The Cost Of Child Protection: Is Apple fighting child pornography for the right reasons?

On August 5, 2021, Apple announced they will be releasing an update which will focus on combating child abuse.  This new update aims to mitigate the spread of digital content which depicts children involved in sexually explicit activities. According to the National Center for Missing and Exploited Children (NCMEC), Child Sexual Abuse Material (CSAM) has increased by 28% in 2020. Due to rising CSAM concerns, Apple’s new IOS 15 update scans users photos, messages, and email attachments for CSAM. Despite their noble effort to reduce the sexualization of children, Apple’s update poses many user privacy concerns. This article will focus on why Apple’s CSAM feature should not be released and the negative impacts of such a system on society. 

Most people would consider their photos, messages, emails, and searches to be sensitive information. Although Apple is trying to limit the sexualization of children on their devices, there are inadvertent consequences to designing algorithms which can limit and report on the transmission of information. Apple is currently deploying this technology for a noble cause, however, there is nothing stopping them from using the same systems to inhibit other types of data. For example, Apple could branch off and sensor political movements, groups, or anything they deem a threat. Apple users could find themselves flagged for having a conversation over text or doing research on a political issue.  

If Apple users are flagged, they could find themselves in a situation where they are not able to freely converse about certain topics, or do their own research on what is happening around the world.  In this situation, Apple would be deciding what users can and cannot talk about.  This would cause Apple users to lose their freedom to express their opinions and learn about the different issues in our society.  Fighting against child abuse is a noble cause, but should it be at the expense of the user? 

Apple’s new CSAM feature highlights a central question– how must we monitor and govern our technology in a way that promotes freedom and remains ethical. Although it is important to fight against the mistreatment of children, this cannot come at the expense of personal freedom. If Apple releases this update, they need to ensure a proper implementation of the software.  The concept of user information being monitored can make an Apple user feel violated.  Apple’s goals are in the right direction, but they need to confirm their software will not breach that privacy of the user. 

Leave a Reply