Devices and Apps

Apple Delays Detection of Child Sexual Abuse Material

Apple announced last month that it would start detecting Child Sexual Abuse Material (CSAM) as it automatically scans users’ iPhones to curb child exploitation. However, the company has now delayed the rollout after public criticism.

The public has taken a generally negative approach to Apple’s new tool on Twitter, with phrases being used such as “big issue,” “fear,” and “deeply concerned.”

The Washington-based Center for Democracy and Technology (CDT) also issued a statement condemning this technology. Co-Director of CDT’s Security & Surveillance Project, Greg Nojeim, stated, “The changes Apple announced are extremely disappointing, given the leadership and commitment to user privacy and communications security it has long demonstrated. Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.” Nojeim also feared that children from abusive homes or teenagers who identify with the LGBTQ community would be most at risk of the misuse of this technology by accidentally exposing their sexuality (ones unrelated to pedophilia) to their caretakers, who may be prone to anger and non-acceptance.

While Apple is improving the system, here’s what you should know about how the program works. First, Apple would cross-reference photos that resemble any material compiled by the National Center for Missing and Exploited Children (NCMEC). Apple would utilize three combined technologies to streamline this process. A new technology, NeuralHash, analyzes an image on the device and assigns the image a unique number. As a result, only nearly identical pictures will have the same number. Next, the image is cross-referenced to the numbers stored in the database of the NCMEC for illegal material. This matching process is done through a cryptographic technology known as “private set intersection.” The phone then creates a type of “safety voucher” from this technology, which verifies the legality of the image, which is then stored alongside the image onto iCloud. The third software, “threshold secret sharing,” will securely store the material and the assigned number of the safety vouchers to maintain the privacy of regular photos that are not in conflict with the NCMEC database. 

Apple will only be alerted if the account surpasses the threshold of known CSAM content. Otherwise, the rest of the photographs stored will remain private. Once Apple is alerted, human workers will manually check the content for signs of CSAM, and if found, they will report it to the NCMEC. However, Apple did not specify how much CSAM one must possess to cross this threshold. According to Apple, there is just a low probability of one in one trillion that innocent material, such as parents snapping a photo of bathing their baby, will be flagged.

The Apple products included in this program will be iOS 15 and iPadOS 15. 

For more New York City Technology news and culture, follow us on Instagram, Twitter @NYCWired, and sign up for our Newsletter below. 

Ema Gavrilovic

Ema Gavrilovic is a graduate of DePaul University with M. Ed in clinical counseling degree. Ema's career accomplishments include freelance writing, social media and PR consulting. In her spare time Ema likes to explore outdoors, cooking and yoga.