Up Limits

Apple's Impossible War On Child Exploitation

Brain Food October, 31, 2023

Apple has long been praised for its commitment to user privacy. However this commitment has come at a cost as the company has fallen behind its competitors in identifying and reporting child sexual abuse material (CSAM) on its platforms.

In 2022 Apple abandoned plans to scan its systems for CSAM after concerns were raised about the privacy implications of doing so. Instead the company relies on users to report CSAM when they see it. However this approach has proven to be ineffective as Apple submitted just 234 CSAM reports to the National Centre for Missing and Exploited Children (NCMEC) in 2023. This is in stark contrast to other tech companies such as Meta and Google which submitted millions of CSAM reports to NCMEC in the same year.

Apple's lack of proactive scanning for CSAM has been criticized by child safety advocates and law enforcement officials. They argue that Apple is putting its commitment to privacy ahead of the safety of children.

The Problem Of CSAM

CSAM is a serious and growing problem. In 2022 NCMEC received over 29 million reports of CSAM a 35% increase from the previous year. This disturbing trend is being fuelled by the rise of social media and other online platforms that make it easy for predators to share and distribute CSAM.

Apple's Approach to CSAM Detection

Apple has defended its decision not to scan its systems for CSAM arguing that doing so would open the door to mass surveillance and other privacy abuses. However child safety advocates and law enforcement officials argue that Apple can develop a way to scan for CSAM without sacrificing user privacy.

One way to do this is to use a technique called on-device scanning. This involves scanning files on the user's device before they are uploaded to the cloud. This would allow Apple to identify and remove CSAM from its systems without ever seeing the actual content of the files.

Another way to scan for CSAM without sacrificing privacy is to use a technique called cryptographic hashing. This involves creating a unique digital fingerprint for each file. This fingerprint can then be compared against a database of known CSAM hashes. If there is a match Apple can flag the file for further investigation.

  1. Apple's Competitors

Apple's competitors such as Meta and Google have already implemented CSAM detection measures. Meta scans all photos and videos uploaded to Facebook and Instagram for CSAM using a combination of on-device scanning and cryptographic hashing. Google scans all photos and videos uploaded to Gmail and Google Drive for CSAM using a similar approach.

  1. The Debate Over Privacy and Safety

The debate over Apple's approach to CSAM detection highlights the difficult balance between user privacy and child safety. On the one hand Apple has a legitimate concern about protecting user privacy. On the other hand child safety advocates and law enforcement officials argue that Apple has a moral obligation to do more to protect children.

Here are some additional thoughts on the issue:

  1. Apple has argued that on-device scanning is not feasible because it would drain the user's battery and reduce performance. However other tech companies have successfully implemented on-device scanning without these drawbacks.
  2. Apple has also argued that cryptographic hashing is not effective because it can produce false positives. However there are ways to mitigate this risk such as using a combination of different hashing algorithms.
  3. Some privacy advocates have argued that scanning for CSAM sets a dangerous precedent and could lead to other forms of mass surveillance. However it is important to note that CSAM is a unique type of content that is illegal and harmful.
  4. Ultimately the decision of whether or not to scan for CSAM is up to Apple. However the company should be transparent about its decision-making process and should listen to the concerns of child safety advocates and law enforcement officials.

Here are some things that Apple could do to improve its CSAM detection efforts:

  1. Implement on-device scanning and cryptographic hashing.
  2. Partner with child safety organizations to develop new and innovative ways to detect CSAM.
  3. Be more transparent about its CSAM detection efforts and work with child safety advocates and law enforcement officials to ensure that its approach is effective and balanced.

Conclusion

There is no easy answer to the question of how to balance user privacy and child safety. However it is clear that Apple's current approach to CSAM detection is not effective enough. The company needs to develop a way to scan for CSAM without sacrificing user privacy.