The news: A global coalition of more than 90 policy and civil rights organizations are calling on Apple CEO Tim Cook to put the brakes on Apple’s child sexual abuse media (CSAM) scanning tool. The groups—which include the ACLU, the Electronic Frontier Foundation, and the Center for Democracy and Technology—urged the CEO to “abandon” the company’s plans, citing fears that the tools could be used to “censor protected speech, [and] threaten the privacy and security of people around the world.”
The letter also claims that algorithms designed to detect sexually explicit material are “notoriously unreliable,” often mistakenly flagging art, educational resources, and other imagery as sexual content.
The clock is ticking: The tools are part of the iOS 15 update, which is expected to roll out next month.
How we got here: Apple’s CSAM prevention efforts drew prompt and fervid criticism from privacy advocates almost as soon as they were announced. Privacy advocates worry the features could weaken Apple’s encryption, or be used by governments to quash political dissenters.
Why this matters: Prolonged interrogations of Apple’s privacy commitments could damage one of its key competitive advantages: consumer privacy.
The bigger picture: Continued pressure from advocacy groups deflates any hope Apple may have had of the issue fizzling out.
Instead, Apple’s decision to stand by the tool—or even withdraw it —could be a watershed moment for the company, and an inflection point for consumer privacy writ large.