Privacy advocates are warning that China and authoritarian governments could use Apple’s child pornography detection tool to hunt down political dissidents.
Apple announced last week it will begin scanning user photos stored on iCloud for material found in a database of "Child Sexual Abuse Material." The company claims this system will protect user privacy by scanning phones directly, rather than moving data to hackable outside servers. But privacy advocates say foreign governments could hijack the tool to track down any material it finds objectionable.
Once the detection tool takes effect, Apple will "be under huge pressure from governments to use this for purposes other than going after" child pornography, said the Lincoln Network’s Zach Graves. "It's not that hard to imagine Beijing saying, ‘you're going to use this for memes that criticize the government.’"
Apple maintains a good relationship with China, a major consumer market that houses much of the company's supply chain. The company has used Chinese slave labor to produce its products and lobbied against legislation that would restrict imports that involve forced labor. Apple also stores the personal data of Chinese users on state-owned computer servers.
Tech companies such as Google already scan user content for child pornography. But Apple is the first to embed that function directly into users’ devices. Apple’s method relies on visual hashes, unique representations of photographs. The tool matches hashes on user photos against a database of Child Sexual Abuse Material compiled by the National Center for Missing and Exploited Children and other child safety organizations.
Apple says it's committed to limiting the tool to scan for child pornography and would refuse requests from governments to expand the scope of the tool. But nothing built into the tool prevents Apple from using other databases in the future. "Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor," said India McKinney and Erica Portnoy of the Electronic Frontier Foundation.
Western law enforcement have long pushed for similar access to other messaging systems like Facebook Messenger. While Apple has rejected those requests in the past, experts worry the new system raises the likelihood that governments will demand backdoor access to encrypted data.
Other privacy advocates are raising concerns that Apple’s detection tool could incorrectly flag certain pictures as inappropriate. The company says the odds of false-positive matches were around one in a trillion, but outside observers suggested that number was only possible in academic tests. Author and former Facebook employee Antonio García Martínez pointed out that Facebook, which uses similar technology, has banned many pictures of nursing mothers.