Business

Apple Plans to Have iPhones Detect Child Pornography, Fueling Privacy Debate

Apple Inc.

AAPL -0.48%

plans to introduce new iPhone software designed to identify and report collections of sexually exploitative images of children, aiming to bridge the yearslong divide between the company’s pledge to protect customer privacy and law enforcement’s desire to learn of illegal activity happening on the device.

The software, slated for release in an update for U.S. users later this year, is part of a series of changes Apple is preparing for the iPhone to protect children from sexual predators, the company said Thursday.

Apple, which has built much of its brand image in recent years on promises to safeguard users’ privacy, says that its new software will further enhance those protections by avoiding any need for widespread scanning of images on the company’s servers, something that Apple currently doesn’t perform.

After news of Apple’s plans leaked out Wednesday, critics said they worried that by building software that can flag illegal content belonging to its users, Apple may be softening its stance on how it protects user data via encryption—a source of growing contention between the technology giant and law-enforcement organizations over the past decade.

Apple’s system will use new techniques in cryptography and artificial intelligence to identify child sexual abuse material when it is stored using iCloud Photos, the company said. Using software that runs on both the iPhone and Apple’s cloud, Apple will detect whether images on the device match a known database of these illegal images. If a certain number of them—Apple declined to say exactly how many—are uploaded to iCloud Photos, Apple will review the images. If they are found to be illegal, Apple says it will report them to the National Center for Missing and Exploited Children, a private, nonprofit organization established in 1984 under a congressional mandate that serves as a clearinghouse for reports of child abuse.

The software will detect illegal images, but doesn’t work on videos, Apple says.

The system aims to provide a counter to law enforcement’s criticism that Apple doesn’t do enough to help identify criminals who hide their illegal activity behind encryption.

Apple has made the security it offers users on its iPhones and some other devices a key element of its pitch to consumers, adding ever more features to safeguard their privacy. The privacy stance, at times, has led the company to clash with governments. In the wake of the December 2019 attack by a Saudi aviation student that killed three people at a Florida Navy base, then-Attorney General William Barr called on Apple to find a way to crack the encrypted phones.

The Justice Department also attempted in 2016 under the Obama administration to push Apple to create a software update that would break the privacy protections of the iPhone to gain access to a phone linked to a dead gunman responsible for a 2015 terrorist attack in San Bernardino, Calif. The company has refused to build tools that break the iPhone’s encryption, saying that such software would undermine user privacy.

The issue around sexual images of children stored on Apple iPhones dates back to the early days of the device when, back in 2008, one was found with such imagery. A judge ordered Apple to assist the government in unlocking the iPhone. The company complied, though it has since significantly upgraded security features on the device that have made accessing data stored there more challenging. Law-enforcement authorities frequently point to child pornography and the use of encrypted communications by terrorists to argue for data access.

“Apple’s expanded protection for children is a game changer,” John Clark, president and CEO of the National Center for Missing and Exploited Children, said in a statement Thursday. “The reality is that privacy and child protection can coexist.”

The planned software update to detect child sexual abuse material wouldn’t affect the iPhone’s encryption system and it allows iPhone users to keep their data on the device completely private, Apple says. Users who don’t upload their images to iCloud Photos wouldn’t trigger the detection system, Apple said.

As part of the enhancements, Apple will also add features that will give the iPhone a way of blurring out and then warning children if they are sending or receiving sexually explicit photos via the company’s Messages app. The software can be configured to alert parents, too.


‘Apple’s expanded protection for children is a game changer.’


— John Clark, president and CEO of the National Center for Missing and Exploited Children

Because Messages is the app used to send encrypted iMessages, Apple’s new alerting functionality could be of interest to governments looking to push Apple into conducting surveillance of its phones, said Matthew Green, an associate professor of computer science at Johns Hopkins University.

“Now Apple has demonstrated that they can build a surveillance system, for very specific purposes, that works with iMessage,” he said. “I wonder how long they’ll be able to hold out from the Chinese government?”

Apple says that its software isn’t designed for mass surveillance or the scanning of content on its devices.

It is an argument that isn’t sitting well with privacy proponents. “Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” the Electronic Frontier Foundation, a digital-rights watchdog group, said Thursday.

The iPhone no longer represents the impenetrable data safe it once did as companies including Grayshift LLC, Israel’s Cellebrite Mobile Synchronization Ltd. and others have developed methods to retrieve data from the devices. Spyware from NSO Group, an Israeli company, also has been found on iPhones.

The iPhone maker isn’t the only tech company to spar with governments over the issue of encryption.

Facebook Inc.

has rolled out encryption more widely across its platform, also drawing fire from law-enforcement authorities in the U.S. and abroad, who argued, in part, it would make it more difficult to pursue child-exploitation cases.

“Companies cannot operate with impunity where lives and the safety of our children is at stake,” Mr. Barr said in an open letter criticizing Facebook’s encrypted messaging systems in 2019.

A Facebook spokesman was unable to immediately comment on whether the company would follow Apple’s move.

Apple’s privacy push hasn’t just led to battles with law enforcement. This year the company rolled out a new operating system that makes ad-tracking harder. The move created a clash with Facebook and others and has driven advertisers to shift how they market to consumers.

Write to Robert McMillan at Robert.Mcmillan@wsj.com

Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

You May Also Like

World

France, which has opened its borders to Canadian tourists, is eager to see Canada reopen to the French. The Canadian border remains closed...

Health

Kashechewan First Nation in northern Ontario is experiencing a “deepening state of emergency” as a result of surging COVID-19 cases in the community...

World

The virus that causes COVID-19 could have started spreading in China as early as October 2019, two months before the first case was identified in the central city of Wuhan, a new study...

World

April Ross and Alix Klineman won the first Olympic gold medal for the United States in women’s beach volleyball since 2012 on Friday,...

© 2021 Newslebrity.com - All Rights Reserved.