{"id":213995,"date":"2023-09-05T19:21:34","date_gmt":"2023-09-05T19:21:34","guid":{"rendered":"https:\/\/bestwnews.com\/?p=213995"},"modified":"2023-09-05T19:21:34","modified_gmt":"2023-09-05T19:21:34","slug":"in-monitoring-child-sex-abuse-apple-is-caught-between-safety-and-privacy","status":"publish","type":"post","link":"https:\/\/bestwnews.com\/technology\/in-monitoring-child-sex-abuse-apple-is-caught-between-safety-and-privacy\/","title":{"rendered":"In Monitoring Child Sex Abuse, Apple Is Caught Between Safety and Privacy"},"content":{"rendered":"
In 2021, Apple was embroiled in controversy over a plan to scan iPhones for child sexual abuse materials. Privacy experts warned that governments could abuse the system, and the backlash was so severe that Apple eventually abandoned the plan.<\/p>\n
Two years later, Apple is facing criticism from child safety crusaders and activist investors who are calling on the company to do more to protect children from online abuse.<\/p>\n
A child advocacy group, the Heat Initiative, has raised $2 million for a new national advertising campaign calling on Apple to detect, report and remove child sexual abuse materials from iCloud, its cloud storage platform.<\/p>\n
Next week, the group will release digital advertisements on websites popular with policymakers in Washington, such as Politico. It will also put up posters across San Francisco and New York that say: \u201cChild sexual abuse material is stored on iCloud. Apple allows it.\u201d<\/p>\n
The criticism speaks to a predicament that has dogged Apple for years. The company has made protecting privacy a central part of its iPhone pitch to consumers. But that promise of security has helped make its services and devices, two billion of which are in use, useful tools for sharing child sexual abuse imagery.<\/p>\n
The company is caught between child safety groups, which want it to do more to stop the spread of such materials, and privacy experts, who want it to maintain the promise of secure devices.<\/p>\n
A group of two dozen investors with nearly $1 trillion in assets under management have also called on Apple to publicly report the number of abusive images that it catches across its devices and services.<\/p>\n
Two investors \u2014 Degroof Petercam, a Belgian asset manager, and Christian Brothers Investment Services, a Catholic investment firm \u2014 will submit a shareholder proposal this month that would require Apple to provide a detailed report on how effective its safety tools were at protecting children.<\/p>\n
\u201cApple seems stuck between privacy and action,\u201d said Matthew Welch, an investment specialist at Degroof Petercam. \u201cWe thought a proposal would wake up management and get them to take this more seriously.\u201d<\/p>\n
Apple has been quick to respond to child safety advocates. In early August, its privacy executives met with the group of investors, Mr. Welch said. Then, on Thursday, the company responded to an email from the Heat Initiative with a letter that defended its decision not to scan iCloud. It shared the correspondence with Wired, a technology publication.<\/p>\n
In Apple\u2019s letter, Erik Neuenschwander, the director for user privacy and child safety, said the company had concluded that \u201cit was not practically possible\u201d to scan iCloud photos without \u201cimperiling the security and privacy of our users.\u201d<\/p>\n
\u201cScanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems,\u201d Mr. Neuenschwander said.<\/p>\n
Apple, he added, has created a new default feature for all child accounts that intervenes with a warning if they receive or try to send nude images. It\u2019s designed to prevent the creation of new child sexual abuse material and limit the risk of predators coercing and blackmailing children for money or nude images. It has made those tools available to app developers as well.<\/p>\n
In 2021, Apple said it would use technology called image hashes to spot abusive material on iPhones and in iCloud.<\/p>\n
But the company failed to communicate that plan broadly with privacy experts, intensifying their skepticism and fueling concern that governments could abuse the technology, said Alex Stamos, the director of the Stanford Internet Observatory at the Cyber Policy Center, who opposed the idea.<\/p>\n
Last year, the company discreetly abandoned its plan to scan iCloud, catching child safety groups by surprise.<\/p>\n
Apple has won praise from both privacy and child safety groups for its efforts to blunt the creation of new nude images on iMessage and other services. But Mr. Stamos, who applauded the company\u2019s decision not to scan iPhones, said it could do more to stop people from sharing problematic images in the cloud.<\/p>\n
\u201cYou can have privacy if you store something for yourself, but if you share something with someone else, you don\u2019t get the same privacy,\u201d Mr. Stamos said.<\/p>\n
Governments around the world are putting pressure on Apple to take action. Last year, eSafety Commissioner in Australia issued a report criticizing Apple and Microsoft for failing to do more to proactively police their services for abusive material.<\/p>\n
In the United States, Apple made 160 reports in 2021 to the National Center for Missing and Exploited Children, a federally designated clearinghouse for abusive material. Google made 875,783 reports, while Facebook made 22 million. These reports do not always reflect truly abusive material; some parents have had their Google accounts suspended and have been reported to the police for images of their children that were not criminal in nature.<\/p>\n
The Heat Initiative timed its campaign ahead of Apple\u2019s annual iPhone unveiling, which is scheduled for Sept. 12. The campaign is being led by Sarah Gardner, who was previously the vice president for external affairs at Thorn, a nonprofit founded by Ashton Kutcher and Demi Moore to combat child sexual abuse online. Ms. Gardner raised money from a number of child safety supporters, including the Children\u2019s Investment Fund Foundation and the Oak Foundation.<\/p>\n
The group has built a website that documents law enforcement cases where iCloud has been named. The list will include child pornography charges brought against a 55-year-old in New York who had more than 200 images stored in iCloud.<\/p>\n
Ms. Gardner said the Heat Initiative planned to target advertising throughout the fall in areas where Apple customers and employees would encounter it. \u201cThe goal is to continue to run the tactics until Apple changes its policy,\u201d Ms. Gardner said.<\/p>\n
Kashmir Hill contributed reporting.<\/p>\n
Tripp Mickle<\/span> covers technology from San Francisco, including Apple and other companies. Previously, he spent eight years at The Wall Street Journal reporting on Apple, Google, bourbon and beer. More about Tripp Mickle<\/span><\/p>\n