WhatsApp head blasts Apple over tools that can scan private photos in iPhones to curb child abuse
Stressing that WhatsApp will not allow such Apple tools to run on his platform, Cathcart said that Apple has long needed to do more to fight child sexual abuse material (CSAM), "but the approach they are taking introduces something very concerning into the world".
- Apple confirmed plans to deploy new technology within iOS, macOS, watchOS, and iMessage that will detect potential child abuse imagery.
- The tech giant clarified crucial details from the ongoing project.
- Cathcart, however, said that this is an Apple built and operated surveillance system.
Trending Photos
New Delhi: WhatsApp Head Will Cathcart has slammed Apple over its plans to launch photo identification measures that would identify child abuse images in iOS photo libraries, saying the Apple software can scan all the private photos on your phone which is a clear privacy violation.
Stressing that WhatsApp will not allow such Apple tools to run on his platform, Cathcart said that Apple has long needed to do more to fight child sexual abuse material (CSAM), "but the approach they are taking introduces something very concerning into the world".
"I read the information Apple put out yesterday and I`m concerned. I think this is the wrong approach and a setback for people`s privacy all over the world. People have asked if we`ll adopt this system for WhatsApp. The answer is no," he posted in a Twitter thread late on Friday.
"Instead of focusing on making it easy for people to report content that`s shared with them, Apple has built software that can scan all the private photos on your phone -- even photos you haven`t shared with anyone. That`s not privacy".
On Thursday, Apple confirmed plans to deploy new technology within iOS, macOS, watchOS, and iMessage that will detect potential child abuse imagery, but clarified crucial details from the ongoing project.
According to a report in The Verge, for devices in the US, new versions of iOS and iPadOS rolling out this fall have "new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy".
Cathcart, however, said that this is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control.
"Countries where iPhones are sold will have different definitions on what is acceptable," he added.
Apple has said that other child safety groups were likely to be added as hash sources as the programme expands.
"Will this system be used in China? What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?" asked Cathcart.
"What will happen when spyware companies find a way to exploit this software? Recent reporting showed the cost of vulnerabilities in iOS software as is. What happens if someone figures out how to exploit this new system?" he lamented. Also Read: Now you can co-host on Twitter Spaces with THIS feature
According to 9to5 Mac, an internal memo by Sebastian Marineau-Mes, a software vice president at Apple, acknowledged that the new child protections have some people "worried about the implications" but that the company will "maintain Apple`s deep commitment to user privacy." Also Read: RBI Rules: Follow THIS or your cheque will bounce
Stay informed on all the latest news, real-time breaking news updates, and follow all the important headlines in india news and world News on Zee News.
Live Tv