New Delhi: Apple on Friday announced that it would roll out a new system for checking photos for child sexual abuse material on iPhones. The Cupertino-giant said that the decision has been taken to curb the distribution of child porn or sexually explicit images concerting children.

However, Apple’s new move has not gone down well with the privacy advocates as well as the WhatsApp head Will Cathcart. In a series of tweets, Cathcart said that Apple’s approach is wrong and would be a huge setback for people’s privacy all over the world.

Apple is considered the safest when it comes to the safety and privacy of users. But it is not pleasant to hear when Apple announces that it would scan the iCloud storage of users for child sexual abuse material. While Apple may get successful in slowing down the spread of child abuse imagery, there may be lapses in the privacy and security of users. WhatsApp head in a series of tweets explained why it would not adopt Apple’s system for WhatsApp.

“Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world. Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy,” Cathcart said in a tweet. He said that there has never been a mandate to scan private content of all desktops, laptops or phones globally for unlawful content.

Cathcart has alleged that Apple’s new surveillance system could very easily be used to scan private content for anything they or a government decides it wants to control. He has also raised several questions about the whole system.

“What will happen when spyware companies find a way to exploit this software? Recent reporting showed the cost of vulnerabilities in iOS software as is. What happens if someone figures out how to exploit this new system? There are so many problems with this approach, and it’s troubling to see them act without engaging experts that have long documented their technical and broader concerns with this,” he added.

However, Apple while announcing the new system had said that does not learn anything about images that do not match the known CSAM database. The Cupertino-giant also revealed that Apple “can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.”

(Inputs from BT)


Please enter your comment!
Please enter your name here