Rechercher dans ce blog

Saturday, August 7, 2021

'Setback for privacy': WhatsApp on Apple's plan to scan iPhones for images of child abuse - Hindustan Times

Cathcart assured that WhatsApp will not adopt Apple’s approach to combat child sexual abuse.(AFP)
Cathcart assured that WhatsApp will not adopt Apple’s approach to combat child sexual abuse.(AFP)

'Setback for privacy': WhatsApp on Apple's plan to scan iPhones for images of child abuse

  • Apple has revealed its plan to identify and report iPhone users who store known images of child sexual abuse in their iCloud Photos accounts.
By hindustantimes.com | Written by Kunal Gaurav, Hindustan Times, New Delhi
PUBLISHED ON AUG 07, 2021 04:42 PM IST

The head of WhatsApp has criticised Apple’s plan to scan iPhones for images of child sexual abuse, saying the approach introduces “something very concerning into the world.” On Thursday, the tech giant revealed its plan to identify and report iPhone users who store known images of child sexual abuse in their iCloud Photos accounts. While Apple insisted the process is secure and designed to preserve user privacy, not everyone is convinced.

“I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world,” tweeted WhatsApp head Will Cathcart.

Apple said it will use a perceptual hashing tool, called NeuralHash, that analyses an image and maps it into unique numbers. The system then performs on-device matching using a database of known image hashes of child sexual abuse provided by the National Center for Missing and Exploited Children (NCMEC) and other child-safety organisations. In case a match is found, the image will be manually reviewed and upon confirmation, the user’s account will be disabled and NCMEC will be notified.

Also Read | 'Epic is right': Elon Musk slams Apple's app store fees

Cathcart assured that WhatsApp, which has repeatedly been prompting Indian users to adopt its controversial privacy policy, will not adopt Apple’s approach to combat child sexual abuse. In a series of tweets, he said that WhatsApp reported more than 400,000 cases of child sexual abuse material (CSAM) to NCMEC in 2020 without breaking encryption.

WhatsApp chief blasted Apple for using a software that can scan all the private photos on a phone instead of focusing on making it easy for users to report such content shared with them. He raised concerns over the use of the scanning tool in countries like China where laws are broad and vague for considering content illegal.

“This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable,” Cathcart added.

Adblock test (Why?)


'Setback for privacy': WhatsApp on Apple's plan to scan iPhones for images of child abuse - Hindustan Times
Read More

No comments:

Post a Comment

Motorola Razr 40 And Razr 40 Ultra Are Cheaper By Up To Rs 20,000 In India - Times Now

Motorola Razr 40, Razr 40 Ultra Prices Slashed in India; Foldable Phones Now Start at Rs. 44,999 Motorola has announced an attr...