This article is written by – Abhishree Paradkar, Symbiosis Law School, Pune
On Thursday, Apple announced its new policy wherein it would include child safety measures on all its devices. This led to many of the Digital Rights Organizations including the European Digital Rights Network (EDRi) and the Electronic Frontier Foundation (EFF) to be concerned with respect to security and privacy global customer base possessed by the company.
OVERVIEW OF APPLE’S NEW POLICY REGARDING CHILD SAFETY MEASURES
As per Apple’s new policy, the messages sent and received are scanned by the minor’s account, so as to warn the children, as well as, their parents that they have received or have been sending sexually explicit photos. Once minors receive such sexually explicit photos, the same shall be blurred. Furthermore, the children shall also be assured that it is up to them if they do not want to view these pictures and they will be given the details of the required resources. When children are sending such photos, they will be warned. Their parents may also be notified that that child has decided to send or view sexually explicit pictures.
In addition to that, Apple can also scan photos that are uploaded by its users on the iCloud to identify any Child Sexual Abuse Material (CSAM) and then report these issues to the National Center for Missing and Exploited Children (NCMEC). Apple will then match this content against an unreadable database which is known as CSAM image hashes, which have been provided by the child safety organizations and the same shall be stored in the operating system of the devices of the user.
In case a match is found, safety voucher with cryptographic effect encoding the match shall be uploaded to the iCloud together with the explicit image. These contents will not be interpreted by Apple unless, the user breaches the high threshold limit which is previously set by the company. If so happens, then each match will be reviewed manually and the users account will be disabled and reported to NCMEC.
CONCERNS REGARDING APPLE’S NEW POLICY
The EDRi and EFF agree that Child Exploitation is a serious problem. However, they also argued that the changes that are proposed by Apple which build a backdoor into a data storage system as well as the messaging system, also make it impossible to build scanning system that is used only to determine the presence of sexually explicit images that have been sent or received by children. They also allege that Apple is compromising the end-to-end encryption which protects the users and citizens against surveillance by state. This additional tweaking or expanding of the learning machine may be used to scan all user’s devices which may open the floodgates for potential misuse by authoritarian regimes.
It was also made apparent that when it comes to machine learning CSAM, having it being used without any human oversight often leads to the content being classified incorrectly, which includes sexually explicit content and using such methods to scan the iCloud photos of the users which may result in what is known as the “chilling effect”.
This new policy has also been criticized by experts such as Edward Snowden, Matthew Green and Kendra Albert. The CEO of WhatsApp, Will Cathcart stated that his company would not implement this policy. The consortium of researchers, legal experts, cryptographers, professors, as well as, Apple consumers have sent an open letter to Apple requesting them to cease the deployment of this new security policy and to have its commitment reaffirmed with respect to end-to-end encryption and privacy of the users.
Agrawal, A. (2021, August 08). Digital rights groups say Apple’s new child protection policy weakens user privacy.