Apple announced on Thursday that iPhones and iPads will soon begin identifying and reporting photographs of child sexual abuse when they are uploaded to the company’s online storage in the United States, a move that privacy activists say raises issues.
In an online article, Apple stated, “We aim to help safeguard children from predators who use communication tools to recruit and exploit them, as well as limit the distribution of child sexual abuse material (CSAM).”
According to Apple, new technology will allow software on Apple mobile devices to compare abusive photos on a user’s phone to a database of known CSAM images provided by child safety organisations, and then flag the images as they are uploaded to the company’s online iCloud storage.
Several digital rights organisations, on the other hand, claim that the changes to Apple’s operating systems offer a potential “backdoor” into devices that might be used by governments or other entities.
Apple replies that it will not have direct access to the photographs and emphasises the privacy and security measures it has implemented. Unless the image is determined to contain depictions of child sexual abuse, the matching of photos will be “powered by a cryptographic method” to ascertain “whether there is a match without exposing the outcome,” according to the Silicon Valley-based tech giant.According to an Apple statement, such photographs will be reported to the National Center for Missing and Exploited Children, which collaborates with law enforcement.
“Apple’s compromise on end-to-end encryption may appease government agencies in the United States and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security,” wrote India McKinney and Erica Portnoy of the digital rights group Electronic Frontier Foundation in a blog post.
According to Apple, the new image-monitoring feature is one of a set of features coming to Apple mobile devices. In a statement, Apple announced that its texting software, Messages, will utilise machine learning to recognise and alert children and their parents when they receive or send sexually inappropriate photographs. “The photo will be obscured and the youngster will be cautioned when receiving this type of information,” Apple added. Despite pressure from politicians and authorities to acquire access to people’s data in the name of preventing crime or terrorism, Apple has created a reputation for safeguarding privacy on its devices and services