iPhones will be scanned for images of child sexual abuse as part of an Apple update

Apple announced on Thursday that iPhones and iPads will soon begin identifying and reporting photographs of child sexual abuse when they are uploaded to the company’s online storage in the United States, a move that privacy activists say raises issues.

In an online article, Apple stated, “We aim to help safeguard children from predators who use communication tools to recruit and exploit them, as well as limit the distribution of child sexual abuse material (CSAM).”

According to Apple, new technology will allow software on Apple mobile devices to compare abusive photos on a user’s phone to a database of known CSAM images provided by child safety organisations, and then flag the images as they are uploaded to the company’s online iCloud storage.

Several digital rights organisations, on the other hand, claim that the changes to Apple’s operating systems offer a potential “backdoor” into devices that might be used by governments or other entities.

Apple replies that it will not have direct access to the photographs and emphasises the privacy and security measures it has implemented. Unless the image is determined to contain depictions of child sexual abuse, the matching of photos will be “powered by a cryptographic method” to ascertain “whether there is a match without exposing the outcome,” according to the Silicon Valley-based tech giant.According to an Apple statement, such photographs will be reported to the National Center for Missing and Exploited Children, which collaborates with law enforcement.

“Apple’s compromise on end-to-end encryption may appease government agencies in the United States and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security,” wrote India McKinney and Erica Portnoy of the digital rights group Electronic Frontier Foundation in a blog post.

According to Apple, the new image-monitoring feature is one of a set of features coming to Apple mobile devices. In a statement, Apple announced that its texting software, Messages, will utilise machine learning to recognise and alert children and their parents when they receive or send sexually inappropriate photographs. “The photo will be obscured and the youngster will be cautioned when receiving this type of information,” Apple added. Despite pressure from politicians and authorities to acquire access to people’s data in the name of preventing crime or terrorism, Apple has created a reputation for safeguarding privacy on its devices and services

Latest articles

Debt Recovery Ecosystem in India, Impact of Phygital Debt Collection, and the Way Forward: Mobicule Technologies’ Siddharth Agarwal

In the past few years, India has observed a massive transformation across industries owing to digitisation, new communication channels, and stringent regulatory...

HMD Global Launches Nokia C32: Fashionable Design, Powerful Imaging, and Android 13!

HMD Global, the leading manufacturer of Nokia phones, has announced the launch of the Nokia C32, the latest addition to the popular...

Indian govt likely to reopen semiconductor application worth $10 billion to attract global chip companies

A day after India's IT Minister Ashwini Vaishnaw interacted with the leadership team of  Semiconductor Equipment and Materials International (SEMI), which is the...

Gautam Adani Climbs to Top 20 on Billionaires List with a Net Worth of $64.2 Billion

Indian business tycoon Gautam Adani, chairman of the Adani Group, has achieved a significant milestone by reentering the top 20 on the...

Related articles

Leave a reply

Please enter your comment!
Please enter your name here