iPhones will be scanned for images of child sexual abuse as part of an Apple update

Apple announced on Thursday that iPhones and iPads will soon begin identifying and reporting photographs of child sexual abuse when they are uploaded to the company’s online storage in the United States, a move that privacy activists say raises issues.

In an online article, Apple stated, “We aim to help safeguard children from predators who use communication tools to recruit and exploit them, as well as limit the distribution of child sexual abuse material (CSAM).”

According to Apple, new technology will allow software on Apple mobile devices to compare abusive photos on a user’s phone to a database of known CSAM images provided by child safety organisations, and then flag the images as they are uploaded to the company’s online iCloud storage.

Several digital rights organisations, on the other hand, claim that the changes to Apple’s operating systems offer a potential “backdoor” into devices that might be used by governments or other entities.

Apple replies that it will not have direct access to the photographs and emphasises the privacy and security measures it has implemented. Unless the image is determined to contain depictions of child sexual abuse, the matching of photos will be “powered by a cryptographic method” to ascertain “whether there is a match without exposing the outcome,” according to the Silicon Valley-based tech giant.According to an Apple statement, such photographs will be reported to the National Center for Missing and Exploited Children, which collaborates with law enforcement.

“Apple’s compromise on end-to-end encryption may appease government agencies in the United States and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security,” wrote India McKinney and Erica Portnoy of the digital rights group Electronic Frontier Foundation in a blog post.

According to Apple, the new image-monitoring feature is one of a set of features coming to Apple mobile devices. In a statement, Apple announced that its texting software, Messages, will utilise machine learning to recognise and alert children and their parents when they receive or send sexually inappropriate photographs. “The photo will be obscured and the youngster will be cautioned when receiving this type of information,” Apple added. Despite pressure from politicians and authorities to acquire access to people’s data in the name of preventing crime or terrorism, Apple has created a reputation for safeguarding privacy on its devices and services

Latest articles

Paytm and HDFC Bank have teamed up to explore payment solutions for new online businesses

Paytm, the IPO-bound digital payments company, and HDFC Bank, India's largest private sector bank, announced a strategic partnership to develop comprehensive solutions...

For the first time in three months, Bitcoin has surpassed $50,000

On Monday, Bitcoin surpassed $50,000 for the first time in three months, as investors flocked back into the cryptocurrency on a bargain-hunting...

50 Government schools transformed in Odisha, 1000 underway

Under the 5-T initiative, Odisha Chief Minister Naveen Patnaik has changed 50 government high schools in the Hinjilicut and Sheragada Blocks in...

WhatsApp’s Desktop App Has Been Released To The Public Beta Program

WhatsApp, like most other apps, is subjected to beta testing. This is crucial because they need to ensure that the software works...

Related articles

Leave a reply

Please enter your comment!
Please enter your name here