Saudi Press

Saudi Arabia and the world
Saturday, Feb 22, 2025

Apple to scan user’s iPhones for images of child sexual abuse

Apple to scan user’s iPhones for images of child sexual abuse

Child protection groups have applauded the announcement but some security researchers are concerned that the system could be misused as no one know what other spy activity Apple does against their users and to how many other 3rd parties they pass all your chat, calls, contacts, and location.

Apple unveiled plans to scan US iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

The tool designed to detect known images of child sexual abuse, called "neuralMatch," will scan images before they are uploaded to iCloud.

If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.

The detection system will only flag images that are already in the centre's database of known child pornography. Parents snapping innocent photos of a child in the bath presumably need not worry.

But researchers say the matching tool — which doesn’t "see" such images, just mathematical "fingerprints" that represent them — could be put to more nefarious purposes.

Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement.

"Researchers have been able to do this pretty easily," he said of the ability to trick such systems.

Potential for abuse


Other abuses could include government surveillance of dissidents or protesters. "What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,'" Green asked.

"Does Apple say no? I hope they say no, but their technology won’t say no".

Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images. Apple has used those to scan user files stored in its iCloud service, which is not as securely encrypted as its on-device data, for child pornography.

Apple has been under government pressure for years to allow for increased surveillance of encrypted data.

Coming up with the new security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.

But a dejected Electronic Frontier Foundation, the online civil liberties pioneer, called Apple's compromise on privacy protections "a shocking about-face for users who have relied on the company’s leadership in privacy and security".

Meanwhile, the computer scientist who more than a decade ago invented PhotoDNA, the technology used by law enforcement to identify child pornography online, acknowledged the potential for abuse of Apple's system but said it was far outweighed by the imperative of battling child sexual abuse.

"Is it possible? Of course. But is it something that I’m concerned about? No," said Hany Farid, a researcher at the University of California at Berkeley, who argues that plenty of other programme designed to secure devices from various threats haven't seen "this type of mission creep".

For example, WhatsApp provides users with end-to-end encryption to protect their privacy, but also employs a system for detecting malware and warning users not to click on harmful links.

'Gamechanger'


Apple was one of the first major companies to embrace "end-to-end encryption, in which messages are scrambled so that only their senders and recipients can read them. Law enforcement, however, has long pressured the company for access to that information in order to investigate crimes such as terrorism or child sexual exploitation.

Apple said the latest changes will roll out this year as part of updates to its operating software for iPhones, Macs and Apple Watches.

"Apple’s expanded protection for children is a gamechanger," John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement. "With so many people using Apple products, these new safety measures have lifesaving potential for children".

Julia Cordua, the CEO of Thorn, said that Apple's technology balances “the need for privacy with digital safety for children." Thorn, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with tech platforms.

Breaking security


But in a blistering critique, the Washington-based nonprofit Center for Democracy and Technology called on Apple to abandon the changes, which it said effectively destroy the company’s guarantee of "end-to-end encryption".

Scanning of messages for sexually explicit content on phones or computers effectively breaks the security, it said.

The organisation also questioned Apple’s technology for differentiating between dangerous content and something as tame as art or a meme. Such technologies are notoriously error-prone, CDT said in an emailed statement. Apple denies that the changes amount to a backdoor that degrades its encryption. It says they are carefully considered innovations that do not disturb user privacy but rather strongly protect it.

Separately, Apple said its messaging app will use on-device machine learning to identify and blur sexually explicit photos on children’s phones and can also warn the parents of younger children via text message. It also said that its software would “intervene” when users try to search for topics related to child sexual abuse.

In order to receive warnings about sexually explicit images on their children's devices, parents will have to enroll their child’s phone. Kids over 13 can unenroll, meaning parents of teenagers won’t get notifications.

Apple said neither feature would compromise the security of private communications or notify police.

Newsletter

Related Articles

Saudi Press
0:00
0:00
Close
Saudi Arabia and the United States Strengthen Ties Amid Global Developments
Saudi Arabia Hosts Global Conference to Promote Islamic Unity
The Impact of Artificial Intelligence on Education and Child Development
Saudi Arabia Announces Competition for Best Founding Day Outfits
Saudi-EU Food Security Officials Hold Talks to Strengthen Collaboration
Putin Expresses Gratitude to Saudi Crown Prince for Hosting US-Russia Talks
UK and Saudi Arabia Enhance Collaboration in Innovation and Technology
Denmark's Embassy in Riyadh Showcases Danish Cuisine with Saudi Influence
Saudi Artist Salman Al-Amir Unveils 'Tafawut' Exhibition in Riyadh
Saudi Arabia Offers Condolences to Kuwait Following Military Exercise Fatalities
Saudi Ministry of Islamic Affairs Completes Ramadan Preparations in Madinah
Etidal Secretary-General Hosts UN Counter-Terrorism Director in Riyadh
ADNOC Drilling Targets Over $1 Billion in Investments for 2025 Amid Gulf Expansion Plans
Derayah Financial Achieves Remarkable Growth in Saudi Brokerage and Asset Management
Saudi Arabia Shortlists 30 Firms for Mining Licenses in Eastern Province and Tabuk
Saudi Foreign Minister Engages Counterparts at G20 Meeting in Johannesburg
Oil Prices Decline Amid Rising US Inventories
Saudi Arabia's NDMC Plans Green Bond Issuance by 2025
Moody’s Affirms Egypt’s Caa1 Rating Amid Positive Economic Outlook
Oman and Saudi Arabia Strengthen Economic Ties with New Agreements
Saudi Arabia Investments Propel Expansion of Qurayyah Power Plant
Saudi Capital Market Authority Advances SPACs and Direct Listings
Global Energy Leaders Gather in Riyadh for Symposium on Energy Outlooks
Al-Ahsa Region Sees 500% Growth in Tourism as Saudi Arabia Prioritizes Development
Saudi Arabia Advances Entrepreneurial Ecosystem in Al-Ahsa with New Agreement
King Salman Approves Official Saudi Riyal Symbol
Saudi Credit Card Lending Reaches $8.4 Billion Amid Digital Payment Expansion
King Salman Approves Official Symbol for Saudi Riyal
Putin Thanks Saudi Crown Prince for Facilitating U.S.-Russia Discussions
Saudi Foreign Minister Attends G20 Meeting in Johannesburg
Saudi Arabia Prepares for Nationwide Founding Day Celebrations
Inauguration of Hira Park and Walkway Enhances Jeddah's Urban Landscape
Crown Prince Hosts Leaders for Informal Meeting in Riyadh Amid Gaza Rebuilding Plans
Saudi Official Highlights Achievements and Media's Role in National Transformation
Three Expatriate Women Arrested for Prostitution in Riyadh
Saudi Arabia's Diplomatic Evolution Highlighted at Saudi Media Forum
Healthy Eating and Preparation Essential for Ramadan Fasting
Saudi Arabia and Japan Forge Sustainable Textile Partnership
Advanced Limb Surgery Restores Mobility in Pediatric Cancer Patient
Jeddah Event Explores AI's Role in Boosting Saudi Arabia's SME Sector
UN Representative Highlights AI's Role in Perpetuating Gender Stereotypes
Saudi and Jordanian Leaders Discuss Enhanced Security Cooperation in Amman
Saudi British Society Honors Cultural Bridge-Builders at London Gala
Saudi Media Forum 2025 Explores AI's Role in Modern Journalism
Saudi Arabia's Saqer Al-Moqbel Appointed as WTO General Council President for 2025–2026
Saudi Deputy Ministers Engage in Diplomatic Discussions with U.S. and Dutch Officials in Riyadh
Saudi Arabia to Launch Iftar Program in 61 Countries During Ramadan
Saudi Visitors Expected to Spend £942 Million in UK During 2025
Saudi Arabia Gifts Kaaba's Kiswah to Uzbekistan's Center of Islamic Civilization
Digital Cooperation Organization Concludes Fourth General Assembly with Multiple Agreements
×