Apple announces new safety tools to detect child sexual abuse content

Cupertino, California – Apple has announced a trio of new child safety tools designed to protect young people and limit the spread of child sexual abuse material (CSAM).

Apple's iPhone has been hit with controversy over privacy and security concerns recently after the latest Pegasus hacking revelations.
Apple's iPhone has been hit with controversy over privacy and security concerns recently after the latest Pegasus hacking revelations.  © IMAGO / Pacific Press Agency

Among the features is a new technology that will allow Apple to detect known CSAM images stored in iCloud Photos and report them to law enforcement agencies.

It will be joined by a new feature in the Messages app which will warn children and their parents using linked family accounts when sexually explicit photos are sent or received, with images blocked from view and on-screen alerts.

New guidance in Siri and Search will also point users to helpful resources when they perform searches related to CSAM.

The new tools are set to be introduced later this year as part of the iOS and iPadOS 15 software update due in this fall, and will initially be introduced in the US only, but with plans to expand further over time.

The iPhone maker said the new detection tools have been designed to protect user privacy and do not allow the tech giant to see or scan a user's photo album.

Instead, the system will look for matches, securely on the device, based on a database of "hashes" – a type of digital fingerprint – of known CSAM images provided by child safety organizations.

This matching will only take place when a user attempts to upload an image to their iCloud Photo Library.

Apple said that only if a threshold for matches for harmful content is exceeded would it then be able to manually review the content to confirm the match and then send a report to safety organizations.

The company reiterates that the new tools will not give unfiltered access to private camera rolls

Apple's system will look for matches from images a user attempts to upload to their iCloud Photo Library.
Apple's system will look for matches from images a user attempts to upload to their iCloud Photo Library.  © IMAGO / ZUMA Wire

The company reiterated that the new CSAM detection tools would only apply to those using iCloud Photos and would not allow the firm or anyone else to scan the images on a user's camera roll.

The new Messages tool will show a warning to a child when they are sent sexually explicit photos, blurring the image and reassuring them that it is OK if they do not want to view the image, as well as presenting them with helpful resources.

It will also inform them that as an extra precaution, their parents will be sent a notification if they do choose to view the image.

Similar protections will be in place if a child attempts to send a sexually explicit image, Apple said.

The announcement is the latest in a series of major updates from the iPhone maker, and other tech giants, geared at improving user safety and addressing concerns for the protection of young users.

The Apple addition follows a number of the company's security updates early this year designed to cut down on third-party data collection and improve user privacy when they use an iPhone.

Cover photo: IMAGO / ZUMA Wire

More on Tech: