Apple created a tool that scans iPhones for inappropriate inappropriate child sexual images ... and its about time!

Apple created a tool that scans iPhones for inappropriate inappropriate child sexual images ... and its about time!

GeorgeMartin

Registrant

Following a report on work the company was doing to create a tool that scans iPhones for child abuse images, Apple has published a post that provides more details on its efforts related to child safety. With the release of iOS 15, watchOS 8 and macOS Monterey later this year, the company says it will introduce a variety of child safety features across Messages, Photos and Siri.
To start, the Messages app will include new notifications that will warn children, as well as their parents, when they either send or receive sexually explicit photos. When someone sends a child an inappropriate image, the app will blur it and display several warnings. "It's not your fault, but sensitive photos and videos can be used to hurt you," says one of the notifications, per a screenshot Apple shared.

As an additional precaution, the company says Messages can also notify parents if their child decides to go ahead and view a sensitive image. "Similar protections are available if a child attempts to send sexually explicit photos," according to Apple. The company notes the feature uses on-device machine learning to determine whether a photo is explicit. Moreover, Apple does not have access to the messages themselves. This feature will be available to family iCloud accounts.
Apple will also introduce new software tools in iOS and iPadOS that will allow the company to detect when someone uploads content to iCloud that shows children involved in sexually explicit acts. The company says it will use the technology to notify the National Center for Missing and Exploited Children (NCMEC), which will in turn work with law enforcement agencies across the US. "Apple’s method of detecting known CSAM [Child Sexual Abuse Material] is designed with user privacy in mind," the company claims.

Rather than scanning photos when they're uploaded to the cloud, the system will use an on-device database of "known" images provided by NCMEC and other organizations. The company says that the database assigns a hash to the photos, which acts as a kind of digital fingerprint for them.
A cryptographic technology called private set intersection allows Apple to determine if there's a match without seeing the result of the process. In the event of a match, an iPhone or iPad will create a cryptographic safety voucher that will encrypt the upload, along with additional data about it. Another technology called threshold secret sharing makes it so that the company can't see the contents of those vouchers unless someone passes an unspecified threshold of CSAM content. "The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account," according to the company.
It's only when that line is passed that the technology Apple plans to implement will allow the company to review the contents of the vouchers. At that point, the tech giant says it will manually review each report to confirm there's a match. In cases where there is one, it will disable the individual's iCloud account and forward a report to NEMEC. Users can appeal a suspension if they believe their account has been mistakenly flagged.

Siri Child Safety
Lastly, Siri, as well as the built-in search feature found in iOS and macOS, will point users to child safety resources. For instance, you'll be able to ask the company's digital assistant how to report child exploitation. Apple also plans to update Siri to intervene when someone tries to conduct any CSAM-related searches. The assistant will explain "that interest in this topic is harmful and problematic," as well as point the person to resources that offer help with the issue.
Apple's decision to effectively work with law enforcement agencies is likely to be seen as something of an about-face for the company. In 2016, it refused to help the FBI unlock the iPhone that had belonged to the man behind the San Bernardino terror attack. Although the government eventually turned to an outside firm to access the device, Tim Cook called the episode "chilling" and warned it could create a backdoor for more government surveillance down the road.
 
great news, although for every advance in crime prevention there are criminals working to circumvent it, hopefully the resources people like apple can throw at these projects outweighs what the perps can do
 
I’m for anything that helps law enforcement In this area but can not help but wonder if it will tag this site and form as such and how will it determine if someone is searching for let’s say how to spot a victim of csa or cr in order to aid law enforcement some people will try to learn signs for human trafficking sexual abuse and other crimes committed against humanity I often search for signs someone is being trafficked for exploitation so I can spot someone when I see them and if able to safely help them get away from the abusers I am of the help not hurt type of person of course within the confines of the law
 
So I know a lot of people panic about the fact that apple will have access to your phone, but what many don't understand is they are not doing anything with the device itself and they are not taking anything from the device. They are adding a thing called a Hashing functionally to the gallery

A hash is a string of random numbers and characters that are unique to each digital item, these hashes can be used to compare if too files are the same without ever having to look at the files. It is next to impossible for false positives with hashing, its how material can be analyzed by Digital Forensic analysts without having to look at the image.

There doing this to attempt to find matching hashes from a hash lists that are shared between companies and law enforcement for this purpose on their device that might lead to saving lives down the road... I fully respect and support them on this, kind of surprised its taken them this long to do it

Just had to post this because I know a lot of people would freak out thinking Apple are taking all their photos
 
Top