Apple to Enable Feature That Can Scan Incoming Messages for Nudity to Help Prevent Child Sex Abuse | lovebscott.com

Apple to Enable Feature That Can Scan Incoming Messages for Nudity to Help Prevent Child Sex Abuse

Apple is set to implement features to help thwart child predators.

via Complex:

On Thursday, the tech giant confirmed it would begin using new software that will detect and report child sexual abuse material (CSAM) on U.S. iPhones. Apple will utilize a tool known as “NeuralHash,” which can help determine whether a user is trying to store known CSAM on iCloud.

“This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result,” Apple wrote in the announcement. “The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”

Once the automated system finds a match, a human will review the image in a question and assess whether it is illegal. If the reviewer concludes the content qualifies as child pornography, the user’s account will be deactivated and the material will be reported to the National Center for Missing and Exploited Children (NCMEC).

While some have applauded Apple’s efforts to beef up their child protection policies, some technology experts have raised concerns the tool will lead to abuse of privacy.

“Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” Matthew Green, a security researcher at Johns Hopkins University, said to the Associated Press. “Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone.”

NeuralHash will be introduced as part of the iOS 15 software update, which is expected to roll out within the next month or two.

When it comes to children, a little privacy might be worth sacrificing for overall safety.

Share This Post