top of page
Writer's pictureGearlogy Staff

Apple Will Scan iPhones For Child Sexual Abuse Images.

The changes, for later this year, raised concerns that the company is installing surveillance technology that governments could exploit.


- Apple

Highlights:


  1. The software, which will reportedly be called the “NeuralMatch” will compare images on a person’s iPhone with images on the U.S. law enforcement’s child sexual abuse database, and if it flags enough child abuse images, a review will start.

  2. Law enforcement will be alerted if reviewers find there is evidence the photos are illegal.

  3. The system would be able to check the photos stored on the iPhone before they are uploaded to iCloud servers.


Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.


The tool designed to detected known images of child sexual abuse, called "NeuralMatch," will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human.


If child pornography is confirmed, the user's account will be disabled and the National Center for Missing and Exploited Children notified.


Apple said it had designed the new features in a way that protected the privacy of users, including by ensuring that Apple will never see or find out about any nude images exchanged in a child’s text messages.


The scanning is done on the child’s device, and the notifications are sent only to parents’ devices. Apple provided quotes from some cybersecurity experts and child-safety groups that praised the company’s approach.


Other cybersecurity experts were still concerned. Matthew D. Green, a cryptography professor at Johns Hopkins University, said Apple’s new features set a dangerous precedent by creating surveillance technology that law enforcement or governments could exploit.


“They’ve been selling privacy to the world and making people trust their devices,” Mr. Green said. “But now they’re basically capitulating to the worst possible demands of every government. I don’t see how they’re going to say no from here on out.”

The iPhone operating system will soon store a database of hashes of known child sexual abuse material provided by organizations like the National Center for Missing & Exploited Children, and it will run those hashes against the hashes of each photo in a user’s iCloud to see if there is a match.


Once there are a certain number of matches, the photos will be shown to an Apple employee to ensure they are indeed images of child sexual abuse. If so, they will be forwarded to the National Center for Missing & Exploited Children, and the user’s iCloud account will be locked.


Apple said this approach meant that people without child sexual abuse material on their phones would not have their photos seen by Apple or the authorities.


Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images. Apple has used those to scan user files stored in its iCloud service, which is not as securely encrypted as its on-device data, for child pornography.


The Center for Democracy and Technology [CDT] Organization also questioned Apple's technology for differentiating between dangerous content and something as tame as art or a meme. Such technologies are notoriously error-prone, CDT said in an emailed statement. Apple denies that the changes amount to a backdoor that degrades its encryption. It says they are carefully considered innovations that do not disturb user privacy but rather strongly protect it.


Separately, Apple said its messaging app will use on-device machine learning to identify and blur sexually explicit photos on children's phones and can also warn the parents of younger children via text message. It also said that its software would "intervene" when users try to search for topics related to child sexual abuse.


In order to receive the warnings about sexually explicit images on their children's devices, parents will have to enroll their child's phone. Kids over 13 can unenroll, meaning parents of teenagers won't get notifications.


Apple said neither feature would compromise the security of private communications or notify police.

コメント

5つ星のうち0と評価されています。
まだ評価がありません

評価を追加
bottom of page