Close
Updated:

Apple Will Soon Scan Your Phone for Child Pornography

If you have an iPhone, you may soon be subjected to warrantless searches of your photo libraries by Apple.  And, if Apple finds any child pornography on your phone, you’ll most likely end up being arrested and prosecuted.  This article explains more about this privacy intrusion and how you may end up with criminal charges from a search of your phone’s images by Apple.

As Apple’s own privacy policy notes, “[p]rivacy is a fundamental human right.” Apple’s privacy policy further states “[y]our devices are important to so many parts of your life. What you share from those experiences, and whom you share it with, should be up to you. We design Apple products to protect your privacy and give you control over your information.” Apple’s new scanning technology, however, appears inconsistent with this policy.

According to a recent article in the New York Times, Apple last week announced it plans to introduce new tools that will allow it to scan iPhones for images related to the sexual abuse and exploitation of children. Apple is promoting these innovations as part of a child safety initiative.

But as the Times article notes, these tools which are scheduled to become operational in the coming months also may lead to even more troubling forms of surveillance. The article suggests Apple should delay implementation of these technologies until they are further studied and the attendant risks to privacy are better understood.

Apple’s plan has two main prongs. First, parents can elect to have their children’s iMessage accounts scanned for nude images sent or received, and to be notified if this occurs on the phones of users under age 13. All children will receive warnings if they seek to view or share a sexually explicit image.

Second, Apple will scan the photos you store on your iPhone and check them against records of  known child sexual abuse material provided by organizations like the National Center for Missing and Exploited Children. Apple will do this only if you also upload your photos to iCloud Photos.

The technology involved in this plan is fundamentally novel. Although Facebook and Google have been scanning photographs shared on their platforms for quite some time, their systems do not process files on your own computer or phone. Because Apple’s new tools do have the power to process files stored on your phone, those tools are viewed by many to as a serious threat to privacy rights.

In the case of the iMessage child safety service, the privacy intrusion is not especially serious. At no time is Apple or law enforcement informed of a nude image sent or received by a child, and children are given the ability to reverse a potentially serious mistake without their parents ever knowing.

But the technology which allows Apple to scan the photos on your phone is causing many to be concerned.  While Apple has pledged to use this technology to search only for child sexual abuse material among photos uploaded to iCloud Photos, a major concern is that nothing in principle prevents this type of technology from being used for other purposes and without your consent. As the Times article observes, “[i]t is reasonable to wonder if law enforcement in the United States could compel Apple (or any other company that develops such capacities) to use this technology to detect other kinds of images or documents stored on people’s computers or phones.”

Another concern is the new technology has not been sufficiently tested. It is based on a new algorithm designed to recognize known child sexual abuse images, even if they have been slightly altered. Apple says its algorithm is extremely unlikely to accidentally identify legitimate content.  Plus, Apple has implemented additional safeguards, including having Apple employees confirm illegal images before forwarding them to the National Center for Missing and Exploited Children.

Finally, as reported in a recent AP News article, Matthew Green, a top cryptography researcher at Johns Hopkins University, warned Apple’s new technology could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could trick Apple’s algorithm, causing it to alert law enforcement. “Researchers have been able to do this pretty easily,” he said of the ability to fool such systems.

If you were arrested as a result of the police finding child sexual abuse images / child pornography on your phone, computer or other electronic devices, you should consult with a Jacksonville criminal defense attorney knowledgeable about both established and emerging law and surveillance techniques involved in such cases.  Doing so will give you the best chance of having your charges dropped or reduced.  Call me for a free case strategy session to discuss how I can best help you with your child sexual abuse image / child pornography case in Jacksonville, Fernandina Beach, Yulee, Macclenny, Green Cove Springs, Middleburg, St. Augustine or surrounding areas.

 

Contact Us