• About
  • Advertise
  • Careers
  • Write for us
  • Contact
  • Terms of service
Saturday, June 25, 2022
The Millennial Source
TMS
Home WORLD

Apple iPhones will now flag child sexual abuse. What does this mean for data privacy?

byJake Shropshireand Edited byKrystal Lai
August 9, 2021
in WORLD
Apple is going to start flagging child sexual abuse photos

The Apple Inc logo is seen hanging at the entrance to the Apple store on 5th Avenue in Manhattan, New York, U.S., October 16, 2019. REUTERS/Mike Segar

Share on FacebookShare on TwitterShare on Linkedin
In the past, Apple has always reported fewer cases compared to other tech giants, such as Facebook Inc. Last year, the number of CSAM cases reported was under 300, whereas Facebook reported over 20 million.

What’s the new flagging program?

  • One of the programs involves scanning photos that get uploaded to iCloud and then determining whether the content is child sexual abuse material, or CSAM. 
  • How the program figures out whether the content is CSAM is through an on-device program whereby, before the photo is uploaded to the cloud, a matching process is performed on the device for that image against known CSAM image hashes.
  • These CSAM image hashes are provided by child safety organizations, such as National Center for Missing & Exploited Children (NCMEC).
  • According to Apple Inc., the hashing technology, which is called NeuralHash, “analyzes an image and converts it to a unique number specific to that image. Only another image that appears nearly identical can produce the same number.”
  • If there’s a match from Apple’s system, it will then go to an Apple employee who can double check to make sure the system got it right. Once verified, the information will be forwarded to these child safety organizations, and the user’s iCloud account will be locked. 
  • So, the other tool is for parents. The program will now alert parents if their child sends or receives nude photos in text messages.

What’s Apple saying about it?

  • According to the company, the tool is based only on the physical iPhones devices involved, and Apple will have no way of seeing the photos. 
  • Apple has emphasized that the new tools are designed in a way that protects user privacy, making sure that Apple doesn’t have access to things like the images exchanged in users’ text messages. 
  • In their feature announcement post, Apple also included technical assessments from three different cybersecurity experts that all say that privacy concerns have been dealt with properly.
  • “Apple has found a way to detect and report CSAM offenders while respecting these privacy constraints,” wrote Mihir Bellare, one of the cybersecurity experts. 
  • “Harmless users should experience minimal to no loss of privacy,” wrote David Forsyth, another expert consulted by Apple. 
  • “This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM,” Apple said in its announcement post. “And it does so while providing significant privacy benefits over existing techniques.” 
Source: https://www.apple.com/child-safety/

What about the critics?

  • The main thing critics have pointed out is that the new features create potential privacy concerns for users going forwards. 
  • “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow,” wrote National Security Agency (NSA) whistleblower Edward Snowden on Twitter about the issue. “They turned a trillion dollars of devices into iNarcs-*without asking.*”

No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.

They turned a trillion dollars of devices into iNarcs—*without asking.* https://t.co/wIMWijIjJk

— Edward Snowden (@Snowden) August 6, 2021
  • According to Matthew D. Green, a cryptography professor at Johns Hopkins University, this technology sets a potentially dangerous precedent for the demands of governments trying to surveil their citizens. 
  • “They’ve been selling privacy to the world and making people trust their devices,” said Green in a New York Times interview. “But now they’re basically capitulating to the worst possible demands of every government. I don’t see how they’re going to say no from here on out.”

Wait, does that kind of thing happen?

  • There isn’t really a clear answer to that.
  • But in 2017, a Chinese law gave the Chinese government control over some data centers that hold information of Apple users in China.
  • Reports from The New York Times said that Apple had given up control of its data centers in Guiyang and Inner Mongolia to the Chinese government. But, Apple has claimed that the encryption used within the data centers secured the privacy of the data there. 
  • While the publication also admitted that it hadn’t seen any proof of the Chinese government accessing the information and Apple also denied the allegations, the underlying issue is that the Chinese officials can still demand the data be handed over. 
  • Now, tech experts that are critical of the new features Apple is rolling out are worried that something similar could happen with these new tools, where a given government forces Apple to use the new tool for different purposes. 

What’s next?

  • Well, it doesn’t look like Apple is going to slow down from releasing this feature. As per usual, the company did its research on the topic, and it’s unlikely it will back down.
  • Last Friday, in a media briefing, the company said it would make plans to expand the service based on the laws of each country where it operates. This way it can protect itself from any sort of government pressure to identify content other than CSAM.
  • In the United States, companies are required to report CSAM to the authorities. In the past, Apple has always reported fewer cases compared to other tech giants, such as Facebook Inc. Last year, the number of CSAM cases reported was under 300, whereas Facebook reported over 20 million.
  • So after Apple rolls out this program, we should probably expect to see this number go up.

Have a tip or story? Get in touch with our reporters at [email protected]

Related

ShareTweetShare

Latest Posts

Blue Lotus Gallery

How Hong Kong’s Blue Lotus Gallery is exploring the city’s cultural identity

June 25, 2022
John Lee on Hong Kong travel

From John Lee’s comments on Hong Kong travel to the FDA banning Juul vapes – Here’s your June 24 news briefing

June 24, 2022
China livestreaming

Chinese streamers need qualifications to talk about professional topics

June 24, 2022

Western Australia’s mining industry probe uncovers numerous cases of sexual abuses

June 24, 2022

Hong Kong’s incoming leader John Lee looks to reduce “inconvenience” for travelers entering the city

June 24, 2022

The US orders Juul vapes off the market. Here’s what you need to know

June 24, 2022

From Afghanistan’s deadly earthquake to the BRICS summit – Here’s your June 23 news briefing

June 23, 2022

Those pesky cameras aren’t the solution to a productive meeting

June 23, 2022

What you need to know about the BRICS summit, Putin’s first major multilateral meeting since invading Ukraine

June 23, 2022

SUBSCRIBE TO THE TMS NEWSLETTER

By providing your email, you agree to our Privacy Policy

The Millennial Source Ltd. 2021

No Result
View All Result
  • Your daily briefing
  • About us
  • Explore
    • Startups
    • Climate change
    • Tech giants
    • Crypto
    • The future of work
    • Banking giants
    • Economy
  • Lifestyle
  • Human stories
  • TMS archives
  • Write for us
  • Contact
  • Privacy Policy & Terms

2022 The Millennial Source Ltd.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

We are using cookies to give you the best experience on our website.

You can find out more about which cookies we are using or switch them off in settings.

 

Loading Comments...
 

    string(24) "jsonld single post debug"
    The Millennial Source
    Powered by  GDPR Cookie Compliance
    Privacy Overview

    This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

    Strictly Necessary Cookies

    Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

    If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.