Apple to scan iPhones for sexually explicit material involving children

Might be easier to scan for phones that don’t have pedo content, there.

1 Like

Apple is planning to build a backdoor into its data storage system and its messaging system.

So, at best, the iPhone is going to hash every image on your phone, then upload that hash to the Apple “illegal pictures server” and look for a match.

What’s the cost to the user? Just sending hashvalues, so minimal cellular data usage. however, processing every photo?   photoanalysisd on my MacBook already does some quirky CPU-intensive things. So now it’s wearing out the battery and taking away CPU capacity, to find non-existent childporn.

How about this as a compromise? In the Security & Privacy area, have an opt-out dialog (default is opted in):
[  X ] Allow Apple to periodically scan product for child pornography
If you allow it to remain checked (opting in), Apple does not ever scan your phone.
If you uncheck it (that is, opt out of scanning), then and only then, does Apple scan your phone.

1 Like

The Dirty Streisand Effect?

2 Likes

On the plus side, it’ll be a great way (/s) to catch up with the Russians on citizen kompromat.

I wonder if it’d be possible to plant images on a phone like Pegasus did with a worm to Khashoggi’s wife embedded in a text message, then “find” them.

4 Likes

I’m sure we both agree it’s better if nobody does this, but why would you prefer a tech company over the government? Governments are well regulated, have transparency and freedom of information laws, accountability, electability, and distributed power structures. Private companies have none of that.

I get that people (especially Americans) hate their government, but the entire history of democracy is creating structures to diffuse and manage government power. Private companies are the opposite. Poorly regulated, secret, unaccountable to anyone. Why are so many people so happy to give power to them versus governments set up to use that power as safely as possible?

16 Likes

However, that’s not what they claim they will be doing. They claim three different things but only the first two are relevant to image processing:

  1. Use machine learning on the phone on messages so they can pop up warning notifications, and notify parents when their kids choose to view said messages anyway.
  2. Use perceptual hashing on the phone against a local dataset on content being uploaded to iCloud and including the encrypted test result with the upload.

Of course as everyone else has mentioned, said tools can be repurposed - including to implement your scenario - and demonstrate the degree of control Apple has over your phone…

3 Likes

Yeah, I don’t think we’ll need to go through the effort of false positives. I’m sure these dimwits have plenty of incriminating evidence on their phones. A certain Florida congressman comes to mind.

3 Likes

I’m British and of the belief that most government tech projects are poorly conceived, designed and executed. Typically they are outsourced to inadequate providers who overcharge and underdeliver. If the UK government were to be trusted with these powers, the system would be compromised within days, literally.

I also believe that our current government’s record on transparency, compliance with FoI, and accountability is utterly shameful.

On the other hand, Apple has made security, privacy and data ownership a selling point in recent years, and they have far more of a vested interest in upholding that reputation than anyone. Who would I trust to keep my data secure and private, faced with a choice between Apple and the UK government? The former, no contest.

5 Likes

I know Microsoft InTune, and probably other MDM solutions, have the ability to isolate corporate data on mobile devices and exclude it from iCloud backups, prevent copy/paste to non-corporate apps, prevent saving into non-compliant storage, etc.

2 Likes

How will you know what Apple is doing with your data?

4 Likes

There’s a technical solution. My employer prevents use of iCloud Drive and any other iCloud services like Keychain Syncing.

1 Like

But that is not what they’re doing. Messages content will be scanned on-device and will remain end-to-end encrypted. Neither Apple nor anyone else will be able to read that content.

There is no back door. I’m a supporter of EFF but this announcement is just FUD.

2 Likes

Through their explicit data collection and use policies.

I imagine you think that’s naive; but Apple is more proactively transparent about what data they collect, and what they do with it, than any other big tech company; and certainly more so than governments.

2 Likes

Or Serco or G4S or a bloke the minister met in the club

2 Likes

You must have missed the part where parents can read their children’s messages in the supposedly end to end encrypted Messages app.

2 Likes

You must have missed the part where that is on-device, done by the device which can decrypt those messages.

2 Likes

FTA

The other feature scans all iMessage images sent or received by child accounts—that is, accounts designated as owned by a minor—for sexually explicit material, and if the child is young enough, notifies the parent when these images are sent or received. This feature can be turned on or off by parents.

First images, then what? What is deemed “sexually explicit”? What if a queer child is outed to their parents by this? I am normally against slippery slope arguments but Apple has been championing privacy for years now and this blatantly flies in the face of that.

10 Likes

Exactly. All of which have comically awful records when it comes to competence and ethics.

For the uninitiated: the fatuous Dido Harding was literally fired from TalkTalk for presiding over a series of serious data breaches; and then our extraordinarily corrupt government put her in charge of the national Covid-19 track and trace initiative, which has failed spectacularly, despite costing (via Serco) literally hundreds of times what experts believe it should have.

4 Likes

It’s scanning hashes of files that, for the most part, Apple doesn’t know the contents of. Apple is not going to be requesting the original child porn files when Interpol or the FBI or whoever provides an update to the database.

Is it conceivable that that a government would want to discredit a specific individual, perhaps an activist, by adding a harmless photo that only that individual and a few friends would have, to the database of Very Bad Files?

Activist posts a photo of himself with a few friends on social media. Almost no one will save that photo, except maybe activist and a few friends. That photo gets added to Very Bad Files database, and it’s a match!

Media: Activist’s phone had image from the Database of Known Child Pornography!
Media, six months later: FBI says photo on Activist’s phone “does not meet the legal definition of child pornography.” No charges to be filed.

Or, Someguy’s manifesto gets added to the database, because his government is very curious about who has or is sharing it.

8 Likes

Parental Controls have always existed for their kid’s iPhones. The ability for parents to invade their children’s privacy isn’t new. It’s on-device, it’s still encrypted, and only happens if those controls are enabled (i.e., the phone belongs to a minor) AND that parent has enabled the scanning.

The article from EFF has its own spin and intentionally conflates a few different things to make them sound scary. The BB write up is even worse. I definitely understand your concerns. But Apple has done a lot to secure devices and protect privacy—the new iCloud Private Relay is great—and I’m going to see how this works before condemning it.

4 Likes