Apple to scan iPhones for sexually explicit material involving children

One more reason, not that I need it, to never buy an Apple product.

You know there will be false positives and people will get their front door kicked in and wind up in an ugly ugly situation.

Sitting in jail for weeks or months or years while legal processes grind glacially slow just to get charges dismissed for lack of evidence while you get pressured to plead guilty to one thing or another so the prosecutors don’t look like a pack of fools.

Far easier to plant images on a seized computer than drugs on an arrestee.

2 Likes

Sure but that only gets you so far because today it’s Apple and tomorrow it’s Google and the day after that it’s in law - Investigatory Powers Act 2016 - Wikipedia - then all tech companies and ISPs have to implement it and you’ll never know because they’re slapped with a technical capability notice which makes it a crime to even reveal you’ve had one.

Technical capability notices were introduced in the 2016 Investigatory Powers Act, which aimed to regularise government powers of snooping and hacking in the aftermath of the disclosures by Edward Snowden revealing the scale of covert mass surveillance operated by intelligence agencies in the UK.

Each individual capability notice is secret, as is their total number. They require phone and internet companies subject to one to build backdoor access into their systems, allowing them to respond promptly to legitimate surveillance requests. Until now, Facebook is not thought to have been subject to one.

This has the wretched stink of the “shitty technology adoption curve” written all over it and i can already foresee the arguments i’ll be having with people thinking i’m excusing child abuse imagery.

6 Likes

The right wing’s twitter like service (gab?) was recently in the news for spreading ISIS propaganda. Several other, more “mainstream sites” will ban accounts for this sort of nonsense, presumably because they have hashes of beheading photos, etc, and can spot this stuff before it’s seen.

Forbidden material need not be limited to child pornography. Thiis is the tip of the spear,

1 Like

The part about your phone identifying sexts and ratting you out to your parents is kind of gross and inserts itself into privacy issues within families that I don’t think it is wise to let machines do.

But to be clear, that is predicated on your icloud account being linked to your parents’ account, in which case your parents could always read your messages (and everything else) at will, because it’s basically their account, and they have all the crypto materials. It is not a “backdoor”, and doesn’t involve changing anything in terms of the underlying security.

And it’s the same with scanning iCloud against child porn fingerprints: any unencrypted cloud storage was always going to be subject to that sort of thing, and some jurisdictions require it. Apple are trying to claim they’re doing it with stronger safeguards than other providers have, which may well be true, though I wouldn’t attach much importance to that.

The reason big tech invades your privacy is advertising, period. So when ad-funded media (which is to say, all media) implies that the biggest threat to your privacy is Apple, bear in mind that this is exactly like Skeletor warning you about He-Man’s ambitions to rule Eternia.

EFF is not funded by advertising, and they have plenty to say about Facebook, Google et al. too, but they know very well that even fair criticism of Apple will be amplified far more. Their beef with Apple is ideological (hey, He-Man’s dad is a hereditary tyrant after all), but IMO they overplay this to the point of being counter-productive. They seem too focused on Stallman-like purity tests to admit that one company can be better for your privacy than any other, even in principle.

6 Likes

…And what exactly will Apple do if/when they detect kiddy porn?
And, now they’re self appointed mandated reporters.

OK, false positive? What’s the fallout of THAT?

Once they have dirt on someone, they’ll never let that person make the switch to Android.

Bwahaha

1 Like

A huge DB of illicit images just waiting to be hacked, huh? What could possibly go wrong?

1 Like

Snowden’s twitter feed offers manifold branches into nuances of this story. Here’s one that resonates with me.

Also notable is a leaked memo Apple sent to staff, using a catchy label to marginalize and discredit anyone who’s not down with this.

I guess this promise goes out the window.


-edited for spelling

9 Likes

A couple of years ago our employer passed out iPhones to our group (obviously for use re company business). Not a week passed when porno pics started appearing on our phones. Our admin didn’t seem surprised when people started turning in their sweet phones. (?) Just great.

If I could switch my government as easily as I could switch phone carriers, we’d be on a level playing field.

TSA was not supposed to look at naked body scans. But they did.

1 Like

Except you’ll be switching to a phone owned by someone else mandated to scan your phone for photos. I guarantee it. If Apple is caving to this pressure, so will everyone else.

You really seem to miss the point on this. I refer you once again to the EFF, who unlike conservatives in government AND tech companies, have nothing to gain from their examination of this issue and their views on it. You are being told it’s totally fine to search your stuff for contraband, and it will NOT be limited to “child pornography,” just as it has not in other places where such things have been enacted. If you’re afraid of government more than tech companies, then you should REALLY be afraid when government and tech companies are colluding.

9 Likes

There are a scant few actually in government who seem to be against this collusion.

1 Like

Which only serves to reaffirm the problem and worry. One or the other should be screaming bloody murder about this. The right to privacy, the right to innocent until proven guilty, are baked into our Constitution for a reason. This is a fishing expedition that can and will be expanded over time, all while reassuring us “if you’re innocent, you have nothing to fear.”

And then your “I’m smoking dope, which is legal in my state” photo gets you in the radar of an administration who dislikes that you’re a member of (insert political activist group here) and before you can say “iPhone!” you’re being picked up by the feds on a federal charge.

This will be abused. I guarantee it. If Apple is folding, they all will.

10 Likes

I took a look into what the whole thing is about, to me it seems as if there is no new access to the phone involved, as parents already have full remote access to a child’s device. This in itself is based upon corporate device management software, where your company iPhone can be remotely managed by the IT department.

What this scan involves is still within the realm of this program, where the managing account can get pinged if the child device is doing something iffy. It’s all about devices being managed remotely, and definitely not about devices that the user owns themselves.

It’s kind of jarring because Apple traditionally has been selling directly to the end user, where its historical competitors did not: cell phone makers sold their phones to telecoms, and Microsoft sold its software to IT department heads. This reminds us that Apple has also been working to make its devices acceptable to these customers who buy devices but don’t use them, but dole them out.

2 Likes

“And yet here we are.”

2 Likes

The database only needs to have the hashes. Which avoids some problems, but also limits who can verify that it only contains content of the claimed type.

2 Likes

It doesn’t happen on your iPhone though, does it?

Apple are scanning iCloud photo libraries which they are hosting on their servers (with the user’s consent). They are not scanning photos that are kept on people’s phones.

1 Like

No, the scan happens on the phone:

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

1 Like

I’m sure this won’t be used at all for anything horrible and that apple will limit this technology only to CSAM. I’m sure they won’t cave to other countries to use this to scan for other things and that we’d know about these things if it happened. And I’m sure the CSAM database will be audited by people regularly to make sure that hashes that make it into the database aren’t being misused.

1 Like

Yes, but only for images that are to be uploaded to iCloud. It’s a byzantine way of doing it, given that Apple is currently capable of reading any data you store (or “back up”) in iCloud, and it only seems to make sense if their intent is to move to end-to-end encryption. It has been widely reported that they planned to do that a few years ago but were dissuaded by the FBI. Once this system is in place, they would be able to implement E2EE for iCloud, making it impossible for them to see or disclose your data, while also claiming that iCloud cannot be used to propagate child porn.

It’s true that if Apple wanted to install spyware on your phone and simply lie about it, they could. But what they are saying is your phone will only report the presence of known child porn images, and only if you are using iCloud to store your photos.

Governments can add the fingerprints of specific images to the child-porn database (e.g. specific images of Che Guevara or Winnie-the-Pooh), and this mechanism will duly report you. But it can’t be adapted to search for images the government hasn’t already seen, such as pictures of your kids in the bath. That would require a different mechanism.

Many people (me included) would be more comfortable knowing that there was no way for our phones to report us for anything. But at this point, if you turn off iCloud for photos, that is what you will get on iOS, and hopefully that will satisfy governments’ demands.

2 Likes