Originally published at: Apple to scan iPhones for sexually explicit material involving children | Boing Boing
…
Vatican City should do it.
It could be worse - at least the neural network is only going to be used to tattle on kids to their parents (AKA help parents monitor their children’s phone usage) and not forwarding everyone’s sexts to law enforcement…
As for scanning images uploaded to iCloud… still working out risk and rewards in my head.
I think the next step is to start creating false positives for politicians, then we can see some sensible legislation.
The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.
When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.
Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.
So, to recap:
-
Your child’s iPhone will run local software to decide if something is a nudie pic, and
-
having done so, will nevertheless allow the child to see some creepy stranger dude’s wang, or their same-age boy/girlfriend’s nudes which are legally child pornography, BUT
-
will also tell the parents so that when the child proves that the picture in question was an obvious false positive (how exactly will it tell the difference between a picture of a topless teenage girl and a chubby topless teenage boy?), the parents will still have cause to be angry at the child for ignoring the warning because of their apparent desire to see nudes.
Hoo boy. Even for a giant tech company, this is a stupid and dangerous kludge.
Re: What could go wrong?
Hmmm…I mean, maybe not a children’s court judge (the trophy winner below), but someone involved in Apple IT, perhaps?
It didn’t already? Maybe the largest image aggregator/AI company in the world and they don’t already have alarming details about what the pictures are of?
Assuming this tech giant already has data about the pictures on their servers, including identifying illegal activity, wouldn’t they be culpable for not sharing it with authorities?
One of the few cases where a slippery slope argument does represent a mountain of grease-soaked banana peels.
Without going into the wider ramifications (and honestly, if anyone were going to do this, I’d rather it were Apple than any other big tech companies or, god forbid, a government), it’s worth pointing out that it’s not scanning for ‘things that look like porn’; it’s scanning for specific image matches from a database of known child exploitation images.
I assume Apple trained their AI by hiring ex-Fotomat employees and asking them which images they would have made copies of.
if apple is relying on machine-learning on the users device this is not going to end well. if apple’s relying on sha-256 hashes of known child abuse images (and the most common crop/rotate/mirror variants) this will probably miss some, but would also be less problematic.
if a system is 99.99% accurate at finding something rare, like, say, a terrorist, it’s 99.99% inaccurate (that is to say, will have 9,999 false positives).
machine-learning onboard snitchware is a pandora’s box that ends with apple becoming the iron fisted assistant to fascist regimes. better to keep it stupid, simon, than to wind up enabling the future “Republic” Of Trump.
isn’t this the weird part? stranger danger being used to facilitate a backdoor into everyone’s phones. but aren’t the most common exploiters and abusers of kids the very people who would be notified by this system? their parents, family members, and other trusted adults.
does it really address the true causes of child abuse and exploitation?
that said, i will admit i don’t know the statistics on how many children receive explicit messages, and who they are from. does apple?
I’d support any initiative that stopped men sending dick pics to my twelve year-old daughter, but I don’t think software solutions are dealing with the problem.
Apple’s always been clear that iCloud is accessible by them. It’s a trade-off they’ve made to be able to get people back into their iCloud account if they lock themselves out. They provide a list of what they can read and what they can’t.
Scanning the photo library is new but I don’t believe they’re creating any new backdoors to make that possible.
Yes. It is for exactly this reason that my employer prohibits employees from using iCloud at all on company-issued iPhones and iPads (all of which the company retains ownership of). For better or for worse, they leave it up to the employee to keep it shut off. I have never looked into whether Apple prohibits a technical solution to that or if the company just hasn’t implemented one, but neither would surprise me.
It’s scanning for fuzzy matches, not exact matches, and it’s scanning for matches in whatever database Apple choose to use. At the moment, they promise that’s the database of known child exploitation images. If in the future a government pressures them into looking for images “promoting terrorism”, we might not find out.
No, they are creating a new backdoor so that they can now also scan images that you don’t upload. The announcement says they will be scanning iMessages. In future, they might scan more.
This. It sounds innocuous and great, but the logical extension of such things never leads to a good end. First it’s this database, followed by government pressure for ALL phone companies to scan their phones. Then it’s “jihadist tracts.” Pretty soon, it’s whatever the government says is criminal. They stop targeting a “known” database of files and use fuzzy logic, and your picture of your kid taking a bath winds you up in jail one night (I’m pretty sure innocuous bathing photos have ALREADY caused such problems).
And I haven’t even mentioned how machine learning is racist. Garbage in, garbage out.
This is an invasion of privacy. This is not a good thing, but a stepping stone to greater intrusion into everyone’s personal lives. And one more data point in my decision on whether or not I replace my current smart phone with a dumb one.
EDIT: The EFF seems to agree with my points, but states it far better than I do - Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life | Electronic Frontier Foundation
Because if 1984 can’t be 1984, at least 2021 can be.