Apple Privacy Issues

It strikes me as a little odd that BB hasn’t yet mentioned the violation of privacy that Apple is rolling out soon. Maybe they’re waiting for more information. Or maybe it doesn’t strike them as problematic, even though many privacy experts have expressed concerns.

closed though.

Maybe you have something to add?

1 Like

Thanks for pointing out that it was addressed (seriously: I appreciate it). I missed it.

I am glad that BB did address it, at least.

I don’t have anything to add, myself. It does disturb me, though.




Apple says its CSAM scan code can be verified by researchers. Corellium starts throwing out dollar bills

Last week, Apple essentially invited security researchers to probe its forthcoming technology that’s supposed to help thwart the spread of known child sexual abuse material (CSAM).


Now, Florida-based infosec outfit Corellium is taking Apple up on that assertion. And yes, that’s the same Corellium Apple tried to drag through the courts, alleging “unlawful commercialization of Apple’s valuable copyrighted works,” until it gave up that fight last week.

With that victory, of sorts, under its belt, and Apple’s invitation to bug hunters and cryptography experts, Corellium, which previously accused Apple of trying to hinder external security research, this week heralded the iPhone maker’s “commitment to holding itself accountable" by researchers.


2021-08-19 ETA (damn you, 2-posts-limit):

Apple didn’t engage with the infosec world on CSAM scanning – so get used to a slow drip feed of revelations

Cross posting… didn’t realize there was a separate thread…

1 Like

Apple’s bright idea for CSAM scanning could start ‘persecution on a global basis’ – 90+ civil rights groups

More than ninety human rights groups from around the world have signed a letter condemning Apple’s plans to scan devices for child sexual abuse material (CSAM) – and warned Cupertino could usher in “censorship, surveillance and persecution on a global basis.”

The US-based Center for Democracy and Technology organised the open letter [PDF], which called on Apple to abandon its approach to mass-scanning. Signatories include Liberty and Big Brother Watch in the UK, the Tor Project, and Privacy International.


1 Like

Apple Has Reportedly Been Scanning Your iCloud Mail for Child Abuse Images Since 2019


Apple wants to scan iCloud to protect kids, can’t even keep them safe in its own App Store – report

Apple, having recently invoked the “think of the children” defense against rivals seeking to open competing iOS App Stores, has been accused of not thinking of the children.

In a report released on Wednesday, the Tech Transparency Project contends that Apple “is failing to take even the most basic steps to protect children” in the App Store. Failures in age verification exposed children to pornography, gambling, and a host of other supposedly age-limited apps.


The Register asked Apple to comment but the company did not reply, perhaps out of concern for its privacy.


This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.