Now those would be worthy offerings for the truck eating bridge.
As someone whose cells form two separate cell lines and as a 23andme user, I should clear up the data that 23andme has is a database of SNP’s which is the sort of thing that could be used to identify a serial killer (the Golden State Killer wasn’t identified using 23andme) as well as personal data. What drug companies get is a pile of SNP’s connected to a particular protein or disease minus the personal data… something like sex chromosomes could be omitted since they aren’t useful.
Once, in the lab, someone complained about the biometrics used to enter extremely dangerous areas and how he didn’t want the State to know who he was. I thought to myself: please, no one tell him about the photo on driver’s licenses.
It’s worth noting that getting deanonymized data for an individual in response to a search warrant isn’t quite the same thing as getting anonymized data in bulk for research purposes.
Both Bear Brook and Science Friday had good episodes on this. If I remember correctly, the police didn’t get warrants for the search. Instead they uploaded the info they had and got close matches from a public database which allowed the police to narrow down their suspects, eventually allowing them to catch the killer.
In essence, if a relative uploads their info into a public database, you’re there as well.
Personally I am very uncomfortable with that as someone who has engaged and engages in activity that was previously illegal, and whose kids a generation ago would be illegal in half the United States.
Edit: it was the Bear Brook podcast, not Futility Closet that devoted an episode to catching a killer through genealogy websites.
Don’t trust anyone that says they anonymized something.
… Besides, biometrics for Access control is not all that secure.
Back in 2000, I watched as a set of twins went into an area controlled by badge / pin / handprint access. Twin #1 badged and put their PIN in into the scanner, and #2 put their hand on the scanner. The door the setup was controlling opened. The security guard was only mildly amused. (All three of us had access to go in, but it was still an eye-opening experiment in how secure some of those systems really are.)
That’s a valid point, and it certainly gets my attention to read that 23andme, Ancestry.com and the Chinese government have the world’s largest genetic databases - there’s probably a dissertation to be written looking at that fact.
The thing that was more important to me is that sharing data with researchers is optional by customers.
I agree it’s highly dubious and I would never use one of these services myself, but if it’s optional to share data, well, that’s better at least.
It’s only optional while the company agrees to keep it optional. And assuming there’s no data breach. And assuming the company isn’t sold to someone else who has no interest in keeping data private. And, how would a customer know for sure their data is private? Because the company says so?
Is the argument that genetic data isn’t medical data? That’s pretty specious. Google has plenty saying you’re correct though. Still insane to me. They are literally using your genetic data to tell you your risk for different medical conditions. If that doesn’t make them a healthcare provider that seems ridiculous.
I think it’s more that they’re not a healthcare provider collecting and retaining medical information. They’re simply a private agency working in a medically adjacent field. HIPPA is concerned with actual medical records generated by clinicians and the patient’s right to privacy and “portability” among providers (ie. their right to “own” and transfer their own biometrics).
ETA: For instance, if someone underwent genetic analysis to determine the likelihood of passing issues on to their offspring, that would be a HIPPA-protected document, but the same data generated outside of medical use would not be.
With the number of health care provider and insurance company breaches reported every month, I tend to mention this when people bring up HIPPA. A hospital network in my area was affected by ransomware this year, not long after an acquisition. The number of portals they use keeps increasing, because specialists use their own and every entity wants a copy of reports. So there are more security risks.
Despite being in the same networks for years, patients are required to resubmit medical history data at every visit. Why? Nurses and office staff complain about data entry since firms changing software refuse to transfer the data from old systems. Expecting the data to be exposed and working on defense is my focus now.
This topic was automatically closed after 5 days. New replies are no longer allowed.