Brainhacking… shadowy hackers taking over our personal lives at random and demanding ransom… government corruption… corporate dominance over nearly every aspect of our lives… growing potential of cybernetic implants…
Okay, if we’re going to be living in a cyberpunk dystopia, I demand that people at least wear the trench coats and mirrorshades as mandated by the genre conventions!
Lead author here. Can confirm that our lab is decked out in mirrorshades, trenchcoats, and strategically hidden flechette pistols fresh from Chiba bay.
I do actually have a box of old brain implants under my desk (no joke!)
Cory’s recent comments to the FDA raise a particularly important issue in security research for all sorts of medical devices, particularly the implants discussed in the paper. The DMCA and similar pieces of legislation currently make it extremely difficult for researchers to do their work because they essentially makes it illegal to break protections on the software, even if the researcher is doing it to make the devices safer in the long run.
This state of affairs really needs to change - it’s unrealistic to expect that manufacturers will manage to catch every bug themselves and, when proper device function is as important as it is in medical implants, it’s crucial that there are independent researchers out there providing protection for these devices. This is especially vital as medical implants continue to get more complex and extensively featured, resulting in even more intricate codebases and opportunities for subversion.
I really hope that the FDA introduces DMCA exemptions for researchers working in medical device security. There’s a hell of a lot of promise in these implants and it’d be such a shame if public confidence in them were shattered by a few high-profile security flaws resulting from poor aftermarket security practices.
My science fiction imagination sees this happening in a different way. Artificial Intelligence would become so powerful that the human mind would be nothing but a pocket calculator in comparison. The AI would know exactly to the neural level how a human mind functions, and could hack it with only external sensory stimuli, no implanted electronics required. It would could blink a strobe light in a complicated pattern; that would hack a person’s brain and gain executive control over the person’s body and mind without the person even knowing it’s going on.
We already have flashing lights that reprogram people’s cognitive processes and lead them to act in predictable and controllable ways. Perhaps you’ve heard of… television?
Neal Stephenson’s “Interface” seems like a relevant cautionary tale. From both the point of view of suborning an individual with brain implants they don’t have full control over and manipulating the public with mass media techniques.
Did you catch the season opener for Person of Interest? The malevolent AI was able to hack bystanders by manipulating news feeds and social media, no magical morse code necessary. It didn’t need to turn a soccer mom into a murderer, it just had to find an off-duty cop with a vigilante complex.
That’s called Ghost in the Shell, and a bright future it is.
It’s gotta go beyond DMCA stuff - before anyone puts a DBS device in my brain, I want the source code, and I want the right to have an independent expert review it and tell me about it in words a generally intelligent person can understand (I’ve coded some complex stuff for a living, so I could probably understand more about code than the average joe, but the average joe needs to be able to understand it).
I know, I’m flogging Ghost in the Shell, but in the Standalone Complex TV series, The Major is able to hack a hostage taker’s cyberbrain by calling the negotiation phone and play an audible exploit at him, that forced him to open up a back door for her to access his cyberbrain directly and take control of him.
Unfortunately all the code is currently proprietary, which I think is a real worry. I’m hoping to address this topic of closed-source code in neurological implants in a forthcoming paper on the ethical and legal issues surrounding neurosecurity and brainjacking.
Industry may want to play the FDA card, like “it was approved by the regulatory agency, so it must be OK!” Don’t let 'em. There are smart computer people working for the FDA, but not nearly enough of them (or enough time) to go through every line of code for every different device. They have to rely on what the manufacturers tell them; adherence to standards, quality manufacturing stuff, risk assessment, and so on. But software also needs to be opened to scrutiny by as many independent people as possible.
Yes, how did it become gray tac. hoodies and transparency cloaks? As long as it’s not Hackadoll The Animation. <–short shows, though Navigator Udine or something was nice. It doesn’t go from Fujushima’s Yggdrasil Operating System straight off the fanservice monocliff.
Wanted to include the Penny Arcade about making the leads 4% cooler by making their mirrorshades built-in.
This is lovely: Relatively tractable considerations. Mercifully the one illo. is from the cover of World Neurosurgery (of a surgeon absorbed in work.) I would like to see the OCD variation-o-matics…
What are you going to do in conflict of interest statement on a paper like this; ‘I won’t want to change anyone’s mind outside medical consultation?’ Bah. Cromulence.
This topic was automatically closed after 5 days. New replies are no longer allowed.