A.I. trained to read minds and translate private thought to text via brain scans

Originally published at: A.I. trained to read minds and translate private thought to text via brain scans | Boing Boing


Spock Encerio GIF


I can read your mind like a magazine.


every brain has unique ways of representing meaning

I sure hope so.


via science.org:

To gather the necessary data, researchers scanned the brains of three participants while each listened to roughly 16 hours of storytelling podcasts such as The Moth Radio Hour and The New York Times ’s Modern Love .

(Odd training wheels but whatever.)

Initially, the system struggled to turn brain scans into language. But then the researchers also incorporated the natural language model GPT to predict what word might come after another. Using the maps generated from the scans and the language model, they went through different possible phrases and sentences to see whether the predicted brain activity matched the actual brain activity. If it did, they kept that phrase and went on to the next one.

Afterward, the subjects listened to podcasts not used in the training. And little by little, the system produced words, phrases, and sentences, eventually producing ideas that accurately matched what the person was hearing. The technology was particularly good at getting the gist of the story, even if it didn’t always get every word right.


“[The technology] could be really transformational for people who need the ability to be able to communicate again, but the implications for the rest of us are profound.” - Nita Farahany, a bioethicist at Duke University.


When the researchers tried to use a decoder trained on one person to read the brain activity of another, it failed, suggesting that every brain has unique ways of representing meaning.

Well, this bit is highly encouraging. It means specific uses that involve the full cooperation of the person are possible, but some authoritarian government trying to use this technology in the context of grabbing someone off the street and trying to “read their mind” is going to fail. (And might always fail.)

Of course, the real and immediate problem is some bullshit technology that doesn’t even work being used to “prove” that someone is lying. (And studies like this, even if the tech is completely different and only works in highly specific use cases, gets used to suggest “mind reading” is possible even in cases when it isn’t.)


Remember that PSA from the FBI about not plugging your phone into unknown charging ports in airports, etc.? Well, don’t be plugging your brain into unknown brain scanners either.


Honestly if palantir or some company like that isn’t already pitching sketchy tools for ai assisted interrogation I’ll be surprised.



(Translation prob.)

You will work with ML researchers on the latest deep learning models to build solutions in areas of information retrieval (IR), question answering,




It might not work for “random person of the street”, but I can think of a way this could be weaponized. If I can surreptitiously set up a brain scanner to monitor someone without their knowledge, while also recording what they’re hearing (e.g., while they’re watching Netflix), I might be able to train a model for their brain without them knowing.


“idea” seems unproven, and i’d imagine very likely wrong. to me it sounds like the algorithm is reproducing what the person hears not what the person thinks.

the jump from one to the other seems like quite a stretch unless i’m misunderstanding something… ( hmm… maybe the machine could tell me :thinking: )


“… and I also see you don’t seem to be wearing your Brain Hat, citizen—please step out of the vehicle” :policeman:

1 Like

Yeah, the defense against this falls apart if commercial brain scanning for purposes of “convenience” start being a thing… but that requires a government (or other actor) to weaponize an existing body of data against a person (rather the way phones and their data trails get used now…).

Their interest appears to be natural language processing for military AI… at this point.


Things like Stochastic Audio Processing perhaps?


Might need this inside to block the brain scans some day.


I was thinking more asking the lines of a sort of wiretap, where a brain scanner is installed in someone’s house without their knowledge in order to first train and then eavesdrop on a person’s thoughts. Unlikely to be practical at scale, but I could see oppressive governments using this for dissident leaders or other high profile targets (or between governments for spying purposes).




I’ve just tried Bing’s Chat. Note my question and chat’s response:


This is less reading a person’s mind than is watching the body language of someone you know and predicting what they’re about to say.
It’s a fun proof of concept, but there is no practical application.

Animals have evolved really impressive abilities to coordinate their brain activity into external signals through their motor system. BMI will always be cheaper and more effective if it uses muscle or motor neuron activity rather than brain signals. But if you’re trying to pitch for funding for next-gen, mind-reading tech, decoding muscles just ain’t sexy enough. So millions more are spent on this sort of science fluff rather than on developing practical interfaces to help people with computer-assisted technology


I do wonder if surreptitious brain scans would ever be possible - having the equipment right up against the head is kind of a necessity; add any distance and noise would swamp the signal, even with (much, much) better sensors and signal processing… more viable (and probably quite easy) to just convince people to do it of their own free will.