Words in the mouth: device lets you hear with your tongue

[Permalink]

2 Likes

Its a great idea but can see it leading to Vurt Feathers.

1 Like

I heard about a similar device, but with accelerometers, as an augmentation for balance sensing when one’s own vestibular system is underperforming.

Also heard about its use for seeing.

Apparently a tongue is a potential alternative data input array…

Also, I saw similar attempts (for vision) with electrode or vibration motor arrays over one’s back, with larger area (but much lower signal density).

1 Like

Does this mean Helen Keller jokes are no longer tasteless?

9 Likes

Reminds me… did anybody try to teach her Morse code? Given the speed achievable by experienced operators, she could’ve been talking quite fast, and the receiving can be not just audiovisual but also haptic.

That’s correct. Wicab, Inc. out of Madison, WI created similar tech to translate electric signals on the tongue for the purpose of giving some visual acuity to those who are blind. They also have a device that uses the same electronic signals to help improve balance to people, particularly with regards to stroke. I spoke to one of the engineers over at Wicab back in '08 or '09 when they were going through FDA trials. Because the device is placed inside the mouth, it was classified as an invasive device and required trials for approval. If I recall correctly, while it was clear that the devices worked, FDA was asking many questions related to what is actually going on in the brain that “rewires” the pathways. I’m not sure if they ever made it through that process. The device they list on their site now seems a bit different from the original that I saw. They had a low resolution camera in black and white and translated those images into electrical signals. These signals were passed through the tongue and the brain figures out how to translate those signals back into images. The engineer said that a sighted person could learn how to use the device after a bout 2 hours with all light blocked from your eyes. The balance device, however, seemed to be more of a cumulative, therapeutic effect. The more you wore the device, the less it was required for balance retention. It was retraining the brain somehow so eventually, theoretically, it would be no longer needed. Very cool technology and amazing to think of all the functionality the tongue has.

3 Likes

Here is a Radiolab broadcast about a blind woman using the tongue sensor to see again. Amazing!

The brain doesnt actually see or hear, it simply interprets electric signals from sensory organs.

I think that it’s more accurate to say “the brain sees and hears by interpreting electrical signals from the sensory organs.” Seeing and hearing are subjective experiences, and from what I can tell they absolutely take place in the brain.

I wonder if the same could be done in an opensource-hardware way, without the profit structure attached and, especially, without FDA to haggle with… (They could of course ban it as untested, but good luck stopping individual imports from DealExtreme.com or other such vendor.) There’s probably enough information in academic papers to roll one from scratch and off-the-shelf microcontrollers…

…random thought… FLIR Lepton sensor on the forehead, electrode array on tongue, and have thermal vision in addition to normal visual inputs?

…why cannot we come with a digital bus for attaching custom peripherals?

1 Like

Just make sure that you use common sense when choosing a charging port during the winter months.

5 Likes

Potentially, but doubtful. I probably can’t say too much about the components of the device, but this group from CO will have the same FDA issue that Wicab had. It’s the classification of “invasive” along with the application of electric signals directly to a human organ. What I like about this group is that they’re working on determining if there are is an optimal arrangement of pads. From what I knew (extremely limited), it didn’t really matter, but there were plenty of active discussions around how to improve image resolution. One thought was more pads meant more throughput for higher visuals.

Taking your random thought a bit further, I don’t know if that would be possible to mix the visual optical data with the visual camera data. If the designed pathways work, I’m not sure that the brain would add another “lane” to the highway. However, there was consideration for non-medical applications such as underwater “navigation” for instance. Put the receptor on the tongue and give the required stimuli providing directions in a “your destination is in this direction” fashion.

2 Likes

My line of thoughts related to opensource medical devices is to add a less-tested but also way less expensive and quicker-available set of options to the market. So people can (or can ask their friends to) get cutting-edge experimental stuff that may (or may not) be of use for them for pennies on a dollar, years before the Official Stuff gets the Mighty Stamp of Approval.

The electrode array on the mouth roof, hinted in the article above, could couple well into the mechanics of the removable dentures. Once you get some they are pretty customizable. (You can tweak the fit using Dremel and other tools, instead of multiple rounds with the doc who would do it herself.) Attaching an electrode array to this substrate is a solvable issue.

Though for an initial testing I’d go with medical-grade silicone potted circuitboard, with gold-plated arrays, much like shown on the photos of the prototypes.

I am rather worrying about this as well. Though the end effect with thermal imagers would not have to be a visual-grade response, as much as “knowing” what is hot and how much when looking at it.

I was thinking about VR goggles with synthetic vision, for this kind of navigation. I wonder how would this sort of tongue-cam work…

We need some way to “borrow” a bunch of nerves to connect arbitrary hardware and to get brain to read and integrate the data. Shouldn’t be That Much Difficult, given those experiments with belts with vibration motors that were constantly alerting the wearer where the north is, which after few weeks led to greatly enhanced ability to not get lost.

1 Like

Sure, but what if I want to taste with my ear?

Then you are in the realm of chemical sensors coupled to acoustic output.

2 Likes

Wow.I want this. read the book The Brain That Changes Itself years ago and have since hoped for something like this.

Tasting with your ear would be pretty straightforward, as long as you only want to hear the signals that are actually produced by taste sensors (sweet, salt, bitter, sour), and not the more complex ones from smell or touch (apparently the heat from peppers, coolness from mint, and pungency from horseradish are all handled by touch nerves in the mouth, not taste nerves.) Terry Gross on NPR was interviewing a neuroscientist, David Linden, who’s written a book on Touch the other day, who talked about that. Fingers and lips are not only able to detect bumpiness, they can also localize its position quite well, so you can do things like read Braille. (OTOH, while genitals can are quite sensitive about textures, they’re not very location-sensitive, so a researcher who tested it on herself and her partner found that those nerves don’t have the capabilities to read Braille.)

2 Likes

Back in the 80s, I worked on Tactile Aids for the Deaf, where we tried to send auditory information to the brain by stimulating the skin, through vibration or electrical stimulation. I wonder how using electrical stimulus of the tongue is thought superior. Is the spatial and/or temporal resolution better? Is it much easier to stimulate a wet tongue than dry skin? What kind of signal processing is this project applying up front? (In our work, we tried to do processing that extracted and encoded spectral features, useful for speech recognition.)

2 Likes

This topic was automatically closed after 5 days. New replies are no longer allowed.