Analog computers could bring massive efficiency gains to machine learning

Originally published at:


Regulating conductance between reference and electrolyte/ion source to control a time varying voltage is how actual neurons do analog processing. The convergence here seems remarkable. They’re trying to improve the digital models of neural networks and end up essentially reverse engineering the analog processors in our brains.

btw. The reason that biological nervous systems use analog processing in the first place is its much more energy efficient. The processor in a phone uses considerably more energy then our entire nervous system.


I like your post better than the OP.


one problem with analog electronics is when you run a transistor in linear mode (i.e. not in saturation) it must dissipate a lot of power. Linear mode gets pretty hot, has a reduced life span, and can be less efficient.

1 Like

Always thought there was potential in having some sort of Universal Serial Bus (not to be confused with USB) to very differently physic’ed computer subunits. Sort’ve in the mode of the ant-farm as part of the ‘computer’ in Terry Pratchett’s Discworld’s Hex. Need to solve a nasty NP-complete problem?: hook up this slime-mold module and “they’ll” solve it for you with complicated chemical gradients and flows. Next, a culture of transformed (cockroach) neurons inna box. Here’s an olfactory sensor made of zebra fish adenoidal tissue. Thermal flow modeling with this thermoplastic layered on top of a titanium brick. And, of course, something something quantum mechanics sans unlucky cat (liquid boron for choice)


The whole idea of neural networks is to a large extent reverse engineering nature. Going analog rather than emulating it in digital computers has been a logical next step for a long time. The trick is building an analog device good enough.

Next problem would be connectivity. Instead of a high speed bus where all communication is centralized have a structure like in the brain where neurons talk directly to each other.


Hewlett Packard Labs has been talking up the use of analog properties in their memristor technology to solve AI problems many (1000’s) times faster than GPU’s (and with way lower power consumption). I can’t claim to have a technical understanding of the principal, but there are scholarly papers going back several years.

1 Like

On the other hand memresistors have been an up and coming technology for like 30 years now. There is always a lot of promise with them, but it never seems to pan out.

“Next Generation”, hah.


Thanks for posting this, very interesting. I have a dim recollection from the 1970s, I think, of someone making a neural net using electrochemical cells that did path re-enforcement with deposition of material ??? Maybe I’m dreamin’ Also, if I may presume to ask, is your user name a tribute or an attribute ?

1 Like

I know, right? … and whatever happened to bubble memory?


The phone can go for several active hours on 3v*2000 mA-H, in other words ~6 watt-hours. Net: the processor is using quite a bit less than a watt. The human brain runs about 20 watts, not varying dramatically between idle and active.


that’s also key for 3d transforms. i can’t wait for my analog realtime raytracing graphics card.

where do i sign up?

Analog computers are potentially very powerful, but they’re also notorious resource hogs. Unless they’re carefully programmed, they’re prone to runaway processes that will consume literally every last bit of matter in real space and hyperspace. Then the only option is to let them become the God of the next iteration of the universe. (Or you can hit command-shift-Q.)

1 Like

possibly this?

1 Like

AFAIR, but I couldn’t quickly find anything to prove it, analogue computing lasted longer in the Soviet Block then it did in the west. It fitted neatly with then prevalent ideas of cybernetics, and integrating machine, biology, and even business and society in advanced networks of analogue feedback loops. But other than the digital model, they didn’t scale well, were highly specialized, maintenance and training intensive. I never thought they’d make a come back.

I wanted to make a CCD processor back in the eighties that could do digital and analogue calculations. You could have a digital number or a charge packet. The CCD technology could duplicate charge packets, invert, and add or subtract charge packets.It would have an A/D and a D/A convertor so you could convert from charge to digital, and back. You could multiply a charge by a digit. You could divide a charge by a charge and get a digit. And so on.

It turned out the whole idea had been patented by Motorola in 1978.

Analogue, asynchronous circuits are a nice idea. They ought to be very powerful. However, without some sort of binary logic and clock, we can’t really design the things, program them, or do a repeatable experiment. It’s sometimes hard to tell whether they are working at all. But we are making strides with AI, so we may beat the analogue barrier in time too.

1 Like

Looks pretty interesting, I’ll read that after work… thanks

Do they have an analog AI hat for the Raspberry Pi?

”It’s an analog, electrochemical substrate that excels at learning as long as you consistently provide it with stimuli and replenish its energy source from time to time. And burp it and change it."


“Yes, unfortunately the device produces various forms waste that need to be dealt with for optimal performance. And it gets annoyingly loud at random times. We should have a fix for that soon.”

“Dude, did you make a baby?”

1 Like