George Dyson: look to analog systems for autonomy and intelligence

Originally published at:


I don’t get it.


Well that was cheerful.

The next revolution will be the ascent of analog systems over which the dominion of digital programming comes to an end.

Am I the only one thinking that this is hopelessly naive? The whole article mostly confuses me but that last paragraph takes the cake.

Also, vacuum tubes? Shows the authors age, not sure what it was meant to do there.


I think he might be referring to the popularity of building big, black-box, neural networks that nobody understands but which have a habit of performing pretty well against various much more comprehensible induction and deduction based expert systems that the optimists hoped would win; if only we could wrap a bit more epistemology in XML.

(That said, current ‘neural networks’ tend to be fairly loose analogs of the actual analog flavor; though there are some interesting, albeit mostly experimental, projects aimed at fabricating actually analog neural network analogs; rather than using the very simplified versions or the enormously expensive software simulations of the less simplified versions)

The notion sort of reminds me of some of the late model analog computers; the ones where part of the problem(often the integration) was still too computationally expensive to handle digitally; but where the housekeeping involved in configuring the system and reading out the results(previously involving a lot of knobs and switchboards and such) had become cheap enough to handle with a digital computer; and the analog components were wrapped in a digital layer for setup and readout.

I suppose he is proposing a tack back toward this arrangement; where digital components continue to handle things like I/O, storage; and arithmetic operations; but have big chunks of analog black box embedded and dictating much of the overall behavior of the system.

Edit: there’s also the (much less glamorous) possibility that he is referring to the tendency of large networks of digital systems, if shoddy enough(and ours are) to periodically descend into the sorts of control theory hell scenarios that aren’t inherently ‘analog’; but which have historically proven amenable to modelling as analog systems. All sorts of fun failure cascades that unfold in similar ways at a high level regardless of whether the nodes being considered are analog or digital in their implementation.


you’d have to read up on the difference between weak AI (narrow AI) and strong AI (AGI) to understand why strong AI is indeed looking to analog systems, and looks to things like fuzzy logic, neural nets, infinite variability, and even quantum computing. analog systems use electricity, but perform their calculations and tasks in a very different ways that are actually better suited to modeling natural biological structures in which intelligence emerges. when you understand the differences in analog and digital circuits in makes perfect sense why analog circuits could potentially play an important role in potential AGI.

many analog computers do indeed use vacuum tubes even today. the only reason people view tubes as “dated” is because when we switched from tubes to chips was when we switched from analog to digital computers. why not switch to the more “modern” digital chips? well the answer is right there in the question.

Hope that helps. cheers.

fyi. on a historical sidenote, the jetsons live in an analog computer alternate future. all their tech is analog.


well said.

here is a beauty of an analog computer…what a looker.


That’s OK. Dyson doesn’t get it either.

Actually, he probably does, but he cannot express it very well through this short format or his writing style (or both).

Too much over-simplification, generalization, and inadequate analogy. Like this:

“Nature uses digital coding for the storage, replication, recombination, and error correction of sequences of nucleotides, but relies on analog coding and analog computing for intelligence and control.”


Childhood’s End was Arthur C. Clarke’s masterpiece, published in 1953, chronicling the arrival of benevolent Overlords who bring many of the same conveniences now delivered by the Keepers of the Internet to Earth. It does not end well.

That’s a matter of viewpoint, and it certainly wasn’t the Overlords’ fault.

1 Like

Hm, yeah. Analog is the new quantum, I guess.

Digital computers deal with integers, binary sequences, deterministic logic, algorithms, and time that is idealized into discrete increments. Analog computers deal with real numbers, non-deterministic logic, and continuous functions, including time as it exists as a continuum in the real world. In analog computing, complexity resides in topology, not code. Information is processed as continuous functions of values such as voltage and relative pulse frequency rather than by logical operations on discrete strings of bits. Digital computing, intolerant of error or ambiguity, depends upon precise definitions and error correction at every step. Analog computing not only tolerates errors and ambiguities, but thrives on them. Digital computers, in a technical sense, are analog computers, so hardened against noise that they have lost their immunity to it. Analog computers embrace noise; a real-world neural network needing a certain level of noise to work.

Total word salad, I’m afraid. I could try to pick the bits of chocolate out of the rabbit turds, but what’s the point?


A Dyson waits until Jan. 3 or so to phrase it ‘Skeuomorphic shit of high-touch leaders has defaulted and eaten humanity. The most sensitive app on my phone with vibration sensors that can tell what searches excite me is an intolerant alarmclock! 100,000 glyphs for Indian food are forever siloed in an app that will never get 5 minutes. Application groups are never to be touched in userland!’


Just as digital computation was first implemented using vacuum tube components,

Mechanical relays, actually. Anyone who knows some history of computation knows that. smh.


That seems reasonably coherent and comprehensible to me, albeit I can’t tell if some of the things in it are true, such as the last phrase in it.

Strictly speaking, wouldn’t it be fairer to say that it was implemented on paper first? Logicians were truth-tabling merrily for some time before the applicability of Boolean algebra to switching circuits was formalized.

It wasn’t remotely practical for most purposes until it had an electromechanical implementation or better; but the theory tended to preceed the electrical engineers by a fair margin(and continues to in some cases: we already know a lot about what quantum computers could do; were someone to succeed in implementing them; and you can use oracle machines to study the behavior of all sorts of computers that either haven’t been built or may not be possible to build even in principle(plus, the licensing fees are lower than when using Oracle machines, which is a plus).)


Zuse FTW.

Although I must admit I sometimes wonder what IT would look like today if the Babbage & Lovelace machines would have worked off the bat.


Atanasoff-Berry computer was digital and binary was at least a close contemporary of the Zuse machines, and certainly influenced computer development at the time, through ties to ENIAC. The Zuse machines were developed in isolation and could not be said to have had an impact on development of computation in general. I’d be pretty surprised if George Dyson was not aware of Konrad Zuse :slight_smile: But, yeah, this Dyson essay doesn’t excite me…

1 Like

Stephan Wolfram did a blog piece a few years ago, very well researched, where he speculates along the line of “what if Ada had not died young”. He feels she could have project managed Babbage into success … he also talks of her developing interest in learning about electricity . Boole was also in her circle of associates. What might have been :slight_smile:



1 Like

This topic was automatically closed after 5 days. New replies are no longer allowed.