Originally published at: The Analog Thing: an open-source, cutting-edge analog computer | Boing Boing
…
500 euros seems a bit steep for what I assume is mostly just a set of finely calibrated op amps, but at least it doesn’t leak.
I had a Calculo Analog Computer in 1959 that, with a little research was $20 back then, the equivalent of $210 now, so yeah, a bit inflated no matter how you compute.
It’s pretty neat looking; but from a history-of-science perspective I can’t help but find the tone of the vendor’s copy pretty grating. Wide-eyed, naive surprise that we have somehow gone so wrong and abandoned natural computing in favor of brute algorithms just doesn’t sit well when the path of analog and hybrid computers was in fact pretty well trodden, both prior to the theoretical development of digital computers and prior to the feasibility of modelling nontrivial problems on them.
Anything with good fiddly adjustment knobs is good by me; but whoever writes their copy sounds like whatever the computer science analog(pun not really intended; but recognized) of a naturopath is; banging on ahistorically without any recognition of the rise, use, and decline(including compelling reasons for decline) of widespread analog modelling.
"Isn’t it strange? The world is quintessentially an analog place - not only with respect to data representation which is not digital but also, and even more important, with respect to it complete abstinence from algorithmic control.
There is essentially no natural process which is controlled by an algorithm. A blade of grass bending in the wind actually solves a partial differential equation (even a system of such equations) without ever resorting to an algorithm. It solves these equations by its very nature and it can serve as a model, a so-called analogon, to solve similar problems.
So, why don’t we use computers based on analogies, too? Why do we rely on cumbersome algorithmic approaches to problems which can be readily solved by physical models? Why not make use of the idea of analog computers again? The time is ripe, the technology for building small, cheap and extremely powerful analog computers is available."
It’s really cool, but I’m really not sure who the target audience is. It says it’s educational, but it’s too expensive for a student who can get a MatLab license for free. And a professional has better tools on a full-fleshed computer that costs the same.
I had a Calculo Analog Computer in 1959 that, with a little research was $20 back then, the equivalent of $210 now
The Calculo consisted of three potentiometers and a few switches, no operational amplifiers, so no integration or differentiation. The Heathkit EC-1 electronic analog computer was a bit more comparable to this one, but it cost $200 in 1960.
I’m just old enough to have taken a course that used an electronic analog computer (a small EAI desktop) for labs as a freshman in 1972, it was gone by my senior year. At the time, it was one of exactly 4 computers that my small engineering school owned, the others were an 8K PDP-11, a PDP-10 timesharing system, and an RCA Spectra 70 batch system used for FORTRAN and COBOL classes, as well as general administrative functions.
My Dad took one of the last classes using a mechanical differential analyzer, taught by Vannevar Bush himself back in 1950.
Heh, reminds me of Terry Pratchett’s Making Money.
Exactly. Analog computing had its day for fifty years or so, both in mechanical and electrical form. Perhaps most notably, the US Navy used a lot of mechanical analog computers on ships for calculating firing solutions. Cams turn out to be really good at that sort of thing and you get infinite precision. The Russians also used it for navigation in space, among other things. Google the Globus for a great example. Being continuous and stateless, the infinite precision was a great match for navigation in space. And being mechanical, it was immune to radiation and very tolerant of temperature extremes.
But yah, digital is basically better at everything, so we switched. No mystery to it, and melodrama about lost technology is not required.
Wonder how they would feel about a slide rule, which i know how to use thanks to 10th grade Chemistry class.
Going by the personal website of their ‘conceptual leader’, which goes into a great deal of detail, dates back substantially further, and possesses that artless old-web charm and enthusiasm that I’m totally incapable of distrusting; the melodrama on the shiny corporate site is not there for want of historical perspective.
It’s idle speculation; but I’d be curious about whether it’s purely there for VC hype; or whether it’s one of those cases where the mathematically inclined take on a slightly mystical attitude about some aspect of their area of interest. That seems to happen sometimes; and the dichotomy between analog systems that embody solutions and algorithmic methods that generate them seems like a reasonably fertile niche for it to do so.
I worked with a guy who started as an instrument tech in the USAF [1] The bomb sights on an F4 Phantom were controlled by a pneumatic computer. Unfortunately, to service this computer, you had to remove an access panel under the cockpit and wedge your head between the frame and the eject charge. Hoo boy.
So much this.
[1] The Vietnam war was on and he couldn’t afford university, so he was prime draft material. He didn’t want to get his arse shot off, so he voluntarily signed up for a branch unlikely to get shot at.
For anyone not comprehending Rob’s reference:
(any excuse to possibly get someone new to watch Blackadder)
Also, this looks like a fun bit of stuff to play with, but a bit too expensive to purchase on a whim.
I’ll add my voice to the chorus of “neat, but too expensive”, with a couple more considerations:
- The schematic is nothing revolutionary, but I see no real problem there.
The circuitry and components are quite commonplace and this, in itself, is a good thing.
It does not help justifying the price though. - Open source?
All I can find is a aset of schematics, in png format (making it hard to follow named nets).
No PCB design files, no panel measures or even a BOM.
While the definition of Open Source Hardware is not as settled down and clear cut as for SW, I would define this “the schematic is not a secret”, not Open Source.
Maybe they intend to publish more? The text refers to articles, not technical information. - Possibly the most important point:
Remember that the price of the computer itself is only half the story!
You need an oscilloscope to visualize the interesting stuff, which can cost (almost) as much for an entry level one from the like of Rigol or Siglent.
Moreover, the scope needs to be good at X-Y visualization: not all all modern DSO (Digital Storage Oscilloscopes) are, this is, in fact, one of the very few fields where old boat anchors shine.
I know my Rigol would be barely tolerable.
Indeed. This is the hardware equivalent of when Apple used to try to claim Mac OS X was open source because the Darwin core is. This claim glosses over the fact that Darwin is about 10% of the stack and includes nothing an enterprising engineer might actually want to build upon in the OS. They stopped making this claim around the time that “open source” lost all value as a hippie nerd marketing tool. Curious, that timing.
I was playing with some schematics for analog computers and a web circuit simulator, and my stumbling block was multiplying two inputs. This “open-source” computer just has “there’s a multiplier element here”. Real handy
I can definitely appreciate the idea of “turn a knob and actually SEE what it means when an equation changes.” I never understood the true value of algebra until I started coding little games in flash. I needed SOHCAHTOA to be visible and fiddle-able to understand and become invested in it. looking back at old space shooters, is now fun to think about why they selected the sin-wave flight patterns, radiating circles, etc…
The first couple of years of electrical engineering school are fascinating (and very very hard) for this same reason. Basically you learn that electronics is calculus and calculus explains all of electronics. Same for mechanical engineering, honestly. It’s insane how well calculus maps to physical phenomena, and an analog circuit is math made real. You can see differential equations playing out on the voltmeter and oscilloscope.
All of this is also why I couldn’t handle electrical engineering and dropped out. But it was fascinating while it lasted.
Completely agree. And now computing is a mature science, so there isn’t much rationale for such prices.
I also find their promo video citing the analog computers save power pretty amusing. It’s true that high end desktops are power hungry, but you can do some quite sophisticated computing with a solar powered calculator!
They are using the Analog Devices AD633, a nice four quadrant multiplier, with a reasonable cost (about 13 $ in small quantities).
The inputs are divided by two, so the output is followed by a 4× amplifier/buffer, the result will then be X×Y/10.
Maybe they might have exploited the fully differential inputs and the summing amplifier for a more versatile block?