Originally published at: http://boingboing.net/2016/12/27/arnold-spielberg-stevens-d.html
…
Nowadays, it takes hundreds of people to design a cpu. Then, one person could design a mainframe.
That’s what I take away from this.
It doesn’t sound like he had any personal involvement with BASIC; Dartmouth’s Kemeny and Kurtz are the people who should receive credit for it. Yes, he designed a computer, but that’s not at all the same as hatever software people use it to make. The hardware engineers at DEC aren’t responsible for UNIX, Jobs and Woz didn’t invent spreadsheets, and the Remington typewriter company didn’t write On The Road.
This article just seems like a sketchy attempt by GE to sex up some (intrinsically interesting) computer history with spurious connections to Jurassic Park and BASIC.
I am relieved to see that basic wasn’t actually Arnold’s fault.
FYI: there is no such school as Dartmouth University. BASIC was invented by Kemeny and Kurtz at Dartmouth_College_ in 1964. And yes, they ran it on a GE computer. When GE sold their computer business to Honeywell, Dartmouth continued to run their time-sharing on newer, bigger versions of the Honeywell hardware into the 1980s.
I love hearing about the early days.
Somewhere a Lisp programmer whose favorite movie is Battleship Potemkin is preparing an especially erudite batch of snark.
Honeywell did change the name of the OS from GECOS to GCOS.
This is the thing. There were good programming languages around in 1964, but they had a high barrier to entry. Basic has a low barrier to entry, but is so simple minded that you can’t do much with it. Yet simple python is no harder to use than simple basic, but the designers of basic didn’t get that a powerful language could be built on simple ideas.
Almost all the code written today is in languages descended from ALGOL, isn’t it?
BASIC was a dead end, and would have been completely forgotten if Microsoft hadn’t bought it from somebody and then sold it hard to people who didn’t know any better.
Computer makes software possible. An interpreted language like BASIC could not have been run at acceptable cost on early computers. Many of the good ideas in software have been around for a very long time, the delay has been in hardware that made them feasible.
Whether or not BASIC itself was a wrong turning (I don’t think the case is simplistic as some people above seem to think) designing a computer that could run it without breaking the bank in 1964 was quite an achievement. As someone whose first experience of a mainframe was in 1966, I have to say that if you had suggested running an interpreted language on it to the programmers, they would have looked at you in something like disbelief. Waste CPU cycles on an interpreter? Do you know how much those things cost?
In 1964, that was hardly surprising. ALGOL, FORTRAN and COBOL had considerable barriers to entry, and in fact ALGOL 68 was too advanced for its time. BASIC was the first computer language that a nonspecialist could pick up easily and use, and it could demonstrate the core concepts of selection, iteration and repetition in a way that did not require people to understand things like Backus-Naur notation and scope. I think it is very unhistorical to criticise the designers of BASIC.
Python is an example of a language that was designed by someone who wasn’t primarily a computer scientist, with the result that you have the fork between original Python with its syntactical irregularities and design gotchas but big installed base, and Python 3 which is a regular language.
You have to remember that 20 years elapsed between original BASIC and its implementation in the IBM PC. It’s reasonable to question that decision - Pascal might have been a better bet - but not apply fifty years of hindsight (yes, it is that long, I’m feeling my age) to the Dartmouth team.
LISP was/is interpreted and has been around since 1960.
I’m not saying the GE-225 wasn’t significant, and time-sharing was a milestone. But BASIC is pretty far removed from what Spielberg did.
BASIC was the killer app of the first generation personal computers (from the Altair through the Apple II) before shrinkwrapped apps like Visicalc took off.
8-bit machine code was very painful to write, and most of the systems didn’t come with an assembler, so you had to translate into hex opcodes by hand. But almost all of them came with BASIC or had it easily available, and that was approachable and fun, even for a 12-year old like me. (And Microsoft’s BASIC was excellent, and advanced the language.)
There wouldn’t have been a PC revolution without BASIC, or at least it would have been delayed many years.
Apparently not; Wikipedia says it was “designed by a team led by Homer R. “Barney” Oldfield, and which included Arnold Spielberg”.
There wouldn’t have been a PC revolution without an accessible programming language and environment, but BASIC is still a dead end, even today.
()
Don’t leave me in suspense!
Both of those statements are true. Where you’re totally wrong is in implying that people could have implemented Python or C or ALGOL-68 on a 1970s personal computer.
I can’t think of any accessible alternatives to BASIC that could be implemented in 4K bytes of 8080 code and use less than 4K of RAM. FORTH is a possibility (FIG-FORTH for Z80 assembles to about 5K) but the syntax is pretty arcane, and it will crash hard if you make a mistake balancing stack pushes and pops.
The first compiled ALGOL-descendent language I’m aware of for PCs was UCSD Pascal for the Apple II … which required 64K of RAM. That was a pretty extravagant requirement back in 1980. There were C compilers for CP/M, but they also demanded as much RAM. They also required floppy disks to store the temporary files, and took several minutes to compile anything more complex than Hello World.
(How old are you, anyway? Have you even experienced the constraints of a 1960s mainframe or a late-1970s personal computer? If you’re under 40 you probably have no idea.)
I didn’t say that. I was more saying why invent a non-extensible language like BASIC when ALGOL was right there, and while a language was needed which was simple to get in to, it didn’t have to be so limiting.
I was born in 1965 and my first system had a 6502 and 4K of RAM, plus some ROM. I wrote machine code to work around the limitations of basic. I had previously written some fortran on card based systems and I spent my early career working on RSX11M on PDP-11s (a legacy application, hard to port).
Of course I accept that something like python was never going to run on a 6502 or similar machine.