It’s too bad the BBC Micro, by the time it made it to these shores, was too expensive to compete (not to mention needing to be reworked for NTSC video). Everything I’ve read about it indicates that it mops the floor with the Apple II, and I recall seeing a claim that BBC BASIC on a 2 MHz 6502 was faster than Microsoft BASIC on a 4.77 MHz 8088. Making interpreted BBC BASIC as fast as native 6502 machine code was the reason why ARM was developed.
I still had a lot of fun with the Apple, including teaching myself 6502 assembler. That served me well when I took a machine-level programming course later, in spite of the different CPU used. My school went with something a bit more ambitious, a PDP-11/34A. My very first-ever “Hello, world” program was keyed in on a DECwriter.
The 6502 was designed such that most instructions took 1 cycle. The exceptions were if it had to perform extra loads, but those cases only add a cycle or two to the execution time.
The 8080, z80, 8086, 8088 and so on all took many cycles for most instructions. A 1 MHz 6502 could keep pace with a 4 MHz z80 on a lot of workloads.
This still haunts us today. All the x86 derivatives are saddled with multi-cycle instruction decode and argument fetch. Moore’s law just meant that the transistor budget for that wasn’t a big deal by the late 90s, and Intel and AMD could spend large amounts of die space on complicated out-of-order execution and pipelining. This is what let them catch up with all the fast RISC chips that operated more like the 6502, things weren’t necessarily single cycle, but decode and fetch were vastly simpler. The OOO execution and pipelining are the parts of the chip that Meltdown and Spectre class attacks exploit.
RISC architectures also benefit from pipelining and out-of-order execution optimizations. Modern ARM chips are full of them. The decode simplicity actually makes it simpler to build huge, fast processors. Which is what Apple did with the M1. Intel and AMD won’t be matching Apple, because that complicated decode and backwards compatibility means they can’t push things as far as Apple did.
(I’m not so secretly hoping this is the end of the x86 era.)
I would argue the stimulation and self guided learning experience of early personal computers would “help with their homework”, although it wouldn’t surprise me to learn many kids of that era had difficulty finishing coursework, etc. if they were up all night coding… My sister’s husband used the “we can put your recipes on computer file” justification several times, of course she would have stabbed him with a butcher knife if he ever tried doing that.
An extraordinary computer and an equally extraordinary initiative by the BBC to single-handed OU drive computer literacy in the UK. It is up there with the creation of the Open University as an achievement in educating the population.
That ended soon after. I remember W H Smith and John Menzies having games sections in 1986, with C64, Spectrum and Amstrad games on cassette. There were probably also Amiga and Atari ST games but my parents were keeping my attention focused on the 8 bit games. If I was just a bit more aware I might have been able to work out what I was getting for Christmas.
Game companies had been formed and collapsed in the time in between
I seem to have fallen into a weird in-between period, where I remember using and playing around on earlier machines (such as my namesake) and my first introduction to the very idea of programming was BASIC on an Apple ][ in middle school, but by far spent more time with IBM-compatible stuff and did the progression of DOS-thru-all-the-Windows-editions-ever. Didn’t even know what Linux was until somewhere in the 2000s when I got online in a serious way.
That’s a great documentary, always worth a watch. Have been meaning to show it to young folks of my acquaintance who are familiar with the Black Mirror “Bandersnatch” episode.
Re. @beschizza’s original post and the platform rivalry that was so closely related… always reminds me of a letter in one of the classic 80s computer mags (my fave was Big K, but it could have been the more mainstream C&VG or even Popular Computing Weekly) - anyway the entire letter was something like
C’mon folks, it’s time for Spectrum and Commodore 64 owners to bury the hatchet. Let’s bury it in the keyboard of a BBC Micro.
Yep. Back in the early 1980s, the pundits were proclaiming that programming was going to be a required course, and that people would be talking in BASIC. Computer science professors complained loudly that BASIC was the worst language to teach kids, since we picked up so many bad habits from it, and that Pascal was the way of the future. But hey, we did spend time typing in games that we found in magazines in books, and modding them to fit our needs, and so could not help but pick up a thing or two.
Ah, but then even before the Apple Macintosh, the IBM PC seemed to eschew BASIC, and those with IBM clones rarely taped in their own programs. And with the Mac, the Amiga and the Atari ST, computer owners were moving more into what my tribe called console users. Or even lusers. But I myself had drifted away from programming, other than writing HyperTalk in HyperCard.
Oh, and all that BASIC coding I did as a kid in high school? It did help me land a career in multimedia and later web development. So neener neener to you, Mr. Dykstra.
I admit I was intrigued by The BBC Micro, but I grew up in Iowa, so my first computer was a Commodore 64. I still have it, though it now lives in the original box. I did most of my early programming on the Apple II series at school, but BASIC was BASIC and going from AppleBASIC to Commodore BASIC was a non-issue.
Ohhhh, fuck no. You can play them in your browser. I have stuff To Do. I cannot be trying to dock spaceships onto spinning ring-orbitals so’s I can sell Space Drugs. Or, drive 16-bit tanks. Or all that other stuff. What the hell are you trying to do to me?