Originally published at: The BBC Micro at 40 | Boing Boing
…
i can testify to that personally being in the older gen x cohort. i knew a little bit of programming in general from working with my ti-59 programmable calculator and BASIC programming in particular-- i took a course in that my freshman year in college, so when a friend of mine who was a baseball enthusiast got a commodore 64 with the BASIC programming language he suggested we make a baseball simulation. we worked on that for 6 months and could actually play one inning when lance haffner games started putting out their sports games and we just stopped because that was really a better game than what we had.
still, i learned a lot about coding and got to know my way around a computer. i ended up not pursuing coding for a career because i just had too many interests apart from that but i don’t regret the time i spent working on it.
I used this beast in the day, what a magnificent piece of crap.
This is what annoys me about editorials that say things like “kids don’t need to learn to code; not everyone is going to be a professional computer programmer”. Knowing how to code is like knowing how to write or know mathematics-- you don’t have to be a professional novelist or mathematician to find those useful. I’m a professional biologist, but I write code to help me analyze data and that’s a lot more productive and reproducible than using Excel (which is how scientists who don’t know how to code do it).
I had the precursor to the BBC Micro, the Acorn Atom. Looked like the BBC Micro, with I think 12 K RAM . Power supply sucked Actually not a bad computer for 1980, but it was quickly sidelined by Apple ][ and IBM PC clones on my desk.
The Complete BBC Micro Games Archive:
This is one of my biggest sources of rage. Growing up k-12 my focus was always on art and music. I started college as an art major and it was there that I finally accepted that the skills required to bring other peoples ideas to life and monetize that work is something I am not good at. On the other hand I can write code to spec and while I will bristle at bad design and inform people why it’s bad, I don’t have the same emotional attachments to the work as I do with art.
So I do various art and music projects for my own learning and growth and enjoyment. If someone hears about something I am doing the first question is always how am I going to monetize it. When I say I am not the usually response is why bother then.
This is the part that fills me with rage. This attitude that if you can’t be a rock star or a famous artist immediately or even a moderately paid graphic artists then it’s a worthless effort. And this all applies as well to mathematics, programming, gardening, really any skill. Do it for the joy of learning. Do it to make yourself a more well rounded person. There is more to life than feeding the capitalist’s need for skilled labor.
Ok rant off. Walk it off. [paces back and forth]
Right? Not all things worth doing have monetary rewards…
Hobbyists of 40 years ago learned to code because there was nothing else to do with the machines
They were expensive puzzle boxes, Rubik’s Cubes that cost as much as a car
But they bought to help with their homework… and the household accounts, if their dad ever works it all out.
Eh. Don’t count out Millennials or Zoomers on this one. Linux and open source have been a similar place for us to cut our teeth on, and I expect more of us have been able to than GenX.
Which is not to disparage the value of those 8-bit systems, I really agree with the soul of the argument. There’s a real way in which the concept of computer literacy is completely broken. My opinion is that the commonly understood definition of computer literacy is actually software tool literacy. It’s important, learning to program without knowing how to use the software that already exists isn’t really going to work. At the same time, a computer will do whatever you can explain to it, and that’s what programming is. Not knowing how to do that misses out on the potential of the machine.
So much this. There are ways in which I’ve been having to fall back in love with computers over the past couple years. Working in the industry did ruin it a bit for me, and I’ve been wrestling with that a bunch. I’m not sure what the next job is going to be, but I’m definitely going to be pickier about it. In the meantime, I’m having a lot of fun playing with NetBSD. It’s simple and it’s lovely and there’s basically no way a jrrb is going to ruin it
True, but this is a somewhat different issue. Yes, it is worthwhile to learn things that turn out to be enjoyable hobbies unrelated to your work, but I was talking more about things like coding and writing that can help one’s work even if that isn’t one’s primary role.
Yes right. I think there’s specfically an Xennial/older millennial “gap”, for people who came into computer ownership after the 8-bits but before Windows PCs became the standard family machine – that was a bit of a dead zone for learning computer science/programming because the 16-bits lacked the instant BASIC prompt you got with the 8-bits, but didn’t have the internet-age communities, access to knowledge, etc that came later. Programming was something you had to really want to do, and even the BASIC dialects tended to be both less accessible and less ‘matched’ to the expectations one has of the machine.
I actually owned an Amstrad CPC first, for maybe 3-4 years before the Amiga, and I went from programming a lot on the Amstrad to never on the Amiga. I’m not even sure it came with BASIC. If it did, it would have required loading the OS from floppy, then loading the BASIC intepreter from another floppy, before even starting.
Yes, “balancing the checkbook” and “keeping track of recipes” were standard (generally unrealized) justifications for buying 8-bit computers.
I know. It’s in the song…
I am definitely going to have to agree with you there, because I am a Xennial, and a weirdo exception-to-yet-example-of that gap. My Dad taught me to use DOS on a 286 in the late 80s when I was, like, 5-6. I played with QBASIC a little bit, but never really got it (I was 9 when I discovered it). It took until the late 90s to get back to programming after I was introduced to Linux. You couldn’t keep me out of the machines, though, and I still learned a ton about networking, PC hardware and other things. This was also the time period where I got fed up with Windows 95 because you couldn’t hack on it. I went all-in on Linux because I realized I could learn as much about it as I wanted because I had all the source code, which is why I took a second crack at C and it stuck. (But, thanks to the limits of what I had access to, I didn’t install Linux until 1998.)
Learning new things in unrelated domains often boils down to learning new problem solving methods.
I think as a person grows their set of available “tools”, that collectively these things can feed back into ones work in ways beyond only learning things that have an immediate application.
i had three close friends who were into computers. one of them, mentioned above, had the commodore equipment with access to BASIC, another one had the dos-based ibm pcjr which was a decent software handler but you had to get special modules and stuff in order to do any programming, then a third friend was into computers because his dad had been a programmer of guidance systems for military missiles and started with an altair kit-computer and has all kinds of odd stuff at his parents house. the latter one taught me some programming but he did the bulk of his programming in machine code for the altair. when pc-clones and msdos took over the field and commodores disappeared i lost touch with programming.
tl;dr
there really was a golden window when people of gen-x could get into programming which closed tight for a decade or so before things opened up again. i got lucky.