Originally published at: https://boingboing.net/2018/12/17/a-very-tandy-xmas.html
…
Covers?! How about full entire issues?
These are mostly later issues/magazines.
Oddly, the December 1976 issue of Byte is there, but the December 1975 issue, a year after the Altair and only the 4th issue of Byte, had a Christmas cover. It was still a hobbyist niche. Unless Creative Computing had a Christmas cover in 1974, Byte defines it for the rest, at a time when kids weren’t dominating the computer
Computers were a lot of fun before they went mainstream. (adjusts hipster glasses)
dont call me a sexist, but she does stare right and only at the “joy-stick”…and its the bigger one
And BYTE in particular. BYTE in the 1970s-early 1980s was a hobbyist magazine that helped introduce mindblowing things like object-oriented and functional programming to people without formal computer science training (they had whole issues dedicated to Smalltalk and Lisp which were the first introductions to these languages that most people had). By the late 1980s, it had become just another corporate magazine read by your local IT department. For example, here’s a cover from 1989 to show how boring it got (and no, TRON is not referring to the Jeff Bridges film but an Japanese OS, granted the most interesting thing in this issue).
It takes me back to opening a cardboard box full of shareware floppy disks, to discover whether any of the things I ordered were any good. I got Eamon which seemed like it might be cool, but I could never get it to work (ms-dos version). I also got a version of PROLOG so I could build my own expert system. I didn’t have a clue why you’d want to build an expert system, and only years later have I learned that most of the proponents of those systems didn’t either.
Racially-biased (or otherwise biased) profiling systems, like the “AI” of today?
Computer learning has the same problem we faced with computers in the 70’s and 80’s:
Garbage In, Garbage out.
Give your “AI” a lot of bad data and it will come out biased exactly as it was taught. Computers have no ethics or morality of their own. And the system designers are often clueless when it comes to solving this problem of bias.
We can scream about the racism of our new digital overlords, but unless we offer some concrete steps to solve the problem there is almost no point in complaining.
This topic was automatically closed after 5 days. New replies are no longer allowed.