Halt and Catch Fire: The Most Relevant Show on Television

That clip is so damn funny.

4 Likes

We watched the first season, and it was not very good. The writing and directing were half-assed at best. I didn’t really blame the actors, who seemed to be trying to make the best of terrible material, but unless you tell me they fired the writers and directors from the first season, not sure it seems worth the effort of trying again.

2 Likes

Unless this guy shows up I don’t see the point in watching the show:

Why hire Toby unless you invite Rudy?

I do. Many styles we associate with the '70’s were still in use in the '80’s esp the early '80’s.

2 Likes

The literal cut off for a decade and the style / marketing term for it never line up exactly. What most media refer to as the 60’s was the late 60’s / early 70’s.

4 Likes

They’re very confused about the whole “Halt and Catch Fire” term and where it comes from; they have it completely wrong.

“HCF - Halt and Catch Fire” had nothing to do with race conditions, parallelism, or threads. It was originally from a long list of joke opcodes, the sort of thing that used to circulate among programmers via endlessly photocopied blobby photocopies; the original dated to the '70s, not the '80s. I think “RWLP - Rewind Line Printer” was another one; I wish I could remember more of them.

It was purely a running joke for assembly programmers.

But I recall reading in those dinosaur days that there were some computers - not microcontrollers, typically in the mini- to mainframe range - where the actual microcode for some of the opcodes was modifiable, like a modern-day FPGA, by system programs and where some programs actually did use that capability. (I’m thinking IBM mainframe IO processors worked like that, but I could be remembering wrong.) And, in some of those cases, if you really really screwed up writing your custom microcode, as it was performing very low-level operations on chip-internal components, it was actually possible to mistakenly program it such that one of the opcodes would open a direct pathway from VCC to Ground right through the middle of the chip. And if you accidentally did that, and then executed that opcode… yep.

Your very very expensive mainframe or minicomputer CPU would quite literally halt and catch fire.

6 Likes

Here’s a list from '89, at least 10 years after the first such I remember seeing:
http://www.mit.edu/afs.new/sipb/user/andre/misc/op.codes

2 Likes

According to the link, it started as a joke but the Motorola 6800 implemented it as an undocumented mnemonic for testing purposes:

When this instruction is run the only way to see what it is doing is with an oscilloscope. From the user’s point of view the machine halts and defies most attempts to get it restarted. Those persons with indicator lamps on the address bus will see that the processor begins to read all of the memory, sequentially, very quickly. In effect, the address bus turns into a 16 bit counter. However, the processor takes no notice of what it is reading… it just reads.

2 Likes

Exactly. We are living in a world where 1980s set pieces are abundant (e.g. Stranger Things, The Americans) and most of them take far more care in getting the details right than the sloppiness of HCF. This isn’t the 1880s people – there are plenty of people alive who were there that you can check things with.

4 Likes

Toby Huss: “Yeah, why aren’t more fuckers watching this?”

You’re a favorite character actor of mine, Mr. Huss… but the answer to your question is simple (and has been covered here on BB before):
Because those of us tech nerds for whom this show would have appeal, those of us who were there when the PC materialized, who typed in C64 code from Compute’s Gazette, who notched our floppies to make room for more IcePic’d games, who remember when the EA logo was 3 primitive shapes, the people that this show should specifically speak to… have found the artistic license reaching a bit too far and completely failing to capture the zeitgeist of the clone era. And those who weren’t hunched over computers in the late 70’s/early 80’s couldn’t care less.

2 Likes

You seriously think Mr. Robot gets the details right?

I stopped watching because it got so terrible on so many levels, but a great example is when the cut to a screen of files (directories?) being encrypted and show /dev and /proc in the output. Files don’t live in /dev or /proc. If you did somehow manage to encrypt the contents of either directory on a running system, the operating system would effectively halt and catch fire. Take a linux VM, boot it up, login as root, and issue the command ‘rm -rf /dev /proc’ and see what it gets you.

Also, speaking for myself, I do remember blowing on a 2600 cart. Not saying it made a difference, but it was done.

2 Likes

Everything is a file. And you can certainly put regular files into /dev. And maybe the output was verbosely showing skipped files.

That’s a pretty extreme nitpick anyway, considering how much the show does get right. If that level of accuracy is your standard you’d better stay away from TV and movies and novels altogether.

5 Likes

Scrollscrollscrollscrollscroll…

4 Likes

OUTSTANDING program. I worked for Dell in the ‘old days’ and much of this territory is comfortable … BUT, when you start digging as far back as Tracy Kidder at Data Gereral for story history, and the story of Tom West and his teams in Westboro, Mass. REALLY srarts to scape right down to the metal. I worked on one of Tom’s teams when Magic (the internal code name for DG’s best O/S of all time) hit the street. It revolutionised computing.

Great story… ‘Don’t be evil’ describes MICROSOFT. It was Googlie’s description for Microsoft FOR YEARS !!! STILL true.

1 Like

I’m old enough and was geek enough but for me the show is good enough to be willingly blind to the artistic license.

Some early CPUs probably did get stuck owing to race conditions, but as this was in the days before instruction pipelines and simultaneous instruction execution, it was certainly not due to instructions competing. It was due to delays in internal logic causing flip flop clocks to transition while the data on the inputs was either changing or three stated. The same effect could be achieved by trying to run memory with too few wait states. I recall a NatSemi ECL computer which, as part of the startup routine, varied its tail voltage to find the safest operating voltage, i.e. the one giving the best timing margin.

However, the original Motorola 68000 in some steppings did have an HCF instruction which had to be avoided - it turned one lot of bus transceivers to outputs of zero, and another to outputs of 1 - thus causing a large current to flow through the bus and overheating. If the power supply was adequate - and in those days PSUs tended to have big output capacitors - the heat would cause the epoxy casing to rupture with a nasty smell. 68000s were expensive in those days, so until the problem was fixed program development was very cautious.
What is mildly interesting about this is that the power consumption which destroyed the CPU and the case was around 15W, which nowadays would barely be adequate for a laptop CPU.

3 Likes

Thirded.

Regarding the video clips on the blog view, does anyone else get the amusing bug where the wrong video appears in each post when you refresh the page sometimes?

Cool. I really wanted to like it and looked forward to Ep 1 as soon as I saw the commercials. It went on my DVR immediately. They got the look right, but it felt like they were all over the place, trying to incorporate things from every end of the micro computer spectrum in those days. It was quite a few years before those of us on the clone path started trying to incorporate many of the tech features the series started centering around early on. We just wanted them to run 1-2-3, we weren’t concerned about GUIs, etc. until much later (well after the Mac, in fact… more in line with when we first saw the Amiga).

This post is difficult to read, goes on for far too long, and really failed to catch my interest or attention.

For comparison’s sake, I read 80,000 words about James Bond in third-person all-caps over at Film Crit Hulk yesterday, and despite getting annoyed with the quirk, never got as bored reading it as I was two paragraphs into reading this.

I’m reminded of the old Twain quote: ‘Don’t use a five-dollar word when a fifty-cent word will do.’

There’s a time and place for long, elegant words. This was not such a time and place, and I found this post kept tripping over itself, and so I only skim-read it, and was unconvinced to watch the show.

ETA: I guess what I’m trying to say is this reads like a technical writer trying to write a non-technical essay. I’d guess the person who wrote this is a very good technical writer, but technical writing and writing for the public are very different skills, and if someone tries to use the former skill to write the latter, it’s almost necessarily going to be boring. And the effect will get worse the longer the piece gets.

1 Like

My startup page has links to BB normal, /blog, /category/post, and the BBS. Normal is the only one that doesn’t get clicked on a regular basis.