"IBM PC Compatible": how adversarial interoperability saved PCs from monopolization

Originally published at: https://boingboing.net/2019/08/05/ibm-pc-compatible-how-adv.html

3 Likes

I think you have munged your chronology a little. The story suggests that there were systems running MS-DOS before the IBM PC came out, which wasn’t the case. Before that most systems used home-grown software (albeit frequently incorporating MS Basic) or CP/M. PC clones did appear fairly quickly and Microsoft sold licenses to them, but software compatibility issues persisted for a while because many packages would bypass the operating system and address hardware directly (early versions of MS-DOS didn’t prevent this).

11 Likes

You could argue it didn’t prevent monopolies, it just moved them further down the chain. The popularity of the clones gave Intel a de facto monopoly, to the detriment of Motorola and other chip makers.

9 Likes

Another thing to bear in mind is the simplicity of the architecture.
There were Apple II and TRS-80 compatible or almost compatible home and personal computers, but no Commodore 64 clones were made. Commodore used VIC and SID chips that were custom made so to recreate a C64 clone one had to get some custom chips. Almost the same thing happened with the ZX Spectrum: almost all clones were built in Soviet Union or satellite states, because weren’t sold in the iron curtain.

I suppose that the rise of IBM clones was initially wanted by IBM. Thy could have used a custom-built CPU or IC.

6 Likes

Are you sure about that? I don’t remember any version of DOS preventing low-level direct access to hardware. In fact, when Windows finally got round to doing this in the late nineties, every game, DOS or Windows, became unplayable. My memory is shocking, though.

4 Likes

My guess is that IBM wasn’t too upset about clone Intel x86 chips, considering that they made some…

I really think that IBM didn’t expect the PC market to take off as much as it did, so they kind of half-arsed it. Which is good, because if they would have created a fully integrated custom computer it would have semi-failed, like the Vic20 / C64/128 or the Timex Sinclair and the TRS-80 and all the other almost there but didn’t catch on PCs. IMHO, the IBM took off because of the clones; which is why Apple has been a very successful also-ran in the PC space for 40 years - no Mac clones.

1 Like

IBM had a few false starts in creating a PC, but they didn’t really get past the committee stage. The problem was that IBM was too big, and nobody had a clear vision of what the PC would be. In fact, IBM’s management had a large number of people that were openly hostile to the idea of personal computers, which made getting funds allocated difficult.

They actually had discussions about manufacturing something with Matsushita, TI and Atari, amongst others. In some cases it would be their own design, in other cases it would be a design based on the manufacturer’s existing hardware.

What seems to have scared IBM is that lots of the management realised that the schools their children were at had bought Apple II’s. They began to realise that they were missing the window of market opportunity, and had to act fast.

So they built a team of about 12 people who created the first prototype in a month. It used off the shelf components because it had to, not because they wanted to. In theory IBM had the capability to create an entire PC with a custom components - but in practice, they were incapable of managing that process. An all-IBM personal computer would have shipped in 1992, not 1982.

IBM shipped the resulting IBM 5150 PC partly because they were terrified that they’d left it too late to enter the PC market, and partly because they were reassured that the custom BIOS chips meant that even if you bought the exact same components and assembled them, you still didn’t have an IBM PC…

With hindsight, they were wrong on both counts. The business market was waiting for their entry into the personal computing market, meaning IBM effectively defined the standard for personal computing. And the custom BIOS chip wasn’t enough to stop others wanting a piece of that profitable market.

(The obvious irony being that if they’d been right on the first point and were too late, then the market for an IBM Compatible PC would have been too small for the cloning operation to be worth doing…)

[Edited to correct an egregious typo]

5 Likes

I’d make a different argument.

It took off because businesses were waiting for it.

IBM was THE standard for business computing. There were competitors, sure - but IBM had faced multiple investigations and antitrust complaints, which says a lot about how big they were. (https://en.wikipedia.org/wiki/History_of_IBM#1968-1984_Multiple_Government_and_Private_Antitrust_Complaints)

That old cliche of “Nobody got fired for buying IBM” was once truth, not cliche. The launch price of an IBM PC was around $1500US. Imaging you’re putting one on the desk of everyone in finance - a department at least ten people - that’s going to be ten thousand dollars plus. This is not a cheap investment.

Of course, you could be cutting edge and buy them all Apple IIs, or Commodores - but will you be able to get software for it? Will the company still be there in three years time? Will some awfulness befall the manufacturer of those PCs - an awfulness you have no control over, but which means you’re now the guy who blew tens of thousands of dollars on dead-end computers?

That’s the true meaning of “Nobody got fired for buying IBM”.

Clones happened later. Clones really started to matter in the mid to late eighties, after the IBM PC standard had been set but companies wanted better value…

Apart from those couple of years in 1995 to 1997, when Apple allowed Macintosh clones for a licensing fee. Without which Apple may very well have gone under - they really needed the cash at that point!
(https://en.wikipedia.org/wiki/Macintosh_clone)

4 Likes

IBM looked at the existing small computers for design ideas. They daw the success of the Apple II and the CP/Mmachines and decided to follow, hence the expansion bus and details about what was inside the computer. I don’t think they saw cloning, just that making it easy for third parties to write software or make cards for the bus would make the computer easier to sell.

The fact that Microsoft retained rights to MS-DOS made a difference. They could then sell it to other companues or even modify it for them.

MS-DOS was at the very least influenced by CP/M, via 86-DOS. CP/M had segregated the I/O routines and kept rhem in a separate space, often RAM. This meant the same distribution could be adapted to varying hardware. You needed the same sort of hardware for every machine, but the specifics varied. Unless from one manufacturer, you couldn’t count on CP/M machines being identical.

MS-DOS followed. The I/O was kept separate, called the BIOS like in the CP/M days. In the IBM PC the BIOS was in ROM, though it could be in RAM. A bit harder to reconfigure for different hardware.

But since the BIOS was separate, and “open”, itbwas easy to afapt to different hardware.

In the early days there were lots of hardware configurations. Each manufacturer had some reason, and sometimes it meant improved specs (like better graphics).

The problem was that relatively early, the BIOS was slowing things down and some software decided to deal with the hardware directly, bypassing the operating system at times. Or use bits of the BIOS without going through the system calls.The same situation as before there were operating systems, where every bit of software had to include code to address the I/O hardware.

This made the odd hardware incompatible. The applications ceased to work with non-standard hardware.

Soon there was no variety, everyone just copied IBM directly. And they needed to cooy the BIOS code, which was copyrighted by IBM. Previously they could figure out how to write the BIOS drivers. Since the apps no longer used the BIOS calls, the BIOS had to be identical to IBM’s.

That’s where Phoenix stepped in. They made a list of what was needed and where, and wrote from scratch. So they didn’t cooy IBM’s BIOS except in terms of results. So routines had to be in the same place, but were written from scratch. Lots of work, especially because you had to anticipate what might be used by the apps, to have them in the right place.

I gather the BIOS isn’t used much these days, except to load software. So it still exists, but mostly not used.

5 Likes

Still they if they really wanted, they could put in an ULA, like the Spectrum, containing all the glue logic and reducing the price. Sinclair did it, and also Commodore made its own CPU and peripherals. Actually Commodore IBM PC compatibles had custom MOS-technology made ICs.
Apple II was made with off the shelf IC because they started small, but if one has an IC foundry or a business relation with someone buildi ASICs making a couple of mask LSI isn’t a big problem.

I get the impression that some of those non-PC-compatible MS-DOS systems were made that way on purpose. The DEC Rainbow was chock full of weirdness, including a floppy drive that wasn’t really compatible with standard 5-1/4" floppy discs. The hub rings on most discs would physically damage the RX50 drive mechanism due to its mechanical tolerances, and DEC deliberately didn’t include a FORMAT command in their MS-DOS implementation. That’s right, you had to buy pre-formatted floppies at a premium price. It also had a Z80 and could run CP/M; in 8088 mode it used the Z80 as a floppy controller.

DEC could have captured personal computing with the LSI-11 CPU well before IBM came on the market, but they were so afraid of cannibalizing their minicomputer business that they tended to hobble their systems with things like the RX50, and release multiple incompatible small systems. I’m surprised they allowed Heathkit to release an LSI-11 system.

5 Likes

I second Ceran_Swicegood’s refinement. (Bought my first small computer in 1978.) Pre-IBM computers generally ran either a propietary OS (Radio Shack’s BASIC-as-OS) or a flavor of CP/M. And before the 8080/Z-80 family, the motherboard might run a 6502 (Apple) or 6800 chip and have proprietary or “standard” associated support machinery, including peripheral interfaces and data storage systems (S-100 buss? what size floppy disk? what format?). I got through the compatibility wars and owned PCs that were more or less IBM-compatible, generally with Lotus being the make-or-break application. (The Heath/Zenith 150 series turned out to be one of the best choices, but in many ways it was a step backward from the non-compatible H/Z-100 series, with its S-100 buss and dual-processor architecture.)

The rest of Cory’s account is pretty much what I observed in a couple decades of tech-journalism work, with the lawyers and captive legislators and regulators doing what IBM couldn’t.

2 Likes

I agree wholeheartedly with the point this post is making, but (rather ironically) you missed one even earlier example of adversarial interoperability:

The PC revolution owes much to Intel’s 8080 chip, a cheap processor that originally found a market in embedded controllers but eventually became the basis for early personal computers, often built by hobbyists.

That’s only partially true: in fact, quite a number of pre-IBM PC personal computers were based on Zilog’s Z-80 chip, which could run 8080 assembler* without modification but had many enhancements that made it a more attractive basis for a small, single-board computer.

I don’t have statistics available for relative numbers of 8080/Z-80 PCs, but I do know that the original TRS-80 portable, the CP/M expansion board for the Apple II, and my own first computer (a Kaypro 10) were based on the Z-80 and therefore owed their existence to adversarial interoperability.
Edit: I forgot to mention that every original Pac-Man arcade console (and probably a lot of others) had a Z-80 inside.

*Edit: “8080 assembler” was bad phrasing, but I can’t think how to clarify without making it long-winded. The Z-80 could run compiled 8080 machine code without modification, AND could run Intel’s 8080 assembler to generate 8080-compatible machine itself; you could use your Z-80-based CP/M computer to develop software either for the broader market or specifically for the Z-80.

4 Likes

x86 is a different part of the story though. IBM was using 3rd party, off the shelf parts. Additional sources and competition in the market for x86 compatible processors could really on benefit them. But x86 wasn’t the only part of the “IBM PC” platform, like the article discusses IBM’s parts were a proprietary ROM chip, and which components were used how.

But proliferation of x86 was similarly driven by the same sort of interoperability. Aside from unlicensed clones, government procurement at the time required a second source for any key items/components. So when the government started purchasing early x86 chips and systems or equipment that used them. They required Intel to license the tech to another company. AMD was selected as that second vendor, and that’s how they ended up with the rights to produce actual x86 processors. While they produced clones at first, they were much closer better clones than could be produced without direct access to the full instruction set. And AMD moved on to creating their own architectures with improvements and additions. A fair bit of the advancement that’s kept x86 relevant all this time has either come out of AMD or the back and forth between the two companies. The really major ones being the 64-bit extension of x86 (which is mostly why AMD still has a license), and multicore processors.

You add that nugget into this story and you get that it wasn’t just interoperability that drove this. It was also pro competition government regulation. Which is a pretty damn important peg to how the current tech situation developed.

8 Likes

Oh, if only that trend had ended so early! For a few years in the Pentium II/III era, Dell used power supplies and motherboards that had intentionally-scrambled pinouts. If your Dell power supply died (the power supply being, by far, the most common point of failure) and you replaced it with a standard ATX power supply from a different manufacturer, it would fry your motherboard.

That particular shabby trick poisoned my attitude toward Dell for a long, long time. It wasn’t until the Capacitor Plague - when Dell acted in a stand-up way and fixed their affected machines quickly and for free, while other manufacturers (cough cough Apple cough) screwed over their customers - that I started to trust them again.

5 Likes

The Radio Shack Model 100 laptop used the 8085, which wad an 8080with on board clock generator (the 8080 needed a separate IC for that) and two I/O pins for rudimentary serial. It only had new instructions for the I/O pins.

The Z-80 was a big improvement over the 8080, extra registers and instructions, and improved hardware.

But a lot of software ignored the extra instructions in order to be compatible with the 8080. So there often wasn’t an advantage. It was only “much later”
CP/M was even partly rewritten to use the extra instructions, well mote like third party mods.

You see this when companies started offering dual cpu boards for the s-100 bus. They’d add an 8088 but move back to the 8080, or for hardware simplicity the 8085. Seemed like aatep backwards, except the Z-80 often didn’t mean as muchas the potential.

1 Like

Cory, you left out one big non-legal obstacle to reverse engineering and adverserial interoperability:

Crypto.

Firmware can now be signed/encrypted and this is enforced at the hardware level, using OTP keys that are unique to the device. Even if there were no legal obstacles, there would be technical ones. If you do it right, with effort you might break into individual devices but not all of them.

3 Likes

While I knew that he worked at Phoenix, I didn’t know that Tom did the description part of the reverse engineering of the Phoenix BIOS. I’ll have to ask him about that soon. I was firmly stuck in CP/M land at the time, writing custom BIOS ports for STD-bus computers.

I was also using a custom version of Cromemco CDOS, which my coworker Jim had reverse-engineered, modified and recompiled to run on other S100 Z-80 computers. We were developing optical testing machines at a university, so it was okay.

5 Likes

Yeah, I don’t really know when that got fixed, I wasn’t paying as much attention after I was finally able to afford a Mac.

I don’t think many companies tried to be different to lock people to their products. In the MS-DOS era, they were trying to iffer some better performance, though DEC seemed to be an exception to that.

I have an HP Compaq i7 and it uses an odd power supply. But I could trace the wiring and keep it running with an external supply if needed.

1 Like