Great article about the "ill-fated" Apple Lisa, 40 years later

Originally published at: Great article about the "ill-fated" Apple Lisa, 40 years later | Boing Boing


Let’s not forget that Apple purposefully hobbled the early Macintosh System to not allow true multitasking, reserving that for the Lisa. Jwa van der Vuurst released Multi-Mac in 1985 for the Macintosh 512K with the original 64K ROM, making true multitasking possible on the Macintosh hardware.



Yeah, they invented all that after stealing it from Xerox.


Xerox wasn’t using it…it was just sitting there!


Something else people don’t appreciate about the first Mac is what a heroic piece of programming QuickDraw was. Bill Atkinson and Andy Hertzfeld managed to get an entire modern GUI system into 64k of ROM. There are stories out there of some of the amazing feats required to achieve this. For example, shaving bytes out of the routine to draw rounded rectangle corners because those almost didn’t make it, which would have altered the entire look of the system. Not a single byte was wasted in that ROM.

It’s a deeply technical and esoteric topic (raise your hand if you even know what QuickDraw was) so what they did may never be sufficiently appreciated. Such is the unglamorous life of a systems engineer.

This is a much abridged and much less technical version of the story on the Lisa. Best I could find, unfortunately. These stories are getting harder to find as people care less and less about the engineering that went into these early machines.


Isn’t that the computer named after Steve Job’s daughter even though he was still denying paternity and generally treating her like garbage at the time?


My recollection from the time was that color computers were possible (e.g. the Apple ][) but that Jobs preferred the B&W aesthetic. In which case, the computer deserved to die. I remember when the Mac, grudgingly, finally gave us a nearly B&W screen except for the Apple logo.



Though they claimed it stood for ‘Locally Integrated Software Architecture‘.


That’s reductionist to the point of being untrue. By that logic, Xerox stole it all from Doug Engelbart following the Mother Of All Demos. Xerox bought 20% of Apple and part the deal was showing Jobs their technology.


It’s not that simple. They chose black and white because they could get a crisper display at higher resolution with less RAM that way. It was a business targeted machine, and monochrome was the standard in that space.

The Apple II is an odd comparison to make, since it’s from 1979. It was the first home computer with colour and with pixel-addressable graphics. That was a big deal at the time. By 1984 though, colour was becoming common in home computers via the Commodore 64 and Atari 400/800. However business machines remained monochrome for a long time because it was considered more “serious”. Yes, really. Business computing has always been a weird market. Colour was considered a feature of “toy game machines”.


It’s right up there with GEOS which put a desktop on the Commodore 64 - and somehow managed to produce a workable computer.

Both the Lisa and the original Mac would have been much more usable had they come with a graphics coprocessor rather than forcing the poor old 68000 to do all the hard work.

Strangely enough, Jobs turned down the Amiga chipset when it was shown to him. That would have transformed the original Mac.


Steve Jobs was a jerk! What? I thought he was god-tech-emperor-cool-guy-black-turtleneck-hipsters-dude…

Oh My God Omg GIF

Why, oh why, do you have innovation and cool capitalism! WHY?!? /s


I mean, I joke, but A) it’s true, and B) Apple did do a significant amount of both conceptual and development work on their GUI, as the article makes clear.


Likewise GEOS on the Apple II is very impressive.

For better or for worse, Jobs understood that what Amiga did wasn’t what businesses wanted. Amiga was doomed to be viewed as a game machine because customers didn’t understand the potential. They didn’t know to want what didn’t exist yet.


Read the attached article, finally. It is better written than most mainstream stuff about retro computers (written by 20-somethings) usually is. I need to nitpick a few things, though.

The Alto workstation, which was never sold to the public

While that’s true in the strictest sense, Xerox did attempt to commercialize their technology with the Star workstation. Being built on a discrete CPU minicomputer platform, though, it was big, loud, power hungry, and expensive. It was doomed to fail before it ever left the drawing board. This is the irony of PARC. Someone was gonna “steal” what they did sooner or later anyway because they could never make a go of it.

The computer came with two Apple-designed “Twiggy” 5.25-inch floppy drives, but it was designed to be used with Apple’s “ProFile” 5MB hard drive, which sat on top

This is untrue, and based on the often-reproduced photo of the Lisa with the ProFile sitting on top. The ProFile was an Apple /// accessory (yes, there was an Apple ///) which is why it’s so obviously in a different design style than the Lisa and looks so awkward sitting on top. The team used ProFiles internally because they where there and the Lisa did support hard drives (a big deal at the time) but the machine was never “designed” for the ProFile. Some Lisa customers did end up ordering the ProFile with their Lisa just because it was the only HD that Apple sold. Like the first Mac though, the Lisa was designed to boot and run entirely from floppies, which was very typical at the time.

the original Mac could only run one application at a time. Autosaving was also gone, as was virtual memory and memory protection

The Lisa did not have memory protection in its multitasking. That requires hardware MMU support, which did not exist in the 68000 processor. It’s the same reason the Amiga’s multitasking was so unstable (Guru Meditation error, anyone?). Multitasking is a very dangerous game without restricting memory access right on the address bus where it’s guaranteed to be safe. It’s why kernel panics on modern machines are so incredibly rare. Applications can’t crash each other. Device drivers still can, which is why Windows BSODs exist (Windows’ driver layer is hot garbage), but the Lisa could only dream of such hardware support.

Regarding the tragedy of the Amiga, it’s helpful to remember that people had to be taught what computers are good for. For the first couple generations of machines, people thought computers were for three things- games, spreadsheets, and word processing. You had to convince them they were good for anything else. Desktop Publishing was the first niche outside that triangle, and Apple managed to create it with the Mac/LaserWriter combo. Eventually people caught on to using computers to make music, art, and other things, but it took a long time. The Amiga could do all those things at a time when nobody knew they wanted to do those things. It was stuck in the original triangle of use-cases so it got filed under “games machine” and that was the death of it in the marketplace.


That Folklore site (and the book) are absolutely splendid.

1 Like

More or less, yes. Video production turned out to be it’s niche, with the A2000, Lightwave, and Video Toaster.


It wasn’t intentionally hobbled. It had 128KB and a floppy disk drive. Releasing the single app Finder made perfect sense. Yeah, and it was intentionally hobbled with a tiny black and white, not even gray scale screen, and it didn’t have a proper GPU.

1 Like

Steve Jobs invited my boss and me to see his new computer. It was an early Mac back when the prototype still had a 5.25" floppy drive because the 3" ones weren’t ready yet. One demo was a chess game where one moved the Alice character around with the mouse and the various chess pieces in almost perspective would leap from square to square to try and crush you. It was an amazing QuickDraw demo. It was an amazing piece of code. I’ve never seen that demo since. It seems to have vanished.


Then there was the Acorn Archimedes, which was successful in education (being successor to the BBC Micro) and a few other niches. Its ARM processor architecture went on to have a spectacularly successful afterlife.