Watch: How "oldschool" computer graphics worked back in the eighties

[Read the post]

1 Like

I’m old, I know, but it is really, really weird to consider the NES an “early game console”. Have the Atari 2600/Odyssey2/ColecoVision/Intellivision been completely forgotten at this point? They had come and gone before the NES even was released.


I don’t believe in the phrase “hipster”; I think I finally found someone worthy of the scorn:

1 Like

I remember playing a game called “Digger” that was written in pure Assembly for the 4.77Mhz XT, (you had to turn off the turbo or it ran too fast). It had no OS, just a bootable floppy that went straight to the game and it played a lot of tricks to get decent full motion color graphics on a CGA display, including disabling all interrupts, even the clock.

One day a friend called me over to check out the brand new 386 his office had just gotten - their first computer! It had one of the fancy new “Vee Gee Ay” displays and I was curious what the Digger screens would look like on it so I brought a disk with me and asked if I could try it.

The first screen the CRT painted was the scrolling credits at the end - in the time it took for the monitor to change modes all the character lives had been spent and the game was over!


Old school on a PC would be chunky graphics (IIRC 256 by 128 pixels) with a simple low resolution font. People used to code in BASIC on that. My Superboard II had a character generator PROM so my graphics capabilities were limited to a few graphics characters crammed into a 256 byte PROM.


I never really used it, but I learned some programming for my old Mac under System 3 or something. I can remember some crazy graphics hacks involving drawing the next frame off-screen and then moving the screen pointer…

1 Like

Sounds like Double Buffering and is still used today depending on how the particular implementation of your graphics hardware/drivers/etc work. (some queue up draw instructions until the next screen paint for you so it is transparent)



Only $2?!?! :::sigh:::


Ah, sprites. That’s how I learned binary.


For me it was fonts. Sprites followed.

1 Like

When I finally read in a magazine about the similar technique on the Apple II, I’m guessing around 83-84 or so, I was actually kinda pissed. I’d been frustrated for years giving up on most game ideas because it was just too difficult to get smooth motion graphics. WHY DIDN’T THEY JUST PUT THIS INFO IN THE MANUAL. They had a whole section in the book about vector graphic drawing and mapping bits and stuff but couldn’t be bothered to mention that you could draw off screen and then with one poke instantaneously flip the screen making motion seamless. My cynical side was convinced Apple left this out on purpose to give developers an edge over home hobbyists.


In the video, his overview is actually pretty agnostic, focusing on the techniques rather than the implementations. If anything, he spends the most time using a C64 to demonstrate.

Similarly, I liked this series that was linked here before.

He mentions in the video that the next two in the series will be about NTSC artifact color (as on the Apple ][) and CPU-driven graphics (Atari 2600.) I reckon that’s in increasing order of difficulty to program. On the 2600 your code has to explicitly send the video VSYNC signal 60 times per second, in between all the game logic.


Wonderful memories!

“Mapping the Commodore 64” and “The Commodore 64 Programmer’s Reference Guide” were my two bibles in the 1980s. Back when the only “framework” or APIs you had to work with were routines from the Kernal (yes, I spelled it right – the wrong way!). Good luck finding a contiguous block of RAM without swapping various components of BASIC and the ROM in and out. But what a thrill ride those days were!

Search YouTube for videos of old LucasFilm Games elder statesmen talking about the advanced tools they built using more powerful machines in order to create their masterpieces for the C64 and Atari computers. There’s also a thorough explanation titled “The Ultimate C64 Talk” and streams of current demos from the scene showing just how much this little “bittybox” can do when you understand the nuances of its design and chips.

I can’t remember how I first learned binary. But in high school calc, we decided to do a unit on logarithms, and that’s when I learned how to do arithmetic in a few different bases and how to convert between them. Sure the common log is easy for most people, but I had a ball doing logs in Hex, Bin and Octal. It was really fun handing in a paper with all the answers being “10”.

When I was a kid, I sucked so hard at mental math. I still am slow with mental multiplication and rely on a lot of multiplying by 10 then dividing by 2 to get to the answers. I couldn’t hold the times tables in my head. Most kids with ADHD can’t, because it’s boring as hell. But I got so frustrated apparently sucking at math to everyone, even though I didn’t have problems understanding the concepts, that I decided to make math my strongest subject. And I did. By Sophomore year I was teaching math to everyone in the class when the teacher couldn’t intelligibly instruct us.


Well it’s relative isn’t it - unless you’re commenting from the 90’s then the NES is still an early console, it’s just not the earliest.

I mean look at the Nintendo home console lineup:

  • NES
  • SNES
  • N64
  • GameCube
  • Wii
  • Wii U

NES looks pretty early to me :slight_smile:

1 Like

This topic was automatically closed after 5 days. New replies are no longer allowed.