Why CRT monitors were better

I feel like most of the latency people complain of is introduced through the video processing. HDMI is super slow compared to composite/component, to the point where games that rely on speed and timing can become frustrating if not impossible.

The chief example I’m remembering is Super Puzzle Fighter II Turbo, which was beautiful on our CRT TV circa 2002 but whose re-release (Super Puzzle Fighter II Turbo HD Remix) was gratingly unresponsive on a plasma screen circa 2012. Just infuriating.


I think I split the difference between the CRT and LCD issue…

I purchased two plasma big screens by Panasonic right before they stopped making plasmas. I knew they were dead technology, but pulled the trigger.

I don’t know what my response time in terms of gaming (these are big flat screens used for TV primarily), but the blacks are true black, not the lit-from-behind grey pixels like LCD’s have. I think it’s closer to OLED then LCD. I can’t tell the difference from a black screen and when the TV if off - that’s how black it is.

They also get hot, but not overly hot. They are also a bit thicker than an LCD, but not by much. Plus they have 3D (not that I’ve ever used it), YouTube, etc built in.

1 Like

Burn in is not a major issue in real life. It only tended to occur in displays left all the time at max brightness like in bars or airports. Especially cheap projection TVs where the small CRTs are driven hard and may have inadequate cooling. I have used plenty of CRT monitors all work day and never had a problem. And never seen it on coworkers monitors either.

The OLED screens I have owned for years sometimes show temporary burn in but it goes away with use, and they automatically run some sort of electronic anti-burn-in periodically which completely removes it.

1 Like

Oh, I’ve seen it more than a few times over the years on regular desktops when I worked as an IT repair tech. Mostly the older monochrome monitors, but a few times with the newer ones in the late 1990’s that ended up with Windows menu bar burn ins from being left on continuously.

Even further back, old TV’s that had been left on all night back in the day when the channels switched off and went to test signal. Nothing like having your picture overlain with those grid and circles.

1 Like

Repo Man__JFP__X-Rays

1 Like

TVs, in particular, are atrocious about the ‘processing’ they do. Does it look good on the floor at Best Buy? Cram it in. Latency? Meh. Pixel accurate rendering of text? Sir will find our line of “signage” TVs over here for only $2500 more.

1 Like

The worst were the institutions that locked their crts to 60Hz. I took many a programming class where the monitors could be controlled by lecturer-- but this scheme relied on 60 hz refresh rates.

My eyes! The goggles do nothing!

1 Like

And on a very narrow range of CRTs.

The ones that are good are really good. Most are just mediocre.


Unless you want to play a light gun game.

But in all seriousness the blurring is generally more annoying than the lag itself.

Learn the settings. You can turn most of that crap off. Any halfway decent modern TV should also have a game mode which turns pretty much everything off.

1 Like

I had a 20" CRT monitor, I hauled it off to the recycler after I found an old junk LCD on the street

Maybe I would have kept the CRT if I’d known hipsters thought it was cool

but I still wouldn’t use it


Finding an LCD monitor on the sidewalk is being a hipster.

Which was exactly the problem with the used studio monitors I got when the original users went digital. They’d been in use in the control room 24/7 for $LC_DEITY knows how long, and they were toast.

72 posts and no one has brought up dot pitch yet? Or how it changed from center to edge (at least on flat crts like my old pf790)?

Sharp center, less so elsewhere.

1 Like

The only way a CRT is better is launched from a trebuchet.


I prefer pianos myself.


I can’t tell you the “why” because I’m not a neurologist. However, as a career game developer going on 30 years, I can confirm this is real. We did blinded tests on one project I was on, to see how much input latency we could get away with (and by extension how much we could cheat into our ms budget for the frame). Anecdotally, some players complained about as little as 2ms at rates higher than predicted by chance. Not saying this was a statistically rigorous study or anything, but we were sufficiently convinced to minimize latency more than we were planning to do. This wasn’t for a hyper twitch shooter or anything, either. It’s down to how the game feels. Too much latency and the controls feel unsatisfying in a way that players can’t necessarily articulate.

If I had to guess, it because the brain is faster than you’re describing, it’s just that it pipelines. Yes we’re all living 200ms in the past, but that doesn’t mean responses are quantized into 200ms chunks. We do see what happened 7ms ago, we just find out about it 193ms from now.


One thing I miss about CRTs that hasn’t been mentioned yet is that their horizontal resolution was truly analog. That meant almost any horizontal resolution (up to the limits of the phosphor screen) looked good. LCDs badly alias everything that isn’t a perfect match for their native resolution which is annoying for, for example, playing any game made between 1990 and 2008. This was the period when PC resolutions were all over the place and most games only had a couple of options.

CRTs also had a substantial smoothing effect on graphics which was exploited by many game systems. The N64 (the first system I developed professionally for) has a really clever analog anti-aliasing stage after the rasterizer that relied on the analog behavior of CRT scanning. Many arcade graphics were designed with CRT blurring in mind and look weird on LCDs.


This topic was automatically closed after 5 days. New replies are no longer allowed.