Why CRT monitors were better

Burn in is not a major issue in real life. It only tended to occur in displays left all the time at max brightness like in bars or airports. Especially cheap projection TVs where the small CRTs are driven hard and may have inadequate cooling. I have used plenty of CRT monitors all work day and never had a problem. And never seen it on coworkers monitors either.

The OLED screens I have owned for years sometimes show temporary burn in but it goes away with use, and they automatically run some sort of electronic anti-burn-in periodically which completely removes it.

1 Like

Oh, I’ve seen it more than a few times over the years on regular desktops when I worked as an IT repair tech. Mostly the older monochrome monitors, but a few times with the newer ones in the late 1990’s that ended up with Windows menu bar burn ins from being left on continuously.

Even further back, old TV’s that had been left on all night back in the day when the channels switched off and went to test signal. Nothing like having your picture overlain with those grid and circles.

1 Like

Repo Man__JFP__X-Rays

1 Like

TVs, in particular, are atrocious about the ‘processing’ they do. Does it look good on the floor at Best Buy? Cram it in. Latency? Meh. Pixel accurate rendering of text? Sir will find our line of “signage” TVs over here for only $2500 more.

1 Like

The worst were the institutions that locked their crts to 60Hz. I took many a programming class where the monitors could be controlled by lecturer-- but this scheme relied on 60 hz refresh rates.

My eyes! The goggles do nothing!

1 Like

And on a very narrow range of CRTs.

The ones that are good are really good. Most are just mediocre.

2 Likes

Unless you want to play a light gun game.

But in all seriousness the blurring is generally more annoying than the lag itself.

Learn the settings. You can turn most of that crap off. Any halfway decent modern TV should also have a game mode which turns pretty much everything off.

1 Like

I had a 20" CRT monitor, I hauled it off to the recycler after I found an old junk LCD on the street

Maybe I would have kept the CRT if I’d known hipsters thought it was cool

but I still wouldn’t use it

2 Likes

Finding an LCD monitor on the sidewalk is being a hipster.

Which was exactly the problem with the used studio monitors I got when the original users went digital. They’d been in use in the control room 24/7 for $LC_DEITY knows how long, and they were toast.

72 posts and no one has brought up dot pitch yet? Or how it changed from center to edge (at least on flat crts like my old pf790)?

Sharp center, less so elsewhere.

1 Like

The only way a CRT is better is launched from a trebuchet.

5 Likes

I prefer pianos myself.

2 Likes

I can’t tell you the “why” because I’m not a neurologist. However, as a career game developer going on 30 years, I can confirm this is real. We did blinded tests on one project I was on, to see how much input latency we could get away with (and by extension how much we could cheat into our ms budget for the frame). Anecdotally, some players complained about as little as 2ms at rates higher than predicted by chance. Not saying this was a statistically rigorous study or anything, but we were sufficiently convinced to minimize latency more than we were planning to do. This wasn’t for a hyper twitch shooter or anything, either. It’s down to how the game feels. Too much latency and the controls feel unsatisfying in a way that players can’t necessarily articulate.

If I had to guess, it because the brain is faster than you’re describing, it’s just that it pipelines. Yes we’re all living 200ms in the past, but that doesn’t mean responses are quantized into 200ms chunks. We do see what happened 7ms ago, we just find out about it 193ms from now.

2 Likes

One thing I miss about CRTs that hasn’t been mentioned yet is that their horizontal resolution was truly analog. That meant almost any horizontal resolution (up to the limits of the phosphor screen) looked good. LCDs badly alias everything that isn’t a perfect match for their native resolution which is annoying for, for example, playing any game made between 1990 and 2008. This was the period when PC resolutions were all over the place and most games only had a couple of options.

CRTs also had a substantial smoothing effect on graphics which was exploited by many game systems. The N64 (the first system I developed professionally for) has a really clever analog anti-aliasing stage after the rasterizer that relied on the analog behavior of CRT scanning. Many arcade graphics were designed with CRT blurring in mind and look weird on LCDs.

3 Likes

This topic was automatically closed after 5 days. New replies are no longer allowed.