But these are different things -- color hacks and overscan or switching display modes are about circumventing known hardware limitations in predictable and clever ways.
The topic under discussion here is the pixel art -- the idea that the artist would be relying on a fixed amount of horizontal blur to get the amount of glint in the eye "just right". And that's what you couldn't do, because that blur would be dramatically different on different CRT's.
The art was designed to be robust under blurry conditions that had extreme variation. It wasn't designed for some kind of ideal CRT so it would look "just right".
You're assuming that in the analog era content creators wouldn't bother because different analog TVs and monitors had differing fidelity and quality (or could be mis-adjusted). But we did bother. Most of us cared a lot about the pixels we made - maybe too much. We worked our asses off to make them as good as we could. It's no different than when I worked in an audio mixing studio. We had four different sets of stereo speakers sitting on the console and tied to a switchbox. When we were getting close to the final mix, and certainly during all of the mastering process, we'd switch between the audiophile grade speakers to the K-Mart speakers to the shitty car speakers. Of course, it sounded better on the better speakers and there was much less clarity in the cheap speakers. But we needed to make sure it sounded good enough on the bad speakers. This was just the normal way content creators in the analog distribution era worked.
When making games I'd check the graphics on a good composite monitor but also on a TV through an RF switchbox. In the Amiga/Atari ST era we checked on analog RGB too. Commodore 64s had optional S-Video output which looked very good compared to composite video and light years better than RF. We checked it all and created content with it in mind. In the analog era I worked in games, video production and audio production. And in all three I can recall specific instances where we worked on aspects we knew would probably only ever be appreciated by those with better gear. This was especially true with visual effects work we did for broadcast television. We added detail that we saw on the studio master tape but which a lot of people never saw at home (at least until home DVD re-issues became a thing). We hated the limitations of the current distribution standards and of the gear we authored on (even though it was the best money could buy at the time). And we struggled mightily to overcome those limitations and preserve every shred of quality we could.
Also, keep in mind that arcade cabinets weren't variable like consumer TVs. They used very specific monitors which had specific, sometimes non-standard, refresh rates. I never worked at an arcade company but I knew people who did and they often had the bare monitor tube that would be in the cabinet right on their desk during development. And in that era we only ever saw our game graphics on composite displays. All our monitors were composite video, unless you were senior enough to have a 80x25 serial terminal (which were amber or green text only). On the quad-sync analog RGB display I have now in my arcade cabinet, I've installed around 40 specific modelines so that they exactly match the original vertical and horizontal frequency of the monitor in, for example, a Williams Joust cabinet when I'm playing Joust. The CRT I have was made by Wells Gardner, a company that specialized in making CRTs for arcade cabinet manufacturers like Atari, Sega, Namco, etc.
First, I just want to thank you for all your extensive comments. It's really cool to get to hear from someone involved in all of it. It sounds like I probably played a bunch of stuff you were involved with! :)
And I don't think we're really disagreeing -- what you're describing is exactly what I meant when I said "the art was designed to be robust". Just like the sound that still works on bad speakers.
I never meant to imply there was any kind of lack of care or attention to quality. It's more that I see a kind of certain fetishization for "one true image" that never existed in the first place. Rather, the art was intentionally (and carefully) designed to be robust -- and of course more detail would come through on better displays.
You make a great point about the arcade cabinets though, where they did have that level of control, where maybe it really was "one true image" -- I was definitely thinking only about the consumer systems I grew up with. I can definitely appreciate that the art was specifically fine-tuned for that one display. I am curious if there are CRT emulators that try to replicate the individual monitor models used in arcades, as opposed to more generic TV's and monitors...
Thanks again for your comments and for engaging! This is the stuff I love HN for.
Yes, I think we broadly agree. There was quite a bit of variability in home CRT games and much less variability in arcade cabinet games. However, there was a clear specification establishing what these games should look like on a CRT which was set by the RS-170A composite video standard - even if some home TVs fell short of this goal due to age or maladjustment. Our goal in the 80s and 90s was to create game graphics that were as high-quality as we could and ensure the graphics we shipped would look correct on any CRT set to the RS-170A standard. To accomplish this we actually calibrated the composite video monitors on our desks to match the broadcast video standard. I recall one time when a new artist joined the team and his monitor wasn't set up right. The first floppy disk of image tile data he gave me had some odd color choices that were puzzling until I went and looked at his screen - where the same images looked fine. Of course, he had to redo the bitmaps but he did learn a valuable lesson about always checking the calibration on a new monitor. His monitor was literally out of phase with the rest of the universe. :-)
> It's more that I see a kind of certain fetishization for "one true image" that never existed in the first place.
Well, it's a matter of degree. The nature of analog composite video is that it can never be as precise as 16 or 24 bit digital color. But it also wasn't 'horseshoes and hand-grenades' approximation. Just because analog video is old doesn't mean these standards aren't capable of being very precise. It's possible to adjust a decent composite video monitor very close to objectively "correct" per the specification in a few seconds with just standard color bars. Many people assume the standard color bar test signal only allows calibrating correct color with the tint knob. However, it also allows calibrating correct brightness and contrast if you know what you're doing. So, we were creating our game content targeting a precise objective standard.
As for fetishization of vintage or retro... I hate it. Hopefully I've made clear I have no interest arbitrarily injecting the limitations or shortcomings of the analog past if there's any way to avoid it. I love today's 4K 10-bit HDR+ video sources and have a perfectly calibrated, ultra high-end home theater with 150-inch screen, 3-laser projector and 7.4.2 THX surround sound that can damn near make your eyes and ears bleed. It's about as good as it's possible to do in 2025 - and most days I wish it was possible to achieve even better quality. I really want 1,000 nit video projection and 12-bit sources. So, those people degrading video quality to match some nostalgic memory of the past are misguided in my view. 40 years ago those of us making the content, hated that the tech wasn't better. The tech today has improved 10x and I still hate that it isn't even better. :-)
That said, when we're playing old analog era content, whether a retro-game, laserdisc or whatever, we should make sure we're putting all the quality that was in the original up on the screen and that our replay environment correctly matches the standards the content was originally created to match. Back in the day, doing that used to be really hard. Today it's damn near trivial. Which is why it makes me maybe a little extra crazy some people who profess to love "retro" don't even bother to do it.
> You make a great point about the arcade cabinets though...
> I am curious if there are CRT emulators that try to replicate the individual monitor models used in arcades, as opposed to more generic TV's and monitors
Oh yes, indeed there are! Hundreds in fact. And it's a glorious rabbit hole to dive down. I'll just point you to this forum to get started: https://forums.libretro.com/c/retroarch-additions/retroarch-... First, there are shaders, shader packs and shader presets. The lovely thing is that it's easy for anyone to examine, adjust and remix components between various shaders. While the RetroArch emulator system has it's pros and cons, it's undoubtedly excellent for auditioning, adjusting and remixing shaders and presets.
In general, I recommend CRT Royale as a good baseline for shader newbies as it's good and not too complicated. However, I'm personally quite impressed by CyberLab's recent work on the Death to Pixels shaders. https://forums.libretro.com/t/cyberlab-death-to-pixels-shade.... Download and install the latest shader sets into RetroArch. Find and follow an online guide if it's confusing. There are tons. Advanced shader authors like CyberLab and a handful of others are doing some incredible work in the last year. Crazy stuff like researching the phosphors used in certain CRTs and doing physically based modeling on the data. There are shaders specifically for emulating CRTs with dot masks, slot masks and aperture grilles (used in Sony Trinitron CRTs). There are also shaders that target specific classes of legendary CRTs like Sony WEGA, PVM (professional grade) and BVM (broadcast grade). Others target emulating different kinds of cable connections from RF, composite, S-Video, YUV, and RGB). One of the latest trends is creating shaders which rely on what kind of flat screen technology you have. So a shader that more correctly emulates a certain Trinitron CRT by leveraging the uniquely wide contrast range of an OLED monitor but doesn't look as good on a non-OLED monitor. The same is happening around both HDR monitors and high FPS monitors, as each enables better kinds of fidelity by using those traits.
Personally, I prefer maximum quality, fidelity and authenticity (and zero nostalgic degradation). So, I avoid the entire mega-bezel series as that takes up precious screen space for rendered monitor bezels with reflected screen glow. It's cute but simply a waste of space and the bounceback reflections wash out the original image. I focus on RGB shaders and set them to minimal blurring and minimal scanlines. Have fun exploring and trying different things.
The topic under discussion here is the pixel art -- the idea that the artist would be relying on a fixed amount of horizontal blur to get the amount of glint in the eye "just right". And that's what you couldn't do, because that blur would be dramatically different on different CRT's.
The art was designed to be robust under blurry conditions that had extreme variation. It wasn't designed for some kind of ideal CRT so it would look "just right".