Some people love CRTs to a degree that the uninitiated may find obsessive. We all have our thing, and for [Found Tech], it’s absolutely pointing particle accelerators at his face to play video games. He likes modern games, with modern resolutions– none of this 1080p nonsense. Today’s gamers demand 4K! Can a CRT keep up? The answer is a resounding “No, but actually, yes!”
[Found Tech] has an IBM P275 monitor, which is one of the last generation of CRTs. Officially, the resolution maxes out at 1920 dots by 1440 lines. While one might (inaccurately) call that UHD output “2K”, you certainly cannot claim it is 4K. So, what’s the secret? Interlacing. Yes, interlacing, like old analog TV signals.
Apparently, in spite of what the manual says, getting the screen to absorb the 2880×2160 interlaced signal wasn’t the hard part, but generating it was. NVIDIA and AMD graphics cards are absolutely unable to create an interlaced signal, but Intel integrated GPUs are– if you get the right combo of chip and old driver. Sadly, the video doesn’t list exactly what he used. Of course an iGPU isn’t going to give you a very good gaming experience at this high resolution, so [Found Tech] has his games do their rendering on the discrete card before piping that over to the iGPU for display on the CRT.
Technically, you still can’t call the 2880×2160 picture “4K”, as that trademark refers to 2160p at 16:9, and this is both interlaced and 4:3. Still, close enough. In spite of the artifacting that turned us all against interlaced signals back in the day, this apparently has [Found Tech]’s eyes fooled– he says it’s as good as 2160p on his OLED, plus the extra magic that comes with glowing phosphors.
It certainly looks great in a recording, but the monitor in the recording isn’t displayed at a high enough resolution to say for sure if it’s 4K. Still, if you’re into CRT gaming, maybe give this high-res interlacing a try. If you still don’t get what’s so great about CRTs, check here, and remember it could be worse– at least we’re not going on about Plasma TVs.

Maybe it does, maybe it doesn’t. Photoshopped thumnail made me not interested in consuming this content.
The video should be re-titled: “Pushing 4K-ish analog signal to an old UXGA resolution CRT monitor allows content farmer to make unsubstantiated claims about image quality.”
Simple answer: it isn’t 4K – it’s UXGA. It’s physically fixed by the dot pitch of the display. Though the aperture grille has no “dots” in the vertical direction, it’s just going to blend two scan lines into one, because the lines will be separated by less than the beam spot size.
https://en.wikipedia.org/wiki/Aperture_grille#/media/File:Aperture_grille_closeup_teletext.jpg
If you double up the line resolution from what the display is expecting to draw, the scan lines overlap into one another. You don’t actually get more resolution, you get a blurry image. That’s obvious if you think about it: the spot size is tuned for the native resolution of the display, so it wouldn’t leave unnecessary gaps between the scan lines, and you’re probably not going to be able to focus it any better than that because the beam spot size is a compromise against brightness (beam current).
In other words, these CRTs were already at their limits of physics at their native resolution. Pretending to draw 4K on them is just pointless youtube kak.
He’s very short on details but honestly I was more interested in the “oh, you can do that?” hack of getting the interlaced signal out of a modern PC than what it looks like on the screen.
I mean, I want to give him the benefit of the doubt and say interlacing might help with the scanlines blurring together, but yeah, the dot pitch is the dot pitch. Perhaps I should have taken a more skeptical tone in the article, but how could I deny the comments the fun?
It might, but at the higher line rates the deflection signal itself is subject to noise, distortion, and bandwidth limits of the electronics. The scan lines wobble and drift up and down, which further blends them together in the eyes of the viewer.
On the point of feeding monitors signals they’re not meant to receive, wasn’t there some scare story in the past about a computer virus that would overclock your monitor and cause it to burn out the deflection circuitry?
You can think of it like, ok, you’re drawing every other line so they don’t end up overlapping, but then you draw the next half-frame on top of that. Where are those lines going to go? Do they land exactly in between the previous?
Probably not. You get the new scan lines criss-crossing over the previous half-frame and drifting in and out of phase because the monitor can’t keep the alignment or focus perfectly steady. Electrons repel each other, so things like what color you’re trying to draw bends the beam out of its path. Each line shifts and morphs subtly depending on what’s actually being drawn, and what was previously drawn on the line above.
It’s rather mind blowing that we could actually build high resolution color CRT tubes with near perfect convergence and focus in the first place.
It’s also mindblowing that we can get millions of LEDs not to burn out and drive them at 200hz+
It’s why we geek out on technology after all.
I mean, you can probably, at half the refresh rate. Since CRT revert to black after every frame (because no electron beam), if the display can do 50+ hertz and the “empty” lines are black too, the viewer wouldn’t notice the difference.
Edit: oh, right, it’s about getting more resolution out of the CRT display, not pushing more through the line, my bad.
No, that’s not possible with software-only. You’d need to replace the shadow mask. And likely a more exact electron gun too.
That’s why we made sure to use a screenshot where you can see the scanning. ;)
And this is why I like the Hackaday YouTube writeups. Sure, it is a little bit of a cheat compared to an article written from scratch, but it’s a great summary without the clickbait and saves a lot of people’s time.
HaD falls into clickbaiting itself. They’re basically borrowing the clickbait of the original video to get you to read their article… Consider the following:
The actual secret is that the monitor’s dot pitch runs out at this resolution and it cannot physically display any more detail. Whatever tricks you pull beyond that, it all gets blurred down to the dot pitch limit, while introducing problems and artifacts like moire lines, which makes the whole thing completely bunk. You don’t even need to watch the video to know that it’s nonsense.
Technically, the dot pitch of that monitor is 0.24 mm and its viewable area is 388 × 291 mm so the real maximum resolution the monitor can actually display is…. drumroll please…
1600 × 1200.
Not even 2K. It can sync to higher resolutions, but any improvement in picture you would see will be pure imagination.
i really dont get the crt obsession. every crt monitor ive used seems washed out to me. especially now that their phosphors have had time to burn. it might have, at one time, been better than the lcds we had 10-15 years ago, but a modern display, no. if you could get a brand new crt display that might be different. perhaps new old stock, but i bet there are enough videophiles out there to make it cheaper to just get a modern display.
i got rid of my last one at least a decade ago, it was nice but it was dark and washed out. i used it only because it could do 120 hz and i was into stereo vision. now i get 144hz in a monitor bright enough to light a room.
People for some reason unknown to me adore old technology.
Because it was still fascinating, maybe. Not just a blackbox.
It’s comparable to a home cooked meal and fast food/junk food, maybe.
My theory is it’s a bit like sports.
Sports are defined by what you’re not allowed to do. Soccer is soccer because you’re not allowed to pick up the ball, you’re not allowed to tackle other players, etc. The game play emerges from the the restrictions.
If there’s no restrictions, the game isn’t interesting.
Old technology is also interesting because of the restrictions. Creating a spinning cube on a Threadripper isn’t interesting. Creating it on a Commodore 64 is.
I know football but what is soccer ?
aSOCCiation Football
It’s what the Brits called it in the 19th century.
Old tech don’t spy on us
Because people are curious about the what-if s. What if the better, fancier technology was being invented later in time? How much more potential can we squeeze out of it if we didn’t simply abandon it in favor of the shiny new stuff?
I second the “old tech don’t spy on us”, and often not capable of being retrofitted to send cookies to servers. That’s nice. More to the point of CRTs; it’s not the quality of the picture, it’s the particular, peculiar “look” of glowing phosphors with video games of the 80s and 90s. I mean, you could might as well say there’s no point in playing old PC and old console games at all because modern ones are so much better looking and sounding. That not only misunderstands nostalgia, it misses that quality of the games themselves. How fun a game can be has little to do with the resolutions or realism. You can go back to original Tetris and even text games; Zork is not going to play any better on the latest OLED, and personally, I’d rather to see vintage games in low resolution like I did as a kid. With qualities like the green glow of text on an old vintage monochrome screen, speaking of which, my terminal of choice is Cool Retro Term and it can replicate a lot of the qualities of old terminals like that. Personally, it brings me back to happier times, when I first could put magic words on the command line and it was continually surprising to see the computer obey the commands I typed. If you don’t “get” that, you are probably one of the many people(it’s surprised me) who are completely unsentimental, but I think you’re missing what can be a very healing experience, especially recovering after trauma. Please allow me to be dramatic; the inability to appreciate old tech and CRTs in particular is like criticizing the Venus de Milo because the arms have fallen off it. It’s true that it is now armless, but there’s such a depth of aesthetic qualities in what remains in the famous marble statue, that the fact the arms are gone misses that point entirely
The youth of today are a big part of that. To us a CRT is the old school monitor (shout-out to amber, screw green!) we lived with, and happily rejoiced when replaced with the newer flatscreens. To us, LCDs and LEDs are better in most every way. But the younger people today are looking back at what we had as something mechanically very special. And it was. An electron gun in every home is pretty wild when you think about it.
My 13-year old nephew and niece look at cassette tapes and records like they’re magic, even though they’ve grown up in the age of streaming silicon and SSDs. They have billions of transistors packed into the teeniest piece of plastic, but it’s not mechanical. It’s basically invisible. Inaccessible. To them, storing data on bumps and grooves or microscopic magnetic lines somehow set in a thin strip of plastic—is just magic. Why? Maybe because they can see it. Vinyl has ridges that look like simple spirals. Cassettes are still somehow as interesting today as when I was a kid loading ‘Lemonade Stand’ onto our Commodore 64. A CRT is still today a marvel of engineering, and it should be.
What was old is new today for the young people, perhaps just because it was something so physical, tangible. Kind of like when we kids of the 70s and 80s looked at computers built with vacuum tubes fed by paper with little punches taken out. Paper with slots punched turned into a computer program!
Some of the reason for the adoration of old technology is nostalgia. But equally as powerful today is that plenty of young people look at what we had back in the day and wonder, like we older people do today with new (often) crap tech…but can it run Doom? It’s kind off like asking why Doom itself is still so popular? Because it’s freaking cool!
I for one am happy these old technologies are still regarded as interesting. Maybe these kids will help prevent a future where our TV and gutters and front door rely on the Internet—to insure we’ve paid our subscription fee before doing the simple job they were meant to do.
Cassettes were still in wide use for audiobooks for children, last time I checked. At least were I live.
Music cassettes are a very robust media, after all.
There are mono cassette recorders made for toddlers, I remember.
Those colorful models sold by Fisher Price and other manufacturers.
They’re still being produced and sold new.
Example of a classic model:
https://www.youtube.com/watch?v=Fq9retdEy48
Absolutely. There’s a Youtube video I saw a while ago that explains it, too.
Long story short, the youth is done with swiping on a glass plate only all the time.
For a change, they want something real they can feel. Which makes sense.
https://www.youtube.com/watch?v=0dEJiQnotR8
Me, too. And there’s a purpose for each of them, too.
Cassettes are very robust and small, Vinyl is “cult” and has great cover art worth collecting, CDs have very good quality,
are digital and have no copy protection (not part of red book, I mean).
CD audio also is uncompressed, so it still can be high definition by today’s standard if the audio source used in mastering process wasn’t an MP3 or badly normalized (see loudness wars).
Descent is my Doom, it had 6DOF and VR support (not to mention 800×600 in the 90s), but carry on.
Old games were made for CRTs, so they tend to look better on CRTs. Especially pixel art blends very well on CRTs so you get much more “detail” on a CRT than you would get on a modern display.
I find games on my old consoles played on my CRT TV to just look gorgeous. While I could emulate many games with upscaling on my 4K OLED, something just feels off. Especially 2000’s 3D games, with 4K texture packs and upscaling they look crisp and clean on a modern display, but they also just look kind of too clean. Never tried the CRT shaders with BFI on a 120Hz+ display though, maybe that would “fix” the issues I have.
I don’t think there are that many CRT fans, it’s a minority, rather, far from mainstream.
So there’s no “obsession”, either in that narrow sense, I think.
I think there are simply open-minded people who recognize the positive aspects of CRT technology, just like with tube technology in general.
The slight softness, the more “organic” image is a feature, not a bug to them.
It had the effect of a natural anti-aliasing, I’d say.
Worked nicely on the Windows XP GUI with its slighty rough icons, I remember.
That being said, consumer grade CRT screens (household TV) and CAD monitors are very different.
The former is fine for old video games (dot pitch 0.4 or larger), while the latter is not far away from a good LCD panel in terms of sharpness.
The good thing about the CRT image was that it looked high-quality with a lower resolution source already.
TFTs/LCDs are a primitive technology in terms of ingenuity that need brute force to look good.
High amonut of raw pixels and a hot-running on-board computer pushing pixels..
A real innovation would have been OLED screens (active OLED pixels, not OLED background light) or laser based projection.
Maybe through rear projection to not harm human vision.
But so far, we’re using same LCD trchnology that was available in mid-90s on Sega GameGear or pocket TVs. Just slightly more refined.
“slightly more refined” is kind of an understatement. that game gear display was awful. it wasnt until the ’10s till consumer displayes started getting good. in the last few years a lot of new capabilities have started appearing. new monitors are capable of things that even high end gpus have trouble keeping up with. seemed like the screen was the limit for the longest time because you could only feed it pixels so fast, now they are voracious.
crts make be good for nostalgic 8-bit gamers who want an authentic experience. im happy with an emulator and dont even bother to to turn on the filters than make them look more authentic. i recently did a playthrough of super mario 3 on my steamdeck, looked pretty good on the oled, and i played it outside on the porch in broad daylight. crt aint doing that.
but if you can convince people to buy gold plated silver speaker cables, you can convince people to buy a legacy technology at four times the original price. frankly im surprised there isnt a premium brand of modern crt displays, because im sure you will find buyers.
Maybe, think what you want.
To me, there was little progress since the 2000s.
Fundamental progress, I mean. The technology merely got more miniaturized.
But that’s not completely right, maybe. It got worse in some respect, even.
For 20 years we have boring smartphones that look and feel same.
We have black, ugly PCs. Now without any drive bays, even.
Mechanical media are nolonger as common.
Instead, flash media are everywhere and will loose data soon due to degradation of cell capacity.
Our interfaces (Windows/Mac/Android etc) are nolonger colorful, optimistic and skeuomorphistic looking.
Instead, everything is boring and minimalistic. Just like society, maybe.
A far cry from the 90s and early 2000s when applications looked like appliances (music player looking like a hi-fi deck etc).
The built quality of technology has worsen.
Cheap plastic everywhere, fewer screews and metal parts.
High tech, low life. Basically. The list goes on.
In PC gaming, about the only notable enhancement that comes to mind was DirectX 11 with Tesselation. And Mantle API (lives on in Vulkan).
And the re-discovery of Raytracing (once popular in the 90s; POV-Ray etc).
Seriously, as a casual gamer I saw the same mediocre 3D graphics for 20+ years over and over.
No real improvement, except higher resolution/more FPS but always the same of-the-shelf rendering engines in use.
Merely indie games were refreshingly different, I think.
Looking back, current modern 3D games for PC do often look worse than games from, say, 2005-2015.
Nolonger as polished and well animated. They look dull, sometimes.
If I look back at the 2000s, then I see pretty designs such as Sony VAIO computers
or the Compaq IPAQ, Sony CLIE PDAs and other foldable and ergonomic gadgets.
In the 2000s, PCs had pretty OSes such as Mac OS X with Aqua or Windows Vista with Aero Glass. Or Linux with Compiz, even.
Compared to that, our modern tech has become devastatingly unsophisticated. In my opinion, I mean.
That’s why I can relate to why some people won’t let go the past.
The modern tech simply isn’t for them. They have higher/different standards.
Yes, I vaguely remember. They added sync features, so the TFT could run at a variable refresh rate etc?
Previously, the timing was internally fixed and electronics had to convert external frequencies/refresh rate to native one.
That’s why 60 Hz or 120 Hz was often used, I assume.
And why VGA connection was useful to emulators,
it could transport non-standard refresh rates (50 Hz, 59,9 Hz, 70 Hz etc) that the emulated games used (the TFTs ADC then did the conversion).
Yes, but mainly because of the flickery CCFL backlight tube and the noisy Composite video connection.
If it was attached via S-Video or RGB (or a shielded Composite cabling) and had a clean background light,
then the LCD panel itself would have worked more or less fine. 🙂
I do like emulators, too, I also liked the 15″ LCD monitors when new.
I remember using them on 486 laptops first time.
Beige NEC and Belinea LCDs monitors were also not uncommon.
However, they looked a bit pixelated the low at resolutions of the day
(below 640×480; 800×600 pixels and up started to look natural)
but at least they didn’t flicker at 60 Hz, which I found to be positive at the time. 🙂
Real OLED monitors would be cool, I think.
I think many classic games using 320×200 pixels and similar low-res resolutions
also might look fine on small screens because the perceived pixel-density increases (it’s comparble to sitting far away from TV).
That’s an alternative approach to using vintage CRT monitors with big dot pitchs that smooth out/filter the low-res graphics.
Every single thing you said is wrong.
LCD technology itself was simply not ready. (it was CSTN!) The Game Gear is somewhat famous for being the reason that Citizen massively invested in figuring out how to make color pigments for LCDs.
The CCFL tube deliberately flickers at 60Hz, locked to the display, probably to try to reduce the amount of ghosting. It’s not the same full strobe effect that CRTs had – it only dims about 50% during the dim portion. The connection is already fully digital (4 data lines emitting sequential RGB subpixels) and not composite.
The static contrast ratio is abominable (about 1:2). Panels have row/column crosstalk. The LC itself is horribly slow, with ghosting all over the place – gray-to-gray times are multiple tens of milliseconds.
Really? I thought merely half of it. 😅
Seriously though, thanks for pointing out my mistakes. 🙂
Ah yes, I remember differences such as active matrix vs passive matrix..
I wouldn’t say it wasn’t ready, though. It just was what it was, rather.
Modern LCD/TFT technology simply tries to better cover up its deficies, I’d say.
There also was a high voltage generator inside, if I remember correctly?
Anyway, there are replacement backlights on LED basis now.
Haven’t tried them, though.
Yes, you’re right. I got that wrong. My bad.
I think it came to mind because of the LC display mods.
Many years ago when retro gaming started there were attempts at replacing the panel with generic LCDs.
Nowadays there are digital converters and LED based replacement backlights.
The resolution of the GG was very, um, humble, as well.
Something like 160×144, while Mastersystem was 256×192 pixel.
A better screen perhaps wasn’t worth it, not sure.
The TV tuner worked good enough with the screen, though.
My Roadstar/Casio pocket TV had ghosting, too.
It didn’t bother me that much, though, because proper CRT monitors were around, too.
I also liked mono screen in 486 laptops. The ghosting looked quite elegant, I think.
But again, proper CRTs were around, too, at the time so it was a non-issue.
And on the Windows desktop with static images, the slow LCDs were okay I think.
crt shoot electron, steer with magnets, hit screen, make glow
lcd flat rectangle mystery box from korea, indecipherable
To be clear, he does say that the CRT looks better when it’s in the dark. In a regularly lit room, yeah, it’s going to look washed out.
It’s going to look better in the dark anyhow because your visual acuity goes down.
The normal average assumption of 1 arc-minute of resolution for the human eye no longer applies and pixels appear “bloomier” and blend into one another, fading away small edge artifacts and pixelation at the usual viewing distances. At the same time, visual noise increases and this may be interpreted as “detail” the same as how vinyl hiss is interpreted as extra content in the sound even if it’s just white noise, because the brain tries to make sense of it and invents the missing bits.
CRTs for the most part have higher refresh rates, lower latency and better response time than modern monitors.
Yeah, i hate that. And the ones with good contrast don’t get dark enough for all day use, i have to trick with
xrandr --brightness & --gamma.I’m still waiting for someone to create a “retro” OLED display that accurately recreates raster scan for old light gun games. Duck hunt with a Wiimote just isn’t the same.
That would indeed be very cool.
You may already know this, but if for anyone who doesn’t – there are actually ROM patches available for duck hunt and other NES light gun games that will allow them to run on most LCD/OLED TVs.
This is because Duck Hunt (and most other light gun games on the NES) don’t actually need emulated raster scan per se, they just need extremely precise timing and no added latency between the NES and display.
When you pull the trigger in duck hunt, the NES displays a one or more black frames that have a white box where a duck/target should be, while also checking if the light gun is registering. It doesn’t care where the beam is exactly, it just expects the TV to be displaying this entire frame at the same time that it checks the light gun sensor. In practice this means it will sample the sensor during the VBLANK interval after it’s rendered a frame. This means there can be at most around 1.2ms between when the full image has been sent to the display and when the NES expects it to be there to sample.
What the ROM patches do is to add a an adjustable amount of latency to this sampling process, so that you can re-synchronize the light gun sampling with the latency of your specific LCD/OLED display. If this is well implemented, and you have a reasonably fast display, it ends up being awfully close to the original experience.
A couple decades ago I had a set of grayscale monitors made by Radius (nominal 1152×870 @ 75Hz) and I did something similar there. (Grayscale means not aperture grill means no theoretical upper resolution limit). Unfortunately, the graphics card I had couldn’t display more than 2048 pixels horizontally, so I ended up configuring it to draw 2048×1728 (at 75Hz per field). It was ok? The electron beam was a little thick for the resolution (things were a bit blurry and I didn’t think to adjust the focus) and the interlacing was objectionable.
On a color CRT, the aperture grill is really going to be a problem, but modern highDPI rendering intentionally blurs everything anyway so that there’s no aliasing from the pixel grid…
Most consumers buy whatever’s cheap and they mostly bought crappy CRTs. Same thing with modern tvs. Cheap crappy CRTs look like crap and cheap crappy LCD panels look like crap. You also have to use an adjustment tool to get the brightness/sharpness/etc. settings correct. Most people have eyes like a dead fish and will leave their tvs unadjusted or will change settings that make them look even worse. One of the Toy story movies when it first came out on dvd had a really nice adjust tool to help users get their tvs honed in and looking nice. If you get your hands on a nice quality CRT that’s been adjusted correctly, it’s beautiful. I still have my 32″ Toshiba Cinema Series CRT that I use to watch 4×3 content because it just looks nicer on the CRT. CRTs also excel at black levels, something that modern displays STILL struggle with. I also have a modern display for 16×9 content which looks nice too. Both displays have their uses. CRTs also don’t have lag which is important if playing games on older video game consoles. Modern displays have done quite a bit of work to reduce lag but can’t eliminate it entirely.
CRT/LED also don’t have lag, they smear.
Maybe I am uneducated or misunderstanding something but is a CRT’s resolution not tied to the DPI of its shadow mask? You would think that would be a cap on your image quality.
That is correct.
And if it’s an analog signal, it will also be bandwidth limited. How good your monitor cable is becomes a limiting factor, and then what the circuitry in the monitor will be able to handle. This monitor does happen to have a DVI connector, but it’s also using analog RGB through the cable, not digital. That was a fallback option, usually no better than a regular VGA cable. This is why some high-end monitors had a third option of separate BNC connectors with shielded cables, to get a sharper image even at the native resolution.
Reminds me of running 2048×1536 interlaced on a 14″ daewoo crt in highschool. My eyes hurt now thinking about it nearly 30 years later.