Pixel Art And The Myth Of The CRT Effect

The ‘CRT Effect’ myth says that the reason why pixel art of old games looked so much better is due to the smoothing and blending effects of cathode-ray tube (CRT) displays, which were everywhere until the early 2000s. In fits of mistaken nostalgia this has led both to modern-day extreme cubism pixel art and video game ‘CRT’ filters that respectively fail to approach what pixel art was about, or why old games looked the way they did back with our NES and SNES game consoles. This is a point which [Carl Svensson] vehemently argues from a position of experience, and one which is likely shared by quite a few of our readers.

Although there is some possible color bleed and other artefacts with CRTs due to the shadow mask (or Sony’s Trinitron aperture grille), there was no extreme separation between pixels or massive bleed-over into nearby pixels to create some built-in anti-aliasing as is often claimed unless you were using a very old/cheap or dying CRT TV. Where such effects did happen was mostly in the signal being fed into the CRT, which ranged from the horrid (RF, composite) to the not-so-terrible (S-Video, component) to the sublime (SCART RGB), with RGB video (SCART or VGA) especially busting the CRT effect myth.

Where the pixel art of yester-year shines is in its careful use of dithering and anti-aliasing to work around limited color palettes and other hardware limitations. Although back in the Atari 2600 days this led to the extreme cubism which we’re seeing again in modern ‘retro pixel art’ games, yesterday’s artists worked with the hardware limitations to create stunning works of arts, which looked great on high-end CRTs connected via RGB and decent via composite on the kids’ second-hand 14″ color set with misaligned electron guns.

58 thoughts on “Pixel Art And The Myth Of The CRT Effect

  1. i retired my last crt some 10-15 years ago. i was only using it for stereo vision because lcds werent up to 120hz yet. at least none that i could afford. i eventually retired it when the stereo enabled drivers were getting quite dated. the only thing going for it was that it could do 1600×1200 at 80 hz, and it could do 1024×768 at 120hz (i used 800×600 in some games that the gpu couldnt keep up with).

    never really bought into the whole “crt looks better” hype. perhaps it was because it was an old monitor when i got it and everything just sorta looked burned out. you would have to compare it with a virgin monitor (perhaps old new stock) and you might get some useful comparison out of it. but i doubt it would hold up very well to modern display technology which have improved a lot in the last 10-15 years.

    1. Even in 1997 I preferred emulators over the real thing because even the cheapest VGA card coupled to the cheapest CRT monitor would produce a much better image than most gaming consoles connected to a TV.

    2. I still use a CRT for old console/arcade games. For me, it isn’t just about the looks of RGB vs digital (which I find similar, other than scan lines), but about the lack of delay. For platforming and fighting games, even a few frames of delay is significant. For modern games designed to be played on digital displays, this is generally accounted for and isn’t an issue.

      1. RetroArch has what seems like a magic feature called Run ahead. It effectively inserts negative input latency by taking advantage of the fact that a lot of older games introduce a 2-3 frame delay before acting on an input. So it is already rendered that far ahead when it’s time to act on your input. Super Mario World feels native with this on.

  2. Every time some display or audio technology or format becomes obsolete and widely unused, some folks will claim it looked/sounded better all along… like with the recent imax craze, I mean yes when you watch that in an actual theater that uses optical film you get enormous resolution so there is a point to that, but for the most part and what most people get to see, it’s “just” a revival of (almost) the old 4:3 screen ratio. We’re not used to seeing that (any more) so we go “wow” because it’s just so unusual to see that in a theater.
    And all the weaknesses of the old thing suddenly seem to become strengths… like the blurryness of a crt… the crackling of a record player… I could go on.
    Maybe there is something that got lost on the way of trying to modernize. Maybe not. But nostalgia certainly plays a key role, as well as the general idea of a “lost” thing being brought back.

    1. The point of records is the better mastering. Many were made before the loudness war, and you literally cannot master a record as loud as a digital file because of the acceleration limitations of the cutting head. There is a lot of audio processing going on with records (at the very very least some sort of acceleration limitation) but 9 times out of 10 that specific type sounds much nicer than the ‘flatten everything out and make it a wall of sound’ type of mastering done with music today.
      This is why the objectively way inferior medium can sound so much better than a practically perfect digital file. In theory, the digital files can be mastered in the same or even better ways, but since those are made for mass market appeal, they usually aren’t. Mobile Fidelity Sound Labs is one of the studios that do digital media right.

      1. Yes, this! An LP has many technical limitations, much more than even the Red Book CD format (with the possible exception of containing frequency content above 22 kHz, which humans can’t even hear anyway). But many people claim that LP is better than “digital” because analog is just better than digital, right? In this case it (LP) is better only because of the mastering, not the technical limitations of “digital”.

        You can digitize your LP with a good quality turntable and a good quality ADC, and if done properly you’ll be unable to hear the difference between the original LP and the digital version. You don’t even need to go hog wild with the sample rate or bit depth either—the bog standard 48 kHz and 16 bits are more than sufficient (you can resample transparently to 44.1 kHz, if you need to, with any good sample rate converter). The best LP on the best turntable won’t give you more than about 75 dB of dynamic range (SNR), which is about 21 dB worse than what 16-bit PCM is capable of (which also means you don’t need to add any dither if you happened to record at more than 16-bit and then scaled down to 16-bit).

        Keep a lossless copy (FLAC or similar) and make lossy copies from that “master” lossless version. For lossy, Opus is the current king. It’s transparent at 192 kbps (based on published ABX tests), and very close to transparent at 128 kbps. For music ripped from an LP, 128 kbps or even 96 kbps might actually be transparent (i.e., indistinguishable from the original). For speech, 32 kbps is “essentially transparent”. I have an old Flip Wilson comedy LP that should sound just fine at that bit rate if I were to rip it.

    2. “And all the weaknesses of the old thing suddenly seem to become strengths… like the blurryness of a crt… the crackling of a record player… I could go on.”

      I think this misses the point. It’s not about good/bad but about suitable/unsuitable.

      For example, if you have an 600 ohms studio head phone, it also needs a higher impedance source that fits.
      Or let’s say you have an 50 ohm radio transceiver, you also need an 50 ohms coaxial cable and an antdnna with 50 ohms feed point.

      Same goes for frequency range. The more higher end your headphones are, the better has to be the source materia, as well.
      Because imperfections of a bad source will become audible in the higher end headphones.

      And that’s quite same thing that happens with 320×200 graphics on a CAD grade CRT monitor, which also includes HD CRT TVs.
      There’s no blur happening as with lower end CRT tubes that had a lower physical resolution.

      Unfortunatelly, the designers of the 70s, 80s and early 90s did take those monitors into account.
      Simply because these monitor were still in wide use, it wasn’tbecause of nostalgia or a fetisch or whatsoever.

      The actual development was done on good hardware, sure, but beta testing happend on ordinary, consumer‐grade hardware setup.

      In case of PC games it may had been a non‐name 12″ VGA monitor from far east, a random 80286 PC with a Trident 8900 or Paradise VGA card.
      Your typical borring home office PC, in other words.

      NES or Genesis games had been tested on a 14″ portable TV with RF jack, and a Commodore 1702 or 1084 video monitor via composite.
      Both monitors were very popular in briadcast environment, too and known to every C64 user.

      That’s what had been used, generally speaking.
      Rather than, say, an 17″ VGA monitor with an on-screen-display or an 32″ 100 Hz/120 Hz TV.
      That came later history, in the Windows 95/98 and N64 or Playstation era.
      When games startet to adress 640×480 pixel resolutions, rather than NES or C64 resolutions.

      Speaking under correction.

    3. I think it’s because more flawed and analog things that require fiddling and labor and impose restrictions on you are closer to humanity. It’s similar to the concept of wabi-sabi.

  3. Composite artifact colors associated with CRT displays were absolutely part of what made the old computers and consoles of the 8 and (to a limited extent) 16 but era look the way they did.

    https://en.wikipedia.org/wiki/Composite_artifact_colors

    Example: Apple 2 computers had 4 graphics modes but they all resulted in a 560×192 display where, depending on the mode, you had more out fewer ways to control the color of each pixel, and that was based on the 3 pixels before it and where they appear in the scanline. If you ignore this and treat the graphics modes as 140×192 instead using simple RGB conversion it looks like absolute crap and text becomes unreadable in many games. I wrote an apple emulator with NTSC emulation and studied it extensively.

    Example: NES games (some, not all) used very odd looking colors and dither patterns that look a little odd in emulators but with composite filtering the blending causes much nicer looking results, and no surprise it’s because the games themselves were designed on CRT displays to look that way. Examples are: Batman, Empire Strikes Back, Castlevania 3, and Micro Mages (especially the second level)

    Example: Many Sonic games on Sega Genesis/Megadrive used a dither/interlaced pattern which looks like a checkerboard in RGB mode but on composite blends into a beautiful alpha transparency for things like shadow and waterfall effects.

    TL&DR version: you’re dead wrong and you need to research this further. Get off my lawn.

    1. FTA: “One exception to this rule about signal quality is CGA color blending on old IBM PCs. This uses artefacts of the NTSC composite video signal – not the screen – to display more colors than normally available via the RGB signal on the same system. CGA color blending is described in greater detail on this eminent page, from where I also brazenly stole the illustration above. To the left is the combination of RGB colors and patterns that will produce the NTSC color on the right. This can only be achieved using composite video and has nothing to do with the CRT itself. On a CRT capable of displaying both a composite and RGB signal, the effect will only appear when selecting the composite input. “

      1. When the article says that CGA is the exception, it’s flat-out wrong. There’s no shortage of primary sources from people who developed games on the NES, Megadrive, and other composite-first systems where the artists explicitly took advantage of the reduced chroma bandwidth, the slightly-reduced luma bandwidth, and the luma-chroma crosstalk.

    2. Not just composite artifacting. Simply the RGB colour mask arrangement would provide extra texture on top of the base image, and sometimes designers could take advantage of it.

      It’s a bit telling when the article goes off on how a particular image which clearly shows that effect must have been a simulated rendering rather than a real photo… only to then be forced to post a correction because it absolutely *is* a photo and completely undermined his entire argument.

      Oh, and the checkerboard alpha blending doesn’t need composite; I do recall it being used on the GBA’s LCD screen to good effect as well.

  4. Seriously, you should consider using a video monitor or VGA monitor from the 80s.
    They had a dot pitch of 0.4 mm to 0.6 mm and it DOES make a big difference.
    Commodore 1702 (1982) 0.64mm, Commodore 1084 (1987) 0.42 mm, IBM 8512 VGA Monitor 12″ (1987) 0.41 mm.

    An average PC monitor from 90s has 0.21 to 0.28 mm dot pitch and is NO valid monitor for comparison, thus.
    It’s a CAD level monitor and very sharp, of course. Made in the Windows era (640×480 and up being common).

    If you don’t believe me, that’s fine, but please double check yourself at least.
    Here’s an example of an old low‐end VGA monitor that is good for pixel‐art.
    It’s a bad VGA monitory, really, but it visualizes the effect very well, I think.
    https://wwww.youtube.com/watch?v=m79HxULt3O8

    1. Van Ness and Bauman published the frequency response of the human eye, around 1956, using a black and white CRT to generate the test targets. I’ve often wondered if the CRT frequency response influenced the results and if a 4k or 8k uhdtv would make any difference.

    1. Thinkin’ of the IBM 5151, the MDA monitor, by any chance ? ;)
      https://www.youtube.com/watch?v=BEkDRa–YaY

      The green or amber monitors from Apple II era were also great for interlaced video modes.
      An Amiga in interlaced resolution modes looks quite stable on such a monitor with afterglow.
      https://www.youtube.com/watch?v=8v4BaWwoyA0&t=280

      It used to be an analog solution for flickery images.
      This was before scan-doublers/flicker‐fixers or 100 Hz/120 Hz CRT TVs with a digital framebuffer got more popular.

      The nice thing about those slow-decay screens was great compatibility, too.
      They worked with most TV timings from foreign countries and could he used with lightpens or lightguns, still.

      That’s something modern TFTs don’t offer.
      Mono screens can be adjusted to display various resolutions natively, by turning the adjustment pots.
      NTSC, PAL, SECAM, 50 Hz, 60 Hz, interlaced or not etc. It didn’t matter.
      Some mono screens could handle up to 1000 lines and full 6 MHz video bandwith.

      Some green monitors even could display Hercules graphics,
      because the 18 KHz sync was still within the upper limits of the electronics and the flyback.
      It was borderline, sure, but it was possible. And a cool hack.
      http://boginjr.com/electronics/old/tesla-pmd60/

  5. I wrote my first little 4k intro for the Amiga in 1989, which was a raster bar moving up and down using the copper and changing the foreground colours from dark progressively towards the centre so they were exposed only where the bar was positioned & some of the hardware sprites whizzing round & a tiny chip tune.
    When we looked at it on my tv as a display the background screen appeared to swell dimensionally as the bar got near the centre of the screen and it only added to how good it looked to my eye.
    I was quite pleased with it especially managing to get it in 4k, but took it a computer club and it looked pants on someone’s fancy flatscreen monitor that I couldnt afford, and we realized that it was the crt.
    So in that specific case, yes it looked better on a crt.

  6. I don’t remember people replacing their TVs very often.

    We upgraded from b&w to color in the mid to late 70s.

    I think we replaced the color tv after 1990.

    Nobody replaced TVs ever 2-3years, and they were definitely not accurately setup for color rendering.

    And comparisons to mid 90s SVGA monitors is complely messed up.

    1. Also, the artist obviously used a CRT to develop the sprites. So that is how it was intended to look. I mean unless he had a time machine or did it on graph paper (which did happen sometimes)

    1. Most of modern crt’ phosphor couldn’t even cover 90% of SRGB, meanwhile you can plug in a QPLED backlight into the sh*tiest LCD panel and get over 150% coverage of ARGB

      1. Using a percentage is misleading when LCDs can only produce a finite amount of colors and brightnesses while any color space has infinite colors. CRTs do not have color banding and can produce infinite colors.

        1. >CRTs do not have color banding and can produce infinite colors.
          No, they are limited by signal to noise ratio and by the phosphor range
          They also had a worse black level than OLED

  7. Loved my old Trinitron and Viewsonic monitors, but really glad of (relatively) light, hi-res, flat TFT monitors. So nice to have the desk space back and not have to strain my back moving the damn things. Also: the power savings. I get the nostalgia, but am a huge fan of progress in display hardware.

    1. Yeah, I don’t see how they could possibly relegate this to myth. Pixel artists used CRTs to create the art. The medium always influences the artwork, even if you aren’t consciously deciding to “optimize” pixel art for CRTs.

      It was originally made with all the effects and defects of CRT technology, so that’s how it was meant to look. Why would they have developed the artwork around how it would look on some flatscreen tech that doesn’t exist yet?

  8. VGA DOS games typical used 320×200 resolution where each pixel was doubled (the signal was equivalent to 640×400). The pixels appeared very distinct and square even on CRT monitors of the time, which were capable of 640×480 or higher. The artwork was still beautiful.

    The real reason I suspect is behind the beauty of the graphics is: the artists payed attention to every pixel because every pixel mattered.

    1. Hi there, I don’t mean to disagree, that’s all correct as far as I know.
      However, in the early days VGA’s mode 13h (non-tweaked) was often being referred to as MCGA or “PS/2 graphics”.
      And the matching IBM monitor of the day had a dot pitch of 0.4 mm and it showed.

      It was an intended compromise between text quality and picture quality (photos, games).
      Because otherwise, at 320×200 resolution, an digitized image of say a flower or a parrot would have looked ugly even on contemporary hardware.

      The IBM PS/2 Model 30, the 8086 version, had MCGA on-board and had used such a type of blurry monitor.

      I mean, one of the reasons why IBM went analog after EGA was wishing for having more colorful images.
      Anyway, I don’t mean sound like a nitpicker.
      It’s just that vintage computing is a hobby of mine and that I had the original hardware back then.
      I hope you don’t mind, I didn’t mean to educate you or something.

      Links/references:
      http://ps-2.kev009.com/pcpartnerinfo/ctstips/7492.htm
      http://ps-2.kev009.com/pcpartnerinfo/ctstips/ee1a.htm
      https://ardent-tool.com/docs/pdf/brochures/ibm-ps2-colordisplay8512.pdf
      https://en.wikipedia.org/wiki/IBM_PS/2_Model_30#Model_30

      PS: That being said, other, much higher-end PS/2 models with XGA, 8514/A, VGA
      and so on had IBM monitors with a finer dot pitch (0.28 etc) same time already.

      They weren’t meant for home use, though, but for professional use (lab or office or art department).
      The typical home user didn’t seek out to buy such a quality monitor yet, but an affordable MCGA/VGA monitor.

      And there was quite a market for it, as there was with Turbo XT motherboards before.
      Basic VGA compatible cards were sought after, too.
      Just like how countless AdLib and Sound Blaster clones were mass produced afterwards.

      By late 80s, many emerging graphics standards were on the horizon but users were satisfied with anything VGA already.
      They were thinking “I don’t care what it is, if it is VGA compatible at least”.

      And this is understandable, considering that many EGA users at home didn’t even have access to a real EGA monitor.
      With a CGA monitor, they couldn’t do 640×350 mode, the native EGA mode. VGA hardware solved all these issues.
      (Many early VGA cards were Super VGA cards, really and had additional capabilities beyond plain VGA.)

      Suddenly, users could use all previous video standards, including Hercules and 132 columns modes (via mode utility).
      This was such a relief! A slightly blurry 12″ or 14″ monitor didn’t harm the fun. After all, it was VGA, most importantly.
      VGA had better text-mode than Hercules, even: Monochrome VGA monitors had been produced, as well.

  9. Don’t know how this is trying to be sold as myth when people with the equipment have eyes to see the difference with. It also gets a lot of wires crossed between VGA monitors & TVs, the source and the intended output have a huge relation to each other.

    Early 90s MS-DOS games had visibly square pixels, because the resolution on PC monitors was doubled. 90s console games did not have square pixels because the vertical resolution was halved to achieve high refresh rates with the same frequency on TVs designed for content at 30fps (or 25 for PAL).

    The use of dithering to achieve colour blending, transparency effects & texture isn’t some limited edge case, it’s how the perception of limited colour spaces were expanded by artists & you can see this easily all the way through to 256-colour PC games at 640×480 resolution. Textures which seem like a crude smear when presented on a modern display suddenly have a texture that looks a lot more like their intended materials. The ‘natural antialiasing’ on both TVs and monitors is from the subpixel dot mask not being made up of perfect squares, making aliasing artifacts like stair-stepping less noticeable and that’s without things like signal blur or colour bleed.

    All this can even be observed with a decent shader that reproduces the effects of the display contemporary to the material & the effect is obvious. (You need to take both the source resolution and your own display into account to have the vertical resolution evenly divisible or it just looks like a mess)

    Either this article is blatant engagement bait, bad faith, or the authors eyes are literally built different.

    1. “Early 90s MS-DOS games had visibly square pixels, because the resolution on PC monitors was doubled. 90s console games did not have square pixels because the vertical resolution was halved to achieve high refresh rates with the same frequency on TVs designed for content at 30fps (or 25 for PAL).”

      It’s right than 200 line modes got usually being doubled to 400 lines.
      That’s how the low resolutions were made ~31 KHz compatible, too.

      However, there also were games and demoscene productions that did use full 400 line resolution with actual pixel data.
      By disabling line-doubling, the video mode was 320×400.

      This tweaked mode was dropping compatibility with real MCGA hardware, of course.
      The circuit in the IBM PS/2 Model 30 (8086) was a dumb framebuffer.
      It also had merely 64KB of video RAM, so mode 11h and mode 13h were the upper limit.

      There’s a technical article here:
      https://www.phatcode.net/res/224/files/html/ch31/31-01.html

      PS: The original MCGA hardware could drive a 15 KHz monitor, a TV set essentially.
      It had a monitor detection pin that wasn’t being documented.
      There’s a technical article here: https://www.swiat-owocow.pl/lang/en/1285.html

      Also worth mentioning is that the VGA CRTC is very very programmable.
      There are DOS&Windows utilities that re‐program VGA to output NTSC or PAL compatible timings. All it needs is a custom made cable.

      An utility named VERDE can disable line-doubling, for users who wish to play CGA games with black “scanlines” or in green.

      PS2: I think the line doubling from 320×200 to 320×400 was a reason, but not the sole reason that PC games had looked pixelated.
      The line doubling had a certain influence, but didn’t automatically pixelate everything.

      Otherwise, CRT‐based VGA monitors from the 80s/early 90s would not have had an effect on 640×480 resolution anymore (Windows 3.1 desktop, mode 12h), which they did in the early 90s, still.
      The typical no-name monitor had provided a sligtly blurry Windows 3.1 experience at 640×480.

      So I think the higher-end VGA monitors from mid-90s with CRT tubes having the ability to resolve full VGA resolution were at fault, too in making existing DOS games look more pixelated:

      Because, by contrast, the older VGA CRT tubes that were in wide use hefore were rather TV grade and barely did resolve 400 or 480 line modes.

      ‐ Thus, in combination with a lower dot pitch and and an imperfect screen mask, the pixels were simply too close to each others that pixels had no other choice but partially blend together.

      Speaking under correction. Sorry for my poor Englisch.

  10. So what if the writer uses a CRT, many of us retro enthusiasts do. Their opinion piece keeps mixing up their opinion for fact and they are not mentioning what CRT(s) and inputs they are using. CRTs are not universal in appearance and artists are not universal in their intent for their artwork. I have and use higher resolution VGA monitors, HD TV CRTs, SD TV CRTs, small black and white CRTs, arcade CRTs, and vector CRTs, and I wouldn’t speak in absolutes of how CRTs should look, of artists intents, or how modern retro games should look. They’re wasn’t a universal look to pixel games or CRTs back then, there doesn’t need to be a universal look now, and their preferance isn’t fact.

  11. So what if the writer uses a CRT, many of us retro enthusiasts do. Their opinion piece keeps mixing up their opinion for fact and they are not mentioning what CRT(s) and inputs they are using. CRTs are not universal in appearance and artists are not universal in their intent for their artwork. I have and use higher resolution VGA monitors, HD TV CRTs, SD TV CRTs, small black and white CRTs, arcade CRTs, and vector CRTs, and I wouldn’t speak in absolutes of how CRTs should look, of artists intents, or how modern retro games should look. They’re wasn’t a universal look to pixel games or CRTs back then, there doesn’t need to be a universal look now, and their preferance isn’t fact.

    1. “CRTs are not universal in appearance and artists are not universal in their intent for their artwork.”

      Absolutely, right. There were different CRT types with different characteristics.
      For example, here’s a small list of different setups that used to exist:
      – The Amdek (?) monitors used by Apple II users
      – A portable Black and White TV set used on so many ZX81 (soft image, good for semi graphics)
      – Your consumer grade TV at home that connects to your kid’s NES via RF/AV (no visible scan lines)
      – The RGB CRT in an arcade cabinet (with thick scan lines)
      – The Commodore 1701/1702 monitor used by C64/C128 owners
      – The popular Commodore 1084 monitor used by Amiga users
      – The various VGA monitors used by PC users over three decades
      – The Atari SM124 monochrome monitor used by Atari ST users for work
      – The IBM 5151 TTL monitor that had been used by PC/XT users with an MDA or Hercules card
      – The IBM 5153 CGA monitor that CGA/EGA users had used
      – NTSC video monitos used by CGA and and Apple II users to get artifact colors
      – Your generic video monitor, used to display anything with VBS or SVBS output
      – The various NEC MultiSync monitors used by professional users
      – Large 20″ CAD/CAM monitors used by SGI users
      – the list goes own

      What’s important, though, certain combinations were more common than others.
      The TRS-80 and the Amstrad CPC 464 shipped with a standard monitor.
      ZX81 users almost always used a b/w portable.
      The IBM 5151 had many clones and was the standard PC/XT monitor in business.
      The kids in the UK grew up with the BBC Micro standard monitors, such as the Practical Microvitec Cub 14″.

      So there had been reference monitors, too, despite diversity.
      And they’re needed in order to perceive the art as being seen by the computer artists back in the day.

  12. Ahh so we are saying that the CRT is a perfect rendering and ONLY the analog signal compression causes the anti-aliasing? Various styles of shadow mask, beam convergence, scan sync, power supply consistency, etc don’t affect it. Good to know. The beams only hit one set of RGB phosphors and and never misses.

  13. CRTs aren’t magical but the rabbit hole goes deeper than you think

    The reality is that all games were targeting different consumer TVs or Monitors because things were not as standardized in the analog days
    Tubes could range from 150 to 300 TV lines of horizontal resolution and they could range in screen size and they could range in mask pitch too

    Think about color too, japanese phosphors had different color primaries from US/EU counterparts and they used 9300K temperatures instead of 6500K
    Standard phosphors from the 80s were different from the ones used in the 90s too
    Which is why you now have color correction filters in libretro forums like Dogway’s Grade shader for example

    Component is about as good as RGB, and S-Video is close to as good as them
    If they are quality-made cables there is not a huge difference between them
    There were also later sets with phenomenal comb filters that made even Composite video look almost as good and sharp as RGB or Component

    All in all, this article is a waste of bandwidth

  14. What are we talking about about again? My eyes are getting bleary just thinking about looking at a CRT. Even the really nice ones I had used here and there. So many headaches!

Leave a Reply

Your email address will not be published. Required fields are marked *

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.