The reports of the death of the VGA connector are greatly exaggerated. Rumors of the demise of the VGA connector has been going around for a decade now, but VGA has been remarkably resiliant in the face of its impending doom; this post was written on a nine-month old laptop connected to an external monitor through the very familiar thick cable with two blue ends. VGA is a port that can still be found on the back of millions of TVs and monitors that will be shipped this year.
This year is, however, the year that VGA finally dies. After 30 years, after being depreciated by several technologies, and after it became easy to put a VGA output on everything from an eight-pin microcontroller to a Raspberry Pi, VGA has died. It’s not supported by the latest Intel chips, and it’s hard to find a motherboard with the very familiar VGA connector.
The History Of Computer Video
Before the introduction of VGA in 1987, graphics chips for personal computers were either custom chips, low resolution, or exceptionally weird. One of the first computers with built-in video output, the Apple II, simply threw a lot of CPU time at a character generator, a shift register, and a few other bits of supporting circuitry to write memory to a video output.
The state of the art for video displays in 1980 included the Motorola 6845 CRT controller and 6847 video display generator. These chips were, to the modern eye, terrible; they had a maximum resolution of 256 by 192 pixels, incredibly small by modern standards.
Other custom chips found in home computers of the day were not quite as limited. The VIC-II, a custom video chip built for the Commodore 64, could display up to 16 colors with a resolution of 320 by 200 pixels. Trickery abounds in the Commodore 64 demoscene, and these graphics capabilities can be pushed further than the original designers ever dreamed possible.
When the original IBM PC was released, video was not available on the most bare-bones box. Video was extra, and IBM offered two options. The Monochrome Display Adapter (MDA) could display 80 columns and 25 lines of high resolution text. This was not a graphic display; the MDA could only display the 127 standard ASCII characters or another 127 additional characters that are still found in the ‘special character’ selection of just about every text editor. The hearts, diamonds, clubs, and spades, ♥ ♦ ♣ ♠, were especially useful when building a blackjack game for DOS.
IBM’s second offering for the original PC was far more colorful option. The Color Graphics Adapter (CGA) turned the PC into a home computer. Up to 16 colors could be displayed with the CGA card, and resolutions ranged from 40×25 and 80×25 text mode graphics to a 640×200 graphics mode.
Both the MDA and CGA adapters offered by IBM were based on the Motorola 6845 with a few extra bits of hardware for interfacing with the 8-bit ISA bus, and in the case of many cards, a parallel port. This basic circuit would be turned into a vastly superior graphics card released in 1982, the Hercules Graphics Card.
Hercules offered a 80×25 text mode and a graphics mode with a resolution of 720×348 pixels. Hercules’ resolution was enormous at the time, and was still useful for many, many years after the introduction of the superior VGA. Most dual-monitor setups in the DOS era used Hercules for a second display, and some software packages, AutoCAD included, used a second Hercules display for UI elements and dialog boxes.
Still, even with so many choices of display adapters available for the IBM PC, graphics on the desktop was still a messy proposition. Video cards included dozens of individual chips, implementing the video circuit on a single board was difficult, resolution wasn’t that great, and everything was really based on a Motorola CRT controller. Something had to be done.
The Introduction of VGA
While the PC world was dealing with graphics adapters consisting of dozens of different chips, all based on a CRT controller designed in the late 70s, the rest of the computing world saw a steady improvement. 1987 saw the introduction of the Macintosh II, the first Mac with a color display. Resolutions were enormous for the time, and full-color graphics were possible. There is a reason designers and digital artists prefer Macs, and for a time in the late 80s and early 90s, it was the graphics capabilities that made it the logical choice.
Other video standards blossomed during this time. Silicon Graphics introduced their IRIS graphics, Sun was driving 1152×900 resolution displays. Workstation graphics, the kind used in $10,000 machines, were very good. So good, in fact, that resolutions available on these machines frequently bested the resolution found in cheap consumer laptops of today.
By 1986, the state of graphics on the Personal Computer were terrible. The early 80s saw a race for faster processors, more memory, and an oft-forgotten race to have more pixels on a screen. The competition for more pixels was so intense it was defined in the specs for the 3M Computer – a computer with a megabyte of memory, a megaFLOP of processing power, and a megapixel display. Putting more pixels on a display was just as important as having a fast processor, and in 1986, the PC graphics card with the best resolution – Hercules – could only display 0.25 megapixels.
In 1987, IBM defined a new graphics standard to push the graphics on their PC to levels only workstations from Apple, Sun, and SGI could compete with. This was the VGA standard. It was not built on a CRT controller; instead, the heart of the VGA chipset was a custom ASIC, a crystal, a bit of video RAM, and a digital to analog converter. This basic setup would be found in nearly every PC for the next 20 years, and the ASIC would go through a few die shrinks and would eventually be integrated into Intel chipsets. It was the first standard for video and is by far the longest-lived port on the PC.
When discussing the history of VGA, it’s important to define what VGA is. To everyone today, VGA is just the old-looking blue port on the back of a computer used for video. This is somewhat true, but a lie of omission – the VGA standard is more than just a blue DE-15 connector. The specification for VGA defines everything about the video signals, adapters, graphics cards, and signal timing. The first VGA adapters would have 256kB of video RAM, 16 and 256-color palettes, and a maximum resolution of 800×600. There was no blitter, there were no sprites, and there was no hardware graphics acceleration; the VGA standard was just a way to write values to RAM and spit them out on a monitor.
Still, all of the pre-VGA graphics card used a DE-9 connector for video output. This connector – the same connector used in old ‘hardware’ serial ports – had nine pins. VGA stuffed 15 pins into the same connector. The extra pins would be extremely useful in the coming years; data lines would be used to identify the make and model of the monitor, what resolutions it could handle, and what refresh rates would work.
The Downfall of VGA
VGA would be improved through the 1980s and 1990s with the introduction of SVGA, XGA, and Super XGA, all offering higher resolutions through the same clunky connector. This connector was inherently designed for CRTs, though; the H-sync and V-sync pins on the VGA connector are of no use at all to LCD monitors. Unless the monitor you’re viewing this on weights more than 20 pounds and is shooting x-rays into your eyes, there’s no reason for your monitor to use a VGA connector.
The transition away from VGA began alongside the introduction of LCD monitors in the mid-2000s. By 2010, the writing was on the wall: VGA would be replaced with DisplayPort or HDMI, or another cable designed for digital signals needed by today’s LCDs, and not analog signals used by yesteryear’s CRTs.
Despite this, DE-15 ports abound in the workspace, and until a few years ago, most motherboards provided a D-sub connector, just in case someone wanted to use the integrated graphics. This year, though, VGA died. Intel’s Skylake, their newest chip that is now appearing in laptops introduced during CES this month, VGA support has been removed. You can no longer buy a new computer with VGA.
VGA is gone from the latest CPUs, but an announcement from Intel is a bang; VGA was always meant to go quietly. Somehow, without anyone noticing, you cannot search Newegg for a motherboard with a VGA connector. VGA is slowly disappearing from graphics cards, and currently the only cards you can buy with the bright blue plug are entry-level cards using years-old technology.
VGA died quietly, with its cables stuffed in a box in a closet, and the ports on the back of a monitor growing a layer of dust. It lasted far beyond what anyone would have believed nearly 30 years ago. For the technology that finally broke away from CRT controller chips of the early 1980s, VGA would be killed by the technology it replaced. VGA was technically incompatible with truly digital protocols like DisplayPort and HDMI. It had a storied history, but VGA has finally died.