Japan’s Forgotten Analog HDTV Standard Was Well Ahead Of Its Time

When we talk about HDTV, we’re typically talking about any one of a number of standards from when television made the paradigm switch from analog to digital transmission. At the dawn of the new millenium, high-definition TV was a step-change for the medium, perhaps the biggest leap forward since color transmissions began in the middle of the 20th century.

However, a higher-resolution television format did indeed exist well before the TV world went digital. Over in Japan, television engineers had developed an analog HD format that promised quality far beyond regular old NTSC and PAL transmissions. All this, decades before flat screens and digital TV were ever seen in consumer households!

Resolution

Japan’s efforts to develop a better standard of analog television were pursued by the Science and Technical Research Laboratories of NHK, the national public broadcaster. Starting in the 1970s, research and development focused on how to deliver a higher-quality television signal, as well as how to best capture, store, and display it.

The higher resolution of Hi-Vision was seen to make viewing a larger, closer television more desirable. The figures chosen were based on an intended viewing distance that of three times the height of the screen. Credit: NHK Handbook

This work led to the development of a standard known as Hi-Vision, which aimed to greatly improve the resolution and quality of broadcast television.  At 1125 lines, it offered over double the vertical resolution of the prevailing 60 Hz NTSC standard in Japan. The precise number was chosen for meeting minimum requirements for image quality for a viewer with good vision, while being a convenient integer ratio to NTSC’s 525 lines (15:7), and PAL’s 625 lines (9:5). Hi-Vision also introduced a shift to the 16:9 aspect ratio from the more traditional 4:3 used in conventional analog television. The new standard also brought with it improved audio, with four independent channels—left, center, right, and rear—in what was termed “3-1 mode.” This was not unlike the layout used by Dolby Surround systems of the mid-1980s, though the NHK spec suggests using multiple speakers behind the viewers to deliver the single rear sound channel.

Hi-Vision offered improved sound, encoded with PCM. Credit: NHK handbook

Hi-Vision referred most specifically to the video standard itself; the broadcast standard was called MUSE—standing for Multiple sub-Nyquist Sampling Encoding. This was a method for dealing with the high bandwidth requirements of higher-quality television. Where an NTSC TV broadcast might only need 4.2 MHz of bandwidth, the Hi-Vision standard needed 20-25 MHz of bandwidth. That wasn’t practical to fit in alongside terrestrial broadcasts of the time, and even for satellite delivery, it was considered too great. Thus, MUSE offered a way to compress the high-resolution signal down into a more manageable 8.1 MHz, with a combination of dot interlacing and advanced multiplexing techniques. The method used meant that ultimately four frames were needed to make up a full image. Special motion-sensitive encoding techniques were also used to limit the blurring impact of camera pans due to the use of the dot interlaced method. Meanwhile, the four-channel digital audio stream was squeezed into the vertical blanking period.

MUSE broadcasts began on an experimental basis in 1989. NHK would eventually begin using the standard regularly on its BShi satellite service, with a handful of other Japanese broadcasters eventually following suit.  Broadcasts ran until 2007, when NHK finally shut down the service with digital TV by then well established.

 

An NHK station sign-on animation used from 1991 to 1994.

A station ident from NHK’s Hi-Vision broadcasts from 1995 to 1997. Note the 16:9 aspect ratio—then very unusual for TV. Credit: NHK

The technology wasn’t just limited to higher-quality broadcasts, either. Recorded media capable of delivering higher-resolution content also permeated the Japanese market. W-VHS (Wide-VHS) hit the market in 1993 as a video cassette standard capable of recording Hi-Vision/MUSE broadcast material. The W moniker was initially chosen for its shorthand meaning in Japanese of “double”—since Hi-Vision used 1125 lines which was just over double the 525 lines in an NTSC broadcast.

Later, in 1994, Panasonic released its Hi-Vision LaserDisc player, with Pioneer and Sony eventually offering similar products. They similarly offered 1,125 lines (1,035 visible) of resolution in a native 16:9 aspect ratio. The discs were read using a narrower-wavelength laser than standard laser discs, which also offered improved read performance and reliability.

Sample video from a MUSE Hi-Vision Laserdisc. Note the extreme level of detail visible in the makeup palettes and skin, and the motion trails in some of the lens flares.

The hope was that Hi-Vision would become an international standard for HDTV, supplanting the ugly mix of NTSC, PAL, and SECAM formats around the world. Unfortunately, that never came to pass. While Hi-Vision and MUSE did offer a better quality image, there simply wasn’t much content that was actually broadcast in the standard. Only a few channels in Japan were available, creating a limited incentive for households to upgrade their existing sets. Similarly, the amount of recorded media available was also limited. The bandwidth requirements were also too great; even with MUSE squishing the signals down, the 8.1MHz required was still considered too much for practical use in the US market. Meanwhile, being based on a 60 Hz standard meant the European industry was not interested.

Further worsening the situation was that by 1996, DVD technology had been released, offering better quality and all the associated benefits of a digital medium. Digital television technology was not far behind, and buildouts began in countries around the world by the late 1990s. These transmissions offered higher quality and the ability to deliver more channels with the same bandwidth, and would ultimately take over.

Only a handful of Hi-Vision displays still exist in the world.

Hi-Vision and MUSE offered a huge step up in image quality, but their technical limitations and broadcast difficulties meant that they would never compete with the new digital technologies that were coming down the line. There was simply not enough time for the technology to find a foothold in the market before something better came along. Still, it’s quite something to look back on the content and hardware from the late 1980s and early 1990s that was able, in many ways, to measure up in quality to the digital flat screen TVs that wouldn’t arrive for another 15 years or so. Quite a technical feat indeed, even if it didn’t win the day!

9 thoughts on “Japan’s Forgotten Analog HDTV Standard Was Well Ahead Of Its Time

  1. Pedantic side note: the name of the NHK channel mentioned in the article, is indeed stylized “BShi”. If you guessed that it’s supposed to be pronounced like “BS high” (ビーエスハイ), since it broadcast in Hi-Vision, you catch on more quickly than I do.
    I spent an embarrassingly long time trying to figure out why satellite channel 3 would have been called “B-Shi”, which sounds like “B-four” (and especially the pronunciation of “four” that’s avoided because it sounds like the word for “death”). Apparently I’ve spent too much time looking at programming variables lately and just expect everything capitalized to be in camel or Pascal case.

    1. Well, that also depends on the country-region. Sure, it would effect transmission power requirement, but in many countries there were few available channels, and most of the radio spectrum were unused. If I remember correctly, PAL used 6-8MHz total bandwidth, with 5-6MHz for the video, so a slightly tweaked system would basically cut the available channels by 3, and for quite a lot of countries that wouldn’t be a huge issue, although simultaneous transmission of basic signal along with HD will increase usage to 4 channel bandwidths. The main target group to get something new and expensive like this actually used and invested in, would be government broadcast in countries with few channels. And it would have to work along side the old system for at least a decade if not more.

  2. Sub-Nyquist sampling was used by Sony in the late 1970s for their early experiments with digital recording of TV signals. It exploited the property of TV luminance (brightness) signals that their spectrum was crowded around harmonics of the line scan frequency. In NTSC video the color information is designed to into the gaps between these harmonics but in PAL this information was offset either side of these frequencies due to PAL’s alternation of the color burst phase. This, coupled with the relatively low bandwidth of chrominance (color) information opened an opportunity to ‘fold’ the digitized spectrum without losing or corrupting information by sampling it at less than the Nyquist frequency. Its a clever trick but by 1979 it was redundant because Sony built a ‘component’ digital recorder, directly digitizing the video luminance and chrominance signals. This not only worked a lot better than the digitized composite signal but opened the door to processing video completely digitally, something we take for granted today. (It was just a research novelty back then because there just wasn’t the parts, the prototype used a large box of descrete logic, some of which were being worked at their operational limits. The on-tape recording streams, for example, were four parallel 50Mbit/sec streams; these were serialized from memory a line at a time but the DRAM that was available at the time had a 350uSec — yes, microsecond — cycle time!)

  3. And the EBU developed HD-MAC which was a slightly higher resolution analogue system as a response to the proposals that the Japanese standard be adopted as a world standard.

    I saw it demonstrated at an IEEE talk about FLOF Teletext, really quite impressive but doomed.

  4. “being based on a 60 Hz standard meant the European industry was not interested”

    I’m curious about this. Surely, by the 1990s, new TV and video equipment didn’t depend on the mains clock for timing? Is there some other reason TVs in Europe couldn’t have worked with video fields at 60Hz? Bearing in mind, we’d already be talking about new hardware in every part of the system.

    Given the timing, with a digital transition being inevitable by then, I can see why this was DOA regardless. As amazing as those demos would have looked in 1989, we ended up with the same result plus more channels by waiting a decade, so it would’ve been a terrible investment.

    1. New hardware sure, but realistically, a change would be gradual over a decade or more where the new and old had to work side by side, and some would keep their old systems after closing down of transmission done too soon, and new sets would have to be able to use old standard from legacy VCRs and computors and such. So a converter box would have to downsample the new to the old signal for those. And a TV receiver and screen that could work with both the new and old is probably far easier and cheaper if that new is closer compared to the old, and there might be interference patterns with 50 Hz main and 60 Hz equipment. Shielding and such work, but cost money.

      Unless you’re a dictator, change in a country or a region have to be smooth and over a long time to actually be adopted, and for a lot of applications, a completely new system is such a hurdle that competing technology might push that new tech aside if people have to buy new shit and change how they use it, like if cable is worth the hassle to get when you only used to get broadcasts. Or the internet suddenly comes along.

  5. Here in Europe the TV manufacturers tried to push analog HD TV, but it never got beyond plans because regulators pushed back against analog HD TV and a standard was never adopted. I think this was in the ’90-ies. By then it already was pretty clear that it would not take too long before digital would be the future, and implementing analog HD TV for a period of 5 to 10 years would lead to an enormous transfer of money from consumers to equipment manufacturers. Good for the manufacturers, but not so good for the consumers who would quickly see their expensive equipment go obsolete.

    In those days most people also still remembered the VHS versus betamax war.

    It’s also quite amazing how short lived some of the technologies were. DVD’s were introduced in 1997, and took a few years before getting some reasonable adoption by the market, and the peak of DVD sales was in 2005. And 10 years later is was back to a small niche market.

Leave a Reply to CJay UKCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.