Remember When Blockbuster Video Tried Burning Game Cartridges On Demand?

By the onset of the 1990s one thing was clear, the future was digital. Analog format sales for music were down, CD sales were up; and it was evident, at least in the US, that people were bringing more computing devices into their homes. At the beginning of the decade, roughly 1 in 3 American households had a Nintendo Entertainment System in them, according to this Good Morning America segment.

With all those consoles out there, every shopping season became a contest of “who could wait in line the longest” to pickup the newest titles. This left last minute shoppers resorting to taking a rain check or return home empty handed. Things didn’t have to be this way. The digital world had emerged and physical media just needed to catch up. It would take an unlikely alliance of two disparate companies for others to open their minds.

Continue reading “Remember When Blockbuster Video Tried Burning Game Cartridges On Demand?”

Soldering Like It’s 205 BC

Did you ever stop to think how unlikely the discovery of soldering is? It’s hard to imagine what sequence of events led to it; after all, metals heated to just the right temperature while applying an alloy of lead and tin in the right proportions in the presence of a proper fluxing agent doesn’t seem like something that would happen by accident.

Luckily, [Chris] at Clickspring is currently in the business of recreating the tools and technologies that would have been used in ancient times, and he’s made a wonderful video on precision soft soldering the old-fashioned way. The video below is part of a side series he’s been working on while he builds a replica of the Antikythera mechanism, that curious analog astronomical computer of antiquity. Many parts in the mechanism were soldered, and [Chris] explores plausible methods using tools and materials known to have been available at the time the mechanism was constructed (reported by different historians as any time between 205 BC and 70 BC or so). His irons are forged copper blocks, his heat source is a charcoal fire, and his solder is a 60:40 mix of lead and tin, just as we use today. He vividly demonstrates how important both surface prep and flux are, and shows both active and passive fluxes. He settled on rosin for the final joints, which turned out silky smooth and perfect; we suspect it took quite a bit of practice to get the technique down, but as always, [Chris] makes it look easy.

If you’d like to dig a bit deeper into modern techniques, we’ve covered the physics of solder and fluxes in some depth. And if you need more of those sweet, sweet Clickspring videos, we’ve got you covered there as well.

Continue reading “Soldering Like It’s 205 BC”

The Pre-CRT Oscilloscope

Oscilloscopes are especially magical because they translate the abstract world of electronics into something you can visualize. These days, a scope is likely to use an LCD or another kind of flat electronic display, but the gold standard for many years was the ubiquitous CRT (cathode ray tube). Historically, though, CRTs were not very common in the early days of electronics and radio. What we think of as a CRT didn’t really show up until 1931, although if you could draw a high vacuum and provide 30 kV, there were tubes as early as 1919. But there was a lot of electronics work done well before that, so how did early scientists visualize electric current? You might think the answer is “they didn’t,” but that’s not true. We are spoiled today with high-resolution electronic displays, but our grandfathers were clever and used what they had to visualize electronics.

Keep in mind, you couldn’t even get an electronic amplifier until the early 1900s (something we’ve talked about before). The earliest way to get a visual idea of what was happening in a circuit was purely a manual process. You would make measurements and draw your readings on a piece of graph paper.

Continue reading “The Pre-CRT Oscilloscope”

NASA Shows Off Its Big Computer In 1986

Sometimes it is hard to remember just how far computers have come in the last three or four decades. An old NASA video (see below) has been restored with better sound and video recently that shows what passed for a giant computer in 1986. The Cray 2 runs at 250 MHz and had two gigabytes of memory (256 megabytes of million 64-bit words).

Despite the breathless praise, history hasn’t been kind to the Cray 2. Based on ECL, it had 4 processors and –in theory — could reach 1,900 megaFLOPs/second (a FLOP is a floating point operation). However, practical problems made it difficult to get to that theoretical maximum.

Continue reading “NASA Shows Off Its Big Computer In 1986”

Dawn Of The First Digital Camera

Technology vanishes. It either succeeds and becomes ubiquitous or fails. For example, there was a time when networking and multimedia were computer buzzwords. Now they are just how computers work. On the other hand, when was the last time you thought about using a CueCat barcode reader to scan an advertisement? Then there are the things that have their time and vanish, like pagers. It is hard to decide which category digital cameras fall into. They are being absorbed into our phones and disappearing as a separate category for most consumers. But have you ever wondered about the first digital camera? The story isn’t what you would probably guess.

The first digital camera I ever had was a Sony that took a floppy disk. Surely that was the first, right? Turns out, no. There were some very early attempts that didn’t really have the technology to make them work. The Jet Propulsion Laboratory was using analog electronic imaging as early as 1961 (they had been developing film on the moon but certainly need a better way). A TI engineer even patented the basic outline of an electronic camera in 1972, but it wasn’t strictly digital. None of these bore any practical fruit, especially relative to digital technology. It would take Eastman Kodak to create a portable digital camera, even though they were not the first to commercialize the technology.

Continue reading “Dawn Of The First Digital Camera”

Recorded Programming — Thanks To Bing Crosby

If you look up Bing Crosby in Wikipedia, the first thing you’ll notice is his real name was Harry. The second thing you’ll read, though, is that he is considered the first “multimedia star.” In 1948, half of the recorded music played on the air was by Bing Crosby. He also was a major motion picture star and a top-selling recording artist. However, while you might remember Bing for his songs like White Christmas, or for his orange juice commercials, or for accusations of poor treatment from his children, you probably don’t associate him with the use of magnetic tape.

In a way, Bing might have been akin to the Steve Jobs of the day. He didn’t power the technology for tape recording. But he did see the value of it, invested in it, and brought it to the market. Turns out Bing was quite the businessman. Want to know why he did all those Minute Maid commercials? He was a large shareholder in the company and was the west coast distributor for their products. He also owned part of the Pittsburgh Pirate baseball team and other businesses.

So how did Bing become instrumental in introducing magnetic tape recording? Because he was tired of doing live shows. You see, in 1936, Crosby became the host of a radio variety show, The Kraft Music Hall. This very popular program was live. That means you have to show up on time. If you go off on a tangent, you’ll run out of time. And if you make a mistake, there is no editing. Oh and one other thing. You have to do a nationwide live show twice: once for the east coast and another for the west. This was cutting into Bing’s “family time” which, as far as we can ascertain was a code phrase for golf.

Continue reading “Recorded Programming — Thanks To Bing Crosby”

Was The Self Driving Car Invented In The 1980s?

The news is full of self-driving cars and while there is some bad news, most of it is pretty positive. It seems a foregone conclusion that it is just a matter of time before calling for an Uber doesn’t involve another person. But according to a recent article, [Ernst Dickmanns] — a German aerospace engineer —  built three autonomous vehicles starting in 1986 and culminating with on-the-road demonstrations in 1994 for Daimler.

It is hard to imagine what had to take place to get a self-driving car in 1986. The article asserts that you need computer analysis of video at 10 frames a second minimum. In the 1980s doing a single frame in 10 minutes was considered an accomplishment. [Dickmanns’] vehicles borrowed tricks from how humans drive. They focused on a small area at any one moment and tried to ignore things that were not relevant.

Continue reading “Was The Self Driving Car Invented In The 1980s?”