As technology advances, finding the culprit in a malfunctioning device has become somewhat more difficult. As an example, troubleshooting an AM radio is pretty straightforward. There are two basic strategies. First, you can inject a signal in until you can hear it. Then you work backwards to find the stage that is bad. The other way is to trace a signal using a signal tracer or an oscilloscope. When the signal is gone, you’ve found the bad stage. Of course, you still need to figure out what’s wrong with the stage, but that’s usually one or two transistors (or tubes) and a handful of components.
A common signal injector was often a square wave generator that would generate audio frequencies and radio frequency harmonics. It was common to inject at the volume control (easy to find) to determine if the problem was in the RF or audio sections first. If you heard a buzz, you worked backwards into the RF stages. No buzz indicated an audio section problem.
A signal tracer was nothing more than an audio amplifier with a diode demodulator. Starting at the volume control was still a good idea. If you heard radio stations through the signal tracer, the RF section was fine. Television knocked radio off of its pedestal as the primary form of information and entertainment in most households, and thus the TV repair industry was created.
Continue reading “Retrotechtacular: TV Troubleshooting”
I was buying a new laptop the other day and had to make a choice between 4GB of memory and 8. I can remember how big a deal it was when a TRS-80 went from 4K (that’s .000004 GB, if you are counting) to 48K. Today just about all RAM (at least in PCs) is dynamic–it relies on tiny capacitors to hold a charge. The downside to that is that the RAM is unavailable sometimes while the capacitors get refreshed. The upside is you can inexpensively pack lots of bits into a small area. All of the common memory you plug into a PC motherboard–DDR, DDR2, SDRAM, RDRAM, and so on–are types of dynamic memory.
The other kind of common RAM you see is static. This is more or less an array of flip flops. They don’t require refreshing, but a static RAM cell is much larger than an equivalent bit of dynamic memory, so static memory is much less dense than dynamic. Static RAM lives in your PC, too, as cache memory where speed is important.
For now, at least, these two types of RAM technology dominate the market for fast random access read/write memory. Sure, there are a few new technologies that could gain wider usage. There’s also things like flash memory that are useful, but can’t displace regular RAM because of speed, durability, or complex write cycles. However, computers didn’t always use static and dynamic RAM. In fact, they are relatively newcomers to the scene. What did early computers use for fast read/write storage?
Continue reading “Thanks for the Memories: Touring the Awesome Random Access of Old”
This great old video (embedded below the break) from Tektronix in the mid-60s covers a topic that seems to confuse folks more than it should — transmission lines. We found it on Paul Carbone’s blog, a great site for aficionados of old analog scopes in its own right.
As with many of these older videos, the pacing is a bit slow by today’s standards, but the quality of the material eventually presented more than makes it worth the effort to reign in your ADHD. For a preview, you can skip to the end where they do a review of all the material.
They start off 5:31 with a pulse travelling down a wire pair, and take a very real-world approach to figuring out the characteristic impedance of the line: if the pulse was created by a battery of 9V, how much current is flowing? If the DC resistance of the wire is zero then there should be an infinite current by Ohm’s law, and that’s clearly not happening. This motivates the standard analysis where you break the wire down into distributed inductance and capacitance.
Of course they do the experiment where you inject a pulse into a long loop of coaxial cable and play around with the termination at the other end of the line. They also measure the velocity factor of the line. Our only gripe is that they don’t tap the line in different places to demonstrate standing waves. The good news is that we’ve got YouTube (and [w3aew]) for that.
If you’ve got 23 minutes to spare, and are curious about transmission lines or just enjoy the soothing voice of a trained radio announcer reading out values of various termination resistors, this old gem is just the ticket. Enjoy!
Continue reading “Retrotechtacular: Transmission Lines”
By 2016, it is evident the FAX machine has peaked. Sure, you still see a few. There are even services that will let you send and receive FAXes via Internet–which could mean no FAX machine was involved at all. But looking back, you have to wonder where it all started. Most people had never seen a FAX machine until the late 1960s or early 1970s. It was 1980 before there was a standard. Some, like hams and weather service employees, were using them even earlier. But would it surprise you to know that the first experimental FAX machine appeared in 1843?
Wait a minute. Bell didn’t even build a telephone until 1875 (the patent issued in 1876). Turns out the first FAX machines didn’t work with a phone. They worked over a telegraph wire.
Continue reading “FAXing in 1843”
In this short but intense classic of corporate cinematography, we get to watch as the Pacific Bell central office in Glendale, California is converted to electronic switching in a 47-second frenzy of cable cutting in 1984.
In the 1970s and 1980s, conversion of telephone central office (CO) switch gear from older technologies such as crossbar (XBar) switches or step-by-step (SxS) gear to electronic switching systems (ESS) was proceeding apace. Early versions of ESS were rolling out as early as the 1950s, but telcos were conservative entities that were slow to adopt change and even slower to make changes that might result in service outages. So when the time finally came for the 35,000 line Glendale CO to cutover from their aging SxS gear to ESS, Pacific Bell retained Western Electric for their “Speedy Cutover Service.”
Designed to reduce the network outage time to a minimum, cuts like these were intricately planned and rehearsed. Prep teams of technicians marked the cables to be cut and positioned them for easy access by the cutters. For this cut, scaffolding was assembled to support two tiers of cutters. It looks like the tall guys got the upper deck, and the shorter techs – with hard hats – worked under them.
At 11PM on this cut night, an emergency coordinator verified that no emergency calls were in progress, and the cut began. In an intense burst of activity, each of the 54 technicians cut about 20 cables. Smiles widened as the cut accelerated, and sparks actually flew at the 35.7 second mark. When done, each tech turned around and knelt down so the supervisors knew when everyone was done. At least one tech couldn’t help but whoop it up when the cut was done. Who could blame him? It must have been a blast.
Continue reading “Retrotechtacular: Cut All the Cables in this Speedy Teleco Switch Upgrade”
Every generation thinks it has unique problems and, I suppose, sometimes it is true. My great-grandfather didn’t have to pick a cell phone plan. However, a lot of things you think are modern problems go back much further than you might think. Consider Kickstarter. Sure, there have been plenty of successful products on Kickstarter. There have also been some misleading duds. I don’t mean the stupid ones like the guy who wants to make a cake or potato salad. I mean the ones that are almost certainly vaporware like the induced dream headgear or the Bluetooth tag with no batteries.
Overpromising and underdelivering is hardly a new problem. In the 30’s The McGregor Rejuvenator promised to reverse aging with magnetism, radio waves, infrared and ultraviolet light. Presumably, this didn’t work. Sometimes products do work, but they don’t live up to their marketing hype. The Segway comes to mind. Despite the hype that it would revolutionize transportation, the scooter is now a vehicle for tourists and mall cops.
One of my favorite examples of an overhyped product comes from World War II: The Norden Bomb Sight. What makes the Norden especially interesting is that even today it has a reputation for being highly accurate. However, if you look into it, the Norden–although a marvel for its day–didn’t always live up to its press.
Continue reading “Misleading Tech: Kickstarter, Bomb Sights, and Medical Rejuvinators”
We all know what Computer-Generated Imagery (CGI) is nowadays. It’s almost impossible to get away from it in any television show or movie. It’s gotten so good, that sometimes it can be difficult to tell the difference between the real world and the computer generated world when they are mixed together on-screen. Of course, it wasn’t always like this. This 1982 clip from BBC’s Tomorrow’s World shows what the wonders of CGI were capable of in a simpler time.
In the earliest days of CGI, digital computers weren’t even really a thing. [John Whitney] was an American animator and is widely considered to be the father of computer animation. In the 1940’s, he and his brother [James] started to experiment with what they called “abstract animation”. They pieced together old analog computers and servos to make their own devices that were capable of controlling the motion of lights and lit objects. While this process may be a far cry from the CGI of today, it is still animation performed by a computer. One of [Whitney’s] best known works is the opening title sequence to [Alfred Hitchcock’s] 1958 film, Vertigo.
Later, in 1973, Westworld become the very first feature film to feature CGI. The film was a science fiction western-thriller about amusement park robots that become evil. The studio wanted footage of the robot’s “computer vision” but they would need an expert to get the job done right. They ultimately hired [John Whitney’s] son, [John Whitney Jr] to lead the project. The process first required color separating each frame of the 70mm film because [John Jr] did not have a color scanner. He then used a computer to digitally modify each image to create what we would now recognize as a “pixelated” effect. The computer processing took approximately eight hours for every ten seconds of footage. Continue reading “Retrotechtacular: The Early Days of CGI”