Just For Laughs: Charlie Douglass And The Laugh Track

I ran into an old episode of Hogan’s Heroes the other day that stuck me as odd. It didn’t have a laugh track. Ironically, the show was one where two pilots were shown, one with and one without a laugh track. The resulting data ensured future shows would have fake laughter. This wasn’t the pilot, though, so I think it was just an error on the part of the streaming service.

However, it was very odd. Many of the jokes didn’t come off as funny without the laugh track. Many of them came off as cruel. That got me to thinking about how they had to put laughter in these shows to begin with. I had my suspicions, but was I way off!

Well, to be honest, my suspicions were well-founded if you go back far enough. Bing Crosby was tired of running two live broadcasts, one for each coast, so he invested in tape recording, using German recorders Jack Mullin had brought back after World War II. Apparently, one week, Crosby’s guest was a comic named Bob Burns. He told some off-color stories, and the audience was howling. Of course, none of that would make it on the air in those days. But they saved the recording.

A few weeks later, either a bit of the show wasn’t as funny or the audience was in a bad mood. So they spliced in some of the laughs from the Burns performance. You could guess that would happen, and that’s the apparent birth of the laugh track. But that method didn’t last long before someone — Charley Douglass — came up with something better. Continue reading “Just For Laughs: Charlie Douglass And The Laugh Track”

A Gentle Introduction To Ncurses For The Terminally Impatient

Considered by many to be just a dull output for sequential text, the command-line terminal is a veritable canvas to the creative software developer. With the cursor as the brush, entire graphical user interfaces can be constructed, or even a basic text-based dashboard on which values can be updated without redrawing the entire screen over and over, or opting for a much heavier solution like a GUI.

Ncurses is one of the most well-known and rather portable Terminal User Interface (TUI) libraries using that such cursor control, and more, can be achieved in a fairly painless manner. That said, for anyone coming from a graphical user interface framework, the concepts and terminology with ncurses and similar can be confusingly different yet overlapping, so that getting started can be somewhat harrowing.

In this article we’ll take a look at ncurses’ history, how to set it up and how to use it with C and C++, and many more languages supported via bindings.

Continue reading “A Gentle Introduction To Ncurses For The Terminally Impatient”

Crowdsourcing SIGINT: Ham Radio At War

I often ask people: What’s the most important thing you need to have a successful fishing trip? I get a lot of different answers about bait, equipment, and boats. Some people tell me beer. But the best answer, in my opinion, is fish. Without fish, you are sure to come home empty-handed.

On a recent visit to Bletchley Park, I thought about this and how it relates to World War II codebreaking. All the computers and smart people in the world won’t help you decode messages if you don’t already have the messages. So while Alan Turing and the codebreakers at Bletchley are well-known, at least in our circles, fewer people know about Arkley View.

The problem was apparent to the British. The Axis powers were sending lots of radio traffic. It would take a literal army of radio operators to record it all. Colonel Adrian Simpson sent a report to the director of MI5 in 1938 explaining that the three listening stations were not enough. The proposal was to build a network of volunteers to handle radio traffic interception.

That was the start of the Radio Security Service (RSS), which started operating out of some unused cells at a prison in London. The volunteers? Experienced ham radio operators who used their own equipment, at first, with the particular goal of intercepting transmissions from enemy agents on home soil.

At the start of the war, ham operators had their transmitters impounded. However, they still had their receivers and, of course, could all read Morse code. Further, they were probably accustomed to pulling out Morse code messages under challenging radio conditions.

Over time, this volunteer army of hams would swell to about 1,500 members. The RSS also supplied some radio gear to help in the task. MI5 checked each potential member, and the local police would visit to ensure the applicant was trustworthy. Keep in mind that radio intercepts were also done by servicemen and women (especially women) although many of them were engaged in reporting on voice communication or military communications.

Continue reading “Crowdsourcing SIGINT: Ham Radio At War”

Network Infrastructure And Demon-Slaying: Virtualization Expands What A Desktop Can Do

The original DOOM is famously portable — any computer made within at least the last two decades, including those in printers, heart monitors, passenger vehicles, and routers is almost guaranteed to have a port of the iconic 1993 shooter. The more modern iterations in the series are a little trickier to port, though. Multi-core processors, discrete graphics cards, and gigabytes of memory are generally needed, and it’ll be a long time before something like an off-the-shelf router has all of these components.

But with a specialized distribution of Debian Linux called Proxmox and a healthy amount of configuration it’s possible to flip this idea on its head: getting a desktop computer capable of playing modern video games to take over the network infrastructure for a LAN instead, all with minimal impact to the overall desktop experience. In effect, it’s possible to have a router that can not only play DOOM but play 2020’s DOOM Eternal, likely with hardware most of us already have on hand.

The key that makes a setup like this work is virtualization. Although modern software makes it seem otherwise, not every piece of software needs an eight-core processor and 32 GB of memory. With that in mind, virtualization software splits modern multi-core processors into groups which can act as if they are independent computers. These virtual computers or virtual machines (VMs) can directly utilize not only groups or single processor cores independently, but reserved portions of memory as well as other hardware like peripherals and disk drives.

Proxmox itself is a version of Debian with a number of tools available that streamline this process, and it installs on PCs in essentially the same way as any other Linux distribution would. Once installed, tools like LXC for containerization, KVM for full-fledged virtual machines, and an intuitive web interface are easily accessed by the user to allow containers and VMs to be quickly set up, deployed, backed up, removed, and even sent to other Proxmox installations. Continue reading “Network Infrastructure And Demon-Slaying: Virtualization Expands What A Desktop Can Do”

Reconductoring: Building Tomorrow’s Grid Today

What happens when you build the largest machine in the world, but it’s still not big enough? That’s the situation the North American transmission system, the grid that connects power plants to substations and the distribution system, and which by some measures is the largest machine ever constructed, finds itself in right now. After more than a century of build-out, the towers and wires that stitch together a continent-sized grid aren’t up to the task they were designed for, and that’s a huge problem for a society with a seemingly insatiable need for more electricity.

There are plenty of reasons for this burgeoning demand, including the rapid growth of data centers to support AI and other cloud services and the move to wind and solar energy as the push to decarbonize the grid proceeds. The former introduces massive new loads to the grid with millions of hungry little GPUs, while the latter increases the supply side, as wind and solar plants are often located out of reach of existing transmission lines. Add in the anticipated expansion of the manufacturing base as industry seeks to re-home factories, and the scale of the potential problem only grows.

The bottom line to all this is that the grid needs to grow to support all this growth, and while there is often no other solution than building new transmission lines, that’s not always feasible. Even when it is, the process can take decades. What’s needed is a quick win, a way to increase the capacity of the existing infrastructure without having to build new lines from the ground up. That’s exactly what reconductoring promises, and the way it gets there presents some interesting engineering challenges and opportunities.

Continue reading “Reconductoring: Building Tomorrow’s Grid Today”

Is The Atomic Outboard An Idea Whose Time Has Come?

Everyone these days wants to talk about Small Modular Reactors (SMRs) when it comes to nuclear power. The industry seems to have pinned its hopes for a ‘nuclear renaissance’ on the exciting new concept. Exciting as it may be, it is not exactly new: small reactors date back to the heyday of the atomic era. There were a few prototypes, and a lot more paper projects that are easy to sneer at today. One in particular caught our eye, in a write-up from Steve Wientz, that is described as an atomic outboard motor.

It started as an outgrowth from General Electric’s 1950s work on airborne nuclear reactors. GE’s proposal just screams “1950s” — a refractory, air-cooled reactor serving as the heat source for a large turboprop engine. Yes, complete with open-loop cooling. Those obviously didn’t fly (pun intended, as always) but to try and recoup some of their investment GE proposed a slew of applications for this small, reactor-driven gas turbine. Rather than continue to push the idea of connecting it to a turboprop and spew potentially-radioactive exhaust directly into the atmosphere, GE proposed podding up the reactor with a closed-cycle gas turbine into one small, hermetically sealed-module. Continue reading “Is The Atomic Outboard An Idea Whose Time Has Come?”

Information Density: Microfilm And Microfiche

Today, we think nothing of sticking thousands of pages of documents on a tiny SD card, or just pushing it out to some cloud service. But for decades, this wasn’t possible. Yet companies still generated huge piles of paper. What could be done? The short answer is: microfilm.

However, the long answer is quite a bit more complicated. Microfilm is, technically, a common case of the more generic microform. A microform is a photographically reduced document on film. A bunch of pages on a reel of film is microfilm. If it is on a flat card — usually the size of an index card — that’s microfiche. On top of that, there were a few other incidental formats. Aperture cards were computer punch cards with a bit of microfilm included. Microcards were like microfiche, but printed on cardboard instead of film.

In its heyday, people used specialized cameras, some made to read fanfold computer printer paper, to create microfilm. There were also computer output devices that could create microfilm directly.

Continue reading “Information Density: Microfilm And Microfiche”