Video Game Preservation Through Decompilation

Unlike computer games, which smoothly and continuously evolved along with the hardware that powered them, console games have up until very recently been constrained by a generational style of development. Sure there were games that appeared on multiple platforms, and eventually newer consoles would feature backwards compatibility that allowed them to play select titles from previous generations of hardware. But in many cases, some of the best games ever made were stuck on the console they were designed for.

Now, for those following along as this happened, it wasn’t such a big deal. For gamers, it was simply a given that their favorite games from the Super Nintendo Entertainment System (SNES) wouldn’t play on the Nintendo 64, any more than their Genesis games could run on their Sony PlayStation. As such, it wasn’t uncommon to see several game consoles clustered under the family TV. If you wanted to go back and play those older titles, all you had to do was switch video inputs.

But gaming, and indeed the entertainment world in general, has changed vastly over the last couple of decades. Telling somebody today that the only way they can experience The Legend of Zelda: A Link to the Past is by dragging out some yellowed thirty-odd year old console from the attic is like telling them the only way they can see a movie is by going to the theater.

These days, the expectation is that entertainment comes to you, not the other way around — and it’s an assumption that’s unlikely to change as technology marches on. Just like our TV shows and movies now appear on whatever device is convenient to us at the time, modern gamers don’t want to be limited to their consoles, they also want to play games on their phones and VR headsets.

But that leaves us with a bit of a problem. There are some games which are too significant, either technically or culturally, to just leave in the digital dust. Like any other form of art, there are pieces that deserve to be preserved for future generations to see and experience.

For the select few games that are deemed worth the effort, decompilation promises to offer a sort of digital immortality. As several recent projects have shown, breaking a game down to its original source code can allow it to adapt to new systems and technologies for as long as the community wishes to keep them updated.

Continue reading “Video Game Preservation Through Decompilation”

Eulogy For The Satellite Phone

We take it for granted that we almost always have cell service, no matter where you go around town. But there are places — the desert, the forest, or the ocean — where you might not have cell service. In addition, there are certain jobs where you must be able to make a call even if the cell towers are down, for example, after a hurricane. Recently, a combination of technological advancements has made it possible for your ordinary cell phone to connect to a satellite for at least some kind of service. But before that, you needed a satellite phone.

On TV and in movies, these are simple. You pull out your cell phone that has a bulkier-than-usual antenna, and you make a call. But the real-life version is quite different. While some satellite phones were connected to something like a ship, I’m going to consider a satellite phone, for the purpose of this post, to be a handheld device that can make calls.

Continue reading “Eulogy For The Satellite Phone”

Hackaday Links Column Banner

Hackaday Links: June 22, 2025

Hold onto your hats, everyone — there’s stunning news afoot. It’s hard to believe, but it looks like over-reliance on chatbots to do your homework can turn your brain into pudding. At least that seems to be the conclusion of a preprint paper out of the MIT Media Lab, which looked at 54 adults between the ages of 18 and 39, who were tasked with writing a series of essays. They divided participants into three groups — one that used ChatGPT to help write the essays, one that was limited to using only Google search, and one that had to do everything the old-fashioned way. They recorded the brain activity of writers using EEG, in order to get an idea of brain engagement with the task. The brain-only group had the greatest engagement, which stayed consistently high throughout the series, while the ChatGPT group had the least. More alarmingly, the engagement for the chatbot group went down even further with each essay written. The ChatGPT group produced essays that were very similar between writers and were judged “soulless” by two English teachers. Go figure.

Continue reading “Hackaday Links: June 22, 2025”

Just For Laughs: Charlie Douglass And The Laugh Track

I ran into an old episode of Hogan’s Heroes the other day that stuck me as odd. It didn’t have a laugh track. Ironically, the show was one where two pilots were shown, one with and one without a laugh track. The resulting data ensured future shows would have fake laughter. This wasn’t the pilot, though, so I think it was just an error on the part of the streaming service.

However, it was very odd. Many of the jokes didn’t come off as funny without the laugh track. Many of them came off as cruel. That got me to thinking about how they had to put laughter in these shows to begin with. I had my suspicions, but was I way off!

Well, to be honest, my suspicions were well-founded if you go back far enough. Bing Crosby was tired of running two live broadcasts, one for each coast, so he invested in tape recording, using German recorders Jack Mullin had brought back after World War II. Apparently, one week, Crosby’s guest was a comic named Bob Burns. He told some off-color stories, and the audience was howling. Of course, none of that would make it on the air in those days. But they saved the recording.

A few weeks later, either a bit of the show wasn’t as funny or the audience was in a bad mood. So they spliced in some of the laughs from the Burns performance. You could guess that would happen, and that’s the apparent birth of the laugh track. But that method didn’t last long before someone — Charley Douglass — came up with something better. Continue reading “Just For Laughs: Charlie Douglass And The Laugh Track”

A Gentle Introduction To Ncurses For The Terminally Impatient

Considered by many to be just a dull output for sequential text, the command-line terminal is a veritable canvas to the creative software developer. With the cursor as the brush, entire graphical user interfaces can be constructed, or even a basic text-based dashboard on which values can be updated without redrawing the entire screen over and over, or opting for a much heavier solution like a GUI.

Ncurses is one of the most well-known and rather portable Terminal User Interface (TUI) libraries using that such cursor control, and more, can be achieved in a fairly painless manner. That said, for anyone coming from a graphical user interface framework, the concepts and terminology with ncurses and similar can be confusingly different yet overlapping, so that getting started can be somewhat harrowing.

In this article we’ll take a look at ncurses’ history, how to set it up and how to use it with C and C++, and many more languages supported via bindings.

Continue reading “A Gentle Introduction To Ncurses For The Terminally Impatient”

Crowdsourcing SIGINT: Ham Radio At War

I often ask people: What’s the most important thing you need to have a successful fishing trip? I get a lot of different answers about bait, equipment, and boats. Some people tell me beer. But the best answer, in my opinion, is fish. Without fish, you are sure to come home empty-handed.

On a recent visit to Bletchley Park, I thought about this and how it relates to World War II codebreaking. All the computers and smart people in the world won’t help you decode messages if you don’t already have the messages. So while Alan Turing and the codebreakers at Bletchley are well-known, at least in our circles, fewer people know about Arkley View.

The problem was apparent to the British. The Axis powers were sending lots of radio traffic. It would take a literal army of radio operators to record it all. Colonel Adrian Simpson sent a report to the director of MI5 in 1938 explaining that the three listening stations were not enough. The proposal was to build a network of volunteers to handle radio traffic interception.

That was the start of the Radio Security Service (RSS), which started operating out of some unused cells at a prison in London. The volunteers? Experienced ham radio operators who used their own equipment, at first, with the particular goal of intercepting transmissions from enemy agents on home soil.

At the start of the war, ham operators had their transmitters impounded. However, they still had their receivers and, of course, could all read Morse code. Further, they were probably accustomed to pulling out Morse code messages under challenging radio conditions.

Over time, this volunteer army of hams would swell to about 1,500 members. The RSS also supplied some radio gear to help in the task. MI5 checked each potential member, and the local police would visit to ensure the applicant was trustworthy. Keep in mind that radio intercepts were also done by servicemen and women (especially women) although many of them were engaged in reporting on voice communication or military communications.

Continue reading “Crowdsourcing SIGINT: Ham Radio At War”

Network Infrastructure And Demon-Slaying: Virtualization Expands What A Desktop Can Do

The original DOOM is famously portable — any computer made within at least the last two decades, including those in printers, heart monitors, passenger vehicles, and routers is almost guaranteed to have a port of the iconic 1993 shooter. The more modern iterations in the series are a little trickier to port, though. Multi-core processors, discrete graphics cards, and gigabytes of memory are generally needed, and it’ll be a long time before something like an off-the-shelf router has all of these components.

But with a specialized distribution of Debian Linux called Proxmox and a healthy amount of configuration it’s possible to flip this idea on its head: getting a desktop computer capable of playing modern video games to take over the network infrastructure for a LAN instead, all with minimal impact to the overall desktop experience. In effect, it’s possible to have a router that can not only play DOOM but play 2020’s DOOM Eternal, likely with hardware most of us already have on hand.

The key that makes a setup like this work is virtualization. Although modern software makes it seem otherwise, not every piece of software needs an eight-core processor and 32 GB of memory. With that in mind, virtualization software splits modern multi-core processors into groups which can act as if they are independent computers. These virtual computers or virtual machines (VMs) can directly utilize not only groups or single processor cores independently, but reserved portions of memory as well as other hardware like peripherals and disk drives.

Proxmox itself is a version of Debian with a number of tools available that streamline this process, and it installs on PCs in essentially the same way as any other Linux distribution would. Once installed, tools like LXC for containerization, KVM for full-fledged virtual machines, and an intuitive web interface are easily accessed by the user to allow containers and VMs to be quickly set up, deployed, backed up, removed, and even sent to other Proxmox installations. Continue reading “Network Infrastructure And Demon-Slaying: Virtualization Expands What A Desktop Can Do”