Unlike computer games, which smoothly and continuously evolved along with the hardware that powered them, console games have up until very recently been constrained by a generational style of development. Sure there were games that appeared on multiple platforms, and eventually newer consoles would feature backwards compatibility that allowed them to play select titles from previous generations of hardware. But in many cases, some of the best games ever made were stuck on the console they were designed for.
Now, for those following along as this happened, it wasn’t such a big deal. For gamers, it was simply a given that their favorite games from the Super Nintendo Entertainment System (SNES) wouldn’t play on the Nintendo 64, any more than their Genesis games could run on their Sony PlayStation. As such, it wasn’t uncommon to see several game consoles clustered under the family TV. If you wanted to go back and play those older titles, all you had to do was switch video inputs.
But gaming, and indeed the entertainment world in general, has changed vastly over the last couple of decades. Telling somebody today that the only way they can experience The Legend of Zelda: A Link to the Past is by dragging out some yellowed thirty-odd year old console from the attic is like telling them the only way they can see a movie is by going to the theater.
These days, the expectation is that entertainment comes to you, not the other way around — and it’s an assumption that’s unlikely to change as technology marches on. Just like our TV shows and movies now appear on whatever device is convenient to us at the time, modern gamers don’t want to be limited to their consoles, they also want to play games on their phones and VR headsets.
But that leaves us with a bit of a problem. There are some games which are too significant, either technically or culturally, to just leave in the digital dust. Like any other form of art, there are pieces that deserve to be preserved for future generations to see and experience.
For the select few games that are deemed worth the effort, decompilation promises to offer a sort of digital immortality. As several recent projects have shown, breaking a game down to its original source code can allow it to adapt to new systems and technologies for as long as the community wishes to keep them updated.
Emulation For Most, But Not All
Before we get into the subject of decompilation, we must first address a concept that many readers are likely familiar with already: emulation.
Using a console emulator to play an old game is not entirely unlike running an operating system through a virtual machine, except in the case of the console emulator, there’s the added complication of having to replicate the unique hardware environment that a given game was designed to run on. Given a modern computer, this usually isn’t a problem when it comes to the early consoles. But as you work your way through the console generations, the computational power required to emulate their unique hardware architectures rapidly increases.

The situation is often complicated by the fact that some games were painstakingly optimized for their respective console, often making use of little-documented quirks of the hardware. Emulators often employ title-specific routines to try and make these games playable, but they aren’t always 100% successful. Even on games that aren’t particularly taxing, the general rule of emulation is to put performance ahead of accuracy.
Therein lies the key problem with emulation when it comes to preserving games as an artistic medium. While the need for ever-more powerful hardware is a concern, Moore’s Law will keep that largely in check. The bigger issue is accuracy. Simply running a game is one thing, but to run it exactly how it was meant to run when the developers released it is another story entirely.
It’s fairly common for games to look, sound, and even play slightly differently when under emulation than they did when running on real hardware. In many cases, these issues are barely noticeable for the average player. The occasional sound effect playing out of sync, or a slightly shifted color palette isn’t enough to ruin the experience. Other issues, like missing textures or malfunctioning game logic can be bad enough that the game can’t be completed. There are even games, few as they may be, that simply don’t run at all under emulation.
Make no mistake, emulation is usually good enough for most games. Indeed, both Nintendo and Sony have used emulation in various capacities to help bring their extensive back catalog of games to newer generations. But the fact remains that there are some games which deserve, and sometimes even require, a more nuanced approach.
Chasing Perfection
In comparison, when a game is decompiled to the point that the community has the original C code that it was built from, it’s possible to avoid many of the issues that come with emulation. The game can be compiled as a native executable for modern platforms, and it can take advantage of all the hardware and software improvements that come with it. It’s even possible to fix long-standing bugs, and generally present the game in its best form.
For those who’ve dabbled in reverse engineering, you’ll know that decompiling a program back into usable C code isn’t exactly a walk in the park. While there are automated tools that can help get through a lot of the work, there’s still plenty of human intervention required. Even then, the original code for the game would have been written to take advantage of the original console’s unique hardware, so you’ll need to either patch your way around that or develop some kind of compatibility layer to map various calls over to something more modern and platform-agnostic. It’s a process that can easily take years to complete.
Because of this, decompilation efforts tend to be limited to the most critically acclaimed titles. For example, in 2021 we saw the first efforts to fully reverse The Legend of Zelda: Ocarina of Time. Released in 1998 on the N64, it’s often hailed as one of the greatest video games ever made. Although the effort started with Ocarina, by 2024, the lessons learned during that project led to the development of tools which can help decompile and reconstruct other N64 games.
Games as Living Documents
For the most part, an emulated game works the same way it did when it was first released. Of course, the emulator has full control over the virtual environment that the game is running in, so there are a few tricks it can pull. As such, additional features such as cheats and save states are common in most emulators. It’s even possible to swap out the original graphical assets for higher resolution versions, which can greatly improve the look of some early 3D games.
But what if you wanted to take things further? That’s where having the source code makes all the difference. Once you’ve gotten the game running perfectly, you can create a fork that starts adding in new features and quality of life improvements. As an example, the decompilation for Animal Crossing on the GameCube will allow developers to expand the in-game calendar beyond the year 2030 — but it’s a change that will be implemented in a “deluxe” fork of the code so as to preserve how the original game functioned.
At this point you’re beyond preservation, and you’ve turned the game into something that doesn’t just live on, but can actually grow with new generations of players.
Citation very much needed.
Foreword: I’ve worked in the emulation scene for the better part of 24 years at this point, and have even had some of my efforts featured here on HaD.
If you’re running some ancient emulator from the MS-DOS days, sure, definitely, but it ain’t 1999 anymore, and there are countless people pouring countless hours into delayering chips and figuring out how things worked on a per-cycle level.
The vast majority of consoles prior to the PSX/N64 era have that completely in the bag and have done for quite a few years at this point.
Arcade games languished for years, but the advent of FPGA-based emulation has given the delayering and silicon-reverse-engineering community a shot in the arm. Quite a few of those individuals (Jotego, Furrtek) contribute those findings back to projects like MAME, which then incorporate that information to improve these various systems to be look, sound, and gameplay-identical to the originals as well.
It’s a particularly outlandish thing to say when heavily citing N64 game decompilation, as the N64 had certain graphical capabilities that simply couldn’t be done on commodity PC GPUs until the advent of programmable pixel shaders: Whenever Mario is wearing the vanish cap, for example, the alpha channel of his model is modulated by LFSR-supplied noise, which is what produces the randomly-moving pixelation.
Unless the shim renderer provided by a decompilation is going as far to identify certain RDP color-combiner setups in order to use specific shader setups on the user’s GPU, the most likely scenario is that the noise amplitude will simply be treated as a flat alpha-modulation value, just as it is in Nintendo’s own Virtual Console emulators.
Beyond that, for games which relied heavily on overall system timing – Rare Ltd.’s Blast Corps on the N64 is notorious for running too fast in emulation as it effectively is only throttled by slowdown on a real N64 – decompilation without corresponding gets you effectively nothing. The code by necessity would have to be altered in order to not run even more out-of-control-fast on a modern PC.
Decompilation is a cool thing in and of itself. You don’t need to make it out to be something that it’s not, Tom.
Whoops, missed a word. “decompilation without corresponding” should say “decompilation without corresponding alteration”.
I mean I’ve personally seen it. I don’t know many who have played emulators who do not run into this problem…
It’s usually improved with iterations and improvements on the emulators themselves, but it’s obviously a thing.
Nesticle vs modern NES emulators, or Mame2000 vs newer versions. Not all emulated hardware registers in a particular moment will be the same across various emulators even though they ideally should line up. DK on Mame2000 for instance is not identical to the hardware, to such a degree that it is not useable for world records.
i’m not sure why you’re oppositional on this…of course emulators aren’t perfect. they’re very good, and i have tremendous respect, even awe, for what they have accomplished and how they have improved. but i have yet to see a game that wasn’t palpably different in emulation.
and of course decompiled code will need modification. it says so in the article! the point of decompiling it is that these modifications are easier, and the limitations on what kind of modifications are possible almost evaporate.
“I don’t know why you’re oppositional [to counterfactual statements]”? That’s really the tack you’re going with here, Gregory?
As much as I’ve tried, emulator filters just don’t look quite the same as a fuzzy CRT for NES or the ancient LCDs of my Gameboys.
As for sound, the perfect waveforms of emulators through hi-fi DACs just doesn’t have the same feeling as the scratchy analog of a NES’ output or a Gameboy’s tinny little speaker.
And as for play, even the best reproduction controllers havent been able to capture that feeling. For Gamecube games I got a USB adapter so I could use original controllers but it isn’t quite right. I think it is latency or something but while playing on hardware feels so fluid and natural, emulated feels… off somehow.
I’ve also never really got the same feeling with controllers. But realized for about 6 months ago, that it is the latency in the controller that is the culprit. I found a chart there they have tested the latency on different kinds of controllers. And realized that one I had, connected with USB was much faster. So I tested it on a Mister system. And realized that I’m very susceptible to controller latency.
And I don’t have the fastest controllers.
But it’s very hard to verify this(may be your computer or emulator that introduce latency). As you need extra hardware for that. And you can’t check what poll rate the controller uses.
I just hope that we can get good tools and good guides for setting up decomp projects.
Gave it a shot once but didn’t get far and looking at other projects didn’t help much because they wouldn’t even compile (even the docker containers that you’d expect would be rather locked in).
I have some understanding of assembly and c/c++ but need some help with understanding how to strip assets, a bit of how the compiled Roms are structured, and a good setup for quickly compiling and comparing code.
just reminds me how much i loved the experience of running quake3 on modern hardware. the availability of source code — whereever it comes from — really is a huge boon for keeping things current
Upload the binaries to chatGPT and ask it for the source?
Might not be far off from AI at least being helpful for decomp, though not close to binaries directly uploaded.
Like if you too messy decomp C code and fed it in and used it to help name variables and unwrap what functions are doing it might manage some and speed things up a tad.
Be careful because software copyright last up to 75 years iirc
https://www.copyright.gov/help/faq/faq-duration.html
Life of the copyright holder plus 70 years.
Well Nintendo, Sony, Microsoft ain’t disappearing no time soon 🤷♂️