Quake In 276 KB Of RAM

Porting the original DOOM to various pieces of esoteric hardware is a rite of passage in some software circles. But in the modern world, we can get better performance than the 386 processor required to run the 1993 shooter for the cost of a dinner at a nice restaurant — with plenty of other embedded systems blowing these original minimum system requirements out of the water.

For a much tougher challenge, a group from Silicon Labs decided to port DOOM‘s successor, Quake, to the Arduino Nano Matter Board platform instead even though this platform has some pretty significant limitations for a game as advanced as Quake.

To begin work on the memory problem, the group began with a port of Quake originally designed for Windows, allowing them to use a modern Windows machine to whittle down the memory usage before moving over to hardware. They do have a flash memory module available as well, but there’s a speed penalty with this type of memory. To improve speed they did what any true gamer would do with their system: overclock the processor. This got them to around 10 frames per second, which is playable, but not particularly enjoyable. The further optimizations to improve the FPS required a much deeper dive which included generating lookup tables instead of relying on computation, optimizing some of the original C programming, coding some functions in assembly, and only refreshing certain sections of the screen when needed.

On a technical level, Quake was a dramatic improvement over DOOM, allowing for things like real-time 3D rendering, polygonal models instead of sprites, and much more intricate level design. As a result, ports of this game tend to rely on much more powerful processors than DOOM ports and this team shows real mastery of their hardware to pull off a build with a system with these limitations. Other Quake ports we’ve seen like this one running on an iPod Classic require a similar level of knowledge of the code and the ability to use assembly language to make optimizations.

Thanks to [Nicola] for the tip!

20 thoughts on “Quake In 276 KB Of RAM

    1. wouldnt mind seeing a deep dive into quake c. iirc it was the first time game code was in a separate place from the main engine and gave modders an order of magnitude more leverage in what they could do (without disassembly of the engine). later you had unreal script and a lot of games started to use lua (which always seemed preferred over python with how small the interpreter was). anyway it was a key step towards the modern game engine.

      also want to know how they are getting around the floating point requirements. i remember in the requirements is said 486 and absolutely required a math co processor. games up to that point had been using fixed point math for speed.

      1. “also want to know how they are getting around the floating point requirements. i remember in the requirements is said 486 and absolutely required a math co processor. games up to that point had been using fixed point math for speed.”

        To my knowledge, the x87 FPUs could do fixed point math, as well.
        That feature just had been seldomly used.

        If programmers back then only had bothered to ship two binaries, pure x86 and x86+x87.
        It wouldn’t have caused much extra work, it just was a matter of setting a compiler switch.

      2. Matches my recollection.
        From Graphics Programming Black Book by Michael Abrash:

        To speed things up a little more, the FDIV to calculate the reciprocal of l / z is overlapped with drawing 16 pixels, taking advantage of the Pentiurn’s ability to perform floating-point in parallel with integer instructions, so the FDIV effectively takes only one cycle.

  1. If it runs Quake, then it could be made to run Half Life or better yet, CS 1.6 patch. Just think about it, how could would it be to play de_dust2 on a microcontroller cheaper than your bog standard toilet seat?

    1. Ah yes, good ol’ CS. It caused the whole “killer games” debate in my country.
      I remember how my friends urgend me to play this game.
      That was in early 2000s, when the Iraq war was happening and people were dying in the news.
      I remember how sick I got each time playing this game. Ah yes, good bad old memories.

  2. “But in the modern world, we can get better performance than the 386 processor required to run the 1993 shooter for the cost of a dinner at a nice restaurant ”

    The 386 was a mile stone, really.
    What was the bottleneck back then when running somwhat primitive DOOM wasn’t the processor, but the graphics system.

    On DOS, there’s no* graphics acceleration available and the 386 proccessor has to push the pixels into video memory, often over slow ISA bus.
    That’s what had slowed things down, mainly.

    In addition, many users were cheap on RAM expansion in early 90s, which in turn also affected cache RAM expansion (no cache or modest 64KB cache were common).

    But if the processor doesn’t have enough cache and has to rely on slow 30pin SIMM memory (both in terms of access time in ns, as well as throughput in MB/s), it has no other choice but has to wait. 🤷
    So it’s not the 386 that was slow here per se.

    It’s rather that the more popular 486 had been in use when VLB graphics cards, 72pin PS/2 SIMMs and more cache memory had been common.
    The performance increase was also noticeable with other rather dumb applications that didn’t need much computing power, such as GUIs.

    (*With exceptions. PGC, TIGA, 8514/A, XGA video standards and accelerators like Voodoo, S3 and Cirrus chips, ET4000W32, WD90C31, NV1, etc)

    1. I suppose you could run quake on an older system without graphics by using the ncurses fork, however I doubt anyone is going to put in the labour necessary for such a poor playing experience.

  3. In my memory Quake was the last of the pure software rasterizers. “Graphics Programming Black Book” by Michael Abrash detailed some of it, including the need for a few divide instructions.

    But Quake got hardware acceleration almost immediately. The first “OpenGL” drivers worked very well, as long as your code used them exactly the same way Quake did.
    And after playing in 640×480, you’re not going back to 320×240… bye bye CPU graphics.

    1. Both 800×600 in 16c (planar) and 640×400 in 256c (one pixel per byte) resolutions were technically possible since 1988, but most DOS programmers were stuck to mode 13h all their life (320×200 256c).

      I’m not kidding. The earliest ISA bus VGA cards with as little as 256KB of video RAM were capable of this.
      Advantures and simulators were using 800×600 (mode 6Ah, 102h) early on, but 640×400 was rarely used.

      Perhaps because it involved the use of VESA VBE (mode 100h) or direct chip set support on the programmer’s side.
      800×600 in 16c used a planar memory layout like standard VGA does in 640×480 16c (mode 12h), it was just using a different video mode number.

      So a mode list for different VGA cards (Super VGA cards) was needed by the game, in addition to the known VBE mode numbers.
      Though at least SVGA mode /VBE mode 6Ah was kind of popular before VBE was adopted as a whole.

      So yeah, 3D accelerator were all nice, but a strong 486 PC could have done it in software in early 90s.
      You can double check using PC Player Benchmark, it uses 640×400 256c by default.

      Ok, maybe the frame rate wouldn’t be stellar on average PC of the day, but it would have been playable.
      Let’s remember, back in the day, 12 FPS were considered the minimum for smooth gameplay by game industry.

      So playing 3D games at 7-8 FPS wasn’t too uncommon back then, -let’s remember StarWing or Virtua Racing..

      We all had different PCs, and gamers being gamers always were into latest hardware and never really satisfied.
      To run 3D games truely smoothly, we had to wait for Pentium. Or use an overclocked 486DX2, 486DX4 or 5×86.

      3D games like Magic Carpet, Descent, Terminal Velocity were able to do higher than 320×200 resolutions without an accelerator card (though they eventually came in patched versions with special support for S3 Virge 325 and Voodoo and other cards.)

      1. Software rasterizers don’t scale with resolution. A CPU is not a good device to push memory around.

        I love what Carmek did with Wofenstein, Doom!! and Quake. If you had conceived of the ray casting concept in the 80’s, arcade machines and maybe the Amiga could have made use of it. Was such an awesome idea on his part. I can see how that idea grew out of the ModeX video mode (320×240), it required column bases rendering.

        It’s just that after Quake things changed. You could either work 80 hours per week in game sweatshop or work for ATI/nVidia (or be on of the thousand devs trying to peddle a game engine, sometimes even with a maya plugin) The magic of having artists and engineers in the same company was gone. Division of labor.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.