Zork Running On 4-Bit Intel Computer

Before DOOM would run on any computing system ever produced, and indeed before it even ran on its first computer, the game that would run on any computer of the pre-DOOM era was Zork. This was a text-based adventure game first published in the late 70s that could run on a number of platforms thanks to a virtual machine that interpreted the game code. This let the programmers write a new VM for each platform rather than porting the game every time. [smbakeryt] wanted to see how far he could push this design and got the classic game running on one of the oldest computers ever produced.

The computer in question is the ubiquitous Intel 4004 processor, the first commercially available general-purpose microprocessor produced. This was a four-bit machine and predates the release of Zork by about eight years. As discussed earlier, though, the only thing needed to get Zork to run on any machine is the Z-machine for that platform, so [smbakeryt] got to work. He’s working on a Heathkit H9 terminal, and the main limitation here is the amount of RAM needed to run the game. He was able to extended the address bus to increase the available memory in hardware, but getting the Z-machine running in software took some effort as well. There’s a number of layers of software abstraction here that’s a bit surprising for 70s-era computing but which make it an extremely interesting challenge and project.

As far as [smbakeryt]’s goal of finding the “least amount of computer” that would play Zork, we’d have a hard time thinking of anything predating the 4004 that would have any reasonable user experience, but we’d always encourage others to challenge this thought and [smbakeryt]’s milestone. Similarly, DOOM has a history of running on machines far below the original recommended minimum system requirements, and one of our favorites was getting it to run on the NES.

Continue reading Zork Running On 4-Bit Intel Computer”

A Steam Machine Clone For An Indeterminate But Possibly Low Cost

For various reasons, crypto mining has fallen to the wayside in recent years. Partially because it was never useful other than as a speculative investment and partially because other speculative investments have been more popular lately, there are all kinds of old mining hardware available at bargain prices. One of those is the Asrock AMD BC250, which is essentially a cut down Playstation 5 but which has almost everything built into it that a gaming PC would need to run Steam, and [ETA PRIME] shows us how to get this system set up.

The first steps are to provide the computer with power, an SSD, and a fan for cooling. It’s meant to be in a server rack so this part at least is pretty straightforward. After getting it powered up there are a few changes to make in the BIOS, mostly related to memory management. [ETA PRIME] is uzing Bazzite as an operating system which helps to get games up and running easily. It plays modern games and even AAA titles at respectable resolutions and framerates almost out-of-the-box, which perhaps shouldn’t be surprising since this APU has a six-core Zen 2 processor with a fairly powerful RDNA2 graphics card, all on one board.

It’s worth noting that this build is a few weeks old now, and the video has gotten popular enough that the BC250 cards that [ETA PRIME] was able to find for $100 are reported to be much more expensive now. Still, though, even at double or triple the price this might still be an attractive price point for a self-contained, fun, small computer that lets you game relatively easily and resembles the Steam Machine in concept. There are plenty of other builds based on old mining hardware as well, so don’t limit yourself to this one popular piece of hardware. This old mining rig, for example, made an excellent media server.

Continue reading “A Steam Machine Clone For An Indeterminate But Possibly Low Cost”

Low-Cost, Portable Streaming Server

Thanks to the Raspberry Pi, we have easy access to extremely inexpensive machines running Linux that have all kinds of GPIO as well as various networking protocols. And as the platform has improved over the years, we’ve seen more demanding applications on them as well as applications that use an incredibly small amount of power. This project combines all of these improvements and implements a media streaming server on a Raspberry Pi that uses a tiny amount of energy, something that wouldn’t have been possible on the first generations of Pi.

Part of the reason this server uses such low power, coming in just around two watts, is that it’s based on the Pi Zero 2W. It’s running a piece of software called Mini-Pi Media Server which turns the Pi into a DLNA server capable of streaming media over the network, in this case WiFi. Samba is used to share files and Cockpit is onboard for easy web administration. In testing, the server was capable of streaming video to four different wireless devices simultaneously, all while plugged in to a small USB power supply.

For anyone who wants to try this out, the files for it as well as instructions are also available on a GitHub page. We could think of a number of ways that this would be useful over a more traditional streaming setup, specifically in situations where power demand must remain low such as on a long car trip or while off grid. We also don’t imagine the Pi will be doing much transcoding or streaming of 4K videos with its power and processing limitations, but it would be unreasonable to expect it to do so. For that you’d need something more powerful.

Continue reading “Low-Cost, Portable Streaming Server”

The Many-Sprites Interpretation Of Amiga Mechanics

The invention of sprites triggered a major shift in video game design, enabling games with independent moving objects and richer graphics despite the limitations of early video gaming hardware. As a result, hardware design was specifically built to manipulate sprites, and generally as new generations of hardware were produced the number of sprites a system could produce went up. But [Coding Secrets], who published games for the Commodore Amiga, used an interesting method to get this system to produce far more sprites at a single time than the hardware claimed to support.

This hack is demonstrated with [Coding Secrets]’s first published game on the Amiga, Leander. Normally the Amiga can only display up to eight sprites at once, but there is a coprocessor in the computer that allows for re-drawing sprites in different areas of the screen. It can wait for certain vertical and horizontal line positions and then execute certain instructions. This doesn’t allow unlimited sprites to be displayed, but as long as only eight are displayed on any given line the effect is similar. [Coding Secrets] used this trick to display the information bar with sprites, as well as many backgrounds, all simultaneously with the characters and enemies we’d normally recognize as sprites.

Of course, using built-in hardware to do something the computer was designed to do isn’t necessarily a hack, but it does demonstrate how intimate knowledge of the system could result in a much more in-depth and immersive experience even on hardware that was otherwise limited. It also wasn’t free to use this coprocessor; it stole processing time away from other tasks the game might otherwise have to perform, so it did take finesse as well. We’ve seen similar programming feats in other gaming projects like this one which gets Tetris running with only 1000 lines of code.

Continue reading “The Many-Sprites Interpretation Of Amiga Mechanics”

Measuring Caffeine Content At Home

By far, the most widely used psychoactive substance in the world is caffeine. It’s farmed around the world in virtually every place that it has cropped up, most commonly on coffee plants, tea plants, and cocoa plants. But is also found in other less common plants like the yaupon holly in the southeastern United States and yerba maté holly in South America. For how common it is and how long humans have been consuming it, it’s always been a bit difficult to quantify exactly how much is in any given beverage, but [Johnowhitaker] has a solution to that.

This build uses a practice called thin layer chromatography, which separates the components of a mixture by allowing them to travel at different rates across a thin adsorbent layer using a solvent. Different components will move to different places allowing them to be individually measured. In this case, the solvent is ethyl acetate and when the samples of various beverages are exposed to it on a thin strip, the caffeine will move to a predictable location and will show up as a dark smudge under UV light. The smudge’s dimensions can then be accurately measured to indicate the caffeine quantity, and compared against known reference samples.

Although this build does require a few specialized compounds and equipment, it’s by far a simpler and less expensive way of figuring out how much caffeine is in a product than other methods like high-performance liquid chromatography or gas chromatography, both of which can require extremely expensive setups. Plus [Johnowhitaker]’s results all match the pure samples as well as the amounts reported in various beverages so he’s pretty confident in his experimental results on beverages which haven’t provided that information directly.

If you need a sample for your own lab, we covered a method on how to make pure caffeine at home a while back.

Continue reading “Measuring Caffeine Content At Home”

Modernizing A Classic Datsun Engine

Although Nissan has been in the doldrums ever since getting purchased by Renault in the early 2000s, it once had a reputation as a car company that was always on the cutting edge of technology. Nissan was generally well ahead of its peers when bringing technologies like variable valve timing, turbocharging, fuel injection, and adjustable suspension to affordable, reliable vehicles meant for everyday use. Of course, a lot of this was done before computers were as powerful as they are today so [Ronald] set out to modernize some of these features on his 1978 Datsun 280Z.

Of course there are outright engine swaps that could bring a car like this up to semi-modern standards of power and efficiency, but he wanted to keep everything fully reversible in case he wants to revert to stock in the future, and didn’t want to do anything to the engine’s interior. The first thing was to remove the complicated mechanical system to control the throttle and replace it with an electronic throttle body with fly-by-wire system and a more powerful computer. The next step was removing the distributor-based ignition system in favor of individual coil packs and electronic ignition control, also managed by the new computer. This was perhaps the most complicated part of the build as it involved using a custom-made hall effect sensor on the original distributor shaft to tell the computer where the engine was in its rotation.

The final part of this engine modernization effort was upgrading the fuel delivery system. The original fuel injection system fired all of the injectors all the time, needlessly wasting fuel, but the new system only fires a specific cylinder when it needs fuel. This ended up improving gas mileage dramatically, and dyno tests also showed these modifications improved power significantly as well. Nissan hasn’t been completely whiffing since the Renault takeover, either. Their electric Leaf was the first mass-produced EV and is hugely popular in all kinds of projects like this build which uses a Leaf powertrain in a Nissan Frontier.

Continue reading “Modernizing A Classic Datsun Engine”

Different Algorithms Sort Christmas Lights

Sorting algorithms are a common exercise for new programmers, and for good reason: they introduce many programming fundamentals at once, including loops and conditionals, arrays and lists, comparisons, algorithmic complexity, and the tradeoff between correctness and performance. As a fun Christmas project, [Scripsi] set out to implement twelve different sorting algorithms over twelve days, using Christmas lights as the sorting medium.

The lights in use here are strings of WS2812 addressable LED strips, with the program set up to assign random hue values to each of the lights in the string. From there, an RP2040-based platform will step through the array of lights and implement the day’s sorting algorithm of choice. When operating on an element in the array the saturation is turned all the way up, helping to show exactly what it’s doing at any specific time. When the sorting algorithm has finished, the microcontroller randomizes the lights and starts the process all over again.

For each of the twelve days of Christmas [Scripsi] has chosen one of twelve of their favorite sorting algorithms. While there are a few oddballs like Bogosort which is a guess-and-check algorithm that might never sort the lights correctly before the next Christmas (although if you want to try to speed this up you can always try an FPGA), there are also a few favorites and some more esoteric ones as well. It’s a great way to get some visualization of how sorting algorithms work, learn a bit about programming fundamentals, and get in the holiday spirit as well.