Turning A Cast-Iron Radiator Into A Water-Cooled PC

Bottom of the cast-iron radiator gaming PC during plumbing. (Credit: Billet Labs, YouTube)
Bottom of the cast-iron radiator gaming PC during plumbing. (Credit: Billet Labs, YouTube)

Water-cooled PCs generally have in common that there’s a radiator somewhere in the loop, yet nobody said that you can’t build the PCB into the radiator. Something like a genuine Victorian-era cast-iron radiator, for example. For the folk over at [Billet Labs], this is just your typical project, of course, even if it took a solid three months to make it all work.

Their previous project was also a water-cooled PC, but in the form of a steampunk-esque wall-mounted installation. What differentiates this new build is that it’s trying to be more of a sleeper PC, as long as you ignore some copper tubing and the like running around the outside of this vintage radiator.

Of course, by using a vintage cast-iron radiator like this, you’re also dealing with all the disadvantages of cast-iron, such as the countless impurities in the metal and the immense weight. With water in the loop, the entire build comes in at about 99 kilograms, and cleaning the radiator of particulates released inside it — including rust — was a challenge.

Continue reading “Turning A Cast-Iron Radiator Into A Water-Cooled PC”

Running A Desktop PC Off AA Alkaline Cells

Everyone is probably familiar with the concept of battery-powered devices, but generally, this involves a laptop with a beefy battery pack and hardware optimized for low power draw. You could also do the complete opposite and try to run a desktop PC off alkaline AA cells, as [ScuffedBits] recently did out of morbid curiosity. Exactly how many alkaline cells does it take to run a desktop PC for any reasonable amount of time?

One nice thing about using batteries with a desktop PC is that you can ditch the entire AC-DC power conversion step and instead use a DC-DC adapter like the well-known PicoATX and its many clones. These just take in 12 VDC and tend to have a fairly wide input voltage range, which is useful when your batteries begin to run out of juice. In this case, just above 10 VDC seemed to be the cut-off point for the used DC-DC adapter.

In the end, [ScuffedBits] used what looks like 56 alkaline AA cells connected in both parallel and series, along with two series-connected 6,800 µF, 40V electrolytic capacitors to buffer the spikes in power demand, after early experiments showed that the cells just cannot provide power that quickly. Although admittedly, the initial thin wiring didn’t help either. With alkaline rather than carbon AA cells, improved wiring, and some buffer capacitors, it turns out that you can indeed run a desktop PC off AA cells, if only just about long enough for a small game of Minesweeper.

Amusingly, the small LCD monitor used in the experiment drew so little power that it happily ran on eight NiMH cells for much longer, highlighting just how important power conservation is for battery-powered devices. We wonder if you could marry this project to a battery project we saw and end up with something practically portable?

Continue reading “Running A Desktop PC Off AA Alkaline Cells”

Bionode Is Hand Truck Transformed Into Mobile Computing Lab

[Steven K. Roberts] is the original digital nomad, having designed and built mobile computing for his own use since the 80s. His latest project is Bionode, a portable computing lab built into a hand truck that can accommodate a wide spectrum of needs for a person on the go.

Far more than just a portable computer with wheels and a handle, Bionode is an integrated collection of systems with power management, a sensor suite, multiple computers, NAS for storage, networking, video production tools, and even the ability to be solar charged. [Steven] also uses a laptop, and Bionode complements it by being everything else.

If one truly wishes to be mobile and modular as well as effective, then size and weight begins to be just as important as usability. Everything in Bionode has a purpose, and it currently contains a PC with GPU for local AI and machine learning work, a NAS with 14 TB of storage, an Ubuntu machine, a Raspberry Pi 5 running Home Assistant, another Raspberry Pi 5 for development work, a Raspberry Pi 3 for running his 3D printer, and a Raspberry Pi 4 for SDR (software-defined radio) work. A smart KVM means a single keyboard, mouse, and display can be shared among machines as needed and additional hardware in a thoughtful layout makes audio and video projects workable. Everything is integrated with sensors and Home Assistant with local AI monitoring, which [Steven] likes to think of as the unit’s nervous system.

Bionode is therefore more than just a collection of computers crammed into a hand truck; it’s a carefully-selected array of hardware that provides whatever [Steven] needs.

Give it a look if you want to see what such a system looks like when it’s been designed and assembled by someone who’s “been there, done that” when it comes to mobile computing. Bionode would complement something like a mobile workshop quite nicely; something [Steven] has also done before.


Thanks [Paul] for the tip!

A screenshot of the world's first 64kB boomer shooter

QUOD Is A Quake-Like In Only 64kB

The demoscene is still alive and well, and the proof is in this truly awe-inspiring game demo by [daivuk] : a Quake-like “boomer shooter” squeezed into a Windows executable of only 64 kB he calls “QUOD”. We’ve included the full explanation video below, but before you check out all the technical details, consider playing the game. It’ll make his explanations even more impressive.

OK, what’s so impressive? Well, aside from the fact that this is a playable 3D shooter in 64kB, with multiple enemies, multiple levels, oodles of textures, running, jumping et cetera–it’s so Quake-like he’s using TrenchBroom to make the levels. Of course he’s reprocessing them into a more space-efficient, optimized format. Yeah, unlike the famous .kkrieger and a lot of other demos in the 64kB space, this isn’t all procedurally generated. [daivuk] did make his own image editing program for procedurally generated textures, though. Which makes sense: as a PNG, the QUOD logo is probably half the size of the (compressed) executable.

The low-poly models are created in Blender, and all created to be symmetric–having the engine mirror the meshes saves 50% of the vertex data. . Blender is just exporting half of a low-poly mesh; just as he wrote his own image editor, he has his own bespoke model tool. This allows tiling model elements, as well as handling bones and poses to keyframe the model’s animation.

Audio is treated similarly to textures and meshes: built up at runtime from stored data and a layered series of effects. When you realize all the sounds were put together in his sound tool from square and sine waves, it makes it very impressive. He’s also got an old-style tracker to create the music. All of these tools output byte arrays that get embedded directly in the game code.

The video also gets into some of his optimization techniques; we like his use of a map file and analyzing it with a python tool to find the exact size of game elements and test his optimizations thereby. One thing he notes is that his optmizations are all for space, not for speed. Except, perhaps, for one thing: [daivuk] created a new language and virtual machine for the game, which seems downright extravagant. It actually makes sense, though, as the virtual machine can be optimized for the limits of the game, as he explains starting at about 20 minutes into the video. Apparently it saved a whole 2kB, which seems like nothing these days but actually let [daivuk] fit an extra level into his 64kB limit. Sure, it’s still bigger than Quake13k–and how did we never cover that?–but you get a lot more game, too.

So, to recap: [daivuk] didn’t just make a game with an impressively tiny size on disk, he made the entire toolchain, and a language for it to boot. If you think this is overoptimized, check out Wolfenstien in 600 lines of AWK. Of course in spite of the 1980s file size, this needs modern hardware to run. You can get surprising graphics performance from a fraction of that, like this ATtiny sprite engine.

Thanks to [Keith Olson] for the tip, which probably took up more than 64kB on our tips line.

Continue reading “QUOD Is A Quake-Like In Only 64kB”

NextSilicon’s Maverick-2: The Future Of High-Performance Computing?

A few months back, Sandia National Laboratories announced they had acquired a new supercomputer. It wasn’t the biggest, but it still offered in their eyes something unique. This particular supercomputer contains NextSilicon’s much-hyped Maverick-2 ‘dataflow accelerator’ chips. Targeting the high-performance computing (HPC) market, these chips are claimed to hold a 10x advantage over the best GPU designs.

NextSilicon Maverick-2 OAM-2 module. (Credit: NextSilicon)
NextSilicon Maverick-2 OAM-2 module. (Credit: NextSilicon)

The strategy here appears to be somewhat of a mixture between VLIW, FPGAs and Sony’s Cell architecture, with a dedicated compiler that determines the best mapping of a particular calculation across the compute elements inside the chip. Naturally, the exact details about the internals are a closely held secret by NextSilicon and its partners (like Sandia), so we basically have only the public claims and PR material to go by.

Last year The Register covered this architecture along with a more in-depth look. What we can surmise from this is that it should perform pretty well for just about all applications, except for single-threaded performance. Of course, as a dedicated processor it cannot do CPU things, which is where NextSilicon’s less spectacular RISC-V-based CPU comes into the picture.

What’s apparent from glancing at the product renders on the NextSilicon site is that these Maverick-2 chips have absolutely massive dies, so they’re absolutely not cheap to manufacture. Whether they’ll make more of a splash than Intel’s Itanium or NVIDIA’s brute force remains to be seen.

Cheap Writing Deck Eschews Distractions

A modern computer can be a great productivity tool. It can also be a great source of distractions. To solve that issue, [Quackieduckie] built the e-typer—a device for writing without distraction.

[Quackieduckie] refers to the device as a “low-cost e-ink typewriter” which lays out the basic mode of operation. It consists of a 4.2 inch e-ink screen, combined with an Orange Pi Zero 2W running the Armbian operating system. It’s set up to boot straight into a document editor so there’s no messing around with other software that could get in the way of productivity. The components are all wrapped up in a tidy 3D printed housing, which includes a foldable stand so you can prop the screen up wherever you happen to be working. [Quackieduckie] built the device to work with any USB-C keyboard, probably figuring that those eager to maximize productivity will already have the typing device of their dreams on hand. Code for the project is available on GitHub for those eager to replicate the build.

We’ve featured similar builds in the past, often referred to as “writing decks.” They’re becoming increasingly popular as people look for distraction-free, ad-free tech experiences. A great example is this clamshell design with an integrated keyboard. If you’re building your own productivity aids in your home lab, don’t hesitate to notify the tipsline!

Upgrading An Old Macbook With An Old Processor

The Core Duo processor from Intel may not have been the first multi-core processor available to consumers, but it was arguably the one that brought it to the masses. Unfortunately, the first Core Duo chips were limited to 32-bit at a time when the industry was shifting toward 64-bit. The Core 2 Duo eventually filled this gap, and [dosdude1] recently completed an upgrade to a Macbook Pro that he had always wanted to do by replacing the Core Duo processor it had originally with a Core 2 Duo from a dead motherboard.

The upgrade does require a bit more tooling than many of us may have access too, but the process isn’t completely out of reach, and centers around desoldering the donor processor and making sure the new motherboard gets heated appropriately when removing the old chip and installing the new one. These motherboards had an issue of moisture ingress which adds a pre-heating step that had been the cause of [dosdude1]’s failures in previous attempts. But with the new chip cleaned up, prepared with solder balls, and placed on the new motherboard it was ready to solder into its new home.

Upon booting the upgraded machine, the only hiccup seemed to be that the system isn’t correctly identifying the clock speed. A firmware update solved this problem, though, and the machine is ready for use. For those who may be wondering why one would do something like this given the obsolete hardware, we’d note that beyond the satisfaction of doing it for its own sake these older Macbooks are among the few machines that can run free and open firmware, and also that Macbooks that are a decade or older can easily make excellent Linux machines even given their hardware limitations.

Continue reading “Upgrading An Old Macbook With An Old Processor”