The Vacuum Tube’s Last Stand(s)

When most people think about vacuum tubes, they picture big glass bottles glowing inside antique radios or early computers. History often treats tubes as a dead-end technology that was suddenly swept away by the transistor in the 1950s. But the reality is much more interesting. Vacuum tube technology did not simply stop evolving when the transistor appeared. In fact, some of the most sophisticated and technically impressive tube designs emerged after the transistor had already been invented.

During the final decades of mainstream tube development, manufacturers pushed the technology in remarkable directions. Tubes became smaller, faster, quieter, more rugged, and more specialized. Designers experimented with exotic geometries, ceramic construction, metal envelopes, ultra-high-frequency operation, and even hybrid tube-semiconductor systems. Devices such as acorn tubes, lighthouse tubes, compactrons, and nuvistors represented a last gasp of thermionic electronics.

Ironically, many of these innovations arrived just as solid-state electronics were becoming commercially practical. Vacuum tubes were improving rapidly right up until the market abandoned them.

The Pressure to Improve

By the 1930s and 1940s, vacuum tubes dominated electronics. Radios, radar systems, military communications, industrial controls, and the first digital computers all depended on them. But everyone was painfully aware of their problems.

Traditional tubes were fragile, generated heat, consumed significant power, and suffered from limitations at high frequencies. Internal lead lengths created parasitic inductance and capacitance. At radio frequencies and especially microwave frequencies, those unwanted effects made design difficult.

Continue reading “The Vacuum Tube’s Last Stand(s)”

Hackaday Links Column Banner

Hackaday Links: May 10, 2026

While Artemis II was primarily a demonstration flight of the architecture NASA plans to use for future lunar missions, it was also an excellent excuse for the crew to snap some photos of the Moon and Earth with the benefit of modern camera technology. If you’ve been looking forward to seeing more of the crew’s images, you’re in luck, as thousands of new images have recently been released.

Now we don’t mean to beat up on the folks at NASA, but browsing through these images, we couldn’t help but be reminded of an article we saw on PetaPixel that discussed the space agency’s haphazard approach to sharing images online.

It’s really more like an unsorted file dump than anything, made worse by the fact that you have to access it through a government website that looks and performs like it was designed in the early 2000s. There’s even a prominent button that attempts to load a gallery feature that relies on the long-deprecated Adobe Flash. It would be nice to see the situation improved by the time astronauts actually touch down on the lunar surface, but we wouldn’t count on it.

Speaking of old tech, we’ve been following the resurgence of keyboard-equipped smartphones with great interest, as we imagine many of you have been. A recent CNBC article addresses the trend, although it didn’t quite take the nerd contingent into account. We want physical keys so we can work in the terminal and write code without fighting an on-screen keyboard, but of course, that’s not exactly what your average consumer is looking for.

It’s quite the opposite, in fact. A 20-something user referenced in the article explained how the younger generations see the physical keyboard as a way to be less connected to their phones, describing it as “an extra barrier of inconvenience that adds more steps into the thinking process.” If you need us, we’ll be collecting dust in the corner.

Continue reading “Hackaday Links: May 10, 2026”

Hackaday Podcast Episode 369: IR, E-Ink, And Avgas

In this episode, Hackaday Editors Elliot Williams and Tom Nardi start things off by discussing the latest reason that cheap PCB fabrication isn’t quite as cheap as it once was. The conversation will then move on to hacking electronic shelf labels, stylish e-ink status displays, cutting metal at home with high current and a bit of water, a solarpunk message board hiding in a IKEA-style lantern, and pushing NFC out of its comfort zone. From there you’ll hear about a matching transistors, taking pictures of the International Space Station, and Linux on the PS5. They’ll wrap up this week’s episode by going over the surprisingly simple concept behind flow batteries, and learn who’s still using leaded gasoline and why.

Check out the links if you want to follow along, and as always, tell us what you think about this episode in the comments!

Direct download in DRM-free MP3.

Continue reading “Hackaday Podcast Episode 369: IR, E-Ink, And Avgas”

This Week In Security: Another Linux Exploit, Ubuntu Knocked Offline, Finals Interrupted, And Backdoored Tools

After the CopyFail vulnerability gave root access from any user on almost all distributions last week, this week we’ve got DirtyFrag. This chains the vulnerability in CopyFail (xfrm-ESP) and a new vulnerability in a RPC function which allows similar overwriting of the page cache.

Both vulnerabilities manipulate the Linux page cache where data from disk is stored for rapid access. The kernel will always prefer the cached version of a file, which means that anything that is able to manipulate the contents of the cache can effectively replace the contents of the file. Both of the vulnerabilities leverage a similar mechanism – picking a binary which is flagged to run as root, such as su, and replacing the contents that would prompt for the users password with a launcher to immediately run a shell.

Like CopyFail, DirtyFrag requires the ability to execute code on the target in the first place, but turning almost any code or command execution vulnerability in any network service into root raises the impact significantly, allowing an attacker to break out of containers and privilege environments, or establish a persistent presence in the system when the original vulnerabilities are discovered and closed.

The previous mitigations to block specific kernel modules related to CopyFail are not sufficient to block the new vulnerabilities. At the time of writing this, there are no available patches from the distributions, however the vulnerable kernel modules can be temporarily disabled.

CopyFail added to KEV

CISA (the United States cyber security agency) has added CopyFail to the KEV, or Known Exploited Vulnerabilities list. Attacks on the KEV have been observed under active exploitation, which in the case of CopyFail is hardly a surprise.

The KEV is designed as a tool to allow security teams in government and commercial industry to prioritize the highest risk vulnerabilities – or at least give another source of data to point at when you say “we really need to patch this now”.

Prolonged Ubuntu DDOS

On the heels of the CopyFail vulnerability impacting almost all distributions, Ubuntu has had to face a prolonged distributed denial-of-service (DDoS) attack against the main infrastructure. Ars Technica reported at the beginning of the attack, and after several days, services appear to be restored. In the meantime, core services such as package updates, core repositories, and even the Ubuntu and Canonical websites were largely unreachable.

An Iraqi group claims responsibility for the attack, but it is unclear if they were the actual perpetrators – or why. The timing with the CopyFail vulnerability seems like an opportune moment to cause chaos by taking the update mechanisms of a major distribution offline, but in the era of modern Internet behavior, it could also just have been a Tuesday.

Continue reading “This Week In Security: Another Linux Exploit, Ubuntu Knocked Offline, Finals Interrupted, And Backdoored Tools”

Congratulations To The Green Powered Challenge Winners!

For this challenge, we asked you to show off your hacks that power themselves sustainably from the environment around them. After all, nobody likes wires, and changing batteries is just a hassle. What’s better than an autonomous gizmo? Nothing.

Because this is Hackaday, we expected to see some finished-looking projects, some absolutely zany concepts, and basically everything in-between, and you did not disappoint! So without further ado, let’s have a look at the 2026 Green Powered Challenge winners, each of whom will be going on a $150 shopping spree at DigiKey, our contest’s sponsor.

Continue reading “Congratulations To The Green Powered Challenge Winners!”

There’s More To Global Positioning Than Just GPS

The Global Positioning System (GPS) was developed by the United States military in the 1970s, but it wasn’t long before civilians all over the planet started using it. By the early 2000s the technology was popping up in consumer devices such as mobile phones, and since then its become absolutely integral to our modern way of life.

But although support for GPS in our gadgets is nearly ubiquitous, it’s not the only option when it comes to figuring out where you are on the globe. As you might imagine, not everyone was thrilled with building their infrastructure around one of Uncle Sam’s pet projects, and so today there are several homegrown regional and global satellite navigation systems in operation.

As a follow-up to our recent dive into the ongoing GPS upgrades, let’s take a look at some of the other satellite positioning systems and who operates them.

Continue reading “There’s More To Global Positioning Than Just GPS”

AI On Every Machine: The LLM You Probably Didn’t Want

It’s been a story of the last week or so if you follow the kind of news channels a Hackaday scribe does, that Google have quietly installed an LLM as part of the Chrome browser. Reports vary as to when they did this because there’s a lot of confusion online with their online Gemini features also present in the browser, but it seems Chrome users are noticing its effect through slower performance and hefty disk access. Given that Chrome is by far the most popular web browser, this means that billions of users will have downloaded the four gigabyte Gemini Nano model, and now have an LLM they didn’t know about. It will be used to provide advanced auto-correct and other text suggestion features that their online version of Gemini would presumably be overburdened with, and since it’s available through a set of in-browser APIs we expect that it will find its way into a lot of websites, online applications, and plugins.

It’s caused a bit of a fuss in some circles, and we think, with some justification. When billions of computers unwittingly install an extremely energy intensive software component the effect on global power consumption will be significant, with a consequent uptick in the carbon footprint of computing. It’s not a phenomenon restricted to Chrome, as an example Siri has used a local LLM on Apple devices for a while now. We’ve seen rumblings of discontent and talk of getting European climate regulators involved, but perhaps instead it’s time to have a conversation about local AI models. The key is not whether or not they are a good thing to have, but when and how they operate.

While many of us are sick to death of AI slop and have not been lured into AI psychosis by an over-reinforcing chatbot, the fact remains that LLMs can do some useful things, they’re here to stay whether we like it or not, and having one under your control on your own computer doesn’t have to be a bad thing. Install Llama.cpp on your machine, and you’ve got an LLM of your very own, upon which your usage data isn’t going to be sold, and your content isn’t going to reinforce the finest plagiarism device the world has ever seen.

Opt-In and Opt-Out

The concerning development with the Chrome LLM is that not only has it been installed without the user’s consent, it runs without their consent too, and they can’t use it for anything except what Google Chrome wants it to be used for. Unlike the Llama.cpp mentioned above, it’s not under their control, instead it’s a compute-hungry monster ultimately controlled by Google. The prospect of a future in which multiple pieces of everyday software install their own similarly out-of-control multi-gigabyte CPU-munchers is a concerning one. Anyone who remembers Microsoft’s Clippy grabbing all the resources in a 1990s desktop as its stuttering animation played its course will know where this is going.

If local LLMs are an inevitability, what’s needed is a way to make them like any other application, one that the user chooses and installs themselves. Such an LLM could make its services available to applications such as a web browser if the user allows it to, but not run unless asked. It’s fairly obvious that installing Llama.cpp or similar is beyond many users, but it shouldn’t lie beyond the bounds of possibility to package something like it as an application they can install.

We know that the previous paragraph is pie-in-the-sky wishful thinking, and that as the person who knows computers in your family your next few Christmases will be spent wrestling with six different LLMs running on some elderly family member’s PC. But perhaps in Clippy lies the answer. If the consumer can learn to associate built-in AI features with their computer grinding to a halt just as they did with an office assistant thirty years ago, then perhaps they’ll demand change. We can hope.