I, Integrated Circuit

In 1958, the American free-market economist Leonard E Read published his famous essay I, Pencil, in which he made his point about the interconnected nature of free market economics by following everything, and we mean Everything, that went into the manufacture of the humble writing instrument.

I thought about the essay last week when I wrote a piece about a new Chinese microcontroller with an integrated driver for small motors, because a commenter asked me why I was featuring a non-American part. As a Brit I remarked that it would look a bit silly were I were to only feature parts made in dear old Blighty — yes, we do still make some semiconductors! — and it made more sense to feature cool parts wherever I found them. But it left me musing about the nature of semiconductors, and whether it’s possible for any of them to truly only come from one country. So here follows a much more functional I, Chip than Read’s original, trying to work out just where your integrated circuit really comes from. It almost certainly takes great liberties with the details of the processes involved, but the countries of manufacture and extraction are accurate. Continue reading “I, Integrated Circuit”

After 30 Years, Virtual Boy Gets Its Chance To Shine

When looking back on classic gaming, there’s plenty of room for debate. What was the best Atari game? Which was the superior 16-bit console, the Genesis or the Super NES? Would the N64 have been more commercially successful if it had used CDs over cartridges? It goes on and on. Many of these questions are subjective, and have no definitive answer.

But even with so many opinions swirling around, there’s at least one point that anyone with even a passing knowledge of gaming history will agree with — the Virtual Boy is unquestionably the worst gaming system Nintendo ever produced. Which is what makes its return in 2026 all the more unexpected.

Released in Japan and North America in 1995, the Virtual Boy was touted as a revolution in gaming. It was the first mainstream consumer device capable of showing stereoscopic 3D imagery, powered by a 20 MHz 32-bit RISC CPU and a custom graphics processor developed by Nintendo to meet the unique challenges of rendering gameplay from two different perspectives simultaneously.

In many ways it’s the forebear of modern virtual reality (VR) headsets, but its high cost, small library of games, and the technical limitations of its unique display technology ultimately lead to it being pulled from shelves after less than a year on the market.

Now, 30 years after its disappointing debut, this groundbreaking system is getting a second chance. Later this month, Nintendo will be releasing a replica of the Virtual Boy into which players can insert their Switch or Switch 2 console. The device essentially works like Google Cardboard, and with the release of an official emulator, users will be able to play Virtual Boy games complete with the 3D effect the system was known for.

This is an exciting opportunity for those with an interest in classic gaming, as the relative rarity of the Virtual Boy has made it difficult to experience these games in the way they were meant to be played. It’s also reviving interest in this unique piece of hardware, and although we can’t turn back the clock on the financial failure of the Virtual Boy, perhaps a new generation can at least appreciate the engineering that made it possible.

Continue reading “After 30 Years, Virtual Boy Gets Its Chance To Shine”

How Vibe Coding Is Killing Open Source

Does vibe coding risk destroying the Open Source ecosystem? According to a pre-print paper by a number of high-profile researchers, this might indeed be the case based on observed patterns and some modelling. Their warnings mostly center around the way that user interaction is pulled away from OSS projects, while also making starting a new OSS project significantly harder.

“Vibe coding” here is defined as software development that is assisted by an LLM-backed chatbot, where the developer asks the chatbot to effectively write the code for them. Arguably this turns the developer into more of a customer/client of the chatbot, with no requirement for the former to understand what the latter’s code does, just that what is generated does the thing that the chatbot was asked to create.

This also removes the typical more organic selection process of libraries and tooling, replacing it with whatever was most prevalent in the LLM’s training data. Even for popular projects visits to their website decrease as downloads and documentation are replaced by LLM chatbot interactions, reducing the possibility of promoting commercial plans, sponsorships, and community forums. Much of this is also reflected in the plummet in usage of community forums like Stack Overflow.

Continue reading “How Vibe Coding Is Killing Open Source”

Thomas Edison May Have Discovered Graphene

Thomas Edison is well known for his inventions (even if you don’t agree he invented all of them). However, he also occasionally invented things he didn’t understand, so they had to be reinvented again later. The latest example comes from researchers at Rice University. While building a replica light bulb, they found that Thomas Edison may have accidentally created graphene while testing the original article.

Today, we know that applying a voltage to a carbon-based resistor and heating it up to over 2,000 °C can create turbostratic graphene. Edison used a carbon-based filament and could heat it to over 2,000 °C.

This reminds us of how, in the 1880s, Edison observed current flowing in one direction through a test light bulb that included a plate. However, he thought it was just a curiosity. It would be up to Fleming, in 1904, to figure it out and understand what could be done with it.

Naturally, Edison wouldn’t have known to look for graphene, how to look for it, or what to do with it if he found it. But it does boggle the mind to think about graphene appearing many decades earlier. Or maybe it would still be looking for a killer use. Certainly, as the Rice researchers note, this is one of the easier ways to make graphene.

Building Natural Seawalls To Fight Off The Rising Tide

These days, the conversation around climate change so often focuses on matters of soaring temperatures and extreme weather events. While they no longer dominate the discourse, rising sea levels will nonetheless still be a major issue to face as global average temperatures continue to rise.

This poses unique challenges in coastal areas. Municipalities must figure out how to defend their shorelines, or decide which areas they’re willing to lose. The City of Palo Alto is facing just this challenge, and is building a natural kind of seawall to keep the rising tides at bay.

Continue reading “Building Natural Seawalls To Fight Off The Rising Tide”

Ask Hackaday: How Do You Digitize Your Documents?

Like many of you, I have a hard time getting rid of stuff. I’ve got boxes and boxes of weirdo bits and bobs, and piles of devices that I’ll eventually get around to stripping down into even more bits and bobs. Despite regular purges — I try to bring a car-load of crap treasure to local hackerspaces and meetups at least a couple times a year — the pile only continues to grow.

But the problem isn’t limited to hardware components. There’s all sorts of things that the logical part of me understands I’ll almost certainly never need, and yet I can’t bring myself to dispose of. One of those things just so happens to be documents. Anything printed is fair game. Could be the notes from my last appointment with the doctor, or fliers for events I attended years ago. Doesn’t matter, the stacks keep building up until I end up cramming it all into a box and start the whole process starts over again.

I’ve largely convinced myself that the perennial accumulation of electronic bric-à-brac is an occupational hazard, and have come to terms with it. But I think there’s a good chance of moving the needle on the document situation, and if that involves a bit of high-tech overengineering, even better. As such, I’ve spent the last couple of weeks investigating digitizing the documents that have information worth retaining so that the originals can be sent along to Valhalla in my fire pit.

The following represents some of my observations thus far, in the hopes that others going down a similar path may find them useful. But what I’m really interested in is hearing from the Hackaday community. Surely I’m not the only one trying to save some storage space by turn piles of papers into ones and zeros.

Continue reading “Ask Hackaday: How Do You Digitize Your Documents?”

The Amazing Maser

While it has become a word, laser used to be an acronym: “light amplification by stimulated emission of radiation”. But there is an even older technology called a maser, which is the same acronym but with light switched out for microwaves. If you’ve never heard of masers, you might be tempted to dismiss them as early proto-lasers that are obsolete. But you’d be wrong! Masers keep showing up in places you’d never expect: radio telescopes, atomic clocks, deep-space tracking, and even some bleeding-edge quantum experiments. And depending on how a few materials and microwave engineering problems shake out, masers might be headed for a second golden age.

Simplistically, the maser is — in one sense — a “lower frequency laser.” Just like a laser, stimulated emission is what makes it work. You prepare a bunch of atoms or molecules in an excited energy state (a population inversion), and then a passing photon of the right frequency triggers them to drop to a lower state while emitting a second photon that matches the first with the same frequency, phase, and direction. Do that in a resonant cavity and you’ve got gain, coherence, and a remarkably clean signal.

Continue reading “The Amazing Maser”