A New Mechanical Keyboard For An Old Computer

As computers age, a dedicated few work towards keeping some of the more interesting ones running. This is often a losing battle of sorts, as the relentless march of time comes for us all, human and machine alike. So as fewer and fewer of these machines remain new methods are needed to keep them running as best they can. [CallousCoder] demonstrates a way of building up a new keyboard for a Commodore 64 which both preserves the original look and feel of the retro computer but also adds some modern touches.

One of the main design differences between many computers of the 80s and modern computers is that the keyboard was often built in to the case of the computer itself. For this project, that means a custom 3D printed plate that can attach to the points where the original keyboard would have been mounted inside the case of the Commodore. [CallousCoder] is using a print from [Wolfgang] to get this done, and with the plate printed and a PCB for the keys it was time to start soldering. The keyboard uses modern switches and assembles like most modern keyboards do, with the exception of the unique layout for some of the C64 keys including a latching shift key, is fairly recognizable for anyone who has put together a mechanical keyboard before.

[CallousCoder] is using the original keycaps from a Commodore 64, so there is an additional step of adding a small adapter between the new switches and the old keycaps. But with that done and some amount of configuring, he has a modern keyboard that looks like the original. If you’re more a fan of the original hardware, though, you can always take an original C64 keyboard and convert it to USB to use it on your modern machines instead.

Continue reading “A New Mechanical Keyboard For An Old Computer”

Improving Magnetoplasmadynamic Ion Thrusters With Superconductors

Ion thrusters are an amazing spacecraft propulsion technology, providing very high efficiency with relatively little fuel. Yet getting one to produce more thrust than that required to lift a sheet of A4 paper requires a lot of electricity. This is why they have been only used for applications where sustained thrust and extremely low fuel usage are important, such as the attitude management of satellites and other spacecraft. Now researchers in New Zealand have created a prototype magnetoplasmadynamic (MPD) thruster with a superconducting electromagnet that is claimed to reduce the required input power by 99% while generating a three times as strong a magnetic field.

Although MPD thrusters have been researched since the 1970s – much like their electrostatic cousins, Hall-effect thrusters – the power limitations on the average spacecraft have limited mission profiles. Through the use of a high-temperature superconducting electromagnet with an integrated cryocooler, the MPD thruster should be able to generate a very strong field, while only sipping power. Whether this works and is as reliable as hoped will be tested this year when the prototype thruster is installed on the ISS for experiments.

Ask Hackaday: What’s A Sun-Like Star?

Is a bicycle like a motorcycle? Of course, the answer is it is and it isn’t. Saying something is “like” something else presupposes a lot of hidden assumptions. In the category “things with two wheels,” we have a winner. In the category “things that require gasoline,” not so much. We’ve noticed before that news stories about astronomy often talk about “sun-like stars” or “Earth-like planets.” But what does that really mean? [Paul Gilster] had the same questions, if you want to read his opinion about it.

[Paul] mentions that even textbooks can’t agree. He found one that said that Centauri A was “sun-like” while Centauri B was sometimes considered sun-like and other times not. So while Paul was looking at the examples of press releases and trying to make sense of it all, we thought we’d just ask you. What makes a star like our sun? What makes a planet like our planet?

Continue reading “Ask Hackaday: What’s A Sun-Like Star?”

Clever Engineering Leaves Appliance Useless

Around these parts, we generally celebrate clever hacks that let you do more with less. So if somebody wrote in to tell us how they used multiplexing to drive the front panel of their latest gadget with fewer pins on the microcontroller than would normally be required, we’d be all over it. But what if that same hack ended up leading to a common failure in a piece of consumer hardware?

As [Jim] recently found out, that’s precisely what seems to be ailing the Meaco Arete dehumidifier. When his stopped working, some Internet searching uncovered the cause of the failure: if a segment in the cheap LED display dies and shorts out, the multiplexing scheme used to interface with the front panel essentially reads that as a stuck button and causes the microcontroller to lock up. He passed the info along to us as a cautionary tale of how over-optimization can come with a hidden cost down the line.

Continue reading “Clever Engineering Leaves Appliance Useless”

Everyone’s Talking GPMI, Should You?

The tech press has been full of announcements over the last day or two regarding GPMI. It’s a new standard with the backing of a range of Chinese hardware companies, for a high-speed digital video interface to rival HDMI. The Chinese semiconductor company HiSilicon have a whitepaper on the subject (Chinese language, Google Translate link), promising a tremendously higher data rate than HDMI, power delivery well exceeding that of USB-C, and interestingly, bi-directional data transfer. Is HDMI dead? Probably not, but the next few years will bring us some interesting hardware as they respond to this upstart.

Reading through pages of marketing from all over the web on this topic, it appears to be an early part of the push for 8k video content. There’s a small part of us that wonders just how far we can push display resolution beyond that of our eyes without it becoming just a marketing gimmick, but it is true to say that there is demand for higher-bandwidth interfaces. Reports mention two plug styles: a GPMI-specific one and a USB-C one. We expect the latter to naturally dominate. In terms of adoption, though, and whether users might find themselves left behind with the wrong interface, we would expect that far from needing to buy new equipment, we’ll find that support comes gradually with fallback to existing standards such as DisplayPort over USB-C, such that we hardly notice the transition.

Nearly a decade ago we marked the passing of VGA. We don’t expect to be doing the same for HDMI any time soon in the light of GPMI.

Making Liquid Oxygen: Far From Easy But Worth The Effort

Normally, videos over at The Signal Path channel on YouTube have a certain vibe, namely teardowns and deep dives into high-end test equipment for the microwave realm. And while we always love to see that kind of content, this hop into the world of cryogenics and liquid oxygen production shows that [Shahriar] has other interests, too.

Of course, to make liquid oxygen, one must first have oxygen. While it would be easy enough to get a tank of the stuff from a gas supplier, where’s the fun in that? So [Shahriar] started his quest with a cheap-ish off-the-shelf oxygen concentrator, one that uses the pressure-swing adsorption cycle we saw used to great effect with DIY O2 concentrators in the early days of the pandemic. Although analysis of the machine’s output revealed it wasn’t quite as capable as advertised, it still put out enough reasonably pure oxygen for the job at hand.

The next step in making liquid oxygen is cooling it, and for that job [Shahriar] turned to the cryocooler from a superconducting RF filter, a toy we’re keen to see more about in the future. For now, he was able to harvest the Stirling-cycle cryocooler and rig it up in a test stand with ample forced-air cooling for the heat rejection end and a manifold to supply a constant flow of oxygen from the concentrator. Strategically placed diodes were used to monitor the temperature at the cold end, a technique we can’t recall seeing before. Once powered up, the cryocooler got down to the 77 Kelvin range quite quickly, and within an hour, [Shahriar] had at least a hundred milliliters of lovely pale blue fluid that passed all the usual tests.

While we’ve seen a few attempts to make liquid nitrogen before, this might be the first time we’ve seen anyone make liquid oxygen. Hats off to [Shahriar] for the effort.

Continue reading “Making Liquid Oxygen: Far From Easy But Worth The Effort”

Ask Hackaday: Vibe Coding

Vibe coding is the buzzword of the moment. What is it? The practice of writing software by describing the problem to an AI large language model and using the code it generates. It’s not quite as simple as just letting the AI do your work for you because the developer is supposed to spend time honing and testing the result, and its proponents claim it gives a much more interactive and less tedious coding experience. Here at Hackaday, we are pleased to see the rest of the world catch up, because back in 2023, we were the first mainstream hardware hacking news website to embrace it, to deal with a breakfast-related emergency.

Jokes aside, though, the fad for vibe coding is something which should be taken seriously, because it’s seemingly being used in enough places that vibe coded software will inevitably affect our lives.  So here’s the Ask Hackaday: is this a clever and useful tool for making better software more quickly, or a dangerous tool for creating software nobody quite understands, containing bugs which could cause a disaster?

Our approach to writing software has always been one of incrementally building something from the ground up, which satisfies the need. Readers will know that feeling of being in touch with how a project works at all levels, with a nose for immediately diagnosing any problems that might occur. If an AI writes the code for us, the feeling is that we might lose that connection, and inevitably this will lead to less experienced coders quickly getting out of their depth. Is this pessimism, or the grizzled voice of experience? We’d love to know your views in the comments. Are our new AI overlords the new senior developers? Or are they the worst summer interns ever?