Things Are Getting Rusty In Kernel Land

There is gathering momentum around the idea of adding Rust to the Linux kernel. Why exactly is that a big deal, and what does this mean for the rest of us? The Linux kernel has been just C and assembly for its entire lifetime. A big project like the kernel has a great deal of shared tooling around making its languages work, so adding another one is quite an undertaking. There’s also the project culture developed around the language choice. So why exactly are the grey-beards of kernel development even entertaining the idea of adding Rust? To answer in a single line, it’s because C was designed in 1971, to run on the minicomputers at Bell Labs. If you want to shoot yourself in the foot, C will hand you the loaded firearm.

On the other hand, if you want to write a kernel, C is a great language for doing low-level coding. Direct memory access? Yep. Inline assembly? Sure. Runs directly on the metal, with no garbage collection or virtual machines in the way? Absolutely. But all the things that make C great for kernel programming also make C dangerous for kernel programming.

Now I hear your collective keyboards clacking in consternation: “It’s possible to write safe C code!” Yes, yes it is possible. It’s just very easy to mess up, and when you mess up in a kernel, you have security vulnerabilities. There’s also some things that are objectively terrible about C, like undefined behavior. C compilers do their best to do the right thing with cursed code like i++ + i++; or a[i] = i++;. But that’s almost certainly not going to do what you want it to, and even worse, it may sometimes do the right thing.

Rust seems to be gaining popularity. There are some ambitious projects out there, like rewriting coreutils in Rust. Many other standard applications are getting a Rust rewrite. It’s fairly inevitable that the collection of Rust developers started to ask, could we invade the kernel next? This was pitched for a Linux Plumbers Conference, and the mailing list response was cautiously optimistic. If Rust could be added without breaking things, and without losing the very things that makes Rust useful, then yes it would be interesting. Continue reading “Things Are Getting Rusty In Kernel Land”

NVIDIA Releases Drivers With Openness Flavor

This year, we’ve already seen sizeable leaks of NVIDIA source code, and a release of open-source drivers for NVIDIA Tegra. It seems NVIDIA decided to amp it up, and just released open-source GPU kernel modules for Linux. The GitHub link named open-gpu-kernel-modules has people rejoicing, and we are already testing the code out, making memes and speculating about the future. This driver is currently claimed to be experimental, only “production-ready” for datacenter cards – but you can already try it out!

The Driver’s Present State

Of course, there’s nuance. This is new code, and unrelated to the well-known proprietary driver. It will only work on cards starting from RTX 2000 and Quadro RTX series (aka Turing and onward). The good news is that performance is comparable to the closed-source driver, even at this point! A peculiarity of this project – a good portion of features that AMD and Intel drivers implement in Linux kernel are, instead, provided by a binary blob from inside the GPU. This blob runs on the GSP, which is a RISC-V core that’s only available on Turing GPUs and younger – hence the series limitation. Now, every GPU loads a piece of firmware, but this one’s hefty!

Barring that, this driver already provides more coherent integration into the Linux kernel, with massive benefits that will only increase going forward. Not everything’s open yet – NVIDIA’s userspace libraries and OpenGL, Vulkan, OpenCL and CUDA drivers remain closed, for now. Same goes for the old NVIDIA proprietary driver that, I’d guess, would be left to rot – fitting, as “leaving to rot” is what that driver has previously done to generations of old but perfectly usable cards. Continue reading “NVIDIA Releases Drivers With Openness Flavor”

With Rocket Lab’s Daring Midair Catch, Reusable Rockets Go Mainstream

We’ve all marveled at the videos of SpaceX rockets returning to their point of origin and landing on their spindly deployable legs, looking for all the world like something pulled from a 1950s science fiction film.  On countless occasions founder Elon Musk and president Gwynne Shotwell have extolled the virtues of reusable rockets, such as lower operating cost and the higher reliability that comes with each booster having a flight heritage. At this point, even NASA feels confident enough to fly their missions and astronauts on reused SpaceX hardware.

Even so, SpaceX’s reusability program has remained an outlier, as all other launch providers have stayed the course and continue to offer only expendable booster rockets. Competitors such as United Launch Alliance and Blue Origin have teased varying degrees of reusability for their future vehicles, but to date have nothing to show for it beyond some flashy computer-generated imagery. All the while SpaceX continues to streamline their process, reducing turnaround time and refurbishment costs with each successful reuse of a Falcon 9 booster.

But that changed earlier this month, when a helicopter successfully caught one of Rocket Lab’s Electron boosters in midair as it fell back down to Earth under a parachute. While calling the two companies outright competitors might be a stretch given the relative sizes and capabilities of their boosters, SpaceX finally has a sparing partner when it comes to the science of reusability. The Falcon 9 has already smashed the Space Shuttle’s record turnaround time, but perhaps Rocket Lab will be the first to achieve Elon Musk’s stated goal of re-flying a rocket within 24 hours of its recovery.

Continue reading “With Rocket Lab’s Daring Midair Catch, Reusable Rockets Go Mainstream”

Large Scale Carbon Capture Without The Technology

We humans are in something of a pickle, as we’ve put too much carbon dioxide in the atmosphere and caused climate change that might even wipe us out. There may still be people to whom that’s a controversial statement, but knowing something needs to be done about it should be a position for which you don’t necessarily have to be a climate change activist glueing yourself to the gates of a refinery.

It’s obvious that we can reduce our CO2 emissions to tackle the problem, but that’s not the only way that atmospheric CO2 can be reduced. How about removing it from the air? It’s an approach that’s being taken seriously enough for a number of industrial carbon capture solutions to be proposed, and even for a pilot plant to be constructed in Iceland. The most promising idea is that CO2 from power stations can be injected into porous basalt rock where it can react to form calcium carbonate. All of which is very impressive, but is there not a way that this can be achieved without resorting to too much technology? Time for Hackaday to pull out the back-of-envelope calculator, and take a look. Continue reading “Large Scale Carbon Capture Without The Technology”

Vintage Computer Festival East Raises The Bar Again

When I arrived at the InfoAge Science and History Museum for this year’s Vintage Computer Festival East, I fully expected it to be a reduced event compared to last year. After all, how could it not? Due to the schedule getting shifted around by COVID, show runner Jeffrey Brace and his team had just six months to put together an event that usually gets planned over the course of an entire year. With such a truncated preparation time, they more than deserved a little slack.

But as anyone who attended VCF East 2022 can attest, they didn’t need it. Not only did the event meet the high expectations set by last year’s Festival, it managed to exceed them. There were more workshops, more talks, more vendors, more consignment rooms, more live streams, more…well, everything. This year’s program even got a splash of glossy color compared to the grayscale handout attendees received in October. It was, by any metric you care to use, better than ever.

It does however leave me in somewhat on an unenviable position. As we’ve learned during the pandemic, a virtual representation of an event as extensive as VCF can give you a taste of what’s offered, but all the nuance is lost. Looking at pictures of somebody’s passion project can’t compare to actually meeting the person and seeing that glint of pride in their eye as they walk you through all the details.

So bear that in mind through this rundown of some of the projects that caught my eye. This isn’t  a “best of” list, and the Festival is certainly not a competition. But each attendee will invariably come away with their own handful of favorite memories, so I’ll document mine here. If you’d like to make your own memories, I’d strongly suggest making the trek out to the Jersey Shore come April 2023 for the next Vintage Computer Festival East.

Continue reading “Vintage Computer Festival East Raises The Bar Again”

The State Of Play In Solid State Batteries

Electric vehicles are slowly but surely snatching market share from their combustion-engined forbearers. However, range and charging speed remain major sticking points for customers, and are a prime selling point for any modern EV. Battery technology is front and center when it comes to improving these numbers.

Solid-state batteries could mark a step-change in performance in these areas, and the race to get them to market is starting to heat up. Let’s take a look at the current state of play.

Continue reading “The State Of Play In Solid State Batteries”

Axiom’s Private ISS Mission Was No Space Vacation

In an era where anyone with deep enough pockets can hitch a ride to the edge of space and back, you’d be forgiven for thinking that Axiom’s Ax-1 mission to the International Space Station was little more than a pleasure cruise for the four crew members. Granted it’s a higher and faster flight than the suborbital hops that the likes of William Shatner and Jeff Bezos have been embarking on, but surely it must still be little more than a publicity stunt organized by folks with more money than they know what to do with?

Thankfully, there’s a bit more to it than that. While the mission was privately funded, the Ax-1 crew weren’t just orbital sightseers. For one thing, there was plenty of real-world experience packed into the SpaceX Dragon: the mission was commanded by Michael López-Alegría, a veteran NASA astronaut, and crew members Larry Connor and Eytan Stibbe are both accomplished pilots, with the latter clocking in thousands of hours on various fighter jets during his time with the Israeli Air Force.

But more importantly, they had work to do. Each member of the crew was assigned a list of experiments they were to conduct, ranging from medical observations to the testing of new hardware. Of course there was some downtime — after all, if you spent $50 million on a ticket to space, you’d expect to have at least a little fun — but this wasn’t just a photo op: Axiom was looking for results. There was no hiding from the boss either, as López-Alegría is not just the Mission Commander, he’s also Axiom’s Vice President of Business Development.

Which makes sense when you consider the company’s ultimate goal is to use the ISS as a springboard to accelerate the development of their own commercial space station. The data collected during Ax-1 is going to be critical to Axiom’s path forward, and with their first module already under construction and expected to launch by 2025, there’s no time to waste.

So what did the crew members of the this privately funded mission to the International Space Station accomplish? Let’s take a look at a few of the more interesting entries from the docket.

Continue reading “Axiom’s Private ISS Mission Was No Space Vacation”