Hackaday Links Column Banner

Hackaday Links: July 3, 2022

Looks like we might have been a bit premature in our dismissal last week of the Sun’s potential for throwing a temper tantrum, as that’s exactly what happened when a G1 geomagnetic storm hit the planet early last week. To be fair, the storm was very minor — aurora visible down to the latitude of Calgary isn’t terribly unusual — but the odd thing about this storm was that it sort of snuck up on us. Solar scientists first thought it was a coronal mass ejection (CME), possibly related to the “monster sunspot” that had rapidly tripled in size and was being hyped up as some kind of planet killer. But it appears this sneak attack came from another, less-studied phenomenon, a co-rotating interaction region, or CIR. These sound a bit like eddy currents in the solar wind, which can bunch up plasma that can suddenly burst forth from the sun, all without showing the usually telltale sunspots.

Then again, even people who study the Sun for a living don’t always seem to agree on what’s going on up there. Back at the beginning of Solar Cycle 25, NASA and NOAA, the National Oceanic and Atmospheric Administration, were calling for a relatively weak showing during our star’s eleven-year cycle, as recorded by the number of sunspots observed. But another model, developed by heliophysicists at the U.S. National Center for Atmospheric Research, predicted that Solar Cycle 25 could be among the strongest ever recorded. And so far, it looks like the latter group might be right. Where the NASA/NOAA model called for 37 sunspots in May of 2022, for example, the Sun actually threw up 97 — much more in line with what the NCAR model predicted. If the trend holds, the peak of the eleven-year cycle in April of 2025 might see over 200 sunspots a month.

So, good news and bad news from the cryptocurrency world lately. The bad news is that cryptocurrency markets are crashing, with the flagship Bitcoin falling from its high of around $67,000 down to $20,000 or so, and looking like it might fall even further. But the good news is that’s put a bit of a crimp in the demand for NVIDIA graphics cards, as the economics of turning electricity into hashes starts to look a little less attractive. So if you’re trying to upgrade your gaming rig, that means there’ll soon be a glut of GPUs, right? Not so fast, maybe: at least one analyst has a different view, based mainly on the distribution of AMD and NVIDIA GPU chips in the market as well as how much revenue they each draw from crypto rather than from traditional uses of the chips. It’s important mainly for investors, so it doesn’t really matter to you if you’re just looking for a graphics card on the cheap.

Speaking of businesses, things are not looking too good for MakerGear. According to a banner announcement on their website, the supplier of 3D printers, parts, and accessories is scaling back operations, to the point where everything is being sold on an “as-is” basis with no returns. In a long post on “The Future of MakerGear,” founder and CEO Rick Pollack says the problem basically boils down to supply chain and COVID issues — they can’t get the parts they need to make printers. And so the company is looking for a buyer. We find this sad but understandable, and wish Rick and everyone at MakerGear the best of luck as they try to keep the lights on.

And finally, if there’s one thing Elon Musk is good at, it’s keeping his many businesses in the public eye. And so it is this week with SpaceX, which is recruiting Starlink customers to write nasty-grams to the Federal Communications Commission regarding Dish Network’s plan to gobble up a bunch of spectrum in the 12-GHz band for their 5G expansion plans. The 3,000 or so newly minted experts on spectrum allocation wrote to tell FCC commissioners how much Dish sucks, and how much they love and depend on Starlink. It looks like they may have a point — Starlink uses the lowest part of the Ku band (12 GHz – 18 GHz) for data downlinks to user terminals, along with big chunks of about half a dozen other bands. It’ll be interesting to watch this one play out.

Asahi GPU Hacking

[Alyssa Rosenzweig] has been tirelessly working on reverse engineering the GPU built into Apple’s M1 architecture as part of the Asahi Linux effort. If you’re not familiar, that’s the project adding support to the Linux kernel and userspace for the Apple M1 line of products. She has made great progress, and even got primitive rendering working with her own open source code, just over a year ago.

Trying to mature the driver, however, has hit a snag. For complex rendering, something in the GPU breaks, and the frame is simply missing chunks of content. Some clever testing discovered the exact failure trigger — too much total vertex data. Put simply, it’s “the number of vertices (geometry complexity) times amount of data per vertex (‘shading’ complexity).” That… almost sounds like a buffer filling up, but on the GPU itself. This isn’t a buffer that the driver directly interacts with, so all of this sleuthing has to be done blindly. The Apple driver doesn’t have corrupted renders like this, so what’s going on?
Continue reading “Asahi GPU Hacking”

NVIDIA Releases Drivers With Openness Flavor

This year, we’ve already seen sizeable leaks of NVIDIA source code, and a release of open-source drivers for NVIDIA Tegra. It seems NVIDIA decided to amp it up, and just released open-source GPU kernel modules for Linux. The GitHub link named open-gpu-kernel-modules has people rejoicing, and we are already testing the code out, making memes and speculating about the future. This driver is currently claimed to be experimental, only “production-ready” for datacenter cards – but you can already try it out!

The Driver’s Present State

Of course, there’s nuance. This is new code, and unrelated to the well-known proprietary driver. It will only work on cards starting from RTX 2000 and Quadro RTX series (aka Turing and onward). The good news is that performance is comparable to the closed-source driver, even at this point! A peculiarity of this project – a good portion of features that AMD and Intel drivers implement in Linux kernel are, instead, provided by a binary blob from inside the GPU. This blob runs on the GSP, which is a RISC-V core that’s only available on Turing GPUs and younger – hence the series limitation. Now, every GPU loads a piece of firmware, but this one’s hefty!

Barring that, this driver already provides more coherent integration into the Linux kernel, with massive benefits that will only increase going forward. Not everything’s open yet – NVIDIA’s userspace libraries and OpenGL, Vulkan, OpenCL and CUDA drivers remain closed, for now. Same goes for the old NVIDIA proprietary driver that, I’d guess, would be left to rot – fitting, as “leaving to rot” is what that driver has previously done to generations of old but perfectly usable cards. Continue reading “NVIDIA Releases Drivers With Openness Flavor”

A Real GPU On The Raspberry Pi — Barely.

[Jeff Geerling] saw the Raspberry Pi Compute Module 4 and its exposed PCI-Express 1x connection, and just naturally wondered whether he could plug a GPU into that slot and get it to work. It didn’t. There were a few reasons why, such as the limited Base Address Register space, and drivers that just weren’t written for ARM hardware. A bit of help from the Raspberry Pi software engineers and other Linux kernel hackers and those issues were fixed, albeit with a big hurdle in the CPU. The Broadcom chip in the Pi 4, the BCM2711, has a broken PCIe implementation.

There has finally been a breakthrough — Thanks to the dedicated community that has sprung up around this topic, a set of kernel patches manage to work around the hardware issues. It’s now possible to run a Radeon HD 5000/6000/7000 card on the Raspberry Pi 4 Compute Module. There are still glitches, and the Kernel patches to make this work will likely never land upstream. That said, It’s possible to run a desktop environment on the Radeon GPU on a Pi, and even a few simple benchmarks. The results… aren’t particularly inspiring, but that wasn’t really ever the point. You may be asking what real-world use is for a full-size GPU on the Pi. Sure, maybe crypto-mining or emulation, or being able to run more monitors for digital signage. More than that, it might help ensure the next Pi has a working PCIe implementation. But like many things we cover here, the real reason is that it’s a challenge that a group of enthusiasts couldn’t leave alone.

Continue reading “A Real GPU On The Raspberry Pi — Barely.”

WebGPU… Better Than WebGL?

As the browser becomes more like an operating system, we are seeing more deep features being built into them. For example, you can now do a form of assembly language for the browser. Sophisticated graphics have been around using WebGL since around 2011, but some people find it hard to use. [Surma] was one of those people and tried a new method that is just surfacing to do the same thing: WebGPU.

[Surma] liked it better and shares a lot of information in the post and — oddly — the post doesn’t use WebGPU for graphics very much. Instead, the post focuses on using GPU cores for fast computation, something else you can do with WebGPU. If your goal is to draw on the screen, though, you need to know the basics and the post links to a site with examples of doing this.

Continue reading “WebGPU… Better Than WebGL?”

90s PC With Modern Parts Throws Many Off Track

When building a desktop computer, usually the budget is the limiting factor. Making sacrifices on one part in order to improve another without breaking the bank is part of the delicate balance of putting together a capable PC. If you’re lucky enough to have the sponsors that [Shank] has though, caution can be thrown to the wind with regards to price for some blisteringly fast parts. Putting them in a ’90s Hot Wheels case to build the ultimate sleeper PC, though, is just icing on the top.

This isn’t quite as simple as replacing a motherboard in a modern PC case, though. The Hot Wheels PC used a mini-ITX standard and is quite a bit smaller than most modern computers outside of something like a Mac Mini. To get the RTX 3060 GPU into the computer the shrouds needed to be removed to save space, plus an unusual 92mm form factor liquid CPU cooler needed to be installed. An equally obscure power supply was included to round out the Ryzen 9 build and after a lot of tinkering eventually all the parts were fitted into this retro case including the original, working floppy disk drive. After that some additional case modding was installed such as RGB lighting, wheels with spinning rims, a spoiler, and an exhaust pipe.

The main issue with this build was temperatures, and both the CPU and GPU were topping out at dangerously high temperatures until [Shank] installed a terrifying 11,000 RPM case fan. With a series of original CRT monitors to go along with this sleeper PC he can have up to 9 displays with surprisingly high video quality thanks to the fundamental properties of CRTs. The video is definitely worth a watch and falls right in line with some of [Shank]’s other console mods that he is famous for such as this handheld Virtual Boy.

Thanks to [Fast Rock Productions] for the tip!

Continue reading “90s PC With Modern Parts Throws Many Off Track”

A GPU PCB mounted on top of a preheater, with a hot air gun blowing on top of one of the DDR chips, held with tweezers, about to be removed from the board. Most of the other chips are already gone from the board, with only a few left.

GPU RAM Upgrades Are Closer Than You Think

We’re all used to swapping RAM in our desktops and laptops. What about a GPU, though? [dosdude1] teaches us that soldered-on RAM is merely a frontier to be conquered. Of course, there’s gotta be a good reason to undertake such an effort – in his case, he couldn’t find the specific type of Nvidia GT640 that could be flashed with an Apple BIOS to have his Xserve machine output the Apple boot screen properly. All he could find were 1GB versions, and the Apple BIOS could only be flashed onto a 2GB version. Getting 2GB worth of DDR chips on Aliexpress was way too tempting!

The video goes through the entire replacement process, to the point where you could repeat it yourself — as long as you have access to a preheater, which is a must for reworking relatively large PCBs, as well as a set of regular tools for replacing BGA chips. In the end, the card booted up, and, flashed with a new BIOS, successfully displayed the Apple bootup logo that would normally be missing without the special Apple VBIOS sauce. If you ever want to try such a repair, now you have one less excuse — and, with the GT640 being a relatively old card, you don’t even risk all that much!

This is not the first soldered-in RAM replacement journey we’ve covered recently — here’s our write-up about [Greg Davill] upgrading soldered-in RAM on his Dell XPS! You can upgrade CPUs this way, too. While it’s standard procedure in sufficiently advanced laptop repair shops, even hobbyists can manage it with proper equipment and a good amount of luck, as this EEE PC CPU upgrade illustrates. BGA work and Apple computers getting a second life go hand in hand — just two years ago, we covered this BGA-drilling hack to bypass a dead GPU in a Macbook, and before that, a Macbook water damage revival story.

Continue reading “GPU RAM Upgrades Are Closer Than You Think”