The IMac GPU Becomes Upgradeable, With PCIe

Over its long lifetime, the Apple iMac all-in-one computer has morphed from the early CRT models through those odd table-lamp machines into today’s beautiful sleek affairs. They look pretty, but is there anything that can be done to upgrade them? Maybe not today’s ones, but the models from the mid-2000s can be given some surprising new life. [LowEndMac] have featured a 2006 24″ model that’s received a much more powerful GPU, something we’d have thought to be impossible.

The iMacs from that era resemble a monitor with a slightly chunkier back, in which resides the guts of the computer. By then the company was producing machines with an x86 processor, and their internals share a lot of similarities with a laptop of the period. The card is a Mac Radeon model newer than the machine would ever be used with, and it sits in a chain of mini PCIe to PCIe adapters. Even then it can’t drive the original screen, so a replacement panel and power supply are taken from another monitor and grafted into the iMac case. This along with a RAM and SSD upgrade makes this about the most upgraded a 2006 iMac could be.

Of course, another approach is to simply replace the whole lot with an Intel NUC.

A Real GPU On The Raspberry Pi — Barely.

[Jeff Geerling] saw the Raspberry Pi Compute Module 4 and its exposed PCI-Express 1x connection, and just naturally wondered whether he could plug a GPU into that slot and get it to work. It didn’t. There were a few reasons why, such as the limited Base Address Register space, and drivers that just weren’t written for ARM hardware. A bit of help from the Raspberry Pi software engineers and other Linux kernel hackers and those issues were fixed, albeit with a big hurdle in the CPU. The Broadcom chip in the Pi 4, the BCM2711, has a broken PCIe implementation.

There has finally been a breakthrough — Thanks to the dedicated community that has sprung up around this topic, a set of kernel patches manage to work around the hardware issues. It’s now possible to run a Radeon HD 5000/6000/7000 card on the Raspberry Pi 4 Compute Module. There are still glitches, and the Kernel patches to make this work will likely never land upstream. That said, It’s possible to run a desktop environment on the Radeon GPU on a Pi, and even a few simple benchmarks. The results… aren’t particularly inspiring, but that wasn’t really ever the point. You may be asking what real-world use is for a full-size GPU on the Pi. Sure, maybe crypto-mining or emulation, or being able to run more monitors for digital signage. More than that, it might help ensure the next Pi has a working PCIe implementation. But like many things we cover here, the real reason is that it’s a challenge that a group of enthusiasts couldn’t leave alone.

Continue reading “A Real GPU On The Raspberry Pi — Barely.”

A Dead Macbook GPU Shouldn’t Stop You, With This BGA Soldering Hack

On some 2011 Macbook Pro models, there is a tendency for the Radeon GPU to fail. This should mean game over for the computer, but surprisingly salvation is offered by its having not one but two GPUs on board. The Intel processor also has a GPU, and Apple use a pile of logic in an FPGA to switch at will between them. The community have produced fresh FPGA code to revive a dead Mac on its Intel GPU, but at the expense of losing brightness control. [Ayilm1] has brought back the brightness with a clever BGA reworking hack that gains access to a brightness control line present on the Intel BD82HM65 Platform Controller Hub chip but not used in the Macbook.

We’re used to impressive soldering work here at Hackaday, and we’ve seen our share of wiring direct to the balls on an upturned BGA chip. This is a similar idea but at another level, as a section of the top insulation on an in-place BGA is removed to expose the microvia above the ball carrying the required signal. A tiny wire is soldered to the exposed pad and taken to a piece of copper tape stuck down to provide mechanical strength, and a piece of enameled copper wire is run from that to the other side of the PCB where lies its destination. It comes with FPGA code to take advantage of it, but even for non-Macbook owners, it’s an extremely impressive piece of work. It’s not the first fine-soldering Macbook fix we’ve seen, either.

Thanks [lightpink784] for the tip.

GPU Turned Into Radio Transmitter To Defeat Air-Gapped PC

Another week, another exploit against an air-gapped computer. And this time, the attack is particularly clever and pernicious: turning a GPU into a radio transmitter.

The first part of [Mikhail Davidov] and [Baron Oldenburg]’s article is a review of some of the basics of exploring the RF emissions of computers using software-defined radio (SDR) dongles. Most readers can safely skip ahead a bit to section 9, which gets into the process they used to sniff for potentially compromising RF leaks from an air-gapped test computer. After finding a few weak signals in the gigahertz range and dismissing them as attack vectors due to their limited penetration potential, they settled in on the GPU card, a Radeon Pro WX3100, and specifically on the power management features of its ATI chipset.

With a GPU benchmarking program running, they switched the graphics card shader clock between its two lowest power settings, which produced a strong signal on the SDR waterfall at 428 MHz. They were able to receive this signal up to 50 feet (15 meters) away, perhaps to the annoyance of nearby hams as this is plunk in the middle of the 70-cm band. This is theoretically enough to exfiltrate data, but at a painfully low bitrate. So they improved the exploit by forcing the CPU driver to vary the shader clock frequency in one megahertz steps, allowing them to implement higher throughput encoding schemes. You can hear the change in signal caused by different graphics being displayed in the video below; one doesn’t need much imagination to see how malware could leverage this to exfiltrate pretty much anything on the computer.

It’s a fascinating hack, and hats off to [Davidov] and [Oldenburg] for revealing this weakness. We’ll have to throw this on the pile with all the other side-channel attacks [Samy Kamkar] covered in his 2019 Supercon talk.

Continue reading “GPU Turned Into Radio Transmitter To Defeat Air-Gapped PC”

Ethereum: GPU Mining Is Back But For How Long?

By now, everyone and their dog has at least heard of Bitcoin. While no government will accept tax payments in Bitcoin just yet, it’s ridiculously close to being real money. We’ve even paid for pizza delivery in Bitcoin. But it’s not the only cryptocurrency in town.

Ethereum initially launched in 2015 is an open source, it has been making headway among the 900 or so Bitcoin clones and is the number two cryptocurrency in the world, with only Bitcoin beating it in value. This year alone, the Ether has risen in value by around 4000%, and at time of writing is worth $375 per coin. And while the Bitcoin world is dominated by professional, purpose-built mining rigs, there is still room in the Ethereum ecosystem for the little guy or gal.

Ethereum is for Hackers

There may be many factors behind Ethereum’s popularity, however one reason is that the algorithm is designed to be resistant to ASIC mining. Unlike Bitcoin, anyone with a half decent graphics card or decent gaming rig can mine Ether, giving them the chance to make some digital currency. This is largely because mining Ethereum coins requires lots of high-speed memory, which ASICs lack. The algorithm also has built-in ASIC detection and will refuse to mine properly on them.

Small-scale Bitcoin miners were stung when the mining technology jumped from GPU to ASICs. ASIC-based miners simply outperformed the home gamer, and individuals suddenly discovered that their rigs were not worth much since there was a stampede of people trying to sell off their high-end GPU’s all at once. Some would go on to buy or build an ASIC but the vast majority just stopped mining. They were out of the game they couldn’t compete with ASICs and be profitable since mining in its self uses huge amounts of electricity.

Economies of scale like those in Bitcoin mining tend to favor a small number of very large players, which is in tension with the distributed nature of cryptocurrencies which relies on consensus to validate transactions. It’s much easier to imagine that a small number of large players would collude to manipulate the currency, for instance. Ethereum on the other hand hopes to keep their miners GPU-based to avoid huge mining farms and give the average Joe a chance at scoring big and discovering a coin on their own computer.

Ethereum Matters

Ethereum’s rise to popularity has basically undone Bitcoin’s move to ASICs, at least in the gamer and graphics card markets. Suddenly, used high-end graphics cards are worth something again. And there are effects in new equipment market. For instance, AMD cards seem to outperform other cards at the moment and they are taking advantage of this with their release of Mining specific GPU drivers for their new Vega architecture. Indeed, even though AMD bundled its hottest RX Vega 64 GPU with two games, a motherboard, and a CPU in an attempt to make the package more appealing to gamers than miners, AMD’s Radeon RX Vega 56 sold out in five minutes with Ethereum miners being blamed.

Besides creating ripples in the market for high-end gaming computers, cryptocurrencies are probably going to be relevant in the broader economy, and Ethereum is number two for now. In a world where even banks are starting to take out patents on blockchain technology in an attempt to get in on the action, cryptocurrencies aren’t as much of a fringe pursuit as they were a few years ago. Ethereum’s ASIC resistance is perhaps its killer feature, preventing centralization of control and keeping the little hacker in the mining game. Only time will tell if it’s going to be a Bitcoin contender, but it’s certainly worth keeping your eye on.

Paddle Controller For GPU Overclocking

[Fred] likes to squeeze every cycle possible out of his graphics card. But sometimes pushing the clock speed too high causes corruption. He figured out a way to turn a knob to adjust the clock speed while your applications are still running.

The actuator seen above is a Griffin Powermate 3.0. It’s a USB peripheral which is meant to be used for anything you can imagine. [Fred] uses an AutoHotKey script that he wrote to capture the input from the spinner, process that information, then adjust GPU clock speed in the background. Since the clock on his ATi Radeon 5800 can be adjusted using the AMD GPU clock tool, it’s an easy choice for this application. Now better graphics are at the tips of his fingers. See for yourself in the video after the break.

Of course if you don’t want to shell out for the fancy hardware you could always build your own paddle controller.

Continue reading “Paddle Controller For GPU Overclocking”

Gaming On A TP-Link TL-WDR4900 Wireless Router

When you look at your home router, the first thought that comes to mind probably isn’t about playing games on it. But that doesn’t stop [Manawyrm] and [tSYS] from taking on the task of turning the 2013-era TP-Link TL-WDR4900 router into a proper gaming machine using an external GPU. This is made possible by the PCIe lanes on the mainboard, courtesy of the PowerPC-based SoC (NXP QorIQ P1014) and remappable Base Address Registers (BARs). This router has been an OpenWRT-favorite for years due to its powerful hardware and feature set.

This mod required a custom miniPCIe PCB that got connected to the PCIe traces (after cutting the connection with the Atheros WiFi chipset). This allowed an external AMD Radeon HD 7470 GPU to be connected to the system, which showed up in OpenWRT. To make full use of this hardware by gaining access to the AMD GPU driver, full Debian Linux was needed. Fortunately, the distro had a special PowerPCSPE port that supports the e500v2 CPU core in the SoC. After this it was found that the amdgpu driver has issues on 32-bit platforms, for which an issue ticket got filed.

Using the legacy Radeon driver helped to overcome this issue, but then it was found that the big endian nature of the CPU tripped up the Grand Theft Auto: Vice City game code which has not been written with BE in mind. This took a lot of code patching to help fix this, but eventually the game was up and running, albeit with glitches. Whatever the cause of these graphical glitches was will remain unknown, as after updating everything things began to work normally.

So now it’s possible to convert a 2013-era router into a gaming console after patching in an external GPU, which actually could be useful in keeping more potential e-waste out of landfills.

Continue reading “Gaming On A TP-Link TL-WDR4900 Wireless Router”