A Dedicated GPU For Your Favorite SBC

The Raspberry Pi is famous for its low cost, versatile and open Linux environment, and plentiful I/O, making it a perfect device not only for its originally-intended educational purposes but for basically every hobbyist from gardeners to roboticists to amateur radio operators. Most builds tend to make use of the GPIO pins which allow easy connections to various peripherals and sensors, but the Pi also supports PCI devices which means that, in theory, it could use a GPU in much the same way that a modern computer would. After plenty of testing and development, [Jeff Geerling] brings us this custom graphics card interface for the Raspberry Pi.

The testing for all of these graphics cards has been done with a Pi Compute Module 4 and the end result is an interface device which looks much like a graphics card itself. It splits the PCI bus out onto a more familiar x16 slot connector and adds physical connections for power, USB, and Ethernet. When plugged into the carrier board, the Compute Module can be attached to any of a number of graphics cards, including the latest and highest-end of Nvidia and AMD offerings.

Perhaps unsurprisingly, though, the 4090 and 7900 cards don’t work with the Raspberry Pi. This is partially due to the 32-bit limitations of the Pi and other memory mapping issues, but even after attempting some workarounds Nvidia’s cards aren’t open-source enough to test properly (although the card is recognized by the Pi) and AMD’s drivers crash the system even after compiling a custom kernel. [Jeff] did find an Nvidia card that worked, although it requires using the USB interface and second-hand cards are selling for around $3000 USD. For a more economical choice there are some other graphics cards that he was eventually able to get working, albeit not with perfect performance, including some of the ones we’ve seen him test already.

Continue reading “A Dedicated GPU For Your Favorite SBC”

What Kind Of GPU Are You?

In the old days, big computers often had some form of external array processor. The idea is you could load a bunch of numbers into the processor and then do some math operations on all of the numbers in parallel. These days, you are more likely to turn to your graphics card for number crunching support. You’ll usually use some library to help you do that, but things are always better when you understand what’s going on under the hood. That’s why we enjoyed [RasterGrid’s] post on GPU architecture types.

If you can tell the difference between IMR (immediate mode) and TBR (tile-based) rendering this might not be the post for you. But while we knew the terms, we found a lot of interesting detail including some graphics and pseudo code that clarified the key differences.

Continue reading “What Kind Of GPU Are You?”

Modder Puts Computer Inside A Power Supply

When building a custom computer rig, most people put the SMPS power supply inside the computer case. [James] a.k.a [Aibohphobia] a.k.a [fearofpalindromes] turned it inside out, and built the STX160.0 – a full-fledged gaming computer stuffed inside a ATX power supply enclosure. While Small Form Factor (SFF) computers are nothing new, his build packs a powerful punch in a small enclosure and is a great example of computer modding, hacker ingenuity and engineering. The finished computer uses a Mini-ITX form factor motherboard with Intel i5 6500T quad-core 2.2GHz processor, EVGA GTX 1060 SC graphics card, 16GB DDR4 RAM, 250GB SSD, WiFi card and two USB ports — all powered from a 160 W AC-DC converter. Its external dimensions are the same as an ATX-EPS power supply at 150 L x 86 H x 230 D mm. The STX160.0 is mains utility powered and not from an external brick, which [James] feels would have been cheating.

For those who would like a quick, TL;DR pictorial review, head over to his photo album on Imgur first, to feast on pictures of the completed computer and its innards. But the Devil is in the details, so check out the forum thread for a ton of interesting build information, component sources, tricks and trivia. For example, to connect the graphics card to the motherboard, he used a “M.2 to powered PCIe x4 adapter” coupled with a flexible cable extender from a quaint company called Adex Electronics who still prefer to do business the old-fashioned way and whose website might remind you of the days when Netscape Navigator was the dominant browser.

As a benchmark, [James] posts that “with the cover panel on, at full load (Prime95 Blend @ 2 threads and FurMark 1080p 4x AA) the CPU is around 65°C with the CPU fan going at 1700RPM, and the GPU is at 64°C at 48% fan speed.” Fairly impressive for what could be passed off at first glance as a power supply.

The two really interesting take away’s for us in this project are his meticulous research to find specific parts that met his requirements from among the vast number of available choices. The second is his extremely detailed notes on designing the custom enclosure for this project and make it DFM (design for manufacturing) friendly so it could be mass-produced – just take a look at his “Table of Contents” for a taste of the amount of ground he is covering. If you are interested in custom builds and computer modding, there is a huge amount of useful information embedded in there for you.

Thanks to [Arsenio Dev‏] who posted a link to this hilarious thread on Reddit discussing the STX160.0. Check out a full teardown and review of the STX160.0 by [Not for Concentrate] in the video after the break.

Continue reading “Modder Puts Computer Inside A Power Supply”

10 Year Old Bug Crushed By Hacker On A Mission

PCI pass through is the ability of a virtualized guest system to directly access PCI hardware. Pass through for dedicated GPUs has just recently been added to the Linux kernel-based virtual machine. Soon afterward, users began to find that switching on nested page tables (NPT), a technology intended to provide hardware acceleration for virtual machines, had the opposite effect on AMD platforms and slowed frame rate down to a crawl.

Annoyed by this [gnif] set out to to fix the problem. His first step was to run graphics benchmarks to isolate the source of the problem. Having identified the culprit in the GPU, [gnif] began to read up on the involved technology stack. Three days of wrapping his head around technical docs allowed [gnif] to find the single line of code that resulted in a faulty memory set up and to implement a basic fix. He then passed the work on to [Paolo Bonzini] at patchwork.kernel.org, who released a more refined patch.

The bug affecting PCI pass through had been around for ten years and had received little attention from the manufacturer. It gained prominence when graphics cards were affected. In the end it took one very dedicated user three days to fix it, and then another day to roll out a patch for Open Source operating systems. In his notes [gnif] points out how helpful AMDs documentation was. With the right to repair in debate, DRMed technical docs and standards locked behind paywalls, [gnif]’s story is a reminder of the importance of accessible quality documentation.

Ethereum: GPU Mining Is Back But For How Long?

By now, everyone and their dog has at least heard of Bitcoin. While no government will accept tax payments in Bitcoin just yet, it’s ridiculously close to being real money. We’ve even paid for pizza delivery in Bitcoin. But it’s not the only cryptocurrency in town.

Ethereum initially launched in 2015 is an open source, it has been making headway among the 900 or so Bitcoin clones and is the number two cryptocurrency in the world, with only Bitcoin beating it in value. This year alone, the Ether has risen in value by around 4000%, and at time of writing is worth $375 per coin. And while the Bitcoin world is dominated by professional, purpose-built mining rigs, there is still room in the Ethereum ecosystem for the little guy or gal.

Ethereum is for Hackers

There may be many factors behind Ethereum’s popularity, however one reason is that the algorithm is designed to be resistant to ASIC mining. Unlike Bitcoin, anyone with a half decent graphics card or decent gaming rig can mine Ether, giving them the chance to make some digital currency. This is largely because mining Ethereum coins requires lots of high-speed memory, which ASICs lack. The algorithm also has built-in ASIC detection and will refuse to mine properly on them.

Small-scale Bitcoin miners were stung when the mining technology jumped from GPU to ASICs. ASIC-based miners simply outperformed the home gamer, and individuals suddenly discovered that their rigs were not worth much since there was a stampede of people trying to sell off their high-end GPU’s all at once. Some would go on to buy or build an ASIC but the vast majority just stopped mining. They were out of the game they couldn’t compete with ASICs and be profitable since mining in its self uses huge amounts of electricity.

Economies of scale like those in Bitcoin mining tend to favor a small number of very large players, which is in tension with the distributed nature of cryptocurrencies which relies on consensus to validate transactions. It’s much easier to imagine that a small number of large players would collude to manipulate the currency, for instance. Ethereum on the other hand hopes to keep their miners GPU-based to avoid huge mining farms and give the average Joe a chance at scoring big and discovering a coin on their own computer.

Ethereum Matters

Ethereum’s rise to popularity has basically undone Bitcoin’s move to ASICs, at least in the gamer and graphics card markets. Suddenly, used high-end graphics cards are worth something again. And there are effects in new equipment market. For instance, AMD cards seem to outperform other cards at the moment and they are taking advantage of this with their release of Mining specific GPU drivers for their new Vega architecture. Indeed, even though AMD bundled its hottest RX Vega 64 GPU with two games, a motherboard, and a CPU in an attempt to make the package more appealing to gamers than miners, AMD’s Radeon RX Vega 56 sold out in five minutes with Ethereum miners being blamed.

Besides creating ripples in the market for high-end gaming computers, cryptocurrencies are probably going to be relevant in the broader economy, and Ethereum is number two for now. In a world where even banks are starting to take out patents on blockchain technology in an attempt to get in on the action, cryptocurrencies aren’t as much of a fringe pursuit as they were a few years ago. Ethereum’s ASIC resistance is perhaps its killer feature, preventing centralization of control and keeping the little hacker in the mining game. Only time will tell if it’s going to be a Bitcoin contender, but it’s certainly worth keeping your eye on.

A Graphics Card For A Homebrew Computer

One of [aepharta]’s ‘before I die’ projects is a homebrew computer. Not just any computer, mind you, but a fabulous Z80 machine, complete with video out. HDMI and DisplayPort would require far too much of this tiny, 80s-era computer, and it’s getting hard to buy a composite monitor. This meant it was time to build a VGA video card from some parts salvaged from old equipment.

When it comes to ancient computers, VGA has fairly demanding requirements; the slowest standard pixel clock is 25.175 MHz, an order of magnitude faster than the CPU clock in early 80s computers. Memory is also an issue, with a 640×480, 4-color image requiring 153600 bytes, or about a quarter of the 640k ‘that should be enough for anybody.’

To cut down on the memory requirements and make everything a nice round in base-2 numbers, [aepharta] decided on a resolution of 512×384. This means about 100k of memory would be required when using 16 colors, and only about 24 kB for monochrome.

The circuit was built from some old programmable logic ICs pulled from a Cisco router. The circuit could have been built from discrete logic chips, but this was much, much simpler. Wiring everything up, [aepharta] got the timing right and was eventually able to put an image on a screen.

After a few minutes, though, the image started wobbling. [aepharta] put his finger on one of the GALs and noticed it was exceptionally hot. A heatsink stopped the wobbling for a few minutes, and a fan stopped it completely. Yes, it’s a 1980s-era graphics card that requires a fan. The card draws about 3W, or about two percent of a modern, high-end graphics card.

Open Source GPU Released

GPLGPU

Nearly a year ago, an extremely interesting project hit Kickstarter: an open source GPU, written for an FPGA. For reasons that are obvious in retrospect, the GPL-GPU Kickstarter was not funded, but that doesn’t mean these developers don’t believe in what they’re doing. The first version of this open source graphics processor has now been released, giving anyone with an interest a look at what a late-90s era GPU looks like on the inside, If you’re cool enough, there’s also enough supporting documentation to build your own.

A quick note for the PC Master Race: this thing might run Quake eventually. It’s not a powerhouse. That said, [Bunnie] had a hard time finding an open source GPU for the Novena laptop, and the drivers for the VideoCore IV in the Raspi have only recently been open sourced. A completely open GPU simply doesn’t exist, and short of a few very, very limited thesis projects there hasn’t been anything like this before.

Right now, the GPL-GPU has 3D graphics acceleration working with VGA on a PCI bus. The plan is to update this late-90s setup to interfaces that make a little more sense, and add DVI and HDMI output. Not bad for a failed Kickstarter, right?