Modder Puts Computer Inside A Power Supply

When building a custom computer rig, most people put the SMPS power supply inside the computer case. [James] a.k.a [Aibohphobia] a.k.a [fearofpalindromes] turned it inside out, and built the STX160.0 – a full-fledged gaming computer stuffed inside a ATX power supply enclosure. While Small Form Factor (SFF) computers are nothing new, his build packs a powerful punch in a small enclosure and is a great example of computer modding, hacker ingenuity and engineering. The finished computer uses a Mini-ITX form factor motherboard with Intel i5 6500T quad-core 2.2GHz processor, EVGA GTX 1060 SC graphics card, 16GB DDR4 RAM, 250GB SSD, WiFi card and two USB ports — all powered from a 160 W AC-DC converter. Its external dimensions are the same as an ATX-EPS power supply at 150 L x 86 H x 230 D mm. The STX160.0 is mains utility powered and not from an external brick, which [James] feels would have been cheating.

For those who would like a quick, TL;DR pictorial review, head over to his photo album on Imgur first, to feast on pictures of the completed computer and its innards. But the Devil is in the details, so check out the forum thread for a ton of interesting build information, component sources, tricks and trivia. For example, to connect the graphics card to the motherboard, he used a “M.2 to powered PCIe x4 adapter” coupled with a flexible cable extender from a quaint company called Adex Electronics who still prefer to do business the old-fashioned way and whose website might remind you of the days when Netscape Navigator was the dominant browser.

As a benchmark, [James] posts that “with the cover panel on, at full load (Prime95 Blend @ 2 threads and FurMark 1080p 4x AA) the CPU is around 65°C with the CPU fan going at 1700RPM, and the GPU is at 64°C at 48% fan speed.” Fairly impressive for what could be passed off at first glance as a power supply.

The two really interesting take away’s for us in this project are his meticulous research to find specific parts that met his requirements from among the vast number of available choices. The second is his extremely detailed notes on designing the custom enclosure for this project and make it DFM (design for manufacturing) friendly so it could be mass-produced – just take a look at his “Table of Contents” for a taste of the amount of ground he is covering. If you are interested in custom builds and computer modding, there is a huge amount of useful information embedded in there for you.

Thanks to [Arsenio Dev‏] who posted a link to this hilarious thread on Reddit discussing the STX160.0. Check out a full teardown and review of the STX160.0 by [Not for Concentrate] in the video after the break.

Continue reading “Modder Puts Computer Inside A Power Supply”

10 Year Old Bug Crushed By Hacker On A Mission

PCI pass through is the ability of a virtualized guest system to directly access PCI hardware. Pass through for dedicated GPUs has just recently been added to the Linux kernel-based virtual machine. Soon afterward, users began to find that switching on nested page tables (NPT), a technology intended to provide hardware acceleration for virtual machines, had the opposite effect on AMD platforms and slowed frame rate down to a crawl.

Annoyed by this [gnif] set out to to fix the problem. His first step was to run graphics benchmarks to isolate the source of the problem. Having identified the culprit in the GPU, [gnif] began to read up on the involved technology stack. Three days of wrapping his head around technical docs allowed [gnif] to find the single line of code that resulted in a faulty memory set up and to implement a basic fix. He then passed the work on to [Paolo Bonzini] at patchwork.kernel.org, who released a more refined patch.

The bug affecting PCI pass through had been around for ten years and had received little attention from the manufacturer. It gained prominence when graphics cards were affected. In the end it took one very dedicated user three days to fix it, and then another day to roll out a patch for Open Source operating systems. In his notes [gnif] points out how helpful AMDs documentation was. With the right to repair in debate, DRMed technical docs and standards locked behind paywalls, [gnif]’s story is a reminder of the importance of accessible quality documentation.

Ethereum: GPU Mining Is Back But For How Long?

By now, everyone and their dog has at least heard of Bitcoin. While no government will accept tax payments in Bitcoin just yet, it’s ridiculously close to being real money. We’ve even paid for pizza delivery in Bitcoin. But it’s not the only cryptocurrency in town.

Ethereum initially launched in 2015 is an open source, it has been making headway among the 900 or so Bitcoin clones and is the number two cryptocurrency in the world, with only Bitcoin beating it in value. This year alone, the Ether has risen in value by around 4000%, and at time of writing is worth $375 per coin. And while the Bitcoin world is dominated by professional, purpose-built mining rigs, there is still room in the Ethereum ecosystem for the little guy or gal.

Ethereum is for Hackers

There may be many factors behind Ethereum’s popularity, however one reason is that the algorithm is designed to be resistant to ASIC mining. Unlike Bitcoin, anyone with a half decent graphics card or decent gaming rig can mine Ether, giving them the chance to make some digital currency. This is largely because mining Ethereum coins requires lots of high-speed memory, which ASICs lack. The algorithm also has built-in ASIC detection and will refuse to mine properly on them.

Small-scale Bitcoin miners were stung when the mining technology jumped from GPU to ASICs. ASIC-based miners simply outperformed the home gamer, and individuals suddenly discovered that their rigs were not worth much since there was a stampede of people trying to sell off their high-end GPU’s all at once. Some would go on to buy or build an ASIC but the vast majority just stopped mining. They were out of the game they couldn’t compete with ASICs and be profitable since mining in its self uses huge amounts of electricity.

Economies of scale like those in Bitcoin mining tend to favor a small number of very large players, which is in tension with the distributed nature of cryptocurrencies which relies on consensus to validate transactions. It’s much easier to imagine that a small number of large players would collude to manipulate the currency, for instance. Ethereum on the other hand hopes to keep their miners GPU-based to avoid huge mining farms and give the average Joe a chance at scoring big and discovering a coin on their own computer.

Ethereum Matters

Ethereum’s rise to popularity has basically undone Bitcoin’s move to ASICs, at least in the gamer and graphics card markets. Suddenly, used high-end graphics cards are worth something again. And there are effects in new equipment market. For instance, AMD cards seem to outperform other cards at the moment and they are taking advantage of this with their release of Mining specific GPU drivers for their new Vega architecture. Indeed, even though AMD bundled its hottest RX Vega 64 GPU with two games, a motherboard, and a CPU in an attempt to make the package more appealing to gamers than miners, AMD’s Radeon RX Vega 56 sold out in five minutes with Ethereum miners being blamed.

Besides creating ripples in the market for high-end gaming computers, cryptocurrencies are probably going to be relevant in the broader economy, and Ethereum is number two for now. In a world where even banks are starting to take out patents on blockchain technology in an attempt to get in on the action, cryptocurrencies aren’t as much of a fringe pursuit as they were a few years ago. Ethereum’s ASIC resistance is perhaps its killer feature, preventing centralization of control and keeping the little hacker in the mining game. Only time will tell if it’s going to be a Bitcoin contender, but it’s certainly worth keeping your eye on.

A Graphics Card For A Homebrew Computer

One of [aepharta]’s ‘before I die’ projects is a homebrew computer. Not just any computer, mind you, but a fabulous Z80 machine, complete with video out. HDMI and DisplayPort would require far too much of this tiny, 80s-era computer, and it’s getting hard to buy a composite monitor. This meant it was time to build a VGA video card from some parts salvaged from old equipment.

When it comes to ancient computers, VGA has fairly demanding requirements; the slowest standard pixel clock is 25.175 MHz, an order of magnitude faster than the CPU clock in early 80s computers. Memory is also an issue, with a 640×480, 4-color image requiring 153600 bytes, or about a quarter of the 640k ‘that should be enough for anybody.’

To cut down on the memory requirements and make everything a nice round in base-2 numbers, [aepharta] decided on a resolution of 512×384. This means about 100k of memory would be required when using 16 colors, and only about 24 kB for monochrome.

The circuit was built from some old programmable logic ICs pulled from a Cisco router. The circuit could have been built from discrete logic chips, but this was much, much simpler. Wiring everything up, [aepharta] got the timing right and was eventually able to put an image on a screen.

After a few minutes, though, the image started wobbling. [aepharta] put his finger on one of the GALs and noticed it was exceptionally hot. A heatsink stopped the wobbling for a few minutes, and a fan stopped it completely. Yes, it’s a 1980s-era graphics card that requires a fan. The card draws about 3W, or about two percent of a modern, high-end graphics card.

Open Source GPU Released

GPLGPU

Nearly a year ago, an extremely interesting project hit Kickstarter: an open source GPU, written for an FPGA. For reasons that are obvious in retrospect, the GPL-GPU Kickstarter was not funded, but that doesn’t mean these developers don’t believe in what they’re doing. The first version of this open source graphics processor has now been released, giving anyone with an interest a look at what a late-90s era GPU looks like on the inside, If you’re cool enough, there’s also enough supporting documentation to build your own.

A quick note for the PC Master Race: this thing might run Quake eventually. It’s not a powerhouse. That said, [Bunnie] had a hard time finding an open source GPU for the Novena laptop, and the drivers for the VideoCore IV in the Raspi have only recently been open sourced. A completely open GPU simply doesn’t exist, and short of a few very, very limited thesis projects there hasn’t been anything like this before.

Right now, the GPL-GPU has 3D graphics acceleration working with VGA on a PCI bus. The plan is to update this late-90s setup to interfaces that make a little more sense, and add DVI and HDMI output. Not bad for a failed Kickstarter, right?

Behold! The Most Insane Crowdfunding Campaign Ever

Hold on to your hats, because this is a good one. It’s a tale of disregarding the laws of physics, cancelled crowdfunding campaigns, and a menagerie of blogs who take press releases at face value.

Meet Silent Power (Google translation). It’s a remarkably small and fairly powerful miniature gaming computer being put together by a team in Germany. The specs are pretty good for a completely custom computer: an i7 4785T, GTX 760, 8GB of RAM and a 500GB SSD. Not a terrible machine for something that will eventually sell for about $930 USD, but what really puts this project in the limelight is the innovative cooling system and small size. The entire machine is only 16x10x7 cm, accented with a very interesting “copper foam” heat sink on top. Sounds pretty cool, huh? It does, until you start to think about the implementation a bit. Then it’s a descent into madness and a dark pit of despair.

There are a lot of things that are completely wrong with this project, and in true Hackaday fashion, we’re going to tear this one apart, figuring out why this project will never exist.

Continue reading “Behold! The Most Insane Crowdfunding Campaign Ever”

STM32 Driving A PCIe Video Card

[Gpuhackr] chose his username to explain exactly how he spends his time. For instance, here he’s using an STM32 Discovery board to drive an AMD Radeon HD 2400 graphics card. The ARM microcontroller isn’t actually using the PCIe interface on the card. Instead, [Gpuhackr] has patched into the debugging interface built into the card itself. This isn’t quite as straight forward as it sounds, but if you do the wiring carefully it’s a pretty intersting way to connect an ARM to an LCD monitor.

This project would be almost impossible if it weren’t for the open source code which AMD has released. This lets him implement the card’s 3D rendering features. The demo directly programs the UVD Xtensa CPU which is on the video card. It draws a cube with color gradients on each side. The cube spins while the debug information is overlaid on the screen. In this case the ARM chip/board is really being used as a programmer to upload some custom firmware. But we think a real code-ninja could implement a communications protocol to open up a simple way to drive the card in real-time.

[Thanks uMinded]