Ethereum: GPU Mining Is Back But For How Long?

By now, everyone and their dog has at least heard of Bitcoin. While no government will accept tax payments in Bitcoin just yet, it’s ridiculously close to being real money. We’ve even paid for pizza delivery in Bitcoin. But it’s not the only cryptocurrency in town.

Ethereum initially launched in 2015 is an open source, it has been making headway among the 900 or so Bitcoin clones and is the number two cryptocurrency in the world, with only Bitcoin beating it in value. This year alone, the Ether has risen in value by around 4000%, and at time of writing is worth $375 per coin. And while the Bitcoin world is dominated by professional, purpose-built mining rigs, there is still room in the Ethereum ecosystem for the little guy or gal.

Ethereum is for Hackers

There may be many factors behind Ethereum’s popularity, however one reason is that the algorithm is designed to be resistant to ASIC mining. Unlike Bitcoin, anyone with a half decent graphics card or decent gaming rig can mine Ether, giving them the chance to make some digital currency. This is largely because mining Ethereum coins requires lots of high-speed memory, which ASICs lack. The algorithm also has built-in ASIC detection and will refuse to mine properly on them.

Small-scale Bitcoin miners were stung when the mining technology jumped from GPU to ASICs. ASIC-based miners simply outperformed the home gamer, and individuals suddenly discovered that their rigs were not worth much since there was a stampede of people trying to sell off their high-end GPU’s all at once. Some would go on to buy or build an ASIC but the vast majority just stopped mining. They were out of the game they couldn’t compete with ASICs and be profitable since mining in its self uses huge amounts of electricity.

Economies of scale like those in Bitcoin mining tend to favor a small number of very large players, which is in tension with the distributed nature of cryptocurrencies which relies on consensus to validate transactions. It’s much easier to imagine that a small number of large players would collude to manipulate the currency, for instance. Ethereum on the other hand hopes to keep their miners GPU-based to avoid huge mining farms and give the average Joe a chance at scoring big and discovering a coin on their own computer.

Ethereum Matters

Ethereum’s rise to popularity has basically undone Bitcoin’s move to ASICs, at least in the gamer and graphics card markets. Suddenly, used high-end graphics cards are worth something again. And there are effects in new equipment market. For instance, AMD cards seem to outperform other cards at the moment and they are taking advantage of this with their release of Mining specific GPU drivers for their new Vega architecture. Indeed, even though AMD bundled its hottest RX Vega 64 GPU with two games, a motherboard, and a CPU in an attempt to make the package more appealing to gamers than miners, AMD’s Radeon RX Vega 56 sold out in five minutes with Ethereum miners being blamed.

Besides creating ripples in the market for high-end gaming computers, cryptocurrencies are probably going to be relevant in the broader economy, and Ethereum is number two for now. In a world where even banks are starting to take out patents on blockchain technology in an attempt to get in on the action, cryptocurrencies aren’t as much of a fringe pursuit as they were a few years ago. Ethereum’s ASIC resistance is perhaps its killer feature, preventing centralization of control and keeping the little hacker in the mining game. Only time will tell if it’s going to be a Bitcoin contender, but it’s certainly worth keeping your eye on.

Under the Hood of AMD’s Threadripper

Although AMD has been losing market share to Intel over the past decade, they’ve recently started to pick up steam again in the great battle for desktop processor superiority. A large part of this surge comes in the high-end, multi-core processor arena, where it seems like AMD’s threadripper is clearly superior to Intel’s competition. Thanks to overclocking expert [der8auer] we can finally see what’s going on inside of this huge chunk of silicon.

The elephant in the room is the number of dies on this chip. It has a massive footprint to accommodate all four dies, each with eight cores. However, it seems as though two of the cores are deactivated due to a combination of manufacturing processes and thermal issues. This isn’t necessarily a bad thing, either, or a reason not to use this processor if you need to utilize a huge number of cores, though; it seems as though AMD found it could use existing manufacturing techniques to save on the cost of production, while still making a competitive product.

Additionally, a larger die size than required opens the door for potentially activating the two currently disabled chips in the future. This could be the thing that brings AMD back into competition with Intel, although both companies still maintain the horrible practice of crippling their chips’ security from the start.

Super Computing with Mini ITX Cluster

[Colin Alston] was able to snag a handful of Mini ITX motherboards for cheap and built a mini super computer he calls TinyJaguar. Named partly after the AMD Sempron 2650 APU, the TinyJaguar boasts four, yes that’s four MSI AM1I Mini-ITX motherboards, each with 4GB of DDR memory.

A Raspberry Pi with custom software manages the cluster, and along with some TTL and relays, controls the power to the four nodes. The mini super computer resides in a custom acrylic case held together by an array of 3D printed parts and fasteners.There’s even a rack-like faceplate near the bottom to host the RPi, an Ethernet switch, an array of status LEDs, and the two buttons.

With 16 total cores of computing power (including GPU), the TinyJaguar is quite capable of doing some pretty cool stuff such as running Jupyter notebook with IPyParallel. [Colin] ran into some issues getting the GPU to behave with PyOpenCL. It took a bit of pain and time, but in the end he was able to get the GPUs up, and wrote a small message passing program to show two of the cores were up and working together.

Be sure to check out [Colin’s] super computer project page, specifically the ten project logs that walk through everything that went into this build. He also posted his code if you want to take a look under the hood.

How Not To Build A CPU Hand Warmer

Winter is coming, along with mittens, cold hands, snow, and jackets. Now that we’re all carrying around lithium batteries in our pocket, wouldn’t it be a great idea to build an electronic hand warmer? That’s what [GreatScott!] thought. To build his electronic hand warmer, he turned to the most effective and efficient way to turn electricity into heat: a ten-year-old AMD CPU.

Building an electronic hand warmer is exceptionally simple. All you need is a resistive heating element (like a resistor), a means to limit current (like a resistor), and a power supply (like a USB power bank). Connect these things together and you have a hand warmer that is either zero percent or one hundred percent efficient. We haven’t figured that last part out yet.

Because more power and more retro is more betterer, [GreatScott] pulled an AMD Sempron out of an old computer. Finding and reading data sheets is for wimps, apparently, so [GreatScott] just poked some pins with a variable power supply until the CPU was drawing about 500mA at 5V.

The video continues with some Arduino-based temperature measurement, finding some new pins to plug the power leads into, and securing all the wires on this heating element with hot glue. For anyone in the comments ready to say, ‘not a hack’, we assure you, this qualifies.

With the naive method of building a CPU hand warmer out of the way, here’s the pros and cons of this project, and how it can be made better. First off, using an old AMD processor was a great idea. These things are firestarters, and even though this processor preceded the 100+ W TDP AMD CPUs, it should work well enough.

That said, this is not how you waste power in a CPU. Ideally, the processor should do some work, with more active gates resulting in higher power consumption. If this were an exceptionally old processor, a good, simple option would be freerunning the chip, or having the CPU count up through its address space. This can be done by tying address lines low or high, depending on the chip. That’ll waste a significant amount of power. Randomly poking pins hoping for the right power consumption is not the way to get the most heat out of this CPU.

Of course, the above paragraph is just theory. The eating is in the pudding, or some other disfigured colloquialism, so here’s a quad-core 386 coffee warmer. This project from [magnustron] uses four 80386 CPUs powered via USB to make a nice desktop hotplate for your cuppa. Of course being powered by USB means there’s only 500mA to go around, and the ΔT is comparable to [GreatScott]’s AMD and hot glue hand warmer. Thus we get to the crux of the issue: 5V and 500mA isn’t very hot. Until cheap USB-C power banks, with ten or twelve Watts flood in from China, the idea of a USB powered heater is a fool’s errand. It does make for some great AMD firestarter jokes, though, so we have to give [GreatScott] credit for that.

Continue reading “How Not To Build A CPU Hand Warmer”

New Part Day: A Truly Secure Workstation

There is a chain of trust in every modern computing device that starts with the code you write yourself, and extends backwards through whatever frameworks you’re using, whatever OS you’re using, whatever drivers you’re using, and ultimately whatever BIOS, UEFI, Secure Boot, or firmware you’re running. With an Intel processor, this chain of trust extends to the Intel Management Engine, a system running independent of the CPU that has access to the network, USB ports, and everything else in the computer.

Needless to say, this chain of trust is untenable. Any attempt to audit every line of code running in a computer will only be met with frustration. There is no modern Intel-based computer that is completely open source, and no computer that can be verified as secure. AMD is just as bad, and recent attempts to create an open computing platform have met with frustration. [Bunnie]’s Novena laptop gets close, but like any engineering task, designing the Novena was an exercise in compromise. You can get around modern BIOSes, coreboot still uses binary blobs, and Libreboot will not be discussed on Hackaday for the time being. There is no modern, completely open, completely secure computing platform. They’re all untrustworthy.

The Talos Secure Workstation, from Raptor Engineering, an an upcoming  Crowd Supply campaign is the answer to the untrustworthiness of modern computing. The Talos is an effort to create the world’s first libre workstation. It’s an ATX-compatible motherboard that is fully auditable, from schematics to firmware, without any binary blobs.

Continue reading “New Part Day: A Truly Secure Workstation”

Echo of the Bunnymen: How AMD Won, Then Lost

In 2003, nothing could stop AMD. This was a company that moved from a semiconductor company based around second-sourcing Intel designs in the 1980s to a Fortune 500 company a mere fifteen years later. AMD was on fire, and with almost a 50% market share of desktop CPUs, it was a true challenger to Intel’s throne.

An AMD 8080A. source
An AMD 8080A. source.

AMD began its corporate history like dozens of other semiconductor companies: second sourcing dozens of other designs from dozens of other companies. The first AMD chip, sold in 1970, was just a four-bit shift register. From there, AMD began producing 1024-bit static RAMs, ever more complex integrated circuits, and in 1974 released the Am9080, a reverse-engineered version of the Intel 8080.

AMD had the beginnings of something great. The company was founded by [Jerry Sanders], electrical engineer at Fairchild Semiconductor. At the time [Sanders] left Fairchild in 1969,  [Gordon Moore] and [Robert Noyce], also former Fairchild employees, had formed Intel a year before.

While AMD and Intel shared a common heritage, history bears that only one company would become the king of semiconductors. Twenty years after these companies were founded they would find themselves in a bitter rivalry, and thirty years after their beginnings, they would each see their fortunes change. For a short time, AMD would overtake Intel as the king of CPUs, only to stumble again and again to a market share of ten to twenty percent. It only takes excellent engineering to succeed, but how did AMD fail? The answer is Intel. Through illegal practices and ethically questionable engineering decisions, Intel would succeed to be the current leader of the semiconductor world.

Continue reading “Echo of the Bunnymen: How AMD Won, Then Lost”

Gizmo Board, a tiny x86 dev board

Gizmo

With the Raspberry Pi and sever other ARM dev boards seeing their time in the lime light, it’s no surprise other chip manufacturers would want to get in on the action. AMD is releasing a very tiny x86 dev board called the Gizmo, a four-inch square board that shrinks a desktop computer down to the palm of your hand.

The Gizmo is powered by a dual-core x86 Brazos CPU running at 1 GHz with an included Radeon HD 6250 graphics engine. Also on the board is 1GB of DDR3 RAM, a SATA, Ethernet, USB, VGA, Audio, PCI and PCIe ports, and a ton of GPIO pins that include ADCs and DACs. All this in a four-inch square package that boasts about twice the performance of a Raspberry Pi.

While the price of the Gizmo – $200 for an explorer kit – will probably preclude it from being as popular as a Raspberry Pi or other ARM board, sometimes you just need an x86 platform to do the job. With the powerful graphics potential of the Gizmo, we could easily see this board being used in a few computer vision or autonomous robot builds.