An Arduino From The Distant Past

Arduinos are a handy tool to have around. They’re versatile, cheap, easy to program, and have a ton of software libraries to build on. They’ve only been around for about a decade and a half though, so if you were living in 1989 and wanted to program a microcontroller you’d probably be stuck with an 8-bit microprocessor with no built-in peripherals to help, reading from a physical book about registers and timing, and probably trying to get a broken ribbon cable to behave so it would actually power up. If you want a less frustrating alternate history to live in, though, check out the latest project from [Marek].

He discovered some 6502 chips (Polish language, Google Translate link) that a Chinese manufacturer was selling, but didn’t really trust that they were legitimate. On a lark he ordered some and upon testing them he found out that they were real 6502s. Building an 8-bit computer is something he’d like to do, but in the meantime he decided to do a project using one of these chips as a general-purpose microcontroller similar to a modern Arduino. The project has similar specs as an Arduino too, including 8kB of RAM memory, 8kB of I/O address space, and various EPROM capabilities. [Marek] went on to build a shield board for it as well, for easy access to some switches and LEDs. It’s a great build that anyone interested in microcontrollers should check out.

Keep in mind that an ATtiny45 has 8 bits like the 6502 but only costs around $1 USD, whereas a 6502 would have cost around $200 in today’s dollars. It’s really only in modern times that we can appreciate the 6502 as a cheap 8-bit microcontroller for that reason alone, but we can also appreciate how it ushered in a computer revolution since competing Intel and Motorola chips cost around six times more before it showed up. They became so popular in fact that people still regularly use them to build retrocomputers of all kinds.

Bespoke Processors Might Soon Power Your Artisanal Devices

Modern microprocessors are a marvel of technological progress and engineering. At less than a dollar per unit, even the cheapest microprocessors on the market are orders of magnitude more powerful than their ancestors. The first commercially available single-chip processor, the Intel 4004, cost roughly $25 (in today’s dollars) when it was introduced in 1971.

The 4-bit 4004 clocked in at 740 kHz — paltry by today’s standards, but quite impressive at the time. However, what was remarkable about the 4004 was the way it shifted computer design architecture practically overnight. Previously, multiple chips were used for processing and were selected to just meet the needs of the application. Considering the cost of components at the time, it would have been impractical to use more than was needed.

That all changed with the new era ushered in by general purpose processors like the 4004. Suddenly it was more cost-effective to just grab a processor of the shelf than to design and manufacture a custom one – even if that processor was overpowered for the task. That trend has continued (and has been amplified) to this day. Your microwave probably only uses a fraction of its processing power, because using a $0.50 processor is cheaper than designing (and manufacturing) one tailored to the microwave’s actual needs.

Anyone who has ever worked in manufacturing, or who has dealt with manufacturers, knows this comes down to unit cost. Because companies like Texas Instruments makes millions of processors, they’re very inexpensive per unit. Mass production is the primary driving force in affordability. But, what if it didn’t have to be?

Professors [Rakesh Kumar] and [John Sartori], along with their students, are experimenting with bespoke processor designs that aim to cut out the unused portions of modern processors. They’ve found that in many applications, less than half the logic gates of the processor are actually being used. Removing these reduces the size and power consumption of the processor, and therefore the final size and power requirements of the device itself.

Of course, that question of cost comes back into play. Is a smaller and more efficient processor worth it if it ends up costing more? For most manufacturers of devices today, the answer is almost certainly no. There aren’t many times when those factors are more important than cost. But, with modern techniques for printing electronics, they think it might be feasible in the near future. Soon, we might be looking at custom processors that resemble the early days of computer design.

 

Under the Hood of AMD’s Threadripper

Although AMD has been losing market share to Intel over the past decade, they’ve recently started to pick up steam again in the great battle for desktop processor superiority. A large part of this surge comes in the high-end, multi-core processor arena, where it seems like AMD’s threadripper is clearly superior to Intel’s competition. Thanks to overclocking expert [der8auer] we can finally see what’s going on inside of this huge chunk of silicon.

The elephant in the room is the number of dies on this chip. It has a massive footprint to accommodate all four dies, each with eight cores. However, it seems as though two of the cores are deactivated due to a combination of manufacturing processes and thermal issues. This isn’t necessarily a bad thing, either, or a reason not to use this processor if you need to utilize a huge number of cores, though; it seems as though AMD found it could use existing manufacturing techniques to save on the cost of production, while still making a competitive product.

Additionally, a larger die size than required opens the door for potentially activating the two currently disabled chips in the future. This could be the thing that brings AMD back into competition with Intel, although both companies still maintain the horrible practice of crippling their chips’ security from the start.

The Gray-1, A Computer Composed Entirely Of ROM And RAM

When we learn about the internals of a microprocessor, we are shown a diagram that resembles the 8-bit devices of the 1970s. There will be an ALU, a program counter, a set of registers, and address and data line decoders. Most of us never go significantly further into the nuances of more modern processors because there is no need. All a processor needs to be is a black box, unless it has particularly sparked your interest or you are working in bare-metal assembly language.

We imagine our simple microprocessor as built from logic gates, and indeed there have been many projects on these pages that create working processors from piles of 74 series chips. But just occasionally a project comes along that reminds us there is more than one way to build a computer, and our subject today is just such a moment. [Olivier Bailleux] has created his “Gray-1”, a processor whose only active components are memory chips, both ROM and RAM.

The clever part comes with the descriptions of how the ROMs are used to recreate the different functions of the processor, through careful programming. Some functions such as registers for example use loops, in which some of the address lines are driven from the data lines to maintain the ROM at a set location. The name of the computer comes from its program counter, which counts in Gray code.

The full processor implements a RISC architecture, and there is a simulator to allow code development without a physical unit. The write-up is both comprehensive and accessible, and makes a fascinating read.

It’s safe to say this is the only processor we’ve seen with this novel approach to architecture. Some more conventional previous features though have been an effort to create a processor entirely from NAND gates, and another made from 74 logic.

42,300 Transistor Megaprocessor Is Complete

As it turns out, the answer is not 42, it’s 42.3 — thousand. That’s how many discrete transistors spread across the 30 m2 room housing this massive computation machine. [James Newman’s] Megaprocessor, a seriously enlarged version of a microprocessor, is a project we’ve been following with awe as it took shape over the last couple of years.

[James] documented his work in great detail, and by doing so, took us on a journey through the inner workings of microprocessors. His monumental machine is now finished, and it’s the ultimate answer to how a processor – and pretty much everything that contains a processor – works.

Continue reading “42,300 Transistor Megaprocessor Is Complete”

How The Dis-integrated 6502 Came To Be

I made a bee line for one booth in particular at this year’s Bay Area Maker Faire; our friend [Eric Schlaepfer] had his MOnSter 6502 on display. If you missed it last week, the unveiling of a 6502 built from discrete transistors lit the Internet afire. At that point, the board was not fully operational but [Eric’s] perseverance paid off because it had no problem whatsoever blinking out verification code at his booth.

I interviewed [Eric] in the video below about the design process. It’s not surprising to hear that he was initially trying to prove that this couldn’t be done. Unable to do so, there was nothing left to do but devote almost six-months of his free time to completing the design, layout, and assembly.

What I’m most impressed about (besides just pulling it off in the first place) is the level of perfection [Eric] achieved in his design. He has virtually no errors whatsoever. In the video you’ll hear him discuss an issue with pull-up/pull-down components which did smoke some of the transistors. The solution is an in-line resistor on each of the replacement transistors. This was difficult to photograph but you can make out the soldering trick above where the 3-pin MOSFET is propped up with it’s pair of legs on the board, and the single leg in the air. The added resistor to fix the issue connects that airborne leg to its PCB pad. Other than this, there was no other routing to correct. Incredible.

The huge schematic binder includes a centerfold — literally. One of the most difficult pieces of the puzzle was working out the decode ROM. What folds out of this binder doesn’t even look like a schematic at first glance, but take a closer look (warning, 8 MB image). Every component in that grid was placed manually.

I had been expecting to see some tube-based goodness from [Eric] this year. That’s because I loved his work on Flappy Bird on a green CRT in 2014, and Battlezone on a tube with a hand-wound yoke last year. But I’m glad he stepped away from the tubes and created this marvelous specimen of engineering.

Exponential Growth In Linear Time: The End Of Moore’s Law

Moore’s Law states the number of transistors on an integrated circuit will double about every two years. This law, coined by Intel and Fairchild founder [Gordon Moore] has been a truism since it’s introduction in 1965. Since the introduction of the Intel 4004 in 1971, to the Pentiums of 1993, and the Skylake processors introduced last month, the law has mostly held true.

The law, however, promises exponential growth in linear time. This is a promise that is ultimately unsustainable. This is not an article that considers the future roadblocks that will end [Moore]’s observation, but an article that says the expectations of Moore’s Law have already ended. It ended quietly, sometime around 2005, and we will never again see the time when transistor density, or faster processors, more capable graphics cards, and higher density memories will double in capability biannually.

Continue reading “Exponential Growth In Linear Time: The End Of Moore’s Law”