The World’s First Microprocessor: F-14 Central Air Data Computer

When the Grumman F-14 Tomcat first flew in 1970, it was a marvel. With its variable-sweep wing, twin tail, and sleek lines, it quickly became one of the most iconic jet fighters of the era — and that was before a little movie called Top Gun hit theaters.

A recent video by [Alexander the ok] details something that was far less well-documented about the plane, namely its avionics. The Tomcat was the first aircraft to use a microprocessor-driven flight system, as well as the first microprocessor unit (MPU) ever demonstrated, beating the Intel 4004 by a year. In 1971, one of the designers of the F-14’s Central Air Data Computer (CADC) – [Ray Holt] – wrote an article for Computer Design magazine that was naturally immediately classified by the Navy until released to the public in 1998.

The MPU in the CADC is called the Garrett AiResearch MP944, and consists of a number of ICs that together form a full computer. These were combined in the CADC with additional electronics to control many elements of the airplane automatically, including the weapons system and the variable-sweep wing configuration. This was considered to be essential based on experiences with the F-111 and its very complex electromechanical flight computer, which was an evolution of the 1950s-era Bendix CADC.

The video goes through the differences between the 4-bit Intel 4004 and the 20-bit MP944, questioning whether the 4004 is even really an MPU, the capabilities of the MP944 and its system architecture. Ultimately the question of ‘first’ and that of ‘what is an MPU’ will always be somewhat fuzzy depending on your definitions, but there is no denying that the MP944 was a marvel of large-scale integration.

Continue reading “The World’s First Microprocessor: F-14 Central Air Data Computer”

Reverse-Engineering The Conditional Jump Circuitry In The 8086 Processor

The condition PLA evaluates microcode conditionals.
The condition PLA evaluates microcode conditionals.

As simple as a processor’s instruction set may seem, especially in a 1978-era one like the Intel 8086, there is quite a bit going on to go from something like a conditional jump instruction to a set of operations that the processor can perform. For the CISC 8086 CPU this is detailed in a recent article by [Ken Shirriff], which covers exactly how the instructions with their parameters are broken down into micro-instructions using microcode, which allows the appropriate registers and flags to be updated.

Where the 8086 is interesting compared to modern x86 CPUs is how the microcode is implemented, using gate logic to reduce the complexity of the microcode by for example generic parameter testing when processing a jump instruction. Considering the limitations of 1970s VLSI manufacturing, this was very much a necessary step, and an acceptable trade-off.

Each jump instruction is broken down into a number of micro-instructions that test a range of flags and updates (temporary) registers as well as the program counter as needed. All in all a fascinating look at the efforts put in by Intel engineers over forty years ago on what would become one of the cornerstones of modern day computing.

Ask Hackaday: When It Comes To Processors, How Far Back Can You Go?

When it was recently announced that the Linux kernel might drop support for the Intel 486 line of chips, we took a look at the state of the 486 world. You can’t buy them from Intel anymore, but you can buy clones, which are apparently still used in embedded devices. But that made us think: if you can’t buy a genuine 486, what other old CPUs are still in production, and which is the oldest?

Defining A Few Rules

An Intel 4004 microprocessor in ceramic packaging
The daddy of them all, 1972’s Intel 4004 went out of production in 1981. Thomas Nguyen, CC BY-SA 4.0

There are a few obvious contenders that immediately come to mind, for example both the 6502 from 1975 and the Z80 from 1976 are still readily available. Some other old silicon survives in the form of cores incorporated into other chips, for example the venerable Intel 8051 microcontroller may have shuffled off this mortal coil as a 40-pin DIP years ago, but is happily housekeeping the activities of many far more modern devices today. Still further there’s the fascinating world of specialist obsolete parts manufacturing in which a production run of unobtainable silicon can be created specially for an extremely well-heeled customer. Should Uncle Sam ever need a crate of the Intel 8080 from 1974 for example, Rochester Electronics can oblige.

Continue reading “Ask Hackaday: When It Comes To Processors, How Far Back Can You Go?”

Riding The Nostalgia Train With A 6502 From The Ground Up

In the very early days of the PC revolution the only way to have a computer was to build one, sometimes from a kit but often from scratch. For the young, impoverished hobbyist, leafing through the pages of Popular Electronics was difficult, knowing that the revolution was passing you by. And just like that, the days of homebrewing drew to a close, forced into irrelevance by commodity beige boxes. Computing for normies had arrived.

Many of the homebrewers-that-never-were are now looking back at this time with the powerful combination of nostalgia and disposable income, and projects such as [Ben Eater]’s scratch-built 6502 computer are set to scratch the old itch. The video below introduces not only the how-to part of building a computer from scratch, but the whys and wherefores as well. Instead of just showing us how to wire up a microprocessor and its supporting chips, [Ben] starts with the two most basic things: a 6502 and its datasheet. He shows what pins do what, which ones to make high, and which ones get forced low. Clocked with a custom 555 circuit that lets him single-step and monitored with an Arduino Mega-based logic analyzer, we get a complete look at the fetch and execute cycle of a simple, hard-wired program at the pin level.

This is one of those rare videos that was over too soon and left us looking for more. [Ben] promises a follow-up to add a ROM chip and a more complex program, and we can’t wait to see that. He’s selling kits so you can build along if you don’t already have the parts. There seems to be a lot of interest in 6502 builds lately, some more practical than others. Seems like a good time to hop on the bandwagon.

Continue reading “Riding The Nostalgia Train With A 6502 From The Ground Up”

A Symbiotic Partnership Between FPGA And 6502

[Kenneth Wilke] is undertaking a noble quest – to build a homebrew microcomputer, based around the venerable 6502. As a prelude to this, he set out to interface the hallowed CPU to an FPGA, and shared the process involved.

[Kenneth] is using an Arty A7 FPGA development board which is a great fit for purpose, having plenty of I/O pins and being relatively easy to work with for the home tinkerer. This is an important consideration, as many industrial strength FPGAs require software licences to use which can easily stretch into the tens of thousands of dollars.

The 6502 is placed on a breadboard, and a nest of wires connects it to the PMOD interfaces of the Arty board. Then it’s a simple job of mapping out the pins on the FPGA and you’re good to go. Due to the 6502’s design it’s possible to step through instructions one at a time, and this is particularly useful on a basic homebrew build so [Kenneth] was sure to implement this functionality.

It’s all capped off with the FPGA sending the 6502 a starting address and a series of NOPs, to demonstrate the setup is capable of running the 6502 with instructions fed from the FPGA. It’s a project that shows the fundamentals of interfacing two technologies that are widely spread out in sophistication, and acts as a great base for further experimentation.

We can’t wait to see what [Kenneth] does next, as we’ve seen great things before.

Intel C4004

Inventing The Microprocessor: The Intel 4004

We recently looked at the origins of the integrated circuit (IC) and the calculator, which was the IC’s first killer app, but a surprise twist is that the calculator played a big part in the invention of the next world-changing marvel, the microprocessor.

There is some dispute as to which company invented the microprocessor, and we’ll talk about that further down. But who invented the first commercially available microprocessor? That honor goes to Intel for the 4004.

Path To The 4004

Busicom calculator motherboard based on 4004 (center) and the calculator (right)
Busicom calculator motherboard based on 4004 (center) and the calculator (right)

We pick up the tale with Robert Noyce, who had co-invented the IC while at Fairchild Semiconductor. In July 1968 he left Fairchild to co-found Intel for the purpose of manufacturing semiconductor memory chips.

While Intel was still a new startup living off of their initial $3 million in financing, and before they had a semiconductor memory product, as many start-ups do to survive they took on custom work. In April 1969, Japanese company Busicom hired them to do LSI (Large-Scale Integration) work for a family of calculators.

Busicom’s design, consisting of twelve interlinked chips, was considered a complicated one. For example, it included shift-register memory, a serial type of memory which complicates the control logic. It also used Binary Coded Decimal (BCD) arithmetic. Marcian Edward Hoff Jr — known as “Ted”, head of the Intel’s Application Research Department, felt that the design was even more complicated than a general purpose computer like the PDP-8, which had a fairly simple architecture. He felt they may not be able to meet the cost targets and so Noyce gave Hoff the go-ahead to look for ways to simplify it.

Continue reading “Inventing The Microprocessor: The Intel 4004”

Bespoke Processors Might Soon Power Your Artisanal Devices

Modern microprocessors are a marvel of technological progress and engineering. At less than a dollar per unit, even the cheapest microprocessors on the market are orders of magnitude more powerful than their ancestors. The first commercially available single-chip processor, the Intel 4004, cost roughly $25 (in today’s dollars) when it was introduced in 1971.

The 4-bit 4004 clocked in at 740 kHz — paltry by today’s standards, but quite impressive at the time. However, what was remarkable about the 4004 was the way it shifted computer design architecture practically overnight. Previously, multiple chips were used for processing and were selected to just meet the needs of the application. Considering the cost of components at the time, it would have been impractical to use more than was needed.

That all changed with the new era ushered in by general purpose processors like the 4004. Suddenly it was more cost-effective to just grab a processor of the shelf than to design and manufacture a custom one – even if that processor was overpowered for the task. That trend has continued (and has been amplified) to this day. Your microwave probably only uses a fraction of its processing power, because using a $0.50 processor is cheaper than designing (and manufacturing) one tailored to the microwave’s actual needs.

Anyone who has ever worked in manufacturing, or who has dealt with manufacturers, knows this comes down to unit cost. Because companies like Texas Instruments makes millions of processors, they’re very inexpensive per unit. Mass production is the primary driving force in affordability. But, what if it didn’t have to be?

Professors [Rakesh Kumar] and [John Sartori], along with their students, are experimenting with bespoke processor designs that aim to cut out the unused portions of modern processors. They’ve found that in many applications, less than half the logic gates of the processor are actually being used. Removing these reduces the size and power consumption of the processor, and therefore the final size and power requirements of the device itself.

Of course, that question of cost comes back into play. Is a smaller and more efficient processor worth it if it ends up costing more? For most manufacturers of devices today, the answer is almost certainly no. There aren’t many times when those factors are more important than cost. But, with modern techniques for printing electronics, they think it might be feasible in the near future. Soon, we might be looking at custom processors that resemble the early days of computer design.