Finally, An Open-Source 8088 BIOS

The Intel 8088 is an interesting chip, being a variant of the more well-known 8086. Given the latter went on to lend its designation to one of the world’s favorite architectures, you can tell which of the two was higher status. Regardless, it was the 8088 that lived in the first IBM PC, and now, it even has its own open-source BIOS.

As with any BIOS, or Basic Input Output System, it’s charged with handling core low-level features for computers like the Micro 8088, Xi 8088, and NuXT. It handles chipset identification, keyboard and mouse communication, real-time clock, and display initialization, among other things.

Of course, BIOSes for 8088-based machines already exist. However, in many cases, they are considered to be proprietary code that cannot be freely shared over the internet. For retrocomputing enthusiasts, it’s of great value to have a open-source BIOS that can be shared, modified, and tweaked as needed to suit a wide variety of end uses.

If you want to learn more about the 8088 CPU, we’ve looked in depth at that topic before. Feel free to drop us a line with your own retro Intel hacks if you’ve got them kicking around!

[Ken] Looks At The 386

The 80386 was — arguably — Intel’s first modern CPU. The 8086 was commercially successful, but the paged memory model was stifling. The 80286 also had a protected mode, which differed from the 386’s. [Ken Shirriff] takes the 386 apart for us in a recent blog post.

The 286’s protected mode was less successful than the 386 because of several key limitations as it was a 16-bit processor with a 24-bit address bus. It still required segment changes to access larger amounts of memory, and it had no good way to call back into real mode for compatibility reasons. The 386 fixed all that. You could adopt a segment strategy if you wanted to. But you could also load the segment registers once to point to a 4 GB linear address space and then essentially forget them. You also had a virtual 86 mode that could simulate real mode with some work.

The CPU used a 1-micron process, compared to the 1.5-micron process used earlier. The chip had 285,000 transistors (although the 80386SL had many more). That was ten times the number of devices on the 8086. The cheaper 386SX did use the 1.5 micron process for a while, but with a 16-bit external bus, this was feasible. While 285,000 sounds like a lot, a Core i9 has around 4.2 billion transistors. Times have changed.

A smaller design also allowed chips like the 386SL for laptops. The CPU took up only about a fourth of the die. The rest held bus controllers and cache interfaces to cut costs on laptops. That’s why it had so many more transistors.

[Ken] does his usual in-depth analysis of both the die and the history behind this historic device. We spent a lot of time writing protected mode 386 code, and it was nice to see the details of a very old friend. These days, you can get a pretty capable CPU system on a solderless breadboard, but designing a working 386 system took a few extra parts. The 80286 was a stepping stone between the 8086 and 80386, but even it had some secrets to give up.

Intel’s Chips Light The Way To Faster Processor Arrays

It’s very likely indeed that whatever you are reading this on will have a multi-core processor. They’re now the norm, but the path to they octa-or-more-core chip in your phone has gone from individual processors with PCB interconnects through many generations of ever faster on-chip ones.

But what if your power needs are so high-end that you need more cores that can be fitted on one chip, but without the slow PCB interconnect to another? If you’re Intel, you develop a multi-core processor with an on-chip photonic interconnect. It talks to the neighboring ones in its cluster at full speed, via light.

The chip in question isn’t one you’ll see in a machine near you, instead it’s inspired by the extremely demanding requirements for DARPA’s HIVE graph analytics program. So this is a machine for supercomputers in huge data centers rather than desktop computers, it will be assembled into multi-die packages with that chip-to-chip optical networking built in. But your computer today is the equal of a supercomputer from not that many years ago, so never say you won’t one day be using its descendant technologies.

Gesture Sensor Teardown Reveals Intel Heritage

A few years ago, there was a rush of products on the market to detect motion. The idea being you could interact with your computer like they do on science fiction movies, with giant expressive hand motions in the air. Most of these were aimed at desktop computer users but one company, YouSpace, wanted to bring this technology to retail stores. [IMSAI Guy] got one of their sensor devices and decided to see what was inside it. You can see, too, in the video below.

The device appeared to have a laser inside, which motivated the teardown. We aren’t sure exactly what YouSpace had planned, but you can see their now-defunct website on the Wayback machine. The use cases listed didn’t really help us get a clear picture, so maybe that was part of the problem.

Getting into the device was the first challenge. Like many modern smartphones, there didn’t appear to be any fasteners, so you simply had to pry the case apart. Inside the case: a tiny circuit board and a metal assembly containing the laser and cameras that were easy to remove. The main PCB appears to be an Intel off-the-shelf board that was in many Intel RealSense products, and currently go for about $50 on eBay. The camera assembly looks a bit like an Intel D430, so it is possible the entire thing was off-the-shelf hardware. Even the little connector board is, technically, a D400 Interposer.

The peek into the structured light project under the microscope was interesting. We expected it would look different, and [IMSAI Guy] clearly didn’t expect its appearance either. The chip was made to beam a known pattern that the cameras would use to deduce the shape of the surfaces it hits.

If you can find these on the surplus market, they would probably be a good deal if you need this hardware which is typically pretty expensive. Just beware, though. Intel announced in late 2021 they were “winding down” RealSense. We don’t know if there will be third-party support in the future or if the whole product line will just be orphaned.

We’ve seen the occasional project that uses structured light. The technique can be very precise.

Continue reading “Gesture Sensor Teardown Reveals Intel Heritage”

Intel Suggests Dropping Everything But 64-Bit From X86 With Its X86-S Proposal

In a move that has a significant part of the internet flashing back to the innocent days of 2001 when Intel launched its Itanium architecture as a replacement for the then 32-bit only x86 architecture – before it getting bludgeoned by AMD’s competing x86_64 architecture – Intel has now released a whitepaper with associated X86-S specification that seeks to probe the community’s thoughts on it essentially removing all pre-x86_64 features out of x86 CPUs.

While today you can essentially still install your copy of MSDOS 6.11 on a brand-new Intel Core i7 system, with some caveats, it’s undeniable that to most users of PCs the removal of 16 and 32-bit mode would likely go by unnoticed, as well as the suggested removal of rings 1 and 2, as well as range of other low-level (I/O) features. Rather than the boot process going from real-mode 16-bit to protected mode, and from 32- to 64-bit mode, the system would boot straight into the 64-bit mode which Intel figures is what everyone uses anyway.

Where things get a bit hazy is that on this theoretical X86-S you cannot just install and boot your current 64-bit operating systems, as they have no concept of this new boot procedure, or the other low-level features that got dropped. This is where the Itanium comparison seems most apt, as it was Intel’s attempt at a clean cut with its x86 legacy, only for literally everything about the concept (VLIW) and ‘legacy software’ support to go horribly wrong.

Although X86-S seems much less ambitious than Itanium, it would nevertheless be interesting to hear AMD’s thoughts on the matter.

Hackaday Links Column Banner

Hackaday Links: March 26, 2023

Sad news in the tech world this week as Intel co-founder Gordon Moore passed away in Hawaii at the age of 94. Along with Robert Noyce in 1968, Moore founded NM Electronics, the company that would later go on to become Intel Corporation and give the world the first commercially available microprocessor, the 4004, in 1971. The four-bit microprocessor would be joined a few years later by the 8008 and 8080, chips that paved the way for the PC revolution to come. Surprisingly, Moore was not an electrical engineer but a chemist, earning his Ph.D. from the California Institute of Technology in 1954 before his postdoctoral research at the prestigious Applied Physics Lab at Johns Hopkins. He briefly worked alongside Nobel laureate and transistor co-inventor William Shockley before jumping ship with Noyce and others to found Fairchild Semiconductor, which is where he made the observation that integrated circuit component density doubled roughly every two years. This calculation would go on to be known as “Moore’s Law.”

Continue reading “Hackaday Links: March 26, 2023”

Exploring Texas Instrument’s Forgotten CPU

Texas Instruments isn’t the name you usually hear associated with the first microprocessor. But the TI TMX 1795 was an 8008 chip produced months before the 8008. It was never available commercially, though, so it has been largely forgotten by most people. But not [Ken Shirriff]. You can see a demo from 2015 of the device in the video below, too.

The reason the chips have the same architecture is they were built to replace the same large circuit board inside a Datapoint 2200 programmable terminal. These were big beasts that could be programmed in BASIC or PL/B.

Datapoint asked Intel to shrink the board to a chip due to heating problems — but after delays, they instead replaced the power supply and lost interest in the device. TI heard about the affair and wanted in on the deal. However, Datapoint was unimpressed. The chip didn’t tolerate voltage fluctuations very well, since they had replaced the power supply and had a new CPU design that was faster than the chip would be. They were also unimpressed with how much stuff you had to add to get a complete system.

So why did the Intel 8008 work out in the marketplace but the TI chip didn’t? After all, Datapoint decided not to use the 8008, also. But as [Ken] points out, the 8008 was much smaller than the TI chip and, thus, was more cost-effective to produce.

As usual, [Ken]’s posts are always interesting and enlightening. He’s looked at a lot of old computers. He’s even dug into old space hardware. Great stuff!

Continue reading “Exploring Texas Instrument’s Forgotten CPU”