Debugging A 1950s Computer Sounds Like A Pain

Debugging computers in the 1950s sounds like it wasn’t an easy task. That’s one of the interesting facts from this fascinating talk by [Guy Fedorkow] about the Whirlwind, one of the first digital computers ever built. The development of this remarkable computer started at MIT (Funded by the US Navy) in 1949 as a flight simulator but pivoted to plotting interceptions in the early 1950s. That was because the USSR had just set off their first boosted nuclear bomb, which could be mounted on a missile or bomber. So, the threat of incoming missiles and atomic bombers became real, and the need arose to intercept nuclear bombers.

As a real-time computer, Whirlwind received radar data from radar stations around the US that showed the location of the interceptor and the incoming bogey, then calculated the vector for the two to meet up and, erm, have a frank exchange of views. So, how do you debug one of the first real-time computers? Carefully, it seems.

Continue reading “Debugging A 1950s Computer Sounds Like A Pain”

History Of The SPARC CPU Architecture

[RetroBytes] nicely presents the curious history of the SPARC processor architecture. SPARC, short for Scalable Processor Architecture, defined some of the most commercially successful RISC processors during the 1980s and 1990s. SPARC was initially developed by Sun Microsystems, which most of us associate the SPARC but while most computer architectures are controlled by a single company, SPARC was championed by dozens of players.  The history of SPARC is not simply the history of Sun.

A Reduced Instruction Set Computer (RISC) design is based on an Instruction Set Architecture (ISA) that runs a limited number of simpler instructions than a Complex Instruction Set Computer (CISC) based on an ISA that comprises more, and more complex, instructions. With RISC leveraging simpler instructions, it generally requires a longer sequence of those simple instructions to complete the same task as fewer complex instructions in a CISC computer. The trade-off being the simple (more efficient) RISC instructions are usually run faster (at a higher clock rate) and in a highly pipelined fashion. Our overview of the modern ISA battles presents how the days of CISC are essentially over. Continue reading “History Of The SPARC CPU Architecture”

History Of Digital Equipment Corp And Bonus PDP-11 Replica Build

[RetroBytes] takes us on a whirlwind tour of the history of the Digital Equipment Corporation (DEC), its founder Ken Olsen, and during intermission builds up a working replica of the PDP-11 from a kit. DEC was a major player in the early computer industry, cranking out a number of models that were both industrial workhorses and used in computer laboratories to develop many of the operating systems and tools whose descendants we still use today. On top of that, DEC’s innovative, employee-friendly, and lightweight company structure was generally well-liked by its employees and a welcomed departure from the typical behemoths of the day.

This video takes us from the beginnings of DEC and its roots in MIT up to the PIP-11 era, highlighting major architectures and events along the way such as the PDP-1, PDP-8, and PDP-11. [RetroBytes] says he has a DEC Alpha sitting on the sidelines, so there may be a few follow-up videos in the future — perhaps one on the VAX as well.

We’ve covered this particular PDP-11 replica last year, and if these replica kits are your cup of tea, check out our coverage of kit designer [Oscar Vermeulen]’s presentation. Have you ever used real PDP or VAX computers? Let us know your war stories in the comments below.

Continue reading “History Of Digital Equipment Corp And Bonus PDP-11 Replica Build”

Classic Chat: Preserving Computer History

Among the many facets of modern technology, few have evolved faster or more radically than the computer.  In less than a century its very nature has changed significantly: today’s smartphones easily outperform desktop computers of the past, machines which themselves were thousands of times more powerful than the room-sized behemoths that ushered in the age of digital computing. The technology has developed so rapidly that an individual who’s now making their living developing iPhone applications could very well have started their career working with stacks of punch cards.

With things moving so quickly, it can be difficult to determine what’s worth holding onto from a historical perspective. Will last year’s Chromebook one day be a museum piece? What about those old Lotus 1-2-3 floppies you’ve got in the garage? Deciding what artifacts are worth preserving in such a fast moving field is just one of the challenges faced by Dag Spicer, the Senior Curator at the Computer History Museum (CHM) in Mountain View, California. Dag stopped by the Hack Chat back in June of 2019 to talk about the role of the CHM and other institutions like it in storing and protecting computing history for future generations.

To answer that most pressing question, what’s worth saving from the landfill, Dag says the CHM often follows what they call the “Ten Year Rule” before making a decision. That is to say, at least a decade should have gone by before a decision can be made about a particular artifact. They reason that’s long enough for hindsight to determine if the piece in question made a lasting impression on the computing world or not. Note that such impression doesn’t always have to be positive; pieces that the CHM deem “Interesting Failures” also find their way into the collection, as well as hardware which became important due to patent litigation.

Of course, there are times when this rule is sidestepped. Dag points to the release of the iPod and iPhone as a prime example. It was clear that one way or another Apple’s bold gambit was going to get recorded in the annals of computing history, so these gadgets were fast-tracked into the collection. Looking back on this decision in 2022, it’s clear they made the right call. When asked in the Chat if Dag had any thoughts on contemporary hardware that could have similar impact on the computing world, he pointed to Artificial Intelligence accelerators like Google’s Tensor Processing Unit.

In addition to the hardware itself, the CHM also maintains a collection of ephemera that serves to capture some of the institutional memory of the era. Notebooks from the R&D labs of Fairchild Semiconductor, or handwritten documents from Intel luminary Andrew Grove bring a human touch to a collection of big iron and beige boxes. These primary sources are especially valuable for those looking to research early semiconductor or computer development, a task that several in the Chat said staff from the Computer History Museum had personally assisted them with.

Towards the end of the Chat, a user asks why organizations like the CHM go through the considerable expense of keeping all these relics in climate controlled storage when we have the ability to photograph them in high definition, produce schematics of their internals, and emulate their functionality on far more capable systems. While Dag admits that emulation is probably the way to go if you’re only worried about the software side of things, he believes that images and diagrams simply aren’t enough to capture the true essence of these machines.

The CHM’s PDP-1 Demo Lab, image by Alexey Komarov.

Quoting the the words of early Digital Equipment Corporation engineer Gordon Bell, Dag says these computers are “beautiful sculptures” that “reflect the times of their creation” in a way that can’t easily be replicated. They represent not just the technological state-of-the-art but also the cultural milieu in which they were developed, with each and every design decision taking into account a wide array of variables ranging from contemporary aesthetics to material availability.

While 3D scans of a computer’s case and digital facsimiles of its internal components can serve to preserve some element of the engineering that went into these computers, they will never be able to capture the experience of seeing the real thing sitting in front of you. Any school child can tell you what the Mona Lisa looks like, but that doesn’t stop millions of people from waiting in line each year to see it at the Louvre.


The Hack Chat is a weekly online chat session hosted by leading experts from all corners of the hardware hacking universe. It’s a great way for hackers connect in a fun and informal way, but if you can’t make it live, these overview posts as well as the transcripts posted to Hackaday.io make sure you don’t miss out.

Building MS-DOS From Scratch Like It’s 1983

Building a complete operating system by compiling its source code is not something for the faint-hearted; a modern Linux or BSD distribution contains thousands of packages with millions of lines of code, all of which need to be processed in the right order and the result stored in the proper place. For all but the most hardcore Gentoo devotees, it’s way easier to get pre-compiled binaries, but obviously someone must have run the entire compilation process at some point.

What’s true for modern OSes also holds for ancient software such as MS-DOS. When Microsoft released the source code for several DOS versions a couple of years ago, many people pored over the code to look for weird comments and undocumented features, but few actually tried to compile the whole package. But [Michal Necasek] over at the OS/2 Museum didn’t shy away from that challenge, and documented the entirely-not-straightforward process of compiling DOS 2.11 from source.

The first problem was figuring out which version had been made available: although the Computer History Museum labelled the package simply as “MS-DOS 2.0”, it actually contained a mix of OEM binaries from version 2.0, source code from version 2.11 and some other stuff left from the development process. The OEM binaries are mostly finished executables, but also contain basic source code for some system components, allowing computer manufacturers to tailor those components to their specific hardware platform.

Compiling the source code was not trivial either. [Michal] was determined to use period-correct tools and examined the behaviour of about a dozen versions of MASM, the assembler likely to have been used by Microsoft in the early 1980s. As it turned out, version 1.25 from 1983 produced code that most closely matched the object code found in existing binaries, and even then some pieces of source code required slight modifications to build correctly. [Michal]’s blog post also goes into extensive detail on the subtle differences between Microsoft-style and IBM-style DOS, which go deeper than just the names of system files (MSDOS.SYS versus IBMDOS.COM).

The end result of this exercise is a modified DOS 2.11 source package that actually compiles to a working set of binaries, unlike the original. And although this does not generate any new code, since binaries of DOS 2.11 have long been available, it does provide a fascinating look into software development practices in an age when even the basic components of the PC platform were not fully standardized. And don’t forget that even today some people still like to develop new DOS software.

The Other First Computer: Konrad Zuse And The Z3

Bavarian Alps, Dec. 1945:

Since 1935, Berlin engineer Konrad Zuse has spent his entire career developing a series of automatic calculators, the first of their kind in the world: the Z1, Z2, Z3, S1, S2, and Z4. He accomplished this with a motley group of engineers, technicians, and mathematicians who were operating against all odds. With all the hardships and shortages of war and the indifference of their peers, the fact that they succeeded at all is a testament to their dedication and resourcefulness. And with the end of the war, more hardships have been piling on.

Two years ago, during the Battle of Berlin, bombers completely destroyed the Zuse family home and adjacent workshops on the Methfesselstraße, where they performed research and fabrication. All of the calculators, engineering drawings, and notes were lost in the rubble, save for the new Z4 nearing completion across the canal in another workshop on Oranienstraße. In the midst of all this, Zuse married in January of this year, but was immediately plunged into another crisis when the largest Allied air raid of the war destroyed the Oranienstraße workshop in February. They managed to rescue the Z4 from the basement, and miraculously arranged for it to be shipped out of the Berlin. Zuse, his family, and colleagues followed soon thereafter. Here and there along the escape route, they managed to complete the final assembly and testing of the Z4 — even giving a demonstration to the Aerodynamics Research Institute in Göttingen.

On arrival here in the Bavarian Alps, Zuse found a ragtag collection of refugees, including Dr Werner Von Braun and a team of 100 rocket scientists from Peenemünde. While everyone here is struggling just to stay alive and find food and shelter, Zuse is further worried with keeping his invention safe from prying eyes. Tensions have risen further upon circulation of a rumor that an SS leader, after three bottles of Cognac, let slip that his troops aren’t here to protect the scientists but to kill them all if the Americans or French approach.

In the midst of all this madness, Zuse and his wife Gisela welcomed a baby boy, and have taken up residence in a Hinterstein farmhouse. Zuse spends his time working on something called a Plankalkül, explaining that it is a mathematical language to allow people to communicate with these new machines. His other hobby is making woodblocks of the local scenery, and he plans to start a company to sell his devices once the economy recovers. There is no doubt that Konrad Zuse will soon be famous and known around the world as the father of automatic computers. Continue reading “The Other First Computer: Konrad Zuse And The Z3”

Hershey Fonts: Not Chocolate, The Origin Of Vector Lettering

Over the past few years, I kept bumping into something called Hershey fonts. After digging around, I found a 1967 government report by a fellow named Dr. Allen Vincent Hershey. Back in the 1960s, he worked as a physicist for the Naval Weapons Laboratory in Dahlgren, Virginia, studying the interaction between ship hulls and water. His research was aided by the Naval Ordnance Research Calculator (NORC), which was built by IBM and was one of the fastest computers in the world when it was first installed in 1954.

The NORC’s I/O facilities, such as punched cards, magnetic tape, and line printers, were typical of the era. But the NORC also had an ultra-high-speed optical printer. This device had originally been developed by the telecommunications firm Stromberg-Carlson for the Social Security Administration in order to quickly print massive amounts of data directly upon microfilm.

Continue reading “Hershey Fonts: Not Chocolate, The Origin Of Vector Lettering”