[RetroBytes] takes us on a whirlwind tour of the history of the Digital Equipment Corporation (DEC), its founder Ken Olsen, and during intermission builds up a working replica of the PDP-11 from a kit. DEC was a major player in the early computer industry, cranking out a number of models that were both industrial workhorses and used in computer laboratories to develop many of the operating systems and tools whose descendants we still use today. On top of that, DEC’s innovative, employee-friendly, and lightweight company structure was generally well-liked by its employees and a welcomed departure from the typical behemoths of the day.
This video takes us from the beginnings of DEC and its roots in MIT up to the PIP-11 era, highlighting major architectures and events along the way such as the PDP-1, PDP-8, and PDP-11. [RetroBytes] says he has a DEC Alpha sitting on the sidelines, so there may be a few follow-up videos in the future — perhaps one on the VAX as well.
Among the many facets of modern technology, few have evolved faster or more radically than the computer. In less than a century its very nature has changed significantly: today’s smartphones easily outperform desktop computers of the past, machines which themselves were thousands of times more powerful than the room-sized behemoths that ushered in the age of digital computing. The technology has developed so rapidly that an individual who’s now making their living developing iPhone applications could very well have started their career working with stacks of punch cards.
With things moving so quickly, it can be difficult to determine what’s worth holding onto from a historical perspective. Will last year’s Chromebook one day be a museum piece? What about those old Lotus 1-2-3 floppies you’ve got in the garage? Deciding what artifacts are worth preserving in such a fast moving field is just one of the challenges faced by Dag Spicer, the Senior Curator at the Computer History Museum (CHM) in Mountain View, California. Dag stopped by the Hack Chat back in June of 2019 to talk about the role of the CHM and other institutions like it in storing and protecting computing history for future generations.
To answer that most pressing question, what’s worth saving from the landfill, Dag says the CHM often follows what they call the “Ten Year Rule” before making a decision. That is to say, at least a decade should have gone by before a decision can be made about a particular artifact. They reason that’s long enough for hindsight to determine if the piece in question made a lasting impression on the computing world or not. Note that such impression doesn’t always have to be positive; pieces that the CHM deem “Interesting Failures” also find their way into the collection, as well as hardware which became important due to patent litigation.
Of course, there are times when this rule is sidestepped. Dag points to the release of the iPod and iPhone as a prime example. It was clear that one way or another Apple’s bold gambit was going to get recorded in the annals of computing history, so these gadgets were fast-tracked into the collection. Looking back on this decision in 2022, it’s clear they made the right call. When asked in the Chat if Dag had any thoughts on contemporary hardware that could have similar impact on the computing world, he pointed to Artificial Intelligence accelerators like Google’s Tensor Processing Unit.
In addition to the hardware itself, the CHM also maintains a collection of ephemera that serves to capture some of the institutional memory of the era. Notebooks from the R&D labs of Fairchild Semiconductor, or handwritten documents from Intel luminary Andrew Grove bring a human touch to a collection of big iron and beige boxes. These primary sources are especially valuable for those looking to research early semiconductor or computer development, a task that several in the Chat said staff from the Computer History Museum had personally assisted them with.
Towards the end of the Chat, a user asks why organizations like the CHM go through the considerable expense of keeping all these relics in climate controlled storage when we have the ability to photograph them in high definition, produce schematics of their internals, and emulate their functionality on far more capable systems. While Dag admits that emulation is probably the way to go if you’re only worried about the software side of things, he believes that images and diagrams simply aren’t enough to capture the true essence of these machines.
Quoting the the words of early Digital Equipment Corporation engineer Gordon Bell, Dag says these computers are “beautiful sculptures” that “reflect the times of their creation” in a way that can’t easily be replicated. They represent not just the technological state-of-the-art but also the cultural milieu in which they were developed, with each and every design decision taking into account a wide array of variables ranging from contemporary aesthetics to material availability.
While 3D scans of a computer’s case and digital facsimiles of its internal components can serve to preserve some element of the engineering that went into these computers, they will never be able to capture the experience of seeing the real thing sitting in front of you. Any school child can tell you what the Mona Lisa looks like, but that doesn’t stop millions of people from waiting in line each year to see it at the Louvre.
The Hack Chat is a weekly online chat session hosted by leading experts from all corners of the hardware hacking universe. It’s a great way for hackers connect in a fun and informal way, but if you can’t make it live, these overview posts as well as the transcripts posted to Hackaday.io make sure you don’t miss out.
Building a complete operating system by compiling its source code is not something for the faint-hearted; a modern Linux or BSD distribution contains thousands of packages with millions of lines of code, all of which need to be processed in the right order and the result stored in the proper place. For all but the most hardcore Gentoo devotees, it’s way easier to get pre-compiled binaries, but obviously someone must have run the entire compilation process at some point.
What’s true for modern OSes also holds for ancient software such as MS-DOS. When Microsoft released the source code for several DOS versions a couple of years ago, many people pored over the code to look for weird comments and undocumented features, but few actually tried to compile the whole package. But [Michal Necasek] over at the OS/2 Museum didn’t shy away from that challenge, and documented the entirely-not-straightforward process of compiling DOS 2.11 from source.
The first problem was figuring out which version had been made available: although the Computer History Museum labelled the package simply as “MS-DOS 2.0”, it actually contained a mix of OEM binaries from version 2.0, source code from version 2.11 and some other stuff left from the development process. The OEM binaries are mostly finished executables, but also contain basic source code for some system components, allowing computer manufacturers to tailor those components to their specific hardware platform.
Compiling the source code was not trivial either. [Michal] was determined to use period-correct tools and examined the behaviour of about a dozen versions of MASM, the assembler likely to have been used by Microsoft in the early 1980s. As it turned out, version 1.25 from 1983 produced code that most closely matched the object code found in existing binaries, and even then some pieces of source code required slight modifications to build correctly. [Michal]’s blog post also goes into extensive detail on the subtle differences between Microsoft-style and IBM-style DOS, which go deeper than just the names of system files (MSDOS.SYS versus IBMDOS.COM).
The end result of this exercise is a modified DOS 2.11 source package that actually compiles to a working set of binaries, unlike the original. And although this does not generate any new code, since binaries of DOS 2.11 have long been available, it does provide a fascinating look into software development practices in an age when even the basic components of the PC platform were not fully standardized. And don’t forget that even today some people still like to develop new DOS software.
Since 1935, Berlin engineer Konrad Zuse has spent his entire career developing a series of automatic calculators, the first of their kind in the world: the Z1, Z2, Z3, S1, S2, and Z4. He accomplished this with a motley group of engineers, technicians, and mathematicians who were operating against all odds. With all the hardships and shortages of war and the indifference of their peers, the fact that they succeeded at all is a testament to their dedication and resourcefulness. And with the end of the war, more hardships have been piling on.
Two years ago, during the Battle of Berlin, bombers completely destroyed the Zuse family home and adjacent workshops on the Methfesselstraße, where they performed research and fabrication. All of the calculators, engineering drawings, and notes were lost in the rubble, save for the new Z4 nearing completion across the canal in another workshop on Oranienstraße. In the midst of all this, Zuse married in January of this year, but was immediately plunged into another crisis when the largest Allied air raid of the war destroyed the Oranienstraße workshop in February. They managed to rescue the Z4 from the basement, and miraculously arranged for it to be shipped out of the Berlin. Zuse, his family, and colleagues followed soon thereafter. Here and there along the escape route, they managed to complete the final assembly and testing of the Z4 — even giving a demonstration to the Aerodynamics Research Institute in Göttingen.
On arrival here in the Bavarian Alps, Zuse found a ragtag collection of refugees, including Dr Werner Von Braun and a team of 100 rocket scientists from Peenemünde. While everyone here is struggling just to stay alive and find food and shelter, Zuse is further worried with keeping his invention safe from prying eyes. Tensions have risen further upon circulation of a rumor that an SS leader, after three bottles of Cognac, let slip that his troops aren’t here to protect the scientists but to kill them all if the Americans or French approach.
In the midst of all this madness, Zuse and his wife Gisela welcomed a baby boy, and have taken up residence in a Hinterstein farmhouse. Zuse spends his time working on something called a Plankalkül, explaining that it is a mathematical language to allow people to communicate with these new machines. His other hobby is making woodblocks of the local scenery, and he plans to start a company to sell his devices once the economy recovers. There is no doubt that Konrad Zuse will soon be famous and known around the world as the father of automatic computers. Continue reading “The Other First Computer: Konrad Zuse And The Z3”→
Over the past few years, I kept bumping into something called Hershey fonts. After digging around, I found a 1967 government report by a fellow named Dr. Allen Vincent Hershey. Back in the 1960s, he worked as a physicist for the Naval Weapons Laboratory in Dahlgren, Virginia, studying the interaction between ship hulls and water. His research was aided by the Naval Ordnance Research Calculator (NORC), which was built by IBM and was one of the fastest computers in the world when it was first installed in 1954.
The NORC’s I/O facilities, such as punched cards, magnetic tape, and line printers, were typical of the era. But the NORC also had an ultra-high-speed optical printer. This device had originally been developed by the telecommunications firm Stromberg-Carlson for the Social Security Administration in order to quickly print massive amounts of data directly upon microfilm.
Video blogger and display technology guru [Fran Blanche] has discovered a splendid retro-tech alphanumeric display from 1910. (Video, embedded below.)
We have always enjoyed her forays into old and unusual displays, including her project researching and reverse engineering an Apollo DSKY unit. This time [Fran] has dug up an amazing billboard from the early 20th century. It was built by the Rice Electric Display Company of Dayton Ohio, and operated in Herald Square for about two years. Requiring $400,000 in 1910-US-dollars to build, this was clearly an Herculean effort for its day and no doubt is the first example of selling advertising time on a computer-controller billboard. It boasts characters that are about 1.3 m tall and 1 m wide which can display letters, numbers, and various punctuation and symbols. These are arrayed into a 3-line 18-character matrix that is about 27 x 4 meters, and that’s up only a third of the total billboard, itself an illuminated and dynamic work of art.
There are quite a few tantalizing details in the video, but a few that jumped out at us are the 20,000 light bulbs, the 40 Hz display update rate, the 150 km of wire used and the three month long installation time. We would really like to learn more about these two 7.5 kW motorized switch controllers, how were they programmed, how were the character segments arranged, what were their shapes?
In the video, you can see triangles arranged in some pattern not unlike more modern sixteen segment displays, although as [Fran] points out, Mr Rice’s characters are more pleasing. We hope [Fran] can tease out more details for a future video. If you have any ideas or knowledge about this display, please put them in the comments section below. Spoiler alert after the video…
September 30th, 1980 is the day when Ethernet was first commercially introduced, making it exactly forty years ago this year. It was first defined in a patent filed by Xerox as a 10 Mb/s networking protocol in 1975, introduced to the market in 1980 and subsequently standardized in 1983 by the IEEE as IEEE 802.3. Over the next thirty-seven years, this standard would see numerous updates and revisions.
Included in the present Ethernet standard are not just the different speed grades from the original 10 Mbit/s to today’s maximum 400 Gb/s speeds, but also the countless changes to the core protocol to enable these ever higher data rates, not to mention new applications of Ethernet such as power delivery and backplane routing. The reliability and cost-effectiveness of Ethernet would result in the 1990 10BASE-T Ethernet standard (802.3i-1990) that gradually found itself implemented on desktop PCs.