When we last left this subject, I told you all about Transorma, the first letter-sorting machine in semi-wide use. But before and since Transorma, machines have come about to perform various tasks on jumbled messes of mail — things like distinguishing letters from packages, making sure letters are all facing the same way before cancelling the postage, and the gargantuan task of getting huge piles of mail into the machines in the first place. So let’s dive right in, shall we?
The ways in which we interact with computers has changed dramatically over the decades. From flipping switches on the control panels of room-sized computers, to punching holes into cards, to ultimately the most common ways that we interact with computers today, in the form of keyboards, mice and touch screens. The latter two especially were developed as a way to interact with graphical user interfaces (GUI) in an intuitive way, but keyboards remain the only reasonable way to quickly enter large amounts of text, which raises many ergonomic questions about how to interact with the rest of the user interface, whether this is a command line or a GUI.
For text editors, perhaps the most divisive feature is that of modal versus non-modal interaction. This one point alone underlies most of the Great Editor War that has raged since time immemorial. Practically, this is mostly about highly opiniated people arguing about whether they like Emacs or vi (or Vim) better. Since in August of 2023 we said our final farewell to the creator of Vim – Bram Moolenaar – this might be a good point to put down the torches and pitchforks and take a sober look at why Vim really is the logical choice for fast, ergonomic coding and editing.
The grid failure in 2003 which reverted much of the eastern US and Canada back to a pre-electrification era may be rather memorable, yet it was not the first time that a national, or even international power grid failed. Nor is it likely that it will be the last. In August of 2023 we mark the 20th anniversary of this blackout which left many people without electricity for up to three days, while costing dozens of people their lives. This raises the question of what lessons we learned from this event since then.
Although damage to transmission lines and related infrastructure is a big cause of power outages – especially in countries where overhead wiring is the norm – the most serious blackouts involve the large-scale desynchronization of the grid, to the point where generators shutdown to protect themselves. Bringing the grid back from such a complete blackout can take hours to days, as sections of the grid are reconnected after a cascade scenario as seen with the 2003 blackout, or the rather similar 1965 blackout which affected nearly the same region.
With how much more modern society relies today on constant access to electrical power than it did twenty, let alone fifty-eight years ago, exactly how afraid should we be of another, possibly worse blackout?
Growing older as an engineer turns out to be a succession of moments in which technologies and devices which you somehow still imagine to be cool or exciting, reveal themselves in fact to be obsolete, indeed, old. Such a moment comes today, with the25th anniversary of the most iconic of 1990s computers, Apple’s iMac. The translucent all-in-one machine was and remains more than simply yet another shiny Mac, it’s probably the single most influential home computer ever. A bold statement to be sure, but take a look at the computer you’re reading this on, indeed at all your electronic devices here in 2023, before you dismiss it.
Any colour you want, as long as it’s beige. Leon Brooks, Public domain.
Computers in the 1990s were beige and boring. Breathtakingly so, a festival of the generic. If you had a PC it came in the same beige box as every single other PC, the only thing breaking the monotony being one of those LED 7-segment fake-MHz displays. Apple computers took the beige and ran with it, their PowerMac range being merely a smoother-fronted version of all those beige-box PCs. This was the period following the departure of Steve Jobs during which the company famously lost its way, and the Bondi blue Jonny Ive-designed iMac was the signature product of his triumphant return.
That’s enough pretending to have drunk the Apple Kool-Aid for one article, so why are we marking this anniversary? The answer lies not in the iMac’s hardware, though its 233MHz PowerPC G3 and ATI graphics driving a 15″ CRT were no slouch for the day, nor even in its forsaking of all their previous proprietary interfaces for USB. Instead it’s the design influence of this machine, as it ushered in a new era of technological devices whose ethos lay around how they might be used rather than in simply showering the interface with features. At the time the iMac spawned a brief fashion for translucent blue in everything from peripherals to steam irons, but in the quarter century since your devices have changed immeasurably in its wake. We still don’t like that weird round mouse though.
Harry Daghlian and Louis Slotin were two of many people who worked on the Manhattan Project. They might not be household names, but we believe they are the poster children for safety procedures. And not in a good way.
Slotin assembled the core of the “Gadget” — the plutonium test device at the Trinity test in 1945. He was no stranger to working in a lab with nuclear materials. It stands to reason that if you are making something as dangerous as a nuclear bomb, it is probably hazardous work. But you probably get used to it, like some of us get used to working around high voltage or deadly chemicals.
Making nuclear material is hard and even more so back then. But the Project had made a third plutonium core — one was detonated at Trinity, the other over Nagasaki, and the final core was meant to go into a proposed second bomb that was not produced.
The cores were two hemispheres of plutonium and gallium. The gallium allowed the material to be hot-pressed into spherical shapes. Unlike the first two cores, however, the third one — one that would later earn the nickname “the demon core” — had a ring around the flat surfaces to contain nuclear flux during implosion. The spheres are not terribly dangerous unless they become supercritical, which would lead to a prompt critical event. Then, they would release large amounts of neutrons. The bombs, for example, would force the two halves together violently. You could also add more nuclear material or reflect neutrons back into the material.
Historians may note that World War II was the last great “movie war.” In those days, you could do many things that are impossible today, yet make for great movie drama. You can’t sneak a fleet of ships across the oceans anymore. Nor could you dig tunnels right under your captor’s nose. Another defining factor is that it doesn’t seem we seek out superweapons anymore.
A Churchill Bullshorn plough for clearning minefields — one of Hobart’s “Funnies”
Sure, we develop better planes, tanks, submarines, and guns. But we aren’t working on anything — that we know of — as revolutionary as a rocket, an atomic bomb, or even radar was back in the 1940s. The Germans worked on Wunderwaffe, including guided missiles, jets, suborbital rocket bombers, and a solar-powered space mirror to burn terrestrial targets. Everyone was working on a nuclear bomb, of course. The British had Hobart’s Funnies as well as less successful entries like the Panjandrum — a ten-foot rocket-driven wheel of explosives.
Death Ray
Perhaps the holy grail of all the super weapons — both realized and dreamed of was the “death ray.” Of course, Tesla claimed to have one that didn’t use rays, but particles, but no one ever successfully built one and there was debate if it would work. Tesla didn’t like the term death ray, partly because it wasn’t a ray at all, but also because it required a huge power plant and, therefore, wasn’t mobile. He envisioned it as a peacekeeping defensive weapon, rendering attacks so futile that no one would dare attempt them.
Weather can have a significant impact on transport and operations of all kinds, especially those at sea or in the air. This makes it a deeply important field of study, particularly in wartime. If you’re at all curious about how this kind of information was gathered and handled in the days before satellites and computer models, this write-up on WWII meteorology is sure to pique your interest.
Weather conditions were valuable data, and weather forecasts even more so. Both required data, which relied on human operators for instruments to be read and their readings transmitted.
The main method of learning weather conditions over the oceans is to persuade merchant ships to report their observations regularly. This is true even today, but these days we also have the benefit of things like satellite technology. Back in the mid-1900s there was no such thing, and the outbreak of WWII (including the classification of weather data as secret information due to its value) meant that new solutions were needed.
The aircraft of the Royal Air Force (RAF) were particularly in need of accurate data, and there was little to no understanding of the upper atmosphere at the time. Eventually, aircraft flew regular 10-hour sorties, logging detailed readings that served to provide data about weather conditions across the Atlantic. Readings were logged, encoded with one-time pad (OTP) encryption, then radioed back to base where charts would be created and updated every few hours.
The value of accurate data and precise understanding of conditions and how they could change was grimly illustrated in a disaster called the Night of the Big Wind (March 24-25, 1944). Forecasts predicted winds no stronger than 45 mph, but Allied bombers sent to Berlin were torn apart when they encountered winds in excess of 120 mph, leading to the loss of 72 aircraft.
The types of data recorded to monitor and model weather are nearly identical to those in modern weather stations. The main difference is that instruments used to be read and monitored by human beings, whereas today we can rely more on electronic readings and transmission that need no human intervention.