The 2003 Northeast Blackout And The Harsh Lessons Of Grid Failures

The grid failure in 2003 which reverted much of the eastern US and Canada back to a pre-electrification era may be rather memorable, yet it was not the first time that a national, or even international power grid failed. Nor is it likely that it will be the last. In August of 2023 we mark the 20th anniversary of this blackout which left many people without electricity for up to three days, while costing dozens of  people their lives. This raises the question of what lessons we learned from this event since then.

Although damage to transmission lines and related infrastructure is a big cause of power outages – especially in countries where overhead wiring is the norm – the most serious blackouts involve the large-scale desynchronization of the grid, to the point where generators shutdown to protect themselves. Bringing the grid back from such a complete blackout can take hours to days, as sections of the grid are reconnected after a cascade scenario as seen with the 2003 blackout, or the rather similar 1965 blackout which affected nearly the same region.

With how much more modern society relies today on constant access to electrical power than it did twenty, let alone fifty-eight years ago, exactly how afraid should we be of another, possibly worse blackout?

Continue reading “The 2003 Northeast Blackout And The Harsh Lessons Of Grid Failures”

A Quarter Century Of The IMac

Growing older as an engineer turns out to be a succession of moments in which technologies and devices which you somehow still imagine to be cool or exciting, reveal themselves in fact to be obsolete, indeed, old. Such a moment comes today, with the25th anniversary of the most iconic of 1990s computers, Apple’s iMac. The translucent all-in-one machine was and remains more than simply yet another shiny Mac, it’s probably the single most influential home computer ever. A bold statement to be sure, but take a look at the computer you’re reading this on, indeed at all your electronic devices here in 2023, before you dismiss it.

Any colour you want, as long as it's beige
Any colour you want, as long as it’s beige. Leon Brooks, Public domain.

Computers in the 1990s were beige and boring. Breathtakingly so, a festival of the generic. If you had a PC it came in the same beige box as every single other PC, the only thing breaking the monotony being one of those LED 7-segment fake-MHz displays. Apple computers took the beige and ran with it, their PowerMac range being merely a smoother-fronted version of all those beige-box PCs. This was the period following the departure of Steve Jobs during which the company famously lost its way, and the Bondi blue Jonny Ive-designed iMac was the signature product of his triumphant return.

That’s enough pretending to have drunk the Apple Kool-Aid for one article, so  why are we marking this anniversary? The answer lies not in the iMac’s hardware, though its 233MHz PowerPC G3 and ATI graphics driving a 15″ CRT were no slouch for the day, nor even in its forsaking of all their previous proprietary interfaces for USB. Instead it’s the design influence of this machine, as it ushered in a new era of technological devices whose ethos lay around how they might be used rather than in simply showering the interface with features. At the time the iMac spawned a brief fashion for translucent blue in everything from peripherals to steam irons, but in the quarter century since your devices have changed immeasurably in its wake. We still don’t like that weird round mouse though.

Header image: Rama, CC BY-SA 4.0.

Screwdrivers And Nuclear Safety: The Demon Core

Harry Daghlian and Louis Slotin were two of many people who worked on the Manhattan Project. They might not be household names, but we believe they are the poster children for safety procedures. And not in a good way.

Harry Daghlian (CC-BY-SA 3.0, Arnold Dion)

Slotin assembled the core of the “Gadget” — the plutonium test device at the Trinity test in 1945. He was no stranger to working in a lab with nuclear materials. It stands to reason that if you are making something as dangerous as a nuclear bomb, it is probably hazardous work. But you probably get used to it, like some of us get used to working around high voltage or deadly chemicals.

Making nuclear material is hard and even more so back then. But the Project had made a third plutonium core — one was detonated at Trinity, the other over Nagasaki, and the final core was meant to go into a proposed second bomb that was not produced.

The cores were two hemispheres of plutonium and gallium. The gallium allowed the material to be hot-pressed into spherical shapes. Unlike the first two cores, however, the third one — one that would later earn the nickname “the demon core” — had a ring around the flat surfaces to contain nuclear flux during implosion. The spheres are not terribly dangerous unless they become supercritical, which would lead to a prompt critical event. Then, they would release large amounts of neutrons. The bombs, for example, would force the two halves together violently. You could also add more nuclear material or reflect neutrons back into the material.

Continue reading “Screwdrivers And Nuclear Safety: The Demon Core”

Ku-Go: The World War II Death Ray

Historians may note that World War II was the last great “movie war.” In those days, you could do many things that are impossible today, yet make for great movie drama. You can’t sneak a fleet of ships across the oceans anymore. Nor could you dig tunnels right under your captor’s nose. Another defining factor is that it doesn’t seem we seek out superweapons anymore.

A Churchill Bullshorn plough for clearning minefields — one of Hobart’s “Funnies”

Sure, we develop better planes, tanks, submarines, and guns. But we aren’t working on anything — that we know of — as revolutionary as a rocket, an atomic bomb, or even radar was back in the 1940s. The Germans worked on Wunderwaffe, including guided missiles, jets, suborbital rocket bombers, and a solar-powered space mirror to burn terrestrial targets. Everyone was working on a nuclear bomb, of course. The British had Hobart’s Funnies as well as less successful entries like the Panjandrum — a ten-foot rocket-driven wheel of explosives.

Death Ray

Perhaps the holy grail of all the super weapons — both realized and dreamed of was the “death ray.” Of course, Tesla claimed to have one that didn’t use rays, but particles, but no one ever successfully built one and there was debate if it would work. Tesla didn’t like the term death ray, partly because it wasn’t a ray at all, but also because it required a huge power plant and, therefore, wasn’t mobile. He envisioned it as a peacekeeping defensive weapon, rendering attacks so futile that no one would dare attempt them.

Continue reading “Ku-Go: The World War II Death Ray”

Weather In Wartime: The Importance Of British Meteorology In WWII

Weather can have a significant impact on transport and operations of all kinds, especially those at sea or in the air. This makes it a deeply important field of study, particularly in wartime. If you’re at all curious about how this kind of information was gathered and handled in the days before satellites and computer models, this write-up on WWII meteorology is sure to pique your interest.

Weather conditions were valuable data, and weather forecasts even more so. Both required data, which relied on human operators for instruments to be read and their readings transmitted.

The main method of learning weather conditions over the oceans is to persuade merchant ships to report their observations regularly. This is true even today, but these days we also have the benefit of things like satellite technology. Back in the mid-1900s there was no such thing, and the outbreak of WWII (including the classification of weather data as secret information due to its value) meant that new solutions were needed.

The aircraft of the Royal Air Force (RAF) were particularly in need of accurate data, and there was little to no understanding of the upper atmosphere at the time. Eventually, aircraft flew regular 10-hour sorties, logging detailed readings that served to provide data about weather conditions across the Atlantic. Readings were logged, encoded with one-time pad (OTP) encryption, then radioed back to base where charts would be created and updated every few hours.

The value of accurate data and precise understanding of conditions and how they could change was grimly illustrated in a disaster called the Night of the Big Wind (March 24-25, 1944). Forecasts predicted winds no stronger than 45 mph, but Allied bombers sent to Berlin were torn apart when they encountered winds in excess of 120 mph, leading to the loss of 72 aircraft.

The types of data recorded to monitor and model weather are nearly identical to those in modern weather stations. The main difference is that instruments used to be read and monitored by human beings, whereas today we can rely more on electronic readings and transmission that need no human intervention.

TV Typewriter Remembered

With the recent passing of Don Lancaster, I took a minute to reflect on how far things have come in a pretty short period of time. If you somehow acquired a computer in the early 1970s, it was probably some discarded DEC, HP, or Data General machine. A few people built their own, but that was a stout project with no microprocessor chips readily available. When machines like the Mark-8 and, more famously, the Altair appeared, the number of people with a “home computer” swelled — relatively speaking — and it left a major problem: What kind of input/output device could you use?

An ad from Kilobaud offered you a ready-to-go, surely refurbished, ASR33 for $840

At work, you might have TeleType. Most of those were leased, and the price tag of a new one was somewhere around $1,000. Remember, too, that $1,000 in 1975 was a small fortune. Really lucky people had video terminals, but those were often well over $1,500, although Lear Siegler introduced one at the $1,000 price, and it became wildly successful. Snagging a used terminal was not very likely, and surplus TeleType equipment was likely of the 5-bit Baudot variety — not unusable, but not the terminal you really wanted.

A lot of the cost of a video terminal was the screen. Yet nearly everyone had a TV, and used TVs have always been fairly cheap, too. That’s where Don Lancaster came in. His TV Typewriter Cookbook was the bible for homebrew video displays. The design influenced the Apple 1 computer and spawned a successful kit for a company known as Southwest Technical Products. For around $300 or so, you could have a terminal that uses your TV for output. Continue reading “TV Typewriter Remembered”

Lighting Up With Chemistry, 1823-Style

With our mass-produced butane lighters and matches made in the billions, fire is never more than a flick of the finger away these days. But starting a fire 200 years ago? That’s a different story.

One method we’d never heard of was Döbereiner’s lamp, an 1823 invention by German chemist Johann Wolfgang Döbereiner. At first glance, the device seems a little sketchy, what with a tank of sulfuric acid and a piece of zinc to create a stream of hydrogen gas ignited by a platinum catalyst. But as [Marb’s Lab] shows with the recreation in the video below, while it’s not exactly as pocket-friendly as a Zippo, the device actually has some inherent safety features.

[Marb]’s version is built mainly from laboratory glassware, with a beaker of dilute sulfuric acid — “Add acid to water, like you ought-er!” — bathing a chunk of zinc on a fixed support. An inverted glass funnel acts as a gas collector, which feeds the hydrogen gas to a nozzle through a pinch valve. The hydrogen gas never mixes with oxygen — that would be bad — and the production of gas stops once the gas displaces the sulfuric acid below the level of the zinc pellet. It’s a clever self-limiting feature that probably contributed to the commercial success of the invention back in the day.

To produce a flame, Döbereiner originally used a platinum sponge, which catalyzed the reaction between hydrogen and oxygen in the air; the heat produced by the reaction was enough to ignite the mixture and produce an open flame. [Marb] couldn’t come up with enough of the precious metal, so instead harvested the catalyst from a lighter fluid-fueled hand warmer. The catalyst wasn’t quite enough to generate an open flame, but it glowed pretty brightly, and would be more than enough to start a fire.

Hats off to [Marb] for the great lesson is chemical ingenuity and history. We’ve seen similar old-school catalytic lighters before, too.

Continue reading “Lighting Up With Chemistry, 1823-Style”