How Italians Got Their Power

We take for granted that electrical power standards are generally unified across countries and territories. Europe for instance has a standard at 230 volts AC, with a wide enough voltage acceptance band to accommodate places still running at 220 or 240 volts. Even the sockets maintain a level of compatibility across territories, with a few notable exceptions.

It was not always this way though, and to illustrate this we have [Sam], who’s provided us with a potted history of mains power in Italy. The complex twists and turns of power delivery in that country reflect the diversity of the power industry in the late 19th and early 20th century as the technology spread across the continent.

Starting with a table showing the impressive range of voltages found across the country from differing power countries, it delves into the taxation of power in Italy which led to two entirely different plug standards, and their 110/220 volt system. Nationalization may have ironed out some of the kinks and unified 220 volts across the country, but the two plugs remain.

Altogether it’s a fascinating read, and one which brings to mind that where this is being written you could still find a few years ago some houses with three sizes of the archaic British round-pin socket. Interested in the diversity of plugs? We have a link for that.

How We Got The Scanning Electron Microscope

According to [Asianometry], no one believed in the scanning electron microscope. No one, that is, except [Charles Oatley].The video below tells the whole story.

The Cambridge graduate built radios during World War II and then joined Cambridge as a lecturer once the conflict was over. [Hans Busch] demonstrated using magnets to move electron beams, which suggested the possibility of creating a lens, and it was an obvious thought to make a microscope that uses electrons.

After all, electrons can have smaller wavelength than light, so a microscope using electrons could — in theory — image at a higher resolution. [Max Knoll] and [Ernst Ruska], in fact, developed the transmission electron microscope or TEM.

Continue reading “How We Got The Scanning Electron Microscope”

Spend An Hour In The Virtual Radio Museum

You have an hour to kill, and you like old communication technology. If you happen to be in Windsor, Connecticut, you could nip over to the Vintage Radio and Communication Museum. If you aren’t in Windsor, you could watch [WG7D’s] video tour, which you can see below.

The museum is a volunteer organization and is mostly about radio, although we did spy some old cameras if you like that sort of thing. There was also a beautiful player piano that — no kidding — now runs from a vacuum cleaner.

Continue reading “Spend An Hour In The Virtual Radio Museum”

The Computers Of Voyager

After more than four decades in space and having traveled a combined 44 billion kilometers, it’s no secret that the Voyager spacecraft are closing in on the end of their extended interstellar mission. Battered and worn, the twin spacecraft are speeding along through the void, far outside the Sun’s influence now, their radioactive fuel decaying, their signals becoming ever fainter as the time needed to cross the chasm of space gets longer by the day.

But still, they soldier on, humanity’s furthest-flung outposts and testaments to the power of good engineering. And no small measure of good luck, too, given the number of nearly mission-ending events which have accumulated in almost half a century of travel. The number of “glitches” and “anomalies” suffered by both Voyagers seems to be on the uptick, too, contributing to the sense that someday, soon perhaps, we’ll hear no more from them.

That day has thankfully not come yet, in no small part due to the computers that the Voyager spacecraft were, in a way, designed around. Voyager was to be a mission unlike any ever undertaken, a Grand Tour of the outer planets that offered a once-in-a-lifetime chance to push science far out into the solar system. Getting the computers right was absolutely essential to delivering on that promise, a task made all the more challenging by the conditions under which they’d be required to operate, the complexity of the spacecraft they’d be running, and the torrent of data streaming through them. Forty-six years later, it’s safe to say that the designers nailed it, and it’s worth taking a look at how they pulled it off.

Continue reading “The Computers Of Voyager”

A render of a BiC Cristal ballpoint pen showing the innards.

This Is How A Pen Changed The World

Look around you. Chances are, there’s a BiC Cristal ballpoint pen among your odds and ends. Since 1950, it has far outsold the Rubik’s Cube and even the iPhone, and yet, it’s one of the most unsung and overlooked pieces of technology ever invented. And weirdly, it hasn’t had the honor of trademark erosion like Xerox or Kleenex. When you ‘flick a Bic’, you’re using a lighter.

It’s probably hard to imagine writing with a feather and a bottle of ink, but that’s what writing was limited to for hundreds of years. When fountain pens first came along, they were revolutionary, albeit expensive and leaky. In 1900, the world literacy rate stood around 20%, and exorbitantly-priced, unreliable utensils weren’t helping.

Close-up, cutaway render of a leaking ballpoint pen. In 1888, American inventor John Loud created the first ballpoint pen. It worked well on leather and wood and the like, but absolutely shredded paper, making it almost useless.

One problem was that while the ball worked better than a nib, it had to be an absolutely perfect fit, or ink would either get stuck or leak out everywhere. Then along came László Bíró, who turned instead to the ink to solve the problems of the ballpoint.

Continue reading “This Is How A Pen Changed The World”

Going Canadian: The Rise And Fall Of Novell

During the 1980s and 1990s Novell was one of those names that you could not avoid if you came even somewhat close to computers. Starting with selling computers and printers, they’d switch to producing networking hardware like the famous NE2000 and the inevitability that was Novell Netware software, which would cement its fortunes. It wasn’t until the 1990s that Novell began to face headwinds from a new giant: Microsoft, which along with the rest of the history of Novell is the topic of a recent article by [Bradford Morgan White], covering this rise, the competition from Microsoft’s Windows NT and its ultimate demise as it found itself unable to compete in the rapidly changing market around 2000, despite flirting with Linux.

Novell was founded by two experienced executives in 1980, with the name being reportedly the misspelled French word for ‘new’ (nouveau or nouvelle). With NetWare having cornered the networking market, there was still a dearth of networking equipment like Ethernet expansion cards. This led Novell to introduce the 8-bit ISA card NE1000 in 1987, later followed by the 16-bit NE2000. Lower priced than competing products, they became a market favorite. Then Windows NT rolled in during the 1990s and began to destroy NetWare’s marketshare, leaving Novell to flounder until it was snapped up by Attachmate in 2011, which was snapped up by Micro Focus International 2014, which got gobbled up by Canada-based OpenText in 2023. Here Novell’s technologies got distributed across its divisions, finally ending Novell’s story.

How DEC’s LANBridge 100 Gave Ethernet A Fighting Chance

Alan Kirby (left) and Mark Kempf with the LANBridge 100, serial number 0001. (Credit: Alan Kirby)
Alan Kirby (left) and Mark Kempf with the LANBridge 100, serial number 0001. (Credit: Alan Kirby)

When Ethernet was originally envisioned, it would use a common, shared medium (the ‘Ether’ part), with transmitting and collision resolution handled by the carrier sense multiple access with collision detection (CSMA/CD) method. While effective and cheap, this limited Ethernet to a 1.5 km cable run and 10 Mb/s transfer rate. As [Alan Kirby] worked at Digital Equipment Corp. (DEC) in the 1980s and 1990s, he saw how competing network technologies including Fiber Distributed Data Interface (FDDI) – that DEC also worked on – threatened to extinguish Ethernet despite these alternatives being more expensive. The solution here would be store-and-forward switching, [Alan] figured.

After teaming up with Mark Kempf, both engineers managed to convince DEC management to give them a chance to develop such a switch for Ethernet, which turned into the LANBridge 100. As a so-called ‘learning bridge’, it operated on Layer 2 of the network stack, learning the MAC addresses of the connected systems and forwarding only those packets that were relevant for the other network. This instantly prevented collisions between thus connected networks, allowed for long (fiber) runs between bridges and would be the beginning of the transformation of Ethernet as a shared medium (like WiFi today) into a star topology network, with each connected system getting its very own Ethernet cable to a dedicated switch port.