Rediscovering The Nile: The Ancient River That Was Once Overlooked By The Egyptian Pyramids

Although we usually imagine the conditions in Ancient Egypt to be much like the Egypt of today, back during the Holocene there was significantly more rain as a result of the African Humid Period (AHP). This translated in the river Nile stretching far beyond its current range, with many more branches. This knowledge led a team of researchers to test the hypothesis that the largest cluster of pyramids in the Nile Valley was sited along one of these now long since vanished branches. Their findings are described in an article published in Communications Earth & Environment, by [Eman Ghoneim] and colleagues.

The Ahramat Branch and pyramids along its trajectory. (Credit: Eman Ghoneim et al., 2024)
The Ahramat Branch and pyramids along its trajectory. (Credit: Eman Ghoneim et al., 2024)

The CliffsNotes version can be found in the accompanying press release by the University of North Carolina Wilmington. Effectively, the researchers postulated that a branch of the Nile existed along these grouping of pyramids, with their accompanying temples originally positioned alongside this branch. The trick was to prove that a river branch once existed in that area many thousands of years ago.

What complicates this is that the main course of the Nile has shifted over the centuries, and anthropogenic activity has obscured much what remained, making life for researchers exceedingly difficult. Ultimately a combination of soil core samples, geophysical evidence, and remote sensing (e.g. satellite imagery) helped to cement the evidence for the existence what they termed the Ahramat Nile Branch, with ‘ahramat’ meaning ‘pyramids’ in Arabic.

Synthetic Aperture Radar (SAR) and high-resolution radar elevation data provided evidence for the Nile once having traveled right past this string of pyramids, also identifying the modern Bahr el-Libeini canal as one of the last remnants of the Ahramat Branch before the river’s course across the floodplain shifted towards the East, probably due to tectonic activity. Further research using Ground Penetrating Radar (GPR) and Electromagnetic Tomography (EMT) along a 1.2 km section of the suspected former riverbed gave clear indications of a well-preserved river channel, with the expected silt and sediments.

Soil cores to a depth of 20 and 13 meters further confirmed this, showing not only the sediment, but also freshwater mussel shells at 6 meter depth. Shallow groundwater was indicated at these core sites, meaning that even today subsurface water still flows through this part of the floodplain.

These findings not only align with the string of pyramids and their causeways that would have provided direct access to the water’s edge, but also provided hints for a further discovery regarding the Bent Pyramid — as it’s commonly known — which is located deep inside the desert today. Although located far from the floodplain by about a kilometer, its approximately 700 meters long causeway terminates at what would have been a now extinct channel: the Dahshur Inlet, which might also have served the Red Pyramid and others, although evidence for this is shakier.

Altogether, these findings further illustrate an Ancient Egypt where the Old Kingdom was followed by a period of severe changes, with increasing drought caused by the end of the AHP, an eastwardly migrating floodplain and decreased flow in the Nile from its tributaries. By the time that European explorers laid eyes on the ancient wonders of the Ancient Egyptian pyramids, the civilization that had birthed them was no more, nor was the green and relatively lush environment that had once surrounded it.

How Italians Got Their Power

We take for granted that electrical power standards are generally unified across countries and territories. Europe for instance has a standard at 230 volts AC, with a wide enough voltage acceptance band to accommodate places still running at 220 or 240 volts. Even the sockets maintain a level of compatibility across territories, with a few notable exceptions.

It was not always this way though, and to illustrate this we have [Sam], who’s provided us with a potted history of mains power in Italy. The complex twists and turns of power delivery in that country reflect the diversity of the power industry in the late 19th and early 20th century as the technology spread across the continent.

Starting with a table showing the impressive range of voltages found across the country from differing power countries, it delves into the taxation of power in Italy which led to two entirely different plug standards, and their 110/220 volt system. Nationalization may have ironed out some of the kinks and unified 220 volts across the country, but the two plugs remain.

Altogether it’s a fascinating read, and one which brings to mind that where this is being written you could still find a few years ago some houses with three sizes of the archaic British round-pin socket. Interested in the diversity of plugs? We have a link for that.

The Computers Of Voyager

After more than four decades in space and having traveled a combined 44 billion kilometers, it’s no secret that the Voyager spacecraft are closing in on the end of their extended interstellar mission. Battered and worn, the twin spacecraft are speeding along through the void, far outside the Sun’s influence now, their radioactive fuel decaying, their signals becoming ever fainter as the time needed to cross the chasm of space gets longer by the day.

But still, they soldier on, humanity’s furthest-flung outposts and testaments to the power of good engineering. And no small measure of good luck, too, given the number of nearly mission-ending events which have accumulated in almost half a century of travel. The number of “glitches” and “anomalies” suffered by both Voyagers seems to be on the uptick, too, contributing to the sense that someday, soon perhaps, we’ll hear no more from them.

That day has thankfully not come yet, in no small part due to the computers that the Voyager spacecraft were, in a way, designed around. Voyager was to be a mission unlike any ever undertaken, a Grand Tour of the outer planets that offered a once-in-a-lifetime chance to push science far out into the solar system. Getting the computers right was absolutely essential to delivering on that promise, a task made all the more challenging by the conditions under which they’d be required to operate, the complexity of the spacecraft they’d be running, and the torrent of data streaming through them. Forty-six years later, it’s safe to say that the designers nailed it, and it’s worth taking a look at how they pulled it off.

Continue reading “The Computers Of Voyager”

A render of a BiC Cristal ballpoint pen showing the innards.

This Is How A Pen Changed The World

Look around you. Chances are, there’s a BiC Cristal ballpoint pen among your odds and ends. Since 1950, it has far outsold the Rubik’s Cube and even the iPhone, and yet, it’s one of the most unsung and overlooked pieces of technology ever invented. And weirdly, it hasn’t had the honor of trademark erosion like Xerox or Kleenex. When you ‘flick a Bic’, you’re using a lighter.

It’s probably hard to imagine writing with a feather and a bottle of ink, but that’s what writing was limited to for hundreds of years. When fountain pens first came along, they were revolutionary, albeit expensive and leaky. In 1900, the world literacy rate stood around 20%, and exorbitantly-priced, unreliable utensils weren’t helping.

Close-up, cutaway render of a leaking ballpoint pen. In 1888, American inventor John Loud created the first ballpoint pen. It worked well on leather and wood and the like, but absolutely shredded paper, making it almost useless.

One problem was that while the ball worked better than a nib, it had to be an absolutely perfect fit, or ink would either get stuck or leak out everywhere. Then along came László Bíró, who turned instead to the ink to solve the problems of the ballpoint.

Continue reading “This Is How A Pen Changed The World”

Going Canadian: The Rise And Fall Of Novell

During the 1980s and 1990s Novell was one of those names that you could not avoid if you came even somewhat close to computers. Starting with selling computers and printers, they’d switch to producing networking hardware like the famous NE2000 and the inevitability that was Novell Netware software, which would cement its fortunes. It wasn’t until the 1990s that Novell began to face headwinds from a new giant: Microsoft, which along with the rest of the history of Novell is the topic of a recent article by [Bradford Morgan White], covering this rise, the competition from Microsoft’s Windows NT and its ultimate demise as it found itself unable to compete in the rapidly changing market around 2000, despite flirting with Linux.

Novell was founded by two experienced executives in 1980, with the name being reportedly the misspelled French word for ‘new’ (nouveau or nouvelle). With NetWare having cornered the networking market, there was still a dearth of networking equipment like Ethernet expansion cards. This led Novell to introduce the 8-bit ISA card NE1000 in 1987, later followed by the 16-bit NE2000. Lower priced than competing products, they became a market favorite. Then Windows NT rolled in during the 1990s and began to destroy NetWare’s marketshare, leaving Novell to flounder until it was snapped up by Attachmate in 2011, which was snapped up by Micro Focus International 2014, which got gobbled up by Canada-based OpenText in 2023. Here Novell’s technologies got distributed across its divisions, finally ending Novell’s story.

How DEC’s LANBridge 100 Gave Ethernet A Fighting Chance

Alan Kirby (left) and Mark Kempf with the LANBridge 100, serial number 0001. (Credit: Alan Kirby)
Alan Kirby (left) and Mark Kempf with the LANBridge 100, serial number 0001. (Credit: Alan Kirby)

When Ethernet was originally envisioned, it would use a common, shared medium (the ‘Ether’ part), with transmitting and collision resolution handled by the carrier sense multiple access with collision detection (CSMA/CD) method. While effective and cheap, this limited Ethernet to a 1.5 km cable run and 10 Mb/s transfer rate. As [Alan Kirby] worked at Digital Equipment Corp. (DEC) in the 1980s and 1990s, he saw how competing network technologies including Fiber Distributed Data Interface (FDDI) – that DEC also worked on – threatened to extinguish Ethernet despite these alternatives being more expensive. The solution here would be store-and-forward switching, [Alan] figured.

After teaming up with Mark Kempf, both engineers managed to convince DEC management to give them a chance to develop such a switch for Ethernet, which turned into the LANBridge 100. As a so-called ‘learning bridge’, it operated on Layer 2 of the network stack, learning the MAC addresses of the connected systems and forwarding only those packets that were relevant for the other network. This instantly prevented collisions between thus connected networks, allowed for long (fiber) runs between bridges and would be the beginning of the transformation of Ethernet as a shared medium (like WiFi today) into a star topology network, with each connected system getting its very own Ethernet cable to a dedicated switch port.

The Rise And Fall Of Silicon Graphics

Maybe best known as the company which brought a splash of color to corporate and scientific computing with its Indigo range of computer systems, Silicon Graphics Inc. (later SGI) burst onto the market in 1981 with what was effectively one of the first commercial graphics operations accelerator with the Geometry Engine. SGI’s founder – James Henry Clark was quite possibly as colorful a character as the company’s products, with [Bradford Morgan White] covering the years leading up to SGI’s founding, its highlights and its eventual demise in 2009.

The story of SGI is typical of a start-up that sees itself become the market leader for years, even as this market gradually changes. For SGI it was the surge in commodity 3D graphics cards in the 1990s alongside affordable (and cluster-capable; insert Beowulf cluster jokes here) server hardware that posed a major problem. Eventually it’d start offering Windows NT workstations, drop its MIPS-based systems in a shift to Intel’s disastrous Itanium range of CPUs and fall to the last-ditch effort of any struggling company: a logo change.

None of this was effective, naturally, and ultimately SGI would file (again) for Chapter 11 bankruptcy in 2009, with Rackable Systems snapping up its assets and renaming itself to SGI, before getting bought out by HPE and sunsetting SGI as a brand name.