Skylab Under The Ocean

A crew lives on a station in a hostile environment. Leaving that environment requires oxygen tanks and specialized gear to deal with pressure differentials. A space station? Nah. A base built on the ocean floor. The US Navy was interested in such a base in the 1960s, and bases like this are a staple of science fiction. But today, we see more space stations than underwater bases. Have you ever wondered why?

Diving deep underwater is a tricky business. At a certain depth, the pressure forces gas like nitrogen to dissolve into your body. By itself, this isn’t a problem, but when you ascend, it is a big problem. If the gas all comes out at the same time, you get bubbles, which can cause decompression sickness, commonly called the bends. The exact problems vary, but the bends often cause extreme joint pain, fatigue, or a rash. Sometimes people die.

While you think of the bends as a deep-sea diver’s problem, it can also happen in airplanes and outer space. Any time you go from high pressure to low pressure quickly, you are subject to decompression sickness. Depending on what you are doing, there are different ways to mitigate the problem. For diving, traditionally, you simply don’t surface too quickly.

You dive, do your work, and then head towards the surface, stopping at preset stops to let the pressure equalize gradually. Physics is a bear, though. The longer you stay at a given depth, the longer you have to decompress.

That means you rapidly reach a point of diminishing returns. Suppose you dive to the ocean floor. You spend an hour working. Then you have to spend, say, eight hours gradually rising to the surface. That makes extended operations at significant depth impractical.

George Bond was thinking about all this and had an interesting idea. It is true that, in general, the longer you stay down, the more gas your body absorbs. But it is also true that, eventually, your tissues saturate, and then you don’t absorb any more.

Continue reading “Skylab Under The Ocean”

CCA Ethernet Cables: Not Up To Scratch, But Are They Dangerous?

If you’ve ever bought a suspiciously cheap Ethernet cable from an online listing, there’s a decent chance you’ve encountered Copper Clad Aluminum. Better known as CCA, it’s exactly what it sounds like—an aluminium conductor with a thin skin of copper deposited on the outside. Externally, cables made with this material look largely like any other, with perhaps the only obvious tell being that they feel somewhat lighter in the hand.

CCA is cheaper than proper copper cabling, and it conducts signals well enough to function in an Ethernet cable. And yet, it’s a prime example of corner-cutting that keeps standards bodies and professional installers up at night. But just how dangerous is this silent scourge, found lurking in so many network cabinets around the world?

Not Up To Scratch

CCA wire is typically made by wrapping an aluminium core with copper strip and then extruding it through a die. Credit: USPTO

Everything you need to know about CCA is in the name—it refers to an aluminium wire with a thin copper cladding, typically applied through a die extrusion process. The reasoning behind this exploits a real physical phenomenon called the skin effect, wherein higher-frequency AC signals tend to travel along the outer surface of a conductor. The idea goes that since most of the current moves through the outer copper skin layer anyway, the less-conductive aluminium core doesn’t unduly impact the wire’s performance. Using copper-clad aluminium wiring is, in theory, desirable because aluminium is much cheaper than copper, which can really add up over long cable runs. Imagine you’re wiring a building with with hundreds of miles of Ethernet cabling, all with eight conductors each—the savings add up pretty quickly.

There’s a problem with CCA cabling in these contexts, though. Due to prevailing cabling standards, any cable made with CCA is technically not even a real Ethernet cable at all. The relevant documents are unambiguous.

ANSI/TIA-568.2-D requires conductors in Category-rated cable to be solid or stranded copper. No other materials are acceptable, and thus CCA is explicitly excluded from use in Category cable applications. A cable with CCA conductors cannot legitimately carry a Cat5e, Cat6, or any related designation under any circumstances. Similarly, ISO/IEC 11801 has the same requirement. The U.S. National Electrical Code also states that conductors in communications cables, other than coaxial cable, shall be copper. This isn’t a suggestion or a best practice; it’s the letter of the code. Anything lesser is simply not allowed. Continue reading “CCA Ethernet Cables: Not Up To Scratch, But Are They Dangerous?”

With Affordable Storage Options Dwindling, Where To Store Our Data?

These days our appetite for more data storage is larger than ever, with video files larger, photo resolutions higher, and project files easily zipping past a few hundred MB. At the same time our options for data storage are becoming more and more limited. For the longest time we could count on there always being a newer, roomier, faster, and cheaper form of storage to come along, but those days would seem to be over.

We can look back and laugh at low capacity USB Flash drives of the early 2000s, yet the first storage drive to hit 1 TB capacity did so in 2007, with a Hitachi Deskstar 7k100, only for that level of capacity in PCs to not really be exceeded nineteen years later.

We also had Blu-ray discs (BD) promise to cram the equivalent of dozens of DVDs onto a single BD, with two- and even four-layer BDs storing up to a one-hundred-and-twenty-eight GB. Yet today optical media is dying a slow death as the sole remaining cheap storage option. NAND Flash storage has only increased in price, and the options for those of us who have large cold storage requirements would seem to be decreasing every day.

So what is the economical solution here? Invest in LTO tapes using commercial left-overs, or give up and sign up for Cloud Storage™ for the low-low price of a monthly recurring fee?

Continue reading “With Affordable Storage Options Dwindling, Where To Store Our Data?”

Ask Hackaday: How Much Compute Is Enough?

Over the history of this business, a lot of people have foreseen limits that look rather silly in hindsight– in 1943, IBM President Thomas Watson declared that “I think there is a world market for maybe five computers.” That was more than a little wrong. Depending on the definition of computers– particularly if you include microcontrollers, there’s probably trillions of the things.

We might as well include microcontrollers, considering how often we see projects replicating retrocomputers on them. The RP2350 can do a Mac 128k, and the ESP32-P4 gets you into the Quadra era. Which, honestly, covers the majority of daily tasks most people use computers for.

The RP2350 and ESP32-P4 both have more than 640kB of RAM, so that famous Bill Gates quote obviously didn’t age any better than Thomas Watson’s prediction. As Yogi Berra once said: predictions are hard, especially about the future. Continue reading “Ask Hackaday: How Much Compute Is Enough?”

From Zip To Nought: The Rise And Fall Of Iomega

If you were anywhere near a computer in the mid-to-late 1990s, you almost certainly encountered a Zip drive. That distinctive purple peripheral, with its satisfying clunk as you slotted in a cartridge, was as much a fixture of the era as beige tower cases and CRT monitors. Iomega, the company behind it, went from an obscure Utah outfit to a multi-billion-dollar darling of Wall Street in the span of about two years. And then, almost as quickly, it all fell apart.

The story of Iomega is one of genuine engineering innovation and the fickle nature of consumer technology. As with so many other juggernauts of its era, Iomega was eventually brought down by a new technology that simply wasn’t practical to counter.

The House That Bernoulli Built

Iomega was founded in Utah, in 1980, by Jerome Paul Johnson, David Bailey, and David Norton. The company soon developed a novel approach to removable magnetic storage based on the Bernoulli effect. The Bernoulli Box arrived in 1982, which was a drive relying on PET film disks spun at 1500 RPM inside a rigid, removable cartridge. The airflow generated by the spinning disk pulled the media down toward the read/write head thanks to the eponymous Bernoulli effect. While spinning, the disk would float a mere micron above the head surface on a cushion of air. If the power cut out or the drive otherwise failed, the disk simply floated away from the head rather than crashing into it—a boon over contemporary hard drives for which head crashes were a real risk. The Bernoulli Box made them essentially impossible. Continue reading “From Zip To Nought: The Rise And Fall Of Iomega”

Artemis II Agenda Keeps Moon-Bound Crew Busy

With the launch of Artemis II from Cape Canaveral potentially just weeks away, NASA has been releasing a steady stream of information about the mission through their official site and social media channels to get the public excited about the agency’s long-awaited return to the Moon. While the slickly produced videos and artist renderings might get the most attention, even the most mundane details about a flight that will put humans on the far side of our nearest celestial neighbor for the first time since 1972 can be fascinating.

The Artemis II Moon Mission Daily Agenda is a perfect example. Released earlier this week via the NASA blog, the document seems to have been all but ignored by the mainstream media. But the day-by-day breakdown of the Artemis II mission contains several interesting entries about what the four crew members will be working on during the ten day flight.

Of course, the exact details of the agenda are subject to change once the mission is underway. Some tasks could run longer than anticipated, experiments may not go as planned, and there’s no way to predict technical issues that may arise.

Conversely, the crew could end up breezing through some of the planned activities, freeing up time in the schedule. There’s simply no way of telling until it’s actually happening.

With the understanding that it’s all somewhat tentative, a look through the plan as it stands right now can give us an idea of the sort of highlights we can expect as we follow this historic mission down here on Earth.

Continue reading “Artemis II Agenda Keeps Moon-Bound Crew Busy”

The Rise And Fall Of Free Dial Up Internet

In the early days of the Internet, having a high-speed IP connection in your home or even a small business was, if not impossible, certainly a rarity. Connecting to a computer in those days required you to use your phone. Early modems used acoustic couplers, but by the time most people started trying to connect, modems that plugged into your phone jack were the norm.

The problem was: whose computer did you call? There were commercial dial-up services like DIALOG that offered very expensive services, such as database searches via modem. That could be expensive. You had a fee for the phone. Then you might have a per-minute charge for the phone call, especially if the computer was in another city. Then you had to pay the service provider, which could be very expensive.

Even before the consumer Internet, this wasn’t workable. Tymnet and Telenet were two services that had the answer. They maintained banks of modems practically everywhere. You dialed a local number, which was probably a “free” call included in your monthly bill, and then used a simple command to connect to a remote computer of your choice. There were other competitors, including CompuServe, which would become a major force in the fledgling consumer market.

While some local internet service providers (ISPs) had their own modem banks, when you saw the rise of national ISPs, they were riding on one of several nationwide modem systems and paying by the minute for the privilege. Eventually, some ISPs reached the scale that made dedicated modem banks worthwhile. This made it easier to offer flat-rate pricing, and the presumed likelihood of everyone dialing in at once made it possible to oversubscribe any given number of modems.

The Cost

Once consumer services like CompuServe, The Source, and AOL started operations, the cost was less, but still not inexpensive. Some early services charged higher rates during business hours, for example. There was also the cost of a phone line, and if you didn’t want to tie up your home phone, you needed a second line dedicated to the modem. It all added up.

By the late 1990s, a dial-up provider might cost you $25 a month or less, not counting your phone line. That’s about $60 in today’s money, just for reference. But the Internet was also booming as a place to sell advertising.

Continue reading “The Rise And Fall Of Free Dial Up Internet”