Retail Fail: The :CueCat Disaster

Digital Convergence Corporation is hardly a household name, and there’s a good reason for that. However, it raised about $185 million in investments around the year 2000 from companies such as Coca-Cola, Radio Shack, GE, E. W. Scripps, and the media giant Belo Corporation. So what did all these companies want, and why didn’t it catch on? If you are old enough, you might remember the :CueCat, but you probably thought it was Radio Shack’s disaster. They were simply investors.

The Big Idea

The :CueCat was a barcode scanner that, usually, plugged into a PC’s keyboard port (in those days, that was normally a PS/2 port). A special cable, often called a wedge, was like a Y-cable, allowing you to use your keyboard and the scanner on the same port. The scanner looked like a cat, of course.

However, the :CueCat was not just a generic barcode scanner. It was made to only scan “cues” which were to appear in catalogs, newspapers, and other publications. The idea was that you’d see something in an ad or a catalog, rush to your computer to scan the barcode, and be transported to the retailer’s website to learn more and complete the purchase.

The software could also listen using your sound card for special audio codes that would play on radio or TV commercials and then automatically pop up the associated webpage. So, a piece of software that was reading your keyboard, listening to your room audio at all times, and could inject keystrokes into your computer. What could go wrong?

Continue reading “Retail Fail: The :CueCat Disaster”

The Most Secure, Modern Computer Might Be A Mac

The Linux world is currently seeing an explosion in new users, thanks in large part to Microsoft turning its Windows operating system into the most intrusive piece of spyware in modern computing. For those who value privacy and security, Linux has long been the safe haven where there’s reasonable certainty that the operating system itself isn’t harvesting user data or otherwise snooping where it shouldn’t be. Yet even after solving the OS problem, a deeper issue remains: the hardware itself. Since around 2008, virtually every Intel and AMD processor has included coprocessors running closed-source code known as the Intel Management Engine (IME) or AMD Platform Security Processor (PSP).

M1 MacBook Air, now with more freedom

These components operate entirely outside the user’s and operating system’s control. They are given privileged access to memory, storage, and networking and can retain that access even when the CPU is not running, creating systemic vulnerabilities that cannot be fully mitigated by software alone. One practical approach to minimizing exposure to opaque management subsystems like the IME or PSP is to use platforms that do not use x86 hardware in the first place. Perhaps surprisingly, the ARM-based Apple M1 and M2 computers offer a compelling option, providing a more constrained and clearly defined trust model for Linux users who prioritize privacy and security.

Before getting into why Apple Silicon can be appealing for those with this concern, we first need to address the elephant in the room: Apple’s proprietary, closed-source operating system. Luckily, the Asahi Linux project has done most of the heavy lifting for those with certain Apple Silicon machines who want to go more open-source. In fact, Asahi is one of the easiest Linux installs to perform today even when compared to beginner-friendly distributions like Mint or Fedora, provided you are using fully supported M1 or M2 machines rather than attempting an install on newer, less-supported models. The installer runs as a script within macOS, eliminating the need to image a USB stick. Once the script is executed, the user simply follows the prompts, restarts the computer, and boots into the new Linux environment. Privacy-conscious users may also want to take a few optional steps, such as verifying the Asahi checksum and encrypting the installation with LUKS but these steps are not too challenging for experienced users. Continue reading “The Most Secure, Modern Computer Might Be A Mac”

From Zip To Nought: The Rise And Fall Of Iomega

If you were anywhere near a computer in the mid-to-late 1990s, you almost certainly encountered a Zip drive. That distinctive purple peripheral, with its satisfying clunk as you slotted in a cartridge, was as much a fixture of the era as beige tower cases and CRT monitors. Iomega, the company behind it, went from an obscure Utah outfit to a multi-billion-dollar darling of Wall Street in the span of about two years. And then, almost as quickly, it all fell apart.

The story of Iomega is one of genuine engineering innovation and the fickle nature of consumer technology. As with so many other juggernauts of its era, Iomega was eventually brought down by a new technology that simply wasn’t practical to counter.

The House That Bernoulli Built

Iomega was founded in Utah, in 1980, by Jerome Paul Johnson, David Bailey, and David Norton. The company soon developed a novel approach to removable magnetic storage based on the Bernoulli effect. The Bernoulli Box arrived in 1982, which was a drive relying on PET film disks spun at 1500 RPM inside a rigid, removable cartridge. The airflow generated by the spinning disk pulled the media down toward the read/write head thanks to the eponymous Bernoulli effect. While spinning, the disk would float a mere micron above the head surface on a cushion of air. If the power cut out or the drive otherwise failed, the disk simply floated away from the head rather than crashing into it—a boon over contemporary hard drives for which head crashes were a real risk. The Bernoulli Box made them essentially impossible. Continue reading “From Zip To Nought: The Rise And Fall Of Iomega”

The Zero-Power Flight Computer

In the early days of aviation, pilots or their navigators used a plethora of tools to solve common navigation and piloting problems. There was definitely a need for some kind of computing aid that could replace slide rules, tables, and tedious dead-reckoning computations. This would become even more important during World War II, when there was a massive push to quickly train young men to be pilots.

The same, but different. A Pickett slide rule (top) and an E6B slide rule (bottom). (Own Work).

Today, we’d whip up some sort of computer device, but in the 1930s, computers weren’t anything you’d cram on a plane, even if they’d had any. For example, the Mark 1 Fire Control Computer during WW2 was 3,000 pounds of gears and motors.

The computer is made to answer flight questions like “how many pounds of fuel do I need for another hour of flying time?” or “How do I adjust my course if I have a particular crosswind?”

History

There were a rash of flight computers starting in the 1920s that were essentially specialized slide rules. The most popular one appeared in the late 1930s. Philip Dalton’s circular slide rule was cheap to produce and easy to use. As you’ll see, it is more than just an ordinary slide rule. Keep in mind, these were not computers in the sense we think of today. They were simple slide rules that easily did specialized math useful to pilots.

Dalton actually developed a number of computers. The popular Model B appeared in 1933, and there were refinements leading to additional models. The Mark VII was very popular. Even Fred Noonan, Amelia Earhart’s navigator, used a Mark VII. Continue reading “The Zero-Power Flight Computer”

Artemis II Agenda Keeps Moon-Bound Crew Busy

With the launch of Artemis II from Cape Canaveral potentially just weeks away, NASA has been releasing a steady stream of information about the mission through their official site and social media channels to get the public excited about the agency’s long-awaited return to the Moon. While the slickly produced videos and artist renderings might get the most attention, even the most mundane details about a flight that will put humans on the far side of our nearest celestial neighbor for the first time since 1972 can be fascinating.

The Artemis II Moon Mission Daily Agenda is a perfect example. Released earlier this week via the NASA blog, the document seems to have been all but ignored by the mainstream media. But the day-by-day breakdown of the Artemis II mission contains several interesting entries about what the four crew members will be working on during the ten day flight.

Of course, the exact details of the agenda are subject to change once the mission is underway. Some tasks could run longer than anticipated, experiments may not go as planned, and there’s no way to predict technical issues that may arise.

Conversely, the crew could end up breezing through some of the planned activities, freeing up time in the schedule. There’s simply no way of telling until it’s actually happening.

With the understanding that it’s all somewhat tentative, a look through the plan as it stands right now can give us an idea of the sort of highlights we can expect as we follow this historic mission down here on Earth.

Continue reading “Artemis II Agenda Keeps Moon-Bound Crew Busy”

The Rise And Fall Of Free Dial Up Internet

In the early days of the Internet, having a high-speed IP connection in your home or even a small business was, if not impossible, certainly a rarity. Connecting to a computer in those days required you to use your phone. Early modems used acoustic couplers, but by the time most people started trying to connect, modems that plugged into your phone jack were the norm.

The problem was: whose computer did you call? There were commercial dial-up services like DIALOG that offered very expensive services, such as database searches via modem. That could be expensive. You had a fee for the phone. Then you might have a per-minute charge for the phone call, especially if the computer was in another city. Then you had to pay the service provider, which could be very expensive.

Even before the consumer Internet, this wasn’t workable. Tymnet and Telenet were two services that had the answer. They maintained banks of modems practically everywhere. You dialed a local number, which was probably a “free” call included in your monthly bill, and then used a simple command to connect to a remote computer of your choice. There were other competitors, including CompuServe, which would become a major force in the fledgling consumer market.

While some local internet service providers (ISPs) had their own modem banks, when you saw the rise of national ISPs, they were riding on one of several nationwide modem systems and paying by the minute for the privilege. Eventually, some ISPs reached the scale that made dedicated modem banks worthwhile. This made it easier to offer flat-rate pricing, and the presumed likelihood of everyone dialing in at once made it possible to oversubscribe any given number of modems.

The Cost

Once consumer services like CompuServe, The Source, and AOL started operations, the cost was less, but still not inexpensive. Some early services charged higher rates during business hours, for example. There was also the cost of a phone line, and if you didn’t want to tie up your home phone, you needed a second line dedicated to the modem. It all added up.

By the late 1990s, a dial-up provider might cost you $25 a month or less, not counting your phone line. That’s about $60 in today’s money, just for reference. But the Internet was also booming as a place to sell advertising.

Continue reading “The Rise And Fall Of Free Dial Up Internet”

Preparing To Fire Up A 90-Year-Old Boiler After Half A Century

Continuing the restoration of the #1 Lancashire boiler at the Claymills Pumping Station in the UK, the volunteers are putting on the final touches after previously passing the boiler inspection. Although it may seem that things are basically ready to start laying down a fire after the boiler is proven to hold 120 PSI with all safeties fully operating, they first had to reassemble the surrounding brickwork, free up a seized damper shaft and give a lot of TLC to mechanisms that were brand new in the 1930s and last operated in 1971.

Removing the ashes from a Lancashire boiler. (Credit: Claymills pumping station, YouTube)
Removing the ashes from a Lancashire boiler. (Credit: Claymills pumping station, YouTube)

The damper shaft is part of the damper mechanism which controls doors that affect the burn rate, acting as a kind of throttle for the boilers. Unfortunately the shaft’s bearings had seized up completely, and no amount of heat and kinetic maintenance could loosen it up again. This forced them to pull it out and manufacture a replacement, but did provide a good look at how it’s put together. The original dial indicator was salvaged, along with some other bits that were still good.

Next was to fit the cast-iron ash boxes that sit below the boiler and from where ash can be scraped out and deposited into wheelbarrows. The automatic sprinkler stokers are fitted above these, with a good look at their mechanism. The operator is given a lot of control over how much coal is being fed into the boiler, as part of the early 20th-century automation.

The missing furnace doors on the #1 boiler were replaced with replicas based on the ones from the other boilers, and some piping around the boiler was refurbished. Even after all that work, it’ll still take a few weeks and a lot more work to fully reassemble the boiler, showing just how complex these systems are. With some luck it’ll fire right back up after fifty years of slumbering and decades of suffering the elements.

Continue reading “Preparing To Fire Up A 90-Year-Old Boiler After Half A Century”