Supersonic Flight May Finally Return To US Skies

After World War II, as early supersonic military aircraft were pushing the boundaries of flight, it seemed like a foregone conclusion that commercial aircraft would eventually fly faster than sound as the technology became better understood and more affordable. Indeed, by the 1960s the United States, Britain, France, and the Soviet Union all had plans to develop commercial transport aircraft capable flight beyond Mach 1 in various stages of development.

Concorde on its final flight

Yet today, the few examples of supersonic transport (SST) planes that actually ended up being built are in museums, and flight above Mach 1 is essentially the sole domain of the military. There’s an argument to be made that it’s one of the few areas of technological advancement where the state-of-the-art not only stopped moving forward, but actually slid backwards.

But that might finally be changing, at least in the United States. Both NASA and the private sector have been working towards a new generation of supersonic aircraft that address the key issues that plagued their predecessors, and a recent push by the White House aims to undo the regulatory roadblocks that have been on the books for more than fifty years.

Continue reading “Supersonic Flight May Finally Return To US Skies”

The Death Of Industrial Design And The Era Of Dull Electronics

It’s often said that what’s inside matters more than one’s looks, but it’s hard to argue that a product’s looks and its physical user experience are what makes it instantly recognizable. When you think of something like a Walkman, an iPod music player, a desktop computer, a car or a TV, the first thing that comes to mind is the way  that it looks along with its user interface. This is the domain of industrial design, where circuit boards, mechanisms, displays and buttons are put into a shell that ultimately defines what users see and experience.

Thus industrial design is perhaps the most important aspect of product development as far as the user is concerned, right along with the feature list. It’s also no secret that marketing departments love to lean into the styling and ergonomics of a product. In light of this it is very disconcerting that the past years industrial design for consumer electronics in particular seems to have wilted and is now practically on the verge of death.

Devices like cellphones and TVs are now mostly flat plastic-and-glass rectangles with no distinguishing features. Laptops and PCs are identified either by being flat, small, having RGB lighting, or a combination of these. At the same time buttons and other physical user interface elements are vanishing along with prominent styling, leaving us in a world of basic geometric shapes and flat, evenly colored surfaces. Exactly how did we get to this point, and what does this mean for our own hardware projects?

Continue reading “The Death Of Industrial Design And The Era Of Dull Electronics”

Power Grid Stability: From Generators To Reactive Power

It hasn’t been that long since humans figured out how to create power grids that integrated multiple generators and consumers. Ever since AC won the battle of the currents, grid operators have had to deal with the issues that come with using AC instead of the far less complex DC. Instead of simply targeting a constant voltage, generators have to synchronize with the frequency of the alternating current as it cycles between positive and negative current many times per second.

Complicating matters further, the transmission lines between generators and consumers, along with any kind of transmission equipment on the lines, add their own inductive, capacitive, and resistive properties to the system before the effects of consumers are even tallied up. The result of this are phase shifts between voltage and current that have to be managed by controlling the reactive power, lest frequency oscillations and voltage swings result in a complete grid blackout.

Continue reading “Power Grid Stability: From Generators To Reactive Power”

Why Apple Dumped 2,700 Computers In A Landfill In 1989

In 1983, the Lisa was supposed to be a barnburner. Apple’s brand-new computer had a cutting edge GUI, a mouse, and power far beyond the 8-bit machines that came before. It looked like nothing else on the market, and had a price tag to match—retailing at $9,995, or the equivalent of over $30,000 today.

It held so much promise. And yet, come 1989, Apple was burying almost 3,000 examples in a landfill. What went wrong?

Continue reading “Why Apple Dumped 2,700 Computers In A Landfill In 1989”

A Field Guide To The North American Cold Chain

So far in the “Field Guide” series, we’ve mainly looked at critical infrastructure systems that, while often blending into the scenery, are easily observable once you know where to look. From the substations, transmission lines, and local distribution systems that make up the electrical grid to cell towers and even weigh stations, most of what we’ve covered so far are mega-scale engineering projects that are critical to modern life, each of which you can get a good look at while you’re tooling down the road in a car.

This time around, though, we’re going to switch things up a bit and discuss a less-obvious but vitally important infrastructure system: the cold chain. While you might never have heard the term, you’ve certainly seen most of the major components at one time or another, and if you’ve ever enjoyed fresh fruit in the dead of winter or microwaved a frozen burrito for dinner, you’ve taken advantage of a globe-spanning system that makes sure environmentally sensitive products can be safely stored and transported.

Continue reading “A Field Guide To The North American Cold Chain”

The DEW Line Remembered

The DEW line was one of three radar early warning systems of the time.

If you grew up in the middle of the Cold War, you probably remember hearing about the Distant Early Warning line between duck-and-cover drills. The United States and Canada built the DEW line radar stations throughout the Arctic to detect potential attacks from the other side of the globe.

MIT’s Lincoln Lab proposed the DEW Line in 1952, and the plan was ambitious. In order to spot bombers crossing over the Arctic circle in time, it required radar twice as powerful as the best radar of the day. It also needed communications systems that were 99 percent reliable, even in the face of terrestrial and solar weather.

In the end, there were 33 stations built from Alaska to Greenland in an astonishing 32 months. Keep in mind that these stations were located in a very inhospitable environment, where temperatures reached down to -60 °F (-51 °C). Operators kept the stations running 24/7 for 36 years, from 1957 to 1993.

System of Systems

The DEW line wasn’t the only radar early-warning system that the US and Canada had in place, only the most ambitious. The Pinetree Line was first activated in 1951. However, its simple radar was prone to jamming and couldn’t pick up things close to the ground. It was also too close to main cities along the border to offer them much protection. Even so, the 33 major stations, along with six smaller stations, did better than expected. Continue reading “The DEW Line Remembered”

The Fight To Save Lunar Trailblazer

After the fire and fury of liftoff, when a spacecraft is sailing silently through space, you could be forgiven for thinking the hard part of the mission is over. After all, riding what’s essentially a domesticated explosion up and out of Earth’s gravity well very nearly pushes physics and current material science to the breaking point.

But in reality, getting into space is just the first on a long list of nearly impossible things that need to go right for a successful mission. While scientific experiments performed aboard the International Space Station and other crewed vehicles have the benefit of human supervision, the vast majority of satellites, probes, and rovers must be able to operate in total isolation. With nobody nearby to flick the power switch off and on again, such craft need to be designed with multiple layers of redundant systems and safe modes if they’re to have any hope of surviving even the most mundane system failure.

That said, nobody can predict the future. Despite the best efforts of everyone involved, there will always be edge cases or abnormal scenarios that don’t get accounted for. With proper planning and a pinch of luck, the majority of missions are able to skirt these scenarios and complete their missions without serious incident.

Unfortunately, Lunar Trailblazer isn’t one of those missions. Things started well enough — the February 26th launch of the SpaceX Falcon 9 went perfectly, and the rocket’s second stage gave the vehicle the push it needed to reach the Moon. The small 210 kg (460 lb) lunar probe then separated from the booster and transmitted an initial status message that was received by the Caltech mission controllers in Pasadena, California which indicated it was free-flying and powering up its systems.

But since then, nothing has gone to plan.

Continue reading “The Fight To Save Lunar Trailblazer”