A History Of Pong

Today, creating a ground-breaking video game is akin to making a movie. You need a story, graphic artists, music, and more. But until the middle of the 20th century, there were no video games. While several games can claim to be the “first” electronic or video game, one is cemented in our collective memory as the first one we’d heard of: Pong.

The truth is, Pong wasn’t the first video game. We suspect that many people might have had the idea, but Ralph Baer is most associated with inventing a practical video game. As a young engineer in 1951, he tried to convince his company to invest in games that you could play on your TV set. They didn’t like the idea, but Ralph would remember the concept and act on it over a decade later.

But was it really the first time anyone had thought of it? Perhaps not. Thomas Goldsmith Jr. and Estle Ray Mann filed a patent in 1947 for a game that simulated launching missiles at targets with an oscilloscope display. The box took eight tubes and, being an oscilloscope, was a vector graphic device. The targets were physical dots on a screen overlay. These “amusement devices” were very expensive, and they only produced handmade prototypes.

Continue reading “A History Of Pong”

Supersonic Flight May Finally Return To US Skies

After World War II, as early supersonic military aircraft were pushing the boundaries of flight, it seemed like a foregone conclusion that commercial aircraft would eventually fly faster than sound as the technology became better understood and more affordable. Indeed, by the 1960s the United States, Britain, France, and the Soviet Union all had plans to develop commercial transport aircraft capable flight beyond Mach 1 in various stages of development.

Concorde on its final flight

Yet today, the few examples of supersonic transport (SST) planes that actually ended up being built are in museums, and flight above Mach 1 is essentially the sole domain of the military. There’s an argument to be made that it’s one of the few areas of technological advancement where the state-of-the-art not only stopped moving forward, but actually slid backwards.

But that might finally be changing, at least in the United States. Both NASA and the private sector have been working towards a new generation of supersonic aircraft that address the key issues that plagued their predecessors, and a recent push by the White House aims to undo the regulatory roadblocks that have been on the books for more than fifty years.

Continue reading “Supersonic Flight May Finally Return To US Skies”

The Death Of Industrial Design And The Era Of Dull Electronics

It’s often said that what’s inside matters more than one’s looks, but it’s hard to argue that a product’s looks and its physical user experience are what makes it instantly recognizable. When you think of something like a Walkman, an iPod music player, a desktop computer, a car or a TV, the first thing that comes to mind is the way  that it looks along with its user interface. This is the domain of industrial design, where circuit boards, mechanisms, displays and buttons are put into a shell that ultimately defines what users see and experience.

Thus industrial design is perhaps the most important aspect of product development as far as the user is concerned, right along with the feature list. It’s also no secret that marketing departments love to lean into the styling and ergonomics of a product. In light of this it is very disconcerting that the past years industrial design for consumer electronics in particular seems to have wilted and is now practically on the verge of death.

Devices like cellphones and TVs are now mostly flat plastic-and-glass rectangles with no distinguishing features. Laptops and PCs are identified either by being flat, small, having RGB lighting, or a combination of these. At the same time buttons and other physical user interface elements are vanishing along with prominent styling, leaving us in a world of basic geometric shapes and flat, evenly colored surfaces. Exactly how did we get to this point, and what does this mean for our own hardware projects?

Continue reading “The Death Of Industrial Design And The Era Of Dull Electronics”

Power Grid Stability: From Generators To Reactive Power

It hasn’t been that long since humans figured out how to create power grids that integrated multiple generators and consumers. Ever since AC won the battle of the currents, grid operators have had to deal with the issues that come with using AC instead of the far less complex DC. Instead of simply targeting a constant voltage, generators have to synchronize with the frequency of the alternating current as it cycles between positive and negative current many times per second.

Complicating matters further, the transmission lines between generators and consumers, along with any kind of transmission equipment on the lines, add their own inductive, capacitive, and resistive properties to the system before the effects of consumers are even tallied up. The result of this are phase shifts between voltage and current that have to be managed by controlling the reactive power, lest frequency oscillations and voltage swings result in a complete grid blackout.

Continue reading “Power Grid Stability: From Generators To Reactive Power”

Why Apple Dumped 2,700 Computers In A Landfill In 1989

In 1983, the Lisa was supposed to be a barnburner. Apple’s brand-new computer had a cutting edge GUI, a mouse, and power far beyond the 8-bit machines that came before. It looked like nothing else on the market, and had a price tag to match—retailing at $9,995, or the equivalent of over $30,000 today.

It held so much promise. And yet, come 1989, Apple was burying almost 3,000 examples in a landfill. What went wrong?

Continue reading “Why Apple Dumped 2,700 Computers In A Landfill In 1989”

Crunching The News For Fun And Little Profit

Do you ever look at the news, and wonder about the process behind the news cycle? I did, and for the last couple of decades it’s been the subject of one of my projects. The Raspberry Pi on my shelf runs my word trend analysis tool for news content, and since my journey from curious geek to having my own large corpus analysis system has taken twenty years it’s worth a second look.

How Career Turmoil Led To A Two Decade Project

A hanging sign surrounded by ornate metalwork, with the legend "Cyder house".
This is very much a minority spelling. Colin Smith, CC BY-SA 2.0.

In the middle of the 2000s I had come out of the dotcom crash mostly intact, and was working for a small web shop. When they went bust I was casting around as one does, and spent a while as a Google quality rater while I looked for a new permie job. These teams are employed by the search giant through temporary employment agencies, and in loose terms their job is to be the trained monkeys against whom the algorithm is tested. The algorithm chose X, and if the humans also chose X, the algorithm is probably getting it right. Being a quality rater is not in any way a high-profile job, but with the big shiny G on my CV I soon found myself in demand from web companies seeking some white-hat search engine marketing expertise. What I learned mirrored my lesson from a decade earlier in the CD-ROM business, that on the web as in any other electronic publishing medium, good content well presented has priority over any black-hat tricks.

But what makes good content? Forget an obsession with stuffing bogus keywords in the text, and instead talk about the right things, and do it authoritatively. What are the right things in this context? If you are covering a subject, you need to do so using the right language; that which the majority uses rather than language only you use. I can think of a bunch of examples which I probably shouldn’t talk about, but an example close to home for me comes in cider. In the UK, cider is a fermented alcoholic drink made from apples, and as a craft cidermaker of many years standing I have a good grasp of its vocabulary. The accepted spelling is “Cider”, but there’s an alternate spelling of “Cyder” used by some commercial producers of the drink. It doesn’t take long to realise that online, hardly anyone uses cyder with a Y, and thus pages concentrating on that word will do less well than those talking about cider.

A graph of the word football versus the word soccer in British news.
We Brits rarely use the word “soccer” unless there’s a story about the Club World Cup in America.

I started to build software to analyse language around a given topic, with the aim of discerning the metaphorical cider from the cyder. It was a great surprise a few years later to discover that I had invented for myself the already-existing field of computational linguistics, something that would have saved me a lot of time had I known about it when I began. I was taking a corpus of text and computing the frequencies and collocates (words that appear alongside each other) of the words within it, and from that I could quickly see which wording mattered around a subject, and which didn’t. This led seamlessly to an interest in what the same process would look like for news data with a time axis added, so I created a version which harvested its corpus from RSS feeds. Thus began my decades-long project.

Continue reading “Crunching The News For Fun And Little Profit”

The End Of The Hackintosh Is Upon Us

From the very dawn of the personal computing era, the PC and Apple platforms have gone very different ways. IBM compatibles surged in popularity, while Apple was able to more closely guard the Macintosh from imitators wanting to duplicate its hardware and run its software.

Things changed when Apple announced it would hop aboard the x86 bandwagon in 2005. Soon enough was born the Hackintosh. It was difficult, yet possible, to run MacOS on your own computer built with the PC parts your heart desired.

Only, the Hackintosh era is now coming to the end. With the transition to Apple Silicon all but complete, MacOS will abandon the Intel world once more.

Continue reading “The End Of The Hackintosh Is Upon Us”