It’s often said that what’s inside matters more than one’s looks, but it’s hard to argue that a product’s looks and its physical user experience are what makes it instantly recognizable. When you think of something like a Walkman, an iPod music player, a desktop computer, a car or a TV, the first thing that comes to mind is the way that it looks along with its user interface. This is the domain of industrial design, where circuit boards, mechanisms, displays and buttons are put into a shell that ultimately defines what users see and experience.
Thus industrial design is perhaps the most important aspect of product development as far as the user is concerned, right along with the feature list. It’s also no secret that marketing departments love to lean into the styling and ergonomics of a product. In light of this it is very disconcerting that the past years industrial design for consumer electronics in particular seems to have wilted and is now practically on the verge of death.
Devices like cellphones and TVs are now mostly flat plastic-and-glass rectangles with no distinguishing features. Laptops and PCs are identified either by being flat, small, having RGB lighting, or a combination of these. At the same time buttons and other physical user interface elements are vanishing along with prominent styling, leaving us in a world of basic geometric shapes and flat, evenly colored surfaces. Exactly how did we get to this point, and what does this mean for our own hardware projects?
It hasn’t been that long since humans figured out how to create power grids that integrated multiple generators and consumers. Ever since AC won the battle of the currents, grid operators have had to deal with the issues that come with using AC instead of the far less complex DC. Instead of simply targeting a constant voltage, generators have to synchronize with the frequency of the alternating current as it cycles between positive and negative current many times per second.
Complicating matters further, the transmission lines between generators and consumers, along with any kind of transmission equipment on the lines, add their own inductive, capacitive, and resistive properties to the system before the effects of consumers are even tallied up. The result of this are phase shifts between voltage and current that have to be managed by controlling the reactive power, lest frequency oscillations and voltage swings result in a complete grid blackout.
In 1983, the Lisa was supposed to be a barnburner. Apple’s brand-new computer had a cutting edge GUI, a mouse, and power far beyond the 8-bit machines that came before. It looked like nothing else on the market, and had a price tag to match—retailing at $9,995, or the equivalent of over $30,000 today.
It held so much promise. And yet, come 1989, Apple was burying almost 3,000 examples in a landfill. What went wrong?
Do you ever look at the news, and wonder about the process behind the news cycle? I did, and for the last couple of decades it’s been the subject of one of my projects. The Raspberry Pi on my shelf runs my word trend analysis tool for news content, and since my journey from curious geek to having my own large corpus analysis system has taken twenty years it’s worth a second look.
How Career Turmoil Led To A Two Decade Project
This is very much a minority spelling. Colin Smith, CC BY-SA 2.0.
In the middle of the 2000s I had come out of the dotcom crash mostly intact, and was working for a small web shop. When they went bust I was casting around as one does, and spent a while as a Google quality rater while I looked for a new permie job. These teams are employed by the search giant through temporary employment agencies, and in loose terms their job is to be the trained monkeys against whom the algorithm is tested. The algorithm chose X, and if the humans also chose X, the algorithm is probably getting it right. Being a quality rater is not in any way a high-profile job, but with the big shiny G on my CV I soon found myself in demand from web companies seeking some white-hat search engine marketing expertise. What I learned mirrored my lesson from a decade earlier in the CD-ROM business, that on the web as in any other electronic publishing medium, good content well presented has priority over any black-hat tricks.
But what makes good content? Forget an obsession with stuffing bogus keywords in the text, and instead talk about the right things, and do it authoritatively. What are the right things in this context? If you are covering a subject, you need to do so using the right language; that which the majority uses rather than language only you use. I can think of a bunch of examples which I probably shouldn’t talk about, but an example close to home for me comes in cider. In the UK, cider is a fermented alcoholic drink made from apples, and as a craft cidermaker of many years standing I have a good grasp of its vocabulary. The accepted spelling is “Cider”, but there’s an alternate spelling of “Cyder” used by some commercial producers of the drink. It doesn’t take long to realise that online, hardly anyone uses cyder with a Y, and thus pages concentrating on that word will do less well than those talking about cider.
We Brits rarely use the word “soccer” unless there’s a story about the Club World Cup in America.
I started to build software to analyse language around a given topic, with the aim of discerning the metaphorical cider from the cyder. It was a great surprise a few years later to discover that I had invented for myself the already-existing field of computational linguistics, something that would have saved me a lot of time had I known about it when I began. I was taking a corpus of text and computing the frequencies and collocates (words that appear alongside each other) of the words within it, and from that I could quickly see which wording mattered around a subject, and which didn’t. This led seamlessly to an interest in what the same process would look like for news data with a time axis added, so I created a version which harvested its corpus from RSS feeds. Thus began my decades-long project.
From the very dawn of the personal computing era, the PC and Apple platforms have gone very different ways. IBM compatibles surged in popularity, while Apple was able to more closely guard the Macintosh from imitators wanting to duplicate its hardware and run its software.
Things changed when Apple announced it would hop aboard the x86 bandwagon in 2005. Soon enough was born the Hackintosh. It was difficult, yet possible, to run MacOS on your own computer built with the PC parts your heart desired.
Some time ago, Linus Torvalds made a throwaway comment that sent ripples through the Linux world. Was it perhaps time to abandon support for the now-ancient Intel 486? Developers had already abandoned the 386 in 2012, and Torvalds openly mused if the time was right to make further cuts for the benefit of modernity.
It would take three long years, but that eventuality finally came to pass. As of version 6.15, the Linux kernel will no longer support chips running the 80486 architecture, along with a gaggle of early “586” chips as well. It’s all down to some housekeeping and precise technical changes that will make the new code inoperable with the machines of the past.
There are all manner of musical myths, covering tones and melodies that have effects ranging from the profound to the supernatural. The Pied Piper, for example, or the infamous “brown note.”
But what about a song that could crash your laptop just by playing it? Even better, a song that could crash nearby laptops in the vicinity, too? It’s not magic, and it’s not a trick—it was just a punchy pop song that Janet Jackson wrote back in 1989.