The End Of The Electromechanical Era

When viewed from the far future, the early years of the 21st century will probably be seen as the end of a short era in human technological development. In the beginning of the 20th century, most everything was mechanical. There were certainly some electric devices, but consumer products like gramophone players and “movie” cameras were purely mechanical affairs. You cranked them up, and they ran on springs. Nowadays, almost every bit of consumer gear you buy will be entirely electronic. In between, there was a roughly 50 year period that I’m going to call the Electromechanical Era.

Jenny List’s teardown this week of an old Fuji film movie camera from 1972 captures the middle of this era perfectly. There’s a small PCB and an electric motor, but most of the heavy lifting in the controls was actually put on the shoulders of levers, bearings, and ridiculously clever mechanisms. The electrical and mechanical systems were loosely coupled, with the electrical controlled by the mechanical.

I’m willing to argue the specifics, but I’d preliminarily date the peak of the Electromechanical Era somewhere around 1990. Last year, I had to replace all of the rotted rubber drive belts in a Sony Walkman WM-D6C, a professional portable tape player and recorder produced from 1984-2002.

It’s not a simple tape recorder — the motors are electronically regulated to keep ridiculously constant speed for such a small device, and mine has Dolby B and C noise reduction circuitry packed inside along with some decent mic preamps. But still, when you press the fast-forward button, it physically shoves rubber-coated drive wheels out of the way, and sliding pieces of metal make it change modes of operation by making and breaking electrical contacts. Its precision lies as much in the mechanical assemblies and motors as in the electronics. It’s truly half electronic and half mechanical.

But that era is long over. The coming of the CD player signaled the end, although we didn’t see it at the time. Sure, there is a motor, but all the buttons are electronic, and all the “mechanism” is implemented almost entirely in silicon. The digital camera was possibly the last nail in the Electromechanical Era’s coffin: with no need to handle physical film, the last demand for anything mechanical evaporated. Open up a GoPro if you don’t know what I mean.

While I’ll be happy to never have to replace the drive rubber in a cassette recorder again, it’s with a little sadness that I think on the early iPods with their spinning metal hard drives, and how they gave way to the entirely silicon Zoom H5 recorder that I use now. It has a S/N ratio and quiet pre-amps, no wow or flutter, and a quality that would have been literally unbelievable when I bought the WM-D6C.

Still, if you find yourself in the thrift store, and you’ve never done so before, buy and take apart one of these marvels from a bygone era. A cassette recorder, even a cheap one, hides a wealth of electromechanical design.

Error Codes And The Law Of Least Astonishment

Do you know the law of least astonishment? I am not sure of its origin, but I first learned it from the excellent “Tao of Programming.” Simply put, it is the principle that software should always respond to the users in a way that least astonishes them. In other words, printing a document shouldn’t erase it from your file system.

Following the law of least astonishment, what should a program do when it hits a hard error? You might say that it should let the user know. Unfortunately, many systems just brush it under the rug these days.

I think it started with Windows. Or maybe the Mac. The thinking goes that end users are too stupid or too afraid of error codes or detailed messages so we are just leaving them out. Case in point: My wife’s iPhone wouldn’t upload pictures. I’m no expert since I carry an Android device, but I agreed to look at it. No matter what I tried, I got the same useless message: “Can’t upload photos right now. Please try again later.” Not only is this not very informative, but it also implies the problem is in something that might fix itself later like the network.

The real culprit? The iCloud terms of service had changed and she had not accepted the new contract. I have a feeling it might have popped up asking her to do that at some point, but for whatever reason she missed it. Until you dug into the settings and checked the box to agree to those terms, “later” was never going to happen.

Continue reading “Error Codes And The Law Of Least Astonishment”

Ordering prototypes like they were fast food

Has DIY Become Click And Buy?

We are living in great times for DIY, although ironically some of that is because of all the steps that we don’t have to do ourselves. PCBs can be ordered out easily and inexpensively, and the mechanical parts of our projects can be ordered conveniently online, fabricated in quantity one for not much more than a song, or 3D printed at home when plastic will do. Is this really DIY if everything is being farmed out? Yes, no, and maybe.

It all depends on where you think the real value of DIY lies. Is it in the idea, the concept, the design? Or in its realization, the manufacturing? I would claim that most of the value actually lies in the former, as much as I personally enjoy the many processes of physically constructing the individual parts of many projects.

For instance, I designed and built a hot-wire CNC foam cutter recently. Or better, I designed a series of improved versions, because I never get anything right on the first try. All along the way, I 3D printed new and improved versions of the plastic parts, ironing as many of the little glitches out as I had patience for. This took probably a good handful of weekends’ time, spread out over a couple months, but in comparison to time spent testing, fixing, and redesigning, very little time or effort was spent in the physical building.

Moreover, I bought most of the parts at the hardware store. The motor controller shield and cheap Arduino clone came from eBay. And even those that I did manufacture myself, the 3D-printed bits, were kind of made by a machine — my experience of the whole process wouldn’t have been any different if I ordered them out.

Of course craftsmanship still exists, and we see that in Hackaday projects all the time. Heck, I’ll admit that I still enjoy a lot of the process of making things with my own hands for its own sake. It’s peaceful. But if there’s one thing that the rapid proliferation of ideas and projects that have been facilitated by 3D printing and cheap short-run PCB services, it’s that the real value of many projects lies in the idea, and the documentation. Which is to say, I gotta get around to writing up that foam cutter…

Microsoft’s Minimal Mouse May Maximize Masochism

So it seems that Microsoft has a patent in process for a folding mouse.  It looks a whole lot like their Arc mouse, which is quite thin and already goes from curved to flat. But that’s apparently not good enough for Microsoft, who says mice in general are bulky and cumbersome to travel with. On the bright side, they do acknowledge the total lack of ergonomics in those tiny travel mice.

Microsoft filed this patent in March of 2021 and it was published in early November. The patent describes the use of an expandable shell on the top with these kerf cuts in the long sides like those used to bend wood — this is where the flexibility comes in. The patent also mentions a motion tracker, haptic feedback, and a wireless charging coil. Now remember, there’s no guarantee of this ever actually happening, and there was no comment from Microsoft about whether it will become a real rodent someday.

And now, the rant. Microsoft considers this mouse, which again is essentially an updated Arc that folds in half, to be ergonomic. Full disclosure: I’ve never used an Arc mouse. But I respectfully disagree with this assessment and believe that people should not prioritize portability when it comes to peripherals, especially those that are so small to begin with. Like, what’s the use? And by the way, isn’t anyone this concerned with portability just using the touch pad or steering stick on their laptop anyway?

Continue reading “Microsoft’s Minimal Mouse May Maximize Masochism”

Laptop keyboard with strange characters on the keys

But Think Of The (World Wide) Users!

History is full of stories about technology that makes sense to the designer but doesn’t really fit the needs of the users. Take cake mixes. In 1929, a man named Duff realized that he could capitalize on surplus flour and molasses and created a cake mix. You simply added water to the dry mix and baked it to create a delicious cake. After World War II General Mills and Pillsbury also wanted to sell more flour so they started making cakes. But sales leveled out. A psychologist who was a pioneer in focus groups named Dichter had the answer: bakers didn’t feel like they were contributing to the creation of the cake. To get more emotional investment, the cake mixes would need to have real eggs added in. Actually, Duff had noticed the same thing in his 1933 patent.

It is easy to imagine a bunch of food… scientists? Engineers? Designers?… whatever a person inventing flour mixes in the 1930s was called… sitting around thinking that making a mix that only requires water is a great thing. But the bakers didn’t like it. How often do we fail to account for users?

From Cake Mix to Tech

Apple has made a business of this. Most of us don’t mind things like arcane commands and control key combinations, but the wider pool of global computer users don’t like those things. As the world continues to virtually shrink, we often find our users are people from different lands and cultures who speak different languages. It is, after all, the world wide web. This requires us to think even harder about our users and their particular likes, dislikes, and customs.

Continue reading “But Think Of The (World Wide) Users!”

So. What’s Up With All These Crazy Event Networks Then?

As an itinerant Hackaday writer I am privileged to meet the people who make up our community as I travel the continent in search of the coolest gatherings. This weekend I’ve made the trek to the east of the Netherlands for the ETH0 hacker camp, in a camping hostel set in wooded countryside. Sit down, connect to the network, grab a Club-Mate, and I’m ready to go!

Forget the CTF, Connecting To WiFi Is The Real Challenge!

There no doubt comes a point in every traveling hacker’s life when a small annoyance becomes a major one and a rant boils up from within, and perhaps it’s ETH0’s misfortune that it’s at their event that something has finally boiled over. I’m speaking of course about wireless networks.

While on the road I connect to a lot of them, the normal commercial hotspots, hackerspaces, and of course at hacker camps. Connecting to a wireless network is a simple experience, with a level of security provided by WPA2 and access credentials being a password. Find the SSID, bang in the password, and you’re in. I’m as securely connected as I reasonably can be, and can get on with whatever I need to do. At hacker camps though, for some reason it never seems to be so simple.

Instead of a simple password field you are presented with a complex dialogue with a load of fields that make little sense, and someone breezily saying “Just enter hacker and hacker!” doesn’t cut it when that simply doesn’t work. When you have to publish an app just so that attendees can hook up their phones to a network, perhaps it’s time to take another look . Continue reading “So. What’s Up With All These Crazy Event Networks Then?”

Quantum computer

Scientific Honesty And Quantum Computing’s Latest Theoretical Hurdle

Quantum computers are really in their infancy. If you created a few logic gates with tubes back in the 1930s, it would be difficult to predict all the ways we would use computers today. However, you could probably guess where at least some of the problems would lie in the future. One of the things we are pretty sure will limit quantum computer development is error correction.

As far as we know, every quantum qubit we’ve come up with so far is very fragile and prone to random errors. That’s why every practical design today incorporates some sort of QEC — quantum error correction. Of course, error correction isn’t news. We use it all the time on unreliable storage media or communication channels and high-reliability memory. The problem is, you can’t directly clone a qubit (a quantum bit), so it is hard to use traditional error correction techniques with qubits.

After all, the whole point to a qubit is we don’t measure it until the end of the computation which, like Schrödinger’s cat, seals its fate. So if you were to “read” a bunch of qubits to form a checksum or a CRC, you’d destroy their quantum nature in the process making your computer not very useful. You can’t even copy a bit to use something like triple redundancy, either. There seems to be no way to practically duplicate a qubit.

Continue reading “Scientific Honesty And Quantum Computing’s Latest Theoretical Hurdle”