Alexa, Remind Me Of The First Time Your Product Category Failed

For the last few years, the Last Great Hope™ of the consumer electronics industry has been voice assistants. Alexas and Echos and Google Homes and Facebook Portals are all the rage. Over one hundred million Alexa devices have been sold, an impressive feat given that there are only about 120 Million households in the United States, and a similar number in Europe. Look to your left, look to your right, one of you lives in a house with an Internet connected voice assistant.

2018 saw a huge explosion of Internet connected voice assistants, in sometimes bizarre form factors. There’s a voice controlled microwave, which is great if you’ve ever wanted to defrost a chicken through the Internet. You can get hardware for developing your own voice assistant device. 2019 will be even bigger. Facebook is heavily advertising the Facebook Portal. If you haven’t yet deleted your Facebook account, you can put the Facebook Portal on your kitchen counter and make video calls with your family and friends through Facebook Messenger. With the Google Home Hub and a Nest doorbell camera, you too can be just like Stu Pickles from Rugrats.

This is not the first time the world has been enamored with Internet-connected assistants. This is not the first time the consumer electronics industry put all their hope into one product category. This has happened before, and all those devices failed spectacularly. These were the Internet appliances released between 1999 and 2001: the last great hurrah of the dot-com boom. They were dumb then, and they’re dumb now.

Continue reading “Alexa, Remind Me Of The First Time Your Product Category Failed”

Why is Continuous Glucose Monitoring So Hard?

Everyone starts their day with a routine, and like most people these days, mine starts by checking my phone. But where most people look for the weather update, local traffic, or even check Twitter or Facebook, I use my phone to peer an inch inside my daughter’s abdomen. There, a tiny electrochemical sensor continuously samples the fluid between her cells, measuring the concentration of glucose so that we can control the amount of insulin she’s receiving through her insulin pump.

Type 1 diabetes is a nasty disease, usually sprung on the victim early in life and making every day a series of medical procedures – calculating the correct amount of insulin to use for each morsel of food consumed, dealing with the inevitable high and low blood glucose readings, and pinprick after pinprick to test the blood. Continuous glucose monitoring (CGM) has been a godsend to us and millions of diabetic families, as it gives us the freedom to let our kids be kids and go on sleepovers and have one more slice of pizza without turning it into a major project. Plus, good control of blood glucose means less chance of the dire consequences of diabetes later in life, like blindness, heart disease, and amputations. And I have to say I think it’s pretty neat that I have telemetry on my child; we like to call her our “cyborg kid.”

But for all the benefits of CGM, it’s not without its downsides. It’s wickedly expensive in terms of consumables and electronics, it requires an invasive procedure to place sensors, and even in this age of tiny electronics, it’s still comparatively bulky. It seems like we should be a lot further along with the technology than we are, but as it turns out, CGM is actually pretty hard to do, and there are some pretty solid reasons why the technology seems stuck.

Continue reading “Why is Continuous Glucose Monitoring So Hard?”

Planned Obsolescence Isn’t A Thing, But It Is Your Fault

The common belief is that big companies are out to get the little people by making products that break after a short period, or with substantially new features or accessories that make previous models obsolete, requiring the user to purchase a new model. This conspiracy theory isn’t true; there’s a perfectly good explanation for this phenomenon, and it was caused by the consumers, not the manufacturers.

When we buy the hottest, shiniest, smallest, and cheapest new thing we join the wave of consumer demand that is the cause of what often gets labelled as “Planned Obsolescence”. In truth, we’re all to blame for the signals our buying habits send to manufacturers. Dig in and get your flamewar fingers fired up.

Continue reading “Planned Obsolescence Isn’t A Thing, But It Is Your Fault”

Ask Hackaday: Managing Inspiration

For most of us, hacking is a hobby, something to pass a few idle hours and satisfy our need to create. Precious few of us get to live the dream of being paid to tinker; most of us need some kind of day job to pay the bills and support our hacking habits. This necessarily creates an essential conflict, rooted in the fact that we all only have 24 hours to spread around every day: I need to spend my time working so I can afford to hack, but the time I spend working to earn money eats away at my hacking time. That’s some catch, that Catch-22.

From that primary conflict emerges another one. Hacking is a hugely creative process, and while the artist or the author might not see it that way, it’s true nonetheless. Unless we’re straight-up copying someone else’s work, either because they’ve already solved the same problem we’re working on and we just need to get it done, or perhaps we’re just learning a new skill and want to stick to the script, chances are pretty good that we’re hitting the creative juices hard when we build something new. And that requires something perhaps even more limiting than time: inspiration. How you manage inspiration in large part dictates how productive you are in your creative pursuits.

Continue reading “Ask Hackaday: Managing Inspiration”

Hackers Want Cambridge Dictionary to Change Their Definition

Maybe it’s the silly season of high summer, or maybe a PR bunny at a cybersecurity company has simply hit the jackpot with a story syndicated by the Press Association, but the non-tech media has been earnestly talking about a call upon the Cambridge Dictionary to remove the word “illegal” from their definition of “Hacker”. The weighty tome from the famous British university lists the word as either “a person who is skilled in the use of computer systems, often one who illegally obtains access to private computer systems:” in its learners dictionary, or as “someone who illegally uses a computer to access information stored on another computer system or to spread a computer virus” in its academic dictionary. The cybersecurity company in question argues that hackers in fact do a lot of the work that improves cybersecurity and are thus all-round Good Eggs, and not those nasty computer crooks we hear so much about in the papers.

We’re right behind them on the point about illegality, because while there are those who adopt the hacker sobriquet that wear hats of all colours including black, for us being a hacker is about having the curiosity to tinker with anything presented to us, whatever it is. It’s a word that originated among railway modelers (Internet Archived version), hardly a community that’s known for its criminal tendencies!

Popular Usage Informs Definition

It is however futile to attempt to influence a dictionary in this way. There are two types of lexicography: Prescriptive and Descriptive. With prescriptive lexicography, the dictionary instructs what something must mean or how it should be spelled, while descriptive lexicography tells you how something is used in the real world based on extensive usage research. Thus venerable lexicographers such as Samuel Johnson or Noah Webster told you a particular way to use your English, while their modern equivalents lead you towards current usage with plenty of examples.

It’s something that can cause significant discontent among some dictionary users as we can see from our consternation over the word “hacker”. The administration team at all dictionaries will be familiar with the constant stream of letters of complaint from people outraged that their pet piece of language is not reflected in the volume they regard as an authority. But while modern lexicographers admit that they sometimes walk in an uneasy balance between the two approaches, they are at heart scientists with a rigorous approach to evidence-based research, and are very proud of their efforts.

Big Data Makes for Big Dictionaries

Lexicographic research comes from huge corpora, databases of tens or hundreds of millions of words of written English, from which they can extract the subtlest of language trends to see where a word is going. These can be interesting and engrossing tools for anyone, not just linguists, so we’d urge you to have a go for yourself.

Sadly for us the corpus evidence shows the definition for “Hacker” has very firmly trended toward the tabloid newspaper meaning that associates cybercriminality. All we can do is subvert that trend by doing our best to own the word as we would prefer it to be used, re-appropriating it. At least the other weighty tome from a well-known British university has a secondary sense that we do agree with: An enthusiastic and skilful computer programmer or user“.

Disclosure: Jenny List used to work in the dictionary business.

The Anxiety of Open Source: Why We Struggle With Putting It Out There

You’ve just finished your project. Well, not finished, but it works and you’ve solved all the problems worth solving, and you have a thing that works for you. Then you think about sharing your creation with the world. “This is cool” you think. “Other people might think it’s cool, too.” So you have to take pictures and video, and you wish you had documented some more of the assembly steps, and you have to do a writeup, and comment your code, and create a repository for it, maybe think about licensing. All of a sudden, the actual project was only the beginning, and now you’re stressing out about all the other things involved in telling other people about your project, because you know from past experience that there are a lot of haters out there who are going to tear it down unless it’s perfect, or even if it is, and even if people like it they are going to ask you for help or to make one for them, and now it’s 7 years later and people are STILL asking you for the source code for some quick little thing you did and threw up on YouTube when you were just out of college, and of course it won’t work anymore because that was on Windows XP when people still used Java.

Take a deep breath. We’ve all been there. This is an article about finding a good solution to sharing your work without dealing with the hassle. If you read the previous paragraph and finished with a heart rate twice what you started, you know the problem. You just want to share something with the world, but you don’t want to support that project for the rest of your life; you want to move on to new and better and more interesting projects. Here are some tips.

Continue reading “The Anxiety of Open Source: Why We Struggle With Putting It Out There”

What Does ‘Crypto’ Actually Mean?

This article is about crypto. It’s in the title, and the first sentence, yet the topic still remains hidden.

At Hackaday, we are deeply concerned with language. Part of this is the fact that we are a purely text-based publication, yes, but a better reason is right there in the masthead. This is Hackaday, and for more than a decade, we have countered to the notion that ‘hackers’ are only bad actors. We have railed against co-opted language for our entire existence, and our more successful stories are entirely about the use and abuse of language.

Part of this is due to the nature of the Internet. Pedantry is an acceptable substitute for wisdom, it seems, and choosing the right word isn’t just a matter of semantics — it’s a compiler error. The wrong word shuts down all discussion. Use the phrase, ‘fused deposition modeling’ when describing a filament-based 3D printer, and some will inevitably reach for their pitchforks and torches; the correct phrase is, ‘fused filament fabrication’, the term preferred by the RepRap community because it is legally unencumbered by patents. That’s actually a neat tidbit, but the phrase describing a technology is covered by a trademark, and not by a patent.

The technical side of the Internet, or at least the subpopulation concerned about backdoors, 0-days, and commitments to hodl, is now at a semantic crossroads. ‘Crypto’ is starting to mean ‘cryptocurrency’. The netsec and technology-minded populations of the Internet are now deeply concerned over language. Cryptocurrency enthusiasts have usurped the word ‘crypto’, and the folks that were hacking around with DES thirty years ago aren’t happy. A DH key exchange has nothing to do with virtual cats bought with Etherium, and there’s no way anyone losing money to ICO scams could come up with an encryption protocol as elegant as ROT-13.

But language changes. Now, cryptographers are dealing with the same problem hackers had in the 90s, and this time there’s nothing as cool as rollerblading into the Gibson to fall back on. Does ‘crypto’ mean ‘cryptography’, or does ‘crypto’ mean cryptocurrency? If frequency of usage determines the correct definition, a quick perusal of the press releases in my email quickly reveals a winner. It’s cryptocurrency by a mile. However, cryptography has been around much, much longer than cryptocurrency. What’s the right definition of ‘crypto’? Does it mean cryptography, or does it mean cryptocurrency?

Continue reading “What Does ‘Crypto’ Actually Mean?”