The first LED digital wristwatches hit the market in the 1970s. They required a button push to turn the display on, prompting one comedian to quip that giving one to a one-armed man would be in poor taste. While the UIs of watches and other wearables have improved since then, smartphones still present some usability challenges. Some of the touch screen gestures needed to operate a phone, like pinching, are nigh impossible when one-handing the phone, and woe unto those with stubby thumbs when trying to take a selfie.
You’d think that the fleet of sensors and the raw computing power on board would afford better ways to control phones. And you’d be right, if the modular mechanical input widgets described in a paper from Columbia University catch on. Dubbed “Vidgets” by [Chang Xiao] et al, the haptic devices are designed to create characteristic acceleration profiles on a phone’s inertial measurement unit (IMU) when actuated. Vidgets take various forms, from push buttons to scroll wheels, each of a similar size and shape and designed to dock into one of eight positions on the back of a 3D-printed phone case. Once trained, the algorithm watches for the acceleration signature caused by actuating a Vidget, and sends commands to the phone to mimic the corresponding gestures. The video below demonstrates a couple of use cases, of which the virtual saxophone is our favorite.
This is really clever stuff, and ventures deep into “Why didn’t I think of that?” territory. Need to get ahead of the curve on IMUs to capitalize on what they can do? You could start with [Al Williams]’ primer on micro-electromechanical systems, or MEMS.