Sonar In Your Hand

Sonar measures distance by emitting a sound and clocking how long it takes the sound to travel. This works in any medium capable of transmitting sound such as water, air, or in the case of FingerPing, flesh and bone. FingerPing is a project at Georgia Tech headed by [Cheng Zhang] which measures hand position by sending soundwaves through the thumb and measuring the time on four different receivers. These readings tell which bones the sound travels through and allow the device to figure out where the thumb is touching. Hand positions like this include American Sign Language one through ten.

From the perspective of discreetly one through ten on a mobile device, this opens up a lot of possibilities for computer input while remaining pretty unobtrusive. We see prototypes which are more capable of reading gestures but also draw attention if you wear them on a bus. It is a classic trade-off between convenience and function but this type of reading is unique and could combine with other bio signals for finer results.

Continue reading “Sonar In Your Hand”

Playing Jedi Mind-Tricks On Your TV

Gesture-enabled controls mean you get to live out your fantasy of wielding force powers. It does, however, take a bit of hacking to make that possible. Directly from the team at [circuito.io] comes a hand gesture controller for Jedi mind-trick manipulation of your devices!

The star of the show here is the APDS-9960 RGB and gesture sensor, with an Arduino Pro Mini 328 doing the thinking and an IR transmitter LED putting that to good use. The Arduino Sketch is a chimera of two code examples for IR LEDs and the gesture sensor — courtesy of the always estimable Ken Shirriff, and SparkFun respectively.

Of course, you can have the output trigger different devices, but since this particular build is meant to control a TV the team had to use a separate Arduino and IR receiver to discover the codes for the commands they wanted  to use. Once they were added to the Sketch, moving your hand above the sensor in X, Y or Z-axes executes the command. Voila! — Jedi powers.

Continue reading “Playing Jedi Mind-Tricks On Your TV”

Roll Up Your Sleeve, Watch A Video With This Smart Watch Forearm Projector

We’re all slowly getting used to the idea of wearable technology, fabulous flops like the creepy Google Glass notwithstanding. But the big problem with tiny tech is in finding the real estate for user interfaces. Sure, we can make it tiny, but human fingers aren’t getting any smaller, and eyeballs can only resolve so much fine detail.

So how do we make wearables more usable? According to Carnegie-Mellon researcher [Chris Harrison], one way is to turn the wearer into the display and the input device (PDF link). More specifically, his LumiWatch projects a touch-responsive display onto the forearm of the wearer. The video below is pretty slick with some obvious CGI “artist’s rendition” displays up front. But even the somewhat limited displays shown later in the video are pretty impressive. The watch can claim up to 40-cm² of the user’s forearm for display, even at the shallow projection angle offered by a watch bezel only slightly above the arm — quite a feat given the irregular surface of the skin. It accomplishes this with a “pico-projector” consisting of red, blue, and green lasers and a pair of MEMS mirrors. The projector can adjust the linearity and brightness of the display to provide a consistent image across the uneven surface. An array of 10 time-of-flight sensors takes care of watching the display area for touch input gestures. It’s a fascinating project with a lot of potential, but we wonder how the variability of the human body might confound the display. Not to mention the need for short sleeves year round.

Need some basics on the micro-electrical mechanic systems (MEMS) behind the pico-projector in this watch? We’ve got a great primer on these microscopic machines.

Continue reading “Roll Up Your Sleeve, Watch A Video With This Smart Watch Forearm Projector”

Motion-Controlled KVM Switch

Once upon a time, [hardwarecoder] acquired a Gen8 HP microserver that he began to toy around with. It started with ‘trying out’ some visualization before spiraling off the rails and fully setting up FreeBSD with ZFS as a QEMU-KVM virtual machine. While wondering what to do next, he happened to be lamenting how he couldn’t also fit his laptop on his desk, so he built himself a slick, motion-sensing KVM switch to solve his space problem.

At its heart, this device injects DCC code via the I2C pins on his monitors’ VGA cables to swap inputs while a relay ‘replugs’ the keyboard and mouse from the server to the laptop — and vice-versa — at the same time. On the completely custom PCB are a pair of infrared diodes and a receiver that detects Jedi-like hand waves which activate the swap. It’s a little more complex than some methods, but arguably much cooler.

Using an adapter, the pcb plugs into his keyboard, and the monitor data connections and keyboard/mouse output to the laptop and server stream out from there. There is a slight potential issue with cables torquing on the PCB, but with it being so conveniently close, [hardwarecoder] doesn’t need to handle it much.

Continue reading “Motion-Controlled KVM Switch”

Wireless Protocol Reverse Engineered To Create Wrist Wearable Mouse

We’ve seen a few near-future sci-fi films recently where computers respond not just to touchscreen gestures but also to broad commands, like swiping a phone to throw its display onto a large flat panel display. It’s a nice metaphor, and if we’re going to see something like it soon, perhaps this wrist-mounted pointing device will be one way to get there.

The video below shows the finished product in action, with the cursor controlled by arm movements. Finger gestures that are very much like handling a real mouse’s buttons are interpreted as clicks. The wearable has a Nano, an MPU6050 IMU, and a nRF24L01 transceiver, all powered by some coin cells and tucked nicely into a 3D-printed case. To be honest, as cool as [Ronan Gaillard]’s wrist mouse is, the real story here is the reverse engineering he and his classmate did to pull this one off.

The road to the finished product was very interesting and more detail is shared in their final presentation (in French and heavy with memes). Our French is sufficient only to decipher “Le dongle Logitech,” but there are enough packet diagrams supporting into get the gist. They sniffed the packets going between a wireless keyboard and its dongle and figured out how to imitate mouse movements using an NRF24 module. Translating wrist and finger movements to cursor position via the 6-axis IMU involved some fairly fancy math, but it all seems to have worked in the end, and it makes for a very impressive project.

Is sniffing wireless packets in your future? Perhaps this guide to Wireshark and the nRF24L01 will prove useful.

Continue reading “Wireless Protocol Reverse Engineered To Create Wrist Wearable Mouse”

Hybrid Interface Brings Touchscreen To Rigol Scope

With pervasive smartphones and tablets, the touch interface is assumed for small LCD screens, and we’ve likely all poked and pinched at some screen, only to find it immune to our gestures. Manufacturers have noticed this and begun adding touch interfaces to instruments like digital oscilloscopes, but touch interfaces tend to be an upgrade feature. But thanks to this hybrid oscilloscope touchscreen interface, even the low-end scopes can get in on the action.

It only makes sense that [Matt Heinz] started with one of the most hackable scopes for this build, which was his Master’s thesis project. Using an Android tablet as an auxiliary interface, [Matt] is able to control most of the main functions of the scope remotely. Pinching and expanding gestures are interpreted as horizontal and vertical scaling, while dragging the displayed waveform changes its position and controls triggering. While it’s not a true touchscreen scope, the code is all open source, so can a true aftermarket Rigol touchscreen be far away?

Rigol hacks abound here — you can talk to them in Linux, increase the bandwidth, or just get a look at their guts.

Continue reading “Hybrid Interface Brings Touchscreen To Rigol Scope”

Listening For Hand Gestures

[B. Aswinth Raj] wanted to control a VLC player with hand gestures. He turned to two common ultrasonic sensors and Python to do the job. There is also, of course, an Arduino. You can see a video of the results, below.

The Arduino code reads the distance from both sensors — one for the left hand and the other for the right. This allows the device to react to single hand gestures that get closer or further away from one sensor as well as gestures involving both hands. For example, raising your left hand and moving it closer or further away will adjust the volume. The right hand controls rewind and fast forward. Raising both hands will start or stop playback.

Continue reading “Listening For Hand Gestures”