BitDrones Are Awesome, Ridiculous At Same Time

At first we thought it was awesome, then we thought it was ridiculous, and now we’re pretty much settled on “ridiculawesome”.

Bitdrones is a prototype of a human-computer interaction that uses tiny quadcopters as pixels in a 3D immersive display. That’s the super-cool part. “PixelDrones” have an LED on top. “ShapeDrones” have a gauzy cage that get illuminated by color LEDs, making them into life-size color voxels. (Cool!) Finally, a “DisplayDrone” has a touchscreen mounted to it. A computer tracks each drone’s location in the room, and they work together to create a walk-in 3D “display”. So far, so awesome.

It gets even better. Because the program that commands the drones knows where each drone is, it can tell when you’ve moved a drone around in space. That’s extremely cool, and opens up the platform to new interactions. And the DisplayDrone is like a tiny flying cellphone, so you can chat hands-free with your friends who hover around your room. Check out the video embedded below the break.

Continue reading “BitDrones Are Awesome, Ridiculous At Same Time”

Gesture Controlled Quadcopter

Controlling A Quadcopter With Gestures

[grassjelly] has been hard at work building a wearable device that uses gestures to control quadcopter motion. The goal of the project is to design a controller that allows the user to intuitively control the motion of a quadcopter. Based on the demonstration video below, we’d say they hit the nail on the head. The controller runs off an Arduino Pro Mini-5v powered by two small coin cell batteries. It contains an accelerometer and an ultrasonic distance sensor.

The controller allows the quadcopter to mimic the orientation of the user’s hand. The user holds their hand out in front of them, parallel to the floor. When the hand is tilted in any direction, the quadcopter copies the motion and will tilt the same way. The amount of pitch and roll is limited by software, likely preventing the user from over-correcting and crashing the machine. The user can also raise or lower their hand to control the altitude of the copter.

[grassjelly] has made all of the code and schematics available via github.

Gesture Control Uses WiFi Doppler Shift

wifi-gesture-control

We’ve said it before: in the future simple interfaces will use nothing but your body. At least at first glance that’s the case with this WiFi-based gesture control system. If you have Internet at home you probably have a WiFi access point. That’s the first portion of the equation. The remainder is a way of measuring how the radio waves bounce off of your body. So far this is being done with Software-Define Radio (SDR) but researchers at University of Washington think it may be possible to build the technique into future WiFi devices.

The demo video shows this man waving his arm to adjust the volume of his home entertainment system. Intuition tells us that this would be impossible if your arm wasn’t the only thing in motion at the time. But that issue is quickly addressed. Multiple antennas can track multiple people at the same time. There is also consideration for false-positives. The system requires a moderately complex wake-up gesture sequence to prevent you from, say, accidentally turning on the stereo when you roll over in bed.

If you’re having trouble wraping your mind around this, consider this ultrasonic music player. The WiFi version does the same thing, but processing changes in the returning radio waves is much more complex.

Continue reading “Gesture Control Uses WiFi Doppler Shift”

3D Gesture Tracking With LIDAR

[Reza] has been working on detecting hand gestures with LIDAR for about 10 years now, and we’ve got to say the end result is worth the wait.

The build uses three small LIDAR sensors to measure the distance to an object. These sensors work by sending out an infrared pulse and recording the time of flight for a beam of light to be emmitted and reflected back to a light sensor. Basically, it’s radar but with infrared light. Three of these LIDAR sensors are mounted on a stand and plugged into an Arduino Uno. By measuring how far away an object is to each sensor, [Reza] can determine the object’s position in 3D space relative to the sensor.

Unlike the Kinect-based gesture applications we’ve seen, [Reza]’s LIDAR can work outside in the sun. Because each LIDAR sensor is measuring the distance a million times a second, it’s also much more responsive than a Kinect as well. Not bad for 10 years worth of work.

You can check out [Reza]’s gesture control demo, as well as a few demos of his LIDAR hardware after the break.

Continue reading “3D Gesture Tracking With LIDAR”

Graffiti Analysis

Here’s a fascinating project that started with a great idea and piled on a remarkable amount of innovation. Graffiti Analysis is a project that captures gestures used to create graffiti art and codifies them through a data-type called Graffiti Markup Language (GML). After the break you can watch a video showing the data capture method used in version 2.0 of the project. A marker taped to a light source is used to draw a graffiti tag on a piece of paper. The paper rests on a plexiglass drawing surface with a webcam tracking the underside in order to capture each motion.

The newest iteration, version 3.0, has some unbelievable features. The addition of audio input means that the markup can be projected and animated based on sound, with the example of graffiti interacting with a fireworks show. The 3D tools are quite amazing too, allowing not only for stereoscopic video playback, but for printing out graffiti markup using a 3D printer. The collection of new features is so vast, and produces such amazing results it’s hard to put into words. So we’ve also embedded the demo of the freshly released version after the break.

Continue reading “Graffiti Analysis”

Mister Gloves, Gesture Input

This two handed glove input setup, by [Sean Chen] and [Evan Levine], is one step closer to achieving that [Tony Stark] like workstation; IE, interacting with software in 3D with simple hand gestures. Dubbed the Mister Gloves, the system incorporates accelerometer, push button, and flex sensor data over RF where an MCU converts it to a standard USB device, meaning no drivers are needed and a windows PC can recognize it as a standard keyboard and mouse. Catch a video of Mister Gloves playing portal after the jump.

While amazing, we’re left wondering if gesture setups are really viable options considering one’s arm(s) surly would get tired? Continue reading “Mister Gloves, Gesture Input”