The Leap controller is one of those gadgets that is probably better for its cool factor rather than its practicality. The time of flight optical sensor reads gestures, but it is hardly a substitute for a mouse in many cases. It seems like the best uses for it we’ve seen are dedicated systems that need to know where your hands are. [Justin Platz] and [Kurt Clothier], for example, have an interesting demo that uses a Leap to control a Raspberry Pi. The Pi commands servo motors that move LED blocks to track your hand motion. Their code is available on GitHub.
The project is begging for a video, but they did provide an animated image:
The demo shows off the low latency possible with PubNub which is yet another Internet of Things broker (everyone wants to be the one thing that rules them all, apparently).
One thing that really caught out eye is the use of these 8×8 RGB modules as the “eyes” of the robot. You have to admit this freshens up the paradigm of using cameras as the eyes of the bot!
This isn’t the first practical use for the Leap we’ve covered. Not to mention, if you have a 3D printer, you should check out the video below.
…”The time of flight sensor” … hmm … Leap motion is NOT a TOF sensor! it computes 3D information using stereo cameras :) Make sure what you post about, this HAD!
Hmmm there was early speculation it was using some TOF (http://www.extremetech.com/extreme/131159-leap-motion-will-it-make-you-a-magician-or-is-it-just-handwaving) however, after reading http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3690061/ it looks like you are right. I’ll strike it out. Thanks.
Hi Al, thanks for the writeup! Here’s the video: https://vimeo.com/136779399
And here is a write up about how to drive the RGB Matrices.
http://www.instructables.com/id/AVR-Dual-RGB-Matrix-Driver/