Follow Me: Making Servos Track Hand Motion With Leap

The Leap controller is one of those gadgets that is probably better for its cool factor rather than its practicality. The time of flight optical sensor reads gestures, but it is hardly a substitute for a mouse in many cases. It seems like the best uses for it we’ve seen are dedicated systems that need to know where your hands are. [Justin Platz] and [Kurt Clothier], for example, have an interesting demo that uses a Leap to control a Raspberry Pi. The Pi commands servo motors that move LED blocks to track your hand motion. Their code is available on GitHub.

The project is begging for a video, but they did provide an animated image:

pubnub_leapmotion_RaspberryPi_servo_iot_DIY

The demo shows off the low latency possible with PubNub which is yet another Internet of Things broker (everyone wants to be the one thing that rules them all, apparently).

One thing that really caught out eye is the use of these 8×8 RGB modules as the “eyes” of the robot. You have to admit this freshens up the paradigm of using cameras as the eyes of the bot!

This isn’t the first practical use for the Leap we’ve covered. Not to mention, if you have a 3D printer, you should check out the video below.

5 thoughts on “Follow Me: Making Servos Track Hand Motion With Leap

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.