Crazyflie Control With Leap And Kinect

crazieFlie03

The gang at Bitcraze is at it again, this time developing Leap Motion control for their Crazyflie quadcopter, as well as releasing a Kinect-driven autopilot proof of concept. If you haven’t seen the Crazyflie before, you may not realize how compact it is: 90mm motor to motor and only 19 grams.

As far as we can tell, the Crazyflie still needs a PC to control it, so the Leap and Kinect are natural followups. Hand control with the Leap Motion is what you’d expect: just imagine your open palm controlling it like a marionette, with the height of your hand dictating thrust. The Kinect setup looks the most promising. The guys strapped a red ball to the Crazyflie that provides a trackable object against a white backdrop. The Kinect then monitors the quadcopter while a user steers via mouse clicks. Separate PID controllers correct the roll, pitch and thrust to reposition the Crazyflie from its current coordinates to a new setpoint chosen by a click or a drag. Videos of both Leap and Kinect piloting are below.

Tight on cash but still want to take to the skies? We have two rubber-band-powered devices from earlier this week: the Ornithopter and the hilariously brilliant GoPro Slingshot.

Continue reading “Crazyflie Control With Leap And Kinect”

Leap Motion Controls Hexapod With Hand Signals

leap-motion-hexapod-hand-control

Moving your hand makes this hexapod dance like a stringless marionette. Okay, so there’s obviously one string which is actually a wire but you know what we mean. The device on the floor is a Leap Motion sensor which is monitoring [Queron Williams’] hand gestures. This is done using a Processing library which leverages the Leap Motion API.

Right now the hand signals only affect pitch, roll, and yaw of the hexapod’s body. But [Queron] does plan to add support for monitoring both hands to add more control. We look at the demo after the break and think this is getting pretty close to the manipulations shown by [Tom Cruise] in Minority Report. Add Google Glass for a Heads Up Display and you could have auxiliary controls rendered on the periphery.

While you’re looking at [Queron’s] project post click on his ‘hexapod’ tag to catch a glimpse the build process for the robot.

Continue reading “Leap Motion Controls Hexapod With Hand Signals”

Animating A Lamp With The Leap Motion

leap

The Leap Motion is a very cool device, but so far we haven’t seen many applications of interacting with physical devices. [Xavier] wanted to control a cute servo animated desk lamp with his hands, and with the help of a Leap and an Arduino he was able to do just that.

The Leap Motion API has a handy feature that will output all its data over a websocket. It’s a very easy way to transfer hand positions with a minimum amount of overhead, and with just a little bit of Node.js, it’s only two lines of code to connect the Leap to a websocket server.

With the Leap data on a web server, the only thing left to do is pulling it down to an Arduino. Again, [Xavier] used Node.js, this time in the form of johnny five, a Javascript-based Arduino framework. After that, it was a simple matter of mapping the data from the Leap to servo movements in [Xavier]’s Pixar-inspired lamp.

Video of the build below.

Continue reading “Animating A Lamp With The Leap Motion”

3D Display Controlled With The Leap Motion

3d-display-controlled-with-leap-motion

Touch screens are nice — we still can’t live without a keyboard but they suffice when on the go. But it is becoming obvious that the end goal with user interface techniques is to completely remove the need to touch a piece of hardware in order to interact with it. One avenue for this goal is the use of voice commands via software like Siri, but another is the use of 3D processing hardware like Kinect or Leap Motion. This project uses the latter to control the image shown on the 3D display.

[Robbie Tilton] generated a 3D image using Three.js, a JavaScript 3D library. The images are made to appear as if floating in air using a pyramid of acrylic which reflects the light toward the viewer’s eyes without blocking out ambient light in the room. In the past we’ve referred to this as a volumetric display. But [Robbie] points out that this actually uses the illusion called Pepper’s Ghost. It’s not really volumetric because the depth is merely an illusion. Moving your point of view won’t change your perspective unless you go around the corner to the next piece of acrylic. But it’s still a nice effect. See for yourself in the demo after the jump.

Continue reading “3D Display Controlled With The Leap Motion”

Air Harp Using The Leap Motion

leap-motion-air-harp

He’s just pointing in this image, but this Air Harp can be played using many fingers as once. It’s a demonstration which [Adam Somers] threw together in one weekend when working with the Leap Motion developer board. We first heard about this slick piece of hardware back in May and from the looks of it this is every bit as amazing as first reported.

Part of what made the project come together so quickly is that [Adam] had already developed a package called muskit. It’s a C++ toolkit for making music applications. It puts the framework in place what we hear in the video after the break. The weekend of hacking makes use of the positional data from the Leap Motion and handles how your digits interact with the virtual strings. You can watch as [Adam] adds more and more strings to the virtual instrument for his finger to interact with. The distance from the screen is what decided is your finger will pluck or not. This is indicated with a red circle when your fingertip is close enough to interact with the phantom string.

Get your hands on the code from his repositories.

Continue reading “Air Harp Using The Leap Motion”

Controlling A Quadcopter With A Leap Motion

A few folks over at National Instruments going under the name LabVIEW Hacker have gotten their hands on a Leap Motion dev kit. The Leap is an interesting little input device designed to track fingertips in 3D space, much like a Kinect but at much higher resolution. Needing something to show off their LabVIEW prowess, these guys controlled their office AR Drone with the Leap, making a quadcopter controller that is completely touchless.

Building on their previous AR Drone hack, the LabVIEW team spent the better part of a day adding wrappers around the Leap SDK and adding in control for their RC quadcopter. Now, simply by moving their fingertips over the Leap sensor, they can control their office quadrotor using a very high-resolution 3D scanner.

Video after the break.

Continue reading “Controlling A Quadcopter With A Leap Motion”