ANUBIS, A Natural User Bot Interface System

[Matt], [Andrew], [Noah], and [Tim] have a pretty interesting build for their capstone project at Ohio Northern University. They’re using a Microsoft Kinect, and a Leap Motion to create a natural user interface for controlling humanoid robots.

The robot the team is using for this project is a tracked humanoid robot they’ve affectionately come to call Johnny Five.  Johnny takes commands from a computer, Kinect, and Leap motion to move the chassis, arm, and gripper around in a way that’s somewhat natural, and surely a lot easier than controlling a humanoid robot with a keyboard.

The team has also released all their software onto Github under an open source license. You can grab that over on the Gits, or take a look at some of the pics and videos from the Columbus Mini Maker Faire.

Video Gaming to Fix Eye Ailments

video-gaming-fixes-crossed-eyes

Let’s face it, most of the time we’re hacking for no other reason than sheer enjoyment. So we love to see hacks come about that can really make a difference in people’s lives. This time around it’s a video game designed to exercise your eyes. [James Blaha] has an eye condition called Strabismus which is commonly known as crossed-eye. The issue is that the muscles for each eye don’t coordinate with each other in the way they need to in order to produce three-dimensional vision.

Recent research (linked in the reference section of [James’] post) suggests that special exercises may be able to train the eyes to work correctly. He’s been working on developing a video game to promote this type of training. As you can see above, the user (patient?) wears an Oculus Rift headset which makes it possible to show each eye slightly different images, while using a Leap Motion controller for VR interaction. If designed correctly, and paired with the addictive qualities of games, this my be just what the doctor ordered. You know what they say, practice makes perfect!

Continue reading “Video Gaming to Fix Eye Ailments”

STL Fun: Converting Images To STL Geometry

stl image conversion

There’s been some good .STL manipulation tips in this week.

The first one is called stl_tools, and it’s a Python library to convert images or text to 3D-printable STL files. The examples shown are quite impressive, and it even does a top notch job of taking a 2D company logo into 3D! We can see this being quite handy if you need some quick 3D text, and either don’t use CAD, or really just need a one click solution. Now if only .STLs were easier to edit afterwards…

The second one is a Javascript based Leap Motion Controller STL manipulator, which lets you pick STLs and manipulate them individually with your fingers. If you happen to have a Leap, this could be a great way to show off 3D parts at a presentation or hackerspace talk, especially if you want to add a [Tony Stark] vibe to your presentation! Stick around after the break to see it in action — Now all we need are some good hologram generators…

Continue reading “STL Fun: Converting Images To STL Geometry”

Finally, a practical use for the Leap

Leap

Robots used in laparoscopic surgery are fairly commonplace, but controlling them is far from simple. The usual setup is something akin to a Waldo-style manipulator, allowing a surgeon to cut, cauterise, and stitch from across a room. There is another way to go about this thanks to some new hardware, as [Sriranjan] shows us with his Leap-controlled surgery bot.

[Sriranjan] isn’t using a real laparoscopic surgery robot for his experiments. Instead, he’s using the Le-Sur simulator that puts two virtual robot arms in front of a surgeon in training. Each of these robotic arms have seven degrees of freedom, and by using two Leap controllers (one each in a VM), [Sriranjan] was able to control both of them using his hands.

We’ve seen a lot of creative applications for the Leap sensor, like controlling quadcopters, controlling hexapod robots, and controlling more quadcopters, but this is the first time we’ve seen the Leap do something no other controller can – emulating the delicate touch of a surgeon’s hand

Continue reading “Finally, a practical use for the Leap”

Crazyflie control with Leap and Kinect

crazieFlie03

The gang at Bitcraze is at it again, this time developing Leap Motion control for their Crazyflie quadcopter, as well as releasing a Kinect-driven autopilot proof of concept. If you haven’t seen the Crazyflie before, you may not realize how compact it is: 90mm motor to motor and only 19 grams.

As far as we can tell, the Crazyflie still needs a PC to control it, so the Leap and Kinect are natural followups. Hand control with the Leap Motion is what you’d expect: just imagine your open palm controlling it like a marionette, with the height of your hand dictating thrust. The Kinect setup looks the most promising. The guys strapped a red ball to the Crazyflie that provides a trackable object against a white backdrop. The Kinect then monitors the quadcopter while a user steers via mouse clicks. Separate PID controllers correct the roll, pitch and thrust to reposition the Crazyflie from its current coordinates to a new setpoint chosen by a click or a drag. Videos of both Leap and Kinect piloting are below.

Tight on cash but still want to take to the skies? We have two rubber-band-powered devices from earlier this week: the Ornithopter and the hilariously brilliant GoPro Slingshot.

Continue reading “Crazyflie control with Leap and Kinect”

Leap motion controls hexapod with hand signals

leap-motion-hexapod-hand-control

Moving your hand makes this hexapod dance like a stringless marionette. Okay, so there’s obviously one string which is actually a wire but you know what we mean. The device on the floor is a Leap Motion sensor which is monitoring [Queron Williams’] hand gestures. This is done using a Processing library which leverages the Leap Motion API.

Right now the hand signals only affect pitch, roll, and yaw of the hexapod’s body. But [Queron] does plan to add support for monitoring both hands to add more control. We look at the demo after the break and think this is getting pretty close to the manipulations shown by [Tom Cruise] in Minority Report. Add Google Glass for a Heads Up Display and you could have auxiliary controls rendered on the periphery.

While you’re looking at [Queron’s] project post click on his ‘hexapod’ tag to catch a glimpse the build process for the robot.

Continue reading “Leap motion controls hexapod with hand signals”

Animating a lamp with the Leap Motion

leap

The Leap Motion is a very cool device, but so far we haven’t seen many applications of interacting with physical devices. [Xavier] wanted to control a cute servo animated desk lamp with his hands, and with the help of a Leap and an Arduino he was able to do just that.

The Leap Motion API has a handy feature that will output all its data over a websocket. It’s a very easy way to transfer hand positions with a minimum amount of overhead, and with just a little bit of Node.js, it’s only two lines of code to connect the Leap to a websocket server.

With the Leap data on a web server, the only thing left to do is pulling it down to an Arduino. Again, [Xavier] used Node.js, this time in the form of johnny five, a Javascript-based Arduino framework. After that, it was a simple matter of mapping the data from the Leap to servo movements in [Xavier]’s Pixar-inspired lamp.

Video of the build below.

Continue reading “Animating a lamp with the Leap Motion”