Robots used in laparoscopic surgery are fairly commonplace, but controlling them is far from simple. The usual setup is something akin to a Waldo-style manipulator, allowing a surgeon to cut, cauterise, and stitch from across a room. There is another way to go about this thanks to some new hardware, as [Sriranjan] shows us with his Leap-controlled surgery bot.
[Sriranjan] isn’t using a real laparoscopic surgery robot for his experiments. Instead, he’s using the Le-Sur simulator that puts two virtual robot arms in front of a surgeon in training. Each of these robotic arms have seven degrees of freedom, and by using two Leap controllers (one each in a VM), [Sriranjan] was able to control both of them using his hands.
We’ve seen a lot of creative applications for the Leap sensor, like controlling quadcopters, controlling hexapod robots, and controlling more quadcopters, but this is the first time we’ve seen the Leap do something no other controller can – emulating the delicate touch of a surgeon’s hand
Continue reading “Finally, a practical use for the Leap”
He’s just pointing in this image, but this Air Harp can be played using many fingers as once. It’s a demonstration which [Adam Somers] threw together in one weekend when working with the Leap Motion developer board. We first heard about this slick piece of hardware back in May and from the looks of it this is every bit as amazing as first reported.
Part of what made the project come together so quickly is that [Adam] had already developed a package called muskit. It’s a C++ toolkit for making music applications. It puts the framework in place what we hear in the video after the break. The weekend of hacking makes use of the positional data from the Leap Motion and handles how your digits interact with the virtual strings. You can watch as [Adam] adds more and more strings to the virtual instrument for his finger to interact with. The distance from the screen is what decided is your finger will pluck or not. This is indicated with a red circle when your fingertip is close enough to interact with the phantom string.
Get your hands on the code from his repositories.
Continue reading “Air Harp using the Leap Motion”
A few folks over at National Instruments going under the name LabVIEW Hacker have gotten their hands on a Leap Motion dev kit. The Leap is an interesting little input device designed to track fingertips in 3D space, much like a Kinect but at much higher resolution. Needing something to show off their LabVIEW prowess, these guys controlled their office AR Drone with the Leap, making a quadcopter controller that is completely touchless.
Building on their previous AR Drone hack, the LabVIEW team spent the better part of a day adding wrappers around the Leap SDK and adding in control for their RC quadcopter. Now, simply by moving their fingertips over the Leap sensor, they can control their office quadrotor using a very high-resolution 3D scanner.
Video after the break.
Continue reading “Controlling a quadcopter with a Leap Motion”
The big news circulating this morning is of the Leap Motion sensor that will be hitting the market soon. They claim that their sensor is ~100 more accurate than anything else on the market right now. Check out the video to see what they mean (there’s another at the link). This is pretty impressive. You can see the accuracy as they use a stylus to write things. If you’ve played with the Kinect, you know that it is nowhere near this tight. Of course, the Kinect is scanning a massive space compared to the work area that this new sensor works in. The response time looks very impressive as well, motions seem to be perfectly in sync with the input. We’re excited to play with one when we get a chance.
Continue reading “Hackit: Leap Motions new motion sensor”