STL Fun: Converting Images To STL Geometry

stl image conversion

There’s been some good .STL manipulation tips in this week.

The first one is called stl_tools, and it’s a Python library to convert images or text to 3D-printable STL files. The examples shown are quite impressive, and it even does a top notch job of taking a 2D company logo into 3D! We can see this being quite handy if you need some quick 3D text, and either don’t use CAD, or really just need a one click solution. Now if only .STLs were easier to edit afterwards…

The second one is a Javascript based Leap Motion Controller STL manipulator, which lets you pick STLs and manipulate them individually with your fingers. If you happen to have a Leap, this could be a great way to show off 3D parts at a presentation or hackerspace talk, especially if you want to add a [Tony Stark] vibe to your presentation! Stick around after the break to see it in action — Now all we need are some good hologram generators…

Continue reading “STL Fun: Converting Images To STL Geometry”

Finally, a practical use for the Leap

Robots used in laparoscopic surgery are fairly commonplace, but controlling them is far from simple. The usual setup is something akin to a Waldo-style manipulator, allowing a surgeon to cut, cauterise, and stitch from across a room. There is another way to go about this thanks to some new hardware, as [Sriranjan] shows us with his Leap-controlled surgery bot.

[Sriranjan] isn’t using a real laparoscopic surgery robot for his experiments. Instead, he’s using the Le-Sur simulator that puts two virtual robot arms in front of a surgeon in training. Each of these robotic arms have seven degrees of freedom, and by using two Leap controllers (one each in a VM), [Sriranjan] was able to control both of them using his hands.

We’ve seen a lot of creative applications for the Leap sensor, like controlling quadcopters, controlling hexapod robots, and controlling more quadcopters, but this is the first time we’ve seen the Leap do something no other controller can – emulating the delicate touch of a surgeon’s hand

Continue reading “Finally, a practical use for the Leap”

Crazyflie control with Leap and Kinect

crazieFlie03

The gang at Bitcraze is at it again, this time developing Leap Motion control for their Crazyflie quadcopter, as well as releasing a Kinect-driven autopilot proof of concept. If you haven’t seen the Crazyflie before, you may not realize how compact it is: 90mm motor to motor and only 19 grams.

As far as we can tell, the Crazyflie still needs a PC to control it, so the Leap and Kinect are natural followups. Hand control with the Leap Motion is what you’d expect: just imagine your open palm controlling it like a marionette, with the height of your hand dictating thrust. The Kinect setup looks the most promising. The guys strapped a red ball to the Crazyflie that provides a trackable object against a white backdrop. The Kinect then monitors the quadcopter while a user steers via mouse clicks. Separate PID controllers correct the roll, pitch and thrust to reposition the Crazyflie from its current coordinates to a new setpoint chosen by a click or a drag. Videos of both Leap and Kinect piloting are below.

Tight on cash but still want to take to the skies? We have two rubber-band-powered devices from earlier this week: the Ornithopter and the hilariously brilliant GoPro Slingshot.

Continue reading “Crazyflie control with Leap and Kinect”

Leap motion controls hexapod with hand signals

leap-motion-hexapod-hand-control

Moving your hand makes this hexapod dance like a stringless marionette. Okay, so there’s obviously one string which is actually a wire but you know what we mean. The device on the floor is a Leap Motion sensor which is monitoring [Queron Williams’] hand gestures. This is done using a Processing library which leverages the Leap Motion API.

Right now the hand signals only affect pitch, roll, and yaw of the hexapod’s body. But [Queron] does plan to add support for monitoring both hands to add more control. We look at the demo after the break and think this is getting pretty close to the manipulations shown by [Tom Cruise] in Minority Report. Add Google Glass for a Heads Up Display and you could have auxiliary controls rendered on the periphery.

While you’re looking at [Queron’s] project post click on his ‘hexapod’ tag to catch a glimpse the build process for the robot.

Continue reading “Leap motion controls hexapod with hand signals”

Animating a lamp with the Leap Motion

leap

The Leap Motion is a very cool device, but so far we haven’t seen many applications of interacting with physical devices. [Xavier] wanted to control a cute servo animated desk lamp with his hands, and with the help of a Leap and an Arduino he was able to do just that.

The Leap Motion API has a handy feature that will output all its data over a websocket. It’s a very easy way to transfer hand positions with a minimum amount of overhead, and with just a little bit of Node.js, it’s only two lines of code to connect the Leap to a websocket server.

With the Leap data on a web server, the only thing left to do is pulling it down to an Arduino. Again, [Xavier] used Node.js, this time in the form of johnny five, a Javascript-based Arduino framework. After that, it was a simple matter of mapping the data from the Leap to servo movements in [Xavier]’s Pixar-inspired lamp.

Video of the build below.

Continue reading “Animating a lamp with the Leap Motion”

3D display controlled with the Leap Motion

3d-display-controlled-with-leap-motion

Touch screens are nice — we still can’t live without a keyboard but they suffice when on the go. But it is becoming obvious that the end goal with user interface techniques is to completely remove the need to touch a piece of hardware in order to interact with it. One avenue for this goal is the use of voice commands via software like Siri, but another is the use of 3D processing hardware like Kinect or Leap Motion. This project uses the latter to control the image shown on the 3D display.

[Robbie Tilton] generated a 3D image using Three.js, a JavaScript 3D library. The images are made to appear as if floating in air using a pyramid of acrylic which reflects the light toward the viewer’s eyes without blocking out ambient light in the room. In the past we’ve referred to this as a volumetric display. But [Robbie] points out that this actually uses the illusion called Pepper’s Ghost. It’s not really volumetric because the depth is merely an illusion. Moving your point of view won’t change your perspective unless you go around the corner to the next piece of acrylic. But it’s still a nice effect. See for yourself in the demo after the jump.

Continue reading “3D display controlled with the Leap Motion”

Air Harp using the Leap Motion

leap-motion-air-harp

He’s just pointing in this image, but this Air Harp can be played using many fingers as once. It’s a demonstration which [Adam Somers] threw together in one weekend when working with the Leap Motion developer board. We first heard about this slick piece of hardware back in May and from the looks of it this is every bit as amazing as first reported.

Part of what made the project come together so quickly is that [Adam] had already developed a package called muskit. It’s a C++ toolkit for making music applications. It puts the framework in place what we hear in the video after the break. The weekend of hacking makes use of the positional data from the Leap Motion and handles how your digits interact with the virtual strings. You can watch as [Adam] adds more and more strings to the virtual instrument for his finger to interact with. The distance from the screen is what decided is your finger will pluck or not. This is indicated with a red circle when your fingertip is close enough to interact with the phantom string.

Get your hands on the code from his repositories.

Continue reading “Air Harp using the Leap Motion”