Did [TobiasWeis] build a mirror that’s better at reflecting his image? No, he did not. Did he build a mirror that’s better at reflecting himself? We think so. In addition to these philosophical enhancements, the build itself is really nice.
The display is a Samsung LCD panel with its inconvenient plastic husk torn away and replaced with a new frame made of wood. We like the use of quickly made 3D printed brackets to hold the wood at a perfect 90 degrees while drilling the holes for the butt joints. Some time with glue, band clamps, and a few layers of paint and the frame was ready. He tried the DIY route for the two-way mirror, but decided to just order a glass one after some difficulty with bubbles and scratches.
A smart mirror needs an interface, but unless you own stock in Windex (glass cleaner), it is nice to have a way to keep it from turning into an OCD sufferer’s worst nightmare. This is, oddly, the first justification for the Leap Motion controller we can really buy into. Now, using the mirror does not involve touching the screen. [Tobias] initially thought to use a Raspberry Pi, but instead opted for a mini-computer that had been banging around a closet for a year or two. It had way more go power, and wouldn’t require him to hack drivers for the Leap Motion on the ARM version of Linux.
After that is was coding and installing modules. He goes into a bit of detail about it as well as his future plans. Our favorite is programming the mirror to show a scary face if you say “bloody mary” three times in a row.
[DerVonDenBergen] and his friend are working on a pretty slick mirror LCD with motion control called Reflecty — it looks like something straight out of the Iron Man movies or the Minority Report.
Like most mirror monitors they started with a two way mirror and a de-bezelled LCD — but then they added what looks like an art gallery light off the top — but instead of a light bulb, the arm holds a Leap Motion controller, allowing gesture commands to be given to the computer.
The effective range of the Leap Motion controller is about 8-10″ in front of the display allowing you to reach out and point at exactly what you want — and then squeeze your fist to click. A complete gallery of images is available over on Imgur, but stick around after the break to see a video of the display in action — we kind of want one.
Continue reading “Mirror Monitor Responds To Your Gestures”
The Leap Motion controller is a rather impressive little sensor bar that is capable of generating a massive 3D point cloud and recognizing hands and fingers to allow for gesture control based computing. It’s been out for a few years now but we haven’t seen many hackers playing with it. [Anwaarullah] has messed around with it before, but when it came time to submit something for India’s first Maker Faire, he decided to try doing an actual project with it.
Checking out the latest Leap Motion SDK, [Anwaarullah] realized many improvements had been made and he’d have to rewrite some of his original code to reflect the changes. This time around he’s opted to use the ESP8266 WiFi module instead of a Bluetooth one. He printed off a Raptor hand (from the wonderful folks at e-NABLE) and hooked it up with some RC servos to give him a nice robotic hand to control.
Continue reading “Leap Motion Wirelessly Controlling a Prosthetic Hand With an Arduino”
[Matt], [Andrew], [Noah], and [Tim] have a pretty interesting build for their capstone project at Ohio Northern University. They’re using a Microsoft Kinect, and a Leap Motion to create a natural user interface for controlling humanoid robots.
The robot the team is using for this project is a tracked humanoid robot they’ve affectionately come to call Johnny Five. Johnny takes commands from a computer, Kinect, and Leap motion to move the chassis, arm, and gripper around in a way that’s somewhat natural, and surely a lot easier than controlling a humanoid robot with a keyboard.
The team has also released all their software onto Github under an open source license. You can grab that over on the Gits, or take a look at some of the pics and videos from the Columbus Mini Maker Faire.
Let’s face it, most of the time we’re hacking for no other reason than sheer enjoyment. So we love to see hacks come about that can really make a difference in people’s lives. This time around it’s a video game designed to exercise your eyes. [James Blaha] has an eye condition called Strabismus which is commonly known as crossed-eye. The issue is that the muscles for each eye don’t coordinate with each other in the way they need to in order to produce three-dimensional vision.
Recent research (linked in the reference section of [James’] post) suggests that special exercises may be able to train the eyes to work correctly. He’s been working on developing a video game to promote this type of training. As you can see above, the user (patient?) wears an Oculus Rift headset which makes it possible to show each eye slightly different images, while using a Leap Motion controller for VR interaction. If designed correctly, and paired with the addictive qualities of games, this my be just what the doctor ordered. You know what they say, practice makes perfect!
Continue reading “Video Gaming to Fix Eye Ailments”
There’s been some good .STL manipulation tips in this week.
The first one is called stl_tools, and it’s a Python library to convert images or text to 3D-printable STL files. The examples shown are quite impressive, and it even does a top notch job of taking a 2D company logo into 3D! We can see this being quite handy if you need some quick 3D text, and either don’t use CAD, or really just need a one click solution. Now if only .STLs were easier to edit afterwards…
Continue reading “STL Fun: Converting Images To STL Geometry”
Robots used in laparoscopic surgery are fairly commonplace, but controlling them is far from simple. The usual setup is something akin to a Waldo-style manipulator, allowing a surgeon to cut, cauterise, and stitch from across a room. There is another way to go about this thanks to some new hardware, as [Sriranjan] shows us with his Leap-controlled surgery bot.
[Sriranjan] isn’t using a real laparoscopic surgery robot for his experiments. Instead, he’s using the Le-Sur simulator that puts two virtual robot arms in front of a surgeon in training. Each of these robotic arms have seven degrees of freedom, and by using two Leap controllers (one each in a VM), [Sriranjan] was able to control both of them using his hands.
We’ve seen a lot of creative applications for the Leap sensor, like controlling quadcopters, controlling hexapod robots, and controlling more quadcopters, but this is the first time we’ve seen the Leap do something no other controller can – emulating the delicate touch of a surgeon’s hand
Continue reading “Finally, a practical use for the Leap”