Leap Motion just dropped what may be the biggest tease in Augmented and Virtual Reality since Google Cardboard. The North Star is an augmented reality head-mounted display that boasts some impressive specs:
- Dual 1600×1440 LCDs
- 120Hz refresh rate
- 100 degree FOV
- Cost under $100 (in volume)
- Open Source Hardware
- Built-in Leap Motion camera for precise hand tracking
Yes, you read that last line correctly. The North Star will be open source hardware. Leap Motion is planning to drop all the hardware information next week.
Now that we’ve got you excited, let’s mention what the North Star is not — it’s not a consumer device. Leap Motion’s idea here was to create a platform for developing Augmented Reality experiences — the user interface and interaction aspects. To that end, they built the best head-mounted display they could on a budget. The company started with standard 5.5″ cell phone displays, which made for an incredibly high resolution but low framerate (50 Hz) device. It was also large and completely unpractical.
The current iteration of the North Star uses much smaller displays, which results in a higher frame rate and a better overall experience. The secret sauce seems to be Leap’s use of ellipsoidal mirrors to achieve a large FOV while maintaining focus.
We’re excited, but also a bit wary of the $100 price point — Leap Motion is quick to note that the price is “in volume”. They also mention using diamond tipped tooling in a vibration isolated lathe to grind the mirrors down. If Leap hasn’t invested in some injection molding, those parts are going to make the whole thing expensive. Keep your eyes on the blog here for more information as soon as we have it!
You could have said this at any time in the last couple of decades: the world of virtual reality peripherals does not yet feel as though it has fulfilled its potential. From the Amiga-powered Virtuality headsets and nausea-inducing Nintendo Virtual Boy of the 1990s to today’s crop of advanced headsets and peripherals, there has always been a sense that we’re not quite there yet. Moments at which the shortcomings of the hardware intrude into the virtual world may be less frequent with the latest products, but still the goal of virtual world immersion seems elusive at times.
One of the more interesting peripherals on the market today is the Leap Motion controller. This is a USB device containing infra-red illumination and cameras which provide enough resolution for its software to accurately calculate the position of a user’s hands and fingers in three-dimensional space. This ability to track finger movement gives it the function of a controller for really complex interactions with and manipulations of objects in virtual worlds.
Even the Leap Motion has its shortcomings though, moments at which it ceases to be able to track. Rotating your hand, as you might for instance when aiming a virtual in-game weapon, confuses it. This led [Florian Maurer] to seek his own solution, and he’s come up with a hand peripheral containing a rotation sensor.
Inspired by a movie prop from the film Ender’s Game, it is a 3D-printed device that clips onto the palm of his hand between thumb and index finger. It contains both an Arduino Pro Micro and a bno055 rotation sensor, plus a couple of buttons for in-game actions such as triggers. It solves the problem with the Leap Motion’s rotation detection, and does not impede hand movement so much that he can’t also use his keyboard and mouse while wearing it. Sadly he does not yet seem to have posted any code, but he does treat us to a video demonstration which we’ve posted below the break.
Continue reading “VR Feels More Real with Leap Motion and This Rotation Sensor”
[Florian] wants to browse the web like an internet cowboy from a cyberpunk novel. Unfortunately, VR controllers are great for games but really incapacitate a hand for typing. A new input method was needed, one that would free his fingers for typing, but still give his hands detailed input into the virtual world.
Since VR goggles have… hopefully… already reached peak ridiculousness, his first idea was to glue a Leap Motion controller to the front of it. It couldn’t look any sillier after all. The Leap controller was designed to track hands, and when combined with the IMU built into the VR contraption, did a pretty good job of putting his hands into the world. Unfortunately, the primary gesture used for a “click” was only registering 80% of the time.
The gesture in question is a pinching motion, pushing the thumb and middle finger together. He couldn’t involve a big button without incapacitating his hands for typing. It took a few iterations, but he arrived at a compact ring design with a momentary switch on it. This is connected to an Arduino on his wrist, but was out of the way enough to allow him to type.
It’s yet another development marching us to usable VR. We personally can’t wait until we can use some technology straight out of Stephenson or Gibson novel.
Did [TobiasWeis] build a mirror that’s better at reflecting his image? No, he did not. Did he build a mirror that’s better at reflecting himself? We think so. In addition to these philosophical enhancements, the build itself is really nice.
The display is a Samsung LCD panel with its inconvenient plastic husk torn away and replaced with a new frame made of wood. We like the use of quickly made 3D printed brackets to hold the wood at a perfect 90 degrees while drilling the holes for the butt joints. Some time with glue, band clamps, and a few layers of paint and the frame was ready. He tried the DIY route for the two-way mirror, but decided to just order a glass one after some difficulty with bubbles and scratches.
A smart mirror needs an interface, but unless you own stock in Windex (glass cleaner), it is nice to have a way to keep it from turning into an OCD sufferer’s worst nightmare. This is, oddly, the first justification for the Leap Motion controller we can really buy into. Now, using the mirror does not involve touching the screen. [Tobias] initially thought to use a Raspberry Pi, but instead opted for a mini-computer that had been banging around a closet for a year or two. It had way more go power, and wouldn’t require him to hack drivers for the Leap Motion on the ARM version of Linux.
After that is was coding and installing modules. He goes into a bit of detail about it as well as his future plans. Our favorite is programming the mirror to show a scary face if you say “bloody mary” three times in a row.
[DerVonDenBergen] and his friend are working on a pretty slick mirror LCD with motion control called Reflecty — it looks like something straight out of the Iron Man movies or the Minority Report.
Like most mirror monitors they started with a two way mirror and a de-bezelled LCD — but then they added what looks like an art gallery light off the top — but instead of a light bulb, the arm holds a Leap Motion controller, allowing gesture commands to be given to the computer.
The effective range of the Leap Motion controller is about 8-10″ in front of the display allowing you to reach out and point at exactly what you want — and then squeeze your fist to click. A complete gallery of images is available over on Imgur, but stick around after the break to see a video of the display in action — we kind of want one.
Continue reading “Mirror Monitor Responds To Your Gestures”
The Leap Motion controller is a rather impressive little sensor bar that is capable of generating a massive 3D point cloud and recognizing hands and fingers to allow for gesture control based computing. It’s been out for a few years now but we haven’t seen many hackers playing with it. [Anwaarullah] has messed around with it before, but when it came time to submit something for India’s first Maker Faire, he decided to try doing an actual project with it.
Checking out the latest Leap Motion SDK, [Anwaarullah] realized many improvements had been made and he’d have to rewrite some of his original code to reflect the changes. This time around he’s opted to use the ESP8266 WiFi module instead of a Bluetooth one. He printed off a Raptor hand (from the wonderful folks at e-NABLE) and hooked it up with some RC servos to give him a nice robotic hand to control.
Continue reading “Leap Motion Wirelessly Controlling a Prosthetic Hand With an Arduino”
[Matt], [Andrew], [Noah], and [Tim] have a pretty interesting build for their capstone project at Ohio Northern University. They’re using a Microsoft Kinect, and a Leap Motion to create a natural user interface for controlling humanoid robots.
The robot the team is using for this project is a tracked humanoid robot they’ve affectionately come to call Johnny Five. Johnny takes commands from a computer, Kinect, and Leap motion to move the chassis, arm, and gripper around in a way that’s somewhat natural, and surely a lot easier than controlling a humanoid robot with a keyboard.
The team has also released all their software onto Github under an open source license. You can grab that over on the Gits, or take a look at some of the pics and videos from the Columbus Mini Maker Faire.