Leap Motion’s Project North Star Gets Hardware

It’s been more than a year since we first heard about Leap Motion’s new, Open Source augmented reality headset. The first time around, we were surprised: the headset featured dual 1600×1440 LCDs, 120 Hz refresh rate, 100 degree FOV, and the entire thing would cost under $100 (in volume), with everything, from firmware to mechanical design released under Open licenses. Needless to say, that’s easier said than done. Now it seems Leap Motion is releasing files for various components and a full-scale release might be coming sooner than we think.

Leap Motion first made a name for themselves with the Leap Motion sensor, a sort of mini-Kinect that only worked with hands and arms. Yes, we’re perfectly aware that sounds dumb, but the results were impressive: everything turned into a touchscreen display, you could draw with your fingers, and control robots with your hands. If you mount one of these sensors to your forehead, and reflect a few phone screens onto your retinas, you have the makings of a stereoscopic AR headset that tracks the movement of your hands. This is an over-simplified description, but conceptually, that’s what Project North Star is.

The files released now include STLs of parts that can be 3D printed on any filament printer, files for the electronics that drive the backlight and receive video from a laptop, and even software for doing actual Augmented Reality stuff in Unity. It’s not a complete project ready for prime time, but it’s a far cry from the simple spec sheet full of promises we saw in the middle of last year.

A Low Cost VR Headset

Virtual reality systems have been at the forefront of development for several decades. While there are  commercial offerings now, it’s interesting to go back in time to when the systems were much more limited. [Colin Ord] recently completed his own VR system, modeled on available systems from 20-30 years ago, which gives us a look inside what those systems would have been like, as well as being built for a very low cost using today’s technology.

The core of this project is a head tracker, which uses two BBC Microbits as they have both the accelerometer and compass needed to achieve the project goals. It is also capable of tracking an item and its position in the virtual space. For this project, [Colin] built everything himself including the electronics and the programming. It also makes use of Google Cardboard to hold the screen, lenses, and sensors all in the headset. All of this keeps the costs down, unlike similar systems when they were first unveiled years ago.

The ground-up approach that this project takes is indeed commendable. Hopefully we can see the code released, and others can build upon this excellent work. You could even use it to take a virtual reality cycling tour of the UK.

Continue reading “A Low Cost VR Headset”

Home Built Flight Sim Combines Virtual And Actual Reality

Virtual Reality (VR) and actual reality often don’t mix: watch someone play a VR game without seeing what they see and you see a lot of pointless-looking flailing around. [Nerdaxic] may have found a balance that works in this flight sim setup that mixes VR and AR, though. He did this by combining the virtual cockpit controls of his fight simulator with real buttons, knobs, and dials. He uses an HTC Vive headset and a beefy PC to create the virtual side, which is mirrored with a real-world version. So, the virtual yoke is matched with a real one. The same is true of all of the controls, thanks to a home-made control panel that features all of the physical controls of a Cessna 172 Skyhawk.

[Nerdaxic] has released the plans for the project, including his 3D printable knobs for throttle and fuel/air mixture and the design for the wooden panel and assembly that holds all of the controls in the same place as they are in the real thing. He even put a fan in the system to produce a gentle breeze to enhance the feel of sticking your head out of the window — just don’t try that on a real aircraft.

Continue reading “Home Built Flight Sim Combines Virtual And Actual Reality”

Homebrew Linear Actuators Put The Moves On This Motion Simulator

Breaking into the world of auto racing is easy. Step 1: Buy an expensive car. Step 2: Learn how to drive it without crashing. If you’re stuck at step 1, and things aren’t looking great for step 2 either, you might want to consider going with a virtual Porsche or Ferrari and spending your evenings driving virtual laps rather than real ones.

The trouble is, that can get a bit boring after a while, which is what this DIY motion simulator platform is meant to address. In a long series of posts with a load of build details, [pmvcda] goes through what he’s come up with so far on this work in progress. He’s building a Stewart platform, of the type we’ve seen before but on a much grander scale. This one will be large enough to hold a race car cockpit mockup, which explains the welded aluminum frame. We were most interested in the six custom-made linear actuators, though. Aluminum extrusions form the frame holding BLDC motor, and guide the nut of a long ball screw. There are a bunch of 3D-printed parts in the actuators, each of which is anchored to the frame and to the platform by simple universal joints. The actuators are a little on the loud side, but they’re fast and powerful, and they’ve got a great industrial look.

If car racing is not your thing and you’d rather build a full-motion flight simulator, here’s one that also uses DIY actuators.

Continue reading “Homebrew Linear Actuators Put The Moves On This Motion Simulator”

Redirected Walking In VR Done Via Exploit Of Eyeballs

[Anjul Patney] and [Qi Sun] demonstrated a fascinating new technique at NVIDIA’s GPU Technology Conference (GTC) for tricking a human into thinking a VR space is larger than it actually is. The way it works is this: when a person walks around in VR, they invariably make turns. During these turns, it’s possible to fool the person into thinking they have pivoted more or less than they have actually physically turned. With a way to manipulate perception of turns comes a way for software to gently manipulate a person’s perception of how large a virtual space is. Unlike other methods that rely on visual distortions, this method is undetectable by the viewer.

Saccadic movements

The software essentially exploits a quirk of how our eyes work. When a human’s eyes move around to look at different things, the eyeballs don’t physically glide smoothly from point to point. The eyes make frequent but unpredictable darting movements called saccades. There are a number of deeply interesting things about saccades, but the important one here is the fact that our eyes essentially go offline during saccadic movement. Our vision is perceived as a smooth and unbroken stream, but that’s a result of the brain stitching visual information into a cohesive whole, and filling in blanks without us being aware of it.

Part one of [Anjul] and [Qi]’s method is to manipulate perception of a virtual area relative to actual physical area by making a person’s pivots not a 1:1 match. In VR, it may appear one has turned more or less than one has in the real world, and in this way the software can guide the physical motion while making it appear in VR as though nothing is amiss. But by itself, this isn’t enough. To make the mismatches imperceptible, the system watches the eye for saccades and times its adjustments to occur only while they are underway. The brain ignores what happens during saccadic movement, stitches together the rest, and there you have it: a method to gently steer a human being in a way that a virtual space is larger than the physical area available.

Embedded below is a video demonstration and overview, which mentions other methods of manipulating perception of space in VR and how it avoids the pitfalls of other methods.

Continue reading “Redirected Walking In VR Done Via Exploit Of Eyeballs”

Leap Motion Announces Open Source Augmented Reality Headset

Leap Motion just dropped what may be the biggest tease in Augmented and Virtual Reality since Google Cardboard. The North Star is an augmented reality head-mounted display that boasts some impressive specs:

  • Dual 1600×1440 LCDs
  • 120Hz refresh rate
  • 100 degree FOV
  • Cost under $100 (in volume)
  • Open Source Hardware
  • Built-in Leap Motion camera for precise hand tracking

Yes, you read that last line correctly. The North Star will be open source hardware. Leap Motion is planning to drop all the hardware information next week.

Now that we’ve got you excited, let’s mention what the North Star is not — it’s not a consumer device. Leap Motion’s idea here was to create a platform for developing Augmented Reality experiences — the user interface and interaction aspects. To that end, they built the best head-mounted display they could on a budget. The company started with standard 5.5″ cell phone displays, which made for an incredibly high resolution but low framerate (50 Hz) device. It was also large and completely unpractical.

The current iteration of the North Star uses much smaller displays, which results in a higher frame rate and a better overall experience.  The secret sauce seems to be Leap’s use of ellipsoidal mirrors to achieve a large FOV while maintaining focus.

We’re excited, but also a bit wary of the $100 price point — Leap Motion is quick to note that the price is “in volume”. They also mention using diamond tipped tooling in a vibration isolated lathe to grind the mirrors down. If Leap hasn’t invested in some injection molding, those parts are going to make the whole thing expensive. Keep your eyes on the blog here for more information as soon as we have it!