Leap Motion’s Project North Star Gets Hardware

It’s been more than a year since we first heard about Leap Motion’s new, Open Source augmented reality headset. The first time around, we were surprised: the headset featured dual 1600×1440 LCDs, 120 Hz refresh rate, 100 degree FOV, and the entire thing would cost under $100 (in volume), with everything, from firmware to mechanical design released under Open licenses. Needless to say, that’s easier said than done. Now it seems Leap Motion is releasing files for various components and a full-scale release might be coming sooner than we think.

Leap Motion first made a name for themselves with the Leap Motion sensor, a sort of mini-Kinect that only worked with hands and arms. Yes, we’re perfectly aware that sounds dumb, but the results were impressive: everything turned into a touchscreen display, you could draw with your fingers, and control robots with your hands. If you mount one of these sensors to your forehead, and reflect a few phone screens onto your retinas, you have the makings of a stereoscopic AR headset that tracks the movement of your hands. This is an over-simplified description, but conceptually, that’s what Project North Star is.

The files released now include STLs of parts that can be 3D printed on any filament printer, files for the electronics that drive the backlight and receive video from a laptop, and even software for doing actual Augmented Reality stuff in Unity. It’s not a complete project ready for prime time, but it’s a far cry from the simple spec sheet full of promises we saw in the middle of last year.

Screen Shake In VR, Minus The Throwing Up

In first-person games, an effective way to heighten immersion is to give the player a sense of impact and force by figuratively shaking the camera. That’s a tried and true practice for FPS games played on a monitor, but to [Zulubo]’s knowledge, no one has implemented traditional screen shake in a VR title because it would be a sure way to trigger motion sickness. Unsatisfied with that limitation, some clever experimentation led [Zulubo] to a method of doing screen shake in VR that doesn’t cause any of the usual problems.

Screen shake doesn’t translate well to VR because the traditional method is to shake the player’s entire view. This works fine when viewed on a monitor, but in VR the brain interprets the visual cue as evidence that one’s head and eyeballs are physically shaking while the vestibular system is reporting nothing of the sort. This kind of sensory mismatch leads to motion sickness in most people.

The key to getting the essence of a screen shake without any of the motion sickness baggage turned out to be a mix of two things. First, the shake is restricted to peripheral vision only. Second, it is restricted to an “in and out” motion, with no tilting or twisting. The result is a conveyance of concussion and impact that doesn’t rely on shaking the player’s view, at least not in a way that leads to motion sickness. It’s the product of some clever experimentation to solve a problem, and freely downloadable for use by anyone who may be interested.

Speaking of fooling one’s senses in VR environments, here is a fascinating method of simulating zero gravity: waterproof the VR headset and go underwater.

[via Reddit]

Home Built Flight Sim Combines Virtual and Actual Reality

Virtual Reality (VR) and actual reality often don’t mix: watch someone play a VR game without seeing what they see and you see a lot of pointless-looking flailing around. [Nerdaxic] may have found a balance that works in this flight sim setup that mixes VR and AR, though. He did this by combining the virtual cockpit controls of his fight simulator with real buttons, knobs, and dials. He uses an HTC Vive headset and a beefy PC to create the virtual side, which is mirrored with a real-world version. So, the virtual yoke is matched with a real one. The same is true of all of the controls, thanks to a home-made control panel that features all of the physical controls of a Cessna 172 Skyhawk.

[Nerdaxic] has released the plans for the project, including his 3D printable knobs for throttle and fuel/air mixture and the design for the wooden panel and assembly that holds all of the controls in the same place as they are in the real thing. He even put a fan in the system to produce a gentle breeze to enhance the feel of sticking your head out of the window — just don’t try that on a real aircraft.

Continue reading “Home Built Flight Sim Combines Virtual and Actual Reality”

Underwater VR Offers Zero Gravity on a Budget

Someday Elon Musk might manage to pack enough of us lowly serfs into one of his super rockets that we can actually afford a ticket to space, but until then our options for experiencing weightlessness are pretty limited. Even if you’ll settle for a ride on one of the so-called “Vomit Comet” reduced-gravity planes, you’ll have to surrender a decent chunk of change, and as the name implies, potentially your lunch as well. Is there no recourse for the hacker that wants to get a taste of the astronaut experience without a NASA-sized budget?

Well, if you’re willing to get wet, [spiritplumber] might have the answer for you. Using a few 3D printed components he’s designed, it’s possible to use Google Cardboard compatible virtual reality software from the comfort of your own pool. With Cardboard providing the visuals and the water keeping you buoyant, the end result is something not entirely unlike weightlessly flying around virtual environments.

To construct his underwater VR headset, [spiritplumber] uses a number of off-the-shelf products. The main “Cardboard” headset itself is the common plastic style that you can probably find in the clearance section of whatever Big Box retailer is convenient for you, and the waterproof bag that holds the phone can be obtained cheaply online. You’ll also need a pair of swimmers goggles to keep water from rudely interrupting your wide-eyed wonderment. As for the custom printed parts, a frame keeps the waterproof bag from pressing against the screen while submerged, and a large spacer is required to get the phone at the appropriate distance from the operator’s eyes.

To put his creation to the test, [spiritplumber] loads up a VR rendition of NASA’s Neutral Buoyancy Laboratory, where astronauts experience a near-weightless environment underwater. All that’s left to complete the experience is a DIY scuba regulator so you can stay submerged. Though at that point we wouldn’t be surprised if a passerby confuses your DIY space simulator for an elaborate suicide attempt.

Continue reading “Underwater VR Offers Zero Gravity on a Budget”

N64 Emulated in VR Makes Hyrule go 3D

The Nintendo 64 had some groundbreaking and popular 3D games, and [Avaer Kazmer] felt it was only right to tamper with things just enough to trick an emulator into playing Ocarina of Time in VR, complete with stereoscopic 3D. The result is more than just running an emulator on a simulated screen in virtual reality; the software correctly renders a slightly different perspective of the world of Hyrule to each eye in order to really make the 3D pop in a way the original never could, and make it playable with VR controllers in the process. The VR emulator solution is called Emukit and works best with Exokit, a JavaScript web browser for AR and VR environments for which [Avaer] is a developer.

It turns out that there were a few challenges to work around and a few new problems to solve, not least of which was mapping VR controllers to control an N64 game in a sensible way. One thing that wasn’t avoidable is that the N64’s rendered world may now pop in 3D, but it still springs forth from a rectangular stage. The N64, after all, is still only rendering a world in a TV-screen-sized portion; anything outside that rectangular window doesn’t really exist, and there’s no way around it as long an emulated N64 is running the show. Still, the result is impressive, and a video demo is embedded below where you can see the effect for yourself.

Continue reading “N64 Emulated in VR Makes Hyrule go 3D”

Redirected Walking in VR done via Exploit of Eyeballs

[Anjul Patney] and [Qi Sun] demonstrated a fascinating new technique at NVIDIA’s GPU Technology Conference (GTC) for tricking a human into thinking a VR space is larger than it actually is. The way it works is this: when a person walks around in VR, they invariably make turns. During these turns, it’s possible to fool the person into thinking they have pivoted more or less than they have actually physically turned. With a way to manipulate perception of turns comes a way for software to gently manipulate a person’s perception of how large a virtual space is. Unlike other methods that rely on visual distortions, this method is undetectable by the viewer.

Saccadic movements

The software essentially exploits a quirk of how our eyes work. When a human’s eyes move around to look at different things, the eyeballs don’t physically glide smoothly from point to point. The eyes make frequent but unpredictable darting movements called saccades. There are a number of deeply interesting things about saccades, but the important one here is the fact that our eyes essentially go offline during saccadic movement. Our vision is perceived as a smooth and unbroken stream, but that’s a result of the brain stitching visual information into a cohesive whole, and filling in blanks without us being aware of it.

Part one of [Anjul] and [Qi]’s method is to manipulate perception of a virtual area relative to actual physical area by making a person’s pivots not a 1:1 match. In VR, it may appear one has turned more or less than one has in the real world, and in this way the software can guide the physical motion while making it appear in VR as though nothing is amiss. But by itself, this isn’t enough. To make the mismatches imperceptible, the system watches the eye for saccades and times its adjustments to occur only while they are underway. The brain ignores what happens during saccadic movement, stitches together the rest, and there you have it: a method to gently steer a human being in a way that a virtual space is larger than the physical area available.

Embedded below is a video demonstration and overview, which mentions other methods of manipulating perception of space in VR and how it avoids the pitfalls of other methods.

Continue reading “Redirected Walking in VR done via Exploit of Eyeballs”

Firefox Reality, A Browser For VR Devices

The browser you are reading this page in will be an exceptionally powerful piece of software, with features and APIs undreamed of by the developers of its early-1990s ancestors such as NCSA Mosaic. For all that though, it will very probably be visually a descendant of those early browsers, a window for displaying two-dimensional web pages.

Some of this may be about to change, as in recognition of the place virtual reality devices are making for themselves, Mozilla have released Firefox Reality, in their words “a new web browser designed from the ground up for stand-alone virtual and augmented reality headset“. For now it will run on Daydream and GearVR devices as a developer preview, but the intended target for the software is a future generation of hardware that has yet to be released.

Readers with long memories may remember some of the hype surrounding VR in browsers back in the 1990s, when crystal-ball-gazers who’d read about VRML would hail it as the Next Big Thing without pausing to think about whether the devices to back it up were on the market. It could be that this time the hardware will match the expectation, and maybe one day you’ll be walking around the Hackaday WrencherSpace rather than reading this in a browser. See you there!

They’ve released a video preview that disappointingly consists of a 2D browser window in a VR environment. But it’s a start.

Continue reading “Firefox Reality, A Browser For VR Devices”