N64 Emulated In VR Makes Hyrule Go 3D

The Nintendo 64 had some groundbreaking and popular 3D games, and [Avaer Kazmer] felt it was only right to tamper with things just enough to trick an emulator into playing Ocarina of Time in VR, complete with stereoscopic 3D. The result is more than just running an emulator on a simulated screen in virtual reality; the software correctly renders a slightly different perspective of the world of Hyrule to each eye in order to really make the 3D pop in a way the original never could, and make it playable with VR controllers in the process. The VR emulator solution is called Emukit and works best with Exokit, a JavaScript web browser for AR and VR environments for which [Avaer] is a developer.

It turns out that there were a few challenges to work around and a few new problems to solve, not least of which was mapping VR controllers to control an N64 game in a sensible way. One thing that wasn’t avoidable is that the N64’s rendered world may now pop in 3D, but it still springs forth from a rectangular stage. The N64, after all, is still only rendering a world in a TV-screen-sized portion; anything outside that rectangular window doesn’t really exist, and there’s no way around it as long an emulated N64 is running the show. Still, the result is impressive, and a video demo is embedded below where you can see the effect for yourself.

Continue reading “N64 Emulated In VR Makes Hyrule Go 3D”

360 Live VR Teleportation Uses Drones, Neural Networks, And Perseverance

This past semester I added research to my already full schedule of math and engineering classes, as any masochistic student eagerly would. Packed schedule aside, how do you pass up the chance to work on implementing 360° virtual teleportation to anywhere in the world, in real-time. Yes, it is indeed the same concept as the cult worshipped Star Trek transporter, minus the ability to physically be at the location. Perhaps we can add a, “beam me up, Scotty” command when shutting down.

The research lab I was working with is the Laboratory for Immersive CommunicatiON (LION). It’s funded by NSF, Microsoft, and Adobe and has been on the pursuit of VR teleportation for some time now.  There’s a lot of cool technologies at work here, like drones which are used as location collection devices. A network of drones will survey landscape anywhere in the world and build the collection assets needed for recreating it in VR. Okay, so a swarm of drones might seem a little intimidating at first, but when has emerging technology not?

Continue reading “360 Live VR Teleportation Uses Drones, Neural Networks, And Perseverance”

Redirected Walking In VR Done Via Exploit Of Eyeballs

[Anjul Patney] and [Qi Sun] demonstrated a fascinating new technique at NVIDIA’s GPU Technology Conference (GTC) for tricking a human into thinking a VR space is larger than it actually is. The way it works is this: when a person walks around in VR, they invariably make turns. During these turns, it’s possible to fool the person into thinking they have pivoted more or less than they have actually physically turned. With a way to manipulate perception of turns comes a way for software to gently manipulate a person’s perception of how large a virtual space is. Unlike other methods that rely on visual distortions, this method is undetectable by the viewer.

Saccadic movements

The software essentially exploits a quirk of how our eyes work. When a human’s eyes move around to look at different things, the eyeballs don’t physically glide smoothly from point to point. The eyes make frequent but unpredictable darting movements called saccades. There are a number of deeply interesting things about saccades, but the important one here is the fact that our eyes essentially go offline during saccadic movement. Our vision is perceived as a smooth and unbroken stream, but that’s a result of the brain stitching visual information into a cohesive whole, and filling in blanks without us being aware of it.

Part one of [Anjul] and [Qi]’s method is to manipulate perception of a virtual area relative to actual physical area by making a person’s pivots not a 1:1 match. In VR, it may appear one has turned more or less than one has in the real world, and in this way the software can guide the physical motion while making it appear in VR as though nothing is amiss. But by itself, this isn’t enough. To make the mismatches imperceptible, the system watches the eye for saccades and times its adjustments to occur only while they are underway. The brain ignores what happens during saccadic movement, stitches together the rest, and there you have it: a method to gently steer a human being in a way that a virtual space is larger than the physical area available.

Embedded below is a video demonstration and overview, which mentions other methods of manipulating perception of space in VR and how it avoids the pitfalls of other methods.

Continue reading “Redirected Walking In VR Done Via Exploit Of Eyeballs”

Leap Motion Announces Open Source Augmented Reality Headset

Leap Motion just dropped what may be the biggest tease in Augmented and Virtual Reality since Google Cardboard. The North Star is an augmented reality head-mounted display that boasts some impressive specs:

  • Dual 1600×1440 LCDs
  • 120Hz refresh rate
  • 100 degree FOV
  • Cost under $100 (in volume)
  • Open Source Hardware
  • Built-in Leap Motion camera for precise hand tracking

Yes, you read that last line correctly. The North Star will be open source hardware. Leap Motion is planning to drop all the hardware information next week.

Now that we’ve got you excited, let’s mention what the North Star is not — it’s not a consumer device. Leap Motion’s idea here was to create a platform for developing Augmented Reality experiences — the user interface and interaction aspects. To that end, they built the best head-mounted display they could on a budget. The company started with standard 5.5″ cell phone displays, which made for an incredibly high resolution but low framerate (50 Hz) device. It was also large and completely unpractical.

The current iteration of the North Star uses much smaller displays, which results in a higher frame rate and a better overall experience.  The secret sauce seems to be Leap’s use of ellipsoidal mirrors to achieve a large FOV while maintaining focus.

We’re excited, but also a bit wary of the $100 price point — Leap Motion is quick to note that the price is “in volume”. They also mention using diamond tipped tooling in a vibration isolated lathe to grind the mirrors down. If Leap hasn’t invested in some injection molding, those parts are going to make the whole thing expensive. Keep your eyes on the blog here for more information as soon as we have it!

Google Light Fields Trying To Get The Jump On Magic Leap

Light Field technology is a fascinating area of Virtual Reality research that emulates the way that light behaves to make a virtual scene look more realistic. By emulating light coming from multiple angles entering the eye, the scenes look more realistic because they look closer to reality. It is rumored to be part of the technology included in the forthcoming Magic Leap headset, but it looks like Google is trying to steal some of their thunder. The VR research arm of the search giant has released a VR app called Welcome to Light Fields that uses a similar technique on existing VR headsets, such as those from Oculus and Microsoft.

Continue reading “Google Light Fields Trying To Get The Jump On Magic Leap”

Reverse Engineering Opens Up The Samsung Gear VR Controller

We love a bit of reverse engineering here at Hackaday, figuring out how a device works from the way it communicates with the world. This project from [Jim Yang] is a great example of this: he reverse-engineered the Samsung Gear VR controller that accompanies the Gear VR add-on for their phones. By digging into the APK that links the device to the phone, he was able to figure out the details of the Bluetooth connection that the app uses to connect to the device. Specifically, he was able to find the commands that were used to get the device to send data, and was able to read this data to determine the state of the device. He was then able to use this to create his own web app to use this data.

Continue reading “Reverse Engineering Opens Up The Samsung Gear VR Controller”