Oculus Pi

[WayneKeenan] wrote a proof-of-concept virtual reality system that used a Raspberry Pi and an Oculus Rift. It was about a thousand lines of Python and with a battery pack it was even portable. The problem was that the Pi was struggling to create the 3D views.

[Wayne] recently revisited the demo and found that just about everything has gotten better: the Pi 3 is faster, and the Python libraries have become better. He spent some time building a library — VR Zero — and then recreated the original demo in 80 more lines of Python. You can see a video, below.

Continue reading “Oculus Pi”

Absolute 3D Tracking With EM Fields

[Chris Gunawardena] is still holding his breath on Valve and Facebook surprising everyone by open sourcing their top secret VR prototypes. They have some really clever ways to track the exact location and orientation of the big black box they want people to strap to their faces. Until then, though, he decided to take his own stab at the 3D tracking problems they had to solve. 

While they used light to perform the localization, he wanted to experiment with using electromagnetic fields to perform the same function. Every phone these days has a magnetometer built in. It’s used to figure out which way is up, but it can also measure the local strength of magnetic fields.

Unfortunately to get really good range on a magnetic field there’s a pesky problem involving inverse square laws. Some 9V batteries in series solved the high current DC voltage source problem and left him with magnetic field powerful enough to be detected almost ten centimeters away by his iPhone’s magnetometer.

As small as this range seems, it ended up being enough for his purposes. Using the existing math and a small iOS app he was able to perform rudimentary localization using EM fields. Pretty cool. He’s not done yet and hopes that a more sensitive magnetometer and a higher voltage power supply with let him achieve greater distances and accuracy in a future iteration.

Dirt Cheap VR Gun With Tracking For $15 Of Added Hardware

Virtual reality doesn’t feel very real if your head is the only thing receiving the virtual treatment. For truly immersive experiences you must be able to use your body, and even interact with virtual props, in an intuitive way. For instance, in a first-person shooter you want to be able to hold the gun and use it just as you would in real reality. That’s exactly what [matthewhallberg] managed to do for just a few bucks.

This project is an attempt to develop a VR shooting demo and the associated hardware on a budget, complete with tracking so that the gun can be aimed independent of the user’s view. [matthewhallberg] calls it The Oculus Cardboard Project, named for the combined approach of using a Google Cardboard headset for the VR part, and camera-based object tracking for the gun portion. The game was made in Unity 3D with the Vuforia augmented reality plugin. Not counting a smartphone and Google Cardboard headset, the added parts clocked in at only about $15.

ESP8266 on FiducialUsing corrugated cardboard and a printout, [matthewhallberg] created a handheld paddle-like device with buttons that acts as both controller and large fiducial marker for the smartphone camera. Inside the handle is a battery and an ESP8266 microcontroller. The buttons on the paddle allow for “walk forward” as well as “shoot” triggers. The paddle represents the gun, and when you move it around, the smartphone’s camera tracks the orientation so it’s possible to move and point the gun independent of your point of view. You can see it in action in the video below.

Tracking a handheld paddle with a fiducial marker isn’t a brand new idea; We were able to find this project for example which also very cleverly simulates a trigger input by making a trigger physically alter the paddle shape when you squeeze it. The fiducial is altered by the squeeze, and the camera sees the change and registers it as an input. However, [matthewhallberg]’s approach of using hardware buttons does allow for a wider variety of reliable inputs (move and shoot instead of just move, for example). If you’re interesting in trying it out, the project page has all the required details and source code.

This isn’t [matthewhallberg]’s first attempt and getting the most out of an economical Google Cardboard setup. He used some of the ideas and parts from his earlier DIY Virtual Reality Snowboard project.

Continue reading “Dirt Cheap VR Gun With Tracking For $15 Of Added Hardware”

HoloLens NES Emulator For Augmented Retro Gaming

[Andrew Peterson] was looking for a way to indulge in his retro gaming passions in a more contemporary manner. His 3D NES emulator “N3S” for Windows brings Nintendo classics to the HoloLens, turning pixels into voxels, and Super Mario into an augmented reality gingerbread man.

To run NES games on the HoloLens, [Andrew’s] emulator uses the Nestopia libretro core. Since AR glasses cry for an augmentation of the game itself, the N3S re-emulates the NES’ picture processing unit (PPU), allowing it to interpret a Nintendo game’s graphics in a 3D space. [Andrew] also put together a comprehensive explanation of how the original Nintendo PPU works, and how he re-implemented it for the HoloLens.

The current version of the N3S PPU emulator automatically generates voxels by simply extruding the original pattern data from the game’s ROM, but [Andrew] is thinking about more features. Users could sculpt their own 3D versions of the original graphic elements in an inbuilt editor, and model sets could then be made available in an online database. From there, players would just download 3D mods for their favorite games and play them on the HoloLens.

According to [Andrew], the emulator reaches the limits of what the current pre-production version of the HoloLens can render fluently, so the future of this project may depend on future hardware generations. Nevertheless, the HoloLens screen capture [Andrew] recorded makes us crave for more augmented retro gaming. Enjoy the video!

HTC Vive Gives Autonomous Robots Direction

The HTC Vive is a virtual reality system designed to work with Steam VR. The system seeks to go beyond just a headset in order to make an entire room a virtual reality environment by using two base stations that track the headset and controller in space. The hardware is very exciting because of the potential to expand gaming and other VR experiences, but it’s already showing significant potential for hackers as well — in this case with robotics location and navigation.

Autonomous robots generally utilize one of two basic approaches for locating themselves: onboard sensors and mapping to see the world around it (like how you’d get your bearings while hiking), or sensors in the room which tell the robot where it is (similar to your GPS telling you where you are in the city). Each method has its strengths and weaknesses, of course. Onboard sensors are traditionally expensive if you need very accurate position data, and GPS location data is far too inaccurate to be of use on a smaller scale than city streets.

[Limor] immediately saw the potential in the HTC Vive to solve this problem, at least for indoor applications. Using the Vive Lighthouse base stations, he’s able to locate the system’s controller in 3D space to within 0.3mm. He’s then able to use this data on a Linux system and integrate it into ROS (Robot Operating System). [Limor] hasn’t yet built a robot to utilize this approach, but the significant cost savings ($800 for a complete Vive, but only the Lighthouses and controller are needed) is sure to make this a desirable option for a lot of robot builders. And, as we’ve seen, integrating the Vive hardware with DIY electronics should be entirely possible.

Continue reading “HTC Vive Gives Autonomous Robots Direction”

The Most Expensive VR Experiment You’ll See All Week

There isn’t a lot of detail to be found behind this short demo of robot-based physical feedback for VR, but the video (embedded below) demonstrates things well. It’s an experiment in generating force feedback for virtual objects using a Baxter robot and the HTC Vive. When the user presses against a wooden block in VR, the robot presses back which simulates the mass of the virtual object. Force feedback is one of these areas in which research is ongoing, and in a variety of different directions.

Like so many other things in life, nothing beats the real thing for actual physical feedback. Also, there’s something great about giving a $25,000 robot the job of impersonating a few simple wooden blocks in VR, just so you can strap on a VR rig and basically give a robot a realistic-feeling fist bump.

Continue reading “The Most Expensive VR Experiment You’ll See All Week”

Staying In And Playing Skyrim Has Rarely Been This Healthy

Looking to add some activity to your day but don’t want to go through a lot of effort? [D10D3] has the perfect solution that enables you to take a leisurely bike ride through Skyrim. A standing bicycle combines with an HTC Vive (using the add-on driver VorpX which allows non-vr enabled games to be played with a VR headset) and a Makey Makey board to make slack-xercise — that’s a word now — part of your daily gaming regimen.

The Makey Makey is the backbone of the rig; it allows the user to set up their own inputs with electrical contacts that correspond to keyboard and mouse inputs, thereby allowing one to play a video game in some potentially unorthodox ways — in this case, riding a bicycle.

Setting up a couple buttons for controlling the Dragonborn proved to be a simple process. Buttons controlling some of the main inputs were plugged into a breadboard circuit which was then connected to the Makey Makey along with the ground wires using jumpers. As a neat addition, some aluminium foil served as excellent contacts for the handlebars to act as the look left and right inputs. That proved to be a disorienting addition considering the Vive’s head tracking also moves the camera. Continue reading “Staying In And Playing Skyrim Has Rarely Been This Healthy”