VR Feels More Real with Leap Motion and This Rotation Sensor

You could have said this at any time in the last couple of decades: the world of virtual reality peripherals does not yet feel as though it has fulfilled its potential. From the Amiga-powered Virtuality headsets and nausea-inducing Nintendo Virtual Boy of the 1990s to today’s crop of advanced headsets and peripherals, there has always been a sense that we’re not quite there yet. Moments at which the shortcomings of the hardware intrude into the virtual world may be less frequent with the latest products, but still the goal of virtual world immersion seems elusive at times.

One of the more interesting peripherals on the market today is the Leap Motion controller. This is a USB device containing infra-red illumination and cameras which provide enough resolution for its software to accurately calculate the position of a user’s hands and fingers in three-dimensional space. This ability to track finger movement gives it the function of a controller for really complex interactions with and manipulations of objects in virtual worlds.

Even the Leap Motion has its shortcomings though, moments at which it ceases to be able to track. Rotating your hand, as you might for instance when aiming a virtual in-game weapon, confuses it. This led [Florian Maurer] to seek his own solution, and he’s come up with a hand peripheral containing a rotation sensor.

Inspired by a movie prop from the film Ender’s Game, it is a 3D-printed device that clips onto the palm of his hand between thumb and index finger. It contains both an Arduino Pro Micro and a bno055 rotation sensor, plus a couple of buttons for in-game actions such as triggers. It solves the problem with the Leap Motion’s rotation detection, and does not impede hand movement so much that he can’t also use his keyboard and mouse while wearing it. Sadly he does not yet seem to have posted any code, but he does treat us to a video demonstration which we’ve posted below the break.

Continue reading “VR Feels More Real with Leap Motion and This Rotation Sensor”

Making VR A Little More Usable With A Pinch Gesture Ring

[Florian] wants to browse the web like an internet cowboy from a cyberpunk novel. Unfortunately, VR controllers are great for games but really incapacitate a hand for typing. A new input method was needed, one that would free his fingers for typing, but still give his hands detailed input into the virtual world.

Since VR goggles have… hopefully… already reached peak ridiculousness, his first idea was to glue a Leap Motion controller to the front of it. It couldn’t look any sillier after all.  The Leap controller was designed to track hands, and when combined with the IMU built into the VR contraption, did a pretty good job of putting his hands into the world. Unfortunately, the primary gesture used for a “click” was only registering 80% of the time.

The gesture in question is a pinching motion, pushing the thumb and middle finger together. He couldn’t involve a big button without incapacitating his hands for typing. It took a few iterations, but he arrived at a compact ring design with a momentary switch on it. This is connected to an Arduino on his wrist, but was out of the way enough to allow him to type.

It’s yet another development marching us to usable VR. We personally can’t wait until we can use some technology straight out of  Stephenson or Gibson novel.

Hackaday Prize Entry: Aesthetic As Hell

Microsoft Bob was revolutionary. Normally you’d hear a phrase like that coming from an idiot blogger, but in this case a good argument could be made. Bob threw away the ‘files’ and ‘folders’ paradigm for the very beginnings of virtual reality. The word processor was just sitting down at a desk and writing a letter. Your Rolodex was a Rolodex. All abstractions are removed, and you’re closer than ever to living in your computer. If Microsoft Bob was released today, with multiple users interacting with each other in a virtual environment, it would be too far ahead of it’s time. It would be William Gibson’s most visible heir, instead of Melinda Gates’ only failure. Imagine a cyberpunk world that isn’t a dystopia, and your mind will turn to Microsoft Bob.

Metaverse Lab is aesthetic as hell
Metaverse Lab is aesthetic as hell.

Not everyone will laugh at the above paragraph. Indeed, some people are trying to make the idea of a gigantic, virtual, 3D space populated by real people a reality. For the last few years, [alusion] has been working on Metaverse Lab as an experiment in 3D scanning, virtual web browsers, and turning interconnected 3D spaces into habitats for technonauts. The name comes from Snow Crash, and over the past twenty years, a number of projects have popped up  to replicate this convergence of the digital and physical. By integrating this idea with the latest VR headsets, Metaverse Lab is the the closest thing I’ve ever seen to the dream of awesome 80s sci-fi.

I’ve actually had the experience of using and interacting with Metaverse Lab on a few occasions. The only way to describe it is as what someone would expect the Internet would be if their only exposure to technology was viewing the 1992 film Lawnmower Man. It works, though, as a completely virtual environment where potential is apparent, and the human mind is not limited by its physical embodiment.

Staying In and Playing Skyrim Has Rarely Been This Healthy

Looking to add some activity to your day but don’t want to go through a lot of effort? [D10D3] has the perfect solution that enables you to take a leisurely bike ride through Skyrim. A standing bicycle combines with an HTC Vive (using the add-on driver VorpX which allows non-vr enabled games to be played with a VR headset) and a Makey Makey board to make slack-xercise — that’s a word now — part of your daily gaming regimen.

The Makey Makey is the backbone of the rig; it allows the user to set up their own inputs with electrical contacts that correspond to keyboard and mouse inputs, thereby allowing one to play a video game in some potentially unorthodox ways — in this case, riding a bicycle.

Setting up a couple buttons for controlling the Dragonborn proved to be a simple process. Buttons controlling some of the main inputs were plugged into a breadboard circuit which was then connected to the Makey Makey along with the ground wires using jumpers. As a neat addition, some aluminium foil served as excellent contacts for the handlebars to act as the look left and right inputs. That proved to be a disorienting addition considering the Vive’s head tracking also moves the camera. Continue reading “Staying In and Playing Skyrim Has Rarely Been This Healthy”

Using The Vive’s Lighthouse With DIY Electronics

The HTC Vive is the clear winner of the oncoming VR war, and is ready to enter the hallowed halls of beloved consumer electronics behind the Apple Watch, Smart Home devices, the 3Com Audrey, and Microsoft’s MSN TV. This means there’s going to be a lot of Vives on the secondhand market very soon, opening the doors to some interesting repurposing of some very cool hardware.

[Trammell Hudson] has been messing around with the Vive’s Lighthouse – the IR emitting cube that gives the Vive its sense of direction. There’s nothing really special about this simple box, and it can indeed be used to give any microcontroller project an orientation sensor.

The Vive’s Lighthouse is an exceptionally cool piece of tech that uses multiple scanning IR laser diodes and a bank of LEDs that allows the Vive to sense its own orientation. It does this by alternately blinking and scanning lasers from left to right and top to bottom. The relevant measurements that can be determined from two Lighthouses are the horizontal angle from the first lighthouse, the vertical angle from the first lighthouse, and the horizontal angle from the second lighthouse. That’s all you need to orient the Vive in 3D space.

To get a simple microcontroller to do the same trick, [Trammell] is using a fast phototransistor with a 120° field of view. This setup only works out to about a meter away from the Lighthouses, but that’s enough for testing.

[Trammell] is working on a Lighthouse library for the Arduino and ESP8266, and so far, everything works. He’s able to get the angle of a breadboard to a Lighthouse with just a little bit of code. This is a great enabling build that is going to allow a lot of people to build some very cool stuff, and we can’t wait to see what happens next.

Behold: Valve’s VR and AR Prototypes

Just in case anyone secretly had the idea that Valve Software’s VR and other hardware somehow sprang fully-formed from a lab, here are some great photos and video of early prototypes, and interviews with the people who made them. Some of the hardware is quite raw-looking, some of it is recognizable, and some are from directions that were explored but went nowhere, but it’s all fascinating.

ValvePrototypeVIsit-172-Medium
An early AR prototype that worked like looking through a tube into another world.

The accompanying video (embedded below) has some great background and stories about the research process, which began with a mandate to explore the concepts of AR and VR and determine what could be done and what was holding things back.

One good peek into this process is the piece of hardware shown to the left. You look into the lens end like a little telescope. It has a projector that beams an image directly into your eye, and it has camera-based tracking that updates that image extremely quickly.

The result is a device that lets you look through a little window into a completely different world. In the video (2:16) one of the developers says “It really taught us just how important tracking was. No matter [how you moved] it was essentially perfect. It was really the first glimpse we had into what could be achieved if you had very low persistence displays, and very good tracking.” That set the direction for the research that followed.

Continue reading “Behold: Valve’s VR and AR Prototypes”

Get Your Game On: Troy’s TVCoG Hosts VR and Gaming Hackathon

Troy New York’s Tech Valley Center of Gravity is following up their January IoT Hackathon with another installment. The April 16-17 event promises to be a doozy, and anyone close to the area with even a passing interest in gaming and AR/VR should really make an effort to be there.

Not content to just be a caffeine-fueled creative burst, TVCoG is raising the bar in a couple ways. First, they’re teaming up with some corporate sponsors with a strong presence in the VR and AR fields. unspecifiedDaydream.io, a new company based in the same building as the CoG, is contributing a bunch of its Daydream.VR smartphone headsets to hackathon attendees, as well as mentors to get your project up and running. Other sponsors include 1st Playable Productions and Vicarious Visions, game studios both located in the Troy area. And to draw in the hardcore game programmers, a concurrent Ludum Dare game jam will be run by the Tech Valley Game Space, with interaction and collaboration between the AR/VR hackers and the programmers encouraged. Teams will compete for $1000 in prizes and other giveaways.

This sounds like it’s going to be an amazing chance to hack, to collaborate, and to make connections in the growing AR/VR field. And did we mention the food? There was a ton of it last time, so much they were begging us to take it home on Sunday night. Go, hack, create, mingle, and eat. TVCoG knows how to hackathon, and you won’t be disappointed.

Thanks to [Duncan Crary] for the heads up on this.