Interacting with Virtual Reality Brings us Even Closer to a Real Holodeck

Leap motion controller plus oculus rift

One of our readers has been playing around with virtual reality lately, and has come up with a pretty cool beta run of his research — virtual interaction using your hands.

Using an Oculus Rift, the Leap Motion controller and a beta run of Unity 4.6, [Tomáš Mariančík] put together a test environment for physical interaction. The Leap Motion controller is capable of tracking your fingers with extremely high detail, which allows him to create a pair of virtual hands inside the test environment that almost perfectly mimic his movements. The hack here is making it all work together.

In the following demo he shows off by interacting with holographic menus, grabbing body parts off of anatomically correct human being (thanks to Unity3D), and manipulating his environment.

[Read more...]

‘Duinos and VR Environments

At the Atmel booth at Maker Faire, they were showing off a few very cool bits and baubles. We’ve got a post on the WiFi shield in the works, but the most impressive person at the booth was [Quin]. He has a company, he’s already shipping products, and he has a few projects already in the works. What were you doing at 13?

[Quin]‘s Qduino Mini is your basic Arduino compatible board with a LiPo charging circuit. There’s also a ‘fuel gauge’ of sorts for the battery. The project will be hitting Kickstarter sometime next month, and we’ll probably put that up in a links post.

Oh, [Quin] was also rocking some awesome kicks at the Faire. Atmel, I’m trying to give you money for these shoes, but you’re not taking it.

[Sophie] had a really cool installation at the faire, and notably something that was first featured on hackaday.io. Basically, it’s a virtually reality Segway, built with an Oculus, Leap Motion, a Wobbleboard, an Android that allows you to cruise on everyone’s favorite barely-cool balancing scooter through a virtual landscape.

This project was a collaboration between [Sophie], [Takafumi Ide], [Adelle Lin], and [Martha Hipley]. The virtual landscape was built in Unity, displayed on the Oculus, controlled with an accelerometer on a phone, and has input with a Leap Motion. There are destructible and interactable things in the environment that can be pushed around with the Leap Motion, and with the helmet-mounted fans, you can feel the wind in your hair as you cruise over the landscape on your hovering Segway-like vehicle. This is really one of the best VR projects we’ve ever seen.

Flying a Drone with an Oculus Rift 

image001

Controlling autonomous vehicles remotely with the use of virtual reality headsets seems like an obvious next step. Already, a few UAV companies have begun experimenting with these types of ideas by integrating Oculus Rift developer kits into their hovering quadcopters and drones. Parrot released a video on their blog showing that they developed a head-tracking system for their Bebop Drone in an effort to bring FPV flights to fruition. It looks like a lot of fun and we want to try one of these out asap!

As for technical specifications, they can be found in the YouTube description of the video embedded below. A quick glance showed that the operating system is based on Android and uses WiFi to connect the handheld tablet to the autonomous vehicle floating above. The range is a whopping 2km, giving plenty of freedom to explore. Moving one’s head swivels the attached camera giving a more immersive flying experience.

This isn’t the first example of FPV drones that we have seen. Previously, we covered an Oculus Rift + Head Tracking setup and another similar integration with a Black Armor Drone. We are bound to see virtual reality equipment used to control drones more and more as developers get their hands on cutting edge hardware like the Oculus developer kit 2 hardware which is currently shipping.

[Read more...]

VRcade’s The Nightmare Machine (Kickstarter Campaign)

vrcade2

Aiming to be the leader in Virtual Reality horror experiences is the immersive VR haunted house in Seattle called ‘The Nightmare Machine’ which promises to be one of the most terrifying events this Halloween. But they need some assistance raising money to achieve the type of scale on a large public level that the project is attempting. The goal is $70,000 within a 30 day period which is quite the challenge, and the team will need to hustle every single day in order to accomplish it.

Yet the focus of the project looks good though, which is to lower the massive barriers of entry in VR that are associated with high hardware costs and provide people with a terrifying 5 minutes of nightmare-inducing experiences. This type of fidelity and range is usually only seen in military research facilities and university labs, like the MxR Lab at USC. And, their custom-built head mounted displays bring out this technology into the reach of the public ready to scare the pants off of anyone willing to put on the VR goggles.

The headsets are completely wireless, multi-player and contain immersive binaural audio inside. A motion sensing system has also been integrated that can track movements of the users within hundreds of square feet. Their platform is a combination of custom in-house and 3rd party hardware along with a slick software framework. The technology looks amazing, and the prizes given out through the Kickstarter are cool too! For example, anyone who puts in $175 or more gets to have their head 3D scanned and inserted into the Nightmare Machine. The rest of the prices include tickets to the October showcase where demos of the VR experience will be shown.

[Read more...]

Flex Sensing for a DIY Data Glove

cyber

[Cyber] has been testing out intuitive input methods for virtual reality experiences that immerse the user further into the virtual world than archaic devices like a keyboard or mouse would allow. One of his biggest interests so far was the idea of a data glove that interacts with an Arduino Uno to interface with a PC. Since commercial products are yet to exist on a readily available level, [Cyber] decided to build his own.

He started out with a tiny inertial measurement unit called a Pololu MinIMU-9 v2 that tracks orientation of the 3-axis gyro and accelerometer. The USB interface was soldered into place connecting the wires to an Arduino Uno. From there, he hooked up a flex sensor from Spectra Symbol (which were supposedly used in the original Nintendo Power Gloves) and demoed the project by tracking the movement of one of his fingers. As the finger bent, the output printed on the serial monitor changed.

[Cyber] still needs to mount a glove on this system and construct a proper positional tracking method so that physical movement will be mirrored in a simulation.

[Cyber's] day job has had him busy these last few months, which has forced the project into a temporary hold. Recently though, [Cyber] has been an active member and an influence in the local Orange County VR scene helping to build a nice development culture, so we’re hoping to see more updates from him soon.

To view what he has done up to this point, click the link at the top of the page, and check out the video after the break:

[Read more...]

3D Printed Virtual Reality Goggles

gaming-loop

Oculus, as we know, was acquired by Facebook for $2 billion, and now the VR community has been buzzing about trying to figure out what to do with all this newly accessible technology. And adding to the interest, the 2nd iteration of the development kits were released, causing a resurgence in virtual reality development as computer generated experiences started pouring out from of every corner of the world. But not everyone can afford the $350 USD price tag to purchase one of these devices, bringing out the need for Do-It-Yourself projects like these 3D printed wearable video goggles via Adafruit.

The design of this project is reminiscent of the VR2GO mobile viewer that came out of the MxR Lab (aka the research environment that spun out Palmer Lucky before he created Oculus). However, the hardware here is more robust and utilizes a 5.6″ display and 50mm aspheric lenses instead of a regular smart phone. The HD monitor is held within a 3D printed enclosure along with an Arduino Micro and 9-DOF motion sensor. The outer hood of the case is composed of a combination of PLA and Ninjaflex printing-filament, keeping the fame rigid while the area around the eyes remain flexible and comfortable. The faceplate is secured with a mounting bracket and a pair of aspheric lenses inside split the screen for stereoscopic video. Head straps were added allowing for the device to fit snugly on one’s face.

At the end of the tutorial, the instructions state that once everything is assembled, all that is required afterwards is to plug in a 9V power adapter and an HDMI cable sourcing video from somewhere else. This should get the console up and running; but it would be interesting to see if this design in the future can eliminate the wires and make this into a portable unit. Regardless of which, this project does a fantastic job at showing what it takes to create a homemade virtual reality device. And as you can see from the product list after the break, the price of the project fits under the $350 DK2 amount, helping to save some money while still providing a fun and educational experience.

[Read more...]

Cutting Ribbons with Robots and a Oculus Rift

PR2-GrandOpening

On June 26th, 2014, Clearpath Robotics opened up the doors to their brand new 12,000 square foot robot lair by bringing out a PR2 to cut the ceremonial ribbon and welcome everyone inside. And instead of just programming the ‘locate and destroy’ ribbon sequence, the co-founders opted to use an Oculus Rift to control the robot tearing through the material with flailing arms.

This was accomplished having Jake, the robot, utilize a Kinect 2.0 that fed skeleton tracking data via rosserial_windows, a windows-based set of extension for the Robot Operating System which we heard about in January. The software gathers in a stream of data points each with an X,Y,Z component allowing [Jake] to find himself within a 3D space.Then, the data was collected and published directly into the PR2’s brain. Inject a little python code, and the creature was able to route directions in order to move it’s arms.

Thus, by simply stepping in front of the Kinect 2.0, and putting on the Oculus Rift headset, anyone could teleoperate [Jake] to move around and wave its arms at oncoming ribbons. Once completed, [Jake] would leave the scene, journeying back into the newly created robot lair leaving pieces of nylon and polyester everywhere.

An earlier (un-smoothed) version of the full system can be seen after the break:

[Read more...]

Follow

Get every new post delivered to your Inbox.

Join 98,062 other followers