Most of us have heard of Second Life – that antiquated online virtual reality platform of yesteryear where users could explore, create, and even sell content. You might be surprised to learn that not only are they still around, but they’re also employing the Oculus Rift and completely redesigning their virtual world. With support of the DK2 Rift, the possibilities for a Second Life platform where users can share and explore each other’s creations opens up some interesting doors.
Envision a world where you could log on to a “virtual net”, put on your favorite VR headset and let your imagination run wild. You and some friends could make a city, a planet…and entire universe that you and thousands of others could explore. With a little bit of dreaming
and an arduino, VR can bring dreams to life.
Continue reading “Ask Hackaday: What is The Future of Virtual Reality?”
The Oculus Rift and all the other 3D video goggle solutions out there are great if you want to explore virtual worlds with stereoscopic vision, but until now we haven’t seen anyone exploring real life with digital stereoscopic viewers. [pabr] combined the Kinect-like sensor in an ASUS Xtion with a smartphone in a Google Cardboard-like setup for 3D views the human eye can’t naturally experience like a third-person view, a radar-like display, and seeing what the world would look like with your eyes 20 inches apart.
[pabr] is using an ASUS Xtion depth sensor connected to a Galaxy SIII via the USB OTG port. With a little bit of code, the output from the depth sensor can be pushed to the phone’s display. The hardware setup consists of a VR-Spective, a rather expensive bit of plastic, but with the right mechanical considerations, a piece of cardboard or some foam board and hot glue would do quite nicely.
[pabr] put together a video demo of his build, along with a few examples of what this project can do. It’s rather odd, and surprisingly not a superfluous way to see in 3D. You can check out that video below.
Continue reading “Seeing The World Through Depth Sensing Cameras”
One of our readers has been playing around with virtual reality lately, and has come up with a pretty cool beta run of his research — virtual interaction using your hands.
Using an Oculus Rift, the Leap Motion controller and a beta run of Unity 4.6, [Tomáš Mariančík] put together a test environment for physical interaction. The Leap Motion controller is capable of tracking your fingers with extremely high detail, which allows him to create a pair of virtual hands inside the test environment that almost perfectly mimic his movements. The hack here is making it all work together.
In the following demo he shows off by interacting with holographic menus, grabbing body parts off of anatomically correct human being (thanks to Unity3D), and manipulating his environment.
Continue reading “Interacting with Virtual Reality Brings us Even Closer to a Real Holodeck”
At the Atmel booth at Maker Faire, they were showing off a few very cool bits and baubles. We’ve got a post on the WiFi shield in the works, but the most impressive person at the booth was [Quin]. He has a company, he’s already shipping products, and he has a few projects already in the works. What were you doing at 13?
[Quin]’s Qduino Mini is your basic Arduino compatible board with a LiPo charging circuit. There’s also a ‘fuel gauge’ of sorts for the battery. The project will be hitting Kickstarter sometime next month, and we’ll probably put that up in a links post.
Oh, [Quin] was also rocking some awesome kicks at the Faire. Atmel, I’m trying to give you money for these shoes, but you’re not taking it.
[Sophie] had a really cool installation at the faire, and notably something that was first featured on hackaday.io. Basically, it’s a virtually reality Segway, built with an Oculus, Leap Motion, a Wobbleboard, an Android that allows you to cruise on everyone’s favorite barely-cool balancing scooter through a virtual landscape.
This project was a collaboration between [Sophie], [Takafumi Ide], [Adelle Lin], and [Martha Hipley]. The virtual landscape was built in Unity, displayed on the Oculus, controlled with an accelerometer on a phone, and has input with a Leap Motion. There are destructible and interactable things in the environment that can be pushed around with the Leap Motion, and with the helmet-mounted fans, you can feel the wind in your hair as you cruise over the landscape on your hovering Segway-like vehicle. This is really one of the best VR projects we’ve ever seen.
Controlling autonomous vehicles remotely with the use of virtual reality headsets seems like an obvious next step. Already, a few UAV companies have begun experimenting with these types of ideas by integrating Oculus Rift developer kits into their hovering quadcopters and drones. Parrot released a video on their blog showing that they developed a head-tracking system for their Bebop Drone in an effort to bring FPV flights to fruition. It looks like a lot of fun and we want to try one of these out asap!
As for technical specifications, they can be found in the YouTube description of the video embedded below. A quick glance showed that the operating system is based on Android and uses WiFi to connect the handheld tablet to the autonomous vehicle floating above. The range is a whopping 2km, giving plenty of freedom to explore. Moving one’s head swivels the attached camera giving a more immersive flying experience.
This isn’t the first example of FPV drones that we have seen. Previously, we covered an Oculus Rift + Head Tracking setup and another similar integration with a Black Armor Drone. We are bound to see virtual reality equipment used to control drones more and more as developers get their hands on cutting edge hardware like the Oculus developer kit 2 hardware which is currently shipping.
Continue reading “Flying a Drone with an Oculus Rift “
Aiming to be the leader in Virtual Reality horror experiences is the immersive VR haunted house in Seattle called ‘The Nightmare Machine’ which promises to be one of the most terrifying events this Halloween. But they need some assistance raising money to achieve the type of scale on a large public level that the project is attempting. The goal is $70,000 within a 30 day period which is quite the challenge, and the team will need to hustle every single day in order to accomplish it.
Yet the focus of the project looks good though, which is to lower the massive barriers of entry in VR that are associated with high hardware costs and provide people with a terrifying 5 minutes of nightmare-inducing experiences. This type of fidelity and range is usually only seen in military research facilities and university labs, like the MxR Lab at USC. And, their custom-built head mounted displays bring out this technology into the reach of the public ready to scare the pants off of anyone willing to put on the VR goggles.
The headsets are completely wireless, multi-player and contain immersive binaural audio inside. A motion sensing system has also been integrated that can track movements of the users within hundreds of square feet. Their platform is a combination of custom in-house and 3rd party hardware along with a slick software framework. The technology looks amazing, and the prizes given out through the Kickstarter are cool too! For example, anyone who puts in $175 or more gets to have their head 3D scanned and inserted into the Nightmare Machine. The rest of the prices include tickets to the October showcase where demos of the VR experience will be shown.
Continue reading “VRcade’s The Nightmare Machine (Kickstarter Campaign)”
[Cyber] has been testing out intuitive input methods for virtual reality experiences that immerse the user further into the virtual world than archaic devices like a keyboard or mouse would allow. One of his biggest interests so far was the idea of a data glove that interacts with an Arduino Uno to interface with a PC. Since commercial products are yet to exist on a readily available level, [Cyber] decided to build his own.
He started out with a tiny inertial measurement unit called a Pololu MinIMU-9 v2 that tracks orientation of the 3-axis gyro and accelerometer. The USB interface was soldered into place connecting the wires to an Arduino Uno. From there, he hooked up a flex sensor from Spectra Symbol (which were supposedly used in the original Nintendo Power Gloves) and demoed the project by tracking the movement of one of his fingers. As the finger bent, the output printed on the serial monitor changed.
[Cyber] still needs to mount a glove on this system and construct a proper positional tracking method so that physical movement will be mirrored in a simulation.
[Cyber’s] day job has had him busy these last few months, which has forced the project into a temporary hold. Recently though, [Cyber] has been an active member and an influence in the local Orange County VR scene helping to build a nice development culture, so we’re hoping to see more updates from him soon.
To view what he has done up to this point, click the link at the top of the page, and check out the video after the break:
Continue reading “Flex Sensing for a DIY Data Glove”