The folks behind the Atmos Extended Reality (XR) headset want to provide improved accessibility with an open ecosystem, and they aim to do it with a WebVR-capable headset design that is self-contained, 3D-printable, and open-sourced. Their immediate goal is to release a development kit, then refine the design for a wider release.
The front of the headset has a camera-based tracking board to provide all the modern goodies like inside-out head and hand tracking as well as the ability to pass through video. The design also provides for a variety of interface methods such as eye tracking and 6 DoF controllers.
With all that, the headset gives users maximum flexibility to experiment with and create different applications while working to keep development simple. A short video showing off the modular design of the HMD and optical assembly is embedded below.
Extended Reality (XR) has emerged as a catch-all term to cover broad combinations of real and virtual elements. On one end of the spectrum are completely virtual elements such as in virtual reality (VR), and towards the other end of the spectrum are things like augmented reality (AR) in which virtual elements are integrated with real ones in varying ratios. With the ability to sense the real world and pass through video from the cameras, developers can choose to integrate as much or as little as they wish.
[AlexPewPew] tipped us off on some interesting virtual reality work going on at the Swiss Federal Institute of Technology in Zurich. Mapping a user’s head movement to match the images shown in a head mounted display is something the Oculus Rift is very good at. But in order to walk and move around freely in that virtual environment requires completely different hardware. We’ve seen some ingenious setups before, but nothing as efficient as this.
In the video above, they have put sheets of bar-coded paper on the ceiling in a grid pattern. A camera that mounts on the users head looks up at the grid of papers and gets the user’s location. The neatest part though, is how they are fitting a large virtual space into a small room. As the user walks down a straight virtual path, software is slowly making the actual path in the small room curve. The end result is the user walks in circles in the small room, thinking he or she is exploring a much larger space. Neat stuff!
If you have a head mounted display lying around, and can’t think of anything to enter into The Hackaday Prize contest, this would be a great concept to work on. What are you waiting for…get hacking!
[programing4fun] has been playing around with his Kinect-based 3D display and building a holographic WALL-E controllable with a Windows phone. It’s a ‘kid safe’ version of his Terminator personal assistant that has voice control and support for 3d anaglyph and shutter glasses.
When we saw [programming4fun]’s Kinect hologram setup last summer we were blown away. By tracking a user’s head with a Kinect, [programming] was able to display a 3D image using only a projector. This build was adapted into a 3D multitouch table and real life portals, so we’re glad to see [programming4fun] refining his code and coming up with some really neat builds.
In addition to robotic avatars catering to your every wish, [programming4fun] also put together a rudimentary helicopter flight simulator controlled by tilting cell phone. It’s the same DirectX 9 heli from [programming]’s original build. with the addition of Desert Strike-esque top-down graphics. This might be the future of gaming here, so we’ll keep our eyes out for similar head-tracking 3D builds.
Although virtual reality was the wave of the future in the early 90’s, it hasn’t really taken off the way we would have liked. Sometimes a great idea just takes time for the technology to catch up to it (Aeolipile anyone?). Now that tiny projectors, realistic FPS games, and eye tracking systems have come down in price, this head-tracking projection system engineered by students at University of Texas at Austin could be the start of something really neat.
[Epoch] Sent in this simple head tracking project using Lego pieces. He’s made a custom mount to hold 3 Lego light sensors on a baseball cap. Then, after modifying his webcam for IR with some floppy disk scraps, he loads up the free-track software and can control his games. For convenience, he has programmed the Lego Nxt to only turn on the lights while he’s holding a contact sensor. You can see it in action after the break. This appears to be very similar to [Johnny Lee’s] head tracking. Judging by the video, it’s not as smooth though.
There is no doubt that [Johnny Lee] is the authority on Wiimote based projects. So, when he compiles a list of his favorite Wiimote projects, we definitely pay attention. He’s organized the list as a progression of the unusual. By the time you get to ‘Chicken Head Tracking‘ at the bottom, you’ll be adequately prepared. You’re bound to get some inspiration from the list even it’s building a pigeon guided missile.