An Introduction To Valve’s Tracking Hardware

[Alan Yates] brought a demo of Valve’s new VR tech that’s the basis of the HTC Vive system to Maker Faire this year. It’s exceptionally clever, and compared to existing VR headsets it’s probably one of the best headtracking solutions out there.

With VR headsets, the problem isn’t putting two displays in front of the user’s eyes. The problem is determining where the user is looking quickly and accurately. IMUs and image processing techniques can be used with varying degrees of success, but to do it right, it needs to be really fast and really cheap.

[Alan] and [Valve]’s ‘Lighthouse’ tracking unit does this by placing a dozen or so IR photodiodes on the headset itself. On the tracking base station, IR lasers scan in the X and Y axes. By scanning these IR lasers across the VR headset, the angle of the headset to the base station can be computed in just a few cycles of a microcontroller. For a bunch of one cent photodiodes, absolute angles and the orientation to a base station can be determined very easily, something that has some pretty incredible applications for everything from VR to robotics.

Remember all of the position tracking hacks that came out as a result of the Nintendo Wii using IR beacons and a tracking camera? This seems like an evolutionary leap forward but in the same realm and can’t wait to see people hacking on this tech!

VCF East X: Virtual Reality With PETSCII

What would happen if Oculus-quality virtual reality was created in the 80s on the Commodore PET? [Michael Hill] knows, because he created a stereoscopic video headset using a PET.

This build is an extension of [Michael]’s exhibit last year at VCF East where he displayed a video feed with PETSCII. Yes, that means displaying video with characters, not pixels.

This year, he’s doubling the number of screens, and sending everything to two iPhones in a Google Cardboard-like VR headset. Apart from the optics, the setup is pretty simple: cameras get image data, it’s sent over to a PET, and a stream of characters are sent back.

It’s impossible to film, and using it is interesting, to say the least. Video below.

Continue reading “VCF East X: Virtual Reality With PETSCII”

Putting Oculus Rift On A Robot

Many of the early applications for the much anticipated Oculus Rift VR rig have been in gaming. But it’s interesting to see some more useful applications besides gaming, before it’s commercial release sometime this year. [JoLau] at the Institute i4Ds of FHNW School of Engineering wanted to go a step beyond rendering virtual worlds. So he built the Intuitive Rift Explorer a.k.a IRE. The IRE is a moving reality system consisting of a gimbaled stereo-vision camera rig transmitting video to the Rift, and matching head movements received from the Oculus Rift. The vision platform is mounted on a Remote-controlled robot which is completely wireless.

One of the big challenges with using VR headsets is lag, causing motion sickness in some cases. He had to tackle the problem of latency – reducing the time from moving the head to getting a matching image on the headset – Oculus Rift team specified it should be less than 20ms. The other important requirement is a high frame rate, in this case 60 frames per second. [JoLau] succeeded in overcoming most of the problems, although in conclusion he does mention a couple of enhancements that he would like to add, given more time.

[JoLau] provides a detailed description of the various sub-systems that make up IRE – the Stereo camera,  audio and video transmission, media processing, servo driven gimbal for the stereo camera,  and control system code. [JoLau]’s reasoning on some of the interesting hardware choices for several components used in the project makes for interesting reading. Watch a video of the IRE in action below.

Continue reading “Putting Oculus Rift On A Robot”

Build Your Own Gear VR

With Samsung’s new Gear VR announced, developers and VR enthusiasts are awaiting the release of the smartphone connected VR headset. A few people couldn’t wait to get their hands on the platform, so they created, OpenGear, a Gear VR compatible headset.

The OpenGear starts off with a Samsung Galaxy Note 4, which is the target platform for the Gear VR headset. A cardboard enclosure, similar to the Google Cardboard headset, holds the lenses and straps the phone to your face.

The only missing part is the motion tracking electronics. Fortunately, ST’s STM32F3 Discovery development board has everything needed: a microcontroller with USB device support, a L3GD20 3 axis gyro, and a LSM303DLHC accelerometer/magnetometer. These components together provide a USB inertial measurement unit for tracking your head.

With the Discovery board strapped to the cardboard headset, an open-source firmware is flashed. This emulates the messages sent by a legitimate Oculus Rift motion tracker. The Galaxy Note 4 sees the device as a VR headset, and lets you run VR apps.

If you’re interested, the OpenGear team is offering a development kit. This is a great way for developers to get a head start on their apps before the Gear VR is actually released. The main downside is how you’ll look with this thing affixed to your face. There’s a head-to-head against the real Gear VR after the break.

[via Road To VR]

Continue reading “Build Your Own Gear VR”

CastAR Hands-On And Off-Record Look At Next Version

At long last I had the opportunity to try out the CastAR, a glasses-based Augmented Reality system developed by Technical Illusions. The hardware has been in the works now for a couple of years, but every time we have come across a demo we were thwarted by the long lines that accompany them. This time I was really lucky. [Jeri] gave us a private demo in a suite at the Palazzo during CES 2015. Reflecting on the experience, CastAR is exactly the type of Virtual Reality hardware I’ve been longing for.

Continue reading “CastAR Hands-On And Off-Record Look At Next Version”

“Superfan” Gaming Peripheral Lets You Feel Your Speed

Virtual reality has come a long way but some senses are still neglected. Until Smell-O-Vision happens, the next step might be feeling the wind in your hair. Perhaps dad racing a sportbike or kids giggling on a rollercoaster. Not as hard to build as you might think, you probably have the parts already.

HAD - Superfan4Off-the-shelf devices serve up the seeing and hearing part of your imaginary environment, but they stop there. [Jared] wanted to take the immersion farther by being able to feel the speed, which meant building his own high power wind generator and tying it into the VR system. The failed crowdfunding effort of the “Petal” meant that something new would have to be constructed. Obviously, to move air without actually going on a rollercoaster requires a motor controller and some fans. Powerful fans.

A proponent of going big or going home, [Jared] picked up a pair of fans and modified them so heavily that they will launch themselves off of the table if not anchored down. Who overdrives fans so hard they need custom heatsinks for the motors? He does. He admits he went overboard and sensibly way overbudget for most people but he built it for himself and does not care.

Continue reading ““Superfan” Gaming Peripheral Lets You Feel Your Speed”

CES: Building Booths And Simulating Reality

My first day on the ground at CES started with a somewhat amusing wait at the Taxi Stand of the McCarran International Airport. Actually I’m getting ahead of myself… it started with a surprisingly efficient badge-pickup booth in the baggage claim of the airport. Wait in line for about three minutes, and show them the QR code emailed to you from online registration and you’re ready to move to the 1/4 mile-long, six-switchback deep line for cabs. Yeah, there’s a lot of people here for this conference.

It’s striking just how huge this thing is. Every hotel on the strip is crawling with badge-wearing CES attendees. Many of the conference halls in the hotels are filled with booths, meaning the thing is spread out over a huge geographic area. We bought three-day monorail passes and headed to the convention center to get started.

Building the Booths

[Sophi] knows [Ben Unsworth] who put his heart and soul into this year’s IEEE booth. His company, Globacore, builds booths for conferences and this one sounds like it was an exceptional amount of fun to work on. He was part of a tiny team that built a mind-controlled drag strip based on Emotive Insight brainwave measuring hardware shipped directly from the first factory production run. This ties in with the display screens above the track to form a leader board. We’ll have a keen eye out for hacks this week, but the story behind building these booths may be the best hack to be found.

Oculus

[Ben] told us hands-down the thing to see is the new Oculus hardware called Crescent Bay. He emphatically mentioned The Holodeck which is a comparison we don’t throw around lightly. Seems like a lot of people feel that way because the line to try it out is wicked long. We downloaded their app which allows you to schedule a demo but all appointments are already taken. Hopefully our Twitter plea will be seen by their crew.

In the meantime we tried out the Oculus Gear VR. It uses a Galaxy Note 4 as the screen along with lenses and a variety of motion tracking and user controls. The demo was a Zelda-like game where you view the scene from overhead. This used a handheld controller to command the in-game character with the headset’s motion tracking used to look around the playing area. It was a neat demo, I’m not quite sold on long gaming sessions with the hardware but maybe I just need to get used full-immersion first.

Window to another Dimension

DSC_0317

The midways close at six o’clock and we made our way to the Occipital booth just as they were winding done. I’ve been 3D scanned a few times before but those systems used turntables and depth cameras on motorized tracks to do the work. This uses a depth-camera add-on for an iPad which they call Structure Sensor.

It is striking how quickly the rig can capture a model. This high-speed performance is parlayed into other uses, like creating a virtual world inside the iPad which the user navigates by using the screen as if it were a magic window into another dimension. Their demo was something along the lines of the game Portal and has us thinking that the Wii U controller has the right idea for entertainment, but it needs the performance that Occipital offers. I liked this experience more than the Oculus demo because you are not shut off from the real world as you make your way through the virtual.

We shot some video of the hardware and plan to post more about it as soon as we get the time to edit the footage.

Find Us or Follow Us

josh-can-hardwareWe’re wearing our Hackaday shirts and that stopped [Josh] in his tracks. He’s here on business with his company Evermind, but like any good hacker he is carrying around one of his passion projects in his pocket. What he’s showing off are a couple of prototypes for a CANbus sniffer and interface device that he’s build.

We’ll be at CES all week. You can follow our progress through the following Twitter accounts: @Hackaday, @HackadayPrize, @Szczys, and @SophiKravitz. If you’re here in person you can Tweet us to find where we are. We’re also planning a 9am Thursday Breakfast meetup at SambaLatte in the Monte Carlo. We hope you’ll stop by and say hi. Don’t forget to bring your own hardware!