Putting Oculus Rift On A Robot

Many of the early applications for the much anticipated Oculus Rift VR rig have been in gaming. But it’s interesting to see some more useful applications besides gaming, before it’s commercial release sometime this year. [JoLau] at the Institute i4Ds of FHNW School of Engineering wanted to go a step beyond rendering virtual worlds. So he built the Intuitive Rift Explorer a.k.a IRE. The IRE is a moving reality system consisting of a gimbaled stereo-vision camera rig transmitting video to the Rift, and matching head movements received from the Oculus Rift. The vision platform is mounted on a Remote-controlled robot which is completely wireless.

One of the big challenges with using VR headsets is lag, causing motion sickness in some cases. He had to tackle the problem of latency – reducing the time from moving the head to getting a matching image on the headset – Oculus Rift team specified it should be less than 20ms. The other important requirement is a high frame rate, in this case 60 frames per second. [JoLau] succeeded in overcoming most of the problems, although in conclusion he does mention a couple of enhancements that he would like to add, given more time.

[JoLau] provides a detailed description of the various sub-systems that make up IRE – the Stereo camera,  audio and video transmission, media processing, servo driven gimbal for the stereo camera,  and control system code. [JoLau]’s reasoning on some of the interesting hardware choices for several components used in the project makes for interesting reading. Watch a video of the IRE in action below.

[vimeo 110369914 w=500 h=281]

15 thoughts on “Putting Oculus Rift On A Robot

  1. Have wanted to try this for a long time but imagined strapping it to a quadcopter or an rc plane. However, latency of current tech is why I imagined it wouldnt work well :/

  2. Pretty awesome way to use it, but those HDMI type connections are an accident waiting to happen, just wiggling around side to side is stressing the pads and will gradually separate them from the PCB, also driving around with two giant cable loops on the side seems like a bad idea.. They are currently 180° flipped from each other, why not raise one up ever so slightly and stack/stagger them so the cables go out the same way(think like how two ‘L’ letters would nest together), then the image won’t need to be inverted on one either. If that’s enough to mess up the stereo vision that’s understandable though.

  3. This thing would make me instantly sick. I seem to be overly susceptible to input lag. I think the ultimate option would be a stereoscopic 360 degree camera which handles movement as close to the hmd as possible. Lag between when you push a joystick for movement is more acceptable than head movement lag. In any case, keep up the good work. I really wish I didn’t get as sick as I do with these things. It’s not from a lack of trying, just happen to not cope as well with this particular implementation of simulator sickness (HMD’s that is)

    1. It’s not that some people just get motion sick. Everybody has a degree of tolerance to this kind of lag, and in this case they have unfortunately still not made big improvements on what has been done before… We are looking at more lag in the video than 20ms, so if they are actually getting <1ms wireless video then there's still some huge buffer somewhere in the control loop that they can't get to.

  4. I could see something like this attached to those RC robots they send into confined spaces/collapsed buildings to look for people and whatnot. I think it would give better perception of those spaces than the current fixed cameras do.

  5. Cool gadget! A logical extension might be to put the user in a tilt chair so they dont get motion sick from the moving camera. Also, you should probably cite the source for that music, it’s called Pamgaea by Kevin MacLeod and is available from Incompetench.com (I’ve personally used that tune before ^_^)

  6. I wonder if it’d be easier to damp the input method instead of trying to increase response time? Like using something akin to one of those head braces with dampers for each axis so you can’t move too fast.

  7. How come nobody has done this with brushless motors and drivers? All the modern gimbals respond really quickly and precisely, which is why servos got ditched for use in gimbals.
    Plus why do all these research projects seemingly use the really old-hat DK1 Rift?

    1. In the first place I also wanted to do it with a brushless gimbal, but my cameras have thick USB 3.0 cables, so it would have been almost impossible to balance it. But there is actually a project, which did it with brushless motors: http://www.narvaro3d.com/ I also described the reason for using DK1 on my blog.
      What other projects do you know?

Leave a Reply to ulfurkCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.