Testing VR Limits With A Raspberry Pi

vrpi

Virtual Reality by function pushes the boundaries of what we perceive as existence, tricking the mind into believing that the computer generated environment that the user is thrust into actually contains a real place. So in the spirit of seeing what is possible in VR, a developer named [Jacques] hooked up a Raspberry Pi to an Oculus Rift. He used a computer graphics rendering API called OpenGL ES, which is much like any mobile platform found these days, to render a floating, rotating cube.

All his tests were done on a Release build which utilized the official vertex and fragment shaders. There was no attempt to optimize anything; not like there would be much to do anyways. The scene was rendered twice at 16 milliseconds per frame. From there, he attempted 27 ms per frame with texture, followed by 36 ms/frame, and then 45.

The code used can be found on [Jacques]’s Github account. A simple improvement would use a Banana Pi for better processing speed. However, don’t expect any spectacular results with this type of setup. Really, the project only proves that it’s possible to minimize a VR experience into something that could become portable. And in the same vein, the Pi + Oculus integration can produce an uncomfortable lagging effect if things are not lined up properly. But once the energy/computing power issues are addressed, VR devices could transform into a more fashionable product like Google Glass, where a simple flip of a switch would toggle the view between VR and AR into a something more mixed. And then a motion sensing input camera like this Kinect-mapping space experiment could allow people all over the world to jump into the perspectives of other reality-pushing explorers. That’s all far down the line though, but this project lays the foundation for what the future might hold.

To see [Jacques]’s full set up, view the video after the break.

15 thoughts on “Testing VR Limits With A Raspberry Pi

  1. While trying a normal Oculus Rift demo on the Pi is likely to be VR-sickness-inducing, I’ll bet running any of the many VR demos from the first VR hype boom (about 1987-1993) would go wonderfully (even in emulation, probably). All else fails, you can probably play Doom 2 in glorious steroscopy with a couple code changes deep in the guts.

    1. raspberry pi is pretty much screwed in the real-time response department, all of its IO happens through a single USB port and the crappy USB hub built into its ethernet chip. It’s the modern version of the commodore 64 floppy drive, but worse.

      Even if you can get the CPU to sing, it’s going to suffer terrible input lag.

        1. I can’t see joystick manufacturers getting very excited about making raspberry pi versions of their hardware, so any GPIO solution is going to be some sort of MacGyver nightmare. Instead of pulling out the soldering iron, I’d swap out the pi for another board with better IO.

  2. Nice attempt! However there are quad core and very powerful embedded linux boards such as the Chinese Radxa or the Korean Odroids (XU3 is crazy) where this could be implemented much better.

    1. “Normally ODROID-XU3 consumes about 1~2Amp in most cases. But it can go up to 4A when the computing load is very high with a few USB devices.”

      “Typically, the model B uses between 700-1000mA depending on what peripherals are connected, and the model A can use as little as 500mA with no peripherals attached. The maximum power the Raspberry Pi can use is 1 Amp”

      These systems have very different power requirements. ODROID is certainly much faster but you are lugging twice the batteries for same play time.

      The whole point here is to eliminate any tether cable. If you need a cable then you might as well make it a video cable and put the computer hardware on the floor instead of carrying it.

      1. F, you my hero. Goals of efficiency and elegance? Thinking carefully and using available hardware and power resourcew as best possible? Blasphemy – I can and will throw more hardware at the problem! I CAN CARRY CAR BATTERIES MAN… but please don’t make me think about optimization.

        I acknowledge the Pi’s not slated to play Crysis (maybe it can?), but I also have a hard time imagining it can’t keep up with the potential requirements of the OR, if there were a will. Reserve the right to be mistaken, though.

  3. “All his tests were done on a Release build which utilized the official vertex and fragment shaders. There was no attempt to optimize anything; not like there would be much to do anyways. The scene was rendered twice at 16 milliseconds per frame. From there, he attempted 27 ms per frame with texture, followed by 36 ms/frame, and then 45.”

    So the Raspberry Pi can’t even do two textured cubes at 23 frames per second (1000/45) when applying a pretty simple full-screen VS/FS technique. Impressive! Even more impressive when considering that the Oculus devs suggest at least 100fps in order to minimize motion sickness.

      1. While I agree something seems wrong with the implementation as stated, I’d also think that the new Dev kit with the Note 3 display also shows how to make this work portably. Swap in a full Note 3.

    1. you’re better of thanking the software consultants who wrote their code and burdened it with non-disclosure agreements.

      nvidia has the same problem: they paid consultants to write code for them and now they cannot release it as open-source even if they want to, because the consultants protected their cushy jobs with contractual agreements to NEVER release the source to the public.

      If this torques your screws then you should become a 3-D graphics consultant and sell your services without these nasty non-disclosures

Leave a Reply to mbb11253Cancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.