Testing VR Limits with a Raspberry Pi

vrpi

Virtual Reality by function pushes the boundaries of what we perceive as existence, tricking the mind into believing that the computer generated environment that the user is thrust into actually contains a real place. So in the spirit of seeing what is possible in VR, a developer named [Jacques] hooked up a Raspberry Pi to an Oculus Rift. He used a computer graphics rendering API called OpenGL ES, which is much like any mobile platform found these days, to render a floating, rotating cube.

All his tests were done on a Release build which utilized the official vertex and fragment shaders. There was no attempt to optimize anything; not like there would be much to do anyways. The scene was rendered twice at 16 milliseconds per frame. From there, he attempted 27 ms per frame with texture, followed by 36 ms/frame, and then 45.

The code used can be found on [Jacques]‘s Github account. A simple improvement would use a Banana Pi for better processing speed. However, don’t expect any spectacular results with this type of setup. Really, the project only proves that it’s possible to minimize a VR experience into something that could become portable. And in the same vein, the Pi + Oculus integration can produce an uncomfortable lagging effect if things are not lined up properly. But once the energy/computing power issues are addressed, VR devices could transform into a more fashionable product like Google Glass, where a simple flip of a switch would toggle the view between VR and AR into a something more mixed. And then a motion sensing input camera like this Kinect-mapping space experiment could allow people all over the world to jump into the perspectives of other reality-pushing explorers. That’s all far down the line though, but this project lays the foundation for what the future might hold.

To see [Jacques]‘s full set up, view the video after the break.

Comments

  1. John Ohno says:

    While trying a normal Oculus Rift demo on the Pi is likely to be VR-sickness-inducing, I’ll bet running any of the many VR demos from the first VR hype boom (about 1987-1993) would go wonderfully (even in emulation, probably). All else fails, you can probably play Doom 2 in glorious steroscopy with a couple code changes deep in the guts.

    • F says:

      raspberry pi is pretty much screwed in the real-time response department, all of its IO happens through a single USB port and the crappy USB hub built into its ethernet chip. It’s the modern version of the commodore 64 floppy drive, but worse.

      Even if you can get the CPU to sing, it’s going to suffer terrible input lag.

  2. Technoshaman says:

    Nice attempt! However there are quad core and very powerful embedded linux boards such as the Chinese Radxa or the Korean Odroids (XU3 is crazy) where this could be implemented much better.

    • F says:

      “Normally ODROID-XU3 consumes about 1~2Amp in most cases. But it can go up to 4A when the computing load is very high with a few USB devices.”

      “Typically, the model B uses between 700-1000mA depending on what peripherals are connected, and the model A can use as little as 500mA with no peripherals attached. The maximum power the Raspberry Pi can use is 1 Amp”

      These systems have very different power requirements. ODROID is certainly much faster but you are lugging twice the batteries for same play time.

      The whole point here is to eliminate any tether cable. If you need a cable then you might as well make it a video cable and put the computer hardware on the floor instead of carrying it.

  3. Yarr says:

    “All his tests were done on a Release build which utilized the official vertex and fragment shaders. There was no attempt to optimize anything; not like there would be much to do anyways. The scene was rendered twice at 16 milliseconds per frame. From there, he attempted 27 ms per frame with texture, followed by 36 ms/frame, and then 45.”

    So the Raspberry Pi can’t even do two textured cubes at 23 frames per second (1000/45) when applying a pretty simple full-screen VS/FS technique. Impressive! Even more impressive when considering that the Oculus devs suggest at least 100fps in order to minimize motion sickness.

    • tekkieneet says:

      “By stepping outside the rules of the game you can redefine the game”
      Rename project to “RPi Motion Sickness Inducer” and that’s going to be a huge success.

    • Whatnot says:

      Clearly the guy is doing something wrong, because it seems highly unlikely a raspi can’t do a cube at a significantly higher speed.

      • SOI Sentinel says:

        While I agree something seems wrong with the implementation as stated, I’d also think that the new Dev kit with the Note 3 display also shows how to make this work portably. Swap in a full Note 3.

  4. Thank Broadcom for not being able to really squeeze that dual core graphics engine..

    • F says:

      you’re better of thanking the software consultants who wrote their code and burdened it with non-disclosure agreements.

      nvidia has the same problem: they paid consultants to write code for them and now they cannot release it as open-source even if they want to, because the consultants protected their cushy jobs with contractual agreements to NEVER release the source to the public.

      If this torques your screws then you should become a 3-D graphics consultant and sell your services without these nasty non-disclosures

    • gregkennedy says:

      I thought they released the source to the graphics driver…

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 93,990 other followers