Ask Hackaday: Is Anyone Sad Phone VR Is Dead?

It’s official: smartphone-based VR is dead. The two big players in this space were Samsung Gear VR (powered by Oculus, which is owned by Facebook) and Google Daydream. Both have called it quits, with Google omitting support from their newer phones and Oculus confirming that the Gear VR has reached the end of its road. Things aren’t entirely shut down quite yet, but when it does it will sure leave a lot of empty headsets laying around. These things exist in the millions, but did anyone really use phone-based VR? Are any of you sad to see it go?

Google Cardboard, lowering cost and barrier to entry about as low as it could go.

In case you’re unfamiliar with phone-based VR, this is how it works: the user drops their smartphone into a headset, puts it on their head, and optionally uses a wireless controller to interact with things. The smartphone takes care of tracking motion and displaying 3D content while the headset itself takes care of the optics and holds everything in front of the user’s eyeballs. On the low end was Google Cardboard and on the higher end was Daydream and Gear VR. It works, and is both cheap and portable, so what happened?

In short, phone-based VR had constraints that limited just how far it could go when it came to delivering a VR experience, and these constraints kept it from being viable in the long run. Here are some of the reasons smartphone-based VR hit the end of the road: Continue reading “Ask Hackaday: Is Anyone Sad Phone VR Is Dead?”

Hacklet 54 – Virtual Reality Projects

Virtual Reality is finally coming of age. Hackers, Makers and Engineers have dreamed of creating immersive interfaces for years. From the first flight simulators to today’s cellphone powered head mounted displays, VR has always been an exciting field. Many of the advances today are being created by hackers who were inspired by systems like Virtuality from the early 1990’s. Now 25 years on, we’re seeing amazing advances – not only in commercial systems, but in open source VR projects. This week’s Hacklet is all about the best VR projects on Hackaday.io!

vr1We start with [j0nno] and D.I.Y Virtual Reality. [J0nno] has become interested in VR, and decided to build his own head mounted display. His goal is to create a setup with full head tracking and an open source software stack. He’s hoping to do this within a budget of just $200 AUD. [J0nno] started with the Ritech3d-V2 VR Goggles, which are a plastic implementation of Google’s project cardboard. For display he’s using a 5.6 inch 1280 x 800 TFT LCD. Tracking is optical, using IR LEDs and a PS3 Eye camera. [J0nno’s] background is in software, so he’s doing great setting up OpenVR and Perception. The hardware side is a bit new to him. This isn’t stopping [J0nno] though! In true hacker spirit, he’s learning all about resistors and driving LEDs as he works on D.I.Y Virtual Reality.

vr2Next up is [Josh Lindsay] with Digitabulum: The last motion-capture glove. Digitabulum is a motion capture glove designed to be able to emulate most other motion capture systems. It is also designed to be relatively low-cost. At $400 per hand, it is less expensive than most other offerings, though we’d still love to see something even cheaper. [Josh] is going with inertial sensors, and a lot of them. Specifically he’s using no less than 17 LSM9DS1 Inertial Measurement Unit (IMU) sensors from ST Microelectronics. IMU sensors like this combine multiple rate gyros, accelerometers, and magnetometers into a single unit. Essentially every segment of every finger has its own sensor suite. As you might imagine, that is quite a bit of data to crunch. An Altera Max II CPLD and an ST Arm processor help boil down the data to something which a VR engine can process. [Josh] has been working on this project for over a year now, and he’s making great progress. The prototype glove looks terrific!

vr3[Thomas] brings augmented reality to the table with Oculus Rift featured Crane control. What started as a hobby experiment became [Thomas’] major project at university. He’s connected an Oculus Rift to a toy crane. A stereo camera on the crane sends a video image to the operator. The camera is mounted on a pan/tilt mechanism driven by the Rift’s head tracking unit. Simple joystick controls allow [Thomas] to move the boom and lower the line. On-screen displays show the current status of the crane. The use of the Rift makes this an immersive demonstration. One could easily see how moving this system into the real world would make crane operations safer for crane operators.

vr4Finally we have [Arcadia Labs] with DIY Augmented Reality Device. This project, which is the [Arcadia Labs] entry in the 2015 Hackaday Prize, uses two 320 x 240 screens to create an augmented reality head mounted display. While the resolution can’t match that of the Oculus Rift or HTC Vive, [Arcadia Labs] is ok with that. They’re going for a lower cost open source alternative for augmented reality. Tracking is achieved with an IMU, while a PS3 Eye camera provides the video. A Raspberry Pi controls the show. [Arcadia Labs] was able to get 50 frames per second on the displays just using the Pi’s SPI interface, however the USB PS3 Eye camera limits things to around 10 FPS. This project is under heavy development right now, so follow along with us to see where [Arcadia Labs] ends up!

If you want VR goodness, check out our new virtual reality projects list! Did I miss your project? Don’t be shy, just drop me a message on Hackaday.io. If you’re on the left coast of the USA, check out SOCAL Virtual Reality Conference and Expo. Hackaday is a sponsor. The event happens on July 12 at the University of California Irvine.

That’s it for this week’s Hacklet, As always, see you next week. Same hack time, same hack channel, bringing you the best of Hackaday.io!

Castrol Virtual Drift: Hacking Code At 80MPH With A Driver In A VR Helmet

Driving a brand new 670 horsepower Roucsh stage 3 Mustang while wearing virtual reality goggles. Sounds nuts right? That’s exactly what Castrol Oil’s advertising agency came up with though. They didn’t want to just make a commercial though – they wanted to do the real thing. Enter [Adam and Glenn], the engineers who were tasked with getting data from the car into a high end gaming PC. The computer was running a custom simulation under the Unreal Engine. El Toro field provided a vast expanse of empty tarmac to drive the car without worry of hitting any real world obstacles.

The Oculus Rift was never designed to be operated inside a moving vehicle, so it presented a unique challenge for [Adam and Glenn]. Every time the car turned or spun, the Oculus’ on-board Inertial Measurement Unit (IMU) would think driver [Matt Powers] was turning his head. At one point [Matt] was trying to drive while the game engine had him sitting in the passenger seat turned sideways. The solution was to install a 9 degree of freedom IMU in the car, then subtract the movements of that IMU from the one in the Rift.

GPS data came from a Real Time Kinematic (RTK) GPS unit. Unfortunately, the GPS had a 5Hz update rate – not nearly fast enough for a car moving close to 100 MPH. The GPS was relegated to aligning the virtual and real worlds at the start of the simulation. The rest of the data came from the IMUs and the car’s own CAN bus. [Adam and Glenn] used an Arduino with a Microchip mcp2515 can bus interface  to read values such as steering angle, throttle position, brake pressure, and wheel spin. The data was then passed on to the Unreal engine. The Arduino code is up on Github, though the team had to sanitize some of Ford’s proprietary CAN message data to avoid a lawsuit. It’s worth noting that [Adam and Glenn] didn’t have any support from Ford on this, they just sniffed the CAN network to determine each message ID.

The final video has the Hollywood treatment. “In game” footage has been replaced with pre-rendered sequences, which look so good we’d think the whole thing was fake, that is if we didn’t know better.

Click past the break for the final commercial and some behind the scenes footage.

Continue reading “Castrol Virtual Drift: Hacking Code At 80MPH With A Driver In A VR Helmet”

Oculus Rift Goes From Virtual To Augmented Reality

[William Steptoe] is a post-doctoral research associate at University College London. This means he gets to play with some really cool hardware. His most recent project is an augmented reality update to the Oculus Rift. This is much more than hacking a pair of cameras on the Rift though. [William] has created an entire AR/VR user interface, complete with dockable web browser screens. He started with a stock Rift, and a room decked out with a professional motion capture system. The Rift was made wireless with the addition of an ASUS Wavi and a laptop battery system. [William] found that the wireless link added no appreciable latency to the Rift. To move into the realm of augmented reality, [William] added a pair of Logitech C310 cameras. The C310 lens’ field of view was a bit narrow for what he needed, so lenses from a Genius WideCam F100 were swapped in. The Logitech cameras were stripped down to the board level, and mounted on 3D printed brackets that clip onto the Rift’s display. Shapelock was added to the mounts to allow the convergence of the cameras to be easily set.

Stereo camera calibration is a difficult and processor intensive process. Add to that multiple tracking systems (both the 6DOF head tracking on the Rift, and the video tracker built-in to the room) and you’ve got quite a difficult computational process. [William] found that he needed to use a Unity shader running on his PC’s graphics card to get the system to operate in real-time.  The results are quite stunning. We didn’t have a Rift handy to view the 3D portions of [William’s] video. However,  the sense of presence in the room still showed through. Videos like this make us excited for the future of augmented reality applications, with the Rift, the upcoming castAR, and with other systems.

Continue reading “Oculus Rift Goes From Virtual To Augmented Reality”