Facebook To Slurp Oculus Rift Users’ Every Move

The web is abuzz with the news that the Facebook-owned Oculus Rift has buried in its terms of service a clause allowing the social media giant access to the “physical movements and dimensions” of its users. This is likely to be used for the purposes of directing advertising to those users and most importantly for the advertisers, measuring the degree of interaction between user and advert. It’s a dream come true for the advertising business, instead of relying on eye-tracking or other engagement studies on limited subsets of users they can take these metrics from their entire user base and hone their offering on an even more targeted basis for peak interaction to maximize their revenue.

Hardly a surprise you might say, given that Facebook is no stranger to criticism on privacy matters. It does however represent a hitherto unseen level of intrusion into a user’s personal space, even to guess the nature of their activities from their movements, and this opens up fresh potential for nefarious uses of the data.

Fortunately for us there is a choice even if our community doesn’t circumvent the data-slurping powers of their headsets; a rash of other virtual reality products are in the offing at the moment from Samsung, HTC, and Sony among others, and of course there is Google’s budget offering. Sadly though it is likely that privacy concerns will not touch the non-tech-savvy end-user, so competition alone will not stop the relentless desire from big business to get this close to you. Instead vigilance is the key, to spot such attempts when they make their way into the small print, and to shine a light on them even when the organisations in question would prefer that they remained incognito.

Oculus Rift development kit 2 image: By Ats Kurvet – Own work, CC BY-SA 4.0, via Wikimedia Commons.

DIY Virtual Reality Snowboard

If you’re looking for a quick and easy project to get into virtual reality, making your own VR skateboard controller is actually pretty easy to do!

First you’ll need some kind of VR headset. You could buy a fancy one, like the Oculus, or a Samsung Gear VR — or you could use something as simple as Google Cardboard — and you could even make your own. All it takes is a phone, an Arduino, a Bluetooth module, and an accelerometer-plus-gyroscope IMU.

Continue reading “DIY Virtual Reality Snowboard”

Brain Waves Can Answer Spock’s (and VR’s) Toughest Question

In Star Trek IV: The Voyage Home, the usually unflappable Spock found himself stumped by one question: How do you feel? If researchers at the University of Memphis and IBM are correct, computers by Spock’s era might not have to ask. They’d know.

[Pouya Bashivan] and his colleagues used a relatively inexpensive EEG headset and machine learning techniques to determine if, with limited hardware, the computer could derive a subject’s mental state. This has several potential applications including adapting virtual reality avatars to match the user’s mood. A more practical application might be an alarm that alerts a drowsy driver.

Continue reading “Brain Waves Can Answer Spock’s (and VR’s) Toughest Question”

Greased Lightning Shows 360 Degrees

A lot of people got drones for Christmas this year (and many Hackaday readers already had one, anyway). A lot of these drones have cameras on them. The expensive ones beam back live video via RF. The cheaper ones just record to an SD card that you can download later.

If you are NASA, of course, this just isn’t good enough. At the Langley Research Center in Virginia, they’ve been building the Greased Lightning (also known as the GL-10) which is a 10-engine tilt-prop unmanned aerial vehicle. The carbon fiber drone is impressive, sure, but what wows is the recent video NASA released (see below).

Continue reading “Greased Lightning Shows 360 Degrees”

Augmented Reality Ultrasound

Think of Virtual Reality and it’s mostly fun and games that come to mind. But there’s a lot of useful, real world applications that will soon open up exciting possibilities in areas such as medicine, for example. [Victor] from the Shackspace hacker space in Stuttgart built an Augmented Reality Ultrasound scanning application to demonstrate such possibilities.

But first off, we cannot get over how it’s possible to go dumpster diving and return with a functional ultrasound machine! That’s what member [Alf] turned up with one day. After some initial excitement at its novelty, it was relegated to a corner gathering dust. When [Victor] spotted it, he asked to borrow it for a project. Shackspace were happy to donate it to him and free up some space. Some time later, [Victor] showed off what he did with the ultrasound machine.

As soon as the ultrasound scanner registers with the VR app, possibly using the image taped to the scan sensor, the scanner data is projected virtually under the echo sensor. There isn’t much detail of how he did it, but it was done using Vuforia SDK which helps build applications for mobile devices and digital eye wear in conjunction with the Unity 5 cross-platform game engine. Check out the video to see it in action.

Thanks to [hadez] for sending in this link.

Continue reading “Augmented Reality Ultrasound”

Amazing IMU-based Motion Capture Suit Turns You Into A Cartoon

[Alvaro Ferrán Cifuentes] has built the coolest motion capture suit that we’ve seen outside of Hollywood. It’s based on tying a bunch of inertial measurement units (IMUs) to his body, sending the data to a computer, and doing some reasonably serious math. It’s nothing short of amazing, and entirely doable on a DIY budget. Check out the video below the break, and be amazed.

Cellphones all use IMUs to provide such useful functions as tap detection and screen rotation information. This means that they’ve become cheap. The ability to measure nine degrees of freedom on a tiny chip, for chicken scratch, pretty much made this development inevitable, as we suggested back in 2013 after seeing a one-armed proof-of-concept.

But [Alvaro] has gone above and beyond. Everything is open source and documented on his GitHun. An Arduino reads the sensor boards (over multiplexed I2C lines) that are strapped to his limbs, and send the data over Bluetooth to his computer. There, a Python script takes over and passes the data off to Blender which renders a 3D model to match, in real time.

All of this means that you could replicate this incredible project at home right now, on the cheap. We have no idea where this is heading, but it’s going to be cool.

Continue reading “Amazing IMU-based Motion Capture Suit Turns You Into A Cartoon”

Hacklet 54 – Virtual Reality Projects

Virtual Reality is finally coming of age. Hackers, Makers and Engineers have dreamed of creating immersive interfaces for years. From the first flight simulators to today’s cellphone powered head mounted displays, VR has always been an exciting field. Many of the advances today are being created by hackers who were inspired by systems like Virtuality from the early 1990’s. Now 25 years on, we’re seeing amazing advances – not only in commercial systems, but in open source VR projects. This week’s Hacklet is all about the best VR projects on Hackaday.io!

vr1We start with [j0nno] and D.I.Y Virtual Reality. [J0nno] has become interested in VR, and decided to build his own head mounted display. His goal is to create a setup with full head tracking and an open source software stack. He’s hoping to do this within a budget of just $200 AUD. [J0nno] started with the Ritech3d-V2 VR Goggles, which are a plastic implementation of Google’s project cardboard. For display he’s using a 5.6 inch 1280 x 800 TFT LCD. Tracking is optical, using IR LEDs and a PS3 Eye camera. [J0nno’s] background is in software, so he’s doing great setting up OpenVR and Perception. The hardware side is a bit new to him. This isn’t stopping [J0nno] though! In true hacker spirit, he’s learning all about resistors and driving LEDs as he works on D.I.Y Virtual Reality.

vr2Next up is [Josh Lindsay] with Digitabulum: The last motion-capture glove. Digitabulum is a motion capture glove designed to be able to emulate most other motion capture systems. It is also designed to be relatively low-cost. At $400 per hand, it is less expensive than most other offerings, though we’d still love to see something even cheaper. [Josh] is going with inertial sensors, and a lot of them. Specifically he’s using no less than 17 LSM9DS1 Inertial Measurement Unit (IMU) sensors from ST Microelectronics. IMU sensors like this combine multiple rate gyros, accelerometers, and magnetometers into a single unit. Essentially every segment of every finger has its own sensor suite. As you might imagine, that is quite a bit of data to crunch. An Altera Max II CPLD and an ST Arm processor help boil down the data to something which a VR engine can process. [Josh] has been working on this project for over a year now, and he’s making great progress. The prototype glove looks terrific!

vr3[Thomas] brings augmented reality to the table with Oculus Rift featured Crane control. What started as a hobby experiment became [Thomas’] major project at university. He’s connected an Oculus Rift to a toy crane. A stereo camera on the crane sends a video image to the operator. The camera is mounted on a pan/tilt mechanism driven by the Rift’s head tracking unit. Simple joystick controls allow [Thomas] to move the boom and lower the line. On-screen displays show the current status of the crane. The use of the Rift makes this an immersive demonstration. One could easily see how moving this system into the real world would make crane operations safer for crane operators.

vr4Finally we have [Arcadia Labs] with DIY Augmented Reality Device. This project, which is the [Arcadia Labs] entry in the 2015 Hackaday Prize, uses two 320 x 240 screens to create an augmented reality head mounted display. While the resolution can’t match that of the Oculus Rift or HTC Vive, [Arcadia Labs] is ok with that. They’re going for a lower cost open source alternative for augmented reality. Tracking is achieved with an IMU, while a PS3 Eye camera provides the video. A Raspberry Pi controls the show. [Arcadia Labs] was able to get 50 frames per second on the displays just using the Pi’s SPI interface, however the USB PS3 Eye camera limits things to around 10 FPS. This project is under heavy development right now, so follow along with us to see where [Arcadia Labs] ends up!

If you want VR goodness, check out our new virtual reality projects list! Did I miss your project? Don’t be shy, just drop me a message on Hackaday.io. If you’re on the left coast of the USA, check out SOCAL Virtual Reality Conference and Expo. Hackaday is a sponsor. The event happens on July 12 at the University of California Irvine.

That’s it for this week’s Hacklet, As always, see you next week. Same hack time, same hack channel, bringing you the best of Hackaday.io!