Amazing IMU-based Motion Capture Suit Turns You Into A Cartoon

[Alvaro Ferrán Cifuentes] has built the coolest motion capture suit that we’ve seen outside of Hollywood. It’s based on tying a bunch of inertial measurement units (IMUs) to his body, sending the data to a computer, and doing some reasonably serious math. It’s nothing short of amazing, and entirely doable on a DIY budget. Check out the video below the break, and be amazed.

Cellphones all use IMUs to provide such useful functions as tap detection and screen rotation information. This means that they’ve become cheap. The ability to measure nine degrees of freedom on a tiny chip, for chicken scratch, pretty much made this development inevitable, as we suggested back in 2013 after seeing a one-armed proof-of-concept.

But [Alvaro] has gone above and beyond. Everything is open source and documented on his GitHun. An Arduino reads the sensor boards (over multiplexed I2C lines) that are strapped to his limbs, and send the data over Bluetooth to his computer. There, a Python script takes over and passes the data off to Blender which renders a 3D model to match, in real time.

All of this means that you could replicate this incredible project at home right now, on the cheap. We have no idea where this is heading, but it’s going to be cool.

Continue reading “Amazing IMU-based Motion Capture Suit Turns You Into A Cartoon”

Hacklet 54 – Virtual Reality Projects

Virtual Reality is finally coming of age. Hackers, Makers and Engineers have dreamed of creating immersive interfaces for years. From the first flight simulators to today’s cellphone powered head mounted displays, VR has always been an exciting field. Many of the advances today are being created by hackers who were inspired by systems like Virtuality from the early 1990’s. Now 25 years on, we’re seeing amazing advances – not only in commercial systems, but in open source VR projects. This week’s Hacklet is all about the best VR projects on Hackaday.io!

vr1We start with [j0nno] and D.I.Y Virtual Reality. [J0nno] has become interested in VR, and decided to build his own head mounted display. His goal is to create a setup with full head tracking and an open source software stack. He’s hoping to do this within a budget of just $200 AUD. [J0nno] started with the Ritech3d-V2 VR Goggles, which are a plastic implementation of Google’s project cardboard. For display he’s using a 5.6 inch 1280 x 800 TFT LCD. Tracking is optical, using IR LEDs and a PS3 Eye camera. [J0nno’s] background is in software, so he’s doing great setting up OpenVR and Perception. The hardware side is a bit new to him. This isn’t stopping [J0nno] though! In true hacker spirit, he’s learning all about resistors and driving LEDs as he works on D.I.Y Virtual Reality.

vr2Next up is [Josh Lindsay] with Digitabulum: The last motion-capture glove. Digitabulum is a motion capture glove designed to be able to emulate most other motion capture systems. It is also designed to be relatively low-cost. At $400 per hand, it is less expensive than most other offerings, though we’d still love to see something even cheaper. [Josh] is going with inertial sensors, and a lot of them. Specifically he’s using no less than 17 LSM9DS1 Inertial Measurement Unit (IMU) sensors from ST Microelectronics. IMU sensors like this combine multiple rate gyros, accelerometers, and magnetometers into a single unit. Essentially every segment of every finger has its own sensor suite. As you might imagine, that is quite a bit of data to crunch. An Altera Max II CPLD and an ST Arm processor help boil down the data to something which a VR engine can process. [Josh] has been working on this project for over a year now, and he’s making great progress. The prototype glove looks terrific!

vr3[Thomas] brings augmented reality to the table with Oculus Rift featured Crane control. What started as a hobby experiment became [Thomas’] major project at university. He’s connected an Oculus Rift to a toy crane. A stereo camera on the crane sends a video image to the operator. The camera is mounted on a pan/tilt mechanism driven by the Rift’s head tracking unit. Simple joystick controls allow [Thomas] to move the boom and lower the line. On-screen displays show the current status of the crane. The use of the Rift makes this an immersive demonstration. One could easily see how moving this system into the real world would make crane operations safer for crane operators.

vr4Finally we have [Arcadia Labs] with DIY Augmented Reality Device. This project, which is the [Arcadia Labs] entry in the 2015 Hackaday Prize, uses two 320 x 240 screens to create an augmented reality head mounted display. While the resolution can’t match that of the Oculus Rift or HTC Vive, [Arcadia Labs] is ok with that. They’re going for a lower cost open source alternative for augmented reality. Tracking is achieved with an IMU, while a PS3 Eye camera provides the video. A Raspberry Pi controls the show. [Arcadia Labs] was able to get 50 frames per second on the displays just using the Pi’s SPI interface, however the USB PS3 Eye camera limits things to around 10 FPS. This project is under heavy development right now, so follow along with us to see where [Arcadia Labs] ends up!

If you want VR goodness, check out our new virtual reality projects list! Did I miss your project? Don’t be shy, just drop me a message on Hackaday.io. If you’re on the left coast of the USA, check out SOCAL Virtual Reality Conference and Expo. Hackaday is a sponsor. The event happens on July 12 at the University of California Irvine.

That’s it for this week’s Hacklet, As always, see you next week. Same hack time, same hack channel, bringing you the best of Hackaday.io!

An Introduction To Valve’s Tracking Hardware

[Alan Yates] brought a demo of Valve’s new VR tech that’s the basis of the HTC Vive system to Maker Faire this year. It’s exceptionally clever, and compared to existing VR headsets it’s probably one of the best headtracking solutions out there.

With VR headsets, the problem isn’t putting two displays in front of the user’s eyes. The problem is determining where the user is looking quickly and accurately. IMUs and image processing techniques can be used with varying degrees of success, but to do it right, it needs to be really fast and really cheap.

[Alan] and [Valve]’s ‘Lighthouse’ tracking unit does this by placing a dozen or so IR photodiodes on the headset itself. On the tracking base station, IR lasers scan in the X and Y axes. By scanning these IR lasers across the VR headset, the angle of the headset to the base station can be computed in just a few cycles of a microcontroller. For a bunch of one cent photodiodes, absolute angles and the orientation to a base station can be determined very easily, something that has some pretty incredible applications for everything from VR to robotics.

Remember all of the position tracking hacks that came out as a result of the Nintendo Wii using IR beacons and a tracking camera? This seems like an evolutionary leap forward but in the same realm and can’t wait to see people hacking on this tech!

VCF East X: Virtual Reality With PETSCII

What would happen if Oculus-quality virtual reality was created in the 80s on the Commodore PET? [Michael Hill] knows, because he created a stereoscopic video headset using a PET.

This build is an extension of [Michael]’s exhibit last year at VCF East where he displayed a video feed with PETSCII. Yes, that means displaying video with characters, not pixels.

This year, he’s doubling the number of screens, and sending everything to two iPhones in a Google Cardboard-like VR headset. Apart from the optics, the setup is pretty simple: cameras get image data, it’s sent over to a PET, and a stream of characters are sent back.

It’s impossible to film, and using it is interesting, to say the least. Video below.

Continue reading “VCF East X: Virtual Reality With PETSCII”

Putting Oculus Rift On A Robot

Many of the early applications for the much anticipated Oculus Rift VR rig have been in gaming. But it’s interesting to see some more useful applications besides gaming, before it’s commercial release sometime this year. [JoLau] at the Institute i4Ds of FHNW School of Engineering wanted to go a step beyond rendering virtual worlds. So he built the Intuitive Rift Explorer a.k.a IRE. The IRE is a moving reality system consisting of a gimbaled stereo-vision camera rig transmitting video to the Rift, and matching head movements received from the Oculus Rift. The vision platform is mounted on a Remote-controlled robot which is completely wireless.

One of the big challenges with using VR headsets is lag, causing motion sickness in some cases. He had to tackle the problem of latency – reducing the time from moving the head to getting a matching image on the headset – Oculus Rift team specified it should be less than 20ms. The other important requirement is a high frame rate, in this case 60 frames per second. [JoLau] succeeded in overcoming most of the problems, although in conclusion he does mention a couple of enhancements that he would like to add, given more time.

[JoLau] provides a detailed description of the various sub-systems that make up IRE – the Stereo camera,  audio and video transmission, media processing, servo driven gimbal for the stereo camera,  and control system code. [JoLau]’s reasoning on some of the interesting hardware choices for several components used in the project makes for interesting reading. Watch a video of the IRE in action below.

Continue reading “Putting Oculus Rift On A Robot”

Build Your Own Gear VR

With Samsung’s new Gear VR announced, developers and VR enthusiasts are awaiting the release of the smartphone connected VR headset. A few people couldn’t wait to get their hands on the platform, so they created, OpenGear, a Gear VR compatible headset.

The OpenGear starts off with a Samsung Galaxy Note 4, which is the target platform for the Gear VR headset. A cardboard enclosure, similar to the Google Cardboard headset, holds the lenses and straps the phone to your face.

The only missing part is the motion tracking electronics. Fortunately, ST’s STM32F3 Discovery development board has everything needed: a microcontroller with USB device support, a L3GD20 3 axis gyro, and a LSM303DLHC accelerometer/magnetometer. These components together provide a USB inertial measurement unit for tracking your head.

With the Discovery board strapped to the cardboard headset, an open-source firmware is flashed. This emulates the messages sent by a legitimate Oculus Rift motion tracker. The Galaxy Note 4 sees the device as a VR headset, and lets you run VR apps.

If you’re interested, the OpenGear team is offering a development kit. This is a great way for developers to get a head start on their apps before the Gear VR is actually released. The main downside is how you’ll look with this thing affixed to your face. There’s a head-to-head against the real Gear VR after the break.

[via Road To VR]

Continue reading “Build Your Own Gear VR”

CastAR Hands-On And Off-Record Look At Next Version

At long last I had the opportunity to try out the CastAR, a glasses-based Augmented Reality system developed by Technical Illusions. The hardware has been in the works now for a couple of years, but every time we have come across a demo we were thwarted by the long lines that accompany them. This time I was really lucky. [Jeri] gave us a private demo in a suite at the Palazzo during CES 2015. Reflecting on the experience, CastAR is exactly the type of Virtual Reality hardware I’ve been longing for.

Continue reading “CastAR Hands-On And Off-Record Look At Next Version”