Amazing IMU-based Motion Capture Suit Turns You Into A Cartoon

[Alvaro Ferrán Cifuentes] has built the coolest motion capture suit that we’ve seen outside of Hollywood. It’s based on tying a bunch of inertial measurement units (IMUs) to his body, sending the data to a computer, and doing some reasonably serious math. It’s nothing short of amazing, and entirely doable on a DIY budget. Check out the video below the break, and be amazed.

Cellphones all use IMUs to provide such useful functions as tap detection and screen rotation information. This means that they’ve become cheap. The ability to measure nine degrees of freedom on a tiny chip, for chicken scratch, pretty much made this development inevitable, as we suggested back in 2013 after seeing a one-armed proof-of-concept.

But [Alvaro] has gone above and beyond. Everything is open source and documented on his GitHun. An Arduino reads the sensor boards (over multiplexed I2C lines) that are strapped to his limbs, and send the data over Bluetooth to his computer. There, a Python script takes over and passes the data off to Blender which renders a 3D model to match, in real time.

All of this means that you could replicate this incredible project at home right now, on the cheap. We have no idea where this is heading, but it’s going to be cool.

40 thoughts on “Amazing IMU-based Motion Capture Suit Turns You Into A Cartoon

    1. Well, as everything, it depends on quantity, but I’d estimate around 10€ per node, plus another euro for the multiplexer chip and your standard arduino(-compatible?) microcontroller. So maybe about a hundred euros if you build it yourself (since this has been developed at the same time as other projects in the department it’s kind of difficult to keep track of costs)

      1. Thanks :)

        Unity have a free edition. I use it for 2 years and you will never miss the few, very few “pro” features.

        But being opensource, blender is also a good solution :)

  1. There are a lot of IMU based motion capturing systems out on the market – XSense being one of the most famous ones. Perception Neuron uses “cheap” mobile sensors as well (and ridiculously expensive connectors that probably cost more than 4 times the price of the sensors). On Kickstarter there are some (failed and some more or less alive) projects.

    So while this isn’t news, it’s nice that it is getting documented. Although, from own experience, the HARDWARE is the easy side. What makes a el-cheapo IMU based mocap system “work” is the software. You have to deal with A LOT OF NOISE ™, drift, constant semi-automated recalibration, EM … not to speak of some slightly complex IK helper systems to to get the limb hierarchies work somewhat consistently. Ground-contact handling is another non-trivial task.
    Also, if you want to get somewhat accurate measurements, considerable oversampling is required, which ups the needs for data throughput on sensors, routers, wiring …

    Again: Good that it’s done. New? Not at all.

    1. While true, it’s an excellent first shot.

      It would be easy to expand this to make the hardware capture more consistent. He’d need dedicated sampling for each IMU to get full bandwidth. That would also allow for simulteneous sampling which should make everything sync better, especially for fast movements. As for ground contact, yes, it is a pain, but I’d be curious about making a resistive matrix “shoe” to figure out ground contact points.

      1. Touch sensor on base of shoe indicates its down and so transform root of skeleton by values propogated up from contact point. Then the character will stick to the ground.
        Not really though as just before contact and just after contact means a second pass is required unbundling the transforms to fade in the effect and minimise the effects of the noise this sysetm is filled with.
        Doing a Tpose everynow and then (and detecting it) could reset the drift of the IMus…
        Great start – stick with it…

      1. Correct. But if the “news reporter” hasn’t gotten outdoors for a couple of decades (and therefor “hasn’t seen anything like this outside Bollywood”), one might need to explain to him the difference between real life(tm) and the web.
        In real life a lot of things are “old stuff” that the web-nerds consider “freakin’ cool” :-)

  2. A Danish company named Rokoko, have already perfected that, and uses it to entertain children. They are even doing some test with using it for therapy, where the therapist uses a suit, and the child sees a cartoon character instead, it seem to make it easier for the children to related to the therapist.
    http://rokoko.co/

  3. Great hack!

    Idea: replace the expensive BNO055 9-DOF sensor by an MPU-9250 (much cheaper on Aliexpress, $3.52/10), add ESP8266 ($1.68/20), 10ppm 26 MHz cristal ($0.26/50), W25Q16 SPI Flash ($0.18/20), TP4056 LiPo charger ($0.07/50)and strap it to a 601417 3.7V 120 mAH LiPo battery (T 6mm x W 14mm x L 17mm, $2.60/5)…

    What you get is a <$10 14mm x 17mm x 9mm Wi-Fi 9-DOF sensor with a 1 hour autonomy, small and cheap enough to put on each joint.

    Now, THIS is the definitive 1/3 sq. inch project winner!

      1. Those hardware sensor fusion engines only manage to produce ~4 degree heading accuracy at an output rate of 100 Hz max. Moreover, the BNO055 hardware sensor fusion solution displays a few degrees of heading drift even when turned through 90° laying flat on the desk. The skew of the underlying BMX055 sensor coupled with the automatic calibration is somehow causing this heading drift in the BNO055 hardware sensor fusion mode:
        https://github.com/kriswiner/MPU-6050/wiki/Hardware-Sensor-Fusion-Solutions

        Use raw sensor data instead and perform a Madgwick simple filter in software, you can achieve 250 Hz rate or more, and you don’t pay for the extra embedded Cortex M0 inside the BNO055.

    1. Wow! I didn’t know about the MPU-9250, will definitely give it a look.
      As for the ESP, I thought about whether going wireless, and decided against it because I thought charging all those batteries would be too much of a hassle, but after having to untangle all the sensors before each use I’ve changed my mind :)

    1. You can actually get quite acceptable data for longer periods of time (we have done tests with commercially available suits that ran for about 5-6 minutes without recalibration and only minor drift that could easily get compensated for, because the noise- and drift-prints of every single sensor was known). Oversampling is one important keyword. A sampling rate of 120Hz and higher allows you to filter out most errors, so if the COMPASS is good, you can compensate for drift on the gyro, which is one of the biggest issues with those Chinese mocap systems using mobile phone sensors.
      Most projects like this want to do it “on the cheap side”, being a cool “hack”, but turning into frustration rather quickly, because you just don’t get the data quality to have much fun with it. As a proof of concept this project is “gold”. Now, when the actual work starts (getting the software to compensate for the garbage that the sensors provide you with), it really shows if the “hacker” is just a “hack-kid” or has the power to finish a project :-D
      (nit picking here, of course, it still is a nice hobby!)

  4. This looks great. It’s been an interest of mine too but I’m not bright enough to work out the maths to animate bipedal 3d models with IMUs.
    One query I have, the frame rate *appears* to be pretty low, maybe 5 to 10fps, I’m hoping this is just a limitation of your graphics hardware rather than the sample rate of the IMU hardware, because it’s quite an important factor for character animation.

  5. Hi Alvaro i’ll do a similar project for my school, can you explain me how datas are recived by the computer (i mean, how they are packaged for be understanded by the pyton program in blender?), and how the pyton program works? sorry but i’m learning now how to use Blender and i don’t know how it uses the recived data by the sensor to move the bones of the armature.
    Tanks and a lot of compliments for your project.

Leave a Reply to FennecCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.