Motion control photography allows for stunning imagery, although commercial robotic MoCo rigs are hardly affordable. But what is money? Scratch-built from what used to be mechatronic junk and a hacked Canon EF-S lens, [Howard’s] DIY motion control camera rig produces cinematic footage that just blows us away.
[Howard] started this project about a year ago by carrying out some targeted experiments. These would not only assess the suitability of components he gathered together from all directions, but also his own capacity in picking up enough knowledge on mechatronics to make the whole thing work. After making himself accustomed to stepper motors, Teensies and Arduinos, he converted an old moving-head disco light into a pan and tilt mount for the camera. A linear axis was added, and with more degrees of freedom, more sophisticated means of control became necessary.
Using the Swift programming language, [Howard] wrote a host program that automatically detects the numerous stepper and servo motor based axis and streams the position data to their individual Teensy LC based controllers. To the professional motion graphics artist , these shots aren’t just nice and steady footage: The real magic happens when he starts adding perfectly matched layers of CGI. Therefore, he also wrote some Python scripts that allow him to manually control his MoCo rig from a virtual rig in Blender, and also export camera trajectories directly from his 3D scenes.
On top of the 4-axis camera mount and a rotary stage, [Howard] also needed to find an electronic follow-focus mechanism to keep the now moving objects in focus. Since the Canon EF-S protocol had already been reverse engineered, he decided to tap into the SPI control bus between the camera and the lens to make use of its internal ring motor. Although the piezo/ultrasonic motors in autofocus lenses aren’t actually built for absolute positioning, a series of tests revealed that a Canon EF-S 17-55mm IS USM lens can be refocused a few hundred times and still return to its starting position close enough. The caveat: [Howard] had to hack open the £600 lens and drill holes in it. In retrospect, he tells us, it’s a miracle that his wife didn’t leave him during the project.
After several iterations of mechanical improvements, the motion control rig is now finished, and the first clips have already been recorded and edited. They’re stunning. Only the 6-axis robot arm hiding in [Howard’s] basement tells us that he just warming up for the real game. Enjoy the video below, but don’t miss out on the full 3-part video documentary on how this project came to be.
There are two understood definitions of “money shot”, one is that a scene is critical to the work, but is expensive (explosions, CGI, etc) to produce. The other is best left to those that already know the meaning.
Ehh, why be so coy? Pretty sure they mean roughly the same thing in both film making industries. More “edgy” HaD title sensationalism, nothing new here.
Also, the Canon EF-S 17-55mm f/2.8 IS USM lens is technically an ultrasonic lens, not piezo.
From Wikipedia:
“Ultrasonic motors differ from piezoelectric actuators in several ways, though both typically use some form of piezoelectric material, most often lead zirconate titanate and occasionally lithium niobate or other single-crystal materials. The most obvious difference is the use of resonance to amplify the vibration of the stator in contact with the rotor in ultrasonic motors. Ultrasonic motors also offer arbitrarily large rotation or sliding distances, while piezoelectric actuators are limited by the static strain that may be induced in the piezoelectric element.”
Nice that the positioning is somewhat limited in how it drifts over time. Do any EF or EF-S lenses do absolute positioning? Surely such a thing is technically possible.
It’s correct that the Canon EF-S protocol has already been reverse engineered (sort of) but good luck finding any real open source implementations of it. Dark Table and Entangle are about it and they are not exactly feature rich for either Nikon or Canon.
Lastly, this is a very broad set of skills all pulled together for a fairly impressive build. The use of “on a budget” is relative though!
Hey there, I’m pretty sure the rig is genre agnostic. It’s still a piezo motor, of which ultrasonic motors are a subtype. But you’re right, why not write ultrasonic motor then, I’ll edit this. More Wikipedia: https://en.wikipedia.org/wiki/Piezoelectric_motor
If we are in proofreading mode, these errors also stand out at first glance:
“Using the Swift programming language, [Howard] wrote a host program [that?] automatically detects the numerous stepper and servo motor based axis and streams the position data to their individual Teensy LC based controllers.”
“After making [ ]himself accustomed to stepper motors” – redundant space?
Done, thanks!
re. Absolute / relative positioning – nope, no EF lenses offer absolute positioning. Best you can do is ask the lens to go to an endstop (near or far) and then send a relative move command from there. On expensive lenses (like my 17-55), as long as you don’t touch the manual ring, repeatability is extremely good. No perceivable drift after dozens of moves, although for safety I always return to a hard end-stop before going to the animation start position, just in case.
re. budget: hehehe – it’s hard to put a bottom line figure on the project as I “borrowed” lots of bits and pieces from other projects (the stepper drivers and motors came off my CNC mill… gonna have to replace them when I have some money, can’t cut any PCBs at the mo [sniff]). The linear rail was the single largest expense – about £100 once you count the bearings and wheels and shipping. The DC servos all came out of bits of junk I got from eBay (always nice to spend £10 on junk and find it contains sexy servos you’d have to pay >£50 for)
Thanks for the kind comments :)
Hey, what kind of 10£ junk is that?!
re. Absolute lens positioning: technically possible, but in practice the lenses don’t have any fine-grained feedback in them. I eventually found out the reason I was having repeatability issues with the cheaper (kit) lenses I started development with; they use tiny DC motors, but unlike a normal DC servo, they don’t have a quad encoder for feedback, just a single slotted ring and opto-interrupter. So the lens’s logic can estimate how far the motor’s driven, but because it can’t derive direction, even the slightest jitter or jiggle of the mechanics reduces the accuracy.
re. Budget: It may not have cost a lot for me to build, but as I nicked tons of stuff from other projects (like my CNC mill, which is now lacking motors and drivers and stopping me cutting any PCBs [sniff]) it’ll end up costing a bit as I’ll have to start buying replacements… still, way cheaper than hiring a motion control Milo rig even just for a day :)
Agreed, but don’t underestimate the time required to sort all of this out and also the opportunity cost. Then again, renting this type of commercially developed equipment is decidedly not cheap either. Curious about the robot arm. Would it make much of this hardware superfluous or at least mostly redundant? I assume the movement translation becomes, in concept, easier?
Also, reminds me of this:
https://www.youtube.com/watch?v=lX6JcybgDFo
Ahh, man, I remember that video! Blew me away.
I can’t wait to get the arm working again, this time with the benefit of an awful lot of new knowledge. It’d certainly make a lot of my rig redundant, although the app development wouldn’t be wasted; it took a lot of buggering about to get my head round how to handle real-time control. And with a robot arm, the fact I can use Blender to generate the animation sequences becomes all the more important as it means I can use Blender’s built in Inverse Kinematics tools to work out what joint rotations you need in order to move the camera in (say) a straight line.
Luckily I wrote the app to be extendable, so I could use the current rig to move objects around in front of the arm… or maybe, if I beef up the camera slider a bit, I could mount the arm on the slider to extend its reach.
That said, the arm has a fairly small payload capacity as well as reach, so there would be other limitations. I’d have to find an alternative camera body. Ideally one that could still use my Canon lenses. If I remember right, there are some dinky little Black Magic cameras that may fit the bill, and would give me 4K at the same time… but that’s for another day :)
Ha. Haha. Hahaha.
Really cool. If I was starting such a project though I’d start with one of the brushless gimbal controller boards and gimbal motors, then look at adding the 4th axis and the lens control. These cheap controllers (e.g. STorM32) already have python scripting and extremely precise movement without step size limitation, people use them a lot for time lapse video, etc. And I think people do use them for cable camera control too.
I did have a play with some gimbal motors, but hit too many snags. Main thing was that I was trying to drive them without feedback (gimbals normally use a gyro/accelerometer) but that meant I couldn’t overcome the cogging effect of the cheap motors I was using. Feed a nice 3-phase drive voltage in and you’d hope to get smooth rotation out, but what I got was faster and slower motion depending on which pole was where. And even with feedback, if you want to do fast accel/decel with a heavy camera, you’ve gotta have your PID tuned extremely well to avoid bounce (or worse still, jumping round a pole on the motor)
Oh yeah – and the balancing / tuning was the other thing that made me go a different way; to get the gimbals to work right you really do need the camera balanced perfectly. Even changing the zoom on a big lens can be enough to require time-consuming rebalancing, let alone me wanting to be able to swap lenses quickly…
Could be a good approach for someone with better understanding than me, though. And those cheapo Chinese gimbals are bloody cheap for what you get, and all you need to do is add a little joystick to turn them into a pretty cool remote pan/tilt head, once you’ve got them balanced and tuned.
I hope that Lego man is doing this through his own choice. Seems a bit sadistic to be screwing his leg into place to stop him legging it.
You’ll be horrified to hear that not only was he not screwed down, but he had no safety restraints at all. Had to take him off the rig because of union-related legal issues
Man, that’s beautiful. I grok the motion control, I do that every day, so that’s mundane. But the integration with the virtual parts, that’s the magical stuff to me.
Amazing work! Really the ingenuity and results are blowing me away. It’s simultaneously inspiring and depressing, really.
It is cool.
This is excellent. Really like some of the results.
Fantastic use of Blender. Only try to solve the bits of the problem that are missing.
> Only try to solve the bits of the problem that are missing.
That is advice for life. Gonna have to print that and frame it :)
Stunning I was planning on building something like this for my stop motion work also for integration into blender ????
Instead of nitpicking the article apart maybe we can just focus on the fact that someone made something AMAZING out of basically random crap.
That camera rig is truly amazing. Home built, home programmed and integrates into exactly what he needs it to be without a big company deciding what he can do with it.
Seriously as someone who does some product photography, mechanical engineering, software development; this thing blows my mind. Im simply in awe.
Camera matching is built into Blender, motion control is not required as any footage will work once you have define the points that need tracking. This rig seems ideal for claymation type work.
https://group-ssjp.wikispaces.com/file/view/w%26g.jpg/137706741/w%26g.jpg