Stop motion animation is notoriously difficult to pull off well, in large part because it’s a mind-numbingly slow process. Each frame in the final video is a separate photograph, and for each one of those, the characters and props need to be moved the appropriate amount so that the final result looks smooth. You don’t even want to know how long Ben Wyatt spent working on Requiem for a Tuesday, though to be fair, it might still get done before the next Avatar.
But [Nick Bild] thinks his latest project might be able to improve on the classic technique with a dash of artificial intelligence provided by a Jetson Xavier NX. Basically, the Jetson watches the live feed from the camera, and using a hand pose detection model, waits until there’s no human hand in the frame. Once the coast is clear, it takes a shot and then goes back to waiting for the next hands-free opportunity. With the photographs being taken automatically, you’re free to focus on getting your characters moving around in a convincing way.
If it’s still not clicking for you, check out the video below. [Nick] first shows the raw unedited video, which primarily consists of him moving three LEGO figures around, and then the final product produced by his system. All the images of him fiddling with the scene have been automatically trimmed, leaving behind a short animated clip of the characters moving on their own.
Now don’t be fooled, it’s still going to take awhile. By our count, it took two solid minutes of moving around Minifigs to produce just a few seconds of animation. So while we can say its a quicker pace than with traditional stop motion production, it certainly isn’t fast.
As hackers, we naturally see the beauty of technology. We often talk in terms of the aesthetics of a particular hack, or the elegance of one solution over another, and we can marvel at the craftsmanship involved in everything from a well-designed PCB to a particularly clever reverse-engineering effort. Actually using technology to create art is something that’s often harder for us to appreciate, though, and looking at technological art from the artist’s side can be pretty instructive.
Cory Collins is an animator and artist with a long history of not only putting tech to work to create art, but also using it as the subject of his pieces. Cory’s work has brought life to video games, movies, and TV shows for years; more recently, he has turned his animation skills to developing interactive educational material for medical training. He has worked in just about every physical and digital medium imaginable, and the characters and scenes he has created are sometimes whimsical, sometimes terrifying, but always engaging.
Cory will stop by the Hack Chat to talk about what he has learned about technology from the artist’s perspective. Join us as we dive into the creative process, look at how art influences technology and vice versa, and learn how artistic considerations can help us address the technical problems every project eventually faces.
Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io. You don’t have to wait until Wednesday; join whenever you want and you can see what the community is talking about.
[Captain Disillusion] has earned a reputation on YouTube for debunking hoaxes and spreading a healthy sense of skepticism while having some of the highest production value on the platform and pretending to be some kind of inter-dimensional superhero. You’ve likely seen him give a careful explanation of how some viral video was faked alongside a generous dose of sarcastic humor and his own impressive visual effects. VFXcool is a series on his channel that takes deep dives into movies that are historically significant in the effects industry. For this installment, [Captain Disillusion]’s “intern”, [Alan], takes over to breakdown how filmmakers brought a futuristic spaceship to life in 1986’s Flight of the Navigator.
Making a movie requires hacks upon hacks, and that goes double in the era when the technology and techniques we now take for granted were being developed even as they were being put to film. The range of topics covered here is extreme: from full-scale props to models; from robotic motion control rigs to stop motion animation; from early computer graphics to the convoluted optical compositing that was necessary before digital workflows were possible. The tools themselves may be outdated, but understanding the history and the processes allows for a deeper insight into how we accomplish these kinds of effects today. And, really, it’s just so… cool.
Lithophanes are nothing new, with examples going back to the 1800s. But they’ve become popular again thanks to the ease of which these pieces of artwork can be 3D printed. While the Internet would be more than happy to see somebody press a 3D image of their cat into a thin piece of translucent porcelain ready to have a light shone through it, that’s quite a bit harder than just firing up the Monoprice.
The method here is pretty simple: [The Mad Maker] disassembles his favorite GIF to get the individual frame images, converts each one of those into a lithophane STL via an online tool, prints it out, photographs it, and then stitches all those photographs back into a new GIF. Given the incredibly time consuming nature of this process you’ll want to limit it to short animations, and even then, probably do only every 2nd or 3rd frame to preserve your sanity.
In the video after the break you can see the entire process, as well as check out the final result. While there weren’t really any technical hurdles to overcome in this project, we did like seeing how [The Mad Maker] experimented to find the ideal position for the backlight and camera. The wooden frame he came up with to hold everything in position should make subsequent meme conversions a lot easier, now he just needs to add a little color. Continue reading “3D Printed GIFs For Stop Motion Memes”→
It happens to everyone. You get your hands on an Etch-A-Sketch for the first time, and armed with the knowledge of how it works, you’re sure you can draw things other than rectangles and staircases. And then you find out the awful truth: you are not as precise as you think you are, and if you’re [QuintBUILDS], the circles you try to draw look like lemons, potatoes, or microbes.
Most importantly, you can still pick it up and shake it to clear the screen, a feature sorely lacking in many of the auto-sketchers we scratch about. And if you’re not fully satisfied by this hack, be sure to check out the stop-motion video after the break that turns this baby into a touch-screen video player for Flatlanders.
Turn it over and you’ll find a Raspberry Pi 3 and a CNC hat. The knobs are belt-driven from a pair of NEMA-17 size stepper motors that interface to the knobs with tight-fitting pulleys. Power comes from four 18650s, and is metered by a battery management board that provides both overcharge and drain protection. At some point in the future, [QuintBUILDS] plans to move to a battery pack, because the cell holder is electrically unstable.
We love the welded frame and acrylic enclosure because they make the thing sturdy and portable. Also, we’re suckers for see-through enclosures. They’re clearly superior if you want to do what [QuintBUILDS] did and take it to an elementary school science fair to show the kids just how cool science can be if you stick with it.
If you don’t think motorized Etch-A-Sketches can be useful, maybe you just haven’t seen this clock build yet.
Most displays are looking to play things faster. We’ve got movies at 60 frames per second, and gaming displays that run at 144 fps. But what about moving in the other direction? [Bryan Boyer] wanted to try this out, so he built the VSMP, or Very Slow Movie Player. It’s a neat device that plays back a movie at about 24 fph (frames per hour) on an e-ink display to demonstrate something that [Bryan] calls Slow Seeing, which, he says “helps you see yourself against the smear of time.” A traditional epic-length movie is now going to run you greater than 8,000 hours of viewing.
Artistic considerations aside, it’s an interesting device from a technical point of view. [Bryan] built it from a 7.4-inch e-ink display from Pervasive Displays. The controller is connected to a Raspberry Pi Zero, which is running a Python script to convert a frame of the movie file into a dithered file, then send it to the display. Because the Pi Zero isn’t a very fast computer, this takes some time, and thus the slow speed of the VSMP. Originally, [Bryan] had set it up to run as fast as the system could manage, which was about 25 seconds per frame, or about 2 frames per minute. He decided to slow it down a bit further to the more attractive multiple of 24 frames per hour to contrast with the 24 frames per second of the original movie. He did this by using a CRON job that kicks of the conversion script once every 2.5 minutes and increments the frame counter. All of this is topped off with a nice 3D-printed case that has a lovely interference pattern to make a rather neat and intriguing project.
Perhaps the best part of this is see a time-lapse of the VSMP — life moves quickly around it while 2001: A Space Odyssey plays at normal speed.
Lip syncing for computer animated characters has long been simplified. You draw a set of lip shapes for vowels and other sounds your character makes and let the computer interpolate how to go from one shape to the next. But with physical, real world puppets, all those movements have to be done manually, frame-by-frame. Or do they?
He toyed around with a number of approaches for making the lip mechanism before coming up with one that worked the way he wanted. The lips are shaped using guitar wire soldered to other wires going to servos further back in the head. Altogether there are four servos for the lips and one more for the jaw. There isn’t much sideways movement but it does enough and lets the brain fill in the rest.
On the software side, he borrows heavily from the tools used for lip syncing computer-drawn characters. He created virtual versions of the five servo motors in Adobe Animate and manipulates them to define the different lip shapes. Animate then does the interpolation between the different shapes, producing the servo positions needed for each frame. He uses an AS3 script to send those positions off to an Arduino. An Arduino sketch then uses the Firmata library to receive the positions and move the servos. The result is entirely convincing as you can see in the trailer below. We’ve also included a video which summarizes the iterations he went through to get to the finished Billy Whiskers or just check out his detailed website.