Raptor DID. Photo by Matt Mechtley.

How Jurassic Park’s Dinosaur Input Device Bridged The Stop-Motion And CGI Worlds

In a double-blast from the past, [Ian Failes]’ 2018 interview with [Phil Tippett] and others who worked on Jurassic Park is a great look at how the dinosaurs in this 1993 blockbuster movie came to be. Originally conceived as stop-motion animatronics with some motion blurring applied using a method called go-motion, a large team of puppeteers was actively working to make turning the book into a movie when [Steven Spielberg] decided to go in a different direction after seeing a computer-generated Tyrannosaurus rex test made by Industrial Light and Magic (ILM).

Naturally, this left [Phil Tippett] and his crew rather flabbergasted, leading to a range of puppeteering-related extinction jokes. Of course, it was the early 90s, with computer-generated imagery (CGI) animators being still very scarce. This led to an interesting hybrid solution where [Tippett]’s team were put in charge of the dinosaur motion using a custom gadget called the Dinosaur Input Device (DID). This effectively was like a stop-motion puppet, but tricked out with motion capture sensors.

This way the puppeteers could provide motion data for the CG dinosaur using their stop-motion skills, albeit with the computer handling a lot of interpolation. Meanwhile ILM could handle the integration and sprucing up of the final result using their existing pool of artists. As a bridge between the old and new, DIDs provided the means for both puppeteers and CGI artists to cooperate, creating the first major CGI production that holds up to today.

Even if DIDs went the way of the non-avian dinosaurs, their legacy will forever leave their dino-sized footprints on the movie industry.

Thanks to [Aaron] for the tip.


Top image: Raptor DID. Photo by Matt Mechtley.

Oh, The Places You’ll Go With Stop Motion Animation

Robots made of broken toy parts, stop-motion animation, and a great song to tie it all together were not on our bingo card for 2023, but the results are perfect. [Mootroidxproductions] recently released the official music video for I Fight Dragons 2019 song “Oh the Places You’ll Go”.

The song was written by lead vocalist [Brian Mazzaferri] with inspiration from the classic Dr. Seuss book. [Brian] wrote it for his newborn daughter, and we’re pretty sure it will hit any parent right in the feels.

[Mootroidxproductions] isn’t a parent themselves, but they expanded on the theme to create a video about sacrificing oneself to save a loved one. With a self deprecating wit, they take us through the process of turning broken Bionicle parts, bits of Gundam, Lego, and, armature wire to make the two robots in the film. He also explains how he converted garbage into sets, greebles, and lighting effects.

The robots had to be designed so that they could fulfill their roles in the film. From the size of their hands down to their individual walking gaits, he thought of everything. His encyclopedic knowledge of Bionicle parts is also on full display as he explains the origin of the major parts used to build “Little Blue” and “Sherman”

Click through the break for both the main video and the behind-the-scenes production.

Continue reading “Oh, The Places You’ll Go With Stop Motion Animation”

Closeup of a film restorer's hand holding a 35mm film print to check for defects as it goes into a film scanner

35mm Film Restoration Process Explained

For a large part of the 20th century, motion pictures were distributed on nitrate film. Although cheaper for the studios, this film was highly flammable and prone to decay. On top of that, most film prints were simply discarded once they had been through their run at the cinema, so a lot of film history has been lost.

Sometimes, the rolls of projected film would be kept by the projectionist and eventually found by a collector. If the film was too badly damaged to project again, it might still get tossed. Pushing against this tide of decay and destruction are small groups of experts who scan and restore these films for the digital age.

still showing the difference in quality between a 16mm print of a 35mm animated movie and a new scan of the 35mm original
The quality difference between a smaller-format print and the original restored negative can be startling

The process is quite involved – starting with checking every single frame of film by hand and repairing any damaged perforations or splices that could come apart in the scanner. Each frame is then automatically scanned at up to 10K resolution to future-proof the process before being painstakingly digitally cleaned.

The real expertise is in knowing what is damage or dirt, and what is the character of the original film. Especially in stop-motion movies, the subtle changes between frames are really part of the original, so the automatic clean-up tools need to be selectively reined in so as not to lose the charm and art of the film-makers.

The results are quite astonishing and we all have teams like this to thank for protecting our cultural heritage.

If you’re interested in watching the process, then check out the video after the break. If you fancy a go at automatic film digitising yourself (preferably not on unique historical prints!) then we’ve shown projects to do just that in the past.

Thanks to [Cliff Claven] for the tip.

Continue reading “35mm Film Restoration Process Explained”

Photo rail setup for stop motion

Stop-Motion Angels In The Light Field

Baseball jokes aside, holograms have been a dream for decades, and with devices finally around that support something like them, we have finally started to wonder how to make content for them. [Mike Rigsby] recently entered his stop-motion holographic setup into our sci-fi contest, and we love the idea.

Rather than a three-dimensional model or a 2d picture with pixels, the Looking Glass light field display supports a series of images as quantized points (hence light field). As you move around an object, images are interpolated between the frames you do know, giving a pretty convincing effect. In a traditional stop motion animation, you need to take anywhere between 12-24 frames to equal about one second of animation. Now that you need to take 48 pictures for one frame, over 1152 pictures for just one second of animation. Two problems quickly appear, how to take photographs accurately from the same position every time and how do you manage the deluge of photos sensibly. [Mike] started with a wooden stage for his actors. A magnet was mounted to the photo rail carriage, and a sensor allowed it to detect that it was in the same spot. An Arduino controls the rail, reads the magnet via a sensor, and controls the camera shutter. The DSLR he’s using can’t do that many frames per second, but that’s a problem for another sci-fi contest.

Holographic-ish displays are finally here, and they’re getting better. But if a display isn’t your speed, perhaps some laser-powered glasses can be the holographic experience you’re looking for?

This project was an entry into the 2022 Sci-Fi Contest. Check out all of the winning entries here.

Line of electromechanical water valves dispensing a pattern of water droplets

Gravity-Defying Water Drop Display Shows Potential

[3DPrintedLife aka Andrew DeGonge] saw that advert for gatorade that shows some slick stop-motion animation using a so-called ‘liquid printer’ and wondered how they built the machine and got it to work so well. The answer, it would seem, involves a lot of hard work and experimentation.

Conceptually it’s not hard to grasp. A water reservoir sits at the top, which gravity-feeds into a a series of electromechanical valves below, which feed into nozzles. From there, the timing of the valve and water pressure dictate the droplet size. The droplets fall under the influence of gravity, to be collected at the bottom. From that point it’s a ‘simple’ matter of timing droplets with respect to a lighting strobe or camera shutter and hey-presto! instant animation.

As will become evident from the video, it’s just not as easy as that. After an initial wobble when [Andrew] realised that cheap “air-only” solenoids actually are for air-only when they rusted up, he took a slight detour to design and 3D print his own valve body. Using a resin printer to produce fine detailed prints, enabled the production of small internal passages including an ‘air spring’ which is just a small chamber of air. After a lot of testing, proved to be a step in the right direction. Whether this could have been achieved with an FDM printer, is open to speculation, but we suspect the superior fine detail capabilities of modern resin printers are a big help here.

In a nice twist, [Andrew] ripped open and dissolved a fluorescent marker pen, and used that in place of plain water, so when illuminated with suitably triggered UV LED strips, discernable animation was achieved, with an eerie green glow which we think looks pretty neat. All he needs to do now is upgrade the hardware to make a 3D array with more resolution, and he can start approaching the capability of the thing that inspired him. Work on some custom electronics to drive it has started, so this is one to watch in the coming months!

We’ve seen many water-based display device before, like this one that projects directly onto a thin stream of water, and this strangely satisfying hack using paraffin and water, but a full 3D Open Source display device seems elusive so far.

All project details can be found on the associated GitHub.

Continue reading “Gravity-Defying Water Drop Display Shows Potential”

Mastering Stop Motion Through Machine Learning

Stop motion animation is notoriously difficult to pull off well, in large part because it’s a mind-numbingly slow process. Each frame in the final video is a separate photograph, and for each one of those, the characters and props need to be moved the appropriate amount so that the final result looks smooth. You don’t even want to know how long Ben Wyatt spent working on Requiem for a Tuesday, though to be fair, it might still get done before the next Avatar.

But [Nick Bild] thinks his latest project might be able to improve on the classic technique with a dash of artificial intelligence provided by a Jetson Xavier NX. Basically, the Jetson watches the live feed from the camera, and using a hand pose detection model, waits until there’s no human hand in the frame. Once the coast is clear, it takes a shot and then goes back to waiting for the next hands-free opportunity. With the photographs being taken automatically, you’re free to focus on getting your characters moving around in a convincing way.

If it’s still not clicking for you, check out the video below. [Nick] first shows the raw unedited video, which primarily consists of him moving three LEGO figures around, and then the final product produced by his system. All the images of him fiddling with the scene have been automatically trimmed, leaving behind a short animated clip of the characters moving on their own.

Now don’t be fooled, it’s still going to take awhile. By our count, it took two solid minutes of moving around Minifigs to produce just a few seconds of animation. So while we can say its a quicker pace than with traditional stop motion production, it certainly isn’t fast.

Machine learning isn’t the only modern technology that can simplify stop motion production. We’ve seen a few examples of using 3D printed objects instead of manually-adjusted figures. It still takes a long time to print, and of course it eats up a ton of filament, but the mechanical precision of the printed scenes makes for a very clean final result.

Continue reading “Mastering Stop Motion Through Machine Learning”

Art With Technology Hack Chat

Join us on Wednesday, June 16 at noon Pacific for the Art with Technology Hack Chat with Cory Collins!

As hackers, we naturally see the beauty of technology. We often talk in terms of the aesthetics of a particular hack, or the elegance of one solution over another, and we can marvel at the craftsmanship involved in everything from a well-designed PCB to a particularly clever reverse-engineering effort. Actually using technology to create art is something that’s often harder for us to appreciate, though, and looking at technological art from the artist’s side can be pretty instructive.

Cory Collins is an animator and artist with a long history of not only putting tech to work to create art, but also using it as the subject of his pieces. Cory’s work has brought life to video games, movies, and TV shows for years; more recently, he has turned his animation skills to developing interactive educational material for medical training. He has worked in just about every physical and digital medium imaginable, and the characters and scenes he has created are sometimes whimsical, sometimes terrifying, but always engaging.

Cory will stop by the Hack Chat to talk about what he has learned about technology from the artist’s perspective. Join us as we dive into the creative process, look at how art influences technology and vice versa, and learn how artistic considerations can help us address the technical problems every project eventually faces.

join-hack-chatOur Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, June 16 at 12:00 PM Pacific time. If time zones have you tied up, we have a handy time zone converter.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io. You don’t have to wait until Wednesday; join whenever you want and you can see what the community is talking about.