A 3D Printed 35mm Movie Camera

Making a camera can be as easy as taking a cardboard box with a bit of film and a pin hole, but making a more accomplished camera requires some more work. A movie camera has all the engineering challenges as a regular camera with the added complication of a continuous film transport mechanism and shutter. Too much work? Not if you are [Yuta Ikeya], whose 3D printed movie camera uses commonly-available 35 mm film stock rather than the 8 mm or 16 mm film you might expect.

3D printing might not seem to lend itself to the complex mechanism of a movie camera, however with the tech of the 2020s in hand he’s eschewed a complex mechanism in favour of an Arduino and a pair of motors. The camera is hardly petite, but is still within the size to comfortably carry on a shoulder. The film must be loaded into a pair of cassettes, which are pleasingly designed to be reversible, with either able to function as both take-up and dispensing spool.

The resulting images have an extreme wide-screen format and a pleasing artistic feel. Looking at them we’re guessing there may be a light leak or two, but it’s fair to say that they enhance the quality rather than detract from it. Those of us who dabble in movie cameras can be forgiven for feeling rather envious.

We’ve reached out to him asking whether the files might one day be made available, meanwhile you can see it in action in the video below the break.

Continue reading “A 3D Printed 35mm Movie Camera”

How Did Dolby Digital Sound Work On Film?

When we go to the cinema and see a film in 2022, it’s very unlikely that what we’re seeing will in fact be a film. Instead of large reels of transparent film fed through a projector, we’ll be watching the output of a high-quality digital projector. The advantages for the cinema industry in terms of easier distribution and consistent quality are obvious. There was a period in the 1990s though when theatres still had film projectors, but digital technology was starting to edge in for the sound. [Nava Whiteford] has found some 35mm trailer film from the 1990s, and analysed the Dolby Digital sound information from it.

The film is an interesting exercise in backward compatibility, with every part of it outside the picture used to encode information. There is the analogue sound track and two digital formats, but what we’re interested in are the Dolby Digital packets. These are encoded as patterns superficially similar to a QR code in the space between the sprocket holes.

Looking at the patent he found that they were using Reed-Solomon error correction, making it relatively easy to decode. The patent makes for fascinating reading, as it details how the data was read using early-1990s technology with each line being scanned by a linear CCD, before detailing the signal processing steps followed to retrieve the audio data. If you remember your first experience of Dolby cinema sound three decades ago, now you know how the system worked.

The film featured also had an analogue soundtrack, and if you’d like to know how they worked, we’ve got you covered!

It’s Bad Apple, But On A 32K EPROM

The Bad Apple!! video with its silhouette animation style has long been a staple graphics demo for low-end hardware, a more stylish alternative to the question “Will it run DOOM?”. It’s normal for it to be rendered onto a screen by a small microcomputer or similar but as [Ian Ward] demonstrates in an unusual project, it’s possible to display the video without any processor being involved. Instead he’s used a clever arrangement involving a 32K byte EPROM driving a HD44780-compatible parallel alphanumeric LCD display.

While 32K bytes would have seemed enormous back in the days of 8-bit computing, even when driving only a small section of an alphanumeric LCD it’s still something of a struggle to express the required graphics characters. This feat is achieved by the use of a second EPROM, which carries a look-up table.

It’s fair to say that the result which can be seen in the video below the break isn’t the most accomplished rendition of Bad Apple!! that we’ve seen, but given the rudimentary hardware upon which it’s playing we think that shouldn’t matter. Why didn’t we think of doing this in 1988!

Continue reading “It’s Bad Apple, But On A 32K EPROM”

Invisible 3D Printed Codes Make Objects Interactive

An interesting research project out of MIT shows that it’s possible to embed machine-readable labels into 3D printed objects using nothing more than an FDM printer and filament that is transparent to IR. The method is being called InfraredTags; by embedding something like a QR code or ArUco markers into an object’s structure, that label can be detected by a camera and interactive possibilities open up.

One simple proof of concept is a wireless router with its SSID embedded into the side of the device, and the password embedded into a different code on the bottom to ensure that physical access is required to obtain the password. Mundane objects can have metadata embedded into them, or provide markers for augmented reality functionality, like tracking the object in 3D.

How are the codes actually embedded? The process is straightforward with the right tools. The team used a specialty filament from vendor 3dk.berlin that looks nearly opaque in the visible spectrum, but transmits roughly 45% in IR.  The machine-readable label gets embedded within the walls of a printed object either by using a combination of IR PLA and air gaps to represent the geometry of the code, or by making a multi-material print using IR PLA and regular (non-IR transmitting) PLA. Both provide enough contrast for an IR-sensitive camera to detect the label, although the multi-material version works a little better overall. Sadly, the average mobile phone camera by itself isn’t sufficiently IR-sensitive to passively read these embedded tags, so the research used easily available cameras with no IR-blocking filters, like the Raspberry Pi NoIR.

The PDF has deeper details of the implementation for those of you who want to know more, and you can see a demonstration of a few different applications in the video, embedded below. Determining the provenance of 3D printed objects is a topic of some debate in the industry, and it’s not hard to see how technology like this could be used to covertly identify objects without compromising their appearance.

Continue reading “Invisible 3D Printed Codes Make Objects Interactive”

Commodore Promotional Film From 1984 Enhanced

Over on Retro Recipe’s YouTube channel, [Perifractic] has been busy restoring an old promotional video of how Commodore computers were made back in 1984 (video below the break). He cleaned up the old VHS-quality version that’s been around for years, translated the German to English, and trimmed some bits here and there. The result is a fascinating look into the MOS factory, Commodore’s German factory, and a few other facilities around the globe. The film shows the chip design engineers in action, wafer manufacturing, chip dicing, and some serious micro-probing of bare die. We also see PCB production, and final assembly, test and burn-in of Commodore PET and C64s in Germany.

Check out the video description, where [Perifractic] goes over the processes he used to clean up video and audio using machine learning. If restoration interests you, check out the piece we wrote about these techniques to restore old photographs last year. Are there any similar factory tour films, restored or not, lurking around the web? Let us know in the comments below.

Continue reading “Commodore Promotional Film From 1984 Enhanced”

AI Camera Knows Its S**t

[Caleb] shares a problem with most dog owners. Dogs leave their… byproducts…all over your yard. Some people pick it up right away and some just leave it. But what if your dog has run of the yard? How do you know where these piles are hiding? A security camera and AI image detection is the answer, but probably not the way that you think.

You might think as we did that you could train the system to recognize the–um–piles. But instead, [Caleb] elected to have the AI do animal pose estimation to detect the dog’s posture while producing the target. This is probably easier than recognizing a nondescript pile and then it doesn’t matter if it is, say, covered with snow.

Continue reading “AI Camera Knows Its S**t”

OpenCV Knows Where Your Hand Is

We have to say, [Murtaza]’s example game in his latest video isn’t very exciting. However, the OpenCV technique he uses to track a hand and determine its distance from a single camera is pretty interesting. The demo shows a random button on the screen and you have to use your hand to press the button which then moves so you can try again. The hand measurement seems accurate to a few centimeters which is good enough for many applications.

The Python code is actually quite straightforward. Essentially, the software tracks your hand and by estimating its relative size to determine how far away it is. Of course, your hand might also rotate, and [Murtaza] works through all the cases step-by-step. If we wanted to know a distance, we’d probably turn to ultrasonics or a time of flight sensor. The problem is, those sensors can’t tell your hand from anything else that happens to be in front of it. The use of a single camera to track and locate is pretty impressive.

If you haven’t used OpenCV before, the channel has a lot of tutorials and they are all worth watching. Computer vision is a great technique and can replace a lot of things in some applications. GPS, for example. Or, try this creepier tracking application next Halloween.

Continue reading “OpenCV Knows Where Your Hand Is”