Listening To Long Forgotten Voices: An Optical Audio Decoder For 16 Mm Film

Like many of us, [Emily] found herself on COVID-19 lockdown over the summer. To make the most of her time in isolation, she put together an optical audio decoder for old 16 mm film, built using modern components and a bit of 3D printing.

It all started with a broken 16 mm projector that [Emily] got from a friend. After repairing and testing the projector with a roll of film bought at a flea market, she discovered that the film contained an audio track that her projector couldn’t play. The audio track is encoded as a translucent strip with varying width, and when a mask with a narrow slit is placed over the top it modulates the amount of light that can pass through to a light sensor connected to speakers via an amplifier.

[Emily] used a pair of razor blades mounted to a 3D printed bracket to create the mask, and a TI OPT101 light sensor together with a light source to decode the optical signal. She tried to use a photoresistor and a discrete photodiode, but neither had the required sensitivity. She built a frame with adjustable positions for an idler pulley and the optical reader unit, an electronics box on one end for the electronic components, and another pulley attached to a stepper motor to cycle a short loop of the film.

Most of the projects we see involving film these days are for creating digital copies. You can digitize your old 35 mm photo film using a Raspberry Pi, some Lego pieces, and a DSLR camera, or do the same for 8 mm film with a 3D printed rig. Continue reading “Listening To Long Forgotten Voices: An Optical Audio Decoder For 16 Mm Film”

A Gesture Recognizing Armband

Gesture recognition usually involves some sort of optical system watching your hands, but researchers at UC Berkeley took a different approach. Instead they are monitoring the electrical signals in the forearm that control the muscles, and creating a machine learning model to recognize hand gestures.

The sensor system is a flexible PET armband with 64 electrodes screen printed onto it in silver conductive ink, attached to a standalone AI processing module.  Since everyone’s arm is slightly different, the system needs to be trained for a specific user, but that also means that the specific electrical signals don’t have to be isolated as it learns to recognize patterns.

The challenging part of this is that the patterns don’t remain constant over time, and will change depending on factors such as sweat, arm position,  and even just biological changes. To deal with this the model can update itself on the device over time as the signal changes. Another part of this research that we appreciate is that all the inferencing, training, and updating happens locally on the AI chip in the armband. There is no need to send data to an external device or the “cloud” for processing, updating, or third-party data mining. Unfortunately the research paper with all the details is behind a paywall.

Continue reading “A Gesture Recognizing Armband”

Transforming Drone Can Be A Square Or A Dragon

When flying drones in and around structures, the size of the drone is generally limited by the openings you want to fit through. Researchers at the University of Tokyo got around this problem by using an articulating structure for the drone frame, allowing the drone to transform from a large square to a narrow, elongated form to fit through smaller gaps.

The drone is called DRAGON, which is somehow an acronym for the tongue twisting description “Dual-Rotor Embedded Multilink Robot with the Ability of Multi-Degree-of-Freedom Aerial Transformation“. The drone consists of four segments, with a 2-DOF actuated joint between each segment. A pair of ducted fan motors are attached to the middle of each segment with a 2-DOF gimbal that allows it to direct thrust in any direction relative to the segment. For normal flight the segments would be arranged in the square shape, with minimal movement between the segments. When a small gap is encountered, as demonstrated in the video after the break, the segments rearrange into a dragon-like shape, that can pass through a gap in any plane.

Each segment has its own power source and controller, and the control software required to make everything work together is rather complex. The full research paper is unfortunately behind a paywall. The small diameter of the propellers, and all the added components would be a severe limiting factor in terms of lifting capacity and flight time, but the concept is to definitely interesting.

The idea of shape shifting robots has been around for a while, and can become even more interesting when the different segment can detach and reattach themselves to become modular robots. The 2016 Hackaday Grand Prize winner DTTO is a perfect example of this, although it did lack the ability to fly. Continue reading “Transforming Drone Can Be A Square Or A Dragon”

Robotic Melodica Student Is Enthusiastic But Terrible

Anyone who has through the process of learning to play a musical instrument for the first time, or listening to someone attempting to do so will know that it can be a rather painful and frustrating experience. [Alessandro Perini] apparently couldn’t get enough of the sound of a first-time musician, so he created a robot to play the melodica badly for hours on end, as demonstrated in the video after the break.

The project is appropriately named “AI’ve just started to learn to play”, and attempts to copy every melody it hears in real-time. The robot consists of the cartridge carriage from an old printer, mounted on a wooden frame to hold the melodica. The original carriage used a DC motor with an encoder for accurate movement, but since position accuracy was not desirable, [Alessandro] ditched the encoder. Two small trolley wheels are mounted on the cartridge holder to push down on the melodica’s key. A bistable solenoid valve controls airflow to the melodica from an air compressor. The DC motor and solenoid valve is controlled by an Arduino via a pair of LM298 motor drivers.

A host computer running software written in Cycling ’74 MAX listens to the melody it’s trying to imitate, and send serial commands to the Arduino to move the carriage and open the solenoid to try and match the notes. Of course, it keeps hitting a series of wrong notes in the process. The Arduino code and build instructions have been published, but the main Max software is only described briefly. [Alessandro] demonstrated the robot at a local festival, where it played YouTube tutorial snippets and jammed with a local band for a full 24 hours. You have to respect that level of endurance.

If listening to less error-prone electronically controlled instruments is more to your taste, listen to this building-sized pipe organ play MIDI files.

Continue reading “Robotic Melodica Student Is Enthusiastic But Terrible”

Augmented Reality On The Cheap With ESP32

Augmented reality (AR) technology hasn’t enjoyed the same amount of attention as VR, and seriously lags in terms of open source development and accessibility.  Frustrated by this, [Arnaud Atchimon] created CheApR, an open source, low cost AR headset that anyone can build at home and use as a platform for further development

[Arnaud] was impressed by the Tilt Five AR goggles, but the price of this cutting edge hardware simply put it out of reach of most people. Instead, he designed and built his own around a 3D printed frame, ESP32, cheap LCDs, and lenses from a pair of sunglasses. The electronics is packed horizontally in the top of the frame, with the displays pointed down into a pair of angled mirrors, which reflect the image onto the sunglasses lenses and into the user’s eyes. [Arnaud] tested a number of different lenses and found that a thin lens with a slight curve worked best. The ESP32 doesn’t actually run the main software, it just handles displaying the images on the LCDs. The images are sent from a computer running software written in Processing. Besides just displaying images, the software can also integrate inputs from a MPU6050 IMU and  ESP32 camera module mounted on the goggles. This allows the images to shift perspective as the goggles move, and recognize faces and AR markers in the environment.

All the design files and software is available on GitHub, and we exited to see where this project goes. We’ve seen another pair of affordable augmented reality glasses that uses a smartphone as a display, but it seems the headset that was used are no longer available.

Magnetic Motorized Plasma Cutter Track

Affordable plasma cutters are becoming a popular step up from an angle grinder for cutting sheet metal in the home workshop, but cutting long straight lines can be laborious and less than accurate. [Workshop From Scratch] was faced with this problem, so he built a motorized magnetic track for his plasma cutter.

Thanks to a pair of repurposed electromagnetic door looks and adjustable base width, the track can be mounted on any piece of magnetic steel. The track itself consists of a pair of linear rods, with the torch mounts sliding along on linear bearings. A lead screw sits between the two linear rods, and is powered by an old cordless drill with the handle cut off. Its trigger switch was replaced by a speed controller and two-way switch for direction control, and a power supply took the place of the battery. The mounting bracket for the plasma torch is adjustable, allowing the edge of the steel to be cut at an angle if required.

While limit switches on the end of the track might be a preferable option to prevent sliding base to hit the ends of the tracks, the clutch in the electric drill should be good enough to prevent damage if the operator is distracted.

[Workshop From Scratch] is really living up to the name of his YouTube channel, having built many of the other tools used in the video himself. Just a few examples are the XY-table, hydraulic adjustable workbench and  hydraulic shop crane. Continue reading “Magnetic Motorized Plasma Cutter Track”

A Mechanical Edge-Avoiding Robot

In the age of cheap sensors and microcontrollers, it’s easy to forget that there might be very simple mechanical solutions to a problem. [gzumwalt]’s 3D printed mechanical edge avoiding robot is a beautifully  elegant example of this.

The only electric components on this robot is a small geared DC motor and a LiPo battery. The motor drives a shaft fixed to a wheel on one side, while the opposite wheel is free-spinning. A third wheel is mounted perpendicular to the other two in the center of the robot, and is driven from the shaft by a bevel gear. The third wheel is lifted off the surface by a pair of conical wheels on a pivoting axle. When one of these conical wheels go over the edge of whatever surface it’s driving on, it lowers front and brings the third wheel into contact with the surface, spinning the robot around until both front wheels are back on the surface.

Mechanical alternatives for electronic systems are easily overlooked, but are often more reliable and rugged in hostile environments. NASA is looking at sending a rover to Venus, but with surface temperatures in excess of 450 °C and atmospheric pressure 92 times that of Earth, conventional electronics won’t survive. Earlier in the year NASA ran a design competition for a completely mechanical obstacle detection system for use on Venus.

[gzumwalt] is a very prolific designer on ingenious 3D printed mechanical devices. This mechanism could also be integrated in his walking fridge rover to explore the front of your fridge without falling off. Continue reading “A Mechanical Edge-Avoiding Robot”