TMD-2: A Bigger, Better, More Collaborative Turing Machine

One of the things we love best about the articles we publish on Hackaday is the dynamic that can develop between the hacker and the readers. At its best, the comment section of an article can be a model of collaborative effort, with readers’ ideas and suggestions making their way into version 2.0 of a build.

This collegial dynamic is very much on display with TMD-2, [Michael Gardi]’s latest iteration of his Turing machine demonstrator. We covered the original TMD-1 back in late summer, the idea of which was to serve as a physical embodiment of the Turing machine concept. Briefly, the TMD-1 represented the key “tape and head” concepts of the Turing machine with a console of servo-controlled flip tiles, the state of which was controlled by a three-state, three-symbol finite state machine.

TMD-1

TMD-1 was capable of simple programs that really demonstrated the principles of Turing machines, and it really seemed to catch on with readers. Based on the comments of one reader, [Newspaperman5], [Mike] started thinking bigger and better for TMD-2. He expanded the finite state machine to six states and six symbols, which meant coming up with something more scalable than the Hall-effect sensors and magnetic tiles of TMD-1.

TMD-2 has a camera for computer vision of the state machine tiles

[Mike] opted for optical character recognition using a Raspberry Pi cam along with Open CV and the Tesseract OCR engine. The original servo-driven tape didn’t scale well either, so that was replaced by a virtual tape displayed on a 7″ LCD display. The best part of the original, the tile-based FSM, was expanded but kept that tactile programming experience.

Hats off to [Mike] for tackling a project with so many technologies that were previously new to him, and for pulling off another great build. And kudos to [Newspaperman5] for the great suggestions that spurred him on.

“Hey, You Left The Peanut Out Of My Peanut M&Ms!”

Candy-sorting robots are in plentiful supplies on these pages, and with good reason — they’re a great test of the complete suite of hacker tools, from electronics to machine vision to mechatronics. So we see lots of sorters for Skittles, jelly beans, and occasionally even Reese’s Pieces, but it always seems that the M&M sorters are the most popular.

This M&M sorter has a twist, though — it finds the elusive and coveted peanutless candies lurking in most bags of Peanut M&Ms. To be honest, we’d never run into this manufacturing defect before; being chiefly devoted to the plain old original M&Ms, perhaps our sample size has just been too small. Regardless, [Harrison McIntyre] knows they’re there and wants them all to himself, hence his impressive build.

To detect the squib confections, he built a tiny 3D-scanner from a line laser, a turntable, and a Raspberry Pi camera. After scanning the surface to yields its volume, a servo sweeps the candy onto a scale, allowing the density to be calculated. Peanut-free candies will be somewhat denser than their leguminous counterparts, allowing another servo to move the candy to the proper exit chute. The video below shows you all the details, and more than you ever wanted to know about the population statistics of Peanut M&Ms.

We think this is pretty slick, and a nice departure from the sorters that primarily rely on color to sort candies. Of course, we still love those too — take your pick of quick and easy, compact and sleek, or a model of industrial design.

Continue reading ““Hey, You Left The Peanut Out Of My Peanut M&Ms!””

Old Polaroid Gets A Pi And A Printer

There’s nothing like a little diversion project to clear the cobwebs — something to carry one through the summer doldrums and charge you up for the rest of the hacking year. At least that’s what we think was up with [Sam Zeloof]’s printing Polaroid retro-conversion project.

Normally occupied with the business of learning how to make semiconductors in his garage, or more recently working on his undergraduate degree in electrical engineering, [Sam], like many of us, found himself with time to spare this summer. In search of a simple, fun project that wouldn’t glaze over the eyes of people when he showed it off, he settled on a printing party camera. The guts are pretty standard fare: a Raspberry Pi and Pi cam, coupled with a thermal receipt printer for instant hardcopy. The donor camera was a Polaroid Pronto from eBay, in good shape on the outside and mostly complete on the inside. A Dremel took care of the latter, freeing up space occupied by all the plastic bits that held the film cartridge and running gear of the film handling system.

The surgery made enough room to squeeze in the Pi Zero and a LiPo battery pack, along with a buck converter. Adding in the receipt printer and its drive board and mounting the Pi cam presented some challenges, but everything fit without breaking the original look and feel of the Polaroid. The camera now produces low-res hardcopy instantly using a dithering algorithm, and store high-resolution images on an SD card for later download. As a bonus, [Sam] included a simulated time and date stamp in the lower corner of the saved images, like those that used to show up on film.

[Sam]’s camera looks like a ton of fun. We’ve seen other Polaroid conversions, including a stunning SX-70 digital upgrade, but this one shines for its simplicity and instant hardcopy.

[via Tom’s Hardware]

Pi Cam Replaces Pinhole And Film For Digital Solargraphy

Solargraph from a one-year exposure on film. Elekes Andor / CC BY-SA

Have you ever heard of solargraphy? The name tells you much of what you need to know, but the images created with a homemade pinhole camera and a piece of photographic film can be visually arresting, showing as they do the cumulative tracks of the sun’s daily journey across the sky over many months. But what if you don’t want to use film? Is solargraphy out of reach to the digital photographers of the world?

Not at all, thanks to this digital solargraphy setup. [volzo] searched for a way to make a digital camera perform like a film-based solargraphic camera, first thinking to take a series of images during the day and average them together. He found that this just averaged out the sun from the final image. His solution was to take a pair of photos at each timepoint — one correctly exposed to capture the scene, and one stopped way down to just capture the position of the sun as a pinprick of light. All the foreground images are averaged, while the stopped-down sun images are overlaid upon each other, producing the track of the sun across the sky. Add the two resulting images and you’ve got a solargraph.

To automate the process, [volzo] used a Raspberry Pi and a Pi-Cam fitted in a weatherproof 3D-printed box. A custom hat powers up the Pi every few minutes, which boots up and takes the two pictures. Sadly, the batteries only last for a couple of days, so those long six-month exposures aren’t possible yet. But [volzo] has made all the sources available, so feel free to build on his work. If you prefer to use a DSLR for the job, this Bluetooth intervalometer might help.

Machine Vision Keeps Track Of Grubby Hands

Can you remember everything you’ve touched in a given day? If you’re being honest, the answer is, “Probably not.” We humans are a tactile species, with an outsized proportion of both our motor and sensory nerves sent directly to our hands. We interact with the world through our hands, and unfortunately that may mean inadvertently spreading disease.

[Nick Bild] has a potential solution: a machine-vision system called Deep Clean, which monitors a scene and records anything in it that has been touched. [Nick]’s system uses Jetson Xavier and a stereo camera to detect depth in a scene; he built his camera from a pair of Raspberry Pi cams and a Pi 3B+, but other depth cameras like a Kinect could probably do the job. The idea is to watch the scene for human hands — OpenPose is the tool he chose for that job — and correlate their depth in the scene with the depth of objects. Touch a doorknob or a light switch, and a marker is left on the scene. The idea would be that a cleaning crew would be able to look at the scene to determine which areas need extra attention. We can think of plenty of applications that extend beyond the current crisis, as the ability to map areas that have been touched seems to be generally useful.

[Nick] has been getting some mileage out of that Xavier lately — he’s used it to build an AI umpire and shades that help you find lost stuff. Who knows what else he’ll find to do with them during this time of confinement?

Continue reading “Machine Vision Keeps Track Of Grubby Hands”

3D-Printed Film Scanner Brings Family Memories Back To Life

There is a treasure trove of history locked away in closets and attics, where old shoeboxes hold reels of movie film shot by amateur cinematographers. They captured children’s first steps, family vacations, and parties where [Uncle Bill] was getting up to his usual antics. Little of what was captured on thousands of miles of 8-mm and Super 8 film is consequential, but giving a family the means to see long lost loved ones again can be a powerful thing indeed.

That was the goal of [Anton Gutscher]’s automated 8-mm film scanner. Yes, commercial services exist that will digitize movies, slides, and snapshots, but where’s the challenge in that? And a challenge is what it ended up being. Aside from designing and printing something like 27 custom parts, [Anton] also had a custom PCB fabricated for the control electronics. Film handling is done with a stepper motor that moves one frame into the scanner at a time for scanning and cropping. An LCD display allows the archivist to move the cropping window around manually, and individual images are strung together with ffmpeg running on the embedded Raspberry Pi. There’s a brief clip of film from a 1976 trip to Singapore in the video below; we find the quality of the digitized film remarkably good.

Hats off to [Anton] for stepping up as the family historian with this build. We’ve seen ad hoc 8-mm digitizers before, but few this polished looking. We’ve also featured other archival attempts before, like this high-speed slide scanner.

Continue reading “3D-Printed Film Scanner Brings Family Memories Back To Life”

Raspberry Pi Tracks Humans, Blasts Them With Heat Rays

Given how long humans have been warming themselves up, you’d think we would have worked out all the kinks by now. But even with central heating, and indeed sometimes because of it, some places we frequent just aren’t that cozy. In such cases, it often pays to heat the person, not the room, but that can be awkward, to say the least.

Hacking polymath [Matthias Wandel] worked out a solution to his cold shop with this target-tracking infrared heater. The heater is one of those radiant deals with the parabolic dish, and as anyone who’s walked past one on demo in Costco knows, they throw a lot of heat in a very narrow beam. [Matthias] leveraged a previous project that he whipped up for offline surveillance as the core of the project. Running on a Raspberry Pi with a camera, the custom software analyzes images and locates motion across the width of a frame. That drives a stepper that swivels a platform for the heater. The video below shows the build and the successful tests; however, fans of [Matthias] should prepare themselves for a shock as he very nearly purchases a lazy susan to serve as the base for the heater rather than building one.

We’re never disappointed by [Matthias]’ videos, and we’re always impressed by his range as a hacker. From DIY power tools to wooden logic circuits to his recent Lego chocolate engraver, he always finds ways to make things interesting.

Continue reading “Raspberry Pi Tracks Humans, Blasts Them With Heat Rays”