Playing Rock, Paper Scissors With A Time Of Flight Sensor

You can do all kinds of wonderful things with cameras and image recognition. However, sometimes spatial data is useful, too. As [madmcu] demonstrates, you can use depth data from a time-of-flight sensor for gesture recognition, as seen in this rock-paper-scissors demo.

If you’re unfamiliar with time-of-flight sensors, they’re easy enough to understand. They measure distance by determining the time it takes photons to travel from one place to another. For example, by shooting out light from the sensor and measuring how long it takes to bounce back, the sensor can determine how far away an object is. Take an array of time-of-flight measurements, and you can get simple spatial data for further analysis.

The build uses an Arduino Uno R4 Minima, paired with a demo board for the VL53L5CX time-of-flight sensor. The software is developed using NanoEdge AI Studio. In a basic sense, the system uses a machine learning model to classify data captured by the time-of-flight sensor into gestures matching rock, paper, or scissors—or nothing, if no hand is present. If you don’t find [madmcu]’s tutorial enough, you can take a look at the original version from STMicroelectronics, too.

It takes some training, and it only works in the right lighting conditions, but this is a functional system that can determine real hand sign and play the game. We’ve seen similar techniques help more advanced robots cheat at this game before, too! What a time to be alive.

Machine Learning Baby Monitor Prevents The Hunger Games

Newborn babies can be tricky to figure out, especially for first-time parents. Despite the abundance of unsolicited advice proffered by anyone who ever had a baby before — and many who haven’t — most new parents quickly get in sync with the baby’s often ambiguous signals. But [Caleb] took his observations of his newborn a step further and built a machine-learning hungry baby early warning system that’s pretty slick.

Normally, babies are pretty unsubtle about being hungry, and loudly announce their needs to the world. But it turns out that crying is a lagging indicator of hunger, and that there are a host of face, head, and hand cues that precede the wailing. [Caleb] based his system on Google’s MediaPipe library, using his baby monitor’s camera to track such behaviors as lip smacking, pacifier rejection, fist mouthing, and rooting, all signs that someone’s tummy needs filling. By putting together a system to recognize these cues and assign a weight to them, [Caleb] now gets a text before the baby gets to the screaming phase, to the benefit of not only the little nipper but to his sleep-deprived servants as well. The video below has some priceless bits in it; don’t miss [Baby Caleb] at 5:11 or the hilarious automatic feeder gag at the end.

We’ve seen some interesting videos from [Caleb] recently, mostly having to do with his dog’s bathroom habits and getting help cleaning up afterward. We can only guess how those projects will be leveraged when this kid gets a little older and starts potty training.

Continue reading “Machine Learning Baby Monitor Prevents The Hunger Games”

A Gesture Recognizing Armband

Gesture recognition usually involves some sort of optical system watching your hands, but researchers at UC Berkeley took a different approach. Instead they are monitoring the electrical signals in the forearm that control the muscles, and creating a machine learning model to recognize hand gestures.

The sensor system is a flexible PET armband with 64 electrodes screen printed onto it in silver conductive ink, attached to a standalone AI processing module.  Since everyone’s arm is slightly different, the system needs to be trained for a specific user, but that also means that the specific electrical signals don’t have to be isolated as it learns to recognize patterns.

The challenging part of this is that the patterns don’t remain constant over time, and will change depending on factors such as sweat, arm position,  and even just biological changes. To deal with this the model can update itself on the device over time as the signal changes. Another part of this research that we appreciate is that all the inferencing, training, and updating happens locally on the AI chip in the armband. There is no need to send data to an external device or the “cloud” for processing, updating, or third-party data mining. Unfortunately the research paper with all the details is behind a paywall.

Continue reading “A Gesture Recognizing Armband”

Generate Positivity With Machine Learning

Gesture recognition and machine learning are getting a lot of air time these days, as people understand them more and begin to develop methods to implement them on many different platforms. Of course this allows easier access to people who can make use of the new tools beyond strictly academic or business environments. For example, rollerblading down the streets of Atlanta with a gesture-recognizing, streaming TV that [nate.damen] wears over his head.

He’s known as [atltvhead] and the TV he wears has a functional LED screen on the front. The whole setup reminds us a little of Deep Thought. The screen can display various animations which are controlled through Twitch chat as he streams his journeys around town. He wanted to add a little more interaction to the animations though and simplify his user interface, so he set up a gesture-sensing sleeve which can augment the animations based on how he’s moving his arm. He uses an Arduino in the arm sensor as well as a Raspberry Pi in the backpack to tie it all together, and he goes deep in the weeds explaining how to use Tensorflow to recognize the gestures. The video linked below shows a lot of his training runs for the machine learning system he used as well.

[nate.damen] didn’t stop at the cheerful TV head either. He also wears a backpack that displays uplifting messages to people as he passes them by on his rollerblades, not wanting to leave out those who don’t get to see him coming. We think this is a great uplifting project, and the amount of work that went into getting the gesture recognition machine learning algorithm right is impressive on its own. If you’re new to Tensorflow, though, we have featured some projects that can do reliable object recognition using little more than a Raspberry Pi and a camera.

Continue reading “Generate Positivity With Machine Learning”

Machine Learning Algorithm Runs On A Breadboard 6502

When it comes to machine learning algorithms, one’s thoughts do not naturally flow to the 6502, the processor that powered some of the machines in the first wave of the PC revolution. And one definitely does not think of gesture recognition running on a homebrew breadboard version of a 6502 machine, and yet that’s exactly what [Nick Bild] has accomplished.

Before anyone gets too worked up in the comments, we realize that [Nick]’s Vectron breadboard computer is getting a lot of help from other, more modern machines. He’s got a pair of Raspberry Pi 3s in the mix, one to capture and downscale images from a Pi cam, and one that interfaces to an Atari 2600 emulator and sends keypresses to control games based on the gestures seen by the camera. But the logic to convert gesture to control signals is all Vectron, and uses a k-nearest neighbor algorithm executed in 6502 assembly. Fifty gesture images are stored in ROM and act as references for the four known gesture classes: up, down, left, and right. When a match between the camera image and a gesture class is found, the corresponding keypress is sent to the game. The video below shows that the whole thing is pretty responsive.

In our original article on [Nick]’s Vectron breadboard computer, [Tom Nardi] said that “You won’t be playing Prince of Persia on it.” That may be true, but a machine learning system running on the Vectron is not too shabby either.

Continue reading “Machine Learning Algorithm Runs On A Breadboard 6502”

Gesture Sensing With A Temperature Sensor

Good science fiction has sound scientific fact behind it and when Tony Stark first made his debut on the big screen with design tools that worked at the wave of a hand, makers and hackers were not far behind with DIY solutions. Over the years the ideas have become much more polished, as we can see with this Gesture Recognition with PIR sensors project.

The project uses the TPA81 8-pixel thermopile array which detects the change in heat levels from 8 adjacent points. An Arduino reads these temperature points over I2C and then a simple thresholding function is used to detect the movement of the fingers. These movements are then used to do a number of things including turn the volume up or down as shown in the image alongside.

The brilliant part is that the TPA81 8-Pixel sensor has been around for a number of years. It is a bit expensive though it has the ability to detect small thermal variations such as candle flames at up to 2 Meters. More recent parts such as the Panasonic AMG8834 that contain a grid of 8×8 such sensors are much more capable for your hacking/making pleasure, but come with an increased price tag.

This technique is not just limited to gestures, and can be used in Heat-Seeking Robots that can very well be trained to follow the cat around just to annoy it.

Sonar In Your Hand

Sonar measures distance by emitting a sound and clocking how long it takes the sound to travel. This works in any medium capable of transmitting sound such as water, air, or in the case of FingerPing, flesh and bone. FingerPing is a project at Georgia Tech headed by [Cheng Zhang] which measures hand position by sending soundwaves through the thumb and measuring the time on four different receivers. These readings tell which bones the sound travels through and allow the device to figure out where the thumb is touching. Hand positions like this include American Sign Language one through ten.

From the perspective of discreetly one through ten on a mobile device, this opens up a lot of possibilities for computer input while remaining pretty unobtrusive. We see prototypes which are more capable of reading gestures but also draw attention if you wear them on a bus. It is a classic trade-off between convenience and function but this type of reading is unique and could combine with other bio signals for finer results.

Continue reading “Sonar In Your Hand”