Machine Learning Takes The Embarrassment Out Of Videoconference Wardrobe Malfunctions

Telecommuters: tired of the constant embarrassment of showing up to video conferences wearing nothing but your underwear? Save the humiliation and all those pesky trips down to HR with Safe Meeting, the new system that uses the power of artificial intelligence to turn off your camera if you forget that casual Friday isn’t supposed to be that casual.

The following infomercial is brought to you by [Nick Bild], who says the whole thing is tongue-in-cheek but we sense a certain degree of “necessity is the mother of invention” here. It’s true that the sudden throng of remote-work newbies certainly increases the chance of videoconference mishaps and the resulting mortification, so whatever the impetus, Safe Meeting seems like a great idea. It uses a Pi cam connected to a Jetson Nano to capture images of you during videoconferences, which are conducted over another camera. The stream is classified by a convolutional neural net (CNN) that determines whether it can see your underwear. If it can, it makes a REST API call to the conferencing app to turn off the camera. The video below shows it in action, and that it douses the camera quickly enough to spare your modesty.

We shudder to think about how [Nick] developed an underwear-specific training set, but we applaud him for doing so and coming up with a neat application for machine learning. He’s been doing some fun work in this space lately, from monitoring where surfaces have been touched to a 6502-based gesture recognition system.

Continue reading “Machine Learning Takes The Embarrassment Out Of Videoconference Wardrobe Malfunctions”

Crunching Giant Data From The Large Hadron Collider

Modern physics experiments are often complex, ambitious, and costly. The times where scientific progress could be made by conducting a small tabletop experiment in your lab are mostly over. Especially, in fields like astrophysics or particle physics, you need huge telescopes, expensive satellite missions, or giant colliders run by international collaborations with hundreds or thousands of participants. To drive this point home: the largest machine ever built by humankind is the Large Hadron Collider (LHC). You won’t be surprised to hear that even just managing the data it produces is a super-sized task.

Since its start in 2008, the LHC at CERN has received several upgrades to stay at the cutting edge of technology. Currently, the machine is in its second long shutdown and being prepared to restart in May 2021. One of the improvements of Run 3 will be to deliver particle collisions at a higher rate, quantified by the so-called luminosity. This enables experiments to gather more statistics and to better study rare processes. At the end of 2024, the LHC will be upgraded to the High-Luminosity LHC which will deliver an increased luminosity by up to a factor of 10 beyond the LHC’s original design value.

Currently, the major experiments ALICE, ATLAS, CMS, and LHCb are preparing themselves to cope with the expected data rates in the range of Terabytes per second. It is a perfect time to look into more detail at the data acquisition, storage, and analysis of modern high-energy physics experiments. Continue reading “Crunching Giant Data From The Large Hadron Collider”

Automate Your Xbox

First the robots took our jobs, then they came for our video games. This dystopian future is brought to you by [Little French Kev] who designed this adorable 3D-printed robot arm to interface with an Xbox One controller joystick. He shows it off in the video after the break, controlling a ball-balancing physics demonstration written in Unity.

Hats off to him on the quality of the design. There are two parts that nestle the knob of the thumbstick from either side. He mates those pieces with each other using screws, firmly hugging the stick. Bearings are used at the joints for smooth action of the two servo motors that control the arm. The base of the robotic appendage is zip-tied to the controller itself.

The build targets experimentation with machine learning. Since the computer can control the arm via an Arduino, and the computer has access to metrics of what’s happening in the virtual environment, it’s a perfect for training a neural network. Are you thinking what we’re thinking? This is the beginning of hardware speed-running your favorite video games like [SethBling] did for Super Mario World half a decade ago. It will be more impressive since this would be done by automating the mechanical bit of the controller rather than operating purely in the software realm. You’ll just need to do your own hack to implement button control.

Continue reading “Automate Your Xbox”

Machine Learning Algorithm Runs On A Breadboard 6502

When it comes to machine learning algorithms, one’s thoughts do not naturally flow to the 6502, the processor that powered some of the machines in the first wave of the PC revolution. And one definitely does not think of gesture recognition running on a homebrew breadboard version of a 6502 machine, and yet that’s exactly what [Nick Bild] has accomplished.

Before anyone gets too worked up in the comments, we realize that [Nick]’s Vectron breadboard computer is getting a lot of help from other, more modern machines. He’s got a pair of Raspberry Pi 3s in the mix, one to capture and downscale images from a Pi cam, and one that interfaces to an Atari 2600 emulator and sends keypresses to control games based on the gestures seen by the camera. But the logic to convert gesture to control signals is all Vectron, and uses a k-nearest neighbor algorithm executed in 6502 assembly. Fifty gesture images are stored in ROM and act as references for the four known gesture classes: up, down, left, and right. When a match between the camera image and a gesture class is found, the corresponding keypress is sent to the game. The video below shows that the whole thing is pretty responsive.

In our original article on [Nick]’s Vectron breadboard computer, [Tom Nardi] said that “You won’t be playing Prince of Persia on it.” That may be true, but a machine learning system running on the Vectron is not too shabby either.

Continue reading “Machine Learning Algorithm Runs On A Breadboard 6502”

Contest Winners: Machine Learning On All Kinds Of Gadgets

With nearly sixty exciting entries, the Train All the Things contest, presented in partnership with Digi-Key, has drawn to a close and today we are happy to share news of the winning projects. The challenge at hand was to show off a project using some type of Machine Learning and there were plenty of takes on this theme displayed.

Perhaps the most impressive project is the Intelligent Bat Detector by [Tegwyn☠Twmffat] which claims the “ML on the Edge” award. His project, seen above, seeks not only to detect the presence of bats through the sounds they make during echolocation, but to identify the type of bat as well. Having been through a number of iterations, the bat detector, based on Nvidia Jetson Nano and a Raspberry Pi, can classify several types of bats, and a set of house keys (for a “control”). It’s also been impeccably documented and serves as a great example of how to get into machine learning.

The Soldering LIghtsaber takes the “ML Blinky” award for using machine learning in the microcontroller realm. This clever use of the concept seeks one thing: destroying the wait times for your soldering iron to heat up. It takes time to make temperature readings while the iron heats up, if you can do away with this step it speeds things up greatly. By sampling results of different voltages and heating times, machine learning establishes its own guidelines for how to pour electricity into the heating element without checking for feedback, and coming out the other side at the perfect temperature.

Rounding up our final two winners, the AI Powered Bull**** Detector claims the “ML on the Gateway” award, and
Hacking Wearables for Mental Health and More which won in the “ML on the Cloud” category.

The idea behind our illuminated poop emoji project is to detect human speech and make a judgement on whether the comment is valid, or BS. It does this by leveraging a learning set of comments that have previously been identified as BS and making an association with the currently uttered words.

Wearables for mental health is a wonderful project that was previously recognized in the 2018 Hackaday Prize. Economies of scale have made these wearables quite affordable as a way to add a sensor suite to behavior analysis. But of course you need a way to process all of the sensor data, a perfect task for a cloud-based machine learning application.

All four winners received a $100 gift code to Tindie. Don’t forget to check out all of the other interesting projects that were entered in this contest!

Harry Potter Wand Hack Makes Magic Real

Any sufficiently advanced hack is indistinguishable from magic, a wise man once observed. That’s true with this cool build from [Jasmeet Singh] that magically opens a box when you wave a Harry Potter magic wand in the right way. Is it magic? No, it’s a neat hack that uses computer vision to track the wand and recognize when you make the magic gesture.

The trick is based on the same technique that Universal Studios use in their Harry Potter theme park, as detailed in a patent with the snappy title of “System and method for tracking a passive wand and actuating an effect based on a detected wand path“. The basic idea is that a retroreflective dot on the end of the wand reflects light from a set of infra-red LEDs around the camera. An infra-red sensitive camera detects this reflected light as a bright dot. This camera is tied into a computer vision system that tracks the path of the dot, then triggers the action if it follows a certain pattern.

The version that [Jasmeet] built uses a Raspberry Pi NoIR camera, and a Raspberry Pi 3 running OpenCV. This feeds into a machine learning graph that detects the letters of the alphabet. If the detected letter is an A (for Alomahora, the Harry Potter open spell), then the box opens. If it is a C, the box closes. This is all tied together using Python.

It’s a neat build that ties together a number of interesting techniques, and which could keep the kids amused for a while. You could also expand it further, such as adding a death ray that triggers if you trace an S for Sectumsempra. That’ll teach them not to mess with the dark arts.

Continue reading “Harry Potter Wand Hack Makes Magic Real”

A Soldering LightSaber For The Speedy Worker

We all have our preferences when it comes to soldering irons, and for [Marius Taciuc] the strongest of them all is for a quick heat-up. It has to be at full temperature in the time it takes him to get to work, or it simply won’t cut the mustard. His solution is a temperature controlled iron, but one with no ordinary temperature control. Instead of a normal feedback loop it uses a machine learning algorithm to find the quickest warm-up.

The elements he’s using have a thermocouple in series with the element itself, meaning that to measure the temperature the power must be cut to the element. This duty cycle can not be cut too short or the measurements become noisy, so under a traditional temperature control regimen there is a limit on how quickly it can be heated up. His approach is to turn it on full-time for a period without stopping to measure the temperature, only measuring after it has had a chance to heat up. The algorithm constantly learns how long to switch it on to achieve what temperature, and is able to interpolate to arrive at the desired reading. It’s a clever way to make existing hardware perform new tricks, and we like that.

He’s appeared on these pages quite a few times over the years, but perhaps you’d like to see the first version of the same hardware. Meanwhile watch the quick heat up in action with a fuller explanation in the video below.

Continue reading “A Soldering LightSaber For The Speedy Worker”