Adding A Laser Blaster To Classic Atari 2600 Games With Machine Vision

Remember the pistol controller for the original Atari 2600? No? Perhaps that’s because it never existed. But now that we’re living in the future, adding a pistol to the classic games of the 2600 is actually possible.

Possible, but not exactly easy. [Nick Bild]’s approach to the problem is based on machine vision, using an NVIDIA Xavier NX to run an Atari 2600 emulator. The game is projected on a wall, while a camera watches the game field. A toy pistol with a laser pointer attached to it blasts away at targets, while OpenCV is used to find the spots that have been hit by the laser. A Python program matches up the coordinates of the laser blasts with coordinates within the game, and then fires off a sequence of keyboard commands to fire the blasters in the game. Basically, the game plays itself based on where it sees the laser shots. You can check out the system in the video below.

[Nick Bild] had a busy weekend of hacking. This was the third project write-up he sent us, after his big-screen Arduboy build and his C64 smartwatch.

Continue reading “Adding A Laser Blaster To Classic Atari 2600 Games With Machine Vision”

Machine Learning Helps You Track Your Internet Misery Index

We all seem to intuitively know that a lot of what we do online is not great for our mental health. Hang out on enough social media platforms and you can practically feel the changes your mind inflicts on your body as a result of what you see — the racing heart, the tight facial expression, the clenched fists raised in seething rage. Not on Hackaday, of course — nothing but sweetness and light here.

That’s all highly subjective, of course. If you’d like to quantify your online misery more objectively, take a look at the aptly named BrowZen, a machine learning application by [Nick Bild]. Built around an NVIDIA Jetson Xavier NX and a web camera, BrowZen captures images of the user’s face periodically. The expression on the user’s face is classified using a facial recognition model that has been trained to recognize facial postures related to emotions like anger, surprise, fear, and happiness. The app captures your mood and which website you’re currently looking at and stores the results in a database. Handy charts let you know which sites are best for your state of mind; it’s not much of a surprise that Twitter induces rage while Hackaday pushes [Nick]’s happiness button. See? Sweetness and light.

Seriously, we could see something like this being very useful for psychological testing, marketing research, or even medical assessments. This adds to [Nick]’s array of AI apps, which range from tracking which surfaces you touch in a room to preventing you from committing a fireable offense on a video conference.

Continue reading “Machine Learning Helps You Track Your Internet Misery Index”

Machine Vision Keeps Track Of Grubby Hands

Can you remember everything you’ve touched in a given day? If you’re being honest, the answer is, “Probably not.” We humans are a tactile species, with an outsized proportion of both our motor and sensory nerves sent directly to our hands. We interact with the world through our hands, and unfortunately that may mean inadvertently spreading disease.

[Nick Bild] has a potential solution: a machine-vision system called Deep Clean, which monitors a scene and records anything in it that has been touched. [Nick]’s system uses Jetson Xavier and a stereo camera to detect depth in a scene; he built his camera from a pair of Raspberry Pi cams and a Pi 3B+, but other depth cameras like a Kinect could probably do the job. The idea is to watch the scene for human hands — OpenPose is the tool he chose for that job — and correlate their depth in the scene with the depth of objects. Touch a doorknob or a light switch, and a marker is left on the scene. The idea would be that a cleaning crew would be able to look at the scene to determine which areas need extra attention. We can think of plenty of applications that extend beyond the current crisis, as the ability to map areas that have been touched seems to be generally useful.

[Nick] has been getting some mileage out of that Xavier lately — he’s used it to build an AI umpire and shades that help you find lost stuff. Who knows what else he’ll find to do with them during this time of confinement?

Continue reading “Machine Vision Keeps Track Of Grubby Hands”