Hackaday Prize Entry: Remote Control By Head Gestures

Some people may think they’re having a bad day when they can’t find the TV remote. Yet there are some people who can’t even hold a remote, let alone root around in the couch cushions where the remote inevitably winds up. This entry in the Assistive Technologies phase of the 2017 Hackaday Prize seeks to help such folks, with a universal remote triggered by head gestures.

Mobility impairments can range from fine motor control issues to quadriplegia, and people who suffer from them are often cut off from technology by the inability to operate devices. [Cassio Batista] concentrated on controlling a TV for his project, but it’s easy to see how his method could interface with other IR remotes to achieve control over everything from alarm systems to windows and drapes. His open-source project uses a web cam to watch a user’s head gestures, and OpenCV running on a CHIP SBC looks for motion in the pitch, yaw, and roll axes to control volume, channel, and power. An Arduino takes care the IR commands to the TV. The prototype works well in the video below; with the power of OpenCV we can imagine mouth gestures and even eye blinks adding to the controller’s repertoire.

The Assistive Tech phase wraps up tomorrow, so be sure to get your entries in. You’ll have some stiff competition, like this robotic exoskeleton. But don’t let that discourage you.

Continue reading “Hackaday Prize Entry: Remote Control By Head Gestures”

OpenCV Turret Tracks Motion, Busts Airsoft Pellets

In the eternal struggle for office dominance, the motion-tracking Airsoft/Nerf/whatever, the autonomous turret seems to be the nuclear option. [Aaron] and [Davis] built a motion-tracking turret that uses openCV to detect movement, before hitting a relay to trigger the gun.

There’s a Raspberry Pi controlling a Logitech C210 Pi-compatible webcam, with a stepper hat for the Pi controlling two NEMA steppers that aim the gun. The design is simple but elegant, with a rotating base and an assembly that raises and lowers the weapon.

The openCV intrigues us. We want to see a openCV-powered turret with color detection, so your own team doesn’t get blasted along with your hapless enemies. Or if guarding your cubicle, how about a little openCV facial recognition?

If you want to take a stab at your own, [Aaron] and [Davis] show how they built their project in their Hackaday.io page and their Python script can be found on GitHub.  Otherwise, check out the Counter Strike Airsoft robot, the Airsoft sentry gun, and the Nerf turret powered by Slack we published previously. Continue reading “OpenCV Turret Tracks Motion, Busts Airsoft Pellets”

RoGeorge Attacks A Pulse Meter

The “Crivit Sports” is an inexpensive chest-strap monitor that displays your current pulse rate on a dedicated wristwatch. This would be much more useful, and presumably more expensive, if it had a logging option, or any way to export your pulse data to a more capable device. So [RoGeorge] got to work. Each post of the (so-far) three-part series is worth a read, not the least because of the cool techniques used.

In part one, [RoGeorge] starts out by intercepting the signals. His RF sniffer? An oscilloscope probe shorted out in a loop around the heart monitor. Being able to read the signals, it was time to decode them. Doing pushups and decoding on-off keyed RF signals sounds like the ideal hacker training regimen, but instead [RoGeorge] used a signal generator, clipped to the chest monitor, to generate nice steady “heartbeats” and then read the codes off the scope without breaking a sweat.

With the encoding in hand, and some help from the Internet, he tested out his hypothesis in part two. Using an Arduino to generate the pulses logged in part one, he pulsed a coil and managed to get the heart rates displayed on the watch.

Which brings us to part three. What if there were other secrets to be discovered? Brute-forcing every possible RF signal and looking at the watch to see the result would be useful, but doing so for 8,192 possible codes would drive anyone insane. So [RoGeorge] taught himself OpenCV in Python and pointed a webcam at the watch. He wrote a routine that detected the heart icon blinking, a sign that the watch received a valid code, and then transmitted all possible codes to see which ones were valid. Besides discovering a few redundant codes, he didn’t learn much new from this exercise, but it’s a great technique.

We’re not sure what’s left to do on the Crivit. [RoGeorge] has already figured out the heart-rate data protocol, and could easily make his own logger. We are sure that we liked his thorough and automated approach to testing it all, from signal-generator-as-heartbeat to OpenCV as feedback in a brute-force routine. We can’t wait to see what’s up next.

Robot Car Follows Wherever You Go

Having a pet can really make a difference to your happiness at the end of the day, but they’re also a lot of work. This project by [Ioannis Stoltidis] does something similar — minus all the responsibility. The Smart Car Follower Project is designed to track people using Bluetooth and IR and follow them around from room to room.

Submitted as part of a Master’s thesis, this project hacks a toy car and uses a key chain transmitter that sends the tracking signals. A Raspberry Pi 3 combines the Bluetooth RSSI and IR signals to make create an estimate of the position of the beacon. Arduinos facilitate the IR signaling as well as the motor control allowing the robot to chase the user around like a puppy. The whole thing also comes with obstacle avoidance using ultrasonic sensors on all sides which are good if you have a lot of furniture in the house.

You can also choose to go manual-mode and drive it around the block using a PC and gamepad. A webcam connected to the onboard computer allows a first person view of the vehicle by sending the video feed over wifi to a PC application. OpenCV is used to create the final GUI which allows you to see and control the project remotely. The source code is available for download for anyone who wants to replicate the project. Check out the video of it in action below.

Continue reading “Robot Car Follows Wherever You Go”

Robot Solves Sudoku On Paper

Sudoku is a great way to pass some time, especially on a long flight. However, we don’t think the airlines will let [Sanahm] board with his sudoku-solving robot. The basic machine looks like a 2D plotter made with aluminum extrusion, with the addition of a Raspberry Pi and a camera. The machine can read a sudoku puzzle, solve it, and then fill in the puzzle with a pen. Unlike humans, it should never need to erase its work.

The software uses OpenCV to process the camera data, find the grid, and the cells provided by the puzzle. TensorFlow recognizes the numbers. From there, it is all just math to solve the puzzle. Once solved, the plotter part of the robot takes over and fills in the blanks. After all that, this seems like the easy part.

Continue reading “Robot Solves Sudoku On Paper”

Hackaday Prize Entry: Automated Wildlife Recognition

Trail and wildlife cameras are commonly available nowadays, but the Wild Eye project aims to go beyond simply taking digital snapshots of critters. [Brenda Armour] uses a Raspberry Pi to not only take photos of wildlife who wander into the camera’s field of view, but to also automatically identify and categorize the animals seen using a visual recognition API from IBM via the Node-RED infrastructure. The result is a system that captures an image when motion is detected, sends the image to the visual recognition API, and attempts to identify any wildlife based on the returned data.

The visual recognition isn’t flawless, but a recent proof of concept shows promising results with crows, a cat, and a dog having been successfully identified. Perhaps when the project is ready to move deeper into the woods, elements from these solar-powered networked birdhouses (which also use the Raspberry Pi) could help cut some cords.

JeVois Machine Vision Camera Nails Demo Mode

JeVois is a small, open-source, smart machine vision camera that was funded on Kickstarter in early 2017. I backed it because cameras that embed machine vision elements are steadily growing more capable, and JeVois boasts an impressive range of features. It runs embedded Linux and can process video at high frame rates using OpenCV algorithms. It can run standalone, or as a USB camera streaming raw or pre-processed video to a host computer for further action. In either case it can communicate to (and be controlled by) other devices via serial port.

But none of that is what really struck me about the camera when I received my unit. What really stood out was the demo mode. The team behind JeVois nailed an effective demo mode for a complex device. That didn’t happen by accident, and the results are worth sharing.

Continue reading “JeVois Machine Vision Camera Nails Demo Mode”