Engineering Vs Pigeons

We’ve all been there. Pigeons are generally pretty innocuous, but they do leave a mess. If you have a convertible or a bicycle or even just a clean car, you probably don’t want them hanging around. [Max] was tired of a messy balcony, so like you might approach any engineering problem, he worked his way through several possible solutions. Starting with plastic crows, and naturally ending with an automated water gun.

The resulting robotic water gun that targets pigeons with openCV is a dandy project and while we don’t usually advocate shooting at neighborhood animals, we don’t think a little water will be any worse than the rain for the pigeons. The build started with a cheap electric water pistol. A Wemos D1 Mini ESP8266 development board provides the brainpower. The water pistol wouldn’t easily take rechargeable batteries, plus it is a good idea to separate the logic supply and the pump motors, so the D1 gets power from a USB power bank separate from the gun’s batteries.

That leaves the camera. An old iPhone 6S with a 3D printed bracket feeds video to a Python script that uses openCV. If looks for changes using a very particular algorithm to detect that something is moving and fires the gun. It doesn’t appear that it actually tracks the pigeons, so maybe that’s a thought for version 2.

Was it successful? Maybe, but it does seem like the pigeons learned to avoid it. We still think azimuth and elevation on the gun would help.

Most of the time when we see pigeon hacking it is to use them for nefarious purposes. [Max] should be glad he doesn’t have to deal with lions.

Learn Sign Language Using Machine Vision

Learning a new language is a great way to exercise the mind and learn about different cultures, and it’s great to have a native speaker around to improve the learning experience. Without one it’s still possible to learn via videos, books, and software though. The task does get much more complicated when trying to learn a language that isn’t spoken, though, like American Sign Language. This project allows users to learn the ASL alphabet with the help of computer vision and some machine learning algorithms.

The build uses a computer vision model in MobileNetV2 which is trained for each sign in the ASL alphabet. A sign is shown to the user on a screen, and the user needs to demonstrate the sign to the computer in order to progress. To do this, OpenCV running on a Raspberry Pi with a PiCamera is used to analyze the frames of the user in real-time. The user is shown pictures of the correct sign, and is rewarded when the correct sign is made.

While this only works for alphabet signs in ASL currently, the team at the University of Glasgow that built this project is planning on expanding it to include other signs as well. We have seen other machines built to teach ASL in the past, like this one which relies on a specialized glove rather than computer vision.

Continue reading “Learn Sign Language Using Machine Vision”

Twitch And Blink Your Way Through Typing With This Facial Keyboard

For those that haven’t experienced it, the early days of parenthood are challenging, to say the least. Trying to get anything accomplished with a raging case of sleep deprivation is hard enough, but the little bundle of joy who always seems to need to be in physical contact with you makes doing things with your hands nigh impossible. What’s the new parent to do when it comes time to be gainfully employed?

Finding himself in such a boat, [Fletcher]’s solution was to build a face-activated keyboard to work around his offspring’s needs. Before you ask: no, voice recognition software wouldn’t work, at least according to the sleepy little boss who protests noisy awakenings. The solution instead was to first try OpenCV and the dlib facial recognition library to watch [Fletcher] blinking out Morse code. While that sorta-kinda worked, one’s blinkers can’t long endure such a workout, so he moved on to an easier set of gestures. Mouthing Morse code covers most of the keyboard, while a combination of eye, eyebrow, and other facial twitches and tics cover the rest, with MediaPipe’s Face Mesh doing the heavy-lifting in terms of landmark detection.

The resulting facial keyboard, aptly dubbed “CheekyKeys,” performed well enough for [Fletcher] to use for a skills test during an interview with a Big Tech Company. Imagining the interviewer on the other end watching him convulse his way through the interview was worth the price of admission, and we don’t even care if it was a put-on. Video after the break.

CheekyKeys is pretty cool, doing something with a webcam and Python that we thought would have needed a dedicated AI depth camera to accomplish. But perhaps the real hack here was how [Fletcher] taught himself Morse in fifteen minutes.

Continue reading “Twitch And Blink Your Way Through Typing With This Facial Keyboard”

CNC Toolpath Visualisation With OpenCV

[Tony Liechty] has been having a few issues getting into CNC machining — starting with a simple router, he’s tripped over the usual beginners’ problems, you know, things like alignment of the design to the workpiece shape, axis clipping and workpiece/clamp collisions. He did the decent hacker thing, and turned to some other technologies to help out, and came up with a rather neat way of using machine vision with OpenCV to help preview the toolpath against an image of the workpiece in-situ (video, embedded below.)

ChArUco (a combined chessboard and ArUco marker pattern) boards taped to the machine rails were used to give OpenCV a reference of where points in space are with respect to the pattern field, enabling identification of pixel locations within the image of the rails. A homography transformation is then used to link the two side references to an image of the workpiece. This transformation allows the system to determine the physical location of any pixel from the workpiece image, which can then be overlaid with an image of the desired toolpath. Feedback from the user would then enable adjustment of the path, such as shifts, or rotates to be effected in order to counter any issue that can be seen. The reduction of ‘silly’ clamping, positioning and other such issues, means less time wasted and fewer materials in scrap bin, and that can only be a good thing.

[Tony] says this code and setup is just a demo of the concept, but such ‘rough’ code could well be the start of something great, we shall see. Checkout the realWorldGcodeSender GitHub if you want to play along at home!

We’ve seen a few uses of OpenCV for assisting with CNC applications, like this cool you draw it, i’ll cut it hack, and this method for using machine vision to zero-in a CNC mill onto the centre of a large hole.

Continue reading “CNC Toolpath Visualisation With OpenCV”

You Draw It, CNC Cuts It

[Jamie] aka [vector76] hit us with a line-tracing plugin for OctoPrint that cuts out whatever 2D shape you draw on a piece of wood. The plugin lets you skip the modeling step entirely, going straight from a CNC-mounted webcam that reads your scribbles and gives you a Gcode toolpath in return. The code is on GitHub and there’s a demo video embedded below.

Under the hood, OpenCV is doing a lot of the image processing, including line detection, and the iterative “find the line” and “move the toolhead” steps really show off what computer vision can do. It starts off with a fiducial arrow for scale and orientation, then it mores the webcam around the scene. The user can enter the usual milling parameters: speeds, feeds, depth of cut, tool offset, milling direction, etc. And then it gets to work.

Right now, it’s limited to paths with non-crossing lines, and probably with good contrast and a nice dark line — all the usual CV restrictions. But mounting a webcam to a CNC toolhead and using it for various pathing problems really opens up tons of possibilities: visual homing, workpiece edge finding, copying parts, custom fitting odd shapes, and more. This project is clearly an invitation to keep on hacking, an appetizer. Once you see the girl pirate robot that [Jamie]’s daughter made, you’ll get the idea.

We’ve seen a similar OpenCV approach used for center-finding bore holes, but while we’ve seen a few webcams used with laser cutters, the CNC mill applications seem largely untapped. Let us know in the comments if you’ve got some other good examples.

Continue reading “You Draw It, CNC Cuts It”

OpenCV Brings Pinch To Zoom Into The Real World

Gesture controls arrived in the public consciousness a little over a decade ago as touchpads and touchscreens became more popular. The main limitation to gesture controls, a least as far as [Norbert] is concerned, is that they can only control objects in a virtual space. He was hoping to use gestures to control a real-world object instead, and created this device which uses gestures to control an actual picture.

In this unique augmented reality device, not only is the object being controlled in the real world but the gestures are being monitored there as well, thanks to a computer vision system watching his hand which is running OpenCV. The position data is fed into an algorithm which controls a physical picture mounted on a slender robotic arm. Now, when [Norbert] “pinches to zoom”, the servo attached to the picture physically brings it closer to or further from his field of view. He can also use other gestures to move the picture around.

While this gesture-controlled machine is certainly a proof-of-concept, there are plenty of other uses for gesture controls of real-world objects. Any robotics platform could benefit from an interface like this, or even something slightly more mundane like an office PowerPoint presentation. Opportunity abounds, but if you need a primer for OpenCV take a look at this build which tracks a hand in minute detail.

Continue reading “OpenCV Brings Pinch To Zoom Into The Real World”

ElectronBot: A Sweet Mini Desktop Robot That Ticks All The Boxes

[Peng Zhihui] seems to have found some spare time and energy to crack out another sweet robot build, this time it’s a much smaller, and cuter emoji-bot (Original GitHub Link,) with the usual production-ready levels of attention to detail. With a lot of fine details in the 3D printed models, this is one for SLS printing in nylon, but that can be done for a reasonable outlay, in China at least. The electronics package consists of a few full custom, and tiny, PCBs designed with Altium Designer, with off-the-shelf modules for the circular LCD and camera. The main board hosts an STM32F405 and deals with the display and SD card, The reason for this choice of STM32 was due to the requirement for connecting to an external USB3300 high-speed USB PHY. There is a sensor PCB which handles the gesture sensor, a USB hub, MPU6050 9-axis sensor, and also the USB camera module. This board attaches to the USB-C connector in the base, via a FFC cable, allowing the robot to rotate on its base.

Cunning two-servo shoulder mechanism

[Peng] clearly has exacting standards as to how things should work, and we guess wanted to have the arms back-driveable in a way that enabled the host computer to track and record the motor positions for replaying later on. The connection back to the controller is via I2C, allowing all five servos to hang on the same bus, saving previous resources. Smart! Getting a processor and motor driver in such a tiny space was a bit of challenge, but a walk in the park for [Peng] as is demonstrates in the video embedded below (We believe English subtitles are pending!) The arm mechanism is particularly interesting, and rather elegantly executed, and he does seem rather proud of this part of the design, and so he should! Like with [Peng’s] other projects, there is a lot to see, and plenty of scope for feature explosion. It was nice to see the ‘bot being used as an input device, not only with gesture sensing via the dedicated sensor, but also using the camera with OpenCV to track user posture and act accordingly. This thing could act as genuinely useful AI device, as was a being darn cute at the same time!

We know you come to Hackaday for your cute robot fix, and we’re not going to disappoint. Here’s a cute robot lamp, an obligatory spot (a robot dog) type project, and if you’re more of a cat person, then we got that base covered as well.

Continue reading “ElectronBot: A Sweet Mini Desktop Robot That Ticks All The Boxes”