OpenCV Running On A Tiny Microcontroller

At first blush, it might seem like projects that make extensive use of computer vision or machine learning would need to be based on powerful computing platforms with plenty of clock cycles and memory to handle this type of application. While there is some truth to this, as the field progresses it becomes possible to experiment with these tools on low-power devices as well. Take this OpenCV project which is built entirely on an ESP32 for example.

With that being said, there are some modifications that need to be made to the ESP32 in order to use OpenCV in any meaningful way. The most important of these is the use of the ESP32-DOWDQ6 module which increases the available memory of the ESP32 to allow it to make better use of camera functions. Even then, the ESP32 can’t run the entire OpenCV application, so a shrunken version of OpenCV is required before the device can run it natively. Once those two obstacles are out of the way, though, doing things like edge detection, as this project demonstrates, are well in the realm of possibility.

If running OpenCV on something as small as an ESP32 is possible, it is even easier to run on something orders of magnitude more powerful and yet still inexpensive, such as the Raspberry Pi. While the project’s code is available on its GitHub page for those interested, there are plenty of other OpenCV projects that we have featured on more powerful platforms as well, like this clock which falls off of the wall whenever someone looks at it.

Continue reading “OpenCV Running On A Tiny Microcontroller”

Bot Makes Etch A Sketch Art In One Continuous Line

Introduced in 1960 for the princely sum of $2.99 ($25.00 today), Etch A Sketch was to become a standard issue item for the Baby Boomers’ toy box. As enchanting as the toy seems, it’s hard to see why it had staying power: it was hard for young fingers to twirl the knobs, diagonal lines and smooth curves required a concert pianist’s fine motor control, and whatever drawings we managed to make were erased at the slightest jostle of the tablet.

Intent on righting these wrongs, [Sunny Balasubramanian] not only motorized an Etch A Sketch, but he’s also given it a mind of its own in a way. For those unfamiliar with the toy, it’s basically a manual X-Y plotter that drags a stylus across the underside of a glass screen, scraping off a silver powder clinging to the glass to make dark lines. Replacing the knobs with steppers is straightforward, of course, but driving them is the trick. [Sunny] hooked his up to a Raspberry Pi and wrote some Python code to drive them. The Pi also accepts input image files and processes them for rendering through the plotter, first doing Canny edge detection in OpenCV, then plotting a single path through the largest collection of connected pixels in the image. From there it’s just a matter of spinning the motors to create surprisingly detailed images. Check out the short video below to see it in action.

It’s hardly the first automatic Etch A Sketch we’ve seen – here’s one that automates everything including the shake to erase the drawing. That one cheats a little though, in that it rasters across the screen like a CRT. We really like how this one just does a single path. Pretty clever.

Continue reading “Bot Makes Etch A Sketch Art In One Continuous Line”

Hack Your Brain: The McCollough Effect

There is a fascinating brain reaction known as the McCollough Effect which is like side-loading malicious code through your eyeballs. Although this looks and smells like an optical illusion, the science would argue otherwise. What Celeste McCollough observed in 1965 can be described as a contingent aftereffect although we refer to this as “The McCollough Effect” due to McCollough being the first to recognize this phenomena. It’s something that can’t be unseen… sometimes affecting your vision for months!

I am not suggesting that you experience the McCollough Effect yourself. We’ll look at the phenomena of the McCollough Effect, and it can be understood without subjecting yourself to it. If you must experience the McCollough Effect you do so at your own risk (here it is presented as a video). But read on to understand what is happening before you take the plunge.

Continue reading “Hack Your Brain: The McCollough Effect”

Roboartist Draws What It Sees

roboartist-vector-image-machine

The perfect balance of simplicity and complexity have been struck with this automated artist. The Roboartist is a vector drawing robot project which [Niazangels], [Maxarjun], and [Ashwin] have been documenting for the last few days. The killer feature of the build is the ability to process what is seen through a webcam so that it may be sketched as ink on paper by the robotic arm.

The arm itself has four stages, and as you can see in the video below, remarkably little slop. The remaining slight wiggle is just enough to make the images seem as if they were not printed to perfection, and we like that effect!

Above is a still of Roboartist working on a portrait of [Heath Ledger] in his role as Joker from The Dark Knight. The image import feature was used for this. It runs a tweaked version of the Canny Edge Detector to determine where the pen strokes go. This is an alternative to capturing the subject through the webcam. For now MATLAB is part of the software chain, but future work seeks to upgrade to more Open Source tools. The hardware itself uses an Arduino Mega to take input via USB or Bluetooth and drives the quartet of servo motors accordingly.

Continue reading “Roboartist Draws What It Sees”