Introduced in 1960 for the princely sum of $2.99 ($25.00 today), Etch A Sketch was to become a standard issue item for the Baby Boomers’ toy box. As enchanting as the toy seems, it’s hard to see why it had staying power: it was hard for young fingers to twirl the knobs, diagonal lines and smooth curves required a concert pianist’s fine motor control, and whatever drawings we managed to make were erased at the slightest jostle of the tablet.
Intent on righting these wrongs, [Sunny Balasubramanian] not only motorized an Etch A Sketch, but he’s also given it a mind of its own in a way. For those unfamiliar with the toy, it’s basically a manual X-Y plotter that drags a stylus across the underside of a glass screen, scraping off a silver powder clinging to the glass to make dark lines. Replacing the knobs with steppers is straightforward, of course, but driving them is the trick. [Sunny] hooked his up to a Raspberry Pi and wrote some Python code to drive them. The Pi also accepts input image files and processes them for rendering through the plotter, first doing Canny edge detection in OpenCV, then plotting a single path through the largest collection of connected pixels in the image. From there it’s just a matter of spinning the motors to create surprisingly detailed images. Check out the short video below to see it in action.
It’s hardly the first automatic Etch A Sketch we’ve seen – here’s one that automates everything including the shake to erase the drawing. That one cheats a little though, in that it rasters across the screen like a CRT. We really like how this one just does a single path. Pretty clever.
Continue reading “Bot Makes Etch A Sketch Art In One Continuous Line”



Fifteen sensor boards, called K-Ceptors, are attached to various points on the body, each containing an LSM9DS1 IMU (Inertial Measurement Unit). The K-Ceptors are wired together while still allowing plenty of freedom to move around. Communication is via I2C to a Raspberry Pi. The Pi then sends the collected data over WiFi to a desktop machine. As you move around, a 3D model of a human figure follows in realtime, displayed on the desktop’s screen using Blender, a popular, free 3D modeling software. Of course, you can do something else with the data if you want, perhaps make a robot move? Check out the overview and the performance by a clearly experienced dancer putting the system through its paces in the video below.




