Making Music With A Go Board Step Sequencer

Ever wonder what your favorite board game sounds like? Neither did we. Thankfully [Sara Adkins] did, and created a step sequencer called Let’s Go that uses the classic board game Go as input.

In the game Go, two players place black and white tokens on a grid, vying for control of the board. As the game progresses, the configuration of game pieces gets more complex and coincidentally begins to resemble Conway’s Game of Life (or a weird QR Code). Sara saw music in the evolving arrangement of circles and transformed the ancient board game into a modern instrument so others could hear it too.

To an observer, [Sara’s] adaptation looks fairly indistinguishable from the version played in China 2,500 years ago — with the exception of an overhead webcam and nearby laptop, of course. The laptop uses OpenCV to digitize the board layout. It feeds that information via Open Sound Control (OSC) into popular music creation software Max MSP (though an open-source version could probably be implemented in Pure Data), where it’s used to control a step sequencer. Each row on the board represents an instrumental voice (melodic for white pieces, percussive for black ones), and each column corresponds to a beat.

Every new game is a new piece of music that starts out simple and gradually increases in complexity. The music evolves with the board, and adds a new dimension for players to interact with the game. If you want to try it out yourself, [Sara] has the project fully documented on her website, and all of the code is available on GitHub. Now we’re just left wondering what other games sound like — [tinkartank] already answered that question for chess, but what about Settlers of Catan?

Continue reading “Making Music With A Go Board Step Sequencer”

TOBOT Is Your Tic Tac Toe Opponent With A Bad Attitude

[3dprintedlife] is apparently a little bored. Instead of whiling away the time playing tic tac toe, he built an impressive tic tac toe robot named TOBOT. The robot uses a Rasberry Pi Zero and a Feather to control a two-axis robot arm that can draw the board and make moves using a pen. It also uses a simple computer vision system to look at the board to understand your move, and it has a voice too.

The other thing TOBOT has is a bad attitude. The robot wants to win. Badly. Check out the video below and you’ll see what we mean.

Continue reading “TOBOT Is Your Tic Tac Toe Opponent With A Bad Attitude”

OAK Vision Modules Help You See The Forest And The Trees

OpenCV is an open source library of computer vision algorithms, its power and flexibility made many machine vision projects possible. But even with code highly optimized for maximum performance, we always wish for more. Which is why our ears perk up whenever we hear about a hardware accelerated vision module, and the latest buzz is coming out of the OpenCV AI Kit (OAK) Kickstarter campaign.

There are two vision modules launched with this campaign. The OAK-1 with a single color camera for two dimensional vision applications, and the OAK-D which adds stereo cameras for that third dimension. The onboard brain is a Movidius Myriad X processor which, according to team members who have dug through its datasheet, have been massively underutilized in other products. They believe OAK modules will help the chip fulfill its potential for vision applications, delivering high performance while consuming low power in a small form factor. Reading over the spec sheet, we think it’s fair to call these “Ultimate Myriad X Dev Boards” but we must concede “OpenCV AI Kit” sounds better. It does not provide hardware acceleration for the entire OpenCV library (likely an impossible task) but it does cover the highly demanding subset suitable for Myriad X acceleration.

Since the campaign launched a few weeks ago, some additional information have been released to help assure backers that this project has real substance. It turns out OAK is an evolution of a project we’ve covered almost exactly one year ago that became a real product DepthAI, so at least this is not their first rodeo. It is also encouraging that their invitation to the open hardware community has already borne fruit. Check out this thread discussing OAK for robot vision, where a question was met with an honest “we don’t have expertise there” from the OAK team, but then ArduCam pitched in with their camera module experience to help.

We wish them success for their planned December 2020 delivery. They have already far surpassed their funding goals, they’ve shipped hardware before, and we see a good start to a development community. We look forward to the OAK-1 and OAK-D joining the ranks of other hacking friendly vision modules like OpenMV, JeVois, StereoPi, and AIY Vision.

Pinball Machine Needs No Wizard

Ever since he was a young boy, [Tyler] has played the silver ball. And like us, he’s had a lifelong fascination with the intricate electromechanical beasts that surround them. In his recently-completed senior year of college, [Tyler] assembled a mechatronics dream team of [Kevin, Cody, and Omar] to help turn those visions into self-playing pinball reality.

You can indeed play the machine manually, and the Arduino Mega will keep track of your score just like a regular cabinet. If you need to scratch an itch, ignore a phone call, or just plain want to watch a pinball machine play itself, it can switch back and forth on the fly. The USB camera mounted over the playfield tracks the ball as it speeds around. Whenever it enters the flipper vectors, the appropriate flipper will engage automatically to bat the ball away.

Our favorite part of this build (aside from the fact that it can play itself) is the pachinko multi-ball feature that manages to squeeze in a second game and a second level. This project is wide open, and even if you’re not interested in replicating it, [Tyler] sprinkled a ton of good info and links to more throughout the build logs. Take a tour after the break while we have it set on free play.

[Tyler]’s machine uses actual pinball machine parts, which could quickly ramp up the cost. If you roll your own targets and get creative with solenoid sourcing, building a pinball machine doesn’t have to be a drain on your wallet.

Continue reading “Pinball Machine Needs No Wizard”

Machine Vision Keeps Track Of Grubby Hands

Can you remember everything you’ve touched in a given day? If you’re being honest, the answer is, “Probably not.” We humans are a tactile species, with an outsized proportion of both our motor and sensory nerves sent directly to our hands. We interact with the world through our hands, and unfortunately that may mean inadvertently spreading disease.

[Nick Bild] has a potential solution: a machine-vision system called Deep Clean, which monitors a scene and records anything in it that has been touched. [Nick]’s system uses Jetson Xavier and a stereo camera to detect depth in a scene; he built his camera from a pair of Raspberry Pi cams and a Pi 3B+, but other depth cameras like a Kinect could probably do the job. The idea is to watch the scene for human hands — OpenPose is the tool he chose for that job — and correlate their depth in the scene with the depth of objects. Touch a doorknob or a light switch, and a marker is left on the scene. The idea would be that a cleaning crew would be able to look at the scene to determine which areas need extra attention. We can think of plenty of applications that extend beyond the current crisis, as the ability to map areas that have been touched seems to be generally useful.

[Nick] has been getting some mileage out of that Xavier lately — he’s used it to build an AI umpire and shades that help you find lost stuff. Who knows what else he’ll find to do with them during this time of confinement?

Continue reading “Machine Vision Keeps Track Of Grubby Hands”

Upgrade Your Shades To Find Lost Items

Ever wish you could augment your sense of sight?

[Nick Bild]’s latest hack helps you find objects (or people) by locating their position and tracking them with a laser. The device, dubbed Artemis, latches onto your eyeglasses and can be configured to locate a specific object.

Images collected from the device are streamed to an NVIDIA Jetson AGX Xavier board, which uses a SSD300 (Single Shot MultiBox Detection) model to locate objects. The model was pre-trained with the COCO dataset to recognize and localize 80 different object types given input from images thresholded in OpenCV. Once the desired object is identified and located, a laser diode activates.

Probably due to the current thresholds, the demo runs mostly work on objects placed further apart against a neutral background. It’s an interesting look at applications combining computer vision with physical devices to augment experiences, rather than simply processing and analyzing data.

The device uses two servos for controlling the laser: one for X-axis control and the other for Y-axis control. The controls are executed from an Adafruit Itsy Bitsy M4 Express microcontroller.

Perhaps with a bit more training, we might not have so much trouble with “Where’s Waldo” puzzles anymore.

Check out some of our other sunglasses hacks, from home automation to using LCDs to lessening the glare from headlights.

Continue reading “Upgrade Your Shades To Find Lost Items”

Open Source Kitchen Helps You Watch What You Eat

Every appliance business wants to be the one that invents the patented, license-able, and profitable standard that all the other companies have to use. Open Source Kitchen wants to beat them to it. 

Every beginning standard needs a test case, and OSK’s is a simple one. A bowl that tracks what you eat. While a simple concept, the way in which the data is shared, tracked, logged, and communicated is the real goal.

The current demo uses a Nvidia Jetson Nano as its processing center. This $100 US board packs a bit of a punch in its weight class. It processes the video from a camera held above the bowl of fruit, suspended by a scale in a squirrel shaped hangar, determining the calories in and calories out.

It’s an interesting idea. One wonders how the IoT boom might have played out if there had been a widespread standard ready to go before people started walling their gardens.