Solving Rubik’s Cube With An FPGA

For their final project for ECE 5760 at Cornell, [Alex], [Sungjoon], and [Rameez] are solving Rubik’s Cubes. They’re doing it with an FPGA, with homebrew robot arms to twist and turn a rainbow cube into the correct position.

First, the mechanical portion of the build. The team are using a system of three robot arms positioned on the left, right, and back faces of the cube relative to a camera. When a cube is placed in the jaws of this robot, the NTSC camera data is fed into an FPGA, where a Nios II soft core handles the actual detection of the cube faces, the solver algorithm, and the controller to send servo commands to the robot arms.

The algorithm used for solving the cube is CFOP – solve the white cross, the white corners, the middle layer, the top face, and finally the entire cube. In practice, the robot ended up taking between 60-70 moves. This is not the most efficient algorithm; the Thistethwaite algorithm only requires 52 moves. There’s a reason for this apparent inefficiency – the Thistlethwaite algorithm requires large look-up tables.

Once the cube is scanned and the correct moves are computed, the soft core in sends commands out through the FPGA’s GPIO pins. Each cube can be solved in under three minutes after it has been scanned, but the team ran into problems with scanning accuracy. It’s a problem that can be fixed with the right lighting setup and better aberrant cubie detection, and a great final project using FPGAs.

Video demo below.

Continue reading “Solving Rubik’s Cube With An FPGA”

FPGAs Keep Track of your Ping Pong Game

It’s graduation time, and you know what that means! Another great round of senior design projects doing things that are usually pretty unique. [Bruce Land] sent in a great one from Cornell where the students have been working on a project that uses FPGAs and a few video cameras to keep score of a ping-pong game.

The system works by processing a live NTSC feed of a ping pong game. The ball is painted a particular color to aid in detection, and the FPGAs that process the video can keep track of where the net is, how many times the ball bounces, and if the ball has been hit by a player. With all of this information, the system can keep track of the score of the game, which is displayed on a monitor near the table. Now, the players are free to concentrate on their game and don’t have to worry about keeping score!

This is a pretty impressive demonstration of FPGAs and video processing that has applications beyond just ping pong. What would you use it for? It’s always interesting to see what students are working on; core concepts from these experiments tend to make their way into their professional lives later on. Maybe they’ll even take this project to the next level and build an actual real, working ping pong robot to work with their scoring system!

Continue reading “FPGAs Keep Track of your Ping Pong Game”

DIY Electrical Body Fat Analyzer

Whether you are trying to drop some fat or build some muscle, it’s important to track progress. It’s easy enough to track your weight, but weight doesn’t tell the whole story. You might be burning fat but also building muscle, which can make it appear as though you aren’t losing weight at all. A more useful number is body fat percentage. Students from Cornell have developed their own version of an electrical body fat analyzer to help track body fat percentage.

Fat free body mass contains mostly water, whereas fat contains very little water. This means that if you were to pass an electrical current through a body, the overall bioelectrical impedance will vary depending on how much fat or water there is. This isn’t a perfect system, but it can give a rough approximation in a relatively easy way.

The students’ system places an electrode on one hand and another on the opposite foot. This provides the longest electrical path possible in the human body to allow for the most accurate measurement possible. An ATMega1284P is used to generate a 50kHz square wave signal. This signal is opto-isolated for user safety. Another stage of the circuit then uses this source signal to generate a 10ua current source at 50kHz. This is passed through a human body and fed back to the microcontroller for analysis.

The voltage reading is sent to a MATLAB script via serial. The user must also enter in their weight and age. The MATLAB script uses these numbers combined with the voltage reading to estimate the body fat percentage. In order to calibrate the system, the students measured the body fat of 12 of their peers using body fat calipers. They admit that their sample size is too small. All of the sample subjects are about 21 years old and have a similar body fat percentage. This means that their system is currently very accurate for people in this range, but likely less accurate for anyone else. Continue reading “DIY Electrical Body Fat Analyzer”

Boxing Trainer Uses DIY Force Sensors

A team of Cornell students have designed and built their own electronic boxing trainer system. The product of their work is a game similar to Whack-A-Mole. There are five square pads organized roughly into the shape of a human torso and head. Each pad will light up based on a pre-programmed pattern. When the pad lights up, it’s the player’s job to punch it! The game keeps track of the player’s accuracy as well as their reaction time.

The team was trying to keep their budget under $100, which meant that off the shelf components would be too costly. To remedy this, they designed their own force sensors. The sensors are basically a sandwich of a few different materials. In the center is a 10″ by 10″ square of ESD foam. Pressed against it is a 1/2″ thick sheet of insulating foam rubber. This foam rubber sheet has 1/4″ slits cut into it, resulting in something that looks like jail bars. Sandwiching these two pieces of foam is fine aluminum window screen. Copper wire is fixed the screen using conductive glue. Finally, the whole thing is sandwiched between flattened pieces of corrugated cardboard to protect the screen.

The sensors are mounted flat against a wall. When a user punches a sensor, it compresses. This compression causes the resistance between the two pieces of aluminum screen to change. The resistance can be measured to detect a hit. The students found that if the sensor is hit harder, more surface area becomes compressed. This results in a greater change in resistance and can then be measured as a more powerful hit. Unfortunately it would need to be calibrated depending on what is hitting the sensor, since the size of the hitter can throw off calibration.

Each sensor pad is surrounded by a strip of LEDs. The LEDs light up to indicate which pad the user is supposed to hit. Everything is controlled by an ATMEGA 1284p microcontroller. This is the latest in a string of student projects to come out of Cornell. Make sure to watch the demonstration video below. Continue reading “Boxing Trainer Uses DIY Force Sensors”

Beating the Skins of Oatmeal Tins

Ithaca-based power trio [Nick, Roshun, and Ian] share a love of music and beating on things with drum sticks. To that end (and for class credit), they built a Digitally-Recordable, User-Modifiable Sound Emitting Tool (DRUMSET) using force-sensing resistors housed in oatmeal cans.

Anyone who has dealt with FSRs knows how persnickety they can be. In order to direct the force and avoid false positives, these enterprising beat purveyors suspended a sawed-off 2-liter bottle to the underside of each lid. This directs the force coming in from their patent-pending foam-enhanced drum sticks to the small, round sensing area of the FSR. There’s just enough space between the cap and the FSR to account for the play in the oatmeal can lid drum head when struck.

DRUMSET offers different-sounding kits at the push of a momentary switch. At present, there are four pre-programmed kits: the acoustic and electronic foursomes you’d expect, and a kit of miscellaneous sounds like hand claps and wooden claves that sound like something They Might Be Giants would have used on their first album. The fourth is called ‘Smoke on Water’, and is exactly what it sounds like. Should you tire of these, DRUMSET has a program mode with around 20 samples. These can be cycled through on the LCD and assigned to any of the four drums.

The microphone is for record mode, and whatever is recorded can be mapped to any drum. The memory limitations of the ‘1284P make for a 0.2 second sample of whatever is barked into the mic, but that’s plenty of time for shouting ‘hack!’ or firing off whatever hilarious bodily sound one can muster. We think this four track-like functionality of DRUMSET has interesting recording and live performance implications. The team’s future plans include space for longer samples and more robust drum construction (although it is possible to do this without any drums whatsoever). They’d also like to add more drums in case Neil Peart calls. The beat goes on after the break.

Continue reading “Beating the Skins of Oatmeal Tins”

Electronic Glove Detects Sign Language

A team of Cornell students recently built a prototype electronic glove that can detect sign language and speak the characters out loud. The glove is designed to work with a variety of hand sizes, but currently only fits on the right hand.

The glove uses several different sensors to detect hand motion and position. Perhaps the most obvious are the flex sensors that cover each finger. These sensors can detect how each finger is bent by changing the resistance according to the degree of the bend. The glove also contains an MPU-6050 3-axis accelerometer and gyroscope. This sensor can detect the hand’s orientation as well as rotational movement.

While the more high-tech sensors are used to detect most characters, there are a few letters that are similar enough to trick the system. Specifically, they had trouble with the letters R, U, and V. To get around this, the students strategically placed copper tape in several locations on the fingers. When two pieces of tape come together, it closes a circuit and acts as a momentary switch.

The sensor data is collected by an ATmega1284p microcontroller and is then compiled into a packet. This packet gets sent to a PC which then does the heavy processing. The system uses a machine learning algorithm. The user can train the it by gesturing for each letter of the alphabet multiple times. The system will collect all of this data and store it into a data set that can then be used for detection.

This is a great project to take on. If you need more inspiration there’s a lot to be found, including another Cornell project that speaks the letters you sign, as well as this one which straps all needed parts to your forearm.
Continue reading “Electronic Glove Detects Sign Language”

Touching Light with Haptic Feedback

Many of us have gone on a stationary romp through some virtual or augmented scape with one of the few headsets out in the wild today. While the experience of viewing a convincing figment of reality is an exciting sensation in itself, [Mark Lee] and [Kevin Wang] are figuring out how to tie other senses into the mix.

The duo from Cornell University have built a mechanical exoskeleton that responds to light with haptic feedback. This means the wearer can touch the sphere of light around a source as if it were a solid object. Photo resistors are mounted like antenna to the tip of each finger, which they filed down around the edges to receive a more  diffused amount of light. When the wearer of the apparatus moves their hand towards a light source, the sensors trigger servo motors mounted on the back of the hand to actuate and retract a series of 3D printed tendons which arch upward and connect to the individual fingers of the wearer. This way as the resistors receive varying amounts of light, they can react independently to simulate physical contours.

One of the goals of the project was to produce a working proof of concept with no more than 100 dollars worth of materials, which [Mark] and [Kevin] achieve with some cash to spare. Their list of parts can be found on their blog along with some more details on the project.

Continue reading “Touching Light with Haptic Feedback”