Solving Rubik’s Cube With An FPGA

For their final project for ECE 5760 at Cornell, [Alex], [Sungjoon], and [Rameez] are solving Rubik’s Cubes. They’re doing it with an FPGA, with homebrew robot arms to twist and turn a rainbow cube into the correct position.

First, the mechanical portion of the build. The team are using a system of three robot arms positioned on the left, right, and back faces of the cube relative to a camera. When a cube is placed in the jaws of this robot, the NTSC camera data is fed into an FPGA, where a Nios II soft core handles the actual detection of the cube faces, the solver algorithm, and the controller to send servo commands to the robot arms.

The algorithm used for solving the cube is CFOP – solve the white cross, the white corners, the middle layer, the top face, and finally the entire cube. In practice, the robot ended up taking between 60-70 moves. This is not the most efficient algorithm; the Thistethwaite algorithm only requires 52 moves. There’s a reason for this apparent inefficiency – the Thistlethwaite algorithm requires large look-up tables.

Once the cube is scanned and the correct moves are computed, the soft core in sends commands out through the FPGA’s GPIO pins. Each cube can be solved in under three minutes after it has been scanned, but the team ran into problems with scanning accuracy. It’s a problem that can be fixed with the right lighting setup and better aberrant cubie detection, and a great final project using FPGAs.

Video demo below.

Continue reading “Solving Rubik’s Cube With An FPGA”

My Robot Army @ Maker Faire

For a few years now I’ve been developing an interactive army of delta robots. This ongoing project is fueled by my desire to control many mechanical extremities like an extension of my body (I’m assuming I’m not the only one who fantasizes about robots here).

IMG_1846Since my army doesn’t have a practical application… other than producing pretty light patterns and making the user feel extremely cool for a minute, I guess you’d call it art. In the past I’ve held a Kickstarter to fund the production of my art which I can now happily show at cool events with interesting people; Maker Faire being one of them.

Interactivity and Sprawling Crowds

Last year, for our debut at the big Bay Area Maker Faire, my collaborator, [Mark], and I displayed a smaller sampling of 30 robots for our installation. We also decided to create an interactive aspect for others to experience. After the end of our crowdfunding period last March, we had a little over a month to do any development before the big event, so our options were slim. The easy solution was to jam our delta code into the hand tracking demo which comes with the Xbox Kinect’s Open NI within Processing. This was cool enough to exhibit, but we hadn’t really anticipated how it would go over in an environment as densely packed as the dark room at Maker Faire.

We should have known better. Both of us were aware that there would be many, many children… all with micro hands to confuse and bewilder the Kinect, but we did it anyway. Our only resolve was to implement the feature that would force the Kinect to track one hand at a time, only after being waved at in a very particular fashion. After needing to explain this stipulation to every person who stopped by our booth over the course of the weekend, we decided never to use the Kinect for crowds ever again; lesson learned.

Delta Robots and DMX

Over the past year since that experience, we’ve tripled the size of the installation and brainstormed some better demo ideas. As of now, the robots are all individually addressable over an RS485 bus, and we use the DMX protocol over a CAT5 cable to send commands. If you aren’t familiar with it, DMX is used in show production to control stage lighting… to which there is a super neat and free application called QLC+ that allows you to effectively orchestrate the motion and color of many individual light units; perfect for our cause.

qlcDeltasFunctionally, each of the 84 delta robots in the installation believes that it is a stage light (robots with identity issues). We mapped the X and Y axis of the end effector to the existing pan and tilt values, and the z axis to the beam focus value. The RGB of the LED mounted in the end effector of each delta maps directly to the RGB value of the stage light.

By using the sliders in the QLC+ GUI, I could select groups of robots and create presets for position and color. This was great, someone like me who doesn’t really write a lot of code could whip up impressive choreography with little sweat. Additionally, the program comes with a nice visualizer, where you can layout virtual nodes and view your effects as you develop them.

This is the layout of our installation mapped in QLC+. The teal and purple sliders around each light represent pan and tilt (or in our case X and Y):


Lighting control was an interesting solution. Having autonomous robots this year changed how people responded to them, as they were less like an army you’d command and more of a hypnotic field of glowing grass.

[Mark] and I are considering picking up some flex sensors and maybe playing with the Leap or an EEG headset as a means to reintroduce the interactive aspect. Bottom line, I have this cool new toy that I can’t wait to play with over the summer!

Continue reading “My Robot Army @ Maker Faire”

Simple Autonomy with an RC Boat

[Vlad] wrote in to tell us about his latest project—an RC boat that autonomously navigates between waypoints. Building an autonomous vehicle seems like a really complicated project, but [Vlad]’s build shows how you can make a simple waypoint-following vehicle without a background in autonomy and control systems. His design is inspired by the Scout autonomous vehicle that we’ve covered before.

[Vlad] started prototyping with an Arduino, a GPS module, and a digital compass. He wrote a quick sketch that uses the compass and GPS readings to control a servo that steers towards a waypoint. [Vlad] took his prototype outside and walked around to make sure that steering and navigation were working correctly before putting it in a boat. After a bit of tweaking, his controller steered correctly and advanced to the next waypoint after the GPS position was within 5 meters of its goal.

boatgifNext [Vlad] took to the water. His first attempt was a home-built airboat, which looked awesome but unfortunately didn’t work very well. Finally he ended up buying a $20 boat off of eBay and made a MOSFET-based motor controller to drive its dual thrusters. This design worked much better and after a bit of PID tuning, the boat was autonomously navigating between waypoints in the water. In the future [Vlad] plans to use the skills he learned on this project to make an autopilot for the 38-foot catamaran his dad is building (an awesome project by itself!). Watch the video after the break for more details and to see the boat in action.

Continue reading “Simple Autonomy with an RC Boat”

Tiny Robot Jazz

Microcontroller-based projects don’t have to be fancy to be fantastic. Case in point: [r0d0t]’s “Musicomatic: the random jazz machine“. Clever programming and a nice case can transform a few servos and a microcontroller into something delightful.

musicomat_schematicsHardware-wise, there’s really nothing to see here; a speaker and some servos are hooked up to an ATmega328. We think it’s cute to have the microcontroller control its own power supply through a relay, but honestly a MOSFET in place of the relay or better still using the AVR’s shutdown sleep mode would be the way to go.

Nope, where this project shines is the programming. Technically, it might make some of you cringe — full of blocking delays and other coding “taboos”. But none of that matters, because [r0d0t] put his work in where it counts: the music. You simply must hear it for yourself in the clip after the break.

The basis of making music that humans like is rhythm, so [r0d0t] doesn’t leave this entirely to chance. The array “rhythms” has seven beat patterns that get randomly selected. The other thing humans like is predictability and repetition, so choruses and “improvs” repeat as well. All of the random notes are constrained to the pentatonic scale, which keeps it from ever sounding too bad. (The secret sauce of Kenny G.)

In short, [r0d0t] packs a lot of basic music theory into a very basic device, and comes up with something transcendent. We’re a bit reminded of the Yellow Drum Machine robot, and that’s high praise. Both projects are testaments to building something simple and then investing the time and effort into the code to make the project awesome.

For another slice of [r0d0t]’s excellent minimalist pie, check out his take on the classic Snake game: Twisted Snake.

Continue reading “Tiny Robot Jazz”

Robot Camel Jockeys

You might think we’re sinking to lowest-common-denominator, click-bait headlines like the rest of the online press. We’re not. The New York Times Video Notebook series has a story on camel racing that you’ve just got to see in the video after the break.

robotPreviously, the camel races in Abu Dhabi had used small children as jockeys because they’re lightweight. Unfortunately, this lead to illegal trafficking of small children, mostly orphans. That won’t do. So they came up with a technological solution.

Strap a cordless drill with a purpose-built whip in the chuck onto the back of your camel. Add a car-remote keyfob to activate, and a two-way radio so that you can shout encouragement into your animal’s ear at just the right times. Now just chase the racers down the highway in an SUV and it’s like you’re there on the camel’s back!

talkingWe love the little silk suits that the drillbot-jockeys get to wear, but we’re not sure that cordless drills with walkie-talkies and remote controls count as “robots” really, because they don’t do anything autonomous. We think they’re more accurately described as “telepresence agents”.

Continue reading “Robot Camel Jockeys”

Interactive Robot: Project Naughty Ball

A month before the Bay Area Maker Faire, there were ominous predictions the entire faire would be filled with BB-8 droids, the cute astromech ball bot we’ll be seeing more of when The Force Awakens this December. This prediction proved to be premature. There were plenty of R2 units droiding around the faire, but not a single BB-8. Perhaps at the NYC Maker Faire this September.

skeletonRegarding ball bots, we did have one friendly rolling companion at Maker Faire this year. It was a project by UC Davis students [Henjiu Kang], [Yi Lu], and [Yunan Song] that rolls around, seeking out whoever is wearing an infrared ankle strap. They team is calling it Project Naughty Ball, but we’re going to call it the first step towards a miniature BB-8 droid.

The design of the Naughty Ball is somewhat ingenious; it’s set up as a two-wheel balancing bot inside a clear plasic sphere. A ton of batteries work well enough as the ballast, stepper motors and machined plastic wheels balance and steer the ball bot, and the structure on the top hemisphere of the ball houses all the interesting electronics.

There is a BeagleBone Black with WiFi adapter, a few motor drivers, an IMU, and a very interesting 3D printed mount that spins the robot’s eyes – infrared cameras that spin around inside the ball and track whoever is wearing that IR transmitting ankle band.

As far as robotics project go, you really can’t do better at Maker Faire than a semi-autonomous ball bot that follows its owner, and the amount of work these guys have put into this project sends it to the next level. You can check out a video description of their project below.

Tin Spider is 13-foot Rideable Strandbeest

Arguably our best find at Bay Area Maker Faire this year was the Tin Spider built by [Scott Parenteau]. He constructed the 13-foot tall vehicle to take with him on his very first trip to Burning Man back in 2012. There’s very little information available online so we were excited that [Scott] spent some time speaking with us on Saturday.

Continue reading “Tin Spider is 13-foot Rideable Strandbeest”