Protoboard Line Following Robot

dspic-line-follower

We love a good line-following robot project and this really hits the spot. It’s got sharp edges, gobs of solder bridging, and look at all those jumper wires! Despite its appearance it puts in a performance that won’t disappoint.

It uses a dsPIC33 to read from half a dozen analog sensors on the bottom of the board. We’re not all that familiar with the chip’s features, but [Exapod] says it’s got an auto-scan feature he uses to read the sensors. This allows him to sample with 12-bit resolution from all six of them at about 30 kHz. No wonder the thing is so responsive in the demo video embedded below. The track he’s using is just some white printer paper with a fat circuit of black electrical tape placed in a somewhat squiggly pattern.

This is also a fun challenge with toys. Here’s one that hacks a hexapod to follow the lines.

Continue reading “Protoboard Line Following Robot”

Quadruple Backflip And Sticks The Landing

This must have been a coding nightmare, and let’s not even mention the particulars of the mechanical build. The blurred ball near the center of this image is a robot doing a quadruple backflip before sticking the landing.

To the right is a high bar supported by a wood column and some guy-wires. At the beginning of the video below [Hinamitetu] hangs the robot from the bar where it starts its performance without any real motion. The servo motors whine as it gets ready; quickly getting up to speed with full revolutions around the bar. Oh how we wish there was more background info on the hardware! But we’re perfectly happy making our way through [Hinamitetu’s] video collection, which include other gymnastics disciplines like the floor routine. He even posted his own blooper reel that shows the high bar isn’t always a rosy experience.

If you’re thirsting for more amazing performances you won’t be disappointed by this high wire act.

Continue reading “Quadruple Backflip And Sticks The Landing”

Robot Theater Isn’t So Much For The Actors As The Stagehands

robot-theater

[Chris Rybitski] developed this low-profile robot to help move scenery on stage. The test footage shows it to be spry and able to move hundreds of pounds of cargo. The demo shows the addition of a wooden platform about twice the length of the metal chassis with casters at each end to support the extra weight. It seems to have no problem moving around with the weight of a couple of human passengers on board.

Crafty systems for changing huge sets has long made the theater a natural breeding ground for hacks. Balanced turn tables, rails systems, and the like are common place. But we think this has a ton of potential. Right now the electronics seem convoluted, as there is an Arduino running the motors which connects to the LAN using an Ethernet shield and that Linksys wireless router.

We think he should patch directly into the serial port of the router. If he loads DD-WRT or OpenWRT he can easily make the remote control a web interface. We also wonder about the possibility of making it a line-follower that can precisely position itself automatically using patterns on the floor.

Continue reading “Robot Theater Isn’t So Much For The Actors As The Stagehands”

Atlas Humanoid Robot Standing On His Own

Boston Dynamics likes to show off… which is good because we like to see the scary looking robots they come up with. This is Atlas, it’s the culmination of their humanoid robotics research. As part of the unveiling video they include a development process montage which is quite enjoyable to view.

You should remember the feature in October which showed the Robot Ninja Warrior doing the Spider Climb. That was the prototype for Atlas. It was impressive then, but has come a long way since. Atlas is the object of affection for the Darpa Robotics Challenge which seeks to drop a humanoid robot into an environment designed for people and have it perform a gauntlet of tasks. Research teams participating in the challenge are tasked with teaching Atlas how to succeed. Development will happen on a virtual representation of the robot, but to win the challenge you have to succeed with the real deal at the end of the year.

Continue reading “Atlas Humanoid Robot Standing On His Own”

A Robotic Tattoo Artist

tattoo

Here’s something we thought we’d never see: a robot that turns a computer drawing into a tattoo on the user’s arm.

The basic design of the robot is a frame that moves linearly along two axes, and rotates around a third. The tattoo design is imported into a 3D modeling program, and with the help of a few motors and microcontrollers a tattoo can be robotically inked on an arm.

Since the arm isn’t a regular surface, [Luke] needed a way to calibrate his forearm-drawing robot to the weird curves and bends of his ar.  The solution to this problem is a simple calibration process where the mechanism scans along the length of [Luke]’s arm, while the ‘depth’ servo is manually adjusted. This data is imported into Rhino 3D and the robot takes the curve of the arm into account when inking the new tat.

Right now [Luke] is only inking his skin with a marker, but as far as automated tattoo machines go, it’s the best – and only – one we’ve ever seen.

Flying With A Little Help From Friends

flying-with-a-little-help-from-my-friends

A single cell of this distributed flight system can spin its propeller but it comes at the cost of the chassis flying out of control. To realize any type of stable flight it must seek a partnership with other cells. The more astute reader will be wondering how it can autonomously pair if incapable of controlled solo flight? The designers of the project thought of that, and gave each frame a way to propel itself on the ground.

Along the bottom rails of each cage there are several small knobby wheels. These seem to function similar to omniwheels since they are not aligned in parallel to each other. Pairing is accomplished mechanically by magnets, also helping to align the pogo-pins which connect the cells electronically.

Flight tests are shown in the video below. The array can be oriented in symmetrical or asymmetrical patterns and still work just fine. If they have 3D camera feedback they can hold position and navigate quite accurately. But this can also be piloted by remote control in the absence of such a feedback system.

Continue reading “Flying With A Little Help From Friends”

How Do You Think This Quadcopter Feels?

how-does-this-quadcopter-feel

You don’t speak the language of dogs and yet you can tell when one is angry, excited, or down in the dumps. How does that work, and can it be replicated by a robot? That’s the question which [Megha Sharma] set out to study as part of her graduate research at the University of Manitoba in Winnipeg.

The experiment starts by training the robot in a series of patterns meant to mimic emotion. How, you might ask? Apparently you hire an actor trained in Laban Movement. This is a method of describing and dealing with how the human body moves. It’s no surprise that the technique is included in the arsenal of some actors. The training phase uses stationary cameras (kind of like those acrobatic quadcopter projects) to record the device as it is moved by the actor.

Phase two of the experiment involves playing back the recorded motion with the quadcopter under its own power. A human test subject watches each performance and is asked to describe how the quadcopter feels. The surprising thing is that when asked they end up anthropomorphising the inanimate device even further; making up small stories about what the thing actually wants.

It’s an interesting way to approach the problem of the uncanny valley in robotic projects.

Continue reading “How Do You Think This Quadcopter Feels?”