Atlas Humanoid Robot Standing On His Own

Boston Dynamics likes to show off… which is good because we like to see the scary looking robots they come up with. This is Atlas, it’s the culmination of their humanoid robotics research. As part of the unveiling video they include a development process montage which is quite enjoyable to view.

You should remember the feature in October which showed the Robot Ninja Warrior doing the Spider Climb. That was the prototype for Atlas. It was impressive then, but has come a long way since. Atlas is the object of affection for the Darpa Robotics Challenge which seeks to drop a humanoid robot into an environment designed for people and have it perform a gauntlet of tasks. Research teams participating in the challenge are tasked with teaching Atlas how to succeed. Development will happen on a virtual representation of the robot, but to win the challenge you have to succeed with the real deal at the end of the year.

Continue reading “Atlas Humanoid Robot Standing On His Own”

A Robotic Tattoo Artist

tattoo

Here’s something we thought we’d never see: a robot that turns a computer drawing into a tattoo on the user’s arm.

The basic design of the robot is a frame that moves linearly along two axes, and rotates around a third. The tattoo design is imported into a 3D modeling program, and with the help of a few motors and microcontrollers a tattoo can be robotically inked on an arm.

Since the arm isn’t a regular surface, [Luke] needed a way to calibrate his forearm-drawing robot to the weird curves and bends of his ar.  The solution to this problem is a simple calibration process where the mechanism scans along the length of [Luke]’s arm, while the ‘depth’ servo is manually adjusted. This data is imported into Rhino 3D and the robot takes the curve of the arm into account when inking the new tat.

Right now [Luke] is only inking his skin with a marker, but as far as automated tattoo machines go, it’s the best – and only – one we’ve ever seen.

Flying With A Little Help From Friends

flying-with-a-little-help-from-my-friends

A single cell of this distributed flight system can spin its propeller but it comes at the cost of the chassis flying out of control. To realize any type of stable flight it must seek a partnership with other cells. The more astute reader will be wondering how it can autonomously pair if incapable of controlled solo flight? The designers of the project thought of that, and gave each frame a way to propel itself on the ground.

Along the bottom rails of each cage there are several small knobby wheels. These seem to function similar to omniwheels since they are not aligned in parallel to each other. Pairing is accomplished mechanically by magnets, also helping to align the pogo-pins which connect the cells electronically.

Flight tests are shown in the video below. The array can be oriented in symmetrical or asymmetrical patterns and still work just fine. If they have 3D camera feedback they can hold position and navigate quite accurately. But this can also be piloted by remote control in the absence of such a feedback system.

Continue reading “Flying With A Little Help From Friends”

How Do You Think This Quadcopter Feels?

how-does-this-quadcopter-feel

You don’t speak the language of dogs and yet you can tell when one is angry, excited, or down in the dumps. How does that work, and can it be replicated by a robot? That’s the question which [Megha Sharma] set out to study as part of her graduate research at the University of Manitoba in Winnipeg.

The experiment starts by training the robot in a series of patterns meant to mimic emotion. How, you might ask? Apparently you hire an actor trained in Laban Movement. This is a method of describing and dealing with how the human body moves. It’s no surprise that the technique is included in the arsenal of some actors. The training phase uses stationary cameras (kind of like those acrobatic quadcopter projects) to record the device as it is moved by the actor.

Phase two of the experiment involves playing back the recorded motion with the quadcopter under its own power. A human test subject watches each performance and is asked to describe how the quadcopter feels. The surprising thing is that when asked they end up anthropomorphising the inanimate device even further; making up small stories about what the thing actually wants.

It’s an interesting way to approach the problem of the uncanny valley in robotic projects.

Continue reading “How Do You Think This Quadcopter Feels?”

Salvaging Parts From Broken Roomba Robots

salvaging-parts-from-broken-roombas

The great thing about hacking on Roombas is that iRobot used quality parts to build them. [Jason] got his hands on a broken 5XX series Roomba and posted an article about how he reused the salvaged parts.

What you see above is one of the results of his work. This little bot takes commands from an IR television remote control. But he also used the setup to make a self-balancing bot. The two motors from the Roomba have magnetic rotary encoders with 8-bit resolution. Pair this with a well-tuned PID algorithm and you’re in business. The video below shows him testing a motor with his PID code.

You don’t get very much info on the guts of the donor robot. If that’s what you’re looking for you need to look at [Dino’s] Roomba 4000 teardown.

Continue reading “Salvaging Parts From Broken Roomba Robots”

Automata And Wooden Gears

mechanism

While most animated machines we deal with every day – everything from clocks to cars to computers – are made of metal, there is an art to creating automated objects out of wood. [Dug North] is a creator of such inventions, making automata out of wooden gears, cogs, and cams.

[Dug]’s inventions are simple compared to turbine engines, but they still retain an artistry all their own. With just simple woodworking tools, he’s able to creating moving vignettes of everyday scenes, everything from a dog barking at a bird, to Santa Claus gracefully soaring over a house on Christmas Eve.

Below, you’ll find a video of [Dug]’s creation, ‘An Unwelcome Dinner Guest’ – an automated dog barking at a wooden bird. There’s also a video of him being interviewed by the awesome people at Tested last year at the World Maker Faire.

Continue reading “Automata And Wooden Gears”

Build A Light Following Bristlebot As A Way To Teach Science

light-following-bristlebot

[Ben Finio] designed this project as a way to get kids interested in learning about science and engineering. Is it bad that we just want to build one of our own? It’s a light following bristlebot which in itself is quite simple to build and understand. We think the platform has a lot of potential for leading to other things, like learning about microcontrollers and wireless modules to give it wireless control.

Right now it’s basically two bristlebots combined into one package. The screen capture seen above makes it hard to pick out the two toothbrush heads on either side of a battery pack. The chassis of the build is a blue mini-breadboard. The circuit that makes it follow light is the definition of simple. [Ben] uses two MOSFETs to control two vibration motors mounted on the rear corners of the chassis. The gate of each MOSFET is driven by a voltage divider which includes a photoresistor. When light on one is brighter than the other it causes the bot to turn towards to the brighter sensor. When viewing the project log above make sure to click on the tabs to see all of the available info.

This directional control seems quite good. We’ve also seen other versions which shift the weight of the bot to change direction.

Continue reading “Build A Light Following Bristlebot As A Way To Teach Science”