LEGO Pick And Place

[youtube=http://www.youtube.com/watch?v=YoXCn4Gh_HA&w=470]

Turn your volume down and take a look at the brick sorting robot in the video above. It’s built using LEGO and powered by four different NXT modules. It sorts differently colored bricks on the intake conveyor and places them on three output conveyors. The build is solid and was [Chris Shepherd’s] impetus for starting a blog. We appreciate the pneumatic tricks that he detailed in some of his earlier posts such as a compressor, pressure switch, and air tank system. His advice is “build, build, build” and that’s what you’d have to do to perfect a monster of this size and scope.

The Story Of Mr. Stabby

[youtube=http://www.youtube.com/watch?v=XHvXPOSaNbg]

Mr. Stabby was once a broken down bum, sitting on the curb, waiting for an eternity in the city dump. Luckily, someone found him and brought him to the NYC Resistor hacker group. They promptly performed some modifications and brought him back to life.

He’s now a happy go lucky stab-bot with a twinkle in his eye and a skip in his step. His repitoir includes gouging, slashing, and of course stabbing. He can be controlled via a cell phone and has a nifty voice interface.

The video above is from when Mr. Stabby was runner up at the tech crunch hackday.

Poor MABEL

[youtube=http://www.youtube.com/v/IlWIWf4daNs]

At first, watching this video of MABEL, a bipedal robot for studying dynamic gaits, we didn’t know if we should be scared or feel sad. By the end, we know that sadness prevailed. Poor MABEL, forced into a  grueling routine, is not even allowed to rest when her leg breaks.

To be serious though, MABEL is quite impressive. Instead of using a direct drive on the legs, motors are attached to springs that act like tendons. This helps compensate for variances in the walking surface, hopefully allowing for smoother transitions between gaits as well. As you can see, MABEL handles the height differences quite well, albeit a bit slowly. It is worth noting that there are no visual sensors on MABEL and everything is done through feedback from her gait.

BAMF2010: QB Goes To Meetings, Shoots Lasers From Eyes

No, it’s not an extra from Wall-E. “QB” is the latest telepresence robot from Silicon Valley firm Anybots. QB combines two-way videoconferencing with a Segway-style self-balancing platform. The idea is to provide mobility and more natural interaction than desktop-tethered conferencing can provide.

The 35 pound robot’s battery runs for six to eight hours, and the telescoping head allows the eye level to be adjusted to match the user’s natural viewpoint. What looks like stereo vision is actually a single camera on the left eye and a steerable laser pointer on the right.

Shipping this October for $15,000, QB will appeal mostly to businesses with specific telepresence needs. This is half the price of their prior QA model — and in time the technology may reach the mass-market level. Until then, we’ll just have to amuse ourselves by remotely attending meetings with our ankle-nipping Rovio robots.

Cathode Ray Tube Leads The Way On This Bot

[Daqq’s] latest creation is this little robot with a CRT mounted on the front. Obviously ‘why?’ is the wrong question here, but we know this is right up his alley considering his propensity for the less common like this plasma ball Nixie tube. The solidly-built bot uses two stepper motor controlled wheels and an omni-wheel on the front to create a trike. An ATmega128 controls the system but the real story here is the CRT. It requires a hefty voltage regulator for the -600V to +200V the Tungsram DG7-123 tube needs. Trouble along the way ranged from dealing with stray magnetic fields from the power supply, to mounting the fragile tube itself. Take a look at his detailed writeup linked above and join us after the break for the demo videos.

Continue reading “Cathode Ray Tube Leads The Way On This Bot”

More Glove-based Interfaces

You may remember seeing the golf glove air guitar hack last month. Here’s two more uses for gloves with sensors on them.

On the left is a glove interface with flex sensors on each digit as well as an accelerometer. The VEX module reads the sensors to detect sign language as a command set. A shake of the hand is picked up by an accelerometer to delineate between different command sets. See it controlling a little robot after the break. This comes from [Amnon Demri] who was also involved in the EMG prosthesis.

Straight out of Cornell we have the SudoGlove, seen on the right. [Jeremy Blum] and his fellow engineering students bring together a mess of different sensors, sourcing an Arduino and a XBee module to control a small RC car with added lights and a siren. There’s embedded video after the break. You may want to jump past the music video for the description that starts at about 3:52.

Continue reading “More Glove-based Interfaces”

Mad Machinist Masterpieces

If a picture is worth 1000 words, by our count, [Ryan Commbes] has said 1.68×10^6 different things about his custom robot, airsoft, and monster truck builds. While we’re not ones to pick favorites, we agreed his Alpine TPG-1 (picture at the top) build is a step above the rest. Sadly, the forums with his build log doesn’t seem to be loading, but he says the basic process if you wanted to make your own is to gather pictures, measure, and create.

[Thanks Andrew]