An Autonomous Car Using A “Webcam”

This autonomous remote control-style car from Cornell students was designed for a senior level engineering course there. It’s main “sensor” is a low-res webcam style camera. As shown in the video after the break, this car does quite well staying within two black lines on a white surface using it’s vision processing. It also has an IR sensor to detect objects in front of the car and stop without crashing.

All “vision” computations are handled by an Atmel Mega644 MCU, an 8-bit processor. Because of the processing limits of this chip, much work had to be done to make this process computationally efficient. These students go through an incredibly detailed account of their project, focusing on the code and electrical design. Check out the video of their car in action after the break. Continue reading “An Autonomous Car Using A “Webcam””

AVRcam For Small Robot Machine Vision

It’s neat how a project from 2004 can still be relevant if it’s done really well. This is the case with AVRcam.  It uses an Atmel AVR mega8 and can do some pretty impressive things, like track up to eight objects at 30fps. The hardware and software is also open source, so it should be possible to build one yourself. There are many projects like it on the internet, though often they require much beefier hardware. Although, these days you can fit a computer inside a match box, so we see more and more projects just throwing a full USB camera on a robot to do simple things like line following.  It’s debatable which solution is more elegant, but maybe not which one is more impressive.

Spherical Military Drone Coming To A Sky Near You

spherical_drone

We’re always fascinated by flying drones around here, and this latest creation by Japan’s Ministry of Defense is no exception. The spherical drone, which looks far simpler than this drone we saw several months back, looks pretty benign at first glance. Once it starts moving however, you can see just how slick it is.

Reports say that it can hit a top speed of 40 mph, but it seems that the fun is relatively short-lived, as the drone runs out of juice after about 8 minutes. While it is flying, the drone appears to be incredibly agile and fairly easy to control. The built-in camera isn’t top end, but it looks more than sufficient for general surveillance use.

While we love quadrocopters and all of the cool acrobatics they pull off, there’s something awesome about a drone that can hit the ground at speed, roll, and take off again without incurring any serious damage.

Anyone care to start work on a civilian prototype with a longer battery life?

$14 Swarm Robot, Kilobot, Is Extremely Cool

Reader, [Michael Rubenstein], sent in a project he’s been working on. Kilobot, as stated in the paper(pdf), overcomes the big problems with real world swarm robotics simulations; cost, experiment setup time, and maintenance. The robot can be communicated with wirelessly, charged in bulk, and mass programmed in under a minute. Typically, robots used for swarm research cost over a $100, so large scale experiments are left to software simulation. These, however, rarely include the real world physics, sensor error, and other modifying factors that only arise in a physical robot.  Impressively enough, the kilobot comes in far under a hundred and still has many of the features of its costlier brothers. It can sense other robots, report its status, and has full differential steer (achieved, surprisingly, through bristle locomotion). There are a few cool videos of the robot in operation on the project site that are definitely worth a look.

Tiny Transforming Beer Can Robot

beer_can_robot

The next time you reach for a cold one, you might want to take a look at the can to ensure that your beer won’t suddenly sprout legs and start skittering across the table.

You might remember [Ron Tajima] from some of his previous creations, including this Roomba-based baby cradle and the PacMan Roomba mod. This time around, he has created a cool little transforming robot that fits inside a beer can.

The robot’s brains are stored just underneath the top of the beer can on a custom-built board. On one side of this board, you will find an mbed controller which is used to manage all of the robot’s functions, and on the other side, four batteries provide all of the device’s power. The robot’s three legs are controlled by six servos, allowing for movement in several different planes. The beer-bot’s movements are controlled with a Wiimote, so we’re assuming he has crammed a Bluetooth module somewhere in there as well.

[Ron] mentions that it moves a bit slowly when standing on end, but we think the robot is pretty awesome as is, and we can’t wait to see what improvements the next version might bring.

Stick around to see a video demonstration of the robot in action.

[Thanks Sascha]

Continue reading “Tiny Transforming Beer Can Robot”

Trobot: Kickstarting The 6-axis Minature Robot Arm

Having already made it to three hardware development versions, [Toby Baumgartner] is looking for some financial backing to make version four of this robot arm possible.

He’s modelling the arm after much larger ABB industrial robots. Like those, it mounts on a stationary base, and features movement along six axes.  The first couple of iterations even used ABB Software’s RobotStudio for control. This is the same software used by the full-sized robots, and features a special design language to integrate the robots into just about any production facility.

We don’t think the need for high-end software used with these small manipulator arms is very great, but we could see the finished product used for small-scale assembly line work some day. In the mean time these might be useful in your own projects. [Toby] has been using an mBed microcontroller board as the hardware driver. It communicates with the computer via an Ethernet connection and he’s even working on an Android interface right now.

Check out a video demonstration of version 2 and 3 embedded after the break.

Continue reading “Trobot: Kickstarting The 6-axis Minature Robot Arm”

Programming Robots Like You Would Train A Pet

[Jim] has been working with a team from various Universities to develop an intuitive way to guide and train assistance robots. They focused on one particular technique, training a robot to follow on a leash in the same way you would a pet dog (PDF).

He was inspired to send in a link to his research after reading about the Kinect-powered shopping card robot. He figures that that project is similar to his own, but his does have several added benefits. The first being that if a robot is on a leash, everyone knows who that bot is following or assisting. But there is the added benefit of the user needing no training whatsoever. That’s because the act of walking a dog on a leash is commonplace in developed societies; you may not have ever owned a dog, but you’ve seen others walking them on leashes numerous times and could do so yourself without any training.

The leash connects to a sensor-filled turret in the center of the robot’s body. The bot can sense when, and in which direction the user is pulling the leash. There’s also an emergency kill switch on the handle for added functionality. Take a look at some of the test video after the break to see how quickly humans can adapt to this type of user interface.

Continue reading “Programming Robots Like You Would Train A Pet”