A group of developers called [OpenWorm] have mapped the 302 neurons of the Caenorhabditis elegans species of roundworm and created a virtual neural network that can be used to solve all the types of problems a worm might encounter. Which, when you think about it, aren’t much different from those a floor-crawling robots would be confronted with.
In a demo video released by one of the projects founders, [Timothy Busbice], their network is used to control a small Lego-rover equipped with a forward sonar sensor. The robot is able to stop before it hits a wall and determine an appropriate response, which may be to stop, back up, or turn. This is all pretty fantastic when you compare these 302 neural connections to any code you’ve ever written to accomplish the same task! It might be a much more complex route to the same outcome, but its uniquely organic… which makes watching the little Lego-bot fascinating; its stumbling around even looks more like thinking than executing.
I feel obligated to bring up the implications of this project. Since we’re all thinking about it now, let’s all imagine the human brain similarly mapped and able to simulate complex thought processes. If we can pull this off one day, not only will we learn a lot more about how our squishy grey hard drives process information, artificial intelligence will also improve by leaps and bounds. An effort to do this is already in effect, called the connectome project, however since there are a few more connections to map than with the c. elegans’ brain, it’s a feat that is still underway.
The project is called “open”worm, which of course means you can download the code from their website and potentially dabble in neuro-robotics yourself. If you do, we want to hear about your wormy brain bot.
Continue reading “Gift Your Next Robot With the Brain of a Roundworm”
Mikey is [Mike’s] autonomous robot. Like any good father, he’s given the robot his name. Mikey is an Arduino based robot, which uses a Pixy camera for vision.
[Mike] started with a common 4WD robot platform. He added an Arduino Uno, a motor controller, and a Pixy. The Pixy sends directions to the Arduino via a serial link. Mikey’s original task was driving around and finding frogs on the floor. Since then, [Mike] has found a higher calling for Mikey: self charging.
One of the most basic features of life is eating. In the case of autonomous robots, that means self charging. [Mike] gave Mikey the ability to self charge by training the Pixy to detect a green square. The green square identifies Mikey’s charging station. Probes mounted on 3D printed brackets hold the positive leads while springs on the base of the station make contact with conductive tape on Mikey’s belly. Once the circuit is complete, Mike stops moving and starts charging.
Continue reading “Mikey, the Robot That Charges Itself”
[Jia Wu, Mary Sek, and Jeff Maeshiro], students at the California College of the Arts (CCA) in San Francisco, took on the task of developing a walking 3D printer. The result is Geoweaver, a hexapod robot with a glue gun extruder system. Hackaday has seen walking CNC machines before, but not a 3D printer. Geoweaver uses two servos on each of its six legs to traverse the land. The team was able to program several gaits into the robot, allowing it to traverse uneven terrain. Walking is hard enough on its own, but Geoweaver also uses a glue gun based extruder to make 3D prints. The extruder head uses two servos to swing in a hemispherical arc. The arc is mapped in software to a flat
plain plane, allowing the robot to drop a dollop of glue exactly where it is programmed to. Geoweaver doesn’t include much in the way of on board processing – an Arduino Uno is used to drive the 15 servos. Those servos coupled with a glue gun style heater pull quite a bit of power, which has earned Geoweaver nicknames such as Servo Killer, Eater of Shields, Melter of Wires, and Destroyer of Regulators.
Geoweaver’s prints may not be much to look at yet, however the important thing to remember is that one of the future visions for this robot is to print on a planetary scale. Geoweaver currently uses reacTIVision to provide computer control via an “eye in the sky”. ReacTIVision tracks a fiducial marker on the robot, and applies it to a topographical map of the terrain. This allows Geoweaver to change its height and print parameters depending on the flatness of the ground it is printing on. On a scaled up Geoweaver, reacTIVision would be replaced by GPS or a similar satellite based navigation system. Most of the software used in Geoweaver is opensource, including Grasshopper and Firefly, written by the team’s professor, [Jason Kelly Johnson]. The exception is Rhino 5. We would love to see an option for a free or open source alternative to laying out ~$1000 USD in software for our own Geoweaver.
Continue reading “Roving Hexapod Poops Out 3D Prints”
The DARPA robotics challenge trials 2013 are have finished up. The big winner is Team Schaft, seen above preparing to drive in the vehicle trial. This isn’t the end of the line for DARPA’s robotics challenge – there is still one more major event ahead. The DARPA robotics finals will be held at the end of 2014. The tasks will be similar to what we saw today, however this time the team and robot’s communications will be intentionally degraded to simulate real world disaster situations. The teams today were competing for DARPA funding. Each of the top eight teams is eligible for, up to $1 million USD from DARPA. The teams not making the cut are still welcome to compete in the finals using other sources of funding.
The trials were broken up into 8 events. Door, Debris, Valve, Wall, Hose, Terrain, Ladder, and Vehicle. Each trial was further divided into 3 parts, each with one point available. If a robot completed the entire task with no human intervention it would earn a bonus point. With all bonuses, 32 points were available. Team Schaft won the event with an incredible total of 27 points. In second place was Team IHMC (Institute for Human Machine Cognition) with 20 points. Team IMHC deserves special praise as they were using a DARPA provided Boston Dynamics Atlas Robot. Teams using Atlas only had a few short weeks to go from a completely software simulation to interacting with a real world robot. In third place was Carnegie Mellon University’s Team Tartan Rescue and their Chimp robot with 18 points.
The expo portion of the challenge was also exciting, with first responders and robotics researchers working together to understand the problems robots will face in real world disaster situations. Google’s recent acquisition — Boston Dynamics — was also on hand, running their WildCat and LS3 robots. The only real downside to the competition was the coverage provided by DARPA. The live stream left quite a bit to be desired. The majority of videos on DARPA’s YouTube channel currently consist of 9-10 hour recordings of some of the event cameras. The wrap-up videos also contain very little information on how the robots actually performed during the trials. Hopefully as the days progress, more information and video will come out. For now, please share the timestamp and a description of your favorite part with your comments.
Continue reading “DARPA Robotics Challenge Trials Wrap Up”
Today was the first of two days of trials at the DARPA Robotics challenge at Homestead-Miami Speedway in Florida. Created after the Japan’s Fukushima nuclear disaster, The robotics challenge is designed to advance the state of the art of robotics. The trials range from driving a car to clearing a debris field, to cutting through a wall. Robots score points based on their performance in the trials. Much of the day was spent waiting for teams to prepare their robots. There were some exciting moments however, with one challenger falling through a stacked cinder block wall.
Pictured above is Valkyrie from NASA
JPL JSC. We reported on Valkyrie earlier this month. Arguably one of the better looking robots of the bunch, Valkyrie proved to be all show and no go today, failing to score any points in its day 1 trials. The day one lead went to Team Schaft, a new robot from Tokyo based startup company Schaft inc. Schaft scored 18 points in its first day. In second place is the MIT team with 12 points. Third place is currently held by Team TRACLabs with 9 points. All this can change tomorrow as the second day of trials take place. The live stream will be available from 8am to 7pm EST on DARPA’s robotics challenge page.
Continue reading “DARPA Robotics Challenge Trials Day 1″
[Chiprobot] has created an amazing compliant gripper. Designing robot hands (or end effectors) can be a perilous task. It is easy to give robots big, good, strong hands. Strong grippers have to be controlled by sensors. However, sensors can’t always be relied upon to ensure those hands don’t crush anything they touch. Hardware fails, software has bugs. Sometimes the best solution is a clever mechanical design, one which ensures a gripper will conform to the object it is gripping. We’ve seen “jamming” grippers before. (so named for their use of a granular substance which jams around the object being gripped).
[Chiprobot’s] gripper is something entirely different. He designed his gripper in blender, and printed it out with his Ultimaker 3D printer. The material is flexible PLA. Three plastic “fingers” wrap around the object being gripped. The fingers are made up of two strips of printed plastic connected by wire linkages. The flexible plastic of the fingers create a leaf spring design. The fingers are attached to a linear actuator at the center point of the gripper. The linear actuator itself is another great hack. [Chiprobot] created it from a servo and an empty glue stick. As the linear actuator is pulled in, the fingers pull around any object in their grip. The end result is a grip strong enough to hold an egg while shaking it, but not strong enough to break the egg.
We would like to see the gripper gripping other objects, as eggs can be surprisingly strong. We’ve all seen the physics trick where squeezing an egg with bare hands doesn’t break it, yet squeezing an egg while wearing a ring causes it to crack much… like an egg.
Continue reading “Compliant Robot Gripper Won’t Scramble Your Eggs”
[Sarah Petkus] has a simple dream. She wants to build and command her own delta robot army. It all began with an illustration she drew of a woman hovering over a field of flowers. The flowers in this case had incandescent light bulbs as blooms. [Sarah] decided to create her image in the real world as an interactive art installation. Her first attempts at moving light flowers were based on a pulley system, which was unreliable and not exactly the graceful movement she imagined. Eventually [Sarah] discovered inverted delta robots. She changed her flower design to a delta, and began building her own delta robots out of parts she had around the house.
A chance meeting with the folks at SYN Shop hackerspace in Las Vegas, NV kicked the project into high gear. [Sarah] switched from using R/C ball links as joints to a simple ball bearing joint. She created her entire design in CAD software and printed it on the hackerspace’s 3d printer. She now has six working prototypes. The robots are all controlled via I2C by an Arduino compatible Nymph board. Six robots doesn’t exactly constitute an army, so [Sarah] had to find a new way to fund her project. She’s currently setting up a project for Kickstarter. [Sarah] will be selling kits for her robots, with the proceeds going toward the realization of her dream of a field of robotic light bulb flowers – Assuming the deltas don’t become sentient and try to take over the world first. [Sarah] posts progress updates to her blog, and has a dedicated site (which we featured on Sunday as part of a Links post) for information about her upcoming Kickstarter campaign.
Continue reading “Build and Control Your Own Robot Army”