Terra Spider Repairs and Resurfaces new Frontiers

Is your landscape congested with toxic waste, parched, or otherwise abandoned? The Terra Spider may be your answer to new life in otherwise barren wastelands.

Bred in the Digital Craft Lab at the California College of the Arts, the current progress demonstrates the principle of deploying multiple eight-legged drones that can drill and deploy their liquid payload, intended to “repair or maintain” the landing site.

To deliver their project, students [Manali Chitre], [Anh Vu], and [Mallory Van Ness] designed and assembled a laser-cut octopod chassis, an actuated drilling mechanism, and a liquid deployment system all from easily available stock components and raw materials. While project details are sparse, the comprehensive bill-of-materials gives us a window into the process of putting together the pieces of a Terra Spider. The kinematics for movement are actuated by servos, a Sparkfun gear-reduced motor enables drilling, and a peristaltic pump handles the payload deployment.

It’s not every day that flying robots deploy drill-wielding spider drones. Keep in mind, though, that the Terra Spider is a performance piece, a hardware-based demonstration of a bigger idea, in our case: remote coverage and sample deployments in a barren wasteland. While, this project is still a work-in-progress, the bill-of-materials and successful deployment demos both testify towards this project’s extensive development.

With the earnest intent of repairing withering environments, perhaps this project has a future as an entry into this year’s Earth-saving Hackaday Prize….

Coming soon to a galaxy near you!

Continue reading “Terra Spider Repairs and Resurfaces new Frontiers”

Santa’s Autonomous Helping Hands Let the Jolly ol’ Fellow Kick Back this Season

For those skeptical about the feasibility of Santa’s annual delivery schedule, here’s an autonomous piece of the puzzle that will bewilder even the most hard-hearted of non-believers.

The folks over at the Center of Excellence Cognitive Interaction Technology (CITEC) in Germany have whipped together a fantastic demo featuring Santa’s extra pair of helping hands. In the two-and-a-half minute video, the robot executes a suite of impressive autonomous stocking-stuffing maneuvers: from recognizing the open hole in the stocking, to grasping specific candies from the cluster of goodies available.

On the hardware-side, the arms appear to be a KUKA-variant, while on the software-side, the visualizations are being handled by the open source robot software ROS‘ RVIZ tool.

If some of the props in the video look familiar, you’ll find that the researchers at CITEC have already explored some stellar perception, classification, and grasping of related research topics. Who knew this pair of hands would be so jolly to clock some overtime this holiday season? The entire video is set to a crisp computer-voiced jingle that serves as a sneaky summary of their approach to this project.

Now, if only we could set these hands off to do our other dirty work….

Continue reading “Santa’s Autonomous Helping Hands Let the Jolly ol’ Fellow Kick Back this Season”

Robot Vision: Detecting Obstacles with FPGAs and line lasers

Somewhere down the road, you’ll find that your almighty autonomous robot chassis is going to need some sensor feedback. Otherwise, that next small step down the road may end with a blind leap off the coffee table. The first low-cost sensors we might throw at this problem would be sonars or IR rangefinders, but there’s a problem: those sensors only really provide distance data back from the pinpoint view directly ahead of them.

Rest assured, [Jonathan] wrote in to let us know that he’s got you covered. Combining a line laser, camera, and an FPGA, he’s able to detect obstacles that fall within the field of view of the camera and laser.

If you thought writing algorithms in software is tricky, wait till to you try hardware! (We know: division sucks!) [Jonathan] knows no fear though; he’s performing gradient computation on the FPGA directly to detect the laser in the camera image at a wicked 30 frames-per-second. Why roll up your sleeves and take the hardware route, you might ask? If we took a CPU-based approach at the tiny embedded-robot scale, Jonathan estimates a mere 10 frames-per-second. With an FPGA, we’re able to process images about as fast as they’re received.

Jonathan is using the Logi Board, a Kickstarter success we’ve visited in the past, and all of his code is up on the Githubs. If you crack it open, you’ll also find that many of his modules are Wishbone compliant, so developing your own projects with just some of these parts has been made much easier than trying to rip out useful features from a sea of hairy logic.

With computer-vision hardware keeping such a low profile in the hobbyist community, we’re excited to hear more about [Jonathan’s] FPGA-based robotics endeavors.

Continue reading “Robot Vision: Detecting Obstacles with FPGAs and line lasers”

Robot Battle for the Big Leagues: Valkyrie and the DARPA Challenge

valkyrieRobot

Even though NASA’s Johnson Space Center’s impressive build for the upcoming DARPA Robotics Challenge is one of many entries, it has to be one of the coolest. The gang at IEEE Spectrum got a sneak peak of the robot dubbed “Valkyrie”, which at 1.9m and 125kg boasts 44 degrees of freedom while managing to look like a finished product ready to roll off the shelf. We can expect to see other custom robots at the challenge, but a number of teams will compete with a Boston Dynamics Atlas Robot, which we’ve covered a couple times this year.

A few readers are probably polishing their pitchforks in anticipation of shouting “Not a hack!” but before you do, take a look at the tasks for the robots in this challenge and consider how new this territory is. To that end, the NASA JSC crew seem to have prepared for resolving catastrophes, even if it means throwing together a solution. They’ve designed the limbs for quick removal and even reversibility: the arms are identical and only slight adjustments are required to turn a left arm into a right. Unlike the Atlas, which requires a tether, Valkyrie is battery-operated, and it can run for around an hour before someone needs to crack open the torso and swap in a new one, Iron Man filmstyle.

The team was also determined to make Valkyrie seem more human, so they added a soft fabric layer to serve as a kind of clothing. According to IEEE Spectrum, it’s even getting custom made footwear from DC Shoes.There are some utilitarian compromises, though: Valkyrie has adopted a shortcut taken by time-constrained animators in many a cartoon, choosing three fingers per hand instead of four. Make sure you watch the video after the break for a closer look.

Continue reading “Robot Battle for the Big Leagues: Valkyrie and the DARPA Challenge”

Robotic Manta Ray (Mantabot)

The Robotic Manta Ray codenamed MantaBot created by the Bio-Inspired Engineering Research Laboratory (BIER Lab) is set to make a splash. The next evolution in underwater Robotics is here. We have seen the likes of robotic fish and Jelly fish now to be added to the school is the MantaBot which has been designed to mimic the unique swimming motion of the Manta Ray,

This biologically inspired under water robot’s has been designed with a primary goal to be autonomous using its onboard electronics to make its own decisions to navigate its watery domain. BIER Lab has received major funding from the Department of Defense (DoD) Multi-disciplinary University Research Initiative (MURI) program. Part of its goal in the long run is to reverse engineer the biological systems of such creatures to the point of creating simulated artificial skin and muscle.

[Via dvice.com]

Continue reading “Robotic Manta Ray (Mantabot)”

Build a Kinect bot for 500 bones

[Eric] sent in his tutorial on building a Kinect based robot for $500, a low-cost solution to a wife that thinks her husband spends too much on robots.

For the base of his build, [Eric] used an iRobot Create, a derivative of the Roomba that is built exclusive for some hardware hackery. For command and control of the robot, an EEE netbook takes data from the Kinect and sends it to the iRobot over a serial connection.

The build itself is remarkably simple: two pieces of angle aluminum were attached to the iRobot, and a plastic milk crate was installed with zip ties. The Kinect sits on top of the plastic crate and the netbook comfortably fits inside.

A few weeks ago, [Eric] posted a summary of the history and open-source software for the Kinect that covers the development of the Libfreenect driver. [Eric] used this same driver for his robot. Currently, the robot is configured for two modes. The first mode has the robot travel to the furthest point from itself. The second mode instructs the robot to follow the closest thing to itself – walk in front of the robot and it becomes an ankle biter.

There is a limitation of the Kinect that [Eric] is trying to work around. Objects closer than 19 inches to the Kinect appear to be very far away. This caused a lot of wall bumping, but he plans on adding a few ultrasonic sensors to fill the gap in the sensor data. Not bad for a very inexpensive autonomous robot.

Autonomous Cookie Monster

cookie_monster

[DJ Sures], who built the autonomous Wall-E, is back with another creation. His new autonomous Cookie Monster is certainly an interesting build. He had the cookie monster plush toy already so the first step was to flay the blue beast and insert a skeleton. He used another robot for that. There are two servos for the wheels plus one for each arm and one for the neck. There’s a distance sensor in the mouth. He built a custom board for the PIC18F4685 microcontroller which is running the same 2D mapping code as his previous bot. Check out the video of it in action below. Continue reading “Autonomous Cookie Monster”