Robotic Farming, Aussie Style

Australian roboticists from the Queensland University of Technology have developed a prototype agricultural robot that uses machine vision to identify both weed and crop plants before either uprooting or poisoning the weeds or applying fertiliser to the crop.

The machine is a wide platform designed to straddle a strip of the field upon which it is working, with electric wheel motors for propulsion. It is solar-powered, and it is envisaged that a farm could have several of them continuously at work.

At a superficial level there is nothing new in the robot, its propulsion, or even the plant husbandry and weeding equipment. The really clever technology lies in the identification and classification of the plants it will encounter. It is on the success or failure of this in real farm environments that the robot’s future will hinge. The university’s next step will be to take it on-farm, and the ABC report linked above has a wonderfully pithy quote from a farmer on the subject. You can see the machine in action in the video below the break.

Farming robots have a significant following among the hardware hacker community, but it is possible that the machine-vision and plant-identifying abilities of this one would be beyond most hackers. However it is still an interesting project to watch, marking as it does a determined attempt to take the robot out of the lab and into real farm settings.

Continue reading “Robotic Farming, Aussie Style”

Soft Robot With Microfluidic Logic Circuit

Perhaps our future overlords won’t be made up of electrical circuits after all but will instead be soft-bodied like ourselves. However, their design will have its origins in electrical analogues, as with the Octobot.

The Octobot is the brainchild a team of Harvard University researchers who recently published an article about it in Nature. Its body is modeled on the octopus and is composed of all soft body parts that were made using a combination of 3D printing, molding and soft lithography. Two sets of arms on either side of the Octobot move, taking turns under the control of a soft oscillator circuit. You can see it in action in the video below.

Continue reading “Soft Robot With Microfluidic Logic Circuit”

Tissue-Engineered Soft Robot Swims Like a Stingray

We’re about to enter a new age in robotics. Forget the servos, the microcontrollers, the H-bridges and the steppers. Start thinking in terms of optogenetically engineered myocytes, microfabricated gold endoskeletons, and hydrodynamically optimized elastomeric skins, because all of these have now come together in a tissue-engineered swimming robotic stingray that pushes the boundary between machine and life.

In a paper in Science, [Kevin Kit Parker] and his team at the fantastically named Wyss Institute for Biologically Inspired Engineering describe the achievement. It turns out that the batoid fishes like skates and rays have a pretty good handle on how to propel themselves in water with minimal musculoskeletal and neurological requirements, and so they’re great model organisms for a tissue engineered robot.

The body is a laminate of silicone rubber and a collection of 200,000 rat heart muscle cells. The cardiomyocytes provide the contractile force, and the pattern in which they are applied to the 1/2″ (1.25cm) body allows for the familiar undulating motion of a stingray’s wings. A gold endoskeleton with enough stiffness to act as a spring is used to counter the contraction of the muscle fibers and reset the system for another wave. Very clever stuff, but perhaps the coolest bit is that the muscle cells are genetically engineered to be photosensitive, making the robofish controllable with pulses of light. Check out the video below to see the robot swimming through an obstacle course.

This is obviously far from a finished product, but the possibilities are limitless with this level of engineering, especially with a system that draws energy from its environment like this one does. Just think about what could be accomplished if a microcontroller could be included in that gold skeleton.

Continue reading “Tissue-Engineered Soft Robot Swims Like a Stingray”

Autonomous Truck Teaches Itself To Powerslide

When you’re a teenager new to the sensations of driving, it seems counterintuitive to “turn into the skid”, but once you’ve got a few winters of driving under your belt, you’re drifting like a pro. We learn by experience, and as it turns out, so does this fully autonomous power-sliding rally truck.

Figuring out how to handle friction-optional roadways is entirely the point of the AutoRally project at Georgia Tech, which puts a seriously teched-up 1/5 scale rally truck through its paces on an outdoor dirt track. Equipped with high-precision IMU, high-resolution GPS, dual front-facing cameras, and Hall-effect sensors on each wheel sampled at 70 Hz, the on-board Quad-core i7 knows exactly where the vehicle is and what the relationship between it and the track is at all times. There’s no external sensing or computing – everything needed to run the track is in the 21 kg truck. The video below shows how the truck navigates the oval track on its own with one simple goal – keep the target speed as close to 8 meters per second as possible. The truck handles the red Georgia clay like a boss, dealing not only with differing surface conditions but also with bright-to-dark lighting transitions. So far the truck only appears to handle an oval track, but our bet is that a more complex track is the next step for the platform.

While we really like the ride-on scale of this autonomous chase vehicle, other than that there haven’t been too many non-corporate self-driving vehicle hacks around here lately. Let’s hope that AutoRally is an indication that the hackers haven’t ceded the field to Google entirely. Why let them have all the fun?

Continue reading “Autonomous Truck Teaches Itself To Powerslide”

Terra Spider Repairs and Resurfaces new Frontiers

Is your landscape congested with toxic waste, parched, or otherwise abandoned? The Terra Spider may be your answer to new life in otherwise barren wastelands.

Bred in the Digital Craft Lab at the California College of the Arts, the current progress demonstrates the principle of deploying multiple eight-legged drones that can drill and deploy their liquid payload, intended to “repair or maintain” the landing site.

To deliver their project, students [Manali Chitre], [Anh Vu], and [Mallory Van Ness] designed and assembled a laser-cut octopod chassis, an actuated drilling mechanism, and a liquid deployment system all from easily available stock components and raw materials. While project details are sparse, the comprehensive bill-of-materials gives us a window into the process of putting together the pieces of a Terra Spider. The kinematics for movement are actuated by servos, a Sparkfun gear-reduced motor enables drilling, and a peristaltic pump handles the payload deployment.

It’s not every day that flying robots deploy drill-wielding spider drones. Keep in mind, though, that the Terra Spider is a performance piece, a hardware-based demonstration of a bigger idea, in our case: remote coverage and sample deployments in a barren wasteland. While, this project is still a work-in-progress, the bill-of-materials and successful deployment demos both testify towards this project’s extensive development.

With the earnest intent of repairing withering environments, perhaps this project has a future as an entry into this year’s Earth-saving Hackaday Prize….

Coming soon to a galaxy near you!

Continue reading “Terra Spider Repairs and Resurfaces new Frontiers”

Santa’s Autonomous Helping Hands Let the Jolly ol’ Fellow Kick Back this Season

For those skeptical about the feasibility of Santa’s annual delivery schedule, here’s an autonomous piece of the puzzle that will bewilder even the most hard-hearted of non-believers.

The folks over at the Center of Excellence Cognitive Interaction Technology (CITEC) in Germany have whipped together a fantastic demo featuring Santa’s extra pair of helping hands. In the two-and-a-half minute video, the robot executes a suite of impressive autonomous stocking-stuffing maneuvers: from recognizing the open hole in the stocking, to grasping specific candies from the cluster of goodies available.

On the hardware-side, the arms appear to be a KUKA-variant, while on the software-side, the visualizations are being handled by the open source robot software ROS‘ RVIZ tool.

If some of the props in the video look familiar, you’ll find that the researchers at CITEC have already explored some stellar perception, classification, and grasping of related research topics. Who knew this pair of hands would be so jolly to clock some overtime this holiday season? The entire video is set to a crisp computer-voiced jingle that serves as a sneaky summary of their approach to this project.

Now, if only we could set these hands off to do our other dirty work….

Continue reading “Santa’s Autonomous Helping Hands Let the Jolly ol’ Fellow Kick Back this Season”

Robot Vision: Detecting Obstacles with FPGAs and line lasers

Somewhere down the road, you’ll find that your almighty autonomous robot chassis is going to need some sensor feedback. Otherwise, that next small step down the road may end with a blind leap off the coffee table. The first low-cost sensors we might throw at this problem would be sonars or IR rangefinders, but there’s a problem: those sensors only really provide distance data back from the pinpoint view directly ahead of them.

Rest assured, [Jonathan] wrote in to let us know that he’s got you covered. Combining a line laser, camera, and an FPGA, he’s able to detect obstacles that fall within the field of view of the camera and laser.

If you thought writing algorithms in software is tricky, wait till to you try hardware! (We know: division sucks!) [Jonathan] knows no fear though; he’s performing gradient computation on the FPGA directly to detect the laser in the camera image at a wicked 30 frames-per-second. Why roll up your sleeves and take the hardware route, you might ask? If we took a CPU-based approach at the tiny embedded-robot scale, Jonathan estimates a mere 10 frames-per-second. With an FPGA, we’re able to process images about as fast as they’re received.

Jonathan is using the Logi Board, a Kickstarter success we’ve visited in the past, and all of his code is up on the Githubs. If you crack it open, you’ll also find that many of his modules are Wishbone compliant, so developing your own projects with just some of these parts has been made much easier than trying to rip out useful features from a sea of hairy logic.

With computer-vision hardware keeping such a low profile in the hobbyist community, we’re excited to hear more about [Jonathan’s] FPGA-based robotics endeavors.

Continue reading “Robot Vision: Detecting Obstacles with FPGAs and line lasers”