Tissue-Engineered Soft Robot Swims Like A Stingray

We’re about to enter a new age in robotics. Forget the servos, the microcontrollers, the H-bridges and the steppers. Start thinking in terms of optogenetically engineered myocytes, microfabricated gold endoskeletons, and hydrodynamically optimized elastomeric skins, because all of these have now come together in a tissue-engineered swimming robotic stingray that pushes the boundary between machine and life.

In a paper in Science, [Kevin Kit Parker] and his team at the fantastically named Wyss Institute for Biologically Inspired Engineering describe the achievement. It turns out that the batoid fishes like skates and rays have a pretty good handle on how to propel themselves in water with minimal musculoskeletal and neurological requirements, and so they’re great model organisms for a tissue engineered robot.

The body is a laminate of silicone rubber and a collection of 200,000 rat heart muscle cells. The cardiomyocytes provide the contractile force, and the pattern in which they are applied to the 1/2″ (1.25cm) body allows for the familiar undulating motion of a stingray’s wings. A gold endoskeleton with enough stiffness to act as a spring is used to counter the contraction of the muscle fibers and reset the system for another wave. Very clever stuff, but perhaps the coolest bit is that the muscle cells are genetically engineered to be photosensitive, making the robofish controllable with pulses of light. Check out the video below to see the robot swimming through an obstacle course.

This is obviously far from a finished product, but the possibilities are limitless with this level of engineering, especially with a system that draws energy from its environment like this one does. Just think about what could be accomplished if a microcontroller could be included in that gold skeleton.

Continue reading “Tissue-Engineered Soft Robot Swims Like A Stingray”

Autonomous Truck Teaches Itself To Powerslide

When you’re a teenager new to the sensations of driving, it seems counterintuitive to “turn into the skid”, but once you’ve got a few winters of driving under your belt, you’re drifting like a pro. We learn by experience, and as it turns out, so does this fully autonomous power-sliding rally truck.

Figuring out how to handle friction-optional roadways is entirely the point of the AutoRally project at Georgia Tech, which puts a seriously teched-up 1/5 scale rally truck through its paces on an outdoor dirt track. Equipped with high-precision IMU, high-resolution GPS, dual front-facing cameras, and Hall-effect sensors on each wheel sampled at 70 Hz, the on-board Quad-core i7 knows exactly where the vehicle is and what the relationship between it and the track is at all times. There’s no external sensing or computing – everything needed to run the track is in the 21 kg truck. The video below shows how the truck navigates the oval track on its own with one simple goal – keep the target speed as close to 8 meters per second as possible. The truck handles the red Georgia clay like a boss, dealing not only with differing surface conditions but also with bright-to-dark lighting transitions. So far the truck only appears to handle an oval track, but our bet is that a more complex track is the next step for the platform.

While we really like the ride-on scale of this autonomous chase vehicle, other than that there haven’t been too many non-corporate self-driving vehicle hacks around here lately. Let’s hope that AutoRally is an indication that the hackers haven’t ceded the field to Google entirely. Why let them have all the fun?

Continue reading “Autonomous Truck Teaches Itself To Powerslide”

Terra Spider Repairs And Resurfaces New Frontiers

Is your landscape congested with toxic waste, parched, or otherwise abandoned? The Terra Spider may be your answer to new life in otherwise barren wastelands.

Bred in the Digital Craft Lab at the California College of the Arts, the current progress demonstrates the principle of deploying multiple eight-legged drones that can drill and deploy their liquid payload, intended to “repair or maintain” the landing site.

To deliver their project, students [Manali Chitre], [Anh Vu], and [Mallory Van Ness] designed and assembled a laser-cut octopod chassis, an actuated drilling mechanism, and a liquid deployment system all from easily available stock components and raw materials. While project details are sparse, the comprehensive bill-of-materials gives us a window into the process of putting together the pieces of a Terra Spider. The kinematics for movement are actuated by servos, a Sparkfun gear-reduced motor enables drilling, and a peristaltic pump handles the payload deployment.

It’s not every day that flying robots deploy drill-wielding spider drones. Keep in mind, though, that the Terra Spider is a performance piece, a hardware-based demonstration of a bigger idea, in our case: remote coverage and sample deployments in a barren wasteland. While, this project is still a work-in-progress, the bill-of-materials and successful deployment demos both testify towards this project’s extensive development.

With the earnest intent of repairing withering environments, perhaps this project has a future as an entry into this year’s Earth-saving Hackaday Prize….

Coming soon to a galaxy near you!

Continue reading “Terra Spider Repairs And Resurfaces New Frontiers”

Santa’s Autonomous Helping Hands Let The Jolly Ol’ Fellow Kick Back This Season

For those skeptical about the feasibility of Santa’s annual delivery schedule, here’s an autonomous piece of the puzzle that will bewilder even the most hard-hearted of non-believers.

The folks over at the Center of Excellence Cognitive Interaction Technology (CITEC) in Germany have whipped together a fantastic demo featuring Santa’s extra pair of helping hands. In the two-and-a-half minute video, the robot executes a suite of impressive autonomous stocking-stuffing maneuvers: from recognizing the open hole in the stocking, to grasping specific candies from the cluster of goodies available.

On the hardware-side, the arms appear to be a KUKA-variant, while on the software-side, the visualizations are being handled by the open source robot software ROS‘ RVIZ tool.

If some of the props in the video look familiar, you’ll find that the researchers at CITEC have already explored some stellar perception, classification, and grasping of related research topics. Who knew this pair of hands would be so jolly to clock some overtime this holiday season? The entire video is set to a crisp computer-voiced jingle that serves as a sneaky summary of their approach to this project.

Now, if only we could set these hands off to do our other dirty work….

Continue reading “Santa’s Autonomous Helping Hands Let The Jolly Ol’ Fellow Kick Back This Season”

Robot Vision: Detecting Obstacles With FPGAs And Line Lasers

Somewhere down the road, you’ll find that your almighty autonomous robot chassis is going to need some sensor feedback. Otherwise, that next small step down the road may end with a blind leap off the coffee table. The first low-cost sensors we might throw at this problem would be sonars or IR rangefinders, but there’s a problem: those sensors only really provide distance data back from the pinpoint view directly ahead of them.

Rest assured, [Jonathan] wrote in to let us know that he’s got you covered. Combining a line laser, camera, and an FPGA, he’s able to detect obstacles that fall within the field of view of the camera and laser.

If you thought writing algorithms in software is tricky, wait till to you try hardware! (We know: division sucks!) [Jonathan] knows no fear though; he’s performing gradient computation on the FPGA directly to detect the laser in the camera image at a wicked 30 frames-per-second. Why roll up your sleeves and take the hardware route, you might ask? If we took a CPU-based approach at the tiny embedded-robot scale, Jonathan estimates a mere 10 frames-per-second. With an FPGA, we’re able to process images about as fast as they’re received.

Jonathan is using the Logi Board, a Kickstarter success we’ve visited in the past, and all of his code is up on the Githubs. If you crack it open, you’ll also find that many of his modules are Wishbone compliant, so developing your own projects with just some of these parts has been made much easier than trying to rip out useful features from a sea of hairy logic.

With computer-vision hardware keeping such a low profile in the hobbyist community, we’re excited to hear more about [Jonathan’s] FPGA-based robotics endeavors.

Continue reading “Robot Vision: Detecting Obstacles With FPGAs And Line Lasers”

Robot Battle For The Big Leagues: Valkyrie And The DARPA Challenge

valkyrieRobot

Even though NASA’s Johnson Space Center’s impressive build for the upcoming DARPA Robotics Challenge is one of many entries, it has to be one of the coolest. The gang at IEEE Spectrum got a sneak peak of the robot dubbed “Valkyrie”, which at 1.9m and 125kg boasts 44 degrees of freedom while managing to look like a finished product ready to roll off the shelf. We can expect to see other custom robots at the challenge, but a number of teams will compete with a Boston Dynamics Atlas Robot, which we’ve covered a couple times this year.

A few readers are probably polishing their pitchforks in anticipation of shouting “Not a hack!” but before you do, take a look at the tasks for the robots in this challenge and consider how new this territory is. To that end, the NASA JSC crew seem to have prepared for resolving catastrophes, even if it means throwing together a solution. They’ve designed the limbs for quick removal and even reversibility: the arms are identical and only slight adjustments are required to turn a left arm into a right. Unlike the Atlas, which requires a tether, Valkyrie is battery-operated, and it can run for around an hour before someone needs to crack open the torso and swap in a new one, Iron Man filmstyle.

The team was also determined to make Valkyrie seem more human, so they added a soft fabric layer to serve as a kind of clothing. According to IEEE Spectrum, it’s even getting custom made footwear from DC Shoes.There are some utilitarian compromises, though: Valkyrie has adopted a shortcut taken by time-constrained animators in many a cartoon, choosing three fingers per hand instead of four. Make sure you watch the video after the break for a closer look.

Continue reading “Robot Battle For The Big Leagues: Valkyrie And The DARPA Challenge”

Robotic Manta Ray (Mantabot)

The Robotic Manta Ray codenamed MantaBot created by the Bio-Inspired Engineering Research Laboratory (BIER Lab) is set to make a splash. The next evolution in underwater Robotics is here. We have seen the likes of robotic fish and Jelly fish now to be added to the school is the MantaBot which has been designed to mimic the unique swimming motion of the Manta Ray,

This biologically inspired under water robot’s has been designed with a primary goal to be autonomous using its onboard electronics to make its own decisions to navigate its watery domain. BIER Lab has received major funding from the Department of Defense (DoD) Multi-disciplinary University Research Initiative (MURI) program. Part of its goal in the long run is to reverse engineer the biological systems of such creatures to the point of creating simulated artificial skin and muscle.

[Via dvice.com]

Continue reading “Robotic Manta Ray (Mantabot)”