Transforming Drone Can Be A Square Or A Dragon

When flying drones in and around structures, the size of the drone is generally limited by the openings you want to fit through. Researchers at the University of Tokyo got around this problem by using an articulating structure for the drone frame, allowing the drone to transform from a large square to a narrow, elongated form to fit through smaller gaps.

The drone is called DRAGON, which is somehow an acronym for the tongue twisting description “Dual-Rotor Embedded Multilink Robot with the Ability of Multi-Degree-of-Freedom Aerial Transformation“. The drone consists of four segments, with a 2-DOF actuated joint between each segment. A pair of ducted fan motors are attached to the middle of each segment with a 2-DOF gimbal that allows it to direct thrust in any direction relative to the segment. For normal flight the segments would be arranged in the square shape, with minimal movement between the segments. When a small gap is encountered, as demonstrated in the video after the break, the segments rearrange into a dragon-like shape, that can pass through a gap in any plane.

Each segment has its own power source and controller, and the control software required to make everything work together is rather complex. The full research paper is unfortunately behind a paywall. The small diameter of the propellers, and all the added components would be a severe limiting factor in terms of lifting capacity and flight time, but the concept is to definitely interesting.

The idea of shape shifting robots has been around for a while, and can become even more interesting when the different segment can detach and reattach themselves to become modular robots. The 2016 Hackaday Grand Prize winner DTTO is a perfect example of this, although it did lack the ability to fly. Continue reading “Transforming Drone Can Be A Square Or A Dragon”

Dashing Diademata Delivers Second Generation ROS

A simple robot that performs line-following or obstacle avoidance can fit all of its logic inside a single Arduino sketch. But as a robot’s autonomy increases, its corresponding software gets complicated very quickly. It won’t be long before diagnostic monitoring and logging comes in handy, or the desire to encapsulate feature areas and orchestrate how they work together. This is where tools like the Robot Operating System (ROS) come in, so we don’t have to keep reinventing these same wheels. And Open Robotics just released ROS 2 Dashing Diademata for all of us to use.

ROS is an open source project that’s been underway since 2007 and updated regularly, each named after a turtle species. What makes this one worthy of extra attention? Dashing marks the first longer term support (LTS) release of ROS 2, a refreshed second generation of ROS. All high level concepts stayed the same, meaning almost everything in our ROS orientation guide is still applicable in ROS 2. But there were big changes under the hood reflecting technical advances over the past decade.

ROS was built in an age where a Unix workstation cost thousands of dollars, XML was going to be how we communicate all data online, and an autonomous robot cost more than a high-end luxury car. Now we have $35 Raspberry Pi running Linux, XML has fallen out of favor due to processing overhead, and some autonomous robots are high-end luxury cars. For these and many other reasons, the people of Open Robotics decided it was time to make a clean break from legacy code.

The break has its detractors, as it meant leaving behind the vast library of freely available robot intelligence modules released by researchers over the years. Popular ones were (or will be) ported to ROS 2, and there is a translation bridge sufficient to work with some, but the rest will be left behind. However, this update also resolved many of the deal-breakers preventing adoption outside of research, making ROS more attractive for commercial investment which should bring more robots mainstream.

Judging by responses to the release announcement, there are plenty of people eager to put ROS 2 to work, but it is not the only freshly baked robotics framework around. We just saw Nvidia release their Isaac Robot Engine tailored to make the most of their Jetson hardware.

SMORES Robot Finds Its Own Way To The Campfire

Robots that can dynamically reconfigure themselves to adapt to their environments offer a promising advantage over their less dynamic cousins. Researchers have been working through all the challenges of realizing that potential: hardware, software, and all the interactions in between. On the software end of the spectrum, a team at University of Pennsylvania’s ModLab has been working on a robot that can autonomously choose a configuration to best fit its task at hand.

We’ve recently done an overview of modular robots, and we noted that coordination and control are persistent challenges in this area. The robot in this particular demonstration is a hybrid: a fixed core module serving as central command, plus six of the lab’s dynamic SMORES-EP modules. The core module has a RGB+Depth camera for awareness of its environment. A separate downwards-looking camera watches SMORES modules for awareness of itself.

Combining that data using a mix of open robot research software and new machine specific code, this team’s creation autonomously navigates an unfamiliar test environment. While it can adapt to specific terrain challenges like a wood staircase, there are still limitations on situations it can handle. Kudos to the researchers for honestly showing and explaining how the robot can get stuck on a ground seam, instead of editing that gaffe out to cover it up.

While this robot isn’t the completely decentralized modular robot system some are aiming for, it would be a mistake to dismiss based on that criticism alone. At the very least, it is an instructive step on the journey offering a tradeoff that’s useful on its own merits. And perhaps this hybrid approach will find application with a modular robot close to our hearts: Dtto, the winner of our 2016 Hackaday Prize.

[via Science News]

Continue reading “SMORES Robot Finds Its Own Way To The Campfire”

Modular robot legs from Disney

Disney’s New Robot Limbs Trained Using Neural Networks

Disney is working on modular, intelligent robot limbs that snap into place with magnets. The intelligence comes from a reasonable sized neural network that also incorporates some modularity. The robot is their Snapbot whose base unit can fit up to eight of limbs, and so far they’ve trained with up to three together.

The modularity further extends to a choice of three types of limb. One with roll and pitch, another with yaw and pitch, and a third with roll, yaw, and pitch. Interestingly, of the three types, the yaw-pitch one seems most effective.

Learning environment for Disney's modular robot legsIn this age of massive, deep neural networks requiring GPUs or even online services for training in a reasonable amount of time, it’s refreshing to see that this one’s only two layers deep and can be trained in three hours on a single-core, 3.4 GHz Intel i7 processor. Three hours may still seem long, but remember, this isn’t a simulation in a silicon virtual world. This is real-life where the servo motors have to actually move. Of course, they didn’t want to sit around and reset it after each attempt to move across the table so they built in an automatic mechanism to pull the robot back to the starting position before trying to cross the table again. To further speed training, they found that once they’d trained for one limb, they could then copy the last of the network’s layers to get a head starting on the training for two limbs.

Why do training? Afterall, we’ve seen pretty awesome multi-limbed robots working with manual coding, an example being this hexapod tank based on one from the movie Ghost in the Shell. They did that too and then compared the results of the manual approach with those of the trained one and the trained one moved further in the same amount of time. At a minimum, we can learn a trick or two from this modular crawler.

Check out their article for the details and watch it in action in its learning environment below.

Continue reading “Disney’s New Robot Limbs Trained Using Neural Networks”

Modular Robotics Made Easier With ROS

A robot is made up of many hardware components each of which requires its own software. Even a small robot arm with a handful of servo motors uses a servo motor library.

Add that arm to a wheeled vehicle and you have more motors. Then attach some ultrasonic sensors for collision avoidance or a camera for vision. By that point, you’ve probably split the software into multiple processes: one for the arm, another for the mobility, one for vision, and one to act as the brains interfacing somehow with all the rest. The vision may be doing object recognition, something which is computationally demanding and so you now have multiple computers.

Break all this complexity into modules and you have a use case for ROS, the Robot Operating System. As this article shows, ROS can help with designing, building, managing, and even evolving your robot.

Continue reading “Modular Robotics Made Easier With ROS”

Modular Robotics: When You Want More Robots In Your Robot

While robots have been making our lives easier and our assembly lines more efficient for over half a century now, we haven’t quite cracked a Jetsons-like general purpose robot yet. Sure, Boston Dynamics and MIT have some humanoid robots that are fun to kick and knock over, but they’re far from building a world-ending Terminator automaton.

But not every robot needs to be human-shaped in order to be general purpose. Some of the more interesting designs being researched are modular robots. It’s an approach to robotics which uses smaller units that can combine into assemblies that accomplish a given task.

We’ve been immersing ourselves in topics like this one because right now the Robotics Module Challenge is the current focus of the Hackaday Prize. We’re looking for any modular designs that make it easier to build robots — motor drivers, sensor arrays, limb designs — your imagination is the limit. But self contained robot modules that themselves make up larger robots is a fascinating field that definitely fits in with this challenge. Join me for a look at where modular robots are now, and where we’d like to see them going.

Continue reading “Modular Robotics: When You Want More Robots In Your Robot”

Hackaday Prize Entry: Dtto Modular Robot

A robot to explore the unknown and automate tomorrow’s tasks and the ones after them needs to be extremely versatile. Ideally, it was capable of being any size, any shape, and any functionality, shapeless like water, flexible and smart. For his Hackaday Prize entry, [Alberto] is building such a modular, self-reconfiguring robot: Dtto.

ditto_family To achieve the highest possible reconfigurability, [Alberto’s] robot is designed to be the building block of a larger, mechanical organism. Inspired by the similar MTRAN III, individual robots feature two actuated hinges that give them flexibility and the ability to move on their own. A coupling mechanism on both ends of the robot allows the little crawlers to self-assemble in various configurations and carry out complex tasks together. They can chain together to form a snake, turn into a wheel and even become four (or more) legged walkers. With six coupling faces on each robot, that allow for connections in four orientations, virtually any topology is possible.

Each robot contains two strong servos for the hinges and three smaller ones for the coupling mechanism. Alignment magnets help the robots to index against each other before a latch locks them in place. The clever mechanism doubles as an ejector, so connections can be undone against the force of the alignment magnets. Most of the electronics, including an Arduino Nano, a Bluetooth and a NRF24L01+ module, are densely mounted inside one end of the robot, while the other end can be used to add additional features, such as a camera module, an accelerometer and more. The following video shows four Dtto robots in a snake configuration crawling through a tube.

Continue reading “Hackaday Prize Entry: Dtto Modular Robot”