Legged Robots Put On Wheels And Skate Away

We don’t know how much time passed between the invention of the wheel and someone putting wheels on their feet, but we expect that was a great moment of discovery: combining the ability to roll off at speed and our leg’s ability to quickly adapt to changing terrain. Now that we have a wide assortment of recreational wheeled footwear, what’s next? How about teaching robots to skate, too? An IEEE Spectrum interview with [Marko Bjelonic] of ETH Zürich describes progress by one of many research teams working on the problem.

For many of us, the first robot we saw rolling on powered wheels at the end of actively articulated legs was when footage of the Boston Dynamics ‘Handle’ project surfaced a few years ago. Rolling up and down a wide variety of terrain and performing an occasional jump, its athleticism caused quite a stir in robotics circles. But when Handle was introduced as a commercial product, its job was… stacking boxes in a warehouse? That was disappointing. Warehouse floors are quite flat, leaving Handle’s agility under-utilized.

Boston Dynamic has typically been pretty tight-lipped on details of their robotics development, so we may never know the full story behind Handle. But what they have definitely accomplished is getting a lot more people thinking about the control problems involved. Even for humans, we face a nontrivial learning curve paved with bruised and occasionally broken body parts, and that’s even before we start applying power to the wheels. So there are plenty of problems to solve, generating a steady stream of research papers describing how robots might master this mode of locomotion.

Adding to the excitement is the fact this is becoming an area where reality is catching up to fiction, as wheeled-legged robots have been imagined in forms like Tachikoma of Ghost in the Shell. While those fictional robots have inspired projects ranging from LEGO creations to 28-servo beasts, their wheel and leg motions have not been autonomously coordinated as they are in this generation of research robots.

As control algorithms mature in robot research labs around the world, we’re confident we’ll see wheeled-legged robots finding applications in other fields. This concept is far too cool to be left stacking boxes in a warehouse.

Continue reading “Legged Robots Put On Wheels And Skate Away”

Robots Learning To Understand Their Surroundings

Today it is pretty easy to build a robot with an onboard camera and have fun manually driving through that first-person view. But builders with dreams of autonomy quickly learn there is a lot of work between camera installation and autonomously executing a “go to chair” command. Fortunately we can draw upon work such as View Parsing Network by [Bowen Pan, Jiankai Sun, et al]

When a camera image comes into a computer, it is merely a large array of numbers representing red, green, and blue color values and our robot has no idea what that image represents. Over the past years, computer vision researchers have found pretty good solutions for problems of image classification (“is there a chair?”) and segmentation (“which pixels correspond to the chair?”) While useful for building an online image search engine, this is not quite enough for robot navigation.

A robot needs to translate those pixel coordinates into real-world layout, and this is the problem View Parsing Network offers to solve. Detailed in Cross-view Semantic Segmentation for Sensing Surroundings (DOI 10.1109/LRA.2020.3004325) the system takes in multiple camera views looking all around the robot. Results of image segmentation are then synthesized into a 2D top-down segmented map of the robot’s surroundings. (“Where is the chair located?”)

The authors documented how to train a view parsing network in a virtual environment, and described the procedure to transfer a trained network to run on a physical robot. Today this process demands a significantly higher skill level than “download Arduino sketch” but we hope such modules will become more plug-and-play in the future for better and smarter robots.

[IROS 2020 Presentation video (duration 10:51) requires free registration, available until at least Nov. 25th 2020. One-minute summary embedded below.]

Continue reading “Robots Learning To Understand Their Surroundings”

Autonomous Sentry Gun Packs A Punch And A Ton Of Build Tips

What has dual compressed-air cannons, 500 roll-on deodorant balls, and a machine-learning brain with a bad attitude? We didn’t know either, until [Leo Fernekes] dropped this video on his autonomous robot sentry gun and saw it in action for ourselves.

Now, we’ve seen tons of sentry guns on these pages before, shooting everything from water to various forms of Nerf. And plenty of those builds have used some form of machine vision to aim the gun onto the target. So while it might appear that [Leo]’s plowing old ground here, this build is chock full of interesting tips and tricks.

It started when [Leo] saw a video on TensorFlow basics from our friend [Edje Electronics], which gave him the boost needed to jump into an AI project. The controller he ended up with looks for humans in the scene and slews the turret onto target, where the air cannons can do their thing. The hefty ammo is propelled by compressed air, which is dumped into the chamber using a solenoid valve with an interesting driver that maximizes the speed at which it opens. Style points go to the bacteriophage T4-inspired design, and to the sequence starting at 1:34 which reminded us of the factory scene from RoboCop.

[Leo] really put a ton of work into this project, and the results show. He is hoping to get an art gallery or museum to show it as an interactive piece to comment on one possible robot-human future, presumably after getting guests to sign a release. Whatever happens to it, the robot looks great and [Leo] learned a lot from it, as did we.

Continue reading “Autonomous Sentry Gun Packs A Punch And A Ton Of Build Tips”

Dropping A Glider From 18,000 Feet

[Tarik and Kemal] have an objective in mind: to drop a home-made autonomous glider from a high-altitude balloon and safely return it to home. To motivate them, [Tarik] has decided not to cut his hair until they reach 18,000 feet. Given the ambition of their project, it isn’t surprising that his hair is getting rather long now.

Continue reading “Dropping A Glider From 18,000 Feet”

Drones Can Undertake Excavations Without Human Intervention

Researchers from Denmark’s Aarhus University have developed a method for autonomous drone scanning and measurement of terrains, allowing drones to independently navigate themselves over excavation grounds. The only human input is a starting location and the desired cliff face for scanning.

For researchers studying quarries, capturing data about gravel, walls, and other natural and man-made formations is important for understanding the properties of the terrain. Controlling the drones can be expensive though, since there’s considerable skill involved in manually flying the drone and keeping its camera steady and perpendicular to the wall it is capturing.

The process designed is a Gaussian model that predicts the wind encountered near the wall, estimating the strength based on the inputs it receives as it moves. It uses both nonlinear model predictive control (NMPC) and a PID controller in its feedback control system, which calculate the values to send to the drone’s motor controller. A long short-term memory (LSTM) model is used for calculating the predictions. It’s been successfully tested in a chalk quarry in Denmark and will continue to be tested as its algorithms are improved.

Getting a drone to hover and move between GPS waypoints is easy enough, but once they need to maneuver around obstacles it starts getting tricky. Research like this will be invaluable for developing systems that help drones navigate in areas where their human operators can’t reach.

[Thanks to Qes for the tip!]

Ask Hackaday: What Good Is A Robot Dog?

It is said that Benjamin Franklin, while watching the first manned flight of a hot air balloon by the Montgolfier brothers in Paris in 1783, responded when questioned as to the practical value of such a thing, “Of what practical use is a new-born baby?” Dr. Franklin certainly had a knack for getting to the heart of an issue.

Much the same can be said for Spot, the extremely videogenic dog-like robot that Boston Dynamics has been teasing for years. It appears that the wait for a production version of the robot is at least partially over, and that Spot (once known as Spot Mini) will soon be available for purchase by “select partners” who “have a compelling use case or a development team that [Boston Dynamics] believe can do something really interesting with the robot,” according to VP of business development Michael Perry.

The qualification of potential purchasers will certainly limit the pool of early adopters, as will the price tag, which is said to be as much as a new car – and a nice one. So it’s not likely that one will show up in a YouTube teardown video soon, so until the day that Dave Jones manages to find one in his magic Australian dumpster, we’ll have to entertain ourselves by trying to answer a simple question: Of what practical use is a robotic dog?

Continue reading “Ask Hackaday: What Good Is A Robot Dog?”

Skid Steer Mows Airport Grass Autonomously

Sure, mowing the lawn is a hassle. No one really wants to spend their time and money growing a crop that doesn’t produce food, but we do it anyway. If you’re taking care of a quarter acre in the suburbs it’s not that much of a time sink, but if you’re taking care of as much grass as [Roby], you’d probably build something similar to his autonomous skid-steer mower, too.

This thing isn’t a normal push mower outfitted with some random electronics, either. This is a serious mower that is essentially a tractor with blades attached to it. Since it’s a skid steer, it turns by means of two handles that control the speed of the left or right drive wheels. Fabricating up some servo linkages to attach them to specialized servos takes care of the steering portion, and the brain is ArduPilot hooked up to a host of radios, GPS, and a compass to allow it to drive all around the runways at the airport without interfering with any aircraft.

This is a serious build and goes into a lot of detail about how servos and linkages should behave, how all the software works, and the issues of actually mounting everything to the mower. The entire project is open source too, so even if you don’t have a whole airport runway to mow you might be able to find something in there to help with your little patch of grass.

Thanks to [Vincent] for the tip!

Continue reading “Skid Steer Mows Airport Grass Autonomously”