TARS-Like Robot Both Rolls, And Walks

[Aditya Sripada] and [Abhishek Warrier]’s TARS3D robot came from asking what it would take to make a robot with the capabilities of TARS, the robotic character from Interstellar. We couldn’t find a repository of CAD files or code but the research paper for TARS3D explains the principles, which should be enough to inspire a motivated hacker.

What makes TARS so intriguing is the simple-looking structure combined with distinct and effective gaits. TARS is not a biologically-inspired design, yet it can walk and perform a high-speed roll. Making real-world version required not only some inspired mechanical design, but also clever software with machine learning.

[Aditya] and [Abhishek] created TARS3D as a proof of concept not only of how such locomotion can be made to work, but also as a way to demonstrate that unconventional body and limb designs (many of which are sci-fi inspired) can permit gaits that are as effective as they are unusual.

TARS3D is made up of four side-by-side columns that can rotate around a shared central ‘hip’ joint as well as shift in length. In the movie, TARS is notably flat-footed but [Aditya] found that this was unsuitable for rolling, so TARS3D has curved foot plates.

The rolling gait is pretty sensitive to terrain variations, but the walking gait proved to be quite robust. All in all it’s a pretty interesting platform that does more than just show a TARS-like dual gait robot can be made to actually work. It also demonstrates the value of reinforcement learning for robot gaits.

A brief video is below in which you can see the bipedal walk in action. Not that long ago, walking robots were a real challenge but with the tools available nowadays, even a robot running a 5k isn’t crazy.

Continue reading “TARS-Like Robot Both Rolls, And Walks”

Disney’s Bipedal, BDX-Series Droid Gets The DIY Treatment

[Antoine Pirrone] and [Grégoire Passault] are making a DIY miniature re-imagining of Disney’s BDX droid design, and while it’s still early, there is definitely a lot of progress to see. Known as the Open Duck Mini v2 and coming in at a little over 40 cm tall, the project is expected to have a total cost of around 400 USD.

The inner workings of Open Duck Mini use a Raspberry Pi Zero 2W, hobby servos, and an absolute-orientation IMU.

Bipedal robots are uncommon, and back in the day they were downright rare. One reason is that the state of controlled falling that makes up a walking gait isn’t exactly a plug-and-play feature.

Walking robots are much more common now, but gait control for legged robots is still a big design hurdle. This goes double for bipeds. That brings us to one of the interesting things about the Open Duck Mini v2: computer simulation of the design is playing a big role in bringing the project into reality.

It’s a work in progress but the repository collects all the design details and resources you could want, including CAD files, code, current bill of materials, and links to a Discord community. Hardware-wise, the main work is being done with very accessible parts: Raspberry Pi Zero 2W, fairly ordinary hobby servos, and an BNO055-based absolute orientation IMU.

So, how far along is the project? Open Duck Mini v2 is already waddling nicely and can remain impressively stable when shoved! (A “testing purposes” shove, anyway. Not a “kid being kinda mean to your robot” shove.)

Check out the videos to see it in action, and if you end up making your own, we want to hear about it, so remember to send us a tip!

Open-Source Robot Transforms

Besides Pokémon, there might have been no greater media franchise for a child of the 90s than the Transformers, mysterious robots fighting an intergalactic war but which can inexplicably change into various Earth-based object, like trucks and airplanes. It led to a number of toys which can also change shapes from fighting robots into various ordinary objects as well. And, perhaps in a way of life imitating art, plenty of real-life robots have features one might think were inspired by this franchise like this transforming quadruped robot.

Called the CYOBot, the robot has four articulating arms with a wheel at the end of each. The arms can be placed in a wide array of positions for different operating characteristics, allowing the robot to move in an incredibly diverse way. It’s based on a previous version called the CYOCrawler, using similar articulating arms but with no wheels. The build centers around an ESP32-S3 microcontroller, giving it plenty of compute power for things like machine learning, as well as wireless capabilities for control or access to more computing power.

Both robots are open source and modular as well, allowing a range of people to use and add on to the platform. Another perk here is that most parts are common or 3d printed, making it a fairly low barrier to entry for a platform with so many different configurations and options for expansion and development. If you prefer robots without wheels, though, we’d always recommend looking at Strandbeests for inspiration.

Mobile Coffee Table Uses Legs To Get Around

For getting around on most surfaces, it’s hard to beat the utility of the wheel. Versatile, inexpensive, and able to be made from a wide array of materials has led to this being a cornerstone technology for the past ten thousand years or so. But with that much history it can seem a little bit played out. To change up the locomotion game, you might want to consider using robotic legs instead. That’s what [Giliam] designed into this mobile coffee table which uses custom linkages to move its legs and get itself from place to place around the living room.

Continue reading “Mobile Coffee Table Uses Legs To Get Around”

Why Walking Tanks Never Became A Thing

The walking tank concept has always captured imaginations. Whether you’re talking about the AT-AT walkers of Star Wars, or the Dreadnoughts from Warhammer 40,000, they are often portrayed in fiction as mighty and capable foes on the battlefield. These legged behemoths ideally combine the firepower and defense of traditional tanks with the versatility of a legged walking frame.

Despite their futuristic allure, walking tanks never found a practical military application. Let’s take a look at why tracks still rule, and why walking combat machines are going to remain firmly in the realm of fiction for the foreseeable future.

Continue reading “Why Walking Tanks Never Became A Thing”

AI Learns To Walk In 3D Training Grounds

AI agents are learning to do all kinds of interesting jobs, even the creative ones that we quite prefer handling ourselves. Nevertheless, technology marches on. Working in this area is YouTuber [AI Warehouse], who has been teaching an AI to walk in a simulated environment.

Albert needed some specific guidance to learn how to walk upright, something that humans tend to figure out innately.

The AI controls a vaguely humanoid-like creature, albeit with a heavily-simplified body and limbs. It “lives” in a 3D environment created in the Unity engine, which provides the necessary physics engine for the work. Meanwhile, the ML-Agents package is used to provide the brain for Albert, the AI charged with learning to walk.

The video steps through a variety of “deep reinforcement learning” tasks. In these, the AI is rewarded for completing goals which are designed to teach it how to walk. Albert is given control of his limbs, and simply charged with reaching a button some distance away on the floor. After many trials, he learns to do the worm, and achieves his goal.

Getting Albert to walk upright took altogether more training. Lumpy ground and walls in between him and his goal were used to up the challenge, as well as encouragements to alternate his use of each foot and to maintain an upright attitude. Over time, he was able to progress through skipping and to something approximating a proper walk cycle.

One may argue that the teaching method required a lot of specific guidance, but it’s still a neat feat to achieve nonetheless. It’s altogether more complex than learning to play Trackmania, we’d say, and that was impressive enough in itself. Video after the break.

Continue reading “AI Learns To Walk In 3D Training Grounds”

Virtual Reality Experiment Tricks Your Feet Into Walking While Sitting Down

The whole idea behind virtual reality is that you don’t really know what’s going on in the world around you. You only know what your senses tell you is there. If you can fake out your vision, for example, then your brain won’t realize you are floating in a tank providing power for the robot hordes. However, scientists in Japan think that you can even fool your feet into thinking they are walking when they aren’t. In a recent paper, they describe a test they did that combined audio cues with buzzing on different parts of the feet to simulate the feel of walking.

The trick only requires four transducers, two on each foot. They tested several different configurations of what the effect looked like in the participant’s virtual reality headgear. Tests were performed in third person didn’t cause test subjects to associate the foot vibrations with walking. But the first-person perspective caused sensations of walking, with a full-body avatar working the best, compared to showing just hands and feet or no avatar at all.

Making people think they are walking in VR can be tricky but it does explain how they fit all that stuff in a little holodeck. Of course, it is nice if you can also sense walking and use it to move your avatar, but that’s another problem.