3D Printed Research Robotics Platform Runs Remotely

The Open Dynamic Robot Initiative Group is a collaboration between five robotics-oriented research groups, based in three countries, with the aim to build an Open Source robotics platform based around the torque-control method. Leveraging 3D printing, a few custom PCBs, and off-the-shelf parts, there is a low-barrier to entry and much lower cost compared to similar robots.

The eagle-eyed will note that this is only a development platform, and all of the higher level control is off-machine, hosted by a separate PC. What’s interesting here, is just how low-level the robot actually is. The motion hardware is purely a few BLDC motors driven by field-orientated control (FOC) driver units, a wireless controller and some batteries. The FOC method enables very efficient motor commutation, giving excellent efficiency and maximum torque.  A delve into the maths of how this method operates will be an eye opener for the uninitiated. Optical encoders attached to the motors give positional feedback for the control loop.

It is this control loop that’s kinda weird, in that operates over Wi-Fi! Normally one would do all the position, torque and speed sensing locally within the leg unit, with local control loops, as well as running all the limb kinematics and motion planning. This would need some considerable local processing grunt, which can make development more difficult.

This project side-steps this, by first leveraging the ESPNOW protocol, initially aimed at the ESP8266 and friends. By patching Ubuntu Linux, and enabling preemptive multitasking for real-time scheduling, as well as carefully selecting Wi-Fi drivers, it was possible to get raw packets out to robot in about 1 ms, enabling control loop bandwidths of around 1 Khz. And, that, was fast enough to run at least sixteen motors in parallel.

Continue reading “3D Printed Research Robotics Platform Runs Remotely”

A tiny solar-powered robot that even works indoors

Tiny BEAM Robot Smiles Big At The Sun

What have you been working on during the Great Chip Shortage? [NanoRobotGeek] has been living up to their handle and building BEAM robots that are smaller than any we’ve seen before. What are BEAM robots, you say? Technically it stands for Biology Electronics Aesthetics and Mechanics, but basically the idea is to mimic the movement of bugs, usually with found components, and often with solar power. Here’s a bunch of tutorials to get you started.

The underbelly of what might be the world's smallest BEAM robot.
This was before the large, flat storage capacitor came and covered everything up.

This here is an example of a photovore or photopopper — it moves toward light using simple logic by charging up a capacitor and employing a voltage monitor to decide when there’s enough to run two tiny vibration motors that make up its legs and feet.

[NanoRobotGeek] started in a great place when they found these 25% efficient monocrystalline solar panels. They will even make the bot move indoors! If you want to build one of these, you can’t beat [NanoRobotGeek]’s guide. Be sure to watch it toddle around in the demo video after the break.

We love to see people work at all different scales. Last time we checked in with [NanoRobotGeek], they had built this solar-powered ball-flinging delight.

Continue reading “Tiny BEAM Robot Smiles Big At The Sun”

Self Balancing Robot Needs A Little Work

A self-balancing robot isn’t a new idea, but we liked the aesthetics of [Maker ATOM’s] build. The use of a breadboard and a printed bracket looks good, as you can see in the video, below.

Like most first-time projects, though, there were some lessons learned. The power supply needs a little work and the range of balance compliance didn’t meet expectations. But those problems are soluble and, as usual, you often learn more from working through issues like these.

Continue reading “Self Balancing Robot Needs A Little Work”

Line Following Robot Uses PID For Speed

While a line-following robot may not be the newest project idea in the book, this one from [Edison Science] is a clean build using modern components and gets a good speed thanks to PID control feedback instead of the more traditional bang-bang control you see in low-end robots.

Of course, PIDs need tuning and that seems to be the weak link — you’ll have to experiment with the settings. The sensors also require calibration, but we bet both of those issues could be fixed pretty easily.

Continue reading “Line Following Robot Uses PID For Speed”

3d printed GLaDOS home assistant

GLaDOS Voice Assistant Passive-Aggressively Automates Home

With modern voice assistants we can tell a computer to play our favorite music, check the weather, or turn on a light. Like many of us, [nerdaxic] gave in to the convenience and perceived simplicity of various home automation products made by Google and Amazon. Also like many of us, he found it a bit difficult to accept the privacy implications that surround such cloud connected devices. But after selling his Home and Echo, [nerdaxic] missed the ability to control his smart home by voice command. Instead of giving in and buying back into the closed ecosystems he’d left behind, [nerdaxic] decided to open his home to a murderous, passive aggressive, sarcastic, slightly insane AI: GLaDOS, which you can see in action after the break.

Using open source designs from fellow YouTube creator [Mr. Volt], [nerdaxic] 3d printed as much of the GLaDOS animatronic model as he was able to, and implemented much of the same hardware to make it work. [nerdaxic] put more Open Source Software to use and has created a functional but somewhat limited home AI that can manage his home automation, give the weather, and tell jokes among other things. GLaDOS doesn’t fail to deliver some great one liners inspired by the original Portal games while heeding [nerdaxic]’s commands, either.

A ReSpeaker from Seeed Studio cleans up the audio sent to a Raspberry Pi 4, and allows for future expansion that will allow GLaDOS to look in the direction of the person speaking to it. With its IR capable camera, another enhancement will allow GlaDOS to stare at people as they walk around. That’s not creepy at all, right? [nerdaxic] also plans to bring speech-to-text processing in-house instead of the Google Cloud Speech-To-Text API used in its current iteration, and he’s made everything available on GitHub so that you too can have a villainous AI hanging on your every word.

Of course if having GLaDOS looming isn’t enough, you could always build a functional life size Portal turret or listen to the radio on your very own Portal Radio.

Continue reading “GLaDOS Voice Assistant Passive-Aggressively Automates Home”

Flamethrower weedkiller mounted on a robot arm riding a tank tracked base

Don’t Sleep On The Lawn, There’s An AI-Powered, Flamethrower-Wielding Robot About

You know how it goes, you’re just hanging out in the yard, there aren’t enough hours in the day, and weeding the lawn is just such a drag. Then an idea just pops into your head. How about we attach a gas powered flamethrower to a robot arm, drive it around on a tank-tracked robotic base, and have it operate autonomously with an AI brain? Yes, that sounds like a good idea. Let’s do that. And so, [Dave Niewinski] did exactly that with his Ultimate Weed Killing Robot.

And you thought the robot overlords might take a more subtle approach and take over the world one coffee machine at a time? No, straight for the fully-autonomous flamethrower it is then.

This build uses a Kinova Robots Gen 3 six-axis arm, mounted to an Agile-X Robotics Bunker base. Control is via a Connect Tech Rudi-NX box which contains an Nvidia Jetson Xavier NX Edge AI computing engine. Wow that was a mouthful!

Connectivity from the controller to the base is via CAN bus, but, sadly no mention of how the robot arm controller is hooked up. At least this particular model sports an effector mount camera system, which can feed straight into the Jetson, simplifying the build somewhat.

To start the software side of things, [Dave] took a video using his mobile phone while walking his lawn. Next he used RoboFlow to highlight image stills containing weeds, which were in turn used to help train a vision AI system. The actual AI training was written in Python using Google Collaboratory, which is itself based on the awesome Jupyter Notebook (see also Jupyter Lab on the main site. If you haven’t tried that yet, and if you do any data science at all, you’ll kick yourself for not doing so!) Collaboratory would not be all that useful for this by itself, except that it gives you direct, free GPU access, via the cloud, so you can use it for AI workloads without needing fancy (and currently hard to get) GPU hardware on your desk.

Details of the hardware may be a little sparse, but at least the software required can be found on the WeedBot GitHub. It’s not like most of us will have this exact hardware lying around anyway. For a more complete description of this terrifying contraption, checkout the video after the break.

Continue reading “Don’t Sleep On The Lawn, There’s An AI-Powered, Flamethrower-Wielding Robot About”

Tiny ESP32 Strider Walks The Walk

Wheels might be the simplest method of locomotion for robots, but walkers are infinitely more satisfying to watch. This is certainly the case for [Chen Liang’s] tiny Strider walker controlled by a ESP32 camera board.

The Strider mechanism might look similar to Strandbeest walkers, but it lifts its feet higher, allowing it to traverse rougher terrain. [Chen]’s little 3D printed version is driven by a pair of geared N20 motors, with three legs on each side. The ESP32 camera board allows for control and an FPV video feed using WiFi, with power coming from a 14500 LiFePO4 battery. The width required by the motors, leg mechanisms, and bearings means the robot is quite wide, to the point that it could get stuck on something that’s outside the camera’s field of view. [Chen] is working to make it narrower by using continuous rotation servos and a wire drive shaft.

We’ve seen no shortage or riffs on the many-legged walkers, like the TrotBot and Strider mechanism developed by [Wade] and [Ben Vagle], and their website is an excellent resource for prospective builders.

Continue reading “Tiny ESP32 Strider Walks The Walk”