Moteus Open Source BLDC Controller Gets Major Upgrade

[Josh Pieper] of mjbots Robotic Systems just released a major revision to his moteus open sourced brushless DC (BLDC) electric motor controller. The update adds a flexible I/O subsystem which significantly expands the kinds of feedback encoders and peripherals the controller can accept. In the video below the break, [Josh] walks through eleven different example configurations. If you prefer, these examples are also presented in article form on his blog.

The moteus controller originally came about when [Josh] was developing the quad A0, an open source dynamic quadruped robot, along the lines of the MIT Mini Cheetah or Boston Dynamics robotic dogs, and wasn’t satisfied that existing controllers could do the trick. It’s a compact 50 mm square board based on an STM32G4, has an integrated magnetic encoder, and accepts external sensor connections. Interfacing with the board is via CAN-FD using a register-based scheme. A Python GUI tool provides name-based register access via a logical tree structure as well as real-time telemetry plotting capabilities for diagnostic and configuration tasks.

If you are using BLDC motors in your projects, definitely check this out. Even if you’re not using a moteus controller, [Josh]’s demonstrations of the various encoder feedback technologies is very interesting and educational. The entire project is open source, and both the hardware and software design files can be found on the project’s GitHub repository. For some users, this may be a major factor, considering that the latest ODrive BLDC controller offering has become closed source.

We wrote about the mjbots quad A0 in 2019, and you can follow the moteus project over on Hackaday.io. We also found this interesting video by [Skyentific] comparing three popular open source BLCD controllers including the moteus (second video below the break). There’s also the SimpleFOC project we covered last year if you want to dig in and learn more about field-oriented control of BLDC motors. Thanks to [Androiddrew] for the tip.

Continue reading “Moteus Open Source BLDC Controller Gets Major Upgrade”

Hackaday Prize 2022: MasterPi Is A Capable Robot With Fancy Wheels

When it comes to building a mobile robot, often maneuverability is more important than outright speed. The MasterPi robot demonstrates this well, using fancy wheels to help it slide and skate in any direction needed.

Four DC gear-motors are fitted to a metal chassis, each one driving a mecanum wheel. These are special wheels with rollers fitted around their circumference at an angle that allows the robot to move in all directions and rotate in various ways depending on how each wheel is driven.

On top of this highly maneuverable chassis is placed a 5-degree-of-freedom robotic arm. The robot also gets a ultrasonic sensor for avoiding objects, as well as a camera for line-following duties. The camera also allows the robot to pick up blocks and identify their color, and it can then sort them into boxes. It’s all powered by a Raspberry Pi, running a bunch of Python code to make everything happen.

It’s a neat project that shows off just how capable a robot can be with some smart design choices and modern computing hardware on board. We’ve seen some other smart block sorters before, too. Continue reading “Hackaday Prize 2022: MasterPi Is A Capable Robot With Fancy Wheels”

Teardown: KC Bearifone Could Talk Circles Around Teddy Ruxpin

At the risk of dating myself, I will tell you that grew up in the 80s — that decade of excess that was half drab and half brightly colored, depending on where you looked, and how much money you had for stuff like Memphis design. Technology seemed to move quickly in almost every aspect of life as the people of the Me decade demanded convenience, variety, and style in everything from their toilet paper (remember the colors?) to their telephones. Even though long distance cost a fortune back then, we were encouraged to ‘reach out and touch someone’.

A Healthy Fear of Bears

Looking back, it’s easy to see how all that advanced technology and excess filtered down to children. I may be biased, but the 80s were a pretty awesome time for toys, and for children’s entertainment in general. Not only were the toys mostly still well-made, even those that came in quarter machines — many of them were technologically amazing.

Take Teddy Ruxpin, which debuted in 1985. Teddy was the world’s first animatronic children’s toy, a bear that would read stories aloud from special cassette tapes, which moved his eyes and mouth along with the words. One track contained the audio, and the other controlled three servos in his face.

I remember watching the commercials and imagining Teddy suddenly switching from some boring bedtime story over to a rockin’ musical number a là the animatronic Rock-afire Explosion band at ShowBiz Pizza (a Chuck E. Cheese competitor). That’s the kind of night I wanted to be having.

The current lineup of the Rock-afire Explosion. Image via Servo Magazine

Although I went to ShowBiz a fair number of times to play Skee-Ball and stare at the Rock-afire Explosion animals and their cool set pieces, I never did have a Teddy Ruxpin. I remember being torn between wanting one and thinking they were kind of scary, which in turn made me a bit tangentially afraid of the Snuggle bear. When it came down to it, Teddy simply cost too much — $69.99 for the bear alone, and another $20 for a single cassette with storybook. And that’s 1985 dollars — according to my favorite inflation calculator, that’s $250 in today’s money for a talking bear and one lousy story.

Which brings us to KC Bearifone, an animatronic teddy bear telephone. Honestly, part of the reason I bought the Bearifone was some sort of false nostalgia for Teddy. The main reason is that I wanted to own a Teleconcepts unit of some kind, and this one seemed like the most fun to mess around with. A robot teddy bear that only does speakerphone? Yes, please.

Continue reading “Teardown: KC Bearifone Could Talk Circles Around Teddy Ruxpin”

A 3D-Printed Nixie Clock Powered By An Arduino Runs This Robot

While it is hard to tell with a photo, this robot looks more like a model of an old- fashioned clock than anything resembling a Nixie tube. It’s the kind of project that could have been created by anyone with a little bit of Arduino tinkering experience. In this case, the 3D printer used by the Nixie clock project is a Prusa i3 (which is the same printer used to make the original Nixie tubes).

The Nixie clock project was started by a couple of students from the University of Washington who were bored one day and decided to have a go at creating their own timepiece. After a few prototypes and tinkering around with the code , they came up with a design for the clock that was more functional than ornate.

The result is a great example of how one can create a functional and aesthetically pleasing project with a little bit of free time.

Confused yet? You should be.

If you’ve read this far then you’re probably scratching your head and wondering what has come over Hackaday. Should you not have already guessed, the paragraphs above were generated by an AI — in this case Transformer — while the header image came by the popular DALL-E Mini, now rebranded as Craiyon. Both of them were given the most Hackaday title we could think of, “A 3D-Printed Nixie Clock Powered By An Arduino Runs This Robot“, and told to get on with it. This exercise was sparked by curiosity following the viral success of AI generators, which posed the question of whether an AI could make a passable stab at a Hackaday piece. Transformer runs on a prompt model in which the operator is given a choice of several sentence fragments so the text reflects those choices, but the act of choosing could equally have followed any of the options.

The text is both reassuring as a Hackaday writer because it doesn’t manage to convey anything useful, and also slightly shocking because from just that single prompt it’s created meaningful and clear sentences which on another day might have flowed from a Hackaday keyboard as part of a real article. It’s likely that we’ve found our way into whatever corpus trained its model and it’s also likely that subject matter so Hackaday-targeted would cause it to zero in on that part of its source material, but despite that it’s unnerving to realise that a computer somewhere might just have your number. For now though, Hackaday remains safe at the keyboards of a group of meatbags.

We’ve considered the potential for AI garbage before, when we looked at GitHub Copilot.

An Interesting Circular Stewart Platform

Stewart platforms are pretty neat, and not seen in the wild all that often, perhaps because there aren’t a vast number of hacker-friendly applications that need quite this many degrees of freedom within such a restricted movement range. Anyway, here’s an interesting implementation from the the curiously named [Circular-Base-Stewart-Platform] YouTube channel (no, we can’t find the designer’s actual name) with a series of videos from a few years ago, showing the construction and operation of such a beast. This is a very neat mechanism comprised of six geared motors on the end of arms, engaging with a large internal gear. The common end of each arm rides on the central shaft, each with its own bearing. With the addition of the usual six linkages, twelve ball joints, and a few brackets, a complete platform is realised.

This circular arrangement is so simple that we can’t believe we haven’t come across it before. One interesting deviation from the usual Stewart platform arrangement is the use of a central slip-ring connector to provide power, allowing the whole assembly to rotate continuously, in addition to the usual six degrees of freedom the mechanism allows. Control is courtesy of an Arduino Pro Mini, which drives the motors using a handful of Pololu TB6612 (PDF) dual H-bridge driver modules. Obviously, the sketch running on the Arduino will give the thing a fixed motion, but add in an additional data link over that central slip-ring setup (or maybe a wireless link), and it will be much more useful.

We recently saw another 6-DOF actuator design, using flexures, yet another ball-balancing hack, but if you want an actually useful Stewart platform application, checkout this pool-playing robot!

Continue reading “An Interesting Circular Stewart Platform”

The Unique Challenges Of Aerial Robotics

When we think of robotics, the first thing that usually comes to mind for many of us is some sort of industrial arm that’s bolted to the floor, or perhaps a semi-autonomous rover trudging its way across the dusty Martian landscape. While these two environments are about as different as can be, the basic “rules” are pretty much the same. Being on firm ground ground gives the robot a clear understanding of its position and orientation, which greatly simplifies tasks such as avoiding collisions or interacting with nearby objects.

But what happens when that reference point goes away? How does a robot navigate when it’s flying through open space or hovering in mid-air? That’s just one of the problems that fascinates Nick Rehm, who stopped by to host this week’s Aerial Robotics Hack Chat to talk about his passion for flying robots. He’s currently an aerospace engineer at Johns Hopkins Applied Physics Laboratory, where he works on the unique challenges faced by autonomous flying vehicles such as the detection and avoidance of mid-air collisions, as well as the development of vertical take-off and landing (VTOL) systems. But before he had his Master’s in Aerospace Engineering and Rotorcraft, he got started the same way many of us did, by playing around with DIY projects.

In fact, regular Hackaday readers will likely recall seeing some of his impressive builds. His autonomous ekranoplan designed to follow a target using computer vision graced the front page in April. Back in 2020, we took a look at his recreation of SpaceX’s Starship prototype, which used a realistic arrangement of control surfaces and vectored thrust to perform the spacecraft’s signature “Belly Flop” maneuver — albeit with RC motors and propellers instead of rocket engines. But even before that, Nick recalls asking his mother for permission to pull apart a Wii controller so he could use its inertial measurement unit (IMU) in a wooden-framed tricopter he was working on.

Discussing some of these hobby builds leads the Chat towards Nick’s dRehmFlight project, a GPLv3 licensed flight control package that can run on relatively low-cost hardware, namely a Teensy 4.0 microcontroller paired with the GY-521 MPU6050 IMU. The project is designed to let hobbyists easily experiment with VTOL craft, specifically those that transition between vertical and horizontal flight profiles, and has powered the bulk of Nick’s own flying craft.

Moving onto more technical questions, Nick says one of the most difficult aspects when designing an autonomous flying vehicle is getting your constraints nailed down. What he means by that is having a clear goal of what the craft needs to do, and critically, how long it needs to do it. How far does the craft need to be able to fly? How fast? Does it need to loiter at the target location, and if so, for how long? The answers to these questions will largely dictate the form of the final vehicle, and are key to determining if it’s worth implementing the complexity of transitioning from VTOL to fixed-wing horizontal flight.

But according to Nick, the biggest challenge in aerial robotics is onboard state estimation. That is, the ability for the craft to know its position and orientation relative to the ground. While high-performance computers have gotten lighter and sensors have improved, he says there’s still no substitute for having a ground-based tracking system. He mentions that those fancy demonstrations you’ve seen with drones flying in formation and working collaboratively towards a task will almost certainly have an array of motion capture cameras tucked off to the side. This makes for an impressive show, but greatly limits the practical application of these drone swarms.

Nick’s custom Raspberry Pi 4-powered quadcopter lets him test autonomous flight techniques.

So what does the future of aerial robotics look like? Nick says open source projects like ArduPilot and PX4 are still great choices for hobbyists, but sees promise in newer platforms which pair the traditional autopilot with more onboard computing power, such as Auterion’s Skynode. More powerful flight controllers can enable techniques such as simultaneous localization and mapping (SLAM), which uses 3D scans of the environment to help the robot orient itself. He’s also very interested in technologies that enable autonomous flight in GPS-denied environments, which is critical for robotic craft that need to operate indoors or in situations where satellite navigation is unavailable or unreliable. In light of the incredible success of NASA’s Ingenuity helicopter, we imagine these techniques will also play an invaluable role in the future airborne exploration of Mars.

We want to thank Nick for hosting this week’s Aerial Robotics Hack Chat, which turned out to be one of the fastest hours in recent memory. His experience as both an avid hobbyist and a professional in the field provided exactly the sort of insight the Hackaday community looks for, and his gracious offer to keep in touch with several of those who attended the Chat to further discuss their projects speaks to how passionate he is about this topic. We expect to see great things from Nick going forward, and would love to have him join us again in the future to see what he’s been up to.


The Hack Chat is a weekly online chat session hosted by leading experts from all corners of the hardware hacking universe. It’s a great way for hackers connect in a fun and informal way, but if you can’t make it live, these overview posts as well as the transcripts posted to Hackaday.io make sure you don’t miss out.

Aerial Robotics Hack Chat

Join us on Wednesday, June 8 at noon Pacific for the Aerial Robotics Hack Chat with Nick Rehm!

When it comes to robots, especially ones that need to achieve some degree of autonomy, the more constrained the environment they work in, the easier it is for them to deal with the world. An industrial arm tethered next to a production line, for example, only has to worry about positioning its tool within its work envelope. The problems mount up for something like an autonomous car, though, which needs to deal with the world in two — or perhaps two and a half — dimensions.

But what about adding a third dimension? That’s the realm that aerial robots have to live and work in, and it’s where the problems get really interesting. Not only are there hardly any constraints to movement, but you’ve also got to deal with the problems of aerodynamic forces, navigation in space, and control systems that need to respond to the slightest of perturbations without overcompensating.

join-hack-chatThe atmosphere is a tough place to make a living, and dealing with the problems of aerial robotics has kept Nick Rehm occupied for many years as a hobbyist, and more recently as an aerospace engineer at Johns Hopkins Applied Physics Laboratory. Nick has spent his time away from the office solving the problems of autonomous flight, including detection and avoidance of mid-air collisions, development of vertical take-off and landing (VTOL) and fixed-wing aircraft, and even ground-effect aircraft. He’ll drop by the Hack Chat to discuss the problems of aerial robots and the challenges of unconventional aviation, and help us figure out how to deal with the third dimension.

Our Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, June 8 at 12:00 PM Pacific time. If time zones have you tied up, we have a handy time zone converter.

Continue reading “Aerial Robotics Hack Chat”