Dual Brushed Motor Controller Doesn’t Care How It Receives Commands

The simple DC brushed motor is at the heart of many a robotics project. For making little toy bots that zip around the house, you can’t beat the price and simplicity of a pair of brushed motors. They’re also easy to control; you could roll your own H-bridge out of discrete transistors, or pick up one of the commonly used ICs like the L298N or L9110S.

But what if you want an all-in-one solution? Something that will deliver enough current for most applications, drive dual motors, and deal with a wide range of input voltages. Most importantly, something that will talk to any kind of input source.  For his Hackaday prize entry, [Praveen Kumar] is creating a dual brushed motor controller which can handle a multitude of input types. Whether you’re using an IR remote, a Pi communicating over I2C, an analog output or Bluetooth receiver, this driver can handle them all and will automatically select the correct input source.

The board has an ATmega328p brain, so Arduino compatibility is there for easy reprogramming if needed. The mounting holes and header locations are also positioned to allow easy stacking with a Pi, and there’s a status LED too. It’s a great module that could easily find a place in a lot of builds.

If you need even more control over your brushed motor, you can soup up its capabilities by adding a PID loop for extra smarts.

Emotional Hazards That Lurk Far From The Uncanny Valley

A web search for “Uncanny Valley” will retrieve a lot of information about that discomfort we feel when an artificial creation is eerily lifelike. The syndrome tells us a lot about both human psychology and design challenges ahead. What about the opposite, when machines are clearly machines? Are we all clear? It turns out the answer is “No” as [Christine Sunu] explained at a Hackaday Los Angeles meetup. (Video also embedded below.)

When we build a robot, we know what’s inside the enclosure. But people who don’t know tend to extrapolate too much based only on the simple behavior they could see. As [Christine] says, people “anthropomorphize at the drop of the hat” projecting emotions onto machines and feeling emotions in return. This happens even when machines are deliberately designed to be utilitarian. iRobot was surprised how many Roomba owners gave their robot vacuum names and treated them as family members. A similar eruption of human empathy occurred with Boston Dynamics video footage demonstrating their robot staying upright despite being pushed around.

In the case of a Roomba, this kind of emotional power is relatively harmless. In the case of robots doing dangerous work in place of human beings, such attachment may hinder robots from doing the job they were designed for. And even more worrisome, the fact there’s a power means there’s a potential for abuse. To illustrate one such potential, [Christine] brought up the Amazon Echo. The cylindrical puck is clearly a machine and serves as a point-of-sale terminal, yet people have started treating Alexa as their trusted home advisor. If Amazon should start monetizing this trust, would users realize what’s happening? Would they care?

Continue reading “Emotional Hazards That Lurk Far From The Uncanny Valley”

Watney: A Fully 3D Printed Rover Platform

We’re getting to the point that seeing 3D printed parts in a project or hack isn’t as exciting as it was just a few years ago. The proliferation of low-cost desktop 3D printers means that finding a printer to squirt out a few parts for your build isn’t the adventure it once was. Gone are the days of heading to a local hackerspace or college hoping their janky Mendel felt like working that day. But all that really means is that hackers and makers now have the ability to utilize 3D printing even more. Forget printing one or two parts of your design, just print the whole thing.

That’s exactly what [Nik Ivanov] did with Watney, his fully 3D printed rover project. After lamenting that many so-called 3D printed rovers were anything but, he set out to design one that was not only made primarily of printed parts, but was robust enough to put some real work in. Over the course of several design iterations, he built a very capable all-wheel drive platform that needs only some electronics and a handful of M3 screws to leap into action.

As long as you’ve got a 3D printer big enough to handle the roughly 120mm x 190mm dimensions of this bot’s body, you’re well on the way to owning your very own video rover. [Nik] recommends printing everything in PETG, no doubt for its increased strength when it comes to things like the drive gears. Plus it’s low warp, which is really going to help when printing the top and bottom sections of the body. TPU is advised for the tires, but if you don’t have any (or your printer chokes on flexible filaments) you can just wrap the wheels with wide rubber bands.

[Nik] is using a Raspberry Pi Zero W as the brains of the operation, but the beauty of an open platform like this is that you could easily swap out the controls for something else to meet your needs. In addition to the Pi, there’s a L298N H-bridge motor controller to interface with the dual geared motors, as well as a servo to provide tilt for the SainSmart camera module.

We’ve often been surprised at just how expensive commercial robotics platforms can be, so we’re keenly interested in seeing if the availability of designs like this spur on DIY rover development. Though if you’re looking for something a little more rough and tumble, we’ve seen a 3D printed rover that looks combat-ready.

Continue reading “Watney: A Fully 3D Printed Rover Platform”

Robot Maps Rooms With Help From IPhone

The Unity engine has been around since Apple started using Intel chips, and has made quite a splash in the gaming world. Unity allows developers to create 2D and 3D games, but there are some other interesting applications of this gaming engine as well. For example, [matthewhallberg] used it to build a robot that can map rooms in 3D.

The impetus for this project was a robotics company that used a series of robots around their business. The robots navigate using computer vision, but couldn’t map the rooms from scratch. They hired [matthewhallberg] to tackle this problem, and this robot is a preliminary result. Using the Unity engine and an iPhone, the robot can perform in one of three modes. The first is a user-controlled mode, the second is object following, and the third is 3D mapping.

The robot seems fairly easy to construct and only carries and iPhone, a Node MCU, some motors, and a battery. Most of the computational work is done remotely, with the robot simply receiving its movement commands from another computer. There’s a lot going on here, software-wise, and a lot of toolkits and software packages to install and communicate with one another, but the video below does a good job of showing what you’ll need and how it all works together. If that’s all too much, there are other robots with a form of computer vision that can get you started into the world of computer vision and mapping.

Continue reading “Robot Maps Rooms With Help From IPhone”

Hold The Salt And Butter, This Popcorn Is For A Robot

Popcorn! Light and fluffy, it is a fantastically flexible snack. We can have them plain, create a savory snack with some salt and butter, or cover with caramel if you have a sweet tooth. Now Cornell University showed us one more way to enjoy popcorn: use their popping action as the mechanical force in a robot actuator.

It may be unorthodox at first glance, but it makes a lot of sense. We pop corn by heating its water until it turns into steam triggering a rapid expansion of volume. It is not terribly different from our engines burning an air-fuel mixture to create a rapid expansion of volume. Or using heat energy to boil water and trigger its expansion to steam. So a kernel of popcorn can be used as a small, simple, self-contained engine for turning heat energy into mechanical power.

Obviously it would be a single-use mechanism, but that’s perfectly palatable for the right niche. Single-use is a lot easier to swallow when popcorn is so cheap, and also biodegradable resulting in minimal residue. The research paper demonstrated three recipes to harness popping corn’s mechanical energy, but that is hardly an exhaustive list. There’s an open invitation to brainstorm other creations to add to the menu.

Of course, if you prefer candy over popcorn, you could build a robot actuator out of licorice instead.

Either way, the robot uprising will be delicious.

[via IEEE Spectrum]

Continue reading “Hold The Salt And Butter, This Popcorn Is For A Robot”

Cheetah 3 Is Learning To Move Blindly Before Learning To See

Stand up right now and walk around for a minute. We’re pretty sure you didn’t see everywhere you stepped nor did you plan each step meticulously according to visual input. So why should robots do the same? Wouldn’t your robot be more versatile if it could use its vision to plan a path, but leave most of the walking to the legs with the help of various sensors and knowledge of joint positions?

That’s the approach [Sangbae Kim] and a team of researchers at MIT are taking with their Cheetah 3. They’ve given it cameras but aren’t using them yet. Instead, they’re making sure it can move around blind first. So far they have it walking, running, jumping and even going up stairs cluttered with loose blocks and rolls of tape.

Cheetah 3 jumping 30 inches onto a desk
Jumping 30 inches onto a desk

Two algorithms are at the heart of its being able to move around blind.

The first is a contact detection algorithm which decides if the legs should transition between a swing or a step based on knowledge of the joint positions and data from gyroscopes and accelerometers. If it tilted unexpectedly due to stepping on a loose block then this is the algorithm which decides what the legs should do.

The second is a model-predictive algorithm. This predicts what force a leg should apply once the decision has been made to take a step. It does this by calculating the multiplicative positions of the robot’s body and legs a half second into the future. These calculations are done 20 times a second. They’re what help it handle situations such as when someone shoves it or tugs it on a leash. The calculations enabled it to regain its balance or continue in the direction it was headed.

There are a number of other awesome features of this quadruped robot which we haven’t seen in others such as Boston Dynamics’ SpotMini like invertible knee joints and walking on three legs. Check out those features and more in the video below.

Of course, SpotMini has a whole set of neat features of its own. Let’s just say that while they look very similar, they’re on two different evolutionary paths. And the Cheetah certainly has evolved since we last looked at it a few years ago.

Continue reading “Cheetah 3 Is Learning To Move Blindly Before Learning To See”