Robot Joints Go Modular With This Actuator Project

[John Lauer] has been hard at work re-thinking robot arms. His project to create modular, open source actuators that can be connected to one another to form an arm is inspiring, and boasts an impressively low parts cost as well. The actuators are each self-contained, with an ESP32 and a design that takes advantage of the form factors of inexpensive modules and parts from vendors like Aliexpress.

Flex spline in action, for reducing backlash

Each module has 3D printed gears (with an anti-backlash flex spline), an RGB LED for feedback, integrated homing, active cooling, a slip ring made from copper tape, and a touch sensor dial on the back for jogging and training input. The result is a low backlash, low cost actuator that keeps external wiring to an absolute minimum.

Originally inspired by a design named WE-R2.4, [John] has added his own twist in numerous ways, which are best summarized in the video embedded below. That video is number three in a series, and covers the most interesting developments and design changes while giving an excellent overview of the parts and operation (the video for part one is a basic overview and part two shows the prototyping process, during which [John] 3D printed the structural parts and gears and mills out a custom PCB.)

Continue reading “Robot Joints Go Modular With This Actuator Project”

Navigating The Dark Side: Controlling Robots With Zero Radio Communication

While autonomous robots have been the subject of some projects in the past, this particular project takes a swing at building a robot that can teach children about controls and robotics.

The idea is to mimic a space mission on the dark side of the moon, where radio contact is nearly impossible. The students learn to program and debug embedded devices and sensors, even before some of them have learned the alphabet!

The material for the challenge allows the microcontroller to be programmed in a simple Arduino program (Blink) as well as lower level languages like C++ or Java. The main hardware consists of an Arduino Uno R3-based rover controlled over WiFi by an ESP8266. The sensor data from the robot is gathered from an ultrasound distance sensor an a camera, as well as a SIM7000E GSM+GPS. Commands are polled from a server, sent via a web page and REST interface.

The rover responds to commands for directions, takes pictures, and scans its distance remotely. Some custom libraries are written for the serial communication and camera to account for spotty communication. The latest challenge expansion is a probe that pays attention to battery life and power consumption, challenging students to account for power usage during the robot’s lifetime.

Since the project’s conception, the rovers have already been used in schools, and we’re excited to see a new approach for younger students to learn controls and programming.

Making A Robot Cleaner Even Smarter

Some electric cleaners are effective and some hardly even seem to make a difference. The ILIFE V7s may be a robot cleaner, but even with its cleaning modes and anti-collision system, it still requires IR signals to complete any tasks. Tired of having to be physically in the same place as his robot cleaner, [pimuzzo] decided to take matters into his own hands and build a RESTful remote control to send IR signals from afar.

The program uses the ESP8266WebServer and IRremoteESP8266 libraries for handling HTTP requests and sending and receiving infrared signals. The remote also respond to Actions on Google for controlling the robot over a Google Assistant.

The IR signals are a bit funky – as one user highlighted, finding the IR protocol is a nontrivial task that can be accomplished by recording the IR signals from the original remote with a IR receiver and matching the marks, spaces, and carrier frequency with those of known protocol codes. [Oitzu] was able to match the timing to the NEC 32 bit protocol and find the exact codes on an oscilloscope, which simplified the translation of the codes for the remote.

Sometimes when life gives you a robot cleaner, it’s your job to make it smarter.

Humanoid Robot Has Joints That Inspire

One of the challenges with humanoid robots, besides keeping them upright, is finding compact combinations of actuators and joint mechanisms that allow for good range of smooth motion while still having good strength. To achieve that researchers from the IRIM Lab at Korea University of Technology and Education developed the LIMS2-AMBIDEX robotic humanoid upper body that uses a combination of brushless motors, pulleys and some very interesting joint mechanisms. (Video, embedded below.)

The wrist mechanism. Anyone willing to tackle a 3D printed version?

From shoulder to fingers, each arm has seven degrees of freedom which allows the robot to achieve some spectacularly smooth and realistic upper body motion. Except for the wrist rotation actuator, all the actuators are housed in the shoulders, and motion is transferred to the required joint through an array of cables and pulleys. This keeps the arm light and its inertia low, allowing the arms to move rapidly without breaking anything or toppling the entire robot.

The wrist and elbow mechanisms are especially interesting. The wrist emulates rolling contact between two spheres with only revolute joints. It also allows a drive shaft to pass down the centre of the mechanism and transfer rotating motion from one end to the other. The elbow is a rolling double jointed affair that allows true 180 degrees of rotation.

We have no idea why this took two years to end up in our YouTube feed, but we’re sure glad it finally did. Check out some of the demo videos after the break. Continue reading “Humanoid Robot Has Joints That Inspire”

MIT Mini Cheetah Made And Improved In China

We nearly passed over this tip from [xoxu] which was just a few links to some AliExpress pages. However, when we dug a bit into the pages we found something pretty surprising. Somewhere out there in the wild we…east of China there’s a company not only reverse engineering the Mini Cheetah, but improving it too.

We cover a lot of Mini Cheetah projects; it’s a small robot that can do a back-flip after all. When compared to the servo quadruped of not so many years ago it’s definitely exciting magic. Many of the projects go into detail about the control boards and motor modifications required to build a Mini Cheetah of your own. So we were especially interested to discover that this AliExpress seller has gone through the trouble of not just reverse engineering the design, but also improving on it. Claiming their motors are thinner and more dust resistant than what they’ve seen from MIT.

To be honest, we’re not sure what we’re looking at. It’s kind of cool that we live in a world where a video of a research project and some papers can turn into a $12k robot you can buy right now. Let us know what you think after the break.

Robot Allows Remote Colleagues To Enjoy Office Shenanigans

[Esther Rietmann] and colleagues built a Telepresence Robot to allow work at home teammates to have a virtual, but physical presence in the office. A telepresence robot is like a tablet mounted on a Roomba, providing motion capability in addition to an audio/video connection. Built during a 48 hour hackathon, it is a bit crude under the hood and misses out on some features, such as a bidirectional video feed. But overall, it pretty much does what is expected from such a device.

The main structure is build from cheap aluminium profiles and sheets. A Raspberry Pi is at the heart of the electronics hardware, with a servo mounted Pi-camera and speaker-microphone pair taking care of video and audio. The two DC motors are driven by H-bridges controlled from the Pi and an idle swivel caster is attached as the third wheel. The whole thing is powered by a power bank. The one important thing missing is an HDMI display which can show a video feed from the remote laptop camera. That may have been due to time constraints, but this feature should not be too difficult to add as a future upgrade. It’s important for both sides to be able to see each other.

The software is built around WebRTC protocol, with the WebRTC Extension from UV4L doing most of the heavy lifting. The UV4L Streaming Server not only provides its own built-in set of web applications and services, but also embeds a general-purpose web server on another port, allowing the user to run and deploy their own custom web apps. This allowed [Esther Rietmann]’s team to build a basic but functional front-end to transmit data from the remote interface for controlling the robot. The remote computer runs a Python control script, running as a system service, to control the drive motors and camera servo.

The team also played with adding basic object, gesture and action recognition features. This was done using PoseNet – a machine learning model, which allows for real-time human pose estimation in the browser using TensorFlowJS – allowing them to demonstrate some pose detection capability. This could be useful as a “follow me” feature for the robot.

Another missing feature, which most other commercial telepresence robots have, is a sensor suite for collusion avoidance, object detection and awareness such as micro switches, IR / ultrasonic detectors, time of flight cameras or LiDAR’s. It would be relatively easy to add one or several sensors to the robot.

If you’d like to build one for yourself, check out their code repository on Github and the videos below.

Continue reading “Robot Allows Remote Colleagues To Enjoy Office Shenanigans”

Open-Source Arm Puts Robotics Within Reach

In November 2017, we showed you [Chris Annin]’s open-source 6-DOF robot arm. Since then he’s been improving the arm and making it more accessible for anyone who doesn’t get to play with industrial robots all day at work. The biggest improvement is that AR2 had a closed-loop control system, and AR3 is open-loop. If something bumps the arm or it crashes, the bot will recover its previous position automatically. It also auto-calibrates itself using limit switches.

AR3 is designed to be milled from aluminium or entirely 3D printed. The motors and encoders are controlled with a Teensy 3.5, while an Arduino Mega handles I/O, the grippers, and the servos. In the demo video after the break, [Chris] shows off AR3’s impressive control after a brief robotic ballet in which two AR3s move in hypnotizing unison.

[Chris] set up a site with the code, his control software, and all the STL files. He also has tutorial videos for programming and calibrating, and wrote an extremely detailed assembly manual. Between the site and the community already in place from AR2, anyone with enough time, money and determination could probably build one. Check out [Chris]’ playlist of AR2 builds — people are using them for photography, welding, and serving ice cream. Did you build an AR2? The good news is that AR3 is completely backward-compatible.

The AR3’s grippers work well, as you’ll see in the video. If you need a softer touch, try emulating an octopus tentacle.

Continue reading “Open-Source Arm Puts Robotics Within Reach”