Gripper Uses Belts To Pinch And Grasp

For all the work done since the dawn of robotics, there is still no match for the human hand in terms of its dexterity and adaptability. Researchers of the IRIM Lab at Koreatech is a step closer with their ingenious BLT gripper, which can pinch with precision or grasp a larger object with evenly distributed force. (Video embedded below.)

The three fingered gripper is technically called a “belt and link actuated transformable adaptive gripper with active transition capability”. Each finger is a interesting combination of a rigid “fingertip” and actuation link, and a belt as a grasping surface. The actuation link has a small gearbox at it’s base to open and close the hand, and the hinge with the “fingertip” is spring-loaded to the open position. A flexible belt stretches between the finger tip and the base of the gripper, which can be tensioned to actuate the fingertip for pinching, or provide even force across the inside of the gripper for grasping. Two of the fingers can also rotate at the base to give various gripper configurations. This allows the gripper to be used in various ways, including smoothly shifting between pinching and grasping without dropping a object.

We love the relative simplicity of the mechanism, and can see it being used for general robotics and prosthetic hands, especially if force sensing is integrated.  The mechanism should be fairly easy to replicate using 3D printed components, a piece of toothed belt, and two cheap servos, so get cracking! Continue reading “Gripper Uses Belts To Pinch And Grasp”

Sonic The Self-Balancing Robot: Face-Plants And The Challenges Of Sensor Integration

Watching a child learn to run is a joyous, but sometimes painful experience. It seems the same is true for [James Bruton]’s impressive Sonic the Self-Balancing robot, even with bendable knees and force sensitive legs.

We covered the mechanical side of the project recently, and now [James] has added the electronics to turn it into a truly impressive working robot (videos after the break). Getting it to this point was not without challenges, but fortunately he is sharing the experience with us, wipe-outs and all. The knees of this robot are actuated using a pair of motors with ball screws, which are not back drivable. This means that external sensors are needed to allow the motors to actively respond to inputs, which in this case are load cells in the legs and an MPU6050 IMU for balancing. The main control board is a Teensy 3.6, with an NRF24 module providing remote control.

[James] wanted the robot to be able to lean into turns and handle uneven surfaces (small ramps) without tipping or falling over. The leaning part was fairly simple (for him), but the sensor integration for uneven surfaces turned out to be a real challenge, and required multiple iterations to get working. The first approach was to move the robot in the direction of the tipping motion to absorb it, and then return to level. However, this could cause it to tip over slightly larger ramps. When trying to keep the robot level while going over a ramp with one leg, it would go into wild side-to-side oscillations as it drops back to level ground. This was corrected by using the load cells to dampen the motion.

Continue reading “Sonic The Self-Balancing Robot: Face-Plants And The Challenges Of Sensor Integration”

DARPA Subterranean Challenge Urban Circuit Now Livestreaming

Currently underway is the DARPA Subterranean Challenge (SubT) systems competition for urban circuits streamed live on YouTube now through Wednesday, February 26th.

The DARPA Grand Challenge of 2004 kicked research and development of autonomous vehicles into high gear. Many components on today’s self-driving vehicles can be traced back to systems developed for that competition. Hoping to spur further development, DARPA has since held several more challenges focused moving the state of the art in autonomous robotics ahead.

To succeed in this challenge, robots must handle terrain that would confuse today’s self-driving cars. Cluttered environments, uneven surfaces of different materials, even the occasional flooded section are fair game. These robots also lose access to some of the tools previously available, such as GPS. The “systems track” denotes teams building physical robot systems versus a separate “virtual track” for simulation robots. “Urban circuit” is the second of four phases in this competition, environments of this phase are focused on man-made underground structures. (Think subway station.) For more details on this competition as well as description of various phases, see our introductory post or the competition site.

Those who rather not watch robots tentatively exploring unknown territory (and occasionally failing) may choose to wait for summaries published after competition rounds are complete. The first phase (tunnel circuit) from August-October 2019 was summarized by IEEE Spectrum here. Or you can go straight to DARPA for details on the systems track and virtual track with overall results posted on the competition site.

Continue reading “DARPA Subterranean Challenge Urban Circuit Now Livestreaming”

Lil’ ESP32 Bot Does Remote Surveillance, And It’s Easy

Digital cameras have been around for a long time, as have small remote control robotics platforms. However, combining the two has really only come into its own in the last decade or so, as more bandwidth has become available to the home tinkerer. This ESP32-CAM surveillance bot is a great example of what was once hard becoming trivially easy.

It’s a case of standing on the shoulders of giants. The ESP32-CAM is a device that allows one to stream live video images over a network using existing example code. In this case, it’s combined with an L298N DC motor driver which allows the Adafruit robot platform to be steered like a tank via its two wheels. A pair of SG90 servos then serve as a pan/tilt mechanism to further improve the robot’s field of view.

If you aimed to attempt this back in 2010, you’d have spent six months figuring out how to get a microcontroller to talk to a small camera module. Only then could you consider solving the multitude of other problems presented by getting the video feed off the bot to somewhere useful. These days, you can order a bunch of parts online and have it up and running in a couple hours. This project from 2013 serves as an example of how much things have changed in the intervening years. Video after the break.

Continue reading “Lil’ ESP32 Bot Does Remote Surveillance, And It’s Easy”

Simple 3D Printed Robotic Arm Uses Compliant Mechanism

Learning through play is effective for humans of all ages, and since 2016 [slantconcepts] has been designing STEM kits that help teach kids to build their future overlords. They are launching version 3 of their LittleArm robotic arm, and the progression from version 1 is an interesting study in simplification and parts count reduction without sacrificing functionality.

In all of the LittleArm versions the main mechanical components are 3D printed, and driven by 3 servos for motion plus one additional servo to run the gripper. These kits are specifically intended to be built and disassembled repeatedly, and classrooms are a great place for small screws to easily disappear, so reducing the number of screws was a big goal for v3. The gripper/forearm shows the most dramatic improvement from the previous versions, being simplified from 8 separate components to a single 3D printed part by using a compliant mechanism — that squiggly pattern that allows the gripper to flex into place. The gripper tips also feature a simple “cutout” that allow it more easily grasp horizontal objects.

An Arduino Nano based expansion board is used to control the arm, with a HC-06 Bluetooth module to allow it to be controlled via a smart phone app. Various sensors can also be added to expand the kit’s capabilities. Unfortunately the mechanical design is not open source, but it can still be a source of inspiration for your own design projects.

Hopefully this kit will inspire some future hackers to build a more advanced 3D printed version, or even a giant hydraulic powered arm.

Rollbot Crams Ten Arms Onto One Wheel

It’s not every day that we see someone trying something new with robot locomotion, but [kong]’s robot Rollyboi was made to do exactly that by mixing up the usual robot-wheel-motor layout. Instead of the robot using motors to drive wheels, Rollyboi is itself the wheel, and uses multiple simple arms (legs?) attached to hobby servo motors to propel itself. The idea is that the arms swivel out one at a time to roll the robot along as needed.

It’s a novel idea, but how well does it work in practice? The first version was blind and mechanically unstable, with no idea which way was up and therefore no way to effectively control which arm needed to be extended, but was nevertheless able to roll along. The next version implemented a simple control system: buttons installed along the outside rim let the robot know how it is moving and which arm to extend next. With two sets of arms (one on each side) the robot becomes capable of executing simple turns by extending one arm more than the other.

In the end, Rollyboi could move but still lacks a means to perceive and navigate its environment. This is made more challenging by the fact that the robot’s body (and therefore any sensors mounted to it) would be in constant motion as the robot moves. Still, it’s interesting to see how far the idea went using only simple hardware, and its motion gives off a certain radial solenoid engine vibe. You can watch a brief video below.

Continue reading “Rollbot Crams Ten Arms Onto One Wheel”

Companion Bots Definitely Are The Droids You’re Looking For

Companion robots are a breed that, heretofore, we’ve primarily seen in cinema. Free from the limits of real-world technology, they manage to be charismatic, cute, and capable in ways that endear them to audiences the world over. Jorvon Moss and Alex Glow decided that this charming technology shouldn’t just live on the silver screen, and have been developing their own companion bots to explore this field. Lucky for us, they came down to Hackaday Superconference to tell us all about it!

The duo use a variety of techniques to build their ‘bots, infusing them with plenty of personality along the way. Jorvon favors the Arduino as the basis of his builds, while Alex has experimented with the Google AIY Vision Kit, BBC Micro:bit, as well as other platforms. Through clever design and careful planning, the two common maker techniques to create their unique builds. Using standard servos, 3D printed body parts, and plenty of LEDs, it’s all stuff that’s readily accessible to the home gamer.

[Alex]’s companion bot, Archimedes, has been through many upgrades to improve functionality. Plus, he’s got a cute hat!
Having built many robots, the different companions have a variety of capabilities in the manner they interact. Alex’s robot owl, Archimedes, uses machine vision to find people, and tries to figure out if they’re happy or sad. If they’re excited enough, it will give the person a small gift. Archimedes mounts on a special harness Alex built out of armature wire, allowing the avian to perch on her shoulder when out and about. Similarly, Jorvon’s Dexter lurks on his back, modeled after a monkey. Featuring an LED matrix for emotive facial expressions, and a touch sensor for high fives, Dexter packs plenty of character into his 3D printed chassis.

Alex and Jorvon also talk about some of the pitfalls and challenges they’ve faced through the development of their respective companion bots. Jorvon defines a companion robot as “any robot that you can take with you, on any type of adventure”. Being out in the real world and getting knocked around means breakages are common, with both of the duo picking up handfuls of smashed plastic and bundles of wires at times. Thankfully, with 3D printing being the tool of the trade, it’s easy to iteratively design new components to better withstand the rough and tumble of daily life out and about. This also feeds into the rest of the design process, with Jorvon giving the example of Dexter’s last minute LED upgrades that were built and fitted while at Supercon.

Develop on companion bots is never really finished. Future work involves integrating Chirp.io data-over-sound communications to allow the bots to talk. There’s been some headaches on the software side, but we look forward to seeing these ‘bots chatting away in their own droid language. While artificial intelligence doesn’t yet have homebrew companion bots matching the wisecracking droids seen in movies, designing lifelike bodies for our digital creations is a big step in that direction. With people like Alex and Jolyon on the case, we’re sure it won’t be long before we’re all walking around with digital pals on our shoulders — and it promises to be fun!

Continue reading “Companion Bots Definitely Are The Droids You’re Looking For”