There’s something fascinating about humanoid robotic hands, if only because of how they are such close approximations of our own hands. One could almost picture them with tendons and skin covering them. Sadly, making your own is quite prohibitive because in addition to being complex bits of machinery, making one of these marvels of engineering is usually rather expensive.
[Gray Eldritch]’s Humanoid Robot Arm project seeks to fix both points, by providing a ready to print project. All it takes is about a kilogram of PLA filament, some TPU filament, five MG996r servos (or equivalent), an SG90 servo or similar, an Arduino Uno board and a few other bits and pieces. This should result in a robotic arm with hand as covered in the video of the Mark 3 version that is embedded after the break.
Motion planning is important, as it makes working with the robotic arm much easier. Rather than having to manually specify the rotation of each and every joint for every desired movement, instead mathematics is used to figure everything out. End effectors can be moved, and software will figure out the necessary motions required to achieve the end results. This functionality is baked into Robot Operating System (ROS) and proves useful to this project.
The construction of this particular arm is impressive in its simplicity, too. It has 7 degrees of freedom, which is plenty to play with. The arm is built out of LEGO Technic components, which are attached to the servos with the addition of some 3D printed components. It’s a smart and simple way to integrate the servos into the LEGO world, and we’re surprised we don’t see this more often.
If you ever tried to program a robotic arm or almost any robotic mechanism that has more than 3 degrees of freedom, you know that a big part of the programming goes to the programming of the movements themselves. What if you built a robot, regardless of how you connect the motors and joints and, with no knowledge of itself, the robot becomes aware of the way it is physically built?
That is what Columbia Engineering researchers have made by creating a robot arm that learns how it is connected, with zero prior knowledge of physics, geometry, or motor dynamics. At first, the robot has no idea what its shape is, how its motors work and how they affect its movement. After one day of trying out its own outputs in a pretty much random fashion and getting feedback of its actions, the robot creates an accurate internal self-simulation of itself using deep-learning techniques.
The robotic arm used in this study by Lipson and his PhD student Robert Kwiatkowski is a four-degree-of-freedom articulated robotic arm. The first self-models were inaccurate as the robot did not know how its joints were connected. After about 35 hours of training, the self-model became consistent with the physical robot to within four centimeters. The self-model then performed a pick-and-place task that enabled the robot to recalibrate its original position between each step along the trajectory based entirely on the internal self-model.
To test whether the self-model could detect damage to itself, the researchers 3D-printed a deformed part to simulate damage and the robot was able to detect the change and re-train its self-model. The new self-model enabled the robot to resume its pick-and-place tasks with little loss of performance.
Since the internal representation is not static, not only this helps the robot to improve its performance over time but also allows it to adapt to damage and changes in its own structure. This could help robots to continue to function more reliably when there its part start to wear off or, for example, when replacement parts are not exactly the same format or shape.
Of course, it will be long before this arm can get a precision anywhere near Dexter, the 2018 Hackaday Prize winner, but it is still pretty cool to see the video of this research:
Robotic arms are fascinating devices, capable of immense speed and precision when carrying out their tasks. They’re also capable of carrying great loads, and a full-sized industrial robot in operation at maximum pace is a sight to behold. However, while it’s simple to design grippers to move strong metal objects, picking up delicate or soft objects can be much harder. A team at MIT CSAIL have been working on a solution to this problem, which they call the Origami gripper.
The gripper is highly capable at lifting objects with complex shapes.
The gripper consists of a flexible, folding skeleton surrounded by an airtight skin. When vacuum is applied, the skeleton contracts around the object to be picked up. The gripper is capable of grasping objects sized up to 70% of its diameter, and over 100 times its weight.
Fabrication of the device involved the creation of 3D printed molds to produce the silicone rubber skeleton. Combined with precise lasercutting and advanced layering techniques, this created a part that can self-fold itself into shape under the right conditions. The structure was inspired by a “magic ball” origami design. The outer skin is remarkably simple in comparison – consisting of a regular latex balloon.
A robotic arm is an excellent idea if you’re looking to get started with electromechanical projects. There’s linkages to design, and motors to drive, but there’s also the matter of control. This is referred to as “kinematics”, and can be considered in both the forward and inverse sense. [aerdronix] built a robotic arm build that works in both ways.
The brains of the build is an Arduino Yun, which receives commands over the USB interface. Control is realised through the Blynk app, which allows IoT projects to easily build apps for smartphones that can be published to the usual platforms.
The arm’s position is controlled in two fashions. When configured to use inverse kinematics, the user commands an end effector position, and the arm figures out the necessary position of the linkages to make it happen. However, the arm can also be used in a forward kinematics mode, where the individual joint positions are commanded, which then determine the end effector’s final position.
Overall, it’s a well-documented build that lays out everything from the basic mechanical design to the software and source code required to control the system. It’s an excellent learning resource for the newcomer, and such an arm could readily be used in more complex projects.
There was an unbelievable amount of stuff on display at the 2018 World Maker Faire in New York. Seriously, an unreal amount of fantastically cool creations from all corners of the hacker and maker world: from purely artistic creations to the sort of cutting edge hardware that won’t even be on the rest of the world’s radar for a year or so, and everything in between. If you’ve got a creative bone in your body, this is the place for you.
But if there was one type of creation that stood out amongst all others, a general “theme” of Maker Faire if you will, it was robotics. Little robots, big robots, flying robots, battling robots, even musical robots. Robots to delight children of all ages, and robots to stalk the darkest corners of their nightmares. There were robots for all occasions. Probably not overly surprising for an event that has a big red robot as its mascot, but still.
There were far too many robots to cover them all, but the following is a collection of a few of the more interesting robotic creations we saw on display at the event. If you’re the creator of one of the robots we didn’t get a chance to get up close and personal with in our whirlwind tour through the Flushing Meadows Corona Park, we only ask that you please don’t send it here to exact your revenge. We’re very sorry. (Just kidding, if you have a robot to show off drop a link in the comments!)
[igarrido] has shared a project that’s been in the works for a long time now; a wooden desktop robotic arm, named Virk I. The wood is Australian Blackwood and looks gorgeous. [igarrido] is clear that it is a side project, but has decided to try producing a small run of eight units to try to gauge interest in the design. He has been busy cutting the parts and assembling in his spare time.
Besides the beautifully finished wood, some of the interesting elements include hollow rotary joints, which mean less cable clutter and a much tidier assembly. 3D printer drivers are a common go-to for CNC designs, and the Virk I is no different. The prototype is driven by a RAMPS 1.4 board, but [igarrido] explains that while this does the job for moving the joints, it’s not ideal. To be truly useful, a driver would need to have SCARA kinematic support, which he says that to his knowledge is something no open source 3D printer driver offers. Without such a driver, the software has no concept of how the joints physically relate to one another, which is needed to make unified and coherent movements. As a result, users must control motors and joints individually, instead of being able to direct the arm as a whole to move to specific coordinates. Still, Virk I might be what’s needed to get that development going. A video of some test movements is embedded below, showing how everything works so far.