Boston Dynamics has always built robots with agility few others could match. While great for attention-getting demos, from outside the company it hasn’t been clear how they’ll translate acrobatic skills into revenue. Now we’re getting a peek at a plan in an interview with IEEE Spectrum about their new robot Stretch.
Most Boston Dynamics robots have been research projects, too expensive and not designed for mass production. The closest we got to date was Spot, which was offered for sale and picked up a few high profile jobs like inspecting SpaceX test sites. But Spot was still pretty experimental without an explicit application. In contrast, Stretch has a laser-sharp focus made clear by its official product page: this robot will be looking for warehouse jobs. Specifically, Stretch is designed to handle boxes up to 50 lbs (23 kg). Loading and unloading them, to and from pallets, conveyer belts, trucks, or shipping containers. These jobs are repetitive and tedious back-breaking work with a high injury rate, a perfect opportunity for robots.
But warehouse logistics aren’t as tightly structured as factory automation, demanding more adaptability than typical industrial robots can offer. A niche Boston Dynamics learned it can fill after releasing an earlier demo video showing their research robot Atlas moving some boxes around: they started receiving inquiries into how much that would cost. Atlas is not a product, but wheels were set in motion leading to their Handle robot. Learning from what Handle did well (and not well) in a warehouse environment, the designed evolved to today’s Stretch. The ostrich-like Handle prototype is now relegated to further research into wheeled-legged robots and the occasional fun dance video.
The Stretch preproduction prototypes visible in these videos lacks acrobatic flair of its predecessors, but they still have the perception and planning smarts that made those robots possible. Those skills are just being applied to a narrower problem scope. Once production models are on the job, we look forward to reading some work performance reviews.
For all those who have complained about Rubik’s Cube solving robots in the past by dismissing purpose-built rigs that hold the cube in a non-anthropomorphic manner: checkmate.
The video below shows not only that a robot can solve the classic puzzle with mechanical hands, but it can also do it with just one of them – and that with only three fingers. The [Yamakawa] lab at the University of Tokyo built the high-speed manipulator to explore the kinds of fine motions that humans perform without even thinking about them. Their hand, guided by a 500-fps machine vision system, uses two opposing fingers to grip the lower part of the cube while using the other finger to flick the top face of the cube counterclockwise. The entire cube can also be rotated on the vertical axis, or flipped 90° at a time. Piecing these moves together lets the hand solve the cube with impressive speed; extra points for the little, “How’s that, human?” flick at the end.
Unless you are in the fields of robotics or prosthetics, you likely take for granted the fine motor skills our hands have. Picking up and using a pen is no small feat for a robot which doesn’t have a dedicated pen-grabbing apparatus. Holding a mobile phone with the same gripper is equally daunting, not to mention moving that phone around once it has been grasped. Part of the wonder of our hands is the shape and texture which allows pens and phones to slide around at one moment, and hold fast the next moment. Yale’s Grab Lab has built a gripper which starts to solve that problem by changing the friction of the manipulators.
A spring-loaded set of slats with a low-friction surface allow a held object to move freely, but when more pressure is exerted by the robot, the slats retract and a high-friction surface contacts the object. This is similar to our fingers with their round surfaces. When we brush our hands over something lightly, they graze the surface but when we hold tight, our soft flesh meets the surface of the object and we can hold tightly. The Grab Lab is doing a great job demonstrating the solution and taking steps to more capable robots. All hail Skynet.
If [Nixie]’s setup looks familiar, it might be because we featured his plasma experiments a few days ago. He was a little cagey then about his goal, but he’s come clean with his desire to make his own FETs (a project that is his 2018 Hackaday Prize entry). Doing so will require not only creating stable plasmas, but also the ability to move substrates around inside the vacuum chamber. Taking inspiration from the slender and maneuverable instruments surgeons use for laparoscopic procedures, [Nixie] is working on a miniature arm that will work inside his vacuum chamber. The video below is a 3D-printed proof-of-concept model in action, and shows how the arm’s segments will be controlled by cables. What’s really interesting is that the control cables will not penetrate the vacuum chamber — they’ll be moved right through the glass wall using magnets.
[David Brown]’s entry for The Hackaday Prize is a design for a tool that normally exists only as an expensive piece of industrial equipment; out of the reach of normal experimenters, in other words. That tool is a 6-axis micro manipulator and is essentially a small robotic actuator that is capable of very small, very precise movements. It uses 3D printed parts and low-cost components.
The manipulator consists of six identical actuators, each consisting of a single piece of SLS 3D printed nylon with a custom PCB to control a motor and read positional feedback. The motor moves the central pivot point of the 3D printed assembly, which in turn deflects the entire piece by a small amount. By anchoring one point and attaching the other, a small amount of highly controllable movement can be achieved. Six actuators in total form a Gough-Stewart Platform for moving the toolhead.
Interestingly, this 6-Axis Micro Manipulator is a sort of side project. [David] is interested in creating his own digital UV exposer, which requires using UV laser diodes with fiber optic pig tails attached. In an industrial setting these are created by empirically determining the optimal position of a fiber optic with regards to the laser diode by manipulating it with a micro manipulator, then holding it steady while it is cemented in place. Seeing a distinct lack of micro manipulators in anything outside of lab or industrial settings, and recognizing that there would be applications outside of his own needs, [David] resolved to build one.
For those skeptical about the feasibility of Santa’s annual delivery schedule, here’s an autonomous piece of the puzzle that will bewilder even the most hard-hearted of non-believers.
The folks over at the Center of Excellence Cognitive Interaction Technology (CITEC) in Germany have whipped together a fantastic demo featuring Santa’s extra pair of helping hands. In the two-and-a-half minute video, the robot executes a suite of impressive autonomous stocking-stuffing maneuvers: from recognizing the open hole in the stocking, to grasping specific candies from the cluster of goodies available.
On the hardware-side, the arms appear to be a KUKA-variant, while on the software-side, the visualizations are being handled by the open source robot software ROS‘ RVIZ tool.
If some of the props in the video look familiar, you’ll find that the researchers at CITEC have already explored some stellar perception, classification, and grasping of related research topics. Who knew this pair of hands would be so jolly to clock some overtime this holiday season? The entire video is set to a crisp computer-voiced jingle that serves as a sneaky summary of their approach to this project.
The operator pictured above is using a controller which is a scale model of the manipulator arm, with two cameras giving feedback. One of those monitors shows a feed from the arm itself, providing a view of the gripper. The other feed is a wide shot of the working area from the body of the robot. The arm has six degrees of freedom actuated by servo motors. The controller is a replica of the arm laser cut from acrylic. At each joint there’s a potentiometer whose value is used to establish the position of the frame.
At first we thought that this would be more fatiguing and less convenient than using a gaming controller. But as we look at the dexterity of the arm it becomes obvious that joysticks and buttons would just make things more difficult.