Ask Hackaday: What’s The Deal With Humanoid Robots?

When the term ‘robot’ gets tossed around, our minds usually race to the image of a humanoid machine. These robots are a fixture in pop culture, and often held up as some sort of ideal form.

Yet, one might ask, why the fixation? While we are naturally obsessed with recreating robots in our own image, are these bipedal machines the perfect solution we imagine them to be?

Continue reading “Ask Hackaday: What’s The Deal With Humanoid Robots?”

Did You Meet Pepper?

Earlier this week it was widely reported that Softbank’s friendly-faced almost-humanoid Pepper robot was not long for this world, as the Japanese company’s subsidiary in France that had been responsible for the robotic darling of the last decade was being downsized, and that production had paused. Had it gone the way of Sony’s Aibo robotic puppy or Honda’s crouching-astronaut ASIMO? It seems not, because the company soon rolled back a little and was at pains to communicate that reports of Pepper’s untimely death had been greatly exaggerated. It wasn’t so long ago that Pepper was the face of future home robotics, so has the golden future become a little tarnished? Perhaps it’s time to revisit our plastic friend.

A Product Still Looking For A Function

A Pepper earning an honest crust as a tourist guide at the Heijo Palace museum. Tokumeigakarinoaoshima, CC BY-SA 4.0.
A Pepper earning an honest crust as a tourist guide at the Heijo Palace museum. Tokumeigakarinoaoshima, CC BY-SA 4.0.

Pepper made its debut back in 2014, a diminutive and child-like robot with basic speech recognition and conversation skills, the ability to recognize some facial expressions, and a voice to match those big manga-style eyes. It was a robot built for personal interaction rather than work, as those soft tactile hands are better suited to a handshake than holding a tool. It found its way into Softbank stores as well as a variety of other retail environments, it was also used in experiments to assess whether it could work as a companion robot in medical settings, and it even made an appearance as a cheerleading squad. It didn’t matter that it was found to be riddled with insecurities, it very soon became a favourite with media tech pundits, but it remained at heart a product that was seeking a purpose rather than one ready-made to fit a particular function.

I first encountered a Pepper in 2016, at the UK’s National Museum of Computing. It was simply an exhibit under the watchful eye of a museum volunteer rather than being used to perform a job, and it shared an extremely busy gallery with an exhibit of Acorn classroom computers from the 1980s and early ’90s. It was an odd mix of the unexpected and the frustrating, as it definitely saw me and let me shake its hand but stubbornly refused to engage in conversation. Perhaps it was taking its performance as a human child seriously and being shy, but the overwhelming impression was of something that wasn’t ready for anything more than experimental interaction except via its touch screen. As a striking contrast in 2016 the UK saw the first release of the Amazon Echo, a disembodied voice assistant that might not have had a cute face but which could immediately have meaningful interactions with its owner.

How Can A Humanoid Robot Compete With A Disembodied Voice?

In comparing the Pepper with an Amazon Echo it’s possible that we’ve arrived at the root of the problem. Something that looks cool is all very well, but without immediate functionality, it will never capture the hearts of customers. Alexa brought with it the immense power of Amazon’s cloud computing infrastructure, while Pepper had to make do with whatever it had on board. It didn’t matter to potential customers that a cloud-connected microphone presents a huge privacy issue, for them a much cheaper device the size of a hockey puck would always win the day if it could unfailingly tell them the evening’s TV schedule or remind them about Aunty’s birthday.

Over the next decade we will see the arrival of affordable and compact processing power that can do more of the work for which Amazon currently use the cloud. Maybe Pepper will never fully receive that particular upgrade, but it’s certain that if Softbank don’t do it then somebody else will. Meanwhile there’s a reminder from another French company that being first and being cute in the home assistant market is hardly a guarantee of success, who remembers the Nabaztag?

Header: Tokumeigakarinoaoshima, CC0.

Reachy The Open Source Robot Says Bonjour

Humanoid robots always attract attention, but anyone who tries to build one quickly learns respect for a form factor we take for granted because we were born with it. Pollen Robotics wants to help move the field forward with Reachy: a robot platform available both as a product and as a wealth of information shared online.

This French team has released open source robots before. We’ve looked at their Poppy robot and see a strong family resemblance with Reachy. Poppy was a very ambitious design with both arms and legs, but it could only ever walk with assistance. In contrast Reachy focuses on just the upper body. One of the most interesting innovations is found in Reachy’s neck, a cleverly designed 3 DOF mechanism they called Orbita. Combined with two moving antennae at the top of the head, Reachy can emote a wide range of expressions despite not having much of a face. The remainder of Reachy’s joints are articulated with Dynamixel serial bus servos though we see an optional Orbita-based hand attachment in the demo video (embedded below).

Reachy’s € 19,990 price tag may be affordable relative to industrial robots, but it’s pretty steep for the home hacker. No need to fret, those of us with smaller bank accounts can still join the fun because Pollen Robotics has open sourced a lot of Reachy details. Digging into this information, we see Reachy has a Google Coral for accelerating TensorFlow and a Raspberry Pi 4 for general computation. Mechanical designs are released via web-based Onshape CAD. Reachy’s software suite on GitHub is primarily focused on Python, which allows us to experiment within a Jupyter notebook. Simulation can be done within Unity 3D game engine, which can be optionally compiled to run in a browser like the simulation playground. But academic robotics researchers are not excluded from the fun, as ROS1 integration is also available though ROS2 support is still on the to-do list.

Reachy might not be as sophisticated as some humanoid designs we’ve seen, and without a lower body there’s no way for it to dance. But we are very appreciative of a company willing to share knowledge with the world. May it spark new ideas for the future.

[via Engadget]

Continue reading “Reachy The Open Source Robot Says Bonjour”

Robotic Biped Walks On Inverse Kinematics

Robotics projects are always a favorite for hackers. Being able to almost literally bring your project to life evokes a special kind of joy that really drives our wildest imaginations. We imagine this is one of the inspirations for the boom in interactive technologies that are flooding the market these days. Well, [Technovation] had the same thought and decided to build a fully articulated robotic biped.

Each leg has pivot points at the foot, knee, and hip, mimicking the articulation of the human leg. To control the robot’s movements, [Technovation] uses inverse kinematics, a method of calculating join movements rather than explicitly programming them. The user inputs the end coordinates of each foot, as opposed to each individual joint angle, and a special function outputs the joint angles necessary to reach each end coordinate. This part of the software is well commented and worth your time to dig into.

In case you want to change the height of the robot or its stride length, [Technovation] provides a few global constants in the firmware that will automatically adjust the calculations to fit the new robot’s dimensions. Of all the various aspects of this project, the detailed write-up impressed us the most. The robot was designed in Fusion 360 and the parts were 3D printed allowing for maximum design flexibility for the next hacker.

Maybe [Technovation’s] biped will help resurrect the social robot craze. Until then, happy hacking.

Continue reading “Robotic Biped Walks On Inverse Kinematics”

Humanoid Robot Has Joints That Inspire

One of the challenges with humanoid robots, besides keeping them upright, is finding compact combinations of actuators and joint mechanisms that allow for good range of smooth motion while still having good strength. To achieve that researchers from the IRIM Lab at Korea University of Technology and Education developed the LIMS2-AMBIDEX robotic humanoid upper body that uses a combination of brushless motors, pulleys and some very interesting joint mechanisms. (Video, embedded below.)

The wrist mechanism. Anyone willing to tackle a 3D printed version?

From shoulder to fingers, each arm has seven degrees of freedom which allows the robot to achieve some spectacularly smooth and realistic upper body motion. Except for the wrist rotation actuator, all the actuators are housed in the shoulders, and motion is transferred to the required joint through an array of cables and pulleys. This keeps the arm light and its inertia low, allowing the arms to move rapidly without breaking anything or toppling the entire robot.

The wrist and elbow mechanisms are especially interesting. The wrist emulates rolling contact between two spheres with only revolute joints. It also allows a drive shaft to pass down the centre of the mechanism and transfer rotating motion from one end to the other. The elbow is a rolling double jointed affair that allows true 180 degrees of rotation.

We have no idea why this took two years to end up in our YouTube feed, but we’re sure glad it finally did. Check out some of the demo videos after the break. Continue reading “Humanoid Robot Has Joints That Inspire”

Wave Goodbye To Honda Asimo, A Robot That Would Wave Back

Fans of technology will recall a number of years when Honda’s humanoid robot Asimo seemed to be everywhere. In addition to its day job in a research lab, Asimo had a public relations side gig showing everyone that Honda is about more than cars and motorcycles. From trade shows to television programs, even amusement parks and concert halls, Asimo worked a busy publicity schedule. Now a retirement party may be in order, since the research project has reportedly been halted.

Asimo’s activity has tapered off in recent years so this is not a huge surprise. Honda’s official Asimo site itself hasn’t been updated in over a year. Recent humanoid robots in media are more likely to be in context of events like DARPA Robotics Challenge or from companies like Boston Dynamics. Plus the required technology has become accessible enough for us to build our own two-legged robots. So its torch has been passed on, but Asimo would be remembered as the robot who pioneered a lot of thinking into how humanoid robots would interact with flesh and blood humans. It was one of the first robots who could recognize human waving as a gesture, and wave back in return.

Many concepts developed from Asimo will live on as Honda’s research team shift focus to less humanoid form factors. We can see Honda’s new ambitions in their concept video released during CES 2018 (embedded below.) These robots are still designed to live and work alongside people, but now they are specialized to different domains and they travel on wheels. Which is actually a step closer to the Jetsons’ future, because Rosie rolls on wheels!

Continue reading “Wave Goodbye To Honda Asimo, A Robot That Would Wave Back”

Our Reactions To The Treatment Of Robots

Most of us have seen employees of Boston Dynamics kicking their robots, and many of us instinctively react with horror. More recently I’ve watched my own robots being petted, applauded for their achievements, and yes, even kicked.

Why do people react the way they do when mechanical creations are treated as if they were people, pets, or worse? There are some very interesting things to learn about ourselves when considering the treatment of robots as subhuman. But it’s equally interesting to consider the ramifications of treating them as human.

The Boston Dynamics Syndrome

Shown here are two snapshots of Boston Dynamics robots taken from their videos about Spot and Atlas. Why do scenes like this create the empathic reactions they do? Two possible reasons come to mind. One is that the we anthropomorphize the human-shaped one, meaning we think of it as human. That’s easy to do since not only is it human-shaped but the video shows it carrying a box using human-like movements. The second snapshot perhaps evokes the strongest reactions in anyone who owns a dog, though its similarity to any four-legged animal will usually do.

Is it wrong for Boston Dynamics, or anyone else, to treat robots in this way? Being an electronic and mechanical wizard, you might have an emotional reaction and then catch yourself with the reminder that these machines aren’t conscious and don’t feel emotional pain. But it may be wrong for one very good reason.

Continue reading “Our Reactions To The Treatment Of Robots”