Legged Robots Put On Wheels And Skate Away

We don’t know how much time passed between the invention of the wheel and someone putting wheels on their feet, but we expect that was a great moment of discovery: combining the ability to roll off at speed and our leg’s ability to quickly adapt to changing terrain. Now that we have a wide assortment of recreational wheeled footwear, what’s next? How about teaching robots to skate, too? An IEEE Spectrum interview with [Marko Bjelonic] of ETH Zürich describes progress by one of many research teams working on the problem.

For many of us, the first robot we saw rolling on powered wheels at the end of actively articulated legs was when footage of the Boston Dynamics ‘Handle’ project surfaced a few years ago. Rolling up and down a wide variety of terrain and performing an occasional jump, its athleticism caused quite a stir in robotics circles. But when Handle was introduced as a commercial product, its job was… stacking boxes in a warehouse? That was disappointing. Warehouse floors are quite flat, leaving Handle’s agility under-utilized.

Boston Dynamic has typically been pretty tight-lipped on details of their robotics development, so we may never know the full story behind Handle. But what they have definitely accomplished is getting a lot more people thinking about the control problems involved. Even for humans, we face a nontrivial learning curve paved with bruised and occasionally broken body parts, and that’s even before we start applying power to the wheels. So there are plenty of problems to solve, generating a steady stream of research papers describing how robots might master this mode of locomotion.

Adding to the excitement is the fact this is becoming an area where reality is catching up to fiction, as wheeled-legged robots have been imagined in forms like Tachikoma of Ghost in the Shell. While those fictional robots have inspired projects ranging from LEGO creations to 28-servo beasts, their wheel and leg motions have not been autonomously coordinated as they are in this generation of research robots.

As control algorithms mature in robot research labs around the world, we’re confident we’ll see wheeled-legged robots finding applications in other fields. This concept is far too cool to be left stacking boxes in a warehouse.

Continue reading “Legged Robots Put On Wheels And Skate Away”

Robotic Skin Sees When (and How) You’re Touching It

Cameras are getting less and less conspicuous. Now they’re hiding under the skin of robots.

A team of researchers from ETH Zurich in Switzerland have recently created a multi-camera optical tactile sensor that is able to monitor the space around it based on contact force distribution. The sensor uses a stack up involving a camera, LEDs, and three layers of silicone to optically detect any disturbance of the skin.

The scheme is modular and in this example uses four cameras but can be scaled up from there. During manufacture, the camera and LED circuit boards are placed and a layer of firm silicone is poured to about 5 mm in thickness. Next a 2 mm layer doped with spherical particles is poured before the final 1.5 mm layer of black silicone is poured. The cameras track the particles as they move and use the information to infer the deformation of the material and the force applied to it. The sensor is also able to reconstruct the forces causing the deformation and create a contact force distribution. The demo uses fairly inexpensive cameras — Raspberry Pi cameras monitored by an NVIDIA Jetson Nano Developer Kit — that in total provide about 65,000 pixels of resolution.

Apart from just providing more information about the forces applied to a surface, the sensor also has a larger contact surface and is thinner than other camera-based systems since it doesn’t require the use of reflective components. It regularly recalibrates itself based on a convolutional neural network pre-trained with data from three cameras and updated with data from all four cameras. Possible future applications include soft robotics, improving touch-based sensing with the aid of computer vision algorithms.

While self-aware robotic skins may not be on the market quite so soon, this certainly opens the possibility for robots that can detect when too much force is being applied to their structures — the machine equivalent sensation to pain.

Continue reading “Robotic Skin Sees When (and How) You’re Touching It”

Safety Systems For Stopping An Uncontrolled Drone Crash

We spend a lot of time here at Hackaday talking about drone incidents and today we’re looking into the hazard of operating in areas where people are present. Accidents happen, and a whether it’s a catastrophic failure or just a dead battery pack, the chance of a multi-rotor aircraft crashing down onto people below is a real and persistent hazard. For amateur fliers, operating over crowds of people is simply banned, but there are cases where professionally-piloted dones are flying near crowds of people and other safety measures need to be considered.

We saw a skier narrowly missed by a falling camera drone in 2015, and a couple weeks back there was news of a postal drone trial in Switzerland being halted after a parachute system failed. When a multirotor somehow fails while in flight it represents a multi-kilogram flying weapon widow-maker equipped with spinning blades, how does it make it to the ground in as safe a manner as possible? Does it fall in uncontrolled flight, or does it activate a failsafe technology and retain some form of control as it descends?

Continue reading “Safety Systems For Stopping An Uncontrolled Drone Crash”

Omnicopter catching a ball

A Flying, Fetching, Helping-Hand Omnicopter

Wouldn’t it be nice if you had a flying machine that could maneuver in any direction while rotating around any axis while maintaining both thrust and torque? Attach a robot arm and the machine could position itself anywhere and move objects around as needed. [Dario Brescianini] and [Raffaello D’Andrea] of the Institute for Dynamic Systems and Control at ETH Zurich, have come up with their Omnicopter that does just that using eight rotors in configurations that give it six degrees of freedom. Oh, and it plays fetch, as shown in the first video below.

Omnicopter propeller orientations
Omnicopter propeller orientations

Each propeller is reversible to provide thrust in either direction. Also on the vehicle itself is a PX4FMU Pixhawk flight computer, eight motors and motor controllers, a four-cell 1800 mAh LiPo battery, and communication radios. Radio communication is necessary because the calculations for the position and outer attitude are done on a desktop computer, which then sends the desired force and angular rates to the vehicle. The desktop computer knows the vehicle’s position and orientation because they fly it in the Flying Machine Arena, a large room at ETH Zurich with an infrared motion-capture system.

The result is a bit eerie to watch as if gravity doesn’t apply to the Omnicopter. The flying machine can be just plain playful, as you can see in the first video below where it plays fetch by using an attached net to catch a ball. When returning the ball, it actually rotates the net to dump the ball into the thrower’s hand. But you can see that in the video.

Continue reading “A Flying, Fetching, Helping-Hand Omnicopter”

Modest Motor Has Revolutionary Applications

Satellites make many of our everyday activities possible, and the technology continues to improve by leaps and bounds. A prototype, recently completed by [Arda Tüysüz]’s team at ETH Zürich’s Power Electronics Systems Lab in collaboration with its Celeroton spinoff, aims to improve satellite attitude positioning with a high speed, magnetically levitated motor.

Beginning as a doctoral thesis work led by [Tüysüz], the motor builds on existing technologies, but has been arranged into a new application — with great effect. Currently, the maneuvering motors on board satellites are operated at a low rpm to reduce wear, must be sealed in a low-nitrogen environment to prevent rusting of the components, and the microvibrations induced by the ball-bearings in the motors reduces the positioning accuracy. With one felling swoop, this new prototype motor overcomes all of those problems.

Continue reading “Modest Motor Has Revolutionary Applications”

Creating Full Color Images On Thermoformed Parts

In a race to produce the cheapest and most efficient full-color 3D object, we think Disney’s Research facility (ETH Zurich and the Interactive Geometry Lab) may have found the key. Combining hydrographic printing techniques with plastic thermoforming.

You might remember our article last year on creating photorealistic images on 3D objects using a technique called hydrographic printing, where essentially you print a flattened 3D image using a regular printer on special paper to transfer it to a 3D object in a bath of water. This is basically the same, but instead of using the hydrographic printing technique, they’ve combined the flattened image transfer with thermoforming — which seems like an obvious solution!

Continue reading “Creating Full Color Images On Thermoformed Parts”