Start Your New Career In Robot Dance Choreography

Boston Dynamics loves showing off their robots with dance videos. Every time they put one out, it ignites a discussion among robot enthusiasts debating what’s real versus merely implied by the exhibition. We really want to see tooling behind the scenes and fortunately we get a peek with a Spot dance choreography session posted by [Adam Savage]’s Tested team. (YouTube video, also embedded below.)

For about a year, the Tested team has been among those exploring a Spot’s potential. Most of what we’ve seen has been controlled from a custom tablet that looked like a handheld video game console. In contrast, this video shows a computer application for sequencing Spot actions on a music-focused timeline. The timer period is specified in beats per minute, grouped up eight to a bar. The high level task is no different from choreographing human dancers: design something that can be performed to music, delights your audience, all while staying within the boundaries of what your dancers can physically do with their bodies. Then, trust your dancers to perform!

That computer application is Boston Dynamics Choreographer, part of the Spot Choreography SDK. A reference available to anyone who is willing to Read The Fine Manual even if we don’t have a Spot of our own. As of this writing, Choreography SDK covers everything we saw Spot do in an earlier UpTown Funk dance video, but looks like it has yet to receive some of the more advanced Spot dances in the recent Do You Love me? video. There is a reference chart of moves illustrated with animated GIF, documented with customizable parameters along with other important notes.

Lowers the robutt down and back up once. Lasts for one beat (4 slices). Author’s note: I’m sorry.

We’ve seen a lot of hackers take on the challenge of building their own quadruped robots on these pages. Each full of clever mechanical design solutions that can match Spot’s kinematics. And while not all of them can match Spot’s control systems, we’re sure it’s only a matter of time before counterparts to Choreographer application show up on GitHub. (If they already exist, please link in comments.) Will we love robots once they can all dance? The jury is still out.

Continue reading “Start Your New Career In Robot Dance Choreography”

Robotic Fish Swarm Together Using Cameras And LEDs

Robotics has advanced in leaps and bounds over the past few decades, but in terms of decentralized coordination in robot swarms, they far behind biological swarms. Researchers from Harvard University’s Weiss Institute are working to close the gap, and have developed Blueswarm, a school of robotic fish that can exhibit swarm behavior without external centralized control.

In real fish schools, the movement of an individual fish depends on those around it. To allow each robotic fish to estimate the position of its neighbors, they are equipped with a set of 3 blue LEDs, and a camera on each side of the body. Four oscillating fins, inspired by reef fish, provide 3D control. The actuator for the fins is simply a pivoting magnet inside a coil being fed an alternating current. The onboard computer of each fish is a Raspberry Pi W, and the cameras are Raspberry Pi Camera modules with wide-angle lenses. Using the position information calculated from the cameras, the school can coordinate its movements to spread out, group together, swim in a circle, or find an object and then converge on it. The full academic article is available for free if you are interested in the details.

Communication with light is dependent on the clarity of the medium it’s traveling through, in this case, water — and conditions can quickly become a limiting factor. Submarines have faced the same challenge for a long time. Two current alternative solutions are ELF radio and sound, which are both covered in [Lewin Day]’s excellent article on underwater communications.

Continue reading “Robotic Fish Swarm Together Using Cameras And LEDs”

Reachy The Open Source Robot Says Bonjour

Humanoid robots always attract attention, but anyone who tries to build one quickly learns respect for a form factor we take for granted because we were born with it. Pollen Robotics wants to help move the field forward with Reachy: a robot platform available both as a product and as a wealth of information shared online.

This French team has released open source robots before. We’ve looked at their Poppy robot and see a strong family resemblance with Reachy. Poppy was a very ambitious design with both arms and legs, but it could only ever walk with assistance. In contrast Reachy focuses on just the upper body. One of the most interesting innovations is found in Reachy’s neck, a cleverly designed 3 DOF mechanism they called Orbita. Combined with two moving antennae at the top of the head, Reachy can emote a wide range of expressions despite not having much of a face. The remainder of Reachy’s joints are articulated with Dynamixel serial bus servos though we see an optional Orbita-based hand attachment in the demo video (embedded below).

Reachy’s € 19,990 price tag may be affordable relative to industrial robots, but it’s pretty steep for the home hacker. No need to fret, those of us with smaller bank accounts can still join the fun because Pollen Robotics has open sourced a lot of Reachy details. Digging into this information, we see Reachy has a Google Coral for accelerating TensorFlow and a Raspberry Pi 4 for general computation. Mechanical designs are released via web-based Onshape CAD. Reachy’s software suite on GitHub is primarily focused on Python, which allows us to experiment within a Jupyter notebook. Simulation can be done within Unity 3D game engine, which can be optionally compiled to run in a browser like the simulation playground. But academic robotics researchers are not excluded from the fun, as ROS1 integration is also available though ROS2 support is still on the to-do list.

Reachy might not be as sophisticated as some humanoid designs we’ve seen, and without a lower body there’s no way for it to dance. But we are very appreciative of a company willing to share knowledge with the world. May it spark new ideas for the future.

[via Engadget]

Continue reading “Reachy The Open Source Robot Says Bonjour”

Baby Yoda Becomes Personable Robot

Baby Yoda has been a hit character in Disney’s The Mandalorian, but does not actually exist in real life as far as we know. Instead, [Manuel Ahumada] set about building a robotic replica, complete with artificial intelligence.  (Video, embedded below.)

The first step was to build a basic robotic simulcra of Baby Yoda, which [Manuel] achieved by outfitting a toy with servos, motors and a Raspberry Pi. With everything hooked up, Baby Yoda was able to move his head and arms, and scoot around on wheels, all under the control of a Bluetooth gamepad. With that sorted, [Manuel] added brains in the form of a smartphone running Intel’s OpenBot machine learning platform. This allows Baby Yoda to track and follow people it sees on its smartphone camera, and potentially even navigate real-world spaces with future upgrades.

It’s a fun build, and we’d love to see the bot let loose at a convention to explore and make friends. We’ve covered OpenBot before, and look forward to seeing it used in more builds. Video after the break.

Continue reading “Baby Yoda Becomes Personable Robot”

Powered Exoskeletons In Rough Terrain: An Interesting Aspect Of The Chang’e 5 Recovery Mission

At this point in time, one would be hard pressed to find anyone who is not at least aware of some of the uses of exoskeletons as they pertain to use by humans. From supporting people during rehabilitation, to ensuring that people working in industrial and warehouse settings do not overexert themselves, while also preventing injuries and increasing their ability to carry heavy loads without tiring.

During the recovery mission of the Chang’e 5 sample container in the rough terrain of Inner Mongolia, the crew which was tasked with setting up the communications center, electrical supply systems and other essential services in the area wore exoskeletons. Developed by a relatively new Chinese company called ULS Robotics (see embedded promotional video after the break), the powered exoskeletons allowed the crew to carry 50 kg loads at a time for a hundred meters across the rough, snowy terrain.

The obvious benefit of an exoskeleton here is that while humans are pretty good at navigating rough terrain, this ability quickly degrades the moment a heavy load is involved, as anyone who has done serious mountain trekking can probably attest to. By having the exoskeleton bear most of the load, the wearer can focus on staying upright and reaching the destination quickly and safely.

With the growing interest for exoskeletons from various industries, the military, as well as the elderly, it probably won’t be too long before we’ll be seeing more of them in daily life the coming years.

(Thanks, Qes)

Continue reading “Powered Exoskeletons In Rough Terrain: An Interesting Aspect Of The Chang’e 5 Recovery Mission”

Transforming Drone Can Be A Square Or A Dragon

When flying drones in and around structures, the size of the drone is generally limited by the openings you want to fit through. Researchers at the University of Tokyo got around this problem by using an articulating structure for the drone frame, allowing the drone to transform from a large square to a narrow, elongated form to fit through smaller gaps.

The drone is called DRAGON, which is somehow an acronym for the tongue twisting description “Dual-Rotor Embedded Multilink Robot with the Ability of Multi-Degree-of-Freedom Aerial Transformation“. The drone consists of four segments, with a 2-DOF actuated joint between each segment. A pair of ducted fan motors are attached to the middle of each segment with a 2-DOF gimbal that allows it to direct thrust in any direction relative to the segment. For normal flight the segments would be arranged in the square shape, with minimal movement between the segments. When a small gap is encountered, as demonstrated in the video after the break, the segments rearrange into a dragon-like shape, that can pass through a gap in any plane.

Each segment has its own power source and controller, and the control software required to make everything work together is rather complex. The full research paper is unfortunately behind a paywall. The small diameter of the propellers, and all the added components would be a severe limiting factor in terms of lifting capacity and flight time, but the concept is to definitely interesting.

The idea of shape shifting robots has been around for a while, and can become even more interesting when the different segment can detach and reattach themselves to become modular robots. The 2016 Hackaday Grand Prize winner DTTO is a perfect example of this, although it did lack the ability to fly. Continue reading “Transforming Drone Can Be A Square Or A Dragon”

Boston Dynamics’ Dancing Bots Beg For Your Love A La Napoleon Dynamite

How do you get people to love you and sidestep existential fear of robots eclipsing humans as the solar system’s most advanced thinking machines? You put on a dance routine to the music of Berry Gordy.

The video published by Boston Dynamics shows off a range of their advanced robots moving as if they were humans, greyhounds, and ostriches made of actual flesh. But of course they aren’t, which explains the safety barriers surrounding the dance floor and that lack of actual audio from the scene. After picking our jaws up off the floor we began to wonder what it sounds like in the room as the whine of motors must certainly be quite impressive — check out the Handle video from 2017 for an earful of that. We also wonder how long a dance-off of this magnitude can be maintained between battery swaps.

Anthropomorphism (or would it be canine-pomorphism?) is trending this year. We saw the Spot robot as part of a dance routine in an empty baseball stadium back in July. It’s a great marketing move, and this most recent volley from BD shows off some insane stunts like the en pointe work from the dog robot while the Atlas humanoids indulge in some one-footed yoga poses. Seeing this it’s easy to forget that these machines lack the innate compassion and empathy that save humans from injury when bumping into one another. While our robotic future looks bright, we’re not in a rush to share the dance floor anytime soon.

Still, it’s an incredible tribute to the state of the art in robotics — congratulations to the roboticists that have brought use here. Looking back eleven and a half years to the first time we covered these robots here on Hackaday, this seems more like CGI movie footage than real life. What’s more amazing? Hobby builds that are keeping up with this level of accomplishment.

Continue reading “Boston Dynamics’ Dancing Bots Beg For Your Love A La Napoleon Dynamite”