Obstacle Climbing Rover Built With The Power Of Lego

When we want to prototype a rover, we’ve developed a tendency to immediately reach for the 3D printer and Arduino or Raspberry Pi. It’s easy to forget the prototyping tool many of us grew up using: LEGO. The [Brick Experiment Channel] has not forgotten, and in the video after the break demonstrates how he used Lego Technic components to prototype an impressive little obstacle climbing robot.

The little Lego rover starts as a simple four-wheeled rover trying to climb on top of a book. Swap in a four-wheel-drive gearbox and grippy tires, and it clears the first obstacle. Add a few books to the stack causes the break-over angle to become an issue, so the rover gets an inverted-V chassis. As the obstacle height increases, batteries are moved around for better weight distribution, but the real improvement comes when an actuating middle joint is added, turning it into a wheeled inchworm. Clearing overhangs suspended beams, and gaps are all just a matter of finding the right technique.

Thanks to Lego’s modularity, all this is possible in an hour or two where a 3D printer and CAD might have stretched it into days. This robot does have the limitation of not being able to turn. Conventional car steering or Mecanum wheels are two options, but how would you do it?

The [Brick Experiment Channel] knows a thing or two about building Lego robots, even for stealing keys. Continue reading “Obstacle Climbing Rover Built With The Power Of Lego”

3D Printed Gesture-Controlled Robot Arm Is A Ton Of Tutorials

Ever wanted your own gesture-controlled robot arm? [EbenKouao]’s DIY Arduino Robot Arm project covers all the bases involved, but even if a robot arm isn’t your jam, his project has plenty to learn from. Every part is carefully explained, complete with source code and a list of required hardware. This approach to documenting a project is great because it not only makes it easy to replicate the results, but it makes it simple to remix, modify, and reuse separate pieces as a reference for other work.

[EbenKouao] uses a 3D-printable robotic gripper, base, and arm design as the foundation of his build. Hobby servos and a single NEMA 17 stepper take care of the moving, and the wiring and motor driving is all carefully explained. Gesture control is done by wearing an articulated glove upon which is mounted flex sensors and MPU6050 accelerometers. These sensors detect the wearer’s movements and turn them into motion commands, which in turn get sent wirelessly from the glove to the robotic arm with HC-05 Bluetooth modules. We really dig [EbenKouao]’s idea of mounting the glove sensors to this slick 3D-printed articulated gauntlet frame, but using a regular glove would work, too. The latest version of the Arduino code can be found on the project’s GitHub repository.

Most of the parts can be 3D printed, how every part works together is carefully explained, and all of the hardware is easily sourced online, making this a very accessible project. Check out the full tutorial video and demonstration, embedded below.

Continue reading “3D Printed Gesture-Controlled Robot Arm Is A Ton Of Tutorials”

Start Your New Career In Robot Dance Choreography

Boston Dynamics loves showing off their robots with dance videos. Every time they put one out, it ignites a discussion among robot enthusiasts debating what’s real versus merely implied by the exhibition. We really want to see tooling behind the scenes and fortunately we get a peek with a Spot dance choreography session posted by [Adam Savage]’s Tested team. (YouTube video, also embedded below.)

For about a year, the Tested team has been among those exploring a Spot’s potential. Most of what we’ve seen has been controlled from a custom tablet that looked like a handheld video game console. In contrast, this video shows a computer application for sequencing Spot actions on a music-focused timeline. The timer period is specified in beats per minute, grouped up eight to a bar. The high level task is no different from choreographing human dancers: design something that can be performed to music, delights your audience, all while staying within the boundaries of what your dancers can physically do with their bodies. Then, trust your dancers to perform!

That computer application is Boston Dynamics Choreographer, part of the Spot Choreography SDK. A reference available to anyone who is willing to Read The Fine Manual even if we don’t have a Spot of our own. As of this writing, Choreography SDK covers everything we saw Spot do in an earlier UpTown Funk dance video, but looks like it has yet to receive some of the more advanced Spot dances in the recent Do You Love me? video. There is a reference chart of moves illustrated with animated GIF, documented with customizable parameters along with other important notes.

Lowers the robutt down and back up once. Lasts for one beat (4 slices). Author’s note: I’m sorry.

We’ve seen a lot of hackers take on the challenge of building their own quadruped robots on these pages. Each full of clever mechanical design solutions that can match Spot’s kinematics. And while not all of them can match Spot’s control systems, we’re sure it’s only a matter of time before counterparts to Choreographer application show up on GitHub. (If they already exist, please link in comments.) Will we love robots once they can all dance? The jury is still out.

Continue reading “Start Your New Career In Robot Dance Choreography”

Robotic Fish Swarm Together Using Cameras And LEDs

Robotics has advanced in leaps and bounds over the past few decades, but in terms of decentralized coordination in robot swarms, they far behind biological swarms. Researchers from Harvard University’s Weiss Institute are working to close the gap, and have developed Blueswarm, a school of robotic fish that can exhibit swarm behavior without external centralized control.

In real fish schools, the movement of an individual fish depends on those around it. To allow each robotic fish to estimate the position of its neighbors, they are equipped with a set of 3 blue LEDs, and a camera on each side of the body. Four oscillating fins, inspired by reef fish, provide 3D control. The actuator for the fins is simply a pivoting magnet inside a coil being fed an alternating current. The onboard computer of each fish is a Raspberry Pi W, and the cameras are Raspberry Pi Camera modules with wide-angle lenses. Using the position information calculated from the cameras, the school can coordinate its movements to spread out, group together, swim in a circle, or find an object and then converge on it. The full academic article is available for free if you are interested in the details.

Communication with light is dependent on the clarity of the medium it’s traveling through, in this case, water — and conditions can quickly become a limiting factor. Submarines have faced the same challenge for a long time. Two current alternative solutions are ELF radio and sound, which are both covered in [Lewin Day]’s excellent article on underwater communications.

Continue reading “Robotic Fish Swarm Together Using Cameras And LEDs”

Reachy The Open Source Robot Says Bonjour

Humanoid robots always attract attention, but anyone who tries to build one quickly learns respect for a form factor we take for granted because we were born with it. Pollen Robotics wants to help move the field forward with Reachy: a robot platform available both as a product and as a wealth of information shared online.

This French team has released open source robots before. We’ve looked at their Poppy robot and see a strong family resemblance with Reachy. Poppy was a very ambitious design with both arms and legs, but it could only ever walk with assistance. In contrast Reachy focuses on just the upper body. One of the most interesting innovations is found in Reachy’s neck, a cleverly designed 3 DOF mechanism they called Orbita. Combined with two moving antennae at the top of the head, Reachy can emote a wide range of expressions despite not having much of a face. The remainder of Reachy’s joints are articulated with Dynamixel serial bus servos though we see an optional Orbita-based hand attachment in the demo video (embedded below).

Reachy’s € 19,990 price tag may be affordable relative to industrial robots, but it’s pretty steep for the home hacker. No need to fret, those of us with smaller bank accounts can still join the fun because Pollen Robotics has open sourced a lot of Reachy details. Digging into this information, we see Reachy has a Google Coral for accelerating TensorFlow and a Raspberry Pi 4 for general computation. Mechanical designs are released via web-based Onshape CAD. Reachy’s software suite on GitHub is primarily focused on Python, which allows us to experiment within a Jupyter notebook. Simulation can be done within Unity 3D game engine, which can be optionally compiled to run in a browser like the simulation playground. But academic robotics researchers are not excluded from the fun, as ROS1 integration is also available though ROS2 support is still on the to-do list.

Reachy might not be as sophisticated as some humanoid designs we’ve seen, and without a lower body there’s no way for it to dance. But we are very appreciative of a company willing to share knowledge with the world. May it spark new ideas for the future.

[via Engadget]

Continue reading “Reachy The Open Source Robot Says Bonjour”

Baby Yoda Becomes Personable Robot

Baby Yoda has been a hit character in Disney’s The Mandalorian, but does not actually exist in real life as far as we know. Instead, [Manuel Ahumada] set about building a robotic replica, complete with artificial intelligence.  (Video, embedded below.)

The first step was to build a basic robotic simulcra of Baby Yoda, which [Manuel] achieved by outfitting a toy with servos, motors and a Raspberry Pi. With everything hooked up, Baby Yoda was able to move his head and arms, and scoot around on wheels, all under the control of a Bluetooth gamepad. With that sorted, [Manuel] added brains in the form of a smartphone running Intel’s OpenBot machine learning platform. This allows Baby Yoda to track and follow people it sees on its smartphone camera, and potentially even navigate real-world spaces with future upgrades.

It’s a fun build, and we’d love to see the bot let loose at a convention to explore and make friends. We’ve covered OpenBot before, and look forward to seeing it used in more builds. Video after the break.

Continue reading “Baby Yoda Becomes Personable Robot”

Powered Exoskeletons In Rough Terrain: An Interesting Aspect Of The Chang’e 5 Recovery Mission

At this point in time, one would be hard pressed to find anyone who is not at least aware of some of the uses of exoskeletons as they pertain to use by humans. From supporting people during rehabilitation, to ensuring that people working in industrial and warehouse settings do not overexert themselves, while also preventing injuries and increasing their ability to carry heavy loads without tiring.

During the recovery mission of the Chang’e 5 sample container in the rough terrain of Inner Mongolia, the crew which was tasked with setting up the communications center, electrical supply systems and other essential services in the area wore exoskeletons. Developed by a relatively new Chinese company called ULS Robotics (see embedded promotional video after the break), the powered exoskeletons allowed the crew to carry 50 kg loads at a time for a hundred meters across the rough, snowy terrain.

The obvious benefit of an exoskeleton here is that while humans are pretty good at navigating rough terrain, this ability quickly degrades the moment a heavy load is involved, as anyone who has done serious mountain trekking can probably attest to. By having the exoskeleton bear most of the load, the wearer can focus on staying upright and reaching the destination quickly and safely.

With the growing interest for exoskeletons from various industries, the military, as well as the elderly, it probably won’t be too long before we’ll be seeing more of them in daily life the coming years.

(Thanks, Qes)

Continue reading “Powered Exoskeletons In Rough Terrain: An Interesting Aspect Of The Chang’e 5 Recovery Mission”