Automate The Freight: Amazon’s Robotic Packaging Lines

In the “Automate the Freight” series, I’ve concentrated on stories that reflect my premise that the killer app for self-driving vehicles will not be private passenger cars, but will more likely be the mundane but necessary task of toting things from place to place. The economics of replacing thousands of salary-drawing and benefit-requiring humans in the logistics chain are greatly favored compared to the profits to be made by providing a convenient and safe commuting experience to individuals. Advances made in automating deliveries will eventually trickle down to the consumer market, but it’ll be the freight carriers that drive innovation.

While I’ve concentrated on self-driving freight vehicles, there are other aspects to automating the supply chain that I’ve touched on in this series, from UAV-delivered blood and medical supplies to the potential for automating the last hundred feet of home delivery with curb-to-door robots. But automation of the other end of the supply chain holds a lot of promise too, both for advancing technology and disrupting the entire logistics field. This time around: automated packaging lines, or how the stuff you buy online gets picked and wrapped for shipping without ever being touched by human hands.

Continue reading “Automate The Freight: Amazon’s Robotic Packaging Lines”

Robotic Cheetah Teaches A Motors Class

It seems like modern roboticists have decided to have a competition to see which group can develop the most terrifying robot ever invented. As of this writing the leading candidate seems to be the robot that can fuel itself by “eating” organic matter. We can only hope that the engineers involved will decide not to flesh that one out completely. Anyway, if we can get past the horrifying and/or uncanny valley-type situations we find ourselves in when looking at these robots, it turns out they have a lot to teach us about the theories behind a lot of complicated electric motors.

This research paper (gigantic PDF warning) focuses on the construction methods behind MIT’s cheetah robot. It has twelve degrees of freedom and uses a number of exceptionally low-cost modular actuators as motors to control its four legs. Compared to other robots of this type, this helps them jump a major hurdle of cost while still retaining an impressive amount of mobility and control. They were able to integrate a brushless motor, a smart ESC system with feedback, and a planetary gearbox all into the motor itself. That alone is worth the price of admission!

The details on how they did it are well-documented in the 102-page academic document and the source code is available on GitHub if you need a motor like this for any other sort of project, but if you’re here just for the cheetah doing backflips you can also keep up with the build progress at the project’s blog page. We also featured this build earlier in its history as well.

Humanizing Industrial Robots By Sticking A Jibo On Top

A great many robots exist in our modern world, and the vast majority of them are highly specialized machines. They do a job, and they do it well, but they don’t have much of a personality. [Guilherme Martins] was working on a fun project to build a robot arm that could create chocolate artworks, but it needed something to humanize it a bit more. Thankfully, Jibo was there to lend a hand.

For the uninitiated, Jibo was a companion robot produced by a startup company that later folded. Relying on the cloud meant that when the money ran out and the servers switched off, Jibo was essentially dead. [Guilherme] managed to salvage one of these units, however, and gave it a new life.

With the dead company unable to provide an SDK, the entire brains of the robot were replaced with a LattePanda, which is a Windows 10 single-board computer with an integrated Arduino microcontroller. This was combined with a series of Phidgets motor drivers to control all of Jibo’s joints, and with some Unity software to provide the charming expressions on the original screen.

With the Jibo body mounted upon the robot arm, a simple chocolate-decorating robot now has a personality. The robot can wave to humans, and emote as it goes about its day. It’s an interesting feature to add to a project, and one that certainly makes it more fun. We’ve seen projects tackle similar subject matter before, attempting to build friendly robot pets as companions. Video after the break.

Continue reading “Humanizing Industrial Robots By Sticking A Jibo On Top”

Little Lamp To Learn Longer Leaps

Reinforcement learning is a subset of machine learning where the machine is scored on their performance (“evaluation function”). Over the course of a training session, behavior that improved final score is positively reinforced gradually building towards an optimal solution. [Dheera Venkatraman] thought it would be fun to use reinforcement learning for making a little robot lamp move. But before that can happen, he had to build the hardware and prove its basic functionality with a manual test script.

Inspired by the hopping logo of Pixar Animation Studios, this particular form of locomotion has a few counterparts in the natural world. But hoppers of the natural world don’t take the shape of a Luxo lamp, making this project an interesting challenge. [Dheera] published all of his OpenSCAD files for this 3D-printed lamp so others could join in the fun. Inside the lamp head is a LED ring to illuminate where we expect a light bulb, while also leaving room in the center for a camera. Mechanical articulation servos are driven by a PCA9685 I2C PWM driver board, and he has written and released code to interface such boards with Robot Operating System (ROS) orchestrating our lamp’s features. This completes the underlying hardware components and associated software foundations for this robot lamp.

Once all the parts have been printed, electronics wired, and everything assembled, [Dheera] hacked together a simple “Hello World” script to verify his mechanical design is good enough to get started. The video embedded after the break was taken at OSH Park’s Bring-A-Hack afterparty to Maker Faire Bay Area 2019. This motion sequence was frantically hand-coded in 15 minutes, but these tentative baby hops will serve as a great baseline. Future hopping performance of control algorithms trained by reinforcement learning will show how far this lamp has grown from this humble “Hello World” hop.

[Dheera] had previously created the shadow clock and is no stranger to ROS, having created the ROS topic text visualization tool for debugging. We will be watching to see how robot Luxo will evolve, hopefully it doesn’t find a way to cheat! Want to play with reinforcement learning, but prefer wheeled robots? Here are a few options.

Continue reading “Little Lamp To Learn Longer Leaps”

One-Legged Robot Does The Hop

At first, we thought this robot was like a rabbit until we realized rabbits have a 300% bonus in the leg department. SALTO — a robot from [Justin Yim], [Eric Wang], and [Ronald Fearing] only has one leg but gets around quite well hopping from place to place. If you can’t picture it, the video below will make it very obvious.

According to the paper about SALTO, existing hopping robots require external sensors and often are tethered. SALTO is self-contained. The robot weighs a tenth of a kilogram and takes its name from the word saltatorial (adapted for leaping ) which itself comes from the Latin saltare which means to jump or leap.

Continue reading “One-Legged Robot Does The Hop”

Use Movie Tools To Make Your Robot Move Like Movie Robots

Robots of the entertainment industry are given life by character animation, where the goal is to emotionally connect with the audience to tell a story. In comparison, real-world robot movement design focus more on managing physical limitations like sensor accuracy and power management. Tools for robot control are thus more likely to resemble engineering control consoles and not artistic character animation tools. When the goal is to build expressive physical robots, we’ll need tools like ROBiTS project to bridge the two worlds.

As an exhibitor at Maker Faire Bay Area 2019, this group showed off their first demo: a plugin to Autodesk Maya that translate joint movements into digital pulses controlling standard RC servos. Maya can import the same STL files fed to 3D printers, easily creating a digital representation of a robot. Animators skilled in Maya can then use all the tools they are familiar with, working in full context of a robot’s structure in the digital world. This will be a far more productive workflow for animation artists versus manipulating a long flat list of unintuitive slider controls or writing code by hand.

Of course, a virtual world offers some freedoms that are not available in the physical world. Real parts are not allowed to intersect, for one, and then there are other pesky physical limitations like momentum and center of gravity. Forgetting to account for them results in a robot that falls over! One of the follow-up projects on their to-do list is a bridge in the other direction: bringing physical world sensor like an IMU into digital representations in Maya.

We look forward to seeing more results on their YouTube channel. They join the ranks of other animated robots at Maker Faire and a promising addition to the toolbox for robot animation from Disney Research’s kinetic wires to Billy Whiskers who linked servos to Adobe Animate.

Continue reading “Use Movie Tools To Make Your Robot Move Like Movie Robots”

Bringing Battle Bots Into The Modern Classroom

With the wide array of digital entertainment that’s available to young students, it can be difficult for educators to capture their imagination. In decades past, a “volcano” made with baking soda and vinegar would’ve been enough to put a class of 5th graders on the edge of their seats, but those projects don’t pack quite the same punch on students who may have prefaced their school day with a battle royale match. Today’s educators are tasked with inspiring kids who already have the world at their fingertips.

Hoping to rise to that challenge with her entry into the 2019 Hackaday Prize, [Misty Lackie] is putting together a kit which would allow elementary and middle school students to build their very own fighting robots. Thanks to the use of modular components, younger students don’t have to get bogged down with soldering or the intricacies of how all the hardware actually works. On the other hand, older kids will be able to extend the basic platform without having to start from scratch.

The electronics for the bot consist primarily of an Arduino Uno with Sensor Shield, a dual H-bridge motor controller, and a wireless receiver for a PS2 controller. This allows the students to control the bot’s dual drive motors with an input scheme that’s likely very familiar to them already. By mapping the controller’s face buttons to digital pins on the Arduino, additional functions such as the spinner seen in the bot after the break, easily be activated.

[Misty] has already done some test runs with an early version of the kit, and so far its been a huge success. Students were free to design their own bodies and add-ons for the remote controlled platform, and it’s fascinating to see how unique the final results turned out to be. We’ve seen in the past how excited students can be when tasked with customizing their own robots, so any entry into that field is a positive development in our book.

Continue reading “Bringing Battle Bots Into The Modern Classroom”