Universal Robots Vision-Based LEGO Stacker

[Thomas Kølbæk Jespersen] and his classmates at Aalborg University’s Robot Vision course used MATLAB code and URscript to program a Universal Robots UR5 to stack up Duplo bricks. The Duplo bricks are stacked into low-fi Simpsons characters — yellow for Homer’s head, white for his shirt, and blue for his pants, for example.

The bricks are scattered randomly on a nearby table, while a camera mounted above the table scans the bricks and assists in determining the location, color, and orientation of the elements. This involves blob analysis which helps the computer decide what pixel is part of a brick and what isn’t. After running a recursive grassfire algorithm with 4-connectivity, the computer gives each pixel a number and assigns it to a blob.

To determine the orientation (the bricks are all assumed to be stud-side up and not overlapping) the blob is divided into quadrants and within each quadrant, the distance between the center of the blob and its farthest pixel is measured. This technique is not likely to work as well with a brick that isn’t square. Each brick’s location in pixels is translated into Cartesian coordinates, making it a cinch for the robot to pick it up. See [Thomas]’s GitHub for MATLAB and URscript code.

Looking for more UR5 projects? Check out the Sewbo garment-making robot we published last year.

Continue reading “Universal Robots Vision-Based LEGO Stacker”

The Internet Of Interactive Cats

[Tuco] is a cat who shares the space of [Micah Elizabeth Scott]. He is a large tabby tomcat, and he is polydactyl, which is to say he has a congenital excess of toes. He is an extremely active and engaging creature and enjoys playing and interacting with her. We covet [Tuco].

Sadly for the rest of us who love cats, of course, unless we know [Micah] personally we’ll never have the opportunity to play with [Tuco]. She appreciates the cat-shaped void that will leave in our lives, and to help us she’s building a telepresence robot to allow the rest of us to interact with him in real time.

Her idea is to make a flying robot equipped with a camera on a gimbal, and because to mounting it on a multirotor platform would be a hazard, instead she’s making something closer to the aerial cameras you might be familiar with from sporting fixtures, a motorised platform suspended from the corners of her roof space on a set of nylon ropes, that can move at will by adjusting the length of each tether. It is suggested that one day the device will be able to launch plastic bolts for [Tuco] to chase and to incorporate other interactive features to allow online users to engage with him.

We are shown progress so far in the video introducing the project that we’ve placed below the break, she has completed a prototype windlass mechanism and worked on reverse engineering the gimbal mechanism for serial control. We’ll probably never meet [Tuco] in person, but we can’t wait to interact with him online.

Continue reading “The Internet Of Interactive Cats”

Sony Unveils Swarm Robots For Kids

Sony recently unveiled Toio, an educational robotics toy for young programmers. We all know Sony as an electronics giant, but they do dabble in robotics from time to time. The AIBO dog family is probably their most famous creation, though there is also QRIO, a bipedal humanoid, and on the stranger side, the Rolly.

Toio consists of two small cube robots which roll around the desktop. You can control them with handheld rings, or run programs on them. The robots are charged by a base station, which also has a cartridge slot. Sony is marketing this as an ecosystem that can be expanded by buying packs which consist of accessories and a software cartridge. It looks like the cartridge is yet another proprietary memory card format. Is Sony ever going to learn?

There isn’t much hard information on Toio yet. We know it will be released in Japan on December 1st and will cost around ¥ 20,000, or about 200 USD. No word yet on a worldwide release.

The striking thing about this kit is how well the two robots know each other’s position. Tape a paper pair of pants, and they “walk” like two feet. Attach a paper linkage between them, and they turn in perfect sync, like two gears. Add some paper strips, and the two robots work together to form a gripper.  We can only guess that Sony is using cameras on the bottom of each robot to determine position — possibly with the aid of an encoded work surface — similar to Anoto paper. Whatever technology it is, here’s to hoping Sony puts out an SDK for researchers and hackers to get in on the fun with these little robots.

Continue reading “Sony Unveils Swarm Robots For Kids”

Robotic Arms Controlled By Your….. Feet?

The days of the third hand’s dominance of workshops the world over is soon coming to an end. For those moments when only a third hand is not enough, a fourth is there to save the day.

Dubbed MetaLimbs and developed by a team from the [Inami Hiyama Laboratory] at the University of Tokyo and the [Graduate School of Media Design] at Keio University, the device is designed to be worn while sitting — strapped to your back like a knapsack — but use while standing stationary is possible, if perhaps a little un-intuitive. Basic motion is controlled by the position of the leg — specifically, sensors attached to the foot and knee — and flexing one’s toes actuates the robotic hand’s fingers. There’s even some haptic feedback built-in to assist anyone who isn’t used to using their legs as arms.

The team touts the option of customizeable hands, though a soldering iron attachment may not be as precise as needed at this stage. Still, it would be nice to be able to chug your coffee without interrupting your work.

Continue reading “Robotic Arms Controlled By Your….. Feet?”

Self-Driving RC Cars With TensorFlow; Raspberry Pi Or MacBook Onboard

You might think that you do not have what it takes to build a self-driving car, but you’re wrong. The mistake you’ve made is assuming that you’ll be controlling a two-ton death machine. Instead, you can give it a shot without the danger and on a relatively light budget. [Otavio] and [Will] got into self-driving vehicles using radio controlled (RC) cars.

[Otavio] slapped a MacBook Pro on an RC car to do the heavy lifting and called it carputer. The computer reads Hall effect sensor data from the motor to establish distance traveled (this can be used to calculate speed) and watches the stream from a webcam perched on the chassis. These two sources are fed into a neural network using TensorFlow. You train the system by driving the vehicle manually through the course a few times and then let it drive itself.

In the video interview below, you get a look at the car and [Otavio] gives commentary on how the system works as we see playback of a few races, including the Sparkfun 2016 Autonomous Vehicle Competition. I apologize for the poor audio, they lost the booth lottery and were next door to an incredibly noisy robot band (video proof) so we were basically shouting at each other. But I think you’ll agree it’s worth it to get a look at the races. Continue reading “Self-Driving RC Cars With TensorFlow; Raspberry Pi Or MacBook Onboard”

“You Had One Job”, Bot

Only a Human would understand the pithy sarcasm in “You had one job”. When [tterev3]’s RopeBot the Robot became sentient and asked “What is my purpose?”, [tterev3] had to lay it out for him quite bluntly – “You cut the rope”. He designed RopeBot (YouTube video embedded below) for one job only – single mission, single use.

A couple of years back, [tterev3] had put up some thick ropes for a low ropes course in his backyard. Over time, the trees grew up, and the ropes became embedded in the tree trunks. Instead of risking his own life and limbs to try cutting them down, he designed RopeBot to do the job for him. It’s built from scavenged electronics and custom 3D printed parts. A geared motor driving a large cogged pulley helped by two smaller, idler wheels helps the bot to scurry up and down the rope. A second geared motor drives a cam reciprocating mechanism, similar to industrial metal cutting saws. A common utility knife is the business end of the bot, helping slice through the rope. A radio receiver and controller is the brains of the bot which drives the two motors through a motor driver board. The remote controller, assembled on a piece of foam, has three switches for Up, Down and Cut. Everything is held together on the 3D printed frame and tied down with a generous use of zip ties, with rubber bands providing spring tension where needed. When the rope has been cut, the RopeBot comes down for a smashing end. It might not look fancy, but it gets the job done. We spy some real ball bearings on the three pulleys meaning [tterev3] didn’t skimp on good design just because it’s a disposable robot. Obviously, he spent a fair amount of time and effort in designing RopeBot.

Once the job is done, most of the electronics and hardware can be recovered and used again while the 3D printed parts could be recycled, making this a really cost-effective way of handling the problem. Like the Disposable Drones we covered earlier, these kind of “use and discard” robots not only make life easier for Humans, but also ensure low economic and ecological impact.

Continue reading ““You Had One Job”, Bot”

Soft-legged Robot Handles Rough Terrain With Ease

Whether it’s wheels, tracks, feet, or even a roly-poly body like BB-8, most robots have to deal with an essential problem: dirt and grit can get into the moving bits and cause problems. Some researchers from UCSD have come up with a clever way around this: pneumatically actuated soft-legged robots that adapt to rough terrain.

At a top speed of 20 mm per second, [Michael Tolley]’s squishy little robot won’t set any land speed records. But for applications like search and rescue or placing sensors in inhospitable or inaccessible locations, slow and steady might just win the race. The quadrupedal robot’s running gear can be completely 3D-printed on any commercial printer capable of using a soft filament. The legs each contain three parallel air chambers within a bellowed outer skin; alternating how the chambers are inflated controls how they move. The soft legs adapt to unstructured terrain and are completely sealed, eliminating intrusion problems. The video below shows how the bot gets around just fine over rocks and sand.

The legs remind us a little of our [Joshua Vazquez]’s tentacle mechanism, but with fewer parts. Right now, the soft robot is tethered to its air supply, but the team is working on a miniaturized pump to make the whole thing mobile. At which point we bet it’ll even be able to swim.

Continue reading “Soft-legged Robot Handles Rough Terrain With Ease”