DIY Personal Assistant Robot Hears And Sees All

Who wouldn’t want a robot that can fetch them a glass of water? [Saral Tayal] didn’t just think that, he jumped right in and built his own personal assistant robot. This isn’t just some remote-controlled rover though. The robot actually listens to his voice and recognizes his face.

The body of the robot is the common “Rover 5” platform, to which [Saral] added a number of 3D printed parts. A forklift like sled gives the robot the ability to pick things up. Some of the parts are more about form than function – [Saral] loves NASA’s Spirit and Opportunity Mars rovers, so he added some simulated solar cells and other greebles.

The Logitech webcam up front is very functional — images are fed to machine learning models, while audio is processed to listen for commands. This robot can find and pick up 90 unique objects.

The robot’s brains are a Raspberry Pi. It uses TensorFlow for object recognition. Some of the models [Saral] is using are pretty large – so big that the Pi could only manage a couple of frames per second at 100% CPU utilization. A Google Coral coprocessor sped things up quite a bit, while only using about 30% of the Pi’s processor.

It takes several motors to control to robot’s tracks and sled. This is handled by two Roboclaw motor controllers which themselves are commanded by the Pi.

We’ve seen quite a few mobile robot rovers over the years, but [Saral’s] ‘bot is one of the most functional designs out there. Even better is the fact that it is completely open source. You can find the code and 3D models on his GitHub repo.

Check out a video of the personal assistant rover in action after the break.

Continue reading “DIY Personal Assistant Robot Hears And Sees All”

Etch-A-Selfie

Taking a selfie before the modern smartphone era was a true endeavor. Flip phones didn’t have forward-facing cameras, and if you want to go really far back to the days of film cameras, you needed to set a timer on your camera and hope, or get a physical remote shutter. You could also try and create a self portrait on an Etch a Sketch, too, but this would take a lot of time and artistic skill. Luckily in the modern world, we can bring some of this old technology into the future and add a robot to create interesting retro selfies – without needing to be an artist.

The device from [im-pro] attaches two servos to the Etch a Sketch knobs. This isn’t really a new idea in itself, but the device also includes a front-facing camera, taking advantage of particularly inexpensive ESP32 Camera modules. Combining the camera features with [Bart Dring]’s ESP32 Grbl port is a winner. Check the code in [im-pro]’s GitHub.

Once the picture is taken, the ESP32 at the heart of the build handles the image processing and then drawing the image on the Etch a Sketch. The robot needs a black and white image to draw, and an algorithm for doing it without “lifting” the drawing tool, and these tasks stretch the capabilities of such a small processor. It takes some time to work, but in the end the results speak for themselves.

The final project is definitely worth looking for, if not for the interesting ESP32-controlled robot than for the image processing algorithim implementation. The ESP32 is a truly versatile platform, though, and is useful for building almost anything.

Continue reading “Etch-A-Selfie”

Punch The World With A Raspberry Pi

Robots have certainly made the world a better place. Virtually everything from automobile assembly to food production uses a robot at some point in the process, not to mention those robots that can clean your house or make your morning coffee. But not every robot needs such a productive purpose. This one allows you to punch the world, which while not producing as much physical value as a welding robot in an assembly line might, certainly seems to have some therapeutic effects at least.

The IoT Planet Puncher comes to us from [8BitsAndAByte] who build lots of different things of equally dubious function. This one allows us to release our frustration on the world by punching it (or rather, a small model of it). A small painted sphere sits in front of a 3D-printed boxing glove mounted on a linear actuator. The linear actuator is driven by a Raspberry Pi. The Pi’s job doesn’t end there, though, as the project also uses a Pi camera to take video of the globe and serve it on a webpage through which anyone can control the punching glove.

While not immediately useful, we certainly had fun punching it a few times, and once a mysterious hand entered the shot to make adjustments to the system as well. Projects like this are good fun, and sometimes you just need to build something, even if it’s goofy, because the urge strikes you. Continue reading “Punch The World With A Raspberry Pi”

A 3D-Printable Mecanum Wheeled Robot Platform

If your interest lies with robotics there are a multitude of different platforms for you to build. [Teemu Laurila] was frustrated with what was on offer, so designed his own with four-wheel double wishbone suspension and mecanum wheels for maximum flexibility.

It’s a design that has been through multiple revisions since its first iteration in 2015, and along the way it’s clear some thought has gone into it. That double wishbone suspension features an angle for a high ground clearance, and is fully sprung. Drive comes from small motor/gearboxes at each axle. The chassis meanwhile has plenty of space for a single-board computer, and has been specifically designed with the BeagleBone Black in mind.

This build isn’t fully DIY, as the mecanum wheels appear to be off-the-shelf items, but the rest of the project makes up for this. If you need to make your own, it’s hardly as though there aren’t any projects from which you can borrow components.

Continue reading “A 3D-Printable Mecanum Wheeled Robot Platform”

Robot Harvesting Machine Is Tip Of The Agri-Tech Iceberg

Harvesting delicate fruit and vegetables with robots is hard, and increasingly us humans no longer want to do these jobs. The pressure to find engineering solutions is intense and more and more machines of different shapes and sizes have recently been emerging in an attempt to alleviate the problem. Additionally, each crop is often quite different from one another and so, for example, a strawberry picking machine can not be used for harvesting lettuce.

A team from Cambridge university, UK, recently published the details of their lettuce picking machine, written in a nice easy-to-read style and packed full of useful practical information. Well worth a read!

The machine uses YOLO3 detection and classification networks to get localisation coordinates of the crop and then check if it’s ready for harvest, or diseased. A standard UR10 robotic arm then positions the harvesting mechanism over the lettuce, getting force feedback through the arm joints to detect when it hits the ground. A pneumatically actuated cutting blade then attempts to cut the lettuce at exactly the right height below the lettuce head in order to satisfy the very exacting requirements of the supermarkets.

Rather strangely, the main control hardware is just a standard laptop which handles 2 consumer grade USB cameras with overall combined detection and classification speeds of about 0.212 seconds. The software is ROS (Robot Operating System) with custom nodes written in Python by members of the team.

Although the machine is slow and under-powered, we were very impressed with the fact that it seemed to work quite well. This particular project has been ongoing for several years now and the machine rebuilt 16 times! These types of machines are currently (2019) very much in their infancy and we can expect to see many more attempts at cracking these difficult engineering tasks in the next few years.

We’ve covered some solutions before, including: Weedinator, an autonomous farming ‘bot, MoAgriS, an indoor farming rig, a laser-firing fish-lice remover, an Aussie farming robot, and of course the latest and greatest from FarmBot.

Video after the break:

Continue reading “Robot Harvesting Machine Is Tip Of The Agri-Tech Iceberg”

Fish Hooks Embedded In Robot Toes Make Them Climb Like Cockroaches

Take a dozen or so fish hooks, progressively embed them in plastic with a 3D printer and attach them to the feet of your hexapod and you’ve got a giant cockroach!

Fish hooks embedded in 3D-printed robot feet

A team of researchers at Carnagie Mellon University came up with this ingenious hack which can easily be copied by anybody with a hexpod and a 3D printer. Here you can see the hooks embedded into the ends of a leg. This ‘Microspine technology’ enables their T-RHex robot to climb up walls at a slightly under-whelming 55 degrees, but also grants the ability to cling on severe overhangs.

Our interpretation of these results is that the robot needs to release and place each foot in a much more controlled manner to stop it from falling backwards. But researchers do have plans to help improve on that behavior in the near future.

Sensing and Closed Loop Control: As of now, T-RHex moves with an entirely open-loop, scripted gait. We believe that performance can be improved by adding torque sensing to the leg and tail actuators, which would allow the robot to adapt to large-scale surface irregularities in the wall, detect leg slip before catastrophic detachment,and automatically use the tail to balance during wall climbs.This design path would require a platform overhaul, but offers a promising controls-based solution to the shortcomings of our gait design.

No doubt we will all now want to build cockroaches that will out perform the T-RHex. Embedding fish hooks into plastic is done one at a time. During fabrication, the printer is stopped and a hook is carefully laid down by human hand. The printer is turned on once again and another layer of plastic laid down to fully encapsulate the hook. Repeat again and again!

Your robot would need the aforementioned sensing and closed loop control and also the ‘normal’ array of sensors and cameras to enable autonomy with the ability to assess the terrain ahead. Good luck, and don’t forget to post about your projects (check out Hackaday.io if you need somewhere to do this) and tip us off about it! We’ve seen plenty of, sometimes terrifying, hexapod projects, but watch out that the project budget does not get totally out of control (more to be said about this in the future).

Continue reading “Fish Hooks Embedded In Robot Toes Make Them Climb Like Cockroaches”

A Better Motor For Chickenwalkers

The last decade or so has seen remarkable advances in motor technology for robotics and hobby applications. We’re no longer stuck with crappy brushed motors, and now we have fancy (and cheap!) stepper motors, brushless motors for drones, and servo motors. This has led to some incredible achievements; drones are only barely possible with brushed motors, and you can’t build a robot without encoders.

For his entry into the Hackaday Prize, [Gabrael Levine] is taking on one of the hardest robotics challenges around: the bipedal robot. It’s a chickenwalker, or an AT-ST; either way, you need a lot of power in a very small space, and that’s where the OpenTorque Actuator comes in. It’s a quasi-direct-drive motor that was originally pioneered by the MIT Biomimetics Lab.

The key feature of the OpenTorque Actuator is using a big brushless motor, a rotation encoder, and a small, 8:1 planetary gear set. This allows the motor to be backdrivable, capable of force-sensing and open-loop control, and because this actuator is 3D printed, it’s really cheap to produce.

But a motor without a chassis is nothing, and that’s where the Blackbird Bipedal Robot comes in. In keeping with best practices of robotic design, the kinematics are first being tested in simulation, with the mechanical build happening in parallel. That means there’s some great videos of this chickenwalker strutting around (available below), and so far, everything looks great. This bipedal robot can turn, walk, yaw, and work is continuing on the efforts to get this bird-legged bot to stand still.