Flexible PCBs Make The Fins Of This Robotic Fish

We love a little outside-the-box thinking around here, and anytime we see robots that don’t use wheels and motors to do the moving, we take notice. So when a project touting robotic fish using soft-actuator fins crossed the tip line, we had to take a look.

It turns out that this robofish comes from the fertile mind of [Carl Bugeja], whose PCB motors and flexible actuators have been covered here before. The basic concept of these fish fins is derived from the latter project, which uses coils printed onto both sides of a flexible Kapton substrate. Positioned near a magnet, the actuators bend when a current runs through them. The video below shows two prototype robofish, each with four fins. The first is a scrap of foam with a magnet embedded; the fins did flap but the whole thing just weighed too much. Version two was much lighter and almost worked, but the tether to the driver is just too stiff to allow it to really flex its fins.

It looks like it has promise though, and we’re excited to see where [Carl] take this. Perhaps schools of tiny robofish patrolling for pollution?

Continue reading “Flexible PCBs Make The Fins Of This Robotic Fish”

Wood Shines In This SCARA Robotic Arm Project

[igarrido] has shared a project that’s been in the works for a long time now; a wooden desktop robotic arm, named Virk I. The wood is Australian Blackwood and looks gorgeous. [igarrido] is clear that it is a side project, but has decided to try producing a small run of eight units to try to gauge interest in the design. He has been busy cutting the parts and assembling in his spare time.

Besides the beautifully finished wood, some of the interesting elements include hollow rotary joints, which mean less cable clutter and a much tidier assembly. 3D printer drivers are a common go-to for CNC designs, and the Virk I is no different. The prototype is driven by a RAMPS 1.4 board, but [igarrido] explains that while this does the job for moving the joints, it’s not ideal. To be truly useful, a driver would need to have SCARA kinematic support, which he says that to his knowledge is something no open source 3D printer driver offers. Without such a driver, the software has no concept of how the joints physically relate to one another, which is needed to make unified and coherent movements. As a result, users must control motors and joints individually, instead of being able to direct the arm as a whole to move to specific coordinates. Still, Virk I might be what’s needed to get that development going. A video of some test movements is embedded below, showing how everything works so far.

Continue reading “Wood Shines In This SCARA Robotic Arm Project”

Shoelace-Tying Robot With Only Two Motors

Many things that humans do are very difficult for machines. Case in point: tying shoelaces. Think of the intricate dance of fingers crossing over fingers that it takes to pass off a lace from one hand to the other. So when a team of five students from UC Davis got together and built a machine that got the job done with two hooks, some very clever gears, and two motors, we have to say that we’re impressed. Watch it in action on Youtube (also embedded below).

The two-motor constraint would seem at first to be a show-stopper, but now that we’ve watched the video about a hundred times, we’re pretty convinced that a sufficiently clever mechanical engineer could do virtually anything with two motors and enough gears. You see, the secret is that one motor is dedicated to moving a drive gear back and forth to multiple destinations, and the other motor provides the power.

This being Hackaday, I’m sure that some of you are saying “I could do that with one motor!” Consider that a challenge.

Meanwhile, if you need to see more gear-porn, check out this hummingbird automaton. Or for the miracles of cam-driven machines, check out [Fran Blanche]’s work with the Maillardet Automaton.

Continue reading “Shoelace-Tying Robot With Only Two Motors”

ESP8266 Powered Tank With Voice Control

The high availability of (relatively) low cost modular components has made building hardware easier than ever. Depending on what you want to do, the hardware side of a project might be the hacker equivalent of building with LEGO. In fact, we wouldn’t be surprised if it literally involved building with LEGO. In any event, easy and quick hardware builds leave more time for developing creative software to run the show. The end result is that we’re starting to see very complex systems broken down into easy-to-replicate DIY builds that would have been nearly impossible just a few years ago.

[igorfonseca83] writes in to share with us his modular tank platform that uses the ESP8266 and a handful of software hacks to allow for voice control from the user’s mobile device. Presented as a step-by-step guide on Hackaday.io, this project is perfect for getting started in Internet-controlled robotics. Whether you just want to experiment with Google Assistant integration or use this as a blank slate to bootstrap a remotely controlled rover, this project has a lot to offer.

The chassis itself is a commercially available kit, and [igorfonseca83] uses a L298N dual channel H-bridge module to control its two geared motors. A Wemos D1 serves as the brains of the operation, and three 18650 3.7V batteries provide the juice to keep everything running. There’s plenty of expansion capability to add sensors and other gear, but for this project getting it rolling was the only concern.

Software wise, there are a number of pieces that work together to provide the Google Assistant control demonstrated in the video after the break. It starts by interfacing the ESP8266 board Adafruit.IO, which connects to IFTTT, and then finally Google Assistant. By setting up a few two variable phrases in IFTTT that get triggered by voice commands in Google Assistant, you can push commands back down to the ESP8266 through Adafruit.IO. It’s a somewhat convoluted setup, admittedly, but the fact that involves very little programming makes it an interesting solution for anyone who doesn’t want to get bogged down with all the minutiae of developing your own Internet control stack.

[igorfonseca83] is no stranger to building remotely controlled rovers. Last year we covered another of his creations which was commanded through a web browser and carried an Android phone to stream video of its adventures.

Continue reading “ESP8266 Powered Tank With Voice Control”

Robots Invade Your Personal Space

If you have ever had to complete a task such as building a LEGO model over a remote connection, you will know that the challenges are like an absurd grade school group project. The person giving directions often has trouble describing what they are thinking, and the person doing the work has trouble interpreting what the instructor wants. “Turn the blue block over. No, only half way. Go back. Now turn it. No, the other way. NO! Not clockwise, downward. That’s Upward! Geez. Are you even listening‽” Good times.

While you may not be in this situation every day, the Keio University of Japan has an intuitive way to give instructors a way to physically interact with an instructee through a Moore/Swayze experience. The instructor has a camera in typical pirate parrot placement over the shoulder. Two arms are controlled by the instructor who can see through stereoscopic cameras to have a first-person view from across the globe. This natural way to interact with the user’s environment allows muscle memory to pass from the instructor to the wearer.

For some of the other styles of telepresence, see this deep-sea bot and a cylindrical screen that looks like someone is beaming up directly from the holodeck.

Continue reading “Robots Invade Your Personal Space”

A Servo Powered Robotic Arm, But Like You’ve Never Seen Before

We’ve written about a lot of DIY robotic arms. Some of them are high-performance, some are inexpensive, and some are just uniquely fun. This one certainly falls into the last category; whilst watching an episode of Black Mirror, [Gear Down For What] was struck by inspiration for a thin robotic limb. After some iterations he has a final prototype, and it’s quite something to see in action.

To make a robotic arm as slender as possible, the actuators can’t be mounted on the arm itself but must instead drive the arm remotely. There are a number of ways of doing this, and though [Gear Down For What] considered using pneumatics or hydraulics, he opted to keep it simple with RC servos which produced a nifty solution that we really like.

The arm is made out of a series of 3D printed ball joints, allowing rotation in any direction. The tricky bit is transferring the force from the servos to each joint. Initially bare fishing line was considered, but this made the remote joints difficult to control when lower joints were moving. The solution was to use the fishing line inside of tubing, similar to the way that bike brakes operate. This allows the force to be carried to the appropriate joint regardless of lower movement. Each joint needs an x and y tension to allow it to rotate in any direction, which means an army of sixteen servos is needed to operate the eight segment arm.

Robotic arms are always fun to build and we’ve seen some pretty neat uses for them, such as mapping magnetic fields in 3D, or teaching sign language.

Continue reading “A Servo Powered Robotic Arm, But Like You’ve Never Seen Before”

Real Or Fake? Robot Uses AI To Find Waldo

The last few weeks have seen a number of tech sites reporting on a robot which can find and point out Waldo in those “Where’s Waldo” books. Designed and built by Redpepper, an ad agency. The robot arm is a UARM Metal, with a Raspberry Pi controlling the show.

A Logitech c525 webcam captures images, which are processed by the Pi with OpenCV, then sent to Google’s cloud-based AutoML Vision service. AutoML is trained with numerous images of Waldo, which are used to attempt a pattern match.  If a pattern is found, the coordinates are fed to PYUARM, and the UARM will literally point Waldo out.

While this is a totally plausible project, we have to admit a few things caught our jaundiced eye. The Logitech c525 has a field of view (FOV) of 69°. While we don’t have dimensions of the UARM Metal, it looks like the camera is less than a foot in the air. Amazon states that “Where’s Waldo Delux Edition” is 10″ x 0.2″ x 12.5″ inches. That means the open book will be 10″ x 25″. The robot is going to have a hard time imaging a surface that large in a single image. What’s more, the c525 is a 720p camera, so there isn’t a whole lot of pixel density to pattern match. Finally, there’s the rubber hand the robot uses to point out Waldo. Wouldn’t that hand block at least some of the camera’s view to the left?

We’re not going to jump out and call this one fake just yet — it is entirely possible that the robot took a mosaic of images and used that to pattern match. Redpepper may have used a bit of movie magic to make the process more interesting. What do you think? Let us know down in the comments!