The world is full of educational robots for STEAM education, but we haven’t seen one as small or as cute as the Skoobot, an entry in this year’s Hackaday Prize. It’s barely bigger than an inch cubed, but it’s still packed with motors, a battery, sensors, and a microcontroller powerful enough to become a pocket-sized sumo robot.
The hardware inside each Skoobot is small, but powerful. The main microcontroller is a Nordic nRF52832, giving this robot an ARM Cortex-M4F brain and Bluetooth. The sensors include a VL6180X time of flight sensor that has a range of about 100mm. Skoobot also includes a light sensor for all your robotic photovoring needs. Other than that, the Skoobot is just about what you would expect, with a serial port, a buzzer, and some tiny wheels mounted in a plastic frame.
The idea behind the Skoobot is to bring robotics to the classroom, introducing kids to fighting/sumo robots, while still being small, cheap, and cute. To that end, the Skoobot is completely controllable via Bluetooth so anyone with a phone, a Pi, or any other hardware can make this robot move, turn, chase after light, or sync multiple Skoobots together for a choreographed dance.
In the depths of Etsy and Pinterest is a fascinating, if tedious, artform. String art, the process of nailing pins in a board and wrapping thread around the perimeter to create shapes and shading, The most popular project in this vein is something like putting the outline of a heart, in string, in the shape of your home state. Something like that, at least.
While this artform involves about as much effort as pallet wood furniture, there is an interesting computational aspect of it: you can create images with string art, and doing this is a very, very hard problem to solve with an algorithm. Researchers at TU Wien have brought out the best that string art has to offer. They’ve programmed an industrial robot to create portraits out of string.
The experimental setup for this is about as simple as it gets. It’s a circular frame studded with 256 hooks around the perimeter. An industrial robot arm takes a few kilometers of thread winds a piece of string around one of these hooks, then travels to another hook. Repeat that thousands and thousands of times, and you get a portrait of Ada Lovelace or Albert Einstein.
The real trick here is the algorithm that takes an image and translates it into the paths the string will take. This is an NP-hard problem, but it is a surprisingly well-studied problem. The first autorouters — the things you should never trust to route traces between the packages on your PCB — we created for wire wrapped computers. Here, computers would find the shortest path between whatever pins had to be connected together. There were, of course, limitations: pins could only have so many connections on them thanks to the nature of wire wrapping, and you couldn’t have one gigantic mass of wires for a parallel bus. The first autorouters were string art algorithms, only in reverse.
It turns out that this robofish comes from the fertile mind of [Carl Bugeja], whose PCB motors and flexible actuators have been covered here before. The basic concept of these fish fins is derived from the latter project, which uses coils printed onto both sides of a flexible Kapton substrate. Positioned near a magnet, the actuators bend when a current runs through them. The video below shows two prototype robofish, each with four fins. The first is a scrap of foam with a magnet embedded; the fins did flap but the whole thing just weighed too much. Version two was much lighter and almost worked, but the tether to the driver is just too stiff to allow it to really flex its fins.
It looks like it has promise though, and we’re excited to see where [Carl] take this. Perhaps schools of tiny robofish patrolling for pollution?
[igarrido] has shared a project that’s been in the works for a long time now; a wooden desktop robotic arm, named Virk I. The wood is Australian Blackwood and looks gorgeous. [igarrido] is clear that it is a side project, but has decided to try producing a small run of eight units to try to gauge interest in the design. He has been busy cutting the parts and assembling in his spare time.
Besides the beautifully finished wood, some of the interesting elements include hollow rotary joints, which mean less cable clutter and a much tidier assembly. 3D printer drivers are a common go-to for CNC designs, and the Virk I is no different. The prototype is driven by a RAMPS 1.4 board, but [igarrido] explains that while this does the job for moving the joints, it’s not ideal. To be truly useful, a driver would need to have SCARA kinematic support, which he says that to his knowledge is something no open source 3D printer driver offers. Without such a driver, the software has no concept of how the joints physically relate to one another, which is needed to make unified and coherent movements. As a result, users must control motors and joints individually, instead of being able to direct the arm as a whole to move to specific coordinates. Still, Virk I might be what’s needed to get that development going. A video of some test movements is embedded below, showing how everything works so far.
Many things that humans do are very difficult for machines. Case in point: tying shoelaces. Think of the intricate dance of fingers crossing over fingers that it takes to pass off a lace from one hand to the other. So when a team of five students from UC Davis got together and built a machine that got the job done with two hooks, some very clever gears, and two motors, we have to say that we’re impressed. Watch it in action on Youtube (also embedded below).
The two-motor constraint would seem at first to be a show-stopper, but now that we’ve watched the video about a hundred times, we’re pretty convinced that a sufficiently clever mechanical engineer could do virtually anything with two motors and enough gears. You see, the secret is that one motor is dedicated to moving a drive gear back and forth to multiple destinations, and the other motor provides the power.
This being Hackaday, I’m sure that some of you are saying “I could do that with one motor!” Consider that a challenge.
The high availability of (relatively) low cost modular components has made building hardware easier than ever. Depending on what you want to do, the hardware side of a project might be the hacker equivalent of building with LEGO. In fact, we wouldn’t be surprised if it literally involved building with LEGO. In any event, easy and quick hardware builds leave more time for developing creative software to run the show. The end result is that we’re starting to see very complex systems broken down into easy-to-replicate DIY builds that would have been nearly impossible just a few years ago.
[igorfonseca83] writes in to share with us his modular tank platform that uses the ESP8266 and a handful of software hacks to allow for voice control from the user’s mobile device. Presented as a step-by-step guide on Hackaday.io, this project is perfect for getting started in Internet-controlled robotics. Whether you just want to experiment with Google Assistant integration or use this as a blank slate to bootstrap a remotely controlled rover, this project has a lot to offer.
The chassis itself is a commercially available kit, and [igorfonseca83] uses a L298N dual channel H-bridge module to control its two geared motors. A Wemos D1 serves as the brains of the operation, and three 18650 3.7V batteries provide the juice to keep everything running. There’s plenty of expansion capability to add sensors and other gear, but for this project getting it rolling was the only concern.
Software wise, there are a number of pieces that work together to provide the Google Assistant control demonstrated in the video after the break. It starts by interfacing the ESP8266 board Adafruit.IO, which connects to IFTTT, and then finally Google Assistant. By setting up a few two variable phrases in IFTTT that get triggered by voice commands in Google Assistant, you can push commands back down to the ESP8266 through Adafruit.IO. It’s a somewhat convoluted setup, admittedly, but the fact that involves very little programming makes it an interesting solution for anyone who doesn’t want to get bogged down with all the minutiae of developing your own Internet control stack.
If you have ever had to complete a task such as building a LEGO model over a remote connection, you will know that the challenges are like an absurd grade school group project. The person giving directions often has trouble describing what they are thinking, and the person doing the work has trouble interpreting what the instructor wants. “Turn the blue block over. No, only half way. Go back. Now turn it. No, the other way. NO! Not clockwise, downward. That’s Upward! Geez. Are you even listening‽” Good times.
While you may not be in this situation every day, the Keio University of Japan has an intuitive way to give instructors a way to physically interact with an instructee through a Moore/Swayze experience. The instructor has a camera in typical pirate parrot placement over the shoulder. Two arms are controlled by the instructor who can see through stereoscopic cameras to have a first-person view from across the globe. This natural way to interact with the user’s environment allows muscle memory to pass from the instructor to the wearer.