[Eric Dirgahayu] wanted to explore underwater with some sensors and cameras. First, he needed a platform to carry them. That led to his Arduino-controlled swimming fish. The fish is made from PVC and some waterproof servos. From the video (see below) it isn’t clear how much control the fish has, but it does swim with an undulating motion like a real fish.
If you’re like us, you had no idea that there even was a World Robotic Sailing Championship. But we’re glad that we do now! And congratulations to the team of A-Tirma G2, the winning boat. (Link in Spanish, difficult to translate — if you can figure out how, post in the comments?)
The Championship has apparently been going on for nine years now, and moves to a different location around the world each year. The contests for 2016 (PDF) are by no means trivial. Besides a simple there-and-back regatta, the robot boats have to hold position, scan a prescribed area, and avoid a big obstacle and return quickly back to their lane. All of this with wind power, of course.
The winning boat used solid sails, which act essentially as vertical wings, and was designed for rough weather. This paid off in the area-scanning test; the winds were so strong that the organizers considered calling it off, but team A-Tirma’s boat navigated flawlessly, giving them enough points to win the event even though camera malfunction kept them from completing the obstacle avoidance.
Unless you’ve sailed, it’s hard to appreciate the difficulty of these challenges to an autonomous vehicle. It’s incredibly hard to plan far ahead because the boat’s motive power source, the wind, isn’t constant. But the boat has, relatively speaking, a lot of inertia and no brakes, so the robot has to plan fairly far in advance. That any of the 2-4 meter long boats could stay inside a circle of 20 meters is impressive. Oh, and did we mention that A-Tirma did all of this calculating and reacting on solar power?
Because the wind is so fickle, drone sailboats are much less popular than drone motorboats — at least using the Hackaday Blogpost Metric ™. The hackerboat project is trying out sails, but they’re still mostly working on powered propulsion. We do have an entry in the 2016 Hackaday Prize, but it’s looking like the development process is in the doldrums. Still, sailing is the best way to go in the end, because windpower is essentially free on the open ocean, which means less work for the solar panels.
As far as role-models go, you’ve basically got the entrants in the World Robotic Sailing Championships. So kudos to the A-Tirma team, and thanks [Nikito] for the tip!
You never know what you’ll find when you open the projects feed on Hackaday.io. Most weeks, The Hacklet follows a theme of some sort. Sometimes I find projects that just look so cool that I have to get the word out about them.
Such is the case with this week’s first project, Mr. Runner created by [Alex Martin]. Mr. Runner is a quadruped robot that really looks the part. In fact, I’d say it looks like it’s ready to jump off the bench top. Like many of us, [Alex] has been inspired by Boston Dynamics, specifically their Wildcat robot. Wildcat had [Alex] searching the net for walking robot designs. He struck up something he liked with the work of [Dr. Fumiya Iida] and [Dr. Rolf Pfiefer]. In the mid 2000’s, the pair worked out of the University of Zurich. Mr. Runner is based upon their work, with plenty of design tweaks from [Alex].
The basic design is a quadruped with two servos per leg. The servos are at the body and the upper half of the leg. The knee and lower leg are connected by levers and a spring, forming something of a 4 bar linkage. The spring acts as a tendon, absorbing shock, and allowing energy from the servo to be stored and released while the robot runs. [Alex] is experimenting with gaits, controlled by a PC.
Mr. Runner wouldn’t be doing much running without a way to control those 8 servos. [Alex] started with an Arduino and a LynxMotion serial servo controller. This pairing served him well for the first generation of Mr. Runner. For the new version of the robot, he’s rolling his own board based upon Lynxmotion’s
BotBoarduino. The Gerber files have been sent off to OSH Park, and in about a week, Mr. Runner will be off to the races.
Another great recently updated project is Arcade Claw Game Claw Build by [Alex Anderson]. I spent way too many hours of my youth in arcades, and more than a few quarters went into claw games. Sure, they’re usually rigged, but who hasn’t been pulled in by the chance to test your skill and win a prize? A friend asked [Alex] to design an arcade style claw for a game. A seasoned CNC and 3D printing master, [Alex] grabbed his notebook and started sketching. Rack and pinion designs would work well, but didn’t within the constraints of the game. A leadscrew based design would also work, but would be two expensive. Finally, [Alex] settled on a design and fired up his CAD software. He started with two jaw systems to prove out the basic system. Once that was complete, [Alex] moved to a 4 jaw setup.
Much like the arcade games, the claw is actuated by a central plunger. The plunger drives linkages which move the 4 claw jaws. Everything looks good on paper, but when the CAD drawings meet the real world, things get complicated quickly. The initial design relied on a 3D printed part which connected the plunger to the jaw linkages. Any slop in this part would be magnified through the rest of the mechanical system. 3D printers aren’t perfect, and there was some slop — enough that the parts would pinch and bind up while moving.
[Alex] already has a revised design in mind. This is very much a work in progress. That’s the beauty of well documented projects on Hackaday.io — you get to see what works, as well as all the trials and tribulations it took to get to a final working project. Keep at it [Alex], you’re almost there!
That’s it for this week’s Hacklet, As always, see you next week. Same hack time, same hack channel, bringing you the best of Hackaday.io!
[Jack Qiao] wanted an autonomous robot that could be handy around an ever-changing shop. He didn’t want a robot he’d have to baby sit. If he said, ‘bring me the 100 ohm resistors’, it would go find and bring them to him.
He iterated a bit, and ended up building quite a nice robot platform for under a thousand dollars. It’s got a realsense camera and a rangefinder from a Neato robotic vacuum. In addition to a mircrophone, it has a whole suite of additional sensors in its base, which is a stripped down robotic vacuum from a Korean manufacturer. A few more components come together to give it an arm and a gripper.
The thinking is done on a Nvidia Jetson TK1 board. The cores on the integrated graphics card are used to perform faster computer vision calculations. The software is all ROS based.
As can be seen in the video after the break. The robot uses SLAM techniques to successfully navigate and complete tasks such as fetch resistors, get water, and more. [Jack Qiao] is happy with his robot, and we would be too.
Ever since the Roomba was invented, humanity has been one step closer to a Jetsons-style future with robots performing all of our tedious tasks for us. The platform is so ubiquitous and popular with the hardware hacking community that almost anything that could be put on a Roomba has been done already, with one major exception: a Roomba with heat vision. Thanks to [marcelvarallo], though, there’s now a Roomba with almost all of the capabilities of the Predator.
The Roomba isn’t just sporting an infrared camera, though. This Roomba comes fully equipped with a Raspberry Pi for wireless connectivity, audio in and out, video streaming from a webcam (and the FLiR infrared camera), and control over the motors. Everything is wired to the internal battery which allows for automatic recharging, but the impressive part of this build is that it’s all done in a non-destructive way so that the Roomba can be reverted back to a normal vacuum cleaner if the need arises.
If sweeping a just the right time the heat camera might be the key to the messy problem we discussed on Wednesday.
The only thing stopping this from hunting humans is the addition of some sort of weapons. Perhaps this sentry gun or maybe some exploding rope. And, if you don’t want your vacuum cleaner to turn into a weapon of mass destruction, maybe you could just turn yours into a DJ.
The HTC Vive is a virtual reality system designed to work with Steam VR. The system seeks to go beyond just a headset in order to make an entire room a virtual reality environment by using two base stations that track the headset and controller in space. The hardware is very exciting because of the potential to expand gaming and other VR experiences, but it’s already showing significant potential for hackers as well — in this case with robotics location and navigation.
Autonomous robots generally utilize one of two basic approaches for locating themselves: onboard sensors and mapping to see the world around it (like how you’d get your bearings while hiking), or sensors in the room which tell the robot where it is (similar to your GPS telling you where you are in the city). Each method has its strengths and weaknesses, of course. Onboard sensors are traditionally expensive if you need very accurate position data, and GPS location data is far too inaccurate to be of use on a smaller scale than city streets.
[Limor] immediately saw the potential in the HTC Vive to solve this problem, at least for indoor applications. Using the Vive Lighthouse base stations, he’s able to locate the system’s controller in 3D space to within 0.3mm. He’s then able to use this data on a Linux system and integrate it into ROS (Robot Operating System). [Limor] hasn’t yet built a robot to utilize this approach, but the significant cost savings ($800 for a complete Vive, but only the Lighthouses and controller are needed) is sure to make this a desirable option for a lot of robot builders. And, as we’ve seen, integrating the Vive hardware with DIY electronics should be entirely possible.
It’s no secret that a lot of time, money, and effort goes into photographing and filming all that delicious food you see in advertisements. Mashed potatoes in place of ice cream, carefully arranged ingredients on subs, and perfectly golden french fries are all things you’ve seen so often that they’re taken for granted. But, those are static shots – the food is almost always just sitting on a plate. At most, you might see a chef turning a steak or searing a fillet in a commercial for a restaurant. What takes real skill – both artistic and technical – is assembling a hamburger in mid-air and getting it all in stunning 4k video.
That’s what [Steve Giralt] set out to do, and to accomplish it he had to get creative. Each component of the hamburger was suspended by rubber bands, and an Arduino timed and controlled servo system cut each rubber band just before that ingredient entered the frame. There’s even a 3D printed dual-catapult system to fling the condiments, causing them to collide in the perfect place to land in place on the burger.