A Beverage Cooler That Comes To You!

Feel like taking a long walk, but can’t be bothered with carrying your drinks? Have no fear, this  “Follow Me” Cooler Bot is here!

Really just a mobile platform with a cooler on top, the robot connects to smartphone via Bluetooth, following it using GPS. Making the platform involves a little woodworking skill, and an aluminium hub with a 3D-printed hub adapter connects the motors to a pair 6″ rubber wheels with a swivel caster mounted at the rear. A pocket in the platform’s base houses the electronics.

The Arduino Uno — via an L298n motor driver — controls two 12V DC, brushed and geared motors mounted with 3D printed brackets, while a Parallax PAM-7Q GPS Module in conjunction with an HMC 5883L compass help the robot keep its bearing. A duo of batteries power the motors and the electronics separately to prevent  any malfunctions.

Continue reading “A Beverage Cooler That Comes To You!”

Ambitious Hackerboat Project Still Aiming High

Last year we wrote about Hackerbot Labs’ autonomous boat, which project members hope to someday circumnavigate the globe. Now called Project Ladon, progress continues apace with a recent ocean test of their modified 18’ kayak, the TSV Disputed Right of Way. The kayak’s internal spaces contain a pair of lead-acid truck batteries controlled by a home-brewed control system that uses relays to control the craft’s trolling motor, with a Beaglebone and Arduino Mega under the hood.

The test was not exactly a success, with the boat actually avoiding the waypoints rather than sticking to them. Fortunately the team was aboard a chase boat so they were able to keep tabs on the craft. Unlike a quadcopter, which just falls down, a watercraft that borks may never be seen again.

Entered into the 2016 Hackaday Prize, the project has continued to gather steam, with presentations at both Toorcamp and Maker Faire Bay Area. In addition, they’re maintaining their Hackaday.io project site as well as a Patreon page.

Check out a couple of videos after the break! The test video is 360-degrees so you can drag around the POV.

Continue reading “Ambitious Hackerboat Project Still Aiming High”

Submersible Robots Hunt Lice With Lasers

De-lousing is a trying agricultural process. It becomes a major problem in pens which contain the hundreds of thousands of salmon farmed by Norwegians — the world’s largest salmon exporter — an environment which allows the parasite to flourish. To tackle the problem, the Stingray, developed by [Stingray Marine Solutions],  is an autonomous drone capable of destroying the lice with a laser in the order of tens of thousands per day.

Introduced in Norway back in 2014 — and some areas in Scotland in 2016 — the Stingray floats in the salmon pen, alert and waiting. If the lice-recognition software (never thought you’d hear that term, huh?) detects a parasite for more than two frames in the video feed, it immediately annihilates it with a 530 nanometre-wide, 100 millisecond laser pulse from up to two metres away. Don’t worry — the salmon’s scales are reflective enough to leave it unharmed, while the pest is fried to a crisp.  In action, it’s reminiscent of a point-defense laser on a spaceship.

Continue reading “Submersible Robots Hunt Lice With Lasers”

Turbine-driven Robot To Navigate Inside Space Station

It may look more like a Companion Cube than R2-D2, but the ISS is getting an astromech droid of sorts.

According to [Trey Smith] of the NASA Ames Research Center, Astrobee is an autonomous robot that will be able to maneuver inside the ISS in three dimensions using vectored thrust from a pair of turbines. The floating droid will navigate visually, using a camera to pick out landmarks aboard the station, including docking ports that let it interface with power and data. A simple arm allows Astrobee to grab onto any of the hand rails inside the ISS to provide a stable point for viewing astronaut activities or helping out with the science.

As cool as Astrobee is, we’re intrigued by how the team at Ames is testing it. The droid is mounted on a stand that floats over an enormous and perfectly flat granite slab using low-friction CO₂ gas bearings, giving it freedom to move in two dimensions. We can’t help but wonder why they didn’t suspend the Astrobee from a gantry using a counterweight to add that third dimension in. Maybe that’s next.

From the sound of it, Astrobee is slated to be flight ready by the end of 2017, so we’ll be watching to see how it does. But if they find themselves with a little free time in the schedule, perhaps adding a few 3D-printed cosmetics would allow them to enter the Hackaday Sci-Fi Contest.

Canary Island Team Wins World Robotic Sailing 2016

If you’re like us, you had no idea that there even was a World Robotic Sailing Championship. But we’re glad that we do now! And congratulations to the team of A-Tirma G2, the winning boat. (Link in Spanish, difficult to translate — if you can figure out how, post in the comments?)

The Championship has apparently been going on for nine years now, and moves to a different location around the world each year. The contests for 2016 (PDF) are by no means trivial. Besides a simple there-and-back regatta, the robot boats have to hold position, scan a prescribed area, and avoid a big obstacle and return quickly back to their lane. All of this with wind power, of course.

The winning boat used solid sails, which act essentially as vertical wings, and was designed for rough weather. This paid off in the area-scanning test; the winds were so strong that the organizers considered calling it off, but team A-Tirma’s boat navigated flawlessly, giving them enough points to win the event even though camera malfunction kept them from completing the obstacle avoidance.

stationkeepingtrackingUnless you’ve sailed, it’s hard to appreciate the difficulty of these challenges to an autonomous vehicle. It’s incredibly hard to plan far ahead because the boat’s motive power source, the wind, isn’t constant. But the boat has, relatively speaking, a lot of inertia and no brakes, so the robot has to plan fairly far in advance. That any of the 2-4 meter long boats could stay inside a circle of 20 meters is impressive. Oh, and did we mention that A-Tirma did all of this calculating and reacting on solar power?

Because the wind is so fickle, drone sailboats are much less popular than drone motorboats — at least using the Hackaday Blogpost Metric ™. The hackerboat project is trying out sails, but they’re still mostly working on powered propulsion. We do have an entry in the 2016 Hackaday Prize, but it’s looking like the development process is in the doldrums. Still, sailing is the best way to go in the end, because windpower is essentially free on the open ocean, which means less work for the solar panels.

As far as role-models go, you’ve basically got the entrants in the World Robotic Sailing Championships. So kudos to the A-Tirma team, and thanks [Nikito] for the tip!

Grand Theft Auto V Used To Teach Self-Driving AI

For all the complexity involved in driving, it becomes second nature to respond to pedestrians, environmental conditions, even the basic rules of the road. When it comes to AI, teaching machine learning algorithms how to drive in a virtual world makes sense when the real one is packed full of squishy humans and other potential catastrophes. So, why not use the wildly successful virtual world of Grand Theft Auto V to teach machine learning programs to operate a vehicle?

Half and Half GTAV Annotation ThumbThe hard problem with this approach is getting a large enough sample for the machine learning to be viable. The idea is this: the virtual world provides a far more efficient solution to supplying enough data to these programs compared to the time-consuming task of annotating object data from real-world images. In addition to scaling up the amount of data, researchers can manipulate weather, traffic, pedestrians and more to create complex conditions with which to train AI.

It’s pretty easy to teach the “rules of the road” — we do with 16-year-olds all the time. But those earliest drivers have already spent a lifetime observing the real world and watching parents drive. The virtual world inside GTA V is fantastically realistic. Humans are great pattern recognizers and fickle gamers would cry foul at anything that doesn’t analog real life. What we’re left with is a near-perfect source of test cases for machine learning to be applied to the hard part of self-drive: understanding the vastly variable world every vehicle encounters.

A team of researchers from Intel Labs and Darmstadt University in Germany created a program that automatically indexes the virtual world (as seen above), creating useful data for a machine learning program to consume. This isn’t a complete substitute for real-world experience mind you, but the freedom to make a few mistakes before putting an AI behind the wheel of a vehicle has the potential to speed up development of autonomous vehicles. Read the paper the team published Playing for Data: Ground Truth from Video Games.

Continue reading “Grand Theft Auto V Used To Teach Self-Driving AI”

HTC Vive Gives Autonomous Robots Direction

The HTC Vive is a virtual reality system designed to work with Steam VR. The system seeks to go beyond just a headset in order to make an entire room a virtual reality environment by using two base stations that track the headset and controller in space. The hardware is very exciting because of the potential to expand gaming and other VR experiences, but it’s already showing significant potential for hackers as well — in this case with robotics location and navigation.

Autonomous robots generally utilize one of two basic approaches for locating themselves: onboard sensors and mapping to see the world around it (like how you’d get your bearings while hiking), or sensors in the room which tell the robot where it is (similar to your GPS telling you where you are in the city). Each method has its strengths and weaknesses, of course. Onboard sensors are traditionally expensive if you need very accurate position data, and GPS location data is far too inaccurate to be of use on a smaller scale than city streets.

[Limor] immediately saw the potential in the HTC Vive to solve this problem, at least for indoor applications. Using the Vive Lighthouse base stations, he’s able to locate the system’s controller in 3D space to within 0.3mm. He’s then able to use this data on a Linux system and integrate it into ROS (Robot Operating System). [Limor] hasn’t yet built a robot to utilize this approach, but the significant cost savings ($800 for a complete Vive, but only the Lighthouses and controller are needed) is sure to make this a desirable option for a lot of robot builders. And, as we’ve seen, integrating the Vive hardware with DIY electronics should be entirely possible.

Continue reading “HTC Vive Gives Autonomous Robots Direction”