Turbine-driven Robot To Navigate Inside Space Station

It may look more like a Companion Cube than R2-D2, but the ISS is getting an astromech droid of sorts.

According to [Trey Smith] of the NASA Ames Research Center, Astrobee is an autonomous robot that will be able to maneuver inside the ISS in three dimensions using vectored thrust from a pair of turbines. The floating droid will navigate visually, using a camera to pick out landmarks aboard the station, including docking ports that let it interface with power and data. A simple arm allows Astrobee to grab onto any of the hand rails inside the ISS to provide a stable point for viewing astronaut activities or helping out with the science.

As cool as Astrobee is, we’re intrigued by how the team at Ames is testing it. The droid is mounted on a stand that floats over an enormous and perfectly flat granite slab using low-friction CO₂ gas bearings, giving it freedom to move in two dimensions. We can’t help but wonder why they didn’t suspend the Astrobee from a gantry using a counterweight to add that third dimension in. Maybe that’s next.

From the sound of it, Astrobee is slated to be flight ready by the end of 2017, so we’ll be watching to see how it does. But if they find themselves with a little free time in the schedule, perhaps adding a few 3D-printed cosmetics would allow them to enter the Hackaday Sci-Fi Contest.

Canary Island Team Wins World Robotic Sailing 2016

If you’re like us, you had no idea that there even was a World Robotic Sailing Championship. But we’re glad that we do now! And congratulations to the team of A-Tirma G2, the winning boat. (Link in Spanish, difficult to translate — if you can figure out how, post in the comments?)

The Championship has apparently been going on for nine years now, and moves to a different location around the world each year. The contests for 2016 (PDF) are by no means trivial. Besides a simple there-and-back regatta, the robot boats have to hold position, scan a prescribed area, and avoid a big obstacle and return quickly back to their lane. All of this with wind power, of course.

The winning boat used solid sails, which act essentially as vertical wings, and was designed for rough weather. This paid off in the area-scanning test; the winds were so strong that the organizers considered calling it off, but team A-Tirma’s boat navigated flawlessly, giving them enough points to win the event even though camera malfunction kept them from completing the obstacle avoidance.

stationkeepingtrackingUnless you’ve sailed, it’s hard to appreciate the difficulty of these challenges to an autonomous vehicle. It’s incredibly hard to plan far ahead because the boat’s motive power source, the wind, isn’t constant. But the boat has, relatively speaking, a lot of inertia and no brakes, so the robot has to plan fairly far in advance. That any of the 2-4 meter long boats could stay inside a circle of 20 meters is impressive. Oh, and did we mention that A-Tirma did all of this calculating and reacting on solar power?

Because the wind is so fickle, drone sailboats are much less popular than drone motorboats — at least using the Hackaday Blogpost Metric ™. The hackerboat project is trying out sails, but they’re still mostly working on powered propulsion. We do have an entry in the 2016 Hackaday Prize, but it’s looking like the development process is in the doldrums. Still, sailing is the best way to go in the end, because windpower is essentially free on the open ocean, which means less work for the solar panels.

As far as role-models go, you’ve basically got the entrants in the World Robotic Sailing Championships. So kudos to the A-Tirma team, and thanks [Nikito] for the tip!

Grand Theft Auto V Used To Teach Self-Driving AI

For all the complexity involved in driving, it becomes second nature to respond to pedestrians, environmental conditions, even the basic rules of the road. When it comes to AI, teaching machine learning algorithms how to drive in a virtual world makes sense when the real one is packed full of squishy humans and other potential catastrophes. So, why not use the wildly successful virtual world of Grand Theft Auto V to teach machine learning programs to operate a vehicle?

Half and Half GTAV Annotation ThumbThe hard problem with this approach is getting a large enough sample for the machine learning to be viable. The idea is this: the virtual world provides a far more efficient solution to supplying enough data to these programs compared to the time-consuming task of annotating object data from real-world images. In addition to scaling up the amount of data, researchers can manipulate weather, traffic, pedestrians and more to create complex conditions with which to train AI.

It’s pretty easy to teach the “rules of the road” — we do with 16-year-olds all the time. But those earliest drivers have already spent a lifetime observing the real world and watching parents drive. The virtual world inside GTA V is fantastically realistic. Humans are great pattern recognizers and fickle gamers would cry foul at anything that doesn’t analog real life. What we’re left with is a near-perfect source of test cases for machine learning to be applied to the hard part of self-drive: understanding the vastly variable world every vehicle encounters.

A team of researchers from Intel Labs and Darmstadt University in Germany created a program that automatically indexes the virtual world (as seen above), creating useful data for a machine learning program to consume. This isn’t a complete substitute for real-world experience mind you, but the freedom to make a few mistakes before putting an AI behind the wheel of a vehicle has the potential to speed up development of autonomous vehicles. Read the paper the team published Playing for Data: Ground Truth from Video Games.

Continue reading “Grand Theft Auto V Used To Teach Self-Driving AI”

HTC Vive Gives Autonomous Robots Direction

The HTC Vive is a virtual reality system designed to work with Steam VR. The system seeks to go beyond just a headset in order to make an entire room a virtual reality environment by using two base stations that track the headset and controller in space. The hardware is very exciting because of the potential to expand gaming and other VR experiences, but it’s already showing significant potential for hackers as well — in this case with robotics location and navigation.

Autonomous robots generally utilize one of two basic approaches for locating themselves: onboard sensors and mapping to see the world around it (like how you’d get your bearings while hiking), or sensors in the room which tell the robot where it is (similar to your GPS telling you where you are in the city). Each method has its strengths and weaknesses, of course. Onboard sensors are traditionally expensive if you need very accurate position data, and GPS location data is far too inaccurate to be of use on a smaller scale than city streets.

[Limor] immediately saw the potential in the HTC Vive to solve this problem, at least for indoor applications. Using the Vive Lighthouse base stations, he’s able to locate the system’s controller in 3D space to within 0.3mm. He’s then able to use this data on a Linux system and integrate it into ROS (Robot Operating System). [Limor] hasn’t yet built a robot to utilize this approach, but the significant cost savings ($800 for a complete Vive, but only the Lighthouses and controller are needed) is sure to make this a desirable option for a lot of robot builders. And, as we’ve seen, integrating the Vive hardware with DIY electronics should be entirely possible.

Continue reading “HTC Vive Gives Autonomous Robots Direction”

Hacklet 113 – New Robots

I start each day checking out the new and updated projects over on Hackaday.io. Each day one can find all manner of projects – from satellites to machine vision to rockets. One type of project which is always present are robots- robot arms, educational ‘bots, autonomous robots, and mobile robots. This week’s Hackaday.io had a few great robot projects show up on the “new and updated” page, so I’m using the Hacklet to take a closer look.

bot1We start with [Jack Qiao] and Autonomous home robot that does things. [Jack] is building a robot that can navigate his home. He’s learned that just creating a robot that can get itself from point A to point B in the average home is a daunting task. To make this happen, he’s using the Simultaneous Localization and Mapping (SLAM) algorithm. He’s implementing SLAM with the help of Robotic Operating System (ROS).  The robot started out as a test mule tethered to a laptop. It’s evolved to a wooden base with a mini ITX motherboard. Mapping data comes in through a Kinect V2, which will soon be upgraded to a Neato XV-11 LIDAR system.

 

tyrobotNext up is [Tyler Spadgenske] with TyroBot. TyroBot is a walking robot with some lofty goals, including walking a mile in a straight line without falling down. [Tyler’s] inspiration comes from robots such as Bob the Biped and Zowi. So far, TyroBot consists of legs and feet printed in PLA. [Tyler] is going to use a 32 bit processor for [TyroBot’s] brain, and wants to avoid the Arduino IDE at any cost (including writing his own IDE from scratch). This project is just getting started, so head on over to the project page and watch TyroBot’s progress!

 

friendbotNext is [Mike Rigsby] with Little Friend. Little Friend is a companion robot. [Mike] found that robots spend more time charging batteries than interacting. This wouldn’t do for a companion robot. His solution was to do away with batteries all together. Little Friend is powered by super capacitors. An 8 minute charge will keep this little bot going for 75 minutes. An Arduino with a motor shield controls Little Friend’s DC drive motors, as well as two animated eyes. If you can’t tell, [Mike] used a tomato as his inspiration. This keeps Little Friend in the cute zone, far away from the uncanny valley.

 

logi-botFinally we have the walking robot king, [Radomir Dopieralski], with Logicoma-kun. For the uninitiated, a Logicoma is a robot tank (or “logistics robot”) from the Ghost in the Shell series. [Radomir] decided to bring these cartoon tanks to life – at least in miniature. The bulk of Logicoma-kun is built carefully cut and sculpted acrylic sheet. Movement is via popular 9 gram servos found all over the internet. [Radomir] recently wrote an update outlining his new brain for Logicoma-kun. An Arduino Pro Mini will handle servo control. The main computer will be an ESP8266 running Micropython. I can’t wait to see this little ‘bot take its first steps.

If you want more robotic goodness, check out our brand new mobile robot list! Did I miss your project? Don’t be shy, just drop me a message on Hackaday.io. That’s it for this week’s Hacklet, As always, see you next week. Same hack time, same hack channel, bringing you the best of Hackaday.io!

Autonomous Musical Soundscapes From 42 Fans And 7 Lasers

[dmitry] writes in to let us know about a new project that combines lasers with fans and turns the resulting modulation of the light beams into an autonomous soundscape. The piece is called “divider” and is a large, wall-mounted set of rails upon which seven red lasers are mounted on one end with seven matching light sensors mounted on the other end. Interrupting the lasers’ paths are forty-two brushless fans. Four Arduino Megas control the unit.

3Laser beams shining into light sensors don’t do much of anything on their own, but when spinning fan blades interrupt each laser beam it modulates the solid beams and turns the readings of the sensors on the far end into a changing electrical signal which can be played as sound. Light being modulated by fan blades to create sound is the operating principle behind a Fan Synth, which we’ve discussed before as being a kind of siren (or you can go direct to that article’s fan synth demo video to hear what kind of sounds are possible from such a system.)

This project takes this entire concept of a fan synth further by not only increasing the number of lasers and fans, but by tying it all together into an autonomous system. The lasers are interrupted repeatedly and constantly, but never simultaneously. Listen to and watch it in action in the video below.

Continue reading “Autonomous Musical Soundscapes From 42 Fans And 7 Lasers”

Building A Swarm Of Autonomous Ocean Boats

There’s a gritty feel to the Hackerboat project. It doesn’t have slick and polished marketing, people lined up with bags of money to get in on the ground floor, or a flashy name (which I’ll get to in a bit). What it does have is a dedicated team of hackers who are building prototypes to solve some really big challenges. Operating on the ocean is tough on equipment, especially so with electronics. Time and tenacity has carried this team and their project far.

Continue reading “Building A Swarm Of Autonomous Ocean Boats”