Earth Rovers Explore Our Own Planet

While Mars is currently under close scrutiny by NASA and other space agencies, there is still a lot of exploring to do here on Earth. But if you would like to explore a corner of our own planet in the same way NASA that explores Mars, it’s possible to send your own rover to a place and have it send back pictures and data for you, rather than go there yourself. This is what [Norbert Heinz]’s Earth Explorer robots do, and anyone can drive any of the robots to explore whatever locations they happen to be in.

A major goal of the Earth Explorer robot is to be easy to ship. This is a smaller version of the same problem the Mars rovers have: how to get the most into a robot while having as little mass as possible. The weight is kept to under 500g, and the length, width, and height to no more than 90cm combined. This is easy to do with some toy cars modified to carry a Raspberry Pi, a camera, and some radios and sensors. After that, the robots only need an interesting place to go and an Internet connection to communicate with Mission Control.

[Norbert] is currently looking for volunteers to host some of these robots, so if you’re interested head on over to the project page and get started. If you’d just like to drive the robots, though, you can also get your rover fix there as well. It’s an interesting project that will both get people interested in exploring Earth and in robotics all at the same time. And, if you’d like to take the rover concept beyond simple exploration, there are other machines that can take care of the same planet they explore.

Continue reading “Earth Rovers Explore Our Own Planet”

Hackaday Prize Entry: SafeRanger, A Roving Power Plant Monitor

Engineering student [Varun Suresh] designed his SafeRanger rover to inspect oil and gas power plants for abnormal temperatures as well as gas leaks. The rover explores critical areas of the factory, and data is sent to a control center for analysis.

[Varun] built his robot around a Devastator chassis kit from DFRobot, and equipped it with a FLIR Lepton thermal camera and an MQ2 gas sensor, both monitored by a Raspberry Pi. The twin brushless DC motors are controlled by an L293D motor driver IC in conjunction with an Arduino Nano; steering is accomplished with an HC-05 Bluetooth module and a mobile app.

We could see technology like this being implemented in a labyrinthine facility where a human inspector might have a difficult time reaching every nook and cranny. Or just let it wander ar0und, looking for trouble?

Hackaday Prize Entry: InspectorBot Aims To Look Underneath

Why bother crawling into that tiny sewer tunnel and getting coated in Cthulhu knows what — not to mention possibly getting stuck — when you can roll a robot in there instead? That’s what InspectorBot does. It’s [Dennis]’ entry for The Hackaday Prize and a finalist for our Best Product competition.

InspectorBot is a low-profile rover designed to check out the dark recesses of sewers, crawlspaces, and other icky places where humans either won’t fit or don’t want to go. Armed with a Raspberry Pi computer, it sports a high-definition camera pointed up and a regular webcam pointing forward for navigation. It uses point-to-point WiFi for communication and rocks all-wheel drive controlled by a pair of L293D motor drivers.

This seems like fertile ground for us. Pipe-crawlers, chimney-climbers, crawlspace-slitherers all sound like they’d be helpful, particularly in conjunction with some kind of computer vision that allowed the robot to notice problems even when the operator does not. Right now, [Dennis] has the chassis rolling and most of the current work is focused on software. Both cameras are now working, allowing the InspectorBot to send forward-looking and upward-looking video back to the operator at the same time. This, alone, is a great advancement of the current crop of Raspberry Pi rovers and adds a lot of functionality to an easy-to-build platform.

Hackaday Prize Entry: Robo-Dog Learns To Heel

[Radu Motisan] is working on a small rover whose primary trick is being able to identify its owner. Robo-Dog is his proof of concept, a rover that uses five ultrasonic sensors to move toward the nearest obstruction. Obviously, this isn’t the same as being able to recognize one person from another, but it’s a start.

The sensors were home-built using ultrasonic capsules soldered into a custom board, with the tube-shaped enclosures made out of PVC pipe. He made an ultrasonic beacon that uses a 556 timer IC to emit 40 KHz pulses so he can get the hang of steering the robot purely with sound. If that fails, Robo-Dog also has an infrared proximity sensor in front. All of it is controlled by an ATmega128 board and a custom H-bridge motor controller.

[Radu] has been fine-tuning the algorithm, making Robo-Dog move faster to catch up with a target that’s far away, but slower to one that’s close by. It compares the readings from two sensors to compute the angle of approach.

Explore Venus With A Strandbeest Rover

There’s a little problem with sending drones to Venus: it’s too hostile for electronics; the temperature averages 867 °F and the pressure at sea level is 90 atmospheres. The world duration record is 2 hours and 7 minutes, courtesy of Russia’s Venera 13 probe. To tackle the problem, JPL has created a concept for AREE, a mechanical robot designed to survive in that environment.

AREE consists of a Strandbeest configuration of multiple legs with a monster fan propelling it, and one can imagine it creeping over the Venusian landscape. While its propulsion system might be handled by the Strandbeest mechanism, it will still have to navigate and transmit data. We’re not sure how a mechanical radio wave might work–maybe like those propeller arrow-cutters that [Dain of the Iron Hills] busts out in movie version of the Hobbit? Chemical rockets that somehow don’t spontaneously ignite? Or maybe it can just “transfer all energy to life support” and AC the heck out of the radio.

We’re space nerds here at Hackaday–check out our piece about NASA employees’ talks at the 2016 Hackaday Superconference and our extracurricular tour of JPL.

Continue reading “Explore Venus With A Strandbeest Rover”

3D-Printed Rover Rolls Light And Looks Right

[Rick Winscott]’s RO-V Remotely Operated Vehicle instructable shows you how to make this cool-looking and capable robot. The rover, a 1/10th scale truggy, sports a chassis printed in silver and black PLA. It’s got a wireless router mounted on the back, and a webcam in a 2-servo gimbal up front. [Rick] made his own steering rack and pinion out of 3D printed parts and brass M3-threaded rods which he tapped himself.

The simplified drive system nixes the front, rear, and center differentials, thereby saving [Rick] on printing time, complexity, and weight — he was able to include a second 4000 mAH battery. A TReX Jr motor controller runs a pair of Pololu gear motors. All of this is controlled by a Beaglebone Black alongside a Spektrum DX6i 2.4Ghz transmitter and an OrangeRx 6-channel receiver. The DX6i [Rick] employs typically finds use as an airplane/quad controller, but he reconfigured it to steer the rover—the left stick controls direction and the right stick (elevator and aileron) control the webcam servos.

Enough talking technicals. We think this rover is pretty in the face. Much of this attraction owes to the set of Dagu Wild Thumper wheels (an entirely reasonable name) and the awe-inspiring 100mm shocks that jack up this whip so pleasingly. However, [Rick]’s elegant chassis and the silver-and-black color scheme doesn’t hurt one bit. The wheels are mostly for the cool factor, however—[Rick] recommends swapping out the relatively modest Pololu 20D gear motors in favor of higher-torque models if you’re planning any actual off-road extremeness. If you’re interested in making your own you can download the chassis files from Tinkercad or the BeagleBone code from Github.

If it’s other drone projects you’re after, check out the duct rover and solar wifi rover we published recently.

Arpeggio – The Piano SuperDroid

I never had the musical talent in me. Every now and then I would try to pick up a guitar or try and learn the piano, romanticising a glamorous career out of it at some point. Arpeggio – the Piano SuperDroid (YouTube, embedded below) sure makes me glad I chose a different career path. This remarkable machine is the brain child of [Nick Morris], who spent two years building it.

Although there are no detailed technical descriptions yet, at its heart this handsome robot consists of a set of machined ‘fingers’ connected to a set of actuators — most likely solenoids . The solenoids are controlled by proprietary software that combines traditional musical data with additional parameters to accurately mimic performances by your favourite pianists, right in your living room. Professional pianists, who were otherwise assuming excellent job security under Skynet, clearly have to reconsider now.

Continue reading “Arpeggio – The Piano SuperDroid”