India’s Moon Mission Is Far From Over

India’s Chandrayaan-2 mission to the Moon was, in a word, ambitious. Lifting off from the Satish Dhawan Space Centre on July 22nd, the mission hoped to simultaneously deliver an orbiter, lander, and rover to our nearest celestial neighbor. The launch and flight to the Moon went off without a hitch, and while there were certainly some tense moments, the spacecraft ultimately put itself into a stable lunar orbit and released the free-flying lander so it could set off on its independent mission.

Unfortunately, just seconds before the Vikram lander touched down, an anomaly occurred. At this point the Indian Space Research Organisation (ISRO) still doesn’t know exactly what happened, but based on the live telemetry stream from the lander, some have theorized the craft started tumbling or otherwise became unstable between three and four kilometers above the surface.

Telemetry indicates a suboptimal landing orientation

In fact, for a brief moment the telemetry display actually showed the Vikram lander completely inverted, with engines seemingly accelerating the spacecraft towards the surface of the Moon. It’s unclear whether this was an accurate depiction of the lander’s orientation in the final moments before impact or a glitch in the real-time display, but it’s certainly not what you want to see when your craft is just seconds away from touchdown.

But for Chandrayaan-2, the story doesn’t end here. The bulk of the mission’s scientific goals were always to be accomplished by the orbiter itself. There were of course a number of scientific payloads aboard the Vikram lander, and even the Pragyan rover that it was carrying down to the surface, but they were always secondary objectives at best. The ISRO was well aware of the difficulties involved in making a soft landing on the Moon, and planned their mission objectives accordingly.

Rather than feel sorrow over the presumed destruction of Vikram and Pragyan, let’s take a look at the scientific hardware aboard the Chandrayaan-2 orbiter, and the long mission that still lies ahead of it.

Continue reading “India’s Moon Mission Is Far From Over”

A 4G Rover And The Benefits Of A Shakedown Mission

Many moons ago, in the shadowy darkness of the 1990s, a young Lewin visited his elder cousin. An adept AMOS programmer, he had managed to get his Amiga 500 to control an RC car, with little more than a large pile of relays and guile. Everything worked well, but there was just one problem — once the car left the room, there was no way to see what was going on.

Why don’t you put a camera on it? Then you can drive it anywhere!

Lewin

This would go on to inspire the TKIRV project approximately 20 years later. The goal of the project is to build a rover outfitted with a camera, which is controllable over cellular data networks from anywhere on Earth. For its upcoming major expedition, the vehicle is to receive solar panels to enable it to remain operable in distant lands for extended periods without having to return to base to recharge.

The project continues to inch towards this goal, but as the rover nears completion, the temptation to take it out for a spin grew ever greater. What initially began as an exciting jaunt actually netted plenty of useful knowledge for the rover’s further development.

Continue reading “A 4G Rover And The Benefits Of A Shakedown Mission”

Building A Robot Rover For Those Tough Indoor Missions

Making an outdoor rover is easy stuff, with lots of folk having them doing their roving activities on beaches and alien worlds. Clearly the new frontier is indoor environments, a frontier which is helpfully being conquered by [Andreas Hoelldorfer]’s Mantis Rover.

OK, we’re kidding. This project started out life as a base for [Andreas]’s exquisite 3D printable robotic arm, but it’s even capable of carrying people around, as the embedded video after the break makes abundantly clear. The most eye-catching feature of the Mantis Rover are its Mecanum wheels, which allow it to move in any direction, and is perfect for those tight spots where getting stuck would be really awkward.

The Mecanum wheels are 3D printed, making the motors and the associated controllers the more complicated part of this package. Plans for the wheels involve casting some kind of rubber, to make the wheels more gentle on the floors it has to drive on. The electronics include TMC 5160 motor drivers and an STM32F407VET6 MCU, as well as a W5500-equipped custom ‘Robot Shield’.

It seems that there are still a lot of tweaks underway to make the project even more interesting. Maybe it’s the perfect foundation for your next indoor roving sessions at the office or local hackerspace?

Continue reading “Building A Robot Rover For Those Tough Indoor Missions”

3D Printed Rover Enjoys Long Walks On The Beach

More than a few hackers have put in the considerable time and effort required to build a rover inspired by NASA’s robotic Martian explorers, but unfortunately even the most well funded home tinkerer can’t afford the ticket to send their creation offworld. So most of these builds don’t journey through anything more exciting than a backyard sandbox. Not that we can blame their creators, we think a homebrew rover will look just as cool in your living room as it would traipsing through a rock quarry.

But the DIY rover status quo clearly wasn’t sufficient for [Jakob Krantz], who decided the best way to test his new Curiosity-inspired rover was to let it frolic around on the beach for an afternoon. But judging by the video after the break, his beefy 3D printed bot proved to be more than up to the task; powering through wildly uneven terrain with little difficulty.

Beyond a few “real” bearings here and there, all of the key components for the rover are 3D printed. [Jakob] did borrow a couple existing designs, like a printable bearing he found on Thingiverse, but for the most part he’s been toiling away at the design in Fusion 360 and using images of the real Curiosity rover as his guide.

Right now, he’s controlling the rover with a standard 6 channel RC receiver. Four channels are mapped to the steering servos, and a fifth to the single electronic speed control that commands the six wheel motors. But he’s recently added an Arduino to the rover which will eventually be in charge of interpreting the RC commands. This will allow more complex maneuvers with fewer channels, such as the ability to rotate in place.

We’re proud to count our very own [Roger Cheng] among the rover wrangling hackers of the world. An entire community has sprung up around his six-wheeled Sawppy, and the knowledge gained during its design and construction could be applicable to any number of other projects.

Continue reading “3D Printed Rover Enjoys Long Walks On The Beach”

Life At JPL Hack Chat

Join us on Wednesday, August 21st at noon Pacific for the Life at JPL Hack Chat with Arko!

There’s a reason why people use “rocket science” as a metaphor for things that are hard to do. Getting stuff from here to there when there is a billion miles away and across a hostile environment of freezing cold, searing heat, and pelting radiation isn’t something that’s easily accomplished. It takes a dedicated team of scientists and engineers working on machines that can reach out into the vastness of space and work flawlessly the whole time, and as much practice and testing as an Earth-based simulation can provide.

Arko, also known as Ara Kourchians, is a Robotics Electrical Engineer at the Jet Propulsion Laboratory, one of NASA’s research and development centers. Nestled at the outskirts of Pasadena against the flanks of the San Gabriel Mountains, JPL is the birthplace of the nation’s first satellite as well as the first successful interplanetary probe. They build the robots that explore the solar system and beyond for us; Arko gets to work on those space robots every day, and that might just be the coolest job in the world.

Join us on the Hack Chat to get your chance to ask all those burning questions you have about working at JPL. What’s it like to build hardware that will leave this world and travel to another? Get the inside story on how NASA designs and tests systems for space travel. And perhaps get a glimpse at what being a rocket scientist is all about.

join-hack-chatOur Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, August 21 at 12:00 PM Pacific time. If time zones have got you down, we have a handy time zone converter.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io. You don’t have to wait until Wednesday; join whenever you want and you can see what the community is talking about.

DIY Personal Assistant Robot Hears And Sees All

Who wouldn’t want a robot that can fetch them a glass of water? [Saral Tayal] didn’t just think that, he jumped right in and built his own personal assistant robot. This isn’t just some remote-controlled rover though. The robot actually listens to his voice and recognizes his face.

The body of the robot is the common “Rover 5” platform, to which [Saral] added a number of 3D printed parts. A forklift like sled gives the robot the ability to pick things up. Some of the parts are more about form than function – [Saral] loves NASA’s Spirit and Opportunity Mars rovers, so he added some simulated solar cells and other greebles.

The Logitech webcam up front is very functional — images are fed to machine learning models, while audio is processed to listen for commands. This robot can find and pick up 90 unique objects.

The robot’s brains are a Raspberry Pi. It uses TensorFlow for object recognition. Some of the models [Saral] is using are pretty large – so big that the Pi could only manage a couple of frames per second at 100% CPU utilization. A Google Coral coprocessor sped things up quite a bit, while only using about 30% of the Pi’s processor.

It takes several motors to control to robot’s tracks and sled. This is handled by two Roboclaw motor controllers which themselves are commanded by the Pi.

We’ve seen quite a few mobile robot rovers over the years, but [Saral’s] ‘bot is one of the most functional designs out there. Even better is the fact that it is completely open source. You can find the code and 3D models on his GitHub repo.

Check out a video of the personal assistant rover in action after the break.

Continue reading “DIY Personal Assistant Robot Hears And Sees All”

Muscle Wire BugBot And A Raspberry Pi Android With Its Eye On You At Maker Faire

I spent a good chunk of Saturday afternoon hanging out at the Homebrew Robotics Club booth at Maker Faire Bay area. They have a ton of really interesting robot builds on display and I just loved hearing about what went into these two in particular.

It’s obvious where BugBot gets its name. The six-legged walker is the creation of [Mark Johnston] who built the beast in a time where components for robots were much harder to come by. Each leg is driven by a very thin strand of muscle wire which contracts when high voltage is run through it. One of the really tricky parts of the build was finding a way to attach this wire. It has a very low melting point, so trying to solder it usually results in melting right through. His technique is to wrap the wire around the leg itself, then slide a small bit of brass tubing over it and make a crimp connection.

At the heart of the little bug is a PIC microcontroller that is point-to-point soldered to the rest of the components. This only caused real problems once, when Mark somehow bricked the chip and had to replace it. Look close and you’ll see there’s a lot of fiddly bits to work around to pull that off. As I said, robot building was more difficult before the explosion of components and breakout modules hit the scene. The wireless control components on this were actually salvaged out of children’s RC toys. They’re not great by any stretch of the imagination, but it was the best source at the time and it works! You can find a demo of the robot embedded after the jump.

Ralph Campbell (left) and Mark Johnston (right)

An Android robot was on display, but of course, I was most interested in seeing what was beneath the skin. In the image above you can see the mask sitting to the left of the “Pat” skeleton. Ralph Campbell has been working on this build, and plans to incorporate interactive features like facial recognition and gesture recognition to affect the gaze of the robot.

Inside each of the ping pong ball eyes is a Raspberry Pi camera (actually the Adafruit Spy Camera because of its small board size). Ralph has a separate demonstration for facial recognition that he’s in the process of incorporating. But for me, it was the mechanical design of the bot that I find fascinating.

The structure of the skull is coat hanger lashed and soldered together using magnet wires. The eyes move thanks to a clever frame made out of paper clips. The servos to the side of each eye move the gaze up and down, while a servo beneath the eye takes care of left and right. A wooden match stick performs double duty — keeping the camera in place as the pupil of the eye, and allowing it to pivot along the paperclip track of the vertical actuator. It’s as simple as it can be and I find it quite clever!

Continue reading “Muscle Wire BugBot And A Raspberry Pi Android With Its Eye On You At Maker Faire”