Amazingly Detailed Robotics Ground Vehicle Guide

[Andrey Nechypurenko] has put together an excellent design guide describing the development of his a20 grou1nd vehicle and is open sourcing all the schematics and source code.

20150627_180534One of [Andrey]’s previous designs used a Pololu tracked chassis. But this time he designed everything from scratch. In his first post on the a20, [Andrey] describes the mechanical design of the vehicle. In particular focusing on trade-offs between different drive systems, motor types, and approaches to chassis construction. He also covers the challenges of using open source design tools (FreeCAD), and other practical challenges he faced. His thorough documentation makes an invaluable reference for future hackers.

[Andrey] was eager to take the system for a spin so he quickly hacked a motor controller and radio receiver onto the platform (checkout the video below). The a20s final brain will be a Raspberry Pi, and we look forward to more posts from [Andrey] on the software and electronic control system.

Continue reading “Amazingly Detailed Robotics Ground Vehicle Guide”

Simple Autonomy with an RC Boat

[Vlad] wrote in to tell us about his latest project—an RC boat that autonomously navigates between waypoints. Building an autonomous vehicle seems like a really complicated project, but [Vlad]’s build shows how you can make a simple waypoint-following vehicle without a background in autonomy and control systems. His design is inspired by the Scout autonomous vehicle that we’ve covered before.

[Vlad] started prototyping with an Arduino, a GPS module, and a digital compass. He wrote a quick sketch that uses the compass and GPS readings to control a servo that steers towards a waypoint. [Vlad] took his prototype outside and walked around to make sure that steering and navigation were working correctly before putting it in a boat. After a bit of tweaking, his controller steered correctly and advanced to the next waypoint after the GPS position was within 5 meters of its goal.

boatgifNext [Vlad] took to the water. His first attempt was a home-built airboat, which looked awesome but unfortunately didn’t work very well. Finally he ended up buying a $20 boat off of eBay and made a MOSFET-based motor controller to drive its dual thrusters. This design worked much better and after a bit of PID tuning, the boat was autonomously navigating between waypoints in the water. In the future [Vlad] plans to use the skills he learned on this project to make an autopilot for the 38-foot catamaran his dad is building (an awesome project by itself!). Watch the video after the break for more details and to see the boat in action.

Continue reading “Simple Autonomy with an RC Boat”

Project Sea Rendering Autonomously Renders Sea Bottoms

[Geir] has created a pretty neat device, it’s actually his second version of an autonomous boat that maps the depths of lakes and ponds. He calls it the Sea Rendering. The project is pretty serious as the hull was specially made of fiberglass. The propulsion is a simple DC motor and the rudder is powered by an RC servo. A light and flag adorn the top deck making the small craft visible to other larger boats that may be passing by. Seven batteries are responsible for all of the power requirements.

Sea Rendering

The craft’s course is pre-programmed in Mission Planner and uses ArduPilot loaded on an Arduino to steer to the defined way points. An onboard GPS module determines the position of the boat while a transducer measures the depth of the water. Both position and depth values are then saved to an SD card. Those values can later be imported into a software called Dr Depth that generates a topographic map of the water-covered floor.

[Geir] has sent this bad boy out on an 18 km journey passing through 337 way points. That’s pretty impressive! He estimates that the expected run time is 24 hours at a top speed of 3 km/h, meaning it could potentially travel 72 km on a single charge while taking 700 depth measurements during the voyage.

Continue reading “Project Sea Rendering Autonomously Renders Sea Bottoms”

Autonomous Vehicle-Following Vehicle

Humanity has taken one step closer to Skynet becoming fully aware. [Ahmed], [Muhammad], [Salman], and [Suleman] have created a vehicle that can “chase” another vehicle as part of their senior design project. Now it’s just a matter of time before the machines take over.

The project itself is based on a gasoline-powered quad bike that the students first converted to electric for the sake of their project. It uses a single webcam to get information about its surroundings. This is a plus because it frees the robot from needing a stereoscopic camera or any other complicated equipment like a radar or laser rangefinder. With this information, it can follow a lead vehicle without getting any other telemetry.

This project is interesting because it could potentially allow for large convoys with only one human operator at the front. Once self-driving cars become more mainstream, this could potentially save a lot of costs as well if only the vehicle in the front needs the self-driving equipment, while the vehicles behind would be able to operate with much less hardware. Either way, we love seeing senior design projects that have great real-world applications!

Continue reading “Autonomous Vehicle-Following Vehicle”

A Compact Underwater Vehicle: The Nanoseeker

The Nanoseeker is a compact underwater vehicle in a torpedo-like form factor. [John] designed the Nanoseeker as completely enclosed vehicle: both the thruster and the control fins are all housed within the diameter of the tube. The thruster is ducted with vents on the sides and control fins integrated into the back of the duct assembly.

[John] designed a compact PCB to drive the vehicle, which includes an STM32F4 alongside several sensors. An MPU-9150 provides IMU functionality and two dual motor driver ICs from TI control the throttle and the control fins. [John] also added a Bluetooth radio for remote control functionality. For those who want a closer look, an image of the schematic is up on his blog.

The board is running MicroPython, which is a small Python implementation optimized for microcontrollers. Although [John]’s hardware platform looks great, he’s still getting started on his software. We look forward to seeing how his project develops, as his project is one of the smallest underwater vehicles we’ve seen.

[via Dangerous Prototypes]

A Mechanically Scanned LIDAR For Autonomous Robots

LIDAR[Patrick] has spent a lot of time around ground and aerial based autonomous robots, and over the last few years, he’s noticed a particular need for teams in robotics competitions to break through the ‘sensory bottleneck’ and get good data of the surrounding environment for navigational algorithms. The most well-funded teams in autonomous robotics competitions use LIDARs to scan the environment, but these are astonishingly expensive. With that, [Patrick] set out to create a cheaper solution.

Early this year, [Patrick] learned of an extremely cheap LIDAR sensor. Now [Patrick] is building a robotics distance measurement unit based on this sensor.

Early experiments with mechanically scanned LIDAR sensors centered around the XV-11 LIDAR, the distance sensor found in the Neato Robotics robot vacuum cleaner. [Patrick] became convinced a mechanically scanned LIDAR was the way forward when it came to distance measurement of autonomous robots. Now he’s making his own with an astonishingly inexpensive LIDAR sensor.

The basic idea of [Patrick]’s project is to take the PulsedLight LIDAR-Lite module, add a motor and processing board, and sell a complete unit that will output 360° of distance data to a robot’s main control system. The entire system should cost under $150 when finished; a boon to any students, teams, or hobbyists building an autonomous vehicle.

[Patrick]’s system is based on the PulsedLight LIDAR – a device that’s not shipping yet – but the team behind the LIDAR-Lite says they should have everything ready by the end of the month, all the better, because between these two devices, there’s a lot of cool stuff to be done in the area of autonomous robots.

Adding an optical mouse sensor to an autonomous vehicle

optical-mouse-sensor-for-autonomous-vehicle

[Tim] is getting his drone ready for SparkFun’s 2013 Autonomous Vehicle Competition on June 8th. He has a pretty good start, but was having some problems accurately measuring travel distance. The technique he chose for the task was to glue magnets onto the axles of the vehicle and monitor them with a hall effect sensor. Those sensors are finicky and a few problems during testing prompted him to look at a redundant system. Right now he’s experimenting with adding an optical mouse sensor to the autonomous vehicle.

Recently we saw the same concept used, but it was meant for tracking movement of a full-sized automobile. If it can work in that application it should be perfect here since the vehicle is much closer to the ground and will be used in ideal conditions (flat pavement with clear weather). [Tim] cracked open an old HP mouse he had lying around. Inside he found an Avago ADNS-5020 sensor. After grabbing the datasheet he discovered that it’s simply an I2C device. Above you can see the Arduino Leonardo he used for the first tests.

[Tim] coded functions to monitor the chip, including some interesting ones like measuring how in-focus the surface below the sensor is. This brings up a question, is there limit on how fast the vehicle can travel before the sensor fails to report back accurately?