There are a handful of companies trying to build the first autonomous car, but this project makes us think that they all might be heading in the wrong direction. [Thorstin] wanted to use a quadcopter to transport people, and built a working prototype of an autonomous quadcopter-esque vehicle that is actually capable of lifting a person.
The device isn’t actually a quadcopter anymore; that wouldn’t be able to generate enough lift. It has sixteen rotors in total, making it a sexdecacopter (we suppose). This setup generates 282 pounds of static thrust, which as the video below shows, is enough to lift an average person off of the ground along with the aluminum alloy frame and all of the lithium ion batteries used to provide power to all of those motors.
With the PID control system in place, the device is ready for takeoff! We like hobby projects that suddenly become life-sized and rideable, and we hope to see this one fully autonomous at some point too. Maybe soon we’ll see people ferried from waypoint to waypoint instead of being driven around in their ground-bound autonomous cars.
Continue reading “Autonomous Drones Now Carry People”
[Vlad] wrote in to tell us about his latest project—an RC boat that autonomously navigates between waypoints. Building an autonomous vehicle seems like a really complicated project, but [Vlad]’s build shows how you can make a simple waypoint-following vehicle without a background in autonomy and control systems. His design is inspired by the Scout autonomous vehicle that we’ve covered before.
[Vlad] started prototyping with an Arduino, a GPS module, and a digital compass. He wrote a quick sketch that uses the compass and GPS readings to control a servo that steers towards a waypoint. [Vlad] took his prototype outside and walked around to make sure that steering and navigation were working correctly before putting it in a boat. After a bit of tweaking, his controller steered correctly and advanced to the next waypoint after the GPS position was within 5 meters of its goal.
Next [Vlad] took to the water. His first attempt was a home-built airboat, which looked awesome but unfortunately didn’t work very well. Finally he ended up buying a $20 boat off of eBay and made a MOSFET-based motor controller to drive its dual thrusters. This design worked much better and after a bit of PID tuning, the boat was autonomously navigating between waypoints in the water. In the future [Vlad] plans to use the skills he learned on this project to make an autopilot for the 38-foot catamaran his dad is building (an awesome project by itself!). Watch the video after the break for more details and to see the boat in action.
Continue reading “Simple Autonomy with an RC Boat”
MIT engineers have developed a technique to address the challenges involved in manufacturing robots at a cheap and accessible level. Like a plant folding out its petals, a protein folding into shape, or an insect unveiling its wings, this autonomous origami design demonstrated the ability for a mechanical creature to assemble itself and walk away. The technique opens up the possibility of unleashing swarms of flat robots into hard to reach places. Once on site, the robots mobilize from the ground up.
The team behind the project used flexible print circuit boards made out of paper and polystyrene, which is a synthetic aromatic polymer typically found in the commercially sold children’s toy Shrinky Dinks™. Each hinge had embedded circuits that were mechanically programmed to fold at certain angles. Heat was applied to the composite structure triggering the folding process. After about four minutes, the hinges would cool allowing the polystyrene to harden. Some issues did arise though during the initial design phase due to the amount of electrical current running the robots, which was ten times that of a regular light bulb. This caused the original prototypes to burn up before the construction operation was completed.
In the long-term, Core Faculty Member [Robert] would like to have a facility that would provide everyday robotic assistance to anyone in the surrounding community. This place would be accessible to everyone in the neighborhood helping to solve whatever problems might arise, which sounds awfully like a hackerspace to us. Whether the person required a device to detect gas leaks or a porch sweeping robot, the facility would be there to aid the members living nearby.
A video of [Robert] and [Sam] describing the project comes up after the break:
Continue reading “Self-Assembling Origami Robots”
[Patrick] has spent a lot of time around ground and aerial based autonomous robots, and over the last few years, he’s noticed a particular need for teams in robotics competitions to break through the ‘sensory bottleneck’ and get good data of the surrounding environment for navigational algorithms. The most well-funded teams in autonomous robotics competitions use LIDARs to scan the environment, but these are astonishingly expensive. With that, [Patrick] set out to create a cheaper solution.
Early this year, [Patrick] learned of an extremely cheap LIDAR sensor. Now [Patrick] is building a robotics distance measurement unit based on this sensor.
Early experiments with mechanically scanned LIDAR sensors centered around the XV-11 LIDAR, the distance sensor found in the Neato Robotics robot vacuum cleaner. [Patrick] became convinced a mechanically scanned LIDAR was the way forward when it came to distance measurement of autonomous robots. Now he’s making his own with an astonishingly inexpensive LIDAR sensor.
The basic idea of [Patrick]’s project is to take the PulsedLight LIDAR-Lite module, add a motor and processing board, and sell a complete unit that will output 360° of distance data to a robot’s main control system. The entire system should cost under $150 when finished; a boon to any students, teams, or hobbyists building an autonomous vehicle.
[Patrick]’s system is based on the PulsedLight LIDAR – a device that’s not shipping yet – but the team behind the LIDAR-Lite says they should have everything ready by the end of the month, all the better, because between these two devices, there’s a lot of cool stuff to be done in the area of autonomous robots.
Mikey is [Mike’s] autonomous robot. Like any good father, he’s given the robot his name. Mikey is an Arduino based robot, which uses a Pixy camera for vision.
[Mike] started with a common 4WD robot platform. He added an Arduino Uno, a motor controller, and a Pixy. The Pixy sends directions to the Arduino via a serial link. Mikey’s original task was driving around and finding frogs on the floor. Since then, [Mike] has found a higher calling for Mikey: self charging.
One of the most basic features of life is eating. In the case of autonomous robots, that means self charging. [Mike] gave Mikey the ability to self charge by training the Pixy to detect a green square. The green square identifies Mikey’s charging station. Probes mounted on 3D printed brackets hold the positive leads while springs on the base of the station make contact with conductive tape on Mikey’s belly. Once the circuit is complete, Mike stops moving and starts charging.
Continue reading “Mikey, the Robot That Charges Itself”
Ever have one of those weekend projects that takes on a life of its own? [Michael] did, and the result is this PenguinBot. While [Michael’s] wife was away for the weekend he happened upon a broken toy penguin. The batteries had leaked inside, destroying the contacts. Rather than bin the toy, [Michael] made it awesome by turning it into an autonomous robot. [Michael’s] goal was to create a robot that could roam around the house avoiding obstacles, or follow a light source like a flashlight.
He started by pulling out most of the original electronics. Two dollar store toy trains gave their lives and their motors to replace the penguin’s original drive system. An Arduino Pro Mini became PenguinBot’s brain. Sensors consisted of two light sensing CdS cells, an AdaFruit sound sensor, and a MaxBotix ultrasonic sensor. With the ultrasonic sensor mounted on a servo, it can detect obstacles in any direction. The CdS cells and some software will allow PenguinBot to follow lights, like any good photovore robot should.
Click past the break to see PenguinBot in action
Continue reading “PenguinBot Follows Light, Goes Screech in the Night”
[Courtney] has been hard at work on OSkAR, an OpenCV based speaking robot. OSkAR is [Courney’s] capstone project (pdf link) at Shepherd University in West Virginia, USA. The goal is for OSkAR to be an assistive robot. OSkAR will navigate a typical home environment, reporting objects it finds through speech synthesis software.
To accomplish this, [Courtney] started with a Beagle Bone Black and a Logitech C920 webcam. The robot’s body was built using LEGO Mindstorms NXT parts. This means that when not operating autonomously, OSkAR can be controlled via Bluetooth from an Android phone. On the software side, [Courtney] began with the stock Angstrom Linux distribution for the BBB. After running into video problems, she switched her desktop environment to Xfce. OpenCV provides the machine vision system. [Courtney] created models for several objects for OSkAR to recognize.
Right now, OSkAR’s life consists of wandering around the room looking for pencils and door frames. When a pencil or door is found, OSkAR announces the object, and whether it is to his left or his right. It may sound like a rather boring life for a robot, but the semester isn’t over yet. [Courtney] is still hard at work creating more object models, which will expand OSkAR’s interests into new areas.
Continue reading “Never Lose Your Pencil With OSkAR on Patrol”