[Jack Crossfire] took one of those inexpensive indoor helicopters and made it autonomous. He didn’t replace the hardware used for the helicopter, but augmented it and patched into the remote control to make a base station.
The position feedback is provided in much the same way that the Wii remote is used as a pointing device. On the gaming console there is a bar that goes under the TV with two IR LEDs in it. This is monitored by an IR camera in the Wii remote and used to calculate where you’re pointing the thing. [Jack’s] auto-pilot system uses two Logitech webcams with IR filters over the sensors. You can see them mounted on the horizontal bar in the cutout above. The helicopter itself has an IR LED added to it that is always on. The base station follows this beacon by moving the cameras with a pair of servo motors, calculating position and using it when sending commands to the remote control’s PCB.
Don’t miss the demo video of the rig after the break.
Continue reading “Autonomous helicopter works like a Wii remote”
This is the MC Hawking robot built by the Noisebridge hackerspace in San Francisco. It’s a robotic electric wheelchair outfitted with a PC, an XBox Kinect, and an Arduino. On the software side, it uses Ubuntu and the open source ROS platform. A few folks from Noisebridge were hacking away on the robot at Toorcamp to add a robotic arm and other upgrades.
One goal of the project was to build a hardware platform that lets software hackers work on autonomous applications without having to delve in to the complexities of the hardware. Since an autonomous wheelchair could get dangerous, it clearly boasts that it does not behave by Asimov’s three laws.
An example of an autonomous application for the MC Hawking is a facial tracking. This uses the Kinect’s sensors to follow people around. The platform is now being used to develop the DORA Opensource Robot Assistant project, which hopes to use the robotic arm to grab a soda from the fridge 51 days from now.
[Jake] from Noisebridge pointed out that they are seeking people who are interested in working on the software side of the project. If you are in the Bay Area and haven’t visited Noisebridge, you need to. Check their website for lots of information on the group.
Check out a video of MC Hawking partying at Toorcamp after the break.
Continue reading “Toorcamp: MC Hawking Robotic Wheelchair”
[Eduard Ros] wrote in to show off the latest version of his Arduino powered autonomous rover (translated). You may remember seeing the first version of the build back in June. It started with a remote control truck body, adding an Arduino and some ultrasonic sensors for obstacle avoidance.
The two big wheels and the pair of sensors look familiar, but most of the other components are a different from that version. The biggest change is the transition from four wheels to just three. This let him drop the servo motor which controlled steering. At first glance we though this thing was going to pop some mad wheelies, but the direction of travel actually drags the third wheel being the larger two. The motors themselves are different, this time depending on gear-reduced DC motors. The motor H-bridge is the same, but [Eduard] used a simple transistor-based inverter to reduce the number of pins needed to activate it from two down to just one. He also moved from an Arduino Uno to a Nano to reduce the footprint of the controller.
[Eduard Ros] wrote in show off his first attempt at building an autonomous rover (translated). As with many of these projects, he started with the base of a remote control toy truck. This solves so many mechanical issues, like steering, locomotion, and power source.
He just needed a way to control the vehicle. The recent LayerOne badge hacks either did this through the wireless controller protocol or by adding an Arduino directly to the vehicle. [Eduard] chose the latter, and also included obstacle avoidance sensors in the process. We’ve seen quite a few that use these ultrasonic rangefinders. He decided to go a different route by adding two of them rather than scanning by mounting one on a servo motor.
The video after the break shows the vehicle successfully navigating through a tight space. This makes us wonder how much data can be processed from the stationary sensors? We’re not familiar with how wide the horizontal sensitivity is on the devices. If you have some insight, please share you knowledge in the comments section.
Continue reading “Arduino rover doubles up on obstacle avoidance”
For [Gunnar]’s diploma thesis, he wanted to build an autonomous bicycle. There’s an obvious problem with this idea, though: how, exactly does a robotic bicycle stand upright? His solution to balancing the bicycle was a reaction wheel that keeps the bicycle upright at all times.
A bicycle is basically an inverted pendulum; something we’ve seen controlled in a number of projects. To balance his driver-less bike, [Gunnar] used a stabilizing wheel and an IMU to make sure the bicycle is always in the upright position. The bike measure the tilt and angular velocity of itself, along with the speed of the stabilizing wheel. To correct a tilt to the left, the stabilizing wheel spins clockwise, and corrects a rightward tilt by spinning counterclockwise.
While [Gunnar]’s solution of a bike wheel used as a gyroscope is clever – it uses common bicycle wheel, hugely reducing costs if someone wants to replicate this project – there’s not a whole lot of ground clearance. The size of the stabilizing wheel could probably be reduced by replacing the 7.4 kg steel wheel with a Tungsten, Osmium, or Lead disk, possibly becoming so small it could fit inside the frame. Still, though, a very nice build that is sure to turn a few heads.
Continue reading “Self-stabilizing autonomous bicycle”
There is no shortage of government and entertainment-related agencies chomping at the bit to shut down the Pirate Bay for good. While the group has not suffered a permanent service ending raid like [Kim Dotcom] and the Megaupload crew, they are always thinking up novel ways to ensure that the site can endure whatever law enforcement throws at them.
In a recent blog post, representatives from the group unveiled plans to put their front line servers in the clouds, courtesy of custom-made autonomous drones called “Low Orbit Server Stations.” The project is in its infancy, but the general idea is to mount small computers like the Raspberry Pi on GPS-controlled drones kept aloft 24×7 (presumably) using solar energy. These drones would communicate with clients on the ground via radio transmitters which they state can provide a “100Mbps node up to 50km away”.
Calling the claims grandiose would be an understatement, but then again the Pirate Bay has proven to be a difficult organization to quash in any substantial way, so only time will tell.
[via The Daily What – Thanks, roboman2444]
[Vijay Kumar] is a professor at the University of Pennsylvania and the director of the GRASP lab where research centering around autonomous quadcopters is being met with great success. If you were intrigued by the video demonstrations seen over the last few years, you won’t want to miss the TED talk [Dr. Kumar] recently gave on the program’s research. We touched on this the other week when we featured a swarm of the robots in a music video, but there’s a lot more to be learned about what this type of swarm coordination means moving forward.
We’re always wondering where this technology will go since all of the experiments we’ve seen depend on an array of high-speed cameras to give positional feedback to each bot in the swarm. The image above is a screenshot taken about twelve minutes into the TED talk video (embedded after the break). Here [Dr. Kumar] addresses the issue of moving beyond those cameras. The quadcopter shown on the projection screen is one possible solution. It carries a Kinect depth camera and laser rangefinder. This is a mapping robot that is designed to enter an unknown structure and create a 3D model of the environment.
The benefits of this information are obvious, but this raises one other possibility in our minds. Since the robots are designed to function as an autonomous swarm, could they all be outfitted with cameras, and make up the positional-feedback grid for one another? Let us know what you think about it in the comments section.
Continue reading “[Vijay Kumar’s] TED talk on the state of quadcopter research”