Robots can easily make their way across a factory floor; with painted lines on the floor, a factory makes for an ideal environment for a robot to navigate. A much more difficult test of computer vision lies in your living room. Finding a way around a coffee table and not knocking over a lamp present a huge challenge for any autonomous robot. Researchers at the Royal Institute of Technology in Sweden are working on this problem, but they need your help.
[Alper Aydemir], [Rasmus Göransson] and Prof. [Patric Jensfelt] at the Centre for Autonomous Systems in Stockholm created Kinect@Home. The idea is simple: by modeling hundreds of living rooms in 3D, the computer vision and robotics researchers will have a fantastic library to train their algorithms.
To help out the Kinect@Home team, all that is needed is a Kinect, just like the one lying disused in your cupboard. After signing up on the Kinect@Home site, you’re able to create a 3D model of your living room, den, or office right in your browser. This 3D model is then added to the Kinect@Home library for CV researchers around the world.
Upon the release of the Kinect, Microsoft showed off its golden child as the beginnings of a revolution in user interface technology. The skeleton and motion detection promised a futuristic, hand-waving “Minority Report-style” interface where your entire body controls a computer. The expectations haven’t exactly lived up reality, but [Steve], along with his coworkers at Amulet Devices have vastly improved the Kinect’s skeleton recognition so people can use a Kinect sitting down.
One huge drawback for using the Kinect for a Minority Report UI in a home theater is the fact that the Microsoft Skeleton recognition doesn’t work well when sitting down. Instead of relying on the built-in skeleton recognition that comes with the Kinect, [Steve] rolled his own skeleton detection using Harr classifiers.
Detecting Harr-like features has been used in many applications of computer vision technology; it’s a great, not-very-computationally-intensive way to detect faces and body positions with a simple camera. Training is required for the software, and [Steve]’s app spent several days programming itself. The results were worth it, though: the Kinect now recognizes [Steve] waving his arm while he is lying down on the couch.
Not to outdo himself, [Steve] also threw in voice recognition to his Kinect home theater controller; a fitting addition as his employer makes a voice recognition remote control. The recognition software seems to work very well, even with the wistful Scottish accent [Steve] has honed over a lifetime.
[Steve]’s employer is giving away their improved Kinect software that works for both the Xbox and Windows Kinects. If you’re ever going to do something with a Kinect that isn’t provided with the SDKs and APIs we covered earlier today, this will surely be an invaluable resource.
You can check out [Steve]’s demo of the new Kinect software after the break.
Continue reading “Adding new features and controlling a Kinect from a couch”
Sure, squirrels may bother the average home owner, but few have attempted as creative a way to control them as this automated water turret. Check out the video after the break to see how this was accomplished, but if you’d rather just see how the squirrels reacted to getting squirted, fast forward to around 16:00. According to [Kurt] he was sure this would be his solution, however, his conclusion was that “squirrels don’t care.”
As for the presentation, it’s more about how to use [OpenCV], or Open Source Computer Vision. It’s quite a powerful piece of software, especially considering that something like this would cost thousands of dollars in a normal market. An Arduino is used to interface the computer’s outputs to the real world and control a squirt gun. If you’d rather not program something like this yourself, you could always simply use a garden hose as someone suggests just after the video. Continue reading “Birdwatching Meets a Computer-Controlled Water Cannon, Awesomeness Ensues”
[DJ Sures] has been pulling all-nighters lately to get his AR Drone Parrot build off the ground. Now that it’s up and flying around, he managed to get it to follow objects around the room using on board cameras.
For the build, [DJ Sures] used the AR Drone ‘flying video game’ quadrocopter. This toy has two on board cameras that can viewed over wifi. All that’s needed is some interesting software to make things fun. The camera tracking of EZ-Builder software was brought into the mix so the AR Drone can be controlled via object or speech recognition, wiimotes, tablets, or terminals.
[DJ Sures] has come up with some
slightly terrifying awesome builds like a Bluetooth Teddy Ruxpin, realistic Wall-E, and an awesome Omnibot 2000 refurb. This is his first flying hack, and the first to fully exploit the camera tracking of the EZ-Builder software. Check out [Sures]’ copter following him around a room after the break.
Continue reading “Easy camera tracking with a quadrocopter”
[DJ Sures] got his hands on a plastic Wall-E toy and decided to build a robot that includes a camera, voice recognition, and object tracking. The result is adorable so we’re putting this video before the break:
Continue reading “Modded Wall-E becomes a real robot”