Remember Furby? The cute reactive robot was all the rage a few years ago, when the strange chattering creature was found under many a Christmas tree. Most Furbys have been sadly neglected since then, but the Open Furby project aims to give the toy a new lease of life, transforming it into an open source social robot platform.
We’ve featured a few Furby hacks before, such as the wonderful Furby Gurdy and the Internet connected Furby but the Open Furby project aims to create an open platform, rather than creating a specific hack. It works by replacing the brains of the Furby with a FLASH controller that runs the Robot Operating System (ROS), making the Furby much easier to program and control. They have also replaced the eyes with small OLED screens, which means it can do things like show a weather forecast, facebook notification, etc.
It is still in the early stages, but it looks like an interesting project. Personally, I am waiting for the evil Furby that wants to kill you and eat your flesh with that nasty beak…
Continue reading “Open Furby Opens The Furby”
French robot-artist [Lyes Hammadouche] tipped us off to one of his latest works: a collaboration with [Ianis Lallemand] called Texel. A “texel” is apparently a time-pixel, and the piece consists of eight servo-controlled hourglasses that can tip themselves over in response to viewers walking in front of them. Besides making graceful wavelike patterns when people walk by, they also roughly record the amount of time that people have spent looking at the piece — the hourglasses sit straight up when nobody’s around, resulting in a discrete spatial representation of people’s attentions to the piece: texels.
We get jealous when we see artists playing around with toys like these. Texel uses LIDAR scanners, Kalman-filtered naturally, to track the viewers. openFrameworks, OpenCV, and ROS. In short, everything you’d need to build a complex, human-interactive piece like this using completely open-source tools from beginning to end. Respect!
Continue reading “Texel: Art Tracks You, Tracks Time”
[DJI], everyone’s favorite — but very expensive — drone company just announced the Manifold — an extremely capable high performance embedded computer for the future of aerial platforms. And guess what? It runs Ubuntu.
The unit features a quad-core ARM Cortex A-15 processor with an NVIDIA Keplar-based GPU and runs Canonical’s Ubuntu OS with support for CUDA, OpenCV and ROS. The best part is it is compatible with third-party sensors allowing developers to really expand a drone’s toolkit. The benefit of having such a powerful computer on board means you can collect and analyze data in one shot, rather than relaying the raw output down to your control hub.
And because of the added processing power and the zippy GPU, drones using this device will have new artificial intelligence applications available, like machine-learning and computer vision — Yeah, drones are going to be able to recognize and track people; it’s only a matter of time.
We wonder what this will mean for FAA regulations…
For those skeptical about the feasibility of Santa’s annual delivery schedule, here’s an autonomous piece of the puzzle that will bewilder even the most hard-hearted of non-believers.
The folks over at the Center of Excellence Cognitive Interaction Technology (CITEC) in Germany have whipped together a fantastic demo featuring Santa’s extra pair of helping hands. In the two-and-a-half minute video, the robot executes a suite of impressive autonomous stocking-stuffing maneuvers: from recognizing the open hole in the stocking, to grasping specific candies from the cluster of goodies available.
On the hardware-side, the arms appear to be a KUKA-variant, while on the software-side, the visualizations are being handled by the open source robot software ROS‘ RVIZ tool.
If some of the props in the video look familiar, you’ll find that the researchers at CITEC have already explored some stellar perception, classification, and grasping of related research topics. Who knew this pair of hands would be so jolly to clock some overtime this holiday season? The entire video is set to a crisp computer-voiced jingle that serves as a sneaky summary of their approach to this project.
Now, if only we could set these hands off to do our other dirty work….
Continue reading “Santa’s Autonomous Helping Hands Let the Jolly ol’ Fellow Kick Back this Season”
On June 26th, 2014, Clearpath Robotics opened up the doors to their brand new 12,000 square foot robot lair by bringing out a PR2 to cut the ceremonial ribbon and welcome everyone inside. And instead of just programming the ‘locate and destroy’ ribbon sequence, the co-founders opted to use an Oculus Rift to control the robot tearing through the material with flailing arms.
This was accomplished having Jake, the robot, utilize a Kinect 2.0 that fed skeleton tracking data via rosserial_windows, a windows-based set of extension for the Robot Operating System which we heard about in January. The software gathers in a stream of data points each with an X,Y,Z component allowing [Jake] to find himself within a 3D space.Then, the data was collected and published directly into the PR2’s brain. Inject a little python code, and the creature was able to route directions in order to move it’s arms.
Thus, by simply stepping in front of the Kinect 2.0, and putting on the Oculus Rift headset, anyone could teleoperate [Jake] to move around and wave its arms at oncoming ribbons. Once completed, [Jake] would leave the scene, journeying back into the newly created robot lair leaving pieces of nylon and polyester everywhere.
An earlier (un-smoothed) version of the full system can be seen after the break:
Continue reading “Cutting Ribbons with Robots and a Oculus Rift”
[Kevin] brings us Golem, his latest robot project. Golem is crafted not of clay and stone like his namesake, but of T6 Aluminum and Servos. We don’t have a banana for scale, but Golem is big. Not [Jamie Mantzel’s] Giant Robot Project big, but at 2.5 feet (76.2 cm) in diameter and 16 lbs (7.3 Kg), no one is going to call Golem a lightweight. With that kind of mass, standard R/C servos don’t stand much of a chance. [Kevin] pulled out all the stops and picked up Dynamixel MX64 servos for Golem’s legs. Those servos alone propelled the Golem’s costs well beyond the budget of the average hobbyist. Kevin wasn’t done though. He added an Intel NUC motherboard with a fourth generation i5 processor, a 120 Gigabyte solid state drive, and 8 Gigbytes of Ram. Sensing is handled by gyros, accelerometers, and an on-board compass module. We’re assuming from the lack of a GPS that Golem will mainly see indoor use. We definitely like the mini subwoofer mounted on Golem’s back. Hey, even robots gotta have their tunes.
Golem is currently walking under human control via a Dualshock 3 controller paired via bluetooth. [Kevin’s] goal is to use Golem to learn Robotic Operating System (ROS). He’s already installed ubuntu 13.04 and is ready to go. [Kevin] didn’t mention a vision system, but based on the fact that some of his other robots use the Xtion pro live, we’re hopeful. We can’t wait to see Golem’s first autonomous steps.
Continue reading “Hexapod Robot Terrifies Humans and Wallets”
Every once in a while we get a tip for a project that really, really, really blows our minds. This is one of them.
It looks like a basic catamaran with a few extra bells and whistles — except it is so much more than that. You’re looking at a fully Autonomous Surface Vehicle, complete with a piggybacking 6-rotor UAV. It’s decked out in cameras, sonar sensors, laser rangefinders, high accuracy GPS-RTK tracking, an IMU, oh, and did we mention the autonomous 6-rotor UAV capable of taking off and landing on it?
It all started out as a simple experiment within ECHORD (the European Clearing House for Open Robotics Development), and since then it has become a fully funded project at UNINOVA, a Centre of Technology and Systems in Portugal.
The purpose of the mind-blowing robot team is to collect data of river environments — think of it as Google Maps 2.0 — which is almost an understatement for what it is capable of.
You seriously have to watch the video after the break.
Continue reading “RIVERWATCH: An Autonomous Surface-Aerial Marsupial Robot Team”