Amazing 3d telepresence system

encumberance_free_telepresence_kinect

It looks like the world of Kinect hacks is about to get a bit more interesting.

While many of the Kinect-based projects we see use one or two units, this 3D telepresence system developed by UNC Chapel Hill student [Andrew Maimone] under the guidance of [Henry Fuchs] has them all beat.

The setup uses up to four Kinect sensors in a single endpoint, capturing images from various angles before they are processed using GPU-accelerated filters. The video captured by the cameras is processed in a series of steps, filling holes and adjusting colors to create a mesh image. Once the video streams have been processed, they are overlaid with one another to form a complete 3D image.

The result is an awesome real-time 3D rendering of the subject and surrounding room that reminds us of this papercraft costume. The 3D video can be viewed at a remote station which uses a Kinect sensor to track your eye movements, altering the video feed’s perspective accordingly. The telepresence system also offers the ability to add in non-existent objects, making it a great tool for remote technology demonstrations and the like.

Check out the video below to see a thorough walkthrough of this 3D telepresence system.

Continue reading “Amazing 3d telepresence system”

Your robot stand-in has arrived

Meet TIPI, the Telepresence Interface by Pendulum Inversion. TIPI is something of a surrogate, giving physical presence to telecommuters by balancing an LCD screen and camera atop its six foot frame. The user has full control of the robot’s movement, with their own camera image shown on the display so that others interacting with the bot will with whom they are conversing.

A pair of 12.5″ wheels connec to DC motors via a gear box with a 37:1 ratio. These specs are necessary to recover from a sudden 20 degree loss of equilibrium, quite impressive for a bot of this stature. An Orangutan SVP board monitors a two-axis accelerometer and a gyroscope for accurate positioning data. This board automatically keeps balance, while taking user commands from a second control, a Beagle Board. The Beagle Board handles the communications, including sending and receiving the video signals, and delivering incoming position control data to the Orangutan. Separating the two systems guards against a screen-shattering fall by making sure the hardware likely to face slow-down or lockup is physically separate from that responsible for balance.

Check out the video clip after the brake to see some balancing goodness. It shouldn’t be hard to build your own version for much less than the $15k price tag enjoyed by some commercial versions.

Continue reading “Your robot stand-in has arrived”

Wireless animatronic hand control

animatronic_hand

[Easton] was looking to enter his local science fair and needed a project that would wow the judges. After considering it for a bit, he decided that an animatronic hand would be a sure winner. Many animatronic projects we have seen are connected to a computer for control purposes, but his is a bit different.

[Easton] wanted to be able to control the hand in real time with his own movements, so he sewed some flex sensors onto a glove and wired them up to a custom Arduino shield he built. The Arduino is also connected to an XBee radio, allowing it to interface with his animatronic hand wirelessly.

He built the hand after studying anatomical drawings to better understand where finger joints were located and how they moved. He cut up pieces of flexible wire tubing to build the fingers, reinforcing them with Lego bricks. He ran fishing wire from the finger tips to five independent servos to provide the hand’s motion. Another Arduino with an XBee shield was used to control the hand and receive wireless signals from the glove.

Check out the video below to see why this project won [Easton] first place in the science fair.

Continue reading “Wireless animatronic hand control”

BAMF2010: QB goes to meetings, shoots lasers from eyes

No, it’s not an extra from Wall-E. “QB” is the latest telepresence robot from Silicon Valley firm Anybots. QB combines two-way videoconferencing with a Segway-style self-balancing platform. The idea is to provide mobility and more natural interaction than desktop-tethered conferencing can provide.

The 35 pound robot’s battery runs for six to eight hours, and the telescoping head allows the eye level to be adjusted to match the user’s natural viewpoint. What looks like stereo vision is actually a single camera on the left eye and a steerable laser pointer on the right.

Shipping this October for $15,000, QB will appeal mostly to businesses with specific telepresence needs. This is half the price of their prior QA model — and in time the technology may reach the mass-market level. Until then, we’ll just have to amuse ourselves by remotely attending meetings with our ankle-nipping Rovio robots.

British bots compete for attention


The British military held a competition to find the newest batch of robotic surveillance drones. The article mentions that they compete in a mockup village, but sadly we don’t get to see any of the action. We strongly recommend watching the video so you can see some of the robots. There is an interesting helicopter concept that has angled props for better stability and lateral motion, but more importantly you get to see the little guy pictured above. He very well could be Wall-E’s great grandfather. Though his constant buzzing around during the interviews is slightly annoying, his little camera mount looking all around is instantly endearing. If he doesn’t win this contest, he may have a shot at the [crabfu] challenge.

[via Engadget]

Drive a robot in Australia over the web


BP Australia has commissioned an online game where you get to drive robots around an obstacle course. Make no mistake, these are real robots. Actually they are modified versions of the Surveyor SRV-1 vehicles that are popular with research labs, and schools everywhere.

Go to the website, get in queue and pray for no clouds. These babies are solar powered, so you’ll have to try to get in while its day time in Australia. The entire set is built in miniature, so you feel like you’re driving a tank around a city.

[via Robots Dreams]

Gaming with Roombas


Yesterday we looked at the Pac-Man Roomba casemod. In the video, creator [Ron Tajima] expressed interest in seeing Roombas participate in real life games. So we did some digging around and found some used in an interesting augmented reality game. From Brown University, these modified Roomba Create units play various games, like tag, with an underlying goal of developing smarter robots.

The setup consists of a Java powered client/server arrangement. The game server coordinates the Small Universal Robot Vehicles (SmURVs) and builds a database of events for future use. Players can also control the robots through a Java telepresence client.

The units themselves are made up of the iRobot Create with a Mini-ITX computer strapped to the top. They run Linux and communicate over WiFi with the server and players. They also have an IR emitter used in the games to “shoot” other units.

Gameplay has the server acting as the referee and humans only acting as instructors. The humans come into play when the robots are unable to respond based on their existing database of decision making policies. Through the client, players are able to see exactly what the robot sees with the addition of 3D overlays. Future plans for the game include removing the camera view and replacing with nothing but these overlays. One of the final goals of the project was to create a 24/7/365 gaming experience similar to what is found in MMOs and Xbox Live applications today.