On June 26th, 2014, Clearpath Robotics opened up the doors to their brand new 12,000 square foot robot lair by bringing out a PR2 to cut the ceremonial ribbon and welcome everyone inside. And instead of just programming the ‘locate and destroy’ ribbon sequence, the co-founders opted to use an Oculus Rift to control the robot tearing through the material with flailing arms.
This was accomplished having Jake, the robot, utilize a Kinect 2.0 that fed skeleton tracking data via rosserial_windows, a windows-based set of extension for the Robot Operating System which we heard about in January. The software gathers in a stream of data points each with an X,Y,Z component allowing [Jake] to find himself within a 3D space.Then, the data was collected and published directly into the PR2’s brain. Inject a little python code, and the creature was able to route directions in order to move it’s arms.
Thus, by simply stepping in front of the Kinect 2.0, and putting on the Oculus Rift headset, anyone could teleoperate [Jake] to move around and wave its arms at oncoming ribbons. Once completed, [Jake] would leave the scene, journeying back into the newly created robot lair leaving pieces of nylon and polyester everywhere.
An earlier (un-smoothed) version of the full system can be seen after the break:
Continue reading “Cutting Ribbons with Robots and a Oculus Rift”
[Chris] works as part of a small team of developers in Cambridge, Massachusetts in the US. [Timo], one of their core members, works remotely from Heidelberg, Germany. In order to make [Timo] feel closer to the rest of the group, they built him a telepresence robot.
It was a link to DoubleRobotics that got the creative juices flowing. [Chris] and his team wanted to bring [Timo] into the room, but they didn’t have a spare $2499 USD in their budget. Instead they mated a standard motorized pan/tilt camera base with an RFduino Bluetooth kit. An application running on [Timo’s] phone sends gyroscope status through the internet to the iPad on the robot. The robot’s iPad then sends that data via Bluetooth to the RFduino. The RFduino commands pan and tilt movements corresponding with those sensed by the gyroscope. A video chat application runs on top of all this, allowing [Timo] to look around the room and converse with his coworkers.
All the source code is available via GitHub. The design didn’t work perfectly at first. [Chris] mentions the RFduino’s Bluetooth API is rather flaky when it comes to pairing operations. In the end the team was able to complete the robot and present it to [Timo] as a Valentine’s Day gift. For [Chris’] sake we hope [Timo] doesn’t spend too much of his time doing what his homepage URL would suggest: “screamingatmyscreen.com”
Before assuming that the title should be “web crawler,” just shush your shussins’ and check out the video after the break. The Pinoccio, as previously noted, is a board in development as a sort of web-enabled by default Arduino. This makes it perfect for a project like this one where a little rover is controlled from 10,000 Kilometers away, or around 6000 Miles for those of us that dwell in the US.
This setup uses a cell-phone accelerometer in Brazil to allow control of this robot in Nevada. Although close, the control isn’t quite real time, so that has to be accounted for. Something like this could be easily used for a telepresence ‘bot.
If you want to build your own, the assembly time is estimated at 1 hour. Instructions, as well as source code can be found on their page after the video. Although the Pinoccio board won’t be available until at least this summer, maybe this will give someone inspiration to try something similar in the mean time! Continue reading “Pinoccio Web Rover”
This telepresence upgrade lets an employee take part in the office from more than four thousand kilometers away. It’s an upgrade of their previous setup which used a laptop on a rotating platform to add a bit of control to the video conferencing experience. But all that original version could do was swivel, this one lets you drive your virtual self around for fifteen hours between battery charges.
The real work is in the base of the robot, as the audio and video are handled by a tablet independently from the locomotion. The team spent about four hundred bucks to throw the thing together. It starts with a hunk of plywood. Two 3A motors were mated with lawnmower wheels for the front of the bot. Dragging under the back of the base are a couple of casters that make it possible to turn without skidding. A motor shield and a WiFi shield for the Arduino make it possible to control the thing over the Internet. They even added some functionality on the client side to use a PlayStation 3 controller. Check out the completed machine in the clip after the break.
Continue reading “Telepresence upgrade with a minimum of effort”
Here is a telepresence robot that uses an Android device and LEGO NXT parts. [Wolfgang] had an extra phone on hand and decided to put it to good use. The Mindstorm parts make it really easy to produce a small robot, and adding the phone really ups the computing and connectivity options available to him.
The Android device is able to control the NXT bot via Bluetooth. [Wolfgang] didn’t go into detail on that part, but you can get some pointers on the topic from this other Android controlled Mindstorm project. [Wolfgang] wanted the ability to check in at home when he’s travelling. He uses nanohttpd on the Android device to serve up a simple web interface. It uses HTML5 to push a snapshot from the phone’s camera as user feedback, and provides a set of directional arrows which let him drive the bot around.
Obviously this thing is going to run out of juice if he’s away for too long. To combat that problem he included a battery which powers both the NXT parts and the phone. Now he just needs to build an inductive charging station and he’ll really be set.
Continue reading “NXT Android telepresence robot”
[Claire] sent in a project she’s been working on for the past few years. It’s called Botiful and aims to turn any Android phone into a mobile telepresence robot.
Botiful is built around the IOIO Android to Arduino dev board and provides a very clean way to interface your current cell phone with a tiny – and cute – robotic platform. The big feature of Botiful is its integration with Skype; just call a Botiful owner’s phone or tablet, and a panel pops up allowing you to control the robot, tilt the camera up and down, and even robotic yes, no, and ‘dance’ gestures.
Because Botiful is based on the IOIO, there’s a few pins available inside the bot for an I2C bus, PWM control, and even a serial output. It’s also possible to develop your own apps for Botiful, making for a neat mobile robotics platform.,
Right now, Botiful is only for Android but if [Claire] gets $100,000 out of her Kickstarter, she’s promised to add iDevice support. That seems fairly likely, as more than $60,000 has been pledged with three weeks to go. Pretty cool, and we can think of a few very useful asocial applications of the Botiful including running cable in a drop ceiling, and checking out that thing under your car.
Don’t have anyone to share activities with? Forget Siri, she’s just a disembodied voice in a box. You need to get yourself a shoulder-mounted robot pal.
The idea behind this design actually has something to do with telepresence. Let’s say you and your best friend want to go check out the local Hackerspace. The problem is that you met your best friend on the Internet and they live thousands of miles away. Well just strap on your shoulder robot and have your friend log on. There’s a camera to give him or her feedback, and twenty degrees of freedom lets them control the torso, arms, and head of the bot in a realistic and creepy way. This works much like a marionette, with motors pulling wires to actuate the robot’s movements. You can get a very brief look at this in the clip after the break.
Continue reading “Shoulder robot for the forever alone”