It’s always great fun to build your own robot. Sometimes, though, if you’re doing various projects or research, it’s easier to buy an existing robot and then use it to get down to business. That was very much the role of the Willow Garage PR2, but unfortunately, it’s no longer in production. However, as covered by The Robot Report, the design files have now been released for others to use.
The PR2 was built as an advanced platform with wide-ranging capabilities. It was able to manipulate objects with its 7-degrees-of-freedom end effectors, as well as visualize the real world with a variety of complex sensor packages. Researchers put it to work on a variety of tasks, from playing pool to fetching beers and even folding laundry. The latter one is still considered an unsolved problem that challenges even the best robots.
Rights to the PR2 robot landed in the hands of Clearpath Robotics, after Willow Garage was shut down in 2014. Clearpath is now providing access to the robot’s design files on its website. This includes everything from wiring diagrams and schematics, to assembly drawings, cable specs, and other background details. You’ll have to provide some personal information to get access, but the documentation you desire is all there.
We actually got our first look at the PR2 robot many years ago, way back in 2009. If you decide to build your own from scratch, be sure to hit us up on the tipsline.
Continue reading “Design Files Released For The PR2 Robot”
On June 26th, 2014, Clearpath Robotics opened up the doors to their brand new 12,000 square foot robot lair by bringing out a PR2 to cut the ceremonial ribbon and welcome everyone inside. And instead of just programming the ‘locate and destroy’ ribbon sequence, the co-founders opted to use an Oculus Rift to control the robot tearing through the material with flailing arms.
This was accomplished having Jake, the robot, utilize a Kinect 2.0 that fed skeleton tracking data via rosserial_windows, a windows-based set of extension for the Robot Operating System which we heard about in January. The software gathers in a stream of data points each with an X,Y,Z component allowing [Jake] to find himself within a 3D space.Then, the data was collected and published directly into the PR2’s brain. Inject a little python code, and the creature was able to route directions in order to move it’s arms.
Thus, by simply stepping in front of the Kinect 2.0, and putting on the Oculus Rift headset, anyone could teleoperate [Jake] to move around and wave its arms at oncoming ribbons. Once completed, [Jake] would leave the scene, journeying back into the newly created robot lair leaving pieces of nylon and polyester everywhere.
An earlier (un-smoothed) version of the full system can be seen after the break:
Continue reading “Cutting Ribbons With Robots And A Oculus Rift”
Ever heard about the Robot Operating System? It’s a BSD-licensed open-source system for controlling robots, from a variety of hardware. Over the years we’ve shared quite a few projects that run ROS, but nothing on how to actually use ROS. Lucky for us, a robotics company called Clearpath Robotics — who use ROS for everything — have decided to graciously share some tips and tricks on how to get started with ROS 101: An Introduction to the Robot Operating System.
The beauty of the ROS system is that it is made up of a series of independent nodes which communicate with each other using a publish/subscribe messaging model. This means the hardware doesn’t matter. You can use different computers, even different architectures. The example [Ilia Baranov] gives is using an Arduino to publish the messages, a laptop subscribed to them, and even an Android phone used to drive the motors — talk about flexibility!
It appears they will be doing a whole series of these 101 posts, so check it out — they’ve already released numéro 2, ROS 101: A Practical Example. It even includes a ready to go Ubuntu disc image with ROS pre-installed to mess around with on VMWare Player!
And to get you inspired for using ROS, check out this Android controlled robot using it! Or how about a ridiculous wheel-chair-turned-creepy-face-tracking-robot?