VR Telepresence Tank From Raspberry Pi, Google Cardboard, And Xbox Controller

It’s great to see different kinds of hardware and software tossed into a project together, allowing someone to mix things that don’t normally go together into something new. [Freddy Kilo] did just that with a project he calls his VR Robot Tank. It’s a telepresence device that uses a wireless Xbox controller to drive a tracked platform, which is itself headed by a Raspberry Pi.

The Pi has two cameras on a pan-tilt mount, and those cameras are both aimed and viewed via a Google Cardboard-like setup. A healthy dose of free software glues it together, allowing things like video streaming (with U4VL) and steering via the wireless controller (with xboxdrv). A bit of fiddling was required for some parts – viewing the stereoscopic cameras for example is done by opening and positioning two video windows just right so as to see them through the headset lenses. It doesn’t warp the image to account for the lens distortion in the headset, and the wireless range might be limited, but the end result seems to work well enough.

The tank is driven with the wireless controller while a mobile phone mounted in a headset lets the user see through the cameras; motion sensing in the phone moves those cameras whenever you move your head to look around. Remote Control hobbyists will recognize the project as doing essentially the same job as FPV setups for model aircraft (for example, Drone Racing or even Snow Sleds) but this project uses a completely different hardware and software toolchain. It demonstrates the benefits of having access to open tools to use as virtual “duct tape”, letting people stick different things together to test a concept. It proves almost anything can be made to work if you have a willingness to fiddle!

Continue reading “VR Telepresence Tank From Raspberry Pi, Google Cardboard, And Xbox Controller”

Shoot Darts At The Shins Of Total Strangers

[Michael Brumlow] found us and sent us a link. Within a few seconds, we were driving a webcam-enabled Nerf dart tank through his office and trying not to hit walls or get stepped on by his co-workers. Unfortunately, it was out of darts at the time, but you can find them all over the floor if you scout around.

screenshot_remote_botAll of the code details, including the link where you can test drive it yourself, are up on [Michael]’s GitHub. The brains are an Intel Edison board, and the brawns are supplied by an Arduino motor controller shield and (for the latest version) a chassis bought from China.

It runs fairly smoothly, considering the long round trip from [Michael]’s office in Texas, through wherever Amazon keeps their Web Services, over to us in Germany and back. Once we got used to the slight lag, and started using the keyboard’s arrow keys for control, we were driving around like a pro.

It’s got a few glitches still, like the camera periodically overheating and running out of WiFi distance. [Michael] said he’d try to keep it charged up and running while you give it a shot. The controls are multiplexed in the cloud, so your chance of steering it is as good as anyone else’s. It’ll be interesting to see what happens when thousands of Hackaday readers try to control it at once!

It takes a certain kind of bravery to put your telepresence robot up on the open Internets. So kudos to you, [Michael], and we hope that you manage to get some work done this week, even though you will have all of Hackaday driving into your cubicle walls.

Robotic Tabletop

Remember pin art? That’s the little box full of pins that you can push something into and the pins take on the shape. You usually use your hand, but any small object works (including, if you are brave enough, your face). [Sean Follmer] (formerly at the MIT Media Lab) developed the reverse of this: a surface made of pins driven by motors. Under computer control, the surface can take on shapes all by itself.

The square pins can be seen in the video below moving and manipulating blocks and using them to build structures out of the blocks. By using the right sequence of pin motions, the blocks can be flipped and even stacked. Magnetic blocks offer even more options.

Continue reading “Robotic Tabletop”

telepresence

Telepresence Robot Demo Unit Breaks Free Of Its Confinement

What happens when you put a telepresence robot online for the world to try out for free? Hilarity of course. Double Robotics is a company that builds telepresence robots. The particular robot in question is kind of like a miniature Segway with a tablet computer on top. The idea is you can control it with your own tablet from a remote location. This robot drives around with your face on the screen, allowing you to almost be somewhere when you can’t (or don’t want to) be there in person.

Double Robotics decided to make one of these units accessible to the Internet as a public demonstration. Of course, they couldn’t have one of these things just roaming about their facility unrestrained. They ended up keeping it locked in an office. This gives users the ability to drive it around a little bit and get a feel for the robot. Of course it didn’t take long for users to start to wonder how they could break free from their confinement.

One day, a worker left the office door cracked open ever so slightly. A user noticed this and after enough patience and determination, managed to use the robot to get the door opened. It appears as though the office was closed at the time, so no one was around to witness the event. A joy ride ensued and the robot hid its tracks by locking itself back in the room and docking to the charging station.

While this isn’t a hack in the typical sense, this is a perfect example of the hacker mindset. You are given some new technology and explore it to the extent at which you are supposed too. After that, many people would just toss it aside and not give it a second thought. Those with the hacker mindset are different, though. Our next thought is usually, “What else can I do with it?” This video demonstrates that in a fun and humorous way. Hopefully the company learns its lesson and puts a leash on that thing. Continue reading “Telepresence Robot Demo Unit Breaks Free Of Its Confinement”

Thumbnail that say The Hacklet

Hacklet 16 – Terrific Telepresence Technology

16

This weeks Hacklet is all about being there when you can’t through the magic of telepresence. More than just teleconferencing, telepresence takes things a step further to put the user in a remote space. That might be a robot platform, VR goggles, or a actuators to interact with the remote environment. It’s also a field filled with opportunities for creative hackers!

sidWe start with [PJK’s] Subterranean investigation device. [PJK] is exploring a castle for a hidden basement. To get there he has to traverse a tiny passageway with a rubble floor. Nicknamed “Sid The Weedy”, [PJK’s] bot is radio controlled and uses a webcam to send images back to [PJK]. Much like the robots used to explore pyramids, [PJK] has gone with a track drive system. Unlike the pyramid bots, [PJK] is on a budget, so his track system is a modified chain with block treads. [PJK] doesn’t want to get too attached to his robot – he may well lose Sid on his maiden voyage.

skypeRobotNext up is [JackRC] with his Skype robot. [Jack] is building a relatively low-cost (approx $200 USD) robot using the Skype API. Both his Mark I and Mark II models are based on R/C tanks. Tanks can carry a surprising amount of weight when you remove the turret and cannon. [Jack] added a mounting arm for a tablet and a robot arm for disarming bombs and/or angry children. His craftsmanship skills really show through in the completed ‘bot. Without a size reference, it could pass for a police issue bomb disposal robot!

rift[Gary Firestone] takes us to the skies with his Minimal Latency Oculus Rift FPV. [Gary] is using an Oculus Rift Head Mounted Display (HMD) for First Person View (FPV) piloting. His aircraft is a quadcopter.  [Gary’s] video source is a GoPro camera. His quadcopter transmits the video on 5.8GHz using a standard analog video system. On the receiving end, a laptop captures the video, removes the fish eye warp from the GoPro lens, the re-warps the image for the Oculus. His latency is down around 50 – 100ms, which is pretty good for a system capturing analog video.

rover-americaNext [Brad] rolls cross-country with Chipbot: 4G Telepresence Rover Across America. [Brad] and his 5-year-old stepson are converting an R/C truck into a telepresence rover. Chipbot’s electronics have been given a major upgrade. [Brad] added a Raspberry Pi and an Arduino with an SN75441 chip for motor control. Connectivity is via WiFi using a TP-LINK router, or cellular using a 4G modem. Rather than a Raspberry Pi camera, [Brad] chose to go with a Ubiquiti IP camera. The Ubiquiti uses power over ethernet, so he’s added a POE injector. Chipbot is still in development, but as [Brad’s] last update shows, Chipbot is already responding to commands from the interwebs. It’s been about a month since the last Chipbot update, so if you see [Brad] tell him to stop by Hackaday.io and let us how things are progressing!

android-teleFinally, we have [Joe Ferner] with his generically named Telepresence Robot. [Joe] is controlling his android telepresence avatar with Google’s Android Operating System. His on-board computer is a Nexus 7 tablet. A custom board with an STM32 ARM microcontroller allows the Nexus to interface to the robot’s motors and sensors. [Joe] is using a web interface to control his robot. The early demos are promising, as the telepresence bot has already been taken for a drive in Reston, VA by a user in Milwaukee, WI.

That’s a wrap for this episode of The Hacklet.  As always, see you next week. Same hack time, same hack channel, bringing you the best of Hackaday.io!

Update – Check out our telepresence list right here!

Cutting Ribbons With Robots And A Oculus Rift

PR2-GrandOpening

On June 26th, 2014, Clearpath Robotics opened up the doors to their brand new 12,000 square foot robot lair by bringing out a PR2 to cut the ceremonial ribbon and welcome everyone inside. And instead of just programming the ‘locate and destroy’ ribbon sequence, the co-founders opted to use an Oculus Rift to control the robot tearing through the material with flailing arms.

This was accomplished having Jake, the robot, utilize a Kinect 2.0 that fed skeleton tracking data via rosserial_windows, a windows-based set of extension for the Robot Operating System which we heard about in January. The software gathers in a stream of data points each with an X,Y,Z component allowing [Jake] to find himself within a 3D space.Then, the data was collected and published directly into the PR2’s brain. Inject a little python code, and the creature was able to route directions in order to move it’s arms.

Thus, by simply stepping in front of the Kinect 2.0, and putting on the Oculus Rift headset, anyone could teleoperate [Jake] to move around and wave its arms at oncoming ribbons. Once completed, [Jake] would leave the scene, journeying back into the newly created robot lair leaving pieces of nylon and polyester everywhere.

An earlier (un-smoothed) version of the full system can be seen after the break:

Continue reading “Cutting Ribbons With Robots And A Oculus Rift”

Telepresence Robot Proves It’s A Small World After All

jolvoy[Chris] works as part of a small team of developers in Cambridge, Massachusetts in the US. [Timo], one of their core members, works remotely from Heidelberg, Germany. In order to make [Timo] feel closer to the rest of the group, they built him a telepresence robot.

It was a link to DoubleRobotics that got the creative juices flowing. [Chris] and his team wanted to bring [Timo] into the room, but they didn’t have a spare $2499 USD in their budget. Instead they mated a standard motorized pan/tilt camera base with an RFduino Bluetooth kit. An application running on [Timo’s] phone sends gyroscope status through the internet to the iPad on the robot. The robot’s iPad then sends that data via Bluetooth to the RFduino. The RFduino commands pan and tilt movements corresponding with those sensed by the gyroscope.  A video chat application runs on top of all this, allowing [Timo] to look around the room and converse with his coworkers.

All the source code is available via GitHub. The design didn’t work perfectly at first. [Chris] mentions the RFduino’s Bluetooth API is rather flaky when it comes to pairing operations. In the end the team was able to complete the robot and present it to [Timo] as a Valentine’s Day gift. For [Chris’] sake we hope [Timo] doesn’t spend too much of his time doing what his homepage URL would suggest: “screamingatmyscreen.com”

[Thanks Parker]