Re-imagining Telepresence With Humanoid Robots And VR Headsets

Don’t let the name of the Open-TeleVision project fool you; it’s a framework for improving telepresence and making robotic teleoperation far more intuitive than it otherwise would be. It accomplishes this in part by taking advantage of the remarkable technology packed into modern VR headsets like the Apple Vision Pro and Meta Quest. There are loads of videos on the project page, many of which demonstrate successful teleoperation across vast distances.

Teleoperation of robotic effectors typically takes some getting used to. The camera views are unusual, the limbs don’t move the same way arms do, and intuitive human things like looking around to get a sense of where everything is don’t translate well.

A stereo camera with gimbal streaming to a VR headset complete with head tracking seems like a very hackable design.

To address this, researches provided a user with a robot-mounted, real-time stereo video stream (through which the user can turn their head and look around normally) as well as mapping arm and hand movements to humanoid robotic counterparts. This provides the feedback to manipulate objects and perform tasks in a much more intuitive way. In short, when our eyes, bodies, and hands look and work more or less the way we expect, it turns out it’s far easier to perform tasks.

The research paper goes into detail about the different systems, but in essence, a stereo depth and RGB camera is perched with a 3D printed gimbal atop a humanoid robot frame like the Unitree H1 equipped with high dexterity hands. A VR headset takes care of displaying a real-time stereoscopic video stream and letting the user look around. Hand tracking for the user is mapped to the dexterous hands and fingers. This lets a person look at, manipulate, and handle things without in-depth training. Perhaps slower and more clumsily than they would like, but in an intuitive way all the same.

Interested in taking a closer look? The GitHub repository has the necessary code, and while most of us will never be mashing ADD TO CART on something like the Unitree H1, the reference design for a stereo camera streaming to a VR headset and mirroring head tracking with a two-motor gimbal looks like the sort of thing that would be useful for a telepresence project or two.

Continue reading “Re-imagining Telepresence With Humanoid Robots And VR Headsets”

Turning That Old Hoverboard Into A Learning Platform

[Isabelle Simova] is building Hoverbot, a flexible robotics platform using Ikea plastic trays, JavaScript running on a Raspberry Pi and parts scavenged from commonly available hoverboards.

Self-balancing scooters a.k.a. Hoverboards are a great source of parts for such a project. Their high torque, direct drive brushless motors can drive loads of 100 kg or more. In addition, you also get a matching motor controller board, a rechargeable battery and its charging circuit. Most hoverboard controllers use the STM32F103, so flashing them with your own firmware becomes easy using a ST-link V2 programmer.

The next set of parts you need to build your robot is sensors. Some are cheap and easily available, such as microphones, contact switches or LDRs, while others such as ultrasonic distance sensors or LiDAR’s may cost a lot more. One source of cheap sensors are car parking assist transducers. An aftermarket parking sensor kit usually consists of four transducers, a control box, cables and display. Using a logic analyzer, [Isabelle] shows how you can poke around the output port of the control box to reverse engineer the data stream and decipher the sensor data. Once the data structure is decoded, you can then use some SPI bit-banging and voltage translation to interface it with the Raspberry Pi. Using the Pi makes it easy to add a cheap web camera, microphone and speakers to the Hoverbot.

Ikea is a hackers favourite, and offers a wide variety of hacker friendly devices and supplies. Their catalog offers a wide selection of fine, Swedish engineered products which can be used as enclosures for building robots. [Isabelle] zeroed in on a deep, circular plastic tray from a storage table set, stiffened with some plywood reinforcement. The tray offers ample space to mount the two motors, two castor wheels, battery and the rest of the electronics. Most of the original hardware from the hoverboard comes handy while putting it all together.

The software glue that holds all this together is JavaScript. The event-driven architecture of Node.js makes it a very suitable framework to use for Hoverbot. [Isabelle] has built a basic application allowing remote control of the robot. It includes a dashboard which shows live video and audio streams from the robot, buttons for movement control, an input box for converting text to speech, ultrasonic sensor visualization, LED lighting control, message log and status display for the motors. This makes the dashboard a useful debugging tool and a starting point for building more interesting applications. Check the build log for all the juicy details. Which other products from the Ikea catalog can be used to build the Hoverbot? How about a robotic Chair?

Continue reading “Turning That Old Hoverboard Into A Learning Platform”

Soon You’ll Sit Inside A Robot’s Head At Work

MIT’s Computer Science and Artificial Intelligence Lab, CSAIL, has created a process of teleoperating a Baxter humanoid robot with an Oculus Rift VR headset. This project is partially aimed towards making manufacturing jobs a hell of a lot of fun telecommutable. It could even be a way to supervise robot workers from a distance.

In a nutshell, the user controls the robot remotely in a virtual reality environment. The user does this specifically in a VR environment modeled like a control room with multiple sensor displays, making it feel like they are sitting inside the robot’s head. By using hand controllers, users can match their movements to the robot’s to complete various tasks. If you’ve seen Pacific Rim, you are probably envisioning a Jaegar right about now — minus the psychic linking.

Continue reading “Soon You’ll Sit Inside A Robot’s Head At Work”

Caption CERN Contest Turns Out Big Brains And Comic Brilliance

Week 1 of Hackaday’s Caption CERN Contest is complete. We have to say that the Hackaday.io users outdid themselves with funny captions but we also helped CERN add meaning to one of their orphan images. First a few of our favorite captions:

The Funnies:

If you adjust that scope again, when I haven’t touched the controls, I’m donating you to a city college. – [Johnny B. Goode]

SAFTEY FIRST – The proper way to test a 6kv power supply for ripple on the output. – [milestogoh]

Dr. Otto Gunther Octavius – R&D some years before the accident. – [jlbrian7]

The prize though, goes to Hackaday commenting superstar [DainBramage], who proved he knows us all too well with his Portal inspired caption:

Here we see Doug Rattmann, one of Aperture’s best and brightest, perfecting our neurotoxin prior to delivery.

Congrats [DainBramage], enjoy your shirt from The Hackaday Store!

The Meaning of the Image:

8106409Funny captions weren’t the only thing in the comments though – the image tickled [jlbrian7’s] memory and led to a link for CERN Love. A four-year old blog entry about robots at CERN turned out to be the key to unraveling the mystery of this captionless photo. The image depicts [Robert Horne] working with a prototype of the MANTIS system. MANTIS was a teleoperation manipulator system created to work in sections of the CERN facility which were unsafe for humans due to high levels of radioactivity. The MANTIS story is an epic hack itself, so keep your eyes peeled for a future article covering it! We’ve submitted the information to CERN, and we’re giving [jlbrian7] a T-shirt as well for his contribution to finding the actual caption for this image.

Get Started on Next Week:

The image for week 2 is already up, so head over and see for yourself. We’re eager for your clever captions. Ideally we can also figure out the backstory for each week’s randomly chosen image.