Hackaday Prize Entry: Telepresence with the Black Mirror Project

The future is VR, or at least that’s what it was two years ago. Until then, there’s still plenty of time to experiment with virtual worlds, the Metaverse, and other high-concept sci-fi tropes from the 80s and 90s. Interactive telepresence is what the Black Mirror Project is all about. Their plan is to create interactive software based on JanusVR platform for creating immersive VR experiences.

The Black Mirror project makes use of the glTF runtime 3D asset delivery to create an environment ranging from simple telepresence to the mind-bending realities the team unabashedly compares to [Neal Stephenson]’s Metaverse.

For their hardware implementation, the team is looking at UDOO X86 single-board computers, with SSDs for data storage as well as a bevy of sensors — gesture, light, accelerometer, magnetometer — supplying the computer with data. There’s an Intel RealSense camera in the build, and the display is unlike any other VR setup we’ve seen before. It’s a tensor display with multiple projection planes and variable backlighting that has a greater depth of field and wider field of view than almost any other display.

Telepresence Robot 2000 Leagues Under the Sea

Telepresence robots are now a reality, you can wheel around the office and talk to people, join a meeting, see stuff and bump into your colleagues. But imagine if telepresence were applied to deep sea exploration. Today we can become oceanographers through the telepresence system created by Bob Ballard (known for locating the Titanic, discovered deep sea geothermal vents, and more) and his team at the Inner Space Center. Put on your Submariner wristwatch because its time for all of us to explore the ocean depths via the comfort of our home or office.

Continue reading “Telepresence Robot 2000 Leagues Under the Sea”

The Internet Of Interactive Cats

[Tuco] is a cat who shares the space of [Micah Elizabeth Scott]. He is a large tabby tomcat, and he is polydactyl, which is to say he has a congenital excess of toes. He is an extremely active and engaging creature and enjoys playing and interacting with her. We covet [Tuco].

Sadly for the rest of us who love cats, of course, unless we know [Micah] personally we’ll never have the opportunity to play with [Tuco]. She appreciates the cat-shaped void that will leave in our lives, and to help us she’s building a telepresence robot to allow the rest of us to interact with him in real time.

Her idea is to make a flying robot equipped with a camera on a gimbal, and because to mounting it on a multirotor platform would be a hazard, instead she’s making something closer to the aerial cameras you might be familiar with from sporting fixtures, a motorised platform suspended from the corners of her roof space on a set of nylon ropes, that can move at will by adjusting the length of each tether. It is suggested that one day the device will be able to launch plastic bolts for [Tuco] to chase and to incorporate other interactive features to allow online users to engage with him.

We are shown progress so far in the video introducing the project that we’ve placed below the break, she has completed a prototype windlass mechanism and worked on reverse engineering the gimbal mechanism for serial control. We’ll probably never meet [Tuco] in person, but we can’t wait to interact with him online.

Continue reading “The Internet Of Interactive Cats”

Keeping Humanity Safe from Robots at Disney

Almost every big corporation has a research and development organization, so it came as no surprise when we found a tip about Disney Research in the Hackaday Tip Line. And that the project in question turned out to involve human-safe haptic telepresence robots makes perfect sense, especially when your business is keeping the Happiest Place on Earth running smoothly.

That Disney wants to make sure their Animatronics are safe is good news, but the Disney project is about more than keeping guests healthy. The video after the break and the accompanying paper (PDF link) describe a telepresence robot with a unique hydrostatic transmission coupling it to the operator. The actuators are based on a rolling-diaphragm design that limits hydraulic pressure. In a human-safe system that’s exactly what you want.

The system is a hybrid hydraulic-pneumatic design; two actuators, one powered by water pressure and the other with air, oppose each other in each joint. The air-charged actuators behave like a mass-efficient spring that preloads the hydraulic actuator. This increases safety by allowing the system to be de-energized instantly by venting the air lines. What’s more, the whole system presents very low mechanical impedance, allowing haptic feedback to the operator through the system fluid. This provides enough sensitivity to handle an egg, thread a needle — or even bop a kid’s face with impunity.

There are some great ideas here for robotics hackers, and you’ve got to admire the engineering that went into these actuators. For more research from the House of Mouse, check out this slightly creepy touch-sensitive smart watch, or this air-cannon haptic feedback generator.

Continue reading “Keeping Humanity Safe from Robots at Disney”

Smartphone-based Robotic Rover Project goes Open Source

[Aldric Négrier] wrote in to let us know that his DriveMyPhone project has been open sourced. The project is a part telepresence, part remote-controlled vehicle, part robotic rover concept on which he says “I spent more time […] than I should have.” He has shared not just the CAD files, but every detail including tips on assembly. He admits that maybe a robotic chassis for a smartphone might not seem like a particularly new idea today, but it was “an idea with more potential” back in 2010 when he first started.

The chassis is made to cradle a smartphone. Fire up your favorite videoconferencing software and you have a way to see where you’re going as well as hear (and speak to) your surroundings. Bluetooth communications between the phone and the chassis provides wireless control. That being said, this unit is clearly designed to be able to deal with far more challenging terrain than the average office environment, and has been designed to not only be attractive, but to be as accessible and open to repurposing and modification as possible.

Continue reading “Smartphone-based Robotic Rover Project goes Open Source”

VR Telepresence Tank from Raspberry Pi, Google Cardboard, and Xbox Controller

It’s great to see different kinds of hardware and software tossed into a project together, allowing someone to mix things that don’t normally go together into something new. [Freddy Kilo] did just that with a project he calls his VR Robot Tank. It’s a telepresence device that uses a wireless Xbox controller to drive a tracked platform, which is itself headed by a Raspberry Pi.

The Pi has two cameras on a pan-tilt mount, and those cameras are both aimed and viewed via a Google Cardboard-like setup. A healthy dose of free software glues it together, allowing things like video streaming (with U4VL) and steering via the wireless controller (with xboxdrv). A bit of fiddling was required for some parts – viewing the stereoscopic cameras for example is done by opening and positioning two video windows just right so as to see them through the headset lenses. It doesn’t warp the image to account for the lens distortion in the headset, and the wireless range might be limited, but the end result seems to work well enough.

The tank is driven with the wireless controller while a mobile phone mounted in a headset lets the user see through the cameras; motion sensing in the phone moves those cameras whenever you move your head to look around. Remote Control hobbyists will recognize the project as doing essentially the same job as FPV setups for model aircraft (for example, Drone Racing or even Snow Sleds) but this project uses a completely different hardware and software toolchain. It demonstrates the benefits of having access to open tools to use as virtual “duct tape”, letting people stick different things together to test a concept. It proves almost anything can be made to work if you have a willingness to fiddle!

Continue reading “VR Telepresence Tank from Raspberry Pi, Google Cardboard, and Xbox Controller”

Shoot Darts at the Shins of Total Strangers

[Michael Brumlow] found us and sent us a link. Within a few seconds, we were driving a webcam-enabled Nerf dart tank through his office and trying not to hit walls or get stepped on by his co-workers. Unfortunately, it was out of darts at the time, but you can find them all over the floor if you scout around.

screenshot_remote_botAll of the code details, including the link where you can test drive it yourself, are up on [Michael]’s GitHub. The brains are an Intel Edison board, and the brawns are supplied by an Arduino motor controller shield and (for the latest version) a chassis bought from China.

It runs fairly smoothly, considering the long round trip from [Michael]’s office in Texas, through wherever Amazon keeps their Web Services, over to us in Germany and back. Once we got used to the slight lag, and started using the keyboard’s arrow keys for control, we were driving around like a pro.

It’s got a few glitches still, like the camera periodically overheating and running out of WiFi distance. [Michael] said he’d try to keep it charged up and running while you give it a shot. The controls are multiplexed in the cloud, so your chance of steering it is as good as anyone else’s. It’ll be interesting to see what happens when thousands of Hackaday readers try to control it at once!

It takes a certain kind of bravery to put your telepresence robot up on the open Internets. So kudos to you, [Michael], and we hope that you manage to get some work done this week, even though you will have all of Hackaday driving into your cubicle walls.