Robots Collaborate To Localize Themselves Precisely

Here’s the thing about robots. It’s hard for them to figure out where to go or what they should be doing if they don’t know where they are. Giving them some method of localization is key to their usefulness in almost any task you can imagine. To that end, [Guy Elmakis], [Matan Coronel] and [David Zarrouk] have been working on methods for pairs of robots to help each other in this regard.

As per the research paper, the idea is to perform real-time 3D localization between two robots in a given location. The basic idea is that the robots take turns moving. While one robot moves, the other effectively acts as a landmark. The robots are equipped with inertial measurement units and cameras in a turret, which they use to track each other and their own movements. Each robot is equipped with a Raspberry Pi 4 for processing image data and computing positions, and the two robots communicate via Bluetooth to coordinate their efforts.

It’s an interesting technique that could have some real applications in swarm robotics, and in operations in areas where satellite navigation and other typical localization techniques are not practical. If you’re looking for more information, you can find the paper here. We’ve seen some other neat localization techniques for small robots before, too. Video after the break.

Continue reading “Robots Collaborate To Localize Themselves Precisely”

Achieving Human Level Competitive Robot Table Tennis

A team at Google has spent a lot of time recently playing table tennis, purportedly only for science. Their goal was to see whether they could construct a robot which would not only play table tennis, but even keep up with practiced human players. In the paper available on ArXiv, they detail what it took to make it happen. The team also set up a site with a simplified explanation and some videos of the robot in action.

Table tennis robot vs human match outcomes. B is beginner, I is intermediate, A is advanced. (Credit: Google)
Table tennis robot vs human match outcomes. B is beginner, I is intermediate, A is advanced. (Credit: Google)

In the end, it took twenty motion-capture cameras, a pair of 125 FPS cameras, a 6 DOF robot on two linear rails, a special table tennis paddle, and a very large annotated dataset to train multiple convolutional neural networks (CNN) on to analyze the incoming visual data. This visual data was then combined with details like knowledge of the paddle’s position to churn out a value for use in the look-up table that forms the core of the high-level controller (HLC). This look-up table then decides which low-level controller (LLC) is picked to perform a certain action. In order to prevent the CNNs of the LLCs from ‘forgetting’ the training data, a total of 17 different CNNs were used, one per LLC.

The robot was tested with a range of players from a local table tennis club which made clear that while it could easily defeat beginners, intermediate players pose a serious threat. Advanced players completely demolished the table tennis robot. Clearly we do not have to fear our robotic table tennis playing overlords just yet, but the robot did receive praise for being an interesting practice partner. Continue reading “Achieving Human Level Competitive Robot Table Tennis”

Robot Arm Gives Kids The Roller Coaster Ride Of Their Lives

Unfortunately, [Dave Niewinski]’s kids are still too little to go on a real roller coaster. But they’re certainly big enough to be tossed around by this giant robot arm roller coaster simulator.

As to the question of why [Dave] has a Kuka KR 150 robot in his house, we prefer to leave that unasked and move forward. And apparently, this isn’t his first attempt at using the industrial robot as a motion simulator. That attempt revealed a few structural problems with the attachment between the rider’s chair and the robot’s wrist. After redesigning the frame with stouter metal and adding a small form-factor gaming PC and a curved monitor in front of the seat, [Dave] was ready to figure out how to make the arm simulate the motions of a roller coaster.

Now, if you ever thought the world would be a better place if only we had a roller coaster database complete with 4k 60 fps video captured from real coasters, you’re in luck. CoasterStats not only exists, but it also includes six-axis accelerometer data from real rides of coasters across Europe. That gave [Dave] the raw data he needed, but getting it translated into robot motions that simulate the feeling of the ride was a bit tricky. [Dave] goes into the physics of it all in the video below, but suffice it to say that the result is pretty cool.

More after the break.

Continue reading “Robot Arm Gives Kids The Roller Coaster Ride Of Their Lives”

Obscure Sci Fi Robots

Even if you don’t like to build replicas of movie robots, you can often draw inspiration from cinema. Everyone knows Robby the Robot, Gort, and R2D2. But [Atomic Snack Bar] treats us to some lesser-known robots from movies in the 1930s, 40s, and 50s. While we are pretty up on movies, we have to admit that the video, which you can see below, has a few we didn’t know about.

The robots are mostly humanoid. The comedy vampire flick from the 1950s could have inspired Robby, who appeared four years later. The exception that proves the rule is the Twonky which was a TV set turned robot turned mind controller.

Continue reading “Obscure Sci Fi Robots”

Re-imagining Telepresence With Humanoid Robots And VR Headsets

Don’t let the name of the Open-TeleVision project fool you; it’s a framework for improving telepresence and making robotic teleoperation far more intuitive than it otherwise would be. It accomplishes this in part by taking advantage of the remarkable technology packed into modern VR headsets like the Apple Vision Pro and Meta Quest. There are loads of videos on the project page, many of which demonstrate successful teleoperation across vast distances.

Teleoperation of robotic effectors typically takes some getting used to. The camera views are unusual, the limbs don’t move the same way arms do, and intuitive human things like looking around to get a sense of where everything is don’t translate well.

A stereo camera with gimbal streaming to a VR headset complete with head tracking seems like a very hackable design.

To address this, researches provided a user with a robot-mounted, real-time stereo video stream (through which the user can turn their head and look around normally) as well as mapping arm and hand movements to humanoid robotic counterparts. This provides the feedback to manipulate objects and perform tasks in a much more intuitive way. In short, when our eyes, bodies, and hands look and work more or less the way we expect, it turns out it’s far easier to perform tasks.

The research paper goes into detail about the different systems, but in essence, a stereo depth and RGB camera is perched with a 3D printed gimbal atop a humanoid robot frame like the Unitree H1 equipped with high dexterity hands. A VR headset takes care of displaying a real-time stereoscopic video stream and letting the user look around. Hand tracking for the user is mapped to the dexterous hands and fingers. This lets a person look at, manipulate, and handle things without in-depth training. Perhaps slower and more clumsily than they would like, but in an intuitive way all the same.

Interested in taking a closer look? The GitHub repository has the necessary code, and while most of us will never be mashing ADD TO CART on something like the Unitree H1, the reference design for a stereo camera streaming to a VR headset and mirroring head tracking with a two-motor gimbal looks like the sort of thing that would be useful for a telepresence project or two.

Continue reading “Re-imagining Telepresence With Humanoid Robots And VR Headsets”

An RC Tracked Robot, Without The Pain

Small robots can be found at all levels from STEM toys for kids all the way through to complex hacker projects. Somewhere along that line between easy enough for anyone to build and interesting enough for hackers lies the PlayCar, from [ComfySpace]. It’s a small build-it-yourself tracked robot that’s controlled from your smartphone via an app.

At the PlayCar’s heart is a Raspberry Pi Zero 2W, and surrounding it are a set of inexpensive off the shelf modules for power and motor control. The juice meanwhile comes from a set of AA batteries, and the motors are geared DC units. Having acquired all the components, the 3D printable parts can then be downloaded from Printables, and the ComfySpace app can be downloaded for either Apple or Android platforms.

It’s clear that ComfySpace is a start-up targeting the education sector, and we wish them every success. The approach of making an open platform is one we like, as it has the potential to create a community feeding back designs and add-ons rather than remaining proprietary. You can take a  look at the video below the break for more information.

Continue reading “An RC Tracked Robot, Without The Pain”

On the left, a transluscent yellowy-tan android head with eyes set behind holes in the face. On the right, a bright pink circle with small green eyes. It is manipulated into the image of a smiling face via its topography.

A Robot Face With Human Skin

Many scifi robots have taken the form of their creators. In the increasingly blurry space between the biological and the mechanical, researchers have found a way to affix human skin to robot faces. [via NewScientist]

Previous attempts at affixing skin equivalent, “a living skin model composed of cells and extracellular matrix,” to robots worked, even on moving parts like fingers, but typically relied on protrusions that impinged on range of motion and aesthetic concerns, which are pretty high on the list for robots designed to predominantly interact with humans. Inspired by skin ligaments, the researchers have developed “perforation-type anchors” that use v-shaped holes in the underlying 3D printed surface to keep the skin equivalent taut and pliable like the real thing.

The researchers then designed a face that took advantage of the attachment method to allow their robot to have a convincing smile. Combined with other research, robots might soon have skin with touch, sweat, and self-repair capabilities like Data’s partial transformation in Star Trek: First Contact.

We wonder what this extremely realistic humanoid hand might look like with this skin on the outside. Of course that raises the question of if we even need humanoid robots? If you want something less uncanny, maybe try animating your stuffed animals with this robotic skin instead?