Tracking Drone Flight Path Via Video, Using Cameras We Can Get

Calculating three-dimensional position from two-dimensional projections are literal textbook examples in geometry, but those examples are the “assume a spherical cow” type of simplifications. Applicable only in an ideal world where the projections are made with mathematically perfect cameras at precisely known locations with infinite resolution. Making things work in the real world is a lot harder. But not only have [Jingtong Li, Jesse Murray et al.] worked through the math of tracking a drone’s 3D flight from 2D video, they’ve released their MultiViewUnsynch software on GitHub so we can all play with it.

Instead of laboratory grade optical instruments, the cameras used in these experiments are available at our local consumer electronics store. A table in their paper Reconstruction of 3D Flight Trajectories from Ad-Hoc Camera Networks (arXiv:2003.04784) listed several Huawei cell phone cameras, a few Sony digital cameras, and a GoPro 3. Video cameras don’t need to be placed in any particular arrangement, because positions are calculated from their video footage. Correlating overlapping footage from dissimilar cameras is a challenge all in itself, since these cameras record at varying framerates ranging from 25 to 59.94 frames per second. Furthermore, these cameras all have rolling shutters, which adds an extra variable as scanlines in a frame are taken at slightly different times. This is not an easy problem.

There is a lot of interest in tracking drone flights, especially those flying where they are not welcome. And not everyone have the budget for high-end equipment or the permission to emit electromagnetic signals. MultiViewUnsynch is not quite there yet, as it tracks a single target and video files were processed afterwards. The eventual goal is to evolve this capability to track multiple targets on live video, and hopefully help reduce frustrating public embarrassments.

[IROS 2020 Presentation video (duration 14:45) requires free registration, available until at least Nov. 25th 2020.]

One Wheel Is All We Need To Roll Into Better Multirotor Efficiency

Multirotor aircraft enjoy many intrinsic advantages, but as machines that fight gravity with brute force, energy efficiency is not considered among them. In the interest of stretching range, several air-ground hybrid designs have been explored. Flying cars, basically, to run on the ground when it isn’t strictly necessary to be airborne. But they all share the same challenge: components that make a car work well on the ground are range-sapping dead weight while in the air. [Youming Qin et al.] explored cutting that dead weight as much as possible and came up with Hybrid Aerial-Ground Locomotion with a Single Passive Wheel.

As the paper’s title made clear, they went full minimalist with this design. Gone are the driveshaft, brakes, steering, even other wheels. All that remained is a single unpowered wheel bolted to the bottom of their dual-rotor flying machine. Minimizing the impact on flight characteristics is great, but how would that work on the ground? As a tradeoff, these rotors have to keep spinning even while in “ground mode”. They are responsible for keeping the machine upright, and they also have to handle tasks like steering. These and other control algorithm problems had to be sorted out before evaluating whether such a compromised ground vehicle is worth the trouble.

Happily, the result is a resounding “yes”. Even though the rotors have to continue running to do different jobs while on the ground, that was still far less effort than hovering in the air. Power consumption measurements indicate savings of up to 77%, and there are a lot of potential venues for tuning still awaiting future exploration. Among them is to better understand interaction with ground effect, which is something we’ve seen enable novel designs. This isn’t exactly the flying car we were promised, but its development will still be interesting to watch among all the other neat ideas under development to keep multirotors in the air longer.

[IROS 2020 Presentation video (duration 10:49) requires no-cost registration, available until at least Nov. 25th 2020. Forty-two second summary embedded below]

Continue reading “One Wheel Is All We Need To Roll Into Better Multirotor Efficiency”

Robots Learning To Understand Their Surroundings

Today it is pretty easy to build a robot with an onboard camera and have fun manually driving through that first-person view. But builders with dreams of autonomy quickly learn there is a lot of work between camera installation and autonomously executing a “go to chair” command. Fortunately we can draw upon work such as View Parsing Network by [Bowen Pan, Jiankai Sun, et al]

When a camera image comes into a computer, it is merely a large array of numbers representing red, green, and blue color values and our robot has no idea what that image represents. Over the past years, computer vision researchers have found pretty good solutions for problems of image classification (“is there a chair?”) and segmentation (“which pixels correspond to the chair?”) While useful for building an online image search engine, this is not quite enough for robot navigation.

A robot needs to translate those pixel coordinates into real-world layout, and this is the problem View Parsing Network offers to solve. Detailed in Cross-view Semantic Segmentation for Sensing Surroundings (DOI 10.1109/LRA.2020.3004325) the system takes in multiple camera views looking all around the robot. Results of image segmentation are then synthesized into a 2D top-down segmented map of the robot’s surroundings. (“Where is the chair located?”)

The authors documented how to train a view parsing network in a virtual environment, and described the procedure to transfer a trained network to run on a physical robot. Today this process demands a significantly higher skill level than “download Arduino sketch” but we hope such modules will become more plug-and-play in the future for better and smarter robots.

[IROS 2020 Presentation video (duration 10:51) requires free registration, available until at least Nov. 25th 2020. One-minute summary embedded below.]

Continue reading “Robots Learning To Understand Their Surroundings”

Quadcopter With Tensegrity Shell Takes A Beating And Gets Back Up

Many of us have become familiar with the distinctive sound of multirotor toys, a sound frequently punctuated by sharp sounds of crashes. We’d then have to pick it up and repair any damage before flying fun can resume. This is fine for a toy, but autonomous fliers will need to shake it off and get back to work without human intervention. [Zha et al.] of UC Berkeley’s HiPeRLab have invented a resilient design to do so.

We’ve seen increased durability from flexible frames, but that left the propellers largely exposed. Protective bumpers and cages are not new, either, but this icosahedron (twenty sided) tensegrity structure is far more durable than the norm. Tests verified it can survive impact with a concrete wall at speed of 6.5 meters per second. Tensegrity is a lot of fun to play with, letting us build intuition-defying structures and here tensegrity elements dissipate impact energy, preventing damage to fragile components like propellers and electronics.

But surviving an impact and falling to the ground in one piece is not enough. For independent operation, it needs to be able to get itself back in the air. Fortunately the brains of this quadcopter has been taught the geometry of an icosahedron. Starting from the face it landed on, it can autonomously devise a plan to flip itself upright by applying bursts of power to select propeller motors. Rotating itself face by face, working its way to an upright orientation for takeoff, at which point it is back in business.

We have a long way to go before autonomous drone robots can operate safely and reliably. Right now the easy answer is to fly slowly, but that also drastically cuts into efficiency and effectiveness. Having flying robots that are resilient against flying mistakes at speed, and can also recover from those mistakes, will be very useful in exploration of aerial autonomy.

[IROS 2020 Presentation video (duration 14:16) requires free registration, available until at least Nov. 25th 2020. One-minute summary embedded below]

Continue reading “Quadcopter With Tensegrity Shell Takes A Beating And Gets Back Up”

Flexible Actuators Spring Into Action

Most experiments in flexible robot actuators are based around pneumatics, but [Ayato Kanada] and [Tomoaki Mashimo] has been working on using a coiled spring as the moving component of a linear actuator. Named the flexible ultrasonic motor (FUSM), [Yunosuke Sato] built on top of their work and assembled a pair of FUSM into a closed-loop actuator with motion control in two dimensions.

A single FUSM is pretty interesting by itself, its coiled spring is the only mechanical moving part. An earlier paper published by [Kanada] and [Mashimo] laid out how to push the spring through a hole in a metal block acting as the stator of this motor. Piezoelectric devices attached to that block minutely distorts it in a controlled manner resulting in linear motion of the spring.

For closed-loop feedback, electrical resistance from the free end of the spring to the stator block can be measured and converted to linear distance to within a few millimeters. However, the acting end of the spring might be deformed via stretching or bending, which made calculating its actual position difficult. Accounting for such deformation is a future topic for this group of researchers.

This work was presented at IROS2020 which like many other conferences this year, moved online and became IROS On-Demand. After a no-cost online registration we can watch the 12-minute recorded presentation on this project or any other at the conference. The video includes gems such as an exaggerated animation of stator block deformation to illustrate how a FUSM works, and an example of the position calculation challenge where the intended circular motion actually resulted in an oval.

Speaking of conferences that have moved online, we have our own Hackaday Remoticon coming up soon!

Continue reading “Flexible Actuators Spring Into Action”

Morphing Robot Demonstrated At IROS

morphing-robot

A morphing robot was demonstrated at the IROS conference this week. This orb has no rigid structure but uses some type of “inflation” system for locomotion. This robot concept is offered up by the iRobot company as part of a DARPA initiative they’re working on. The “inflation” is really a substance in the skin that can be converted from a liquid-like state to a solid-like one. They call this “The Jamming Concept” and give a layman’s explanation in the video we’ve embedded after the break.

When moving, this white ball is a churning, turning, bulging mass of terror. The just-about-to-hatch pods from Alien, or perhaps something from Doom 3 come to mind. The hexapod from IROS that we covered yesterday was amazing, but this really creeps us out. What’s more, this is footage from the iRobot prototypes of a year ago.  The newer stuff can do much more, like having several of these things glob together into one unit.

We’re glad that [DarwinSurvior] sent us the tip on this one, but now we’re not going to be able to sleep at night.

Continue reading “Morphing Robot Demonstrated At IROS”