Triggering Lightning And Safely Guiding It Using A Drone

Every year lightning strikes cause a lot of damage — with the high-voltage discharges being a major risk to buildings, infrastructure, and the continued existence of squishy bags of mostly salty water. While some ways exist to reduce their impact such as lightning rods, these passive systems can only be deployed in select locations and cannot prevent the build-up of the charge that leads up to the plasma discharge event. But the drone-based system recently tested by Japan’s NTT, the world’s fourth largest telecommunications company, could provide a more proactive solution.

The idea is pretty simple: fly a drone that is protected by a specially designed metal cage close to a thundercloud with a conductive tether leading back to the ground. By providing a very short path to ground, the built-up charge in said cloud will readily discharge into this cage and from there back to the ground.

To test this idea, NTT researchers took commercial drones fitted with such a protective cage and exposed them to artificial lightning. The drones turned out to be fine up to 150 kA which is five times more than natural lightning. Afterwards the full system was tested with a real thunderstorm, during which the drone took a hit and kept flying, although the protective cage partially melted.

Expanding on this experiment, NTT imagines that a system like this could protect cities and sensitive areas, and possibly even use and store the thus captured energy rather than just leading it to ground. While this latter idea would need some seriously effective charging technologies, the idea of proactively discharging thunderclouds is perhaps not so crazy. We would need to see someone run the numbers on the potential effectiveness, of course, but we are all in favor of (safe) lightning experiments like this.

If you’re wondering why channeling lightning away from critical infrastructure is such a big deal, you may want to read up on Apollo 12.

Improving Flying Drones By Mimicking Flying Squirrels

With the ability to independently adjust the thrust of each of their four motors, quadcopters are exceptionally agile compared to more traditional aircraft. But in an effort to create an even more maneuverable drone platform, a group of South Korean researchers have studied adding flying squirrel tech to quadcopters. Combined with machine learning, this is said to significantly increase the prototype’s agility in an obstacle course.

Flying squirrels (tribe Pteromyini)) have large skin flaps (patagium) between their wrists and ankles which they use to control their flight when they glide from tree to tree, along with their fluffy squirrel tail. With flights covering up to 90 meters, they also manage to use said tail and patagium to air brake, which prevents them from smacking with bone jarring velocities into a tree trunk.

By taking these principles and adding a similar mechanism to a quadcopter for extending a patagium-like membrane between its rotors, the researchers could develop a new controller (thrust-wing coordination control, TWCC), which manages the extending of the membranes in coordination with thrust from the brushless motors. Rather than relying on trial-and-error to develop the controller algorithms, the researchers trained a recurrent neural network (RNN) which was pre-trained prior to first flights using simulation data followed by supervised learning to refine the model.

During experiments with obstacle avoidance on a test-track, the RNN-based controller worked quite well compared to a regular quadcopter. A disadvantage is of course that the range of these flying squirrel drones is less due to the extra weight and drag, but if one were to make flying drones that will perch on surfaces between dizzying feats of agility in the air, this type of drone tech might just be the ticket.

Continue reading “Improving Flying Drones By Mimicking Flying Squirrels”

Supercon 2024: Killing Mosquitoes With Freaking Drones, And Sonar

Suppose that you want to get rid of a whole lot of mosquitoes with a quadcopter drone by chopping them up in the rotor blades. If you had really good eyesight and pretty amazing piloting skills, you could maybe fly the drone yourself, but honestly this looks like it should be automated. [Alex Toussaint] took us on a tour of how far he has gotten toward that goal in his amazingly broad-ranging 2024 Superconference talk. (Embedded below.)

The end result is an amazing 380-element phased sonar array that allows him to detect the location of mosquitoes in mid-air, identifying them by their particular micro-doppler return signature. It’s an amazing gadget called LeSonar2, that he has open-sourced, and that doubtless has many other applications at the tweak of an algorithm.

Rolling back in time a little bit, the talk starts off with [Alex]’s thoughts about self-guiding drones in general. For obstacle avoidance, you might think of using a camera, but they can be heavy and require a lot of expensive computation. [Alex] favored ultrasonic range finding. But then an array of ultrasonic range finders could locate smaller objects and more precisely than the single ranger that you probably have in mind. This got [Alex] into beamforming and he built an early prototype, which we’ve actually covered in the past. If you’re into this sort of thing, the talk contains a very nice description of the necessary DSP.

[Alex]’s big breakthrough, though, came with shrinking down the ultrasonic receivers. The angular resolution that you can resolve with a beam-forming array is limited by the distance between the microphone elements, and traditional ultrasonic devices like we use in cars are kinda bulky. So here comes a hack: the TDK T3902 MEMS microphones work just fine up into the ultrasound range, even though they’re designed for human hearing. Combining 380 of these in a very tightly packed array, and pushing all of their parallel data into an FPGA for computation, lead to the LeSonar2. Bigger transducers put out ultrasound pulses, the FPGA does some very intense filtering and combining of the output of each microphone, and the resulting 3D range data is sent out over USB.

After a marvelous demo of the device, we get to the end-game application: finding and identifying mosquitoes in mid-air. If you don’t want to kill flies, wasps, bees, or other useful pollinators while eradicating the tiny little bloodsuckers that are the drone’s target, you need to be able to not only locate bugs, but discriminate mosquitoes from the others.

For this, he uses the micro-doppler signatures that the different wing beats of the various insects put out. Wasps have a very wide-band doppler echo – their relatively long and thin wings are moving slower at the roots than at the tips. Flies, on the other hand, have stubbier wings, and emit a tighter echo signal. The mosquito signal is even tighter.

If you told us that you could use sonar to detect mosquitoes at a distance of a few meters, much less locate them and differentiate them from their other insect brethren, we would have thought that it was impossible. But [Alex] and his team are building these devices, and you can even build one yourself if you want. So watch the talk, learn about phased arrays, and start daydreaming about what you would use something like this for.

Continue reading “Supercon 2024: Killing Mosquitoes With Freaking Drones, And Sonar”

Budget-Minded Synthetic Aperture Radar Takes To The Skies

Unless you work for the government or a large corporation, constrained designs are a fact of life. No matter what you’re building, there’s likely going to be a limit to the time, money, space, or materials you can work with. That’s good news, though, because constrained projects tend to be interesting projects, like this airborne polarimetric synthetic aperture radar.

If none of those terms make much sense to you, don’t worry too much. As [Henrik Forstén] explains, synthetic aperture radar is just a way to make a small radar antenna appear to be much larger, increasing its angular resolution. This is accomplished by moving the antenna across a relatively static target and doing some math to correlate the returned signal with the antenna position. We saw this with his earlier bicycle-mounted SAR.

For this project, [Henrik] shrunk the SAR set down small enough for a low-cost drone to carry. The build log is long and richly detailed and could serve as a design guide for practical radar construction. Component selection was critical, since [Henrik] wanted to use low-cost, easily available parts wherever possible. Still, there are some pretty fancy parts here, with a Zynq 7020 FPGA and a boatload of memory on the digital side of the custom PCB, and a host of specialized parts on the RF side.

The antennas are pretty cool, too; they’re stacked patch antennas made from standard FR4 PCBs, with barn-door feed horns fashioned from copper sheeting and slots positioned 90 to each other to provide switched horizontal and vertical polarization on both the receive and transmit sides. There are also a ton of details about how the radar set is integrated into the flight controller of the drone, as well as an interesting discussion on the autofocusing algorithm used to make up for the less-than-perfect positional accuracy of the system.

The resulting images are remarkably detailed, and almost appear to be visible light images thanks to the obvious shadows cast by large objects like trees and buildings. We’re especially taken by mapping all combinations of transmit and receive polarizations into a single RGB image; the result is ethereal.

DIY Drones Deliver The Goods With Printed Release

It seems like the widespread use of delivery drones by companies like Amazon and Wal-Mart has been perpetually just out of reach. Of course robotics is a tricky field, and producing a fleet of these machines reliable enough to be cost effective has proven to be quite a challenge. But on an individual level, turning any drone into one that can deliver a package is not only doable but is something [Iloke-Alusala] demonstrates with their latest project.

The project aims to be able to turn any drone into a delivery drone, in this case using a FPV drone as the platform. Two hitch-like parts are 3D printed, one which adds an attachment point to the drone and another which attaches to the package, allowing the drone to easily pick up the package and then drop it off quickly. The real key to this build is the control mechanism. [Iloke-Alusala] used an ESP32 to tap into the communications between the receiver and the flight controller. When the ESP32 detects a specific signal has been sent to the flight controller, it can activate the mechanism on the 3D printed hitch to either grab on to a package or release it at a certain point.

While this is a long way from a fully autonomous fleet of delivery drones, it goes a long way into showing that individuals can use existing drones to transport useful amounts of material and also sets up a way for an ESP32 to decode and use a common protocol used in drones, making it easy to expand their capabilities in other ways as well. After all, if we have search and rescue drones we could also have drones that deliver help to those stranded.

Continue reading “DIY Drones Deliver The Goods With Printed Release”

Avian-Inspired Drones: How Studying Birds Of Prey Brings More Efficient Drones Closer

The EPFL LisRaptor with adjustable wings and tail.
The EPFL LisRaptor with adjustable wings and tail.

Throughout evolution, the concept of powered flight has evolved and refined itself multiple times across both dinosaurs (birds), mammals (bats) and insects. So why is it that our human-made flying machines are so unlike them? The field of nature-inspired flying drones is a lively one, but one that is filled with challenges. In a recent video on the Ziroth YouTube channel, [Ryan Inis] takes a look at these efforts, in particular those of EPFL, whose recent RAVEN drone we had a look at recently already.

Along with RAVEN, there is also another project (LisRaptor) based on the Northern Goshawk, a bird of prey seen in both Europe and North-America. While RAVEN mostly focused on the near-vertical take-off that smaller birds are capable of, this project studies the interactions between the bird’s wings and tail, and how these enable rapid changes to the bird’s flight trajectory and velocity, while maintaining efficiency.

The video provides a good overview of this project. Where the LisRaptor differs from the animal is in having a rudder and a propeller, but the former should ideally not be necessary. Obviously the kinematics behind controlled flight are not at all easy, and the researchers spent a lot of time running through configurations aided by machine learning to achieve the ideal – and most efficient – wing and tail configuration. As these prototypes progress, they may one day lead to drones that are hard to differentiate from birds and bats.

Continue reading “Avian-Inspired Drones: How Studying Birds Of Prey Brings More Efficient Drones Closer”

FPV Flying In Mixed Reality Is Easier Than You’d Think

Flying a first-person view (FPV) remote controlled aircraft with goggles is an immersive experience that makes you feel as if you’re really sitting in the cockpit of the plane or quadcopter. Unfortunately, while your wearing the goggles, you’re also completely blind to the world around you. That’s why you’re supposed to have a spotter nearby to keep watch on the local meatspace while you’re looping through the air.

But what if you could have the best of both worlds? What if your goggles not only allowed you to see the video stream from your craft’s FPV camera, but you could also see the world around you. That’s precisely the idea behind mixed reality goggles such as Apple Vision Pro and Meta’s Quest, you just need to put all the pieces together. In a recent video [Hoarder Sam] shows you exactly how to pull it off, and we have to say, the results look quite compelling.

Continue reading “FPV Flying In Mixed Reality Is Easier Than You’d Think”