Most of us associate echolocation with bats. These amazing creatures are able to chirp at frequencies beyond the limit of our hearing, and they use the reflected sound to map the world around them. It’s the perfect technology for navigating pitch-dark cave systems, so it’s understandable why evolution drove down this innovative path.
Humans, on the other hand, have far more limited hearing, and we’re not great chirpers, either. And yet, it turns out we can learn this remarkable skill, too. In fact, research suggests it’s far more achievable than you might think—for the sighted and vision impaired alike!
Bounce That Sound
Before we talk about humans using echolocation, let’s examine how the pros do it. Bats are nature’s acoustic engineers, emitting rapid-fire ultrasonic pulses from their larynx that can range from 11 kHz to over 200 kHz. Much of that range is far beyond human hearing, which tops out at under 20 kHz. As these sound waves bounce off objects in their environment, the bat’s specialized ultrasonic-capable ears capture the returning echoes. Their brain then processes these echoes in real-time, comparing the outgoing and incoming signals to construct a detailed 3D map of their surroundings. The differences in echo timing tell them how far away objects are, while variations in frequency and amplitude reveal information about size, texture, and even movement. Bats will vary between constant-frequency chirps and frequency-modulated tones depending on where they’re flying and what they’re trying to achieve, such as navigating a dark cavern or chasing prey. This biological sonar is so precise that bats can use it to track tiny insects while flying at speed.
Humans can’t naturally produce sounds in the ultrasonic frequency range. Nor could we hear them if we did. That doesn’t mean we can’t echolocate, though—it just means we don’t have quite the same level of equipment as the average bat. Instead, humans can achieve relatively basic echolocation using simple tongue clicks. In fact, a research paper from 2021 outlined that skills in this area can be developed with as little as a 10-week training program. Over this period, researchers successfully taught echolocation to both sighted and blind participants using a combination of practical exercises and virtual training. A group of 14 sighted and 12 blind participants took part, with the former using blindfolds to negate their vision.
The aim of the research was to investigate click-based echolocation in humans. When a person makes a sharp click with their tongue, they’re essentially launching a sonic probe into their environment. As these sound waves radiate outward, they reflect off surfaces and return to the ears with subtle changes. A flat wall creates a different echo signature than a rounded pole, while soft materials absorb more sound than hard surfaces. The timing between click and echo precisely encodes distance, while differences between the echoes reaching each ear allows for direction finding.
The training regime consisted of a variety of simple tasks. The researchers aimed to train participants on size discrimination, with participants facing two foam board disks mounted on metal poles. They had to effectively determine which foam disc was larger using only their mouth clicks and their hearing. The program also included an orientation challenge, which used a single rectangular board that could be rotated to different angles. The participants had to again use clicks and their hearing to determine the orientation of the board. These basic tools allowed participants to develop increasingly refined echo-sensing abilities in a controlled environment.
Perhaps the most intriguing part of the training involved a navigation task in a virtually simulated maze. Researchers first created special binaural recordings of a mannikin moving through a real-world maze, making clicks as it went. They then created virtual mazes that participants could navigate using keyboard controls. As they navigated through the virtual maze, without vision, the participants would hear the relevant echo signature recorded in the real maze. The idea was to allow participants to build mental maps of virtual spaces using only acoustic information. This provided a safe, controlled environment for developing advanced navigation skills before applying them in the real world. Participants also attempted using echolocation to navigate in the real world, navigating freely with experimenters on hand to guide them if needed.
The most surprising finding wasn’t that people could learn echolocation – it was how accessible the skill proved to be. Previous assumptions about age and visual status being major factors in learning echolocation turned out to be largely unfounded. While younger participants showed some advantages in the computer-based exercises, the core skill of practical echolocation was accessible to all participants. After 10 weeks of training, participants were able to correctly answer the size discrimination task over 75% of the time, and at increased range compared to when they began. Orientation discrimination also improved greatly over the test period to a success rate over 60% for the cohort. Virtual maze completion times also dropped by over 50%.
The study also involved a follow-up three months later with the blind members of the cohort. Participants credited the training with improving their spatial awareness, and some noted they had begun to use the technique to find doors or exits, or to make their way through strange places.
What’s particularly fascinating is how this challenges our understanding of basic human sensory capabilities. Echolocation doesn’t involve adding new sensors or augmenting existing ones—it’s just about training the brain to extract more information from signals it already receives. It’s a reminder that human perception is far more plastic than we often assume.
The researchers suggest that echolocation training should be integrated into standard mobility training for visually impaired individuals. Given the relatively short training period needed to develop functional echo-sensing abilities, it’s hard to argue against its inclusion. We might be standing at the threshold of a broader acceptance of human echolocation, not as an exotic capability, but as a practical skill that anyone can learn.
Reminds me of Richard Feynman’s party trick of discerning which books someone has handled by smell. Everyone assumed he used some other method, and that the smelling was a blind to throw them off. He writes that people would frequently make more and more outlandish guesses about how he “really” did it.
Industrialized humans don’t use smell much … but we probably could.
A huge part of our loss of smell is a function of considering it rude to go around closely sniffing things and people. Often it’s socially unacceptable to even acknowledge that things have distinctive odors. That, and the elevation of our nose above the ground.
You can easily smell if somebody has masturbated or had sex recently, what kind of food people ate ten hours ago, if somebody is menstruating or menopausal, arguably one can smell when somebody is pregnant, and you can often smell when people are sick or even just anxious versus normal sweat from exercise.. And it’s absolutely impossible to discuss any of it. But your dog doesn’t mind noticing.
Thank goodness machines can’t do that (yet)!
done this since I was a toddler, and I’m not blind ! very useful in the dark…
Done what, smell books to see who handled them or navigate by echolocation? I can see both being useful in the dark, depending on the circumstance.
I remember reading an article in the 80’s about a blind man that used echolocation. He said he hardly used his cane. A series of tests were performed and he showed that he could navigate easily past obstacles. Even in a busy city trial he had no real problems. He even rode a bicycle except he said he did have a problem with a single wire fence he ran into. It was determined that primarily he was audibly “seeing” an acoustic picture of his surroundings using ambient sounds and their reflections from the environment. However, he did use bat like high frequency “chirps” as well. Maybe someday there will be a wearable sensor net like Miranda’s in STOS episode “Is There In Truth No Beauty” or Geordie La Forge’s (STNG) visor providing ultra high resolution information in a small wearable device.
I thought I’d read several articles a dozen years ago about people using handheld mechanical clicker boxes and teaching themselves to echolocate and getting really quite good at it, easily able to negotiate public spaces.
It probably has the side effect of attracting dogs that have been trained using a clicker.
If you walk blindfolded, slowly, through a large room like a gym or a big conference room, you can hear the sound of the HVAC and other ambient noises change a few centimeters away from the wall. I “discovered” this as a kid. No idea, but I suspect the effect is related to the Crown PZM microphone of years gone by. Which is why I walked into the gym wall a lot until I got the hang of it.
I have normal sight and I have tried to learn a little bit of echo location. I happened to read about it some 15 years ago, and I then also remembered that as a child I was able to “hear walls” when I was near them. That childhood memory was likely only due to the ambient noise reflecting from the walls. But that memory was sufficient proof that I decided to occasionally try real echo location using sharp tongue flicks. The sharpness of the sound is key to success. Obviously impulse response would be best, but you can do it only approximately with tongue.
Background noise was really detrimental for the process. So traffic noise for example made it really difficult.
I have never bothered to really hone the skill, just tried it for fun. I usually did not cover my eyes, but that was mostly because I wanted to correlate the sound to the visual reference. Big, tall objects like buildings were rather obvious. Tree was discernable, if I remember correctly. Small corridors were difficult with limited training and also because I did not want to bother people…
But yes, it is fun skill to try to learn even by oneself.