If you’re a human or other animal with two ears, you’ll probably find great utility in your ability to identify the direction of sounds in the world around you. Of course, this is really just a minimal starting point for such abilities. When [John Duffy] set out to build his acoustic camera, he chose to use ninety-six microphones to get the job done.
The acoustic camera works by having an array of microphones laid out in a prescribed grid. By measuring the timing and phase differences of signals appearing at each microphone, it’s possible to determine the location of sound sources in front of the array. The more microphones, the better the data.
[John] goes into detail as to how the project was achieved on the project blog. Outlining such struggles as assembly issues, he also shares information about how to effectively debug the array, and just how to effectively work with so many microphones at once. Particularly impressive is the video of [John] using the device to track a sound to its source. This technology has potential applications in industry for determining the location of compressed air leaks, for example.
Overall, it’s a university research project done right, with a great writeup of the final results. [John]’s project would serve well as a jumping off point for anyone trying to build something similar. Phased array techniques work in RF, too, as this MIT project demonstrates. Video after the break.
Continue reading “Acoustic Camera Uses Many, Many Microphones”