According tomedia New Atlas, many drones have used ultrasonic waves to detect individual obstacles directly in their paths. However, the experimental new drone system uses sound to locate all the walls in the room, providing guidance when a vision-based or lighting-based system is unable to provide guidance.
The study was led by Professor Mireille “Mimi” Boutin of Purdue University in Indiana and Professor Gregor Kemper of the Technical University of Munich. They first equipped ready-made quadcopters with speakers and four microphones. The microphone is arranged in 3D instead of all being relatively flat.
As the drone then passes through the room, its speakers emit a short sound pulse. For each pulse, the phophone array detects the initial emission and subsequent echoes reflected back by the four walls of the room. The amount of time passed between the emission and any echo is used to determine the distance of the reflective wall… This is different than the way bats use echolocation. Using a mathematical method developed by the researchers, called echo classification, you can determine which measurement distance is applied to which wall. As a result, the system continuously determines the location of the drone relative to all the walls in the room.
The researchers hope that once further developed, the technology could be used to guide drones in dark, rain/snow or dark conditions, in which case systems based on computer vision, laser or infrared light could fail. It is conceivable that the setup could also be used for smartphone-based systems to guide blind pedestrians, or for crash-proof systems for self-driving cars.
“The goal is to enable drones or any other unmanned vehicle to use sound navigation,” Says Kemper. The car is already equipped with a camera. One new approach might also include a sound sensor to enhance existing visual information and better present reality. “
A paper on the study was recently published in the journal SIAM Applied Algebra and Geometry.