Naturally imbued with a handful of the physical senses present among animal species, humankind now seeks to replicate the experience of others through technological means — to include the sense of echolocation, delivered by cell phone.

Just as some blind people have learned to mimic the navigational sense found in some mammal and bird species, including microchiropteran bats, whales, and dolphins, researchers say people may soon gain the sense by technological proxy, using an app via Android or iPhone.

In the tradition of Aristotle, schoolchildren learn that humans experience the five common senses of sight, hearing, touch, smell, and taste; although scientists now say we may possess as many as 21 physical senses, from such mundanity as the sense of time to magentoception, the navigational ability to detect the Earth's magnetic fields.

In a few years of mobile technology evolution, however, humans may gain the handheld power of echolocation, with an application providing a 3D layout of a room upon entering. Computer science researchers from the United States and Switzerland have developed an algorithm that produces the shape and contours of complex structures — such as Switzerland's Lausanne Cathedral — using data collected by four randomly placed microphones. Such technology may be used to improve video gaming and virtual reality in addition to eliminating echo sounds from telephone calls.

"We presented an algorithm for reconstructing the 3D geometry of a convex polyhedral room from a few acoustic measurements," the study authors wrote. "It requires a single sound emission and uses a minimal number of microphones. The proposed algorithm has essentially no constraints on the microphone setup. Thus, we can arbitrarily reposition the microphones, as long as we know their pairwise distances..."

With a few impulse responses, the team demonstrates how to reconstruct a convex polyhedral room. "Our method relies on learning from which wall a particular echo originates," the researchers wrote, noting challenges including the difficulty of extracting echoes from the repeat impulse response and the discernment of order in which the microphones receive echoes bouncing from walls. "Our main contribution is an algorithm that selects the 'correct' combinations of echoes, specifically those that actually correspond to walls. The need for assigning echoes to walls arises fromt he omnidirectionality of the source and the receivers."

In the experiments, reported Monday in the Proceedings of the National Academy of Sciences, the researchers said first-order echoes — as opposed to the repeating effect heard in yodeling or yelling into a cavernous space — described the space quite accurately. "Our algorithm opens the way for different applications in virtual reality, auralization, architectural acoustics, and audio forensics," the authors wrote. "For example, we can use it to design acoustic spaces with desired characteristics or to change the auditory perception of existing spaces."

Aside from possible use in omnidirectional radar, the technology could be used to change the way people interact with one another in a navigational sense within an interior environment. In the future, a person enters a room while speaking and holding a cellphone, gaining an immediate sense of "how are you?" and "what's your 20?"

Source: Dokmanic I, Parhizkar R, Walther A, Lu YM, Vetterli M. Acoustic Echoes Reveal Room Shape. PNAS. 2013.