The “Photoacoustic Airborne Sonar System” can be installed under drones for underwater studies and high-resolution maps of the deep ocean.
Engineers at Stanford University have developed an aerial method for depicting underwater objects by combining light and sound to cross a seemingly insurmountable barrier at the air and water interface.
Researchers predict their hybrid optical-acoustic system will one day conduct drone-based marine biological studies from the air, conduct large-scale aerial searches of sunken ships and aircraft, and details the similar speed and level of ocean depths as Earth’s landscapes. The latest research published in the journal identifies their “Photoacoustic Airborne Sonar System” IEEE access.
“Aerial and space radar and laser or LIDAR systems have been able to map Earth’s landscapes for decades. Radar signals are able to cover cloud cover and cover. However, seawater is too absorbent to make images in water,” said Amin Arbabian, head of research at Stanford. Associate Professor of Electrical Engineering at the School of Engineering. “Our goal is to develop a more rigid system that can also be represented by dark water.”
The oceans cover about 70 percent of the Earth’s surface, but a small portion of their depths have been exposed through high-resolution images and maps.
The main obstacle has to do with physics: Sound waves, for example, cannot pass from air to water or vice versa without losing most of their energy (more than 99.9%) through reflection against the other medium. It suffers from this loss of energy from a system that tries to see it submerged using sound waves that travel from air to water and back into the air. twice – 99.9999% energy reduction is achieved.
Similarly, electromagnetic radiation — an umbrella that collects light, a microwave oven, and radar signals — also loses energy when it passes from one physical medium to another, although the mechanism is different from that of sound. “Light also loses a bit of energy as a result of reflection, but this is the biggest part of energy loss absorption next to the water, ”explains the first author of the study, Aidan Fitzpatrick, a graduate in electrical engineering from Stanford. By the way, this absorption is the reason why sunlight cannot penetrate deep into the ocean and your phone (cellular signals, electromagnetic radiation) cannot receive calls.
The result of all this is that the oceans cannot map the earth from the air and space as a land. To date, most underwater cartography has been achieved by attaching sonar systems to ships that drag a particular region of interest. But this technique is slow and expensive, and is not effective in covering large areas.
Introduce the Photoacoustic Airborne Sonar System (PASS), which combines light and sound to traverse the air-water interface. His idea came from another project that used microwaves to characterize “contactless” images and the roots of underground plants. Some of PASS’s tools were initially designed for this purpose in conjunction with Butrus Khuri-Yakub Stanford’s electrical engineering professor’s laboratory.
At the heart, PASS plays the strengths of light and sound. “Light in the air, where light travels well and sound in the water, if we travel sound well, we can get the best of both worlds,” Fitzpatrick said.
To do this, the system shoots a laser from the air that it absorbs into the water surface. When the laser is absorbed, it creates ultrasonic waves that propagate across the water column and reflect underwater objects before they run toward the surface.
Returning sound waves still absorb most of the energy when they scratch the surface of the water, but creating sound waves submerged with a laser prevents researchers from losing energy twice.
“We have developed a system that is sensitive enough to compensate for the loss of this magnitude and still allows for signal detection and imaging,” Arbabian said.
The reflected ultrasonic waves are recorded by instruments called transducers. Software is used to reunite acoustic signals in the form of an invisible puzzle and to reconstruct a three-dimensional image of a submerged feature or object.
“Just as light refracts or‘ bends ’light when light passes through any medium that is denser than water or air, so does ultrasound,” Arbabian explains. “The algorithms for reconstructing our images are correct for this distortion that occurs when ultrasound waves pass from water to air.”
Drone ocean studies
Conventional sonar systems can reach depths of hundreds of thousands of meters, and researchers hope that eventually their system will reach similar depths.
To date, it has been tested in the PASS laboratory in a container the size of a large fish tank. “Current experiments use static water but we are currently working to deal with water waves,” Fitzpatrick said. “It’s a big challenge but we think it’s a viable issue.”
The researchers say the next step will be to conduct the tests in a larger environment and eventually in the open water environment.
“The focus on this technology is on a helicopter or a drone,” Fitzpatrick said. “We hope the system will be able to fly ten meters above the water.”
Reference: Aidan Fitzpatrick, Ajay Singhvi and Amin Arbabian’s “An Airborne Sonar System for Underwater Remote Sensing and Imaging”, October 16, 2020, IEEE Explore.
DOI: 10.1109 / ACCESS.2020.3031808