Jump to content

Acoustic camera: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Citation bot (talk | contribs)
Add: s2cid, author pars. 1-1. Removed parameters. Some additions/deletions were actually parameter name changes. | You can use this bot yourself. Report bugs here. | Suggested by Abductive | Category:Acoustics | via #UCB_Category
 
(25 intermediate revisions by 18 users not shown)
Line 1: Line 1:
An '''acoustic camera''' is an imaging device used to locate sound sources and to characterize them. It consists of a group of microphones, also called a [[microphone array]], from which signals are simultaneously collected and processed to form a representation of the location of the sound sources.
An '''acoustic camera''' (or '''noise camera''') is an [[imaging]] device used to locate sound sources and to characterize them. It consists of a group of microphones, also called a ''[[microphone array]]'', from which signals are simultaneously collected and processed to form a representation of the location of the sound sources.


== Terminology ==
== Terminology ==
The term '''acoustic camera''' has first appeared at the end of the 19th century: A physiologist, J.R. Ewald,<ref name=first_acoustic_camera>{{cite journal |title=none|last=Ewald |first=J.R.|date=1898 |journal=Wiener klinische Wochenschrift |volume=11 |page=721}}</ref> was investigating the function of the inner ear and introduced an analogy with the [[Ernst Chladni|Chladni plates]] (a domain nowadays called [[Cymatics]]), a device enabling to visually see the modes of vibration of a plate. He called this device an acoustic camera. The term has then been widely used during the 20th century<ref name=laser_scanned_ac>{{cite journal |last1=Whitman |first1=R. L.|last2=Ahmed|first2=M. |last3=Korpel|first3=A. |date=1972 |title=A progress report on the laser scanned acoustic camera. |journal=Acoustical Holography |volume=20 |pages=11–32|publisher= Springer US|doi=10.1007/978-1-4615-8213-7_2|isbn=978-1-4615-8215-1}}</ref><ref name="pat_us_ac">{{ cite patent |country=US |number=3895340|title=Acoustic camera apparatus |status=patent}}</ref><ref name=Underwater>{{cite journal |last1=Hansen |first1=Rolf Kahrs|last2=Andersen|first2=Poul Arndt |date=1993 |title=3D acoustic camera for underwater imaging |journal=Acoustical Imaging |volume=20 |pages=723–727|publisher= Springer US|doi=10.1007/978-1-4615-2958-3_98|isbn=978-1-4613-6286-9}}</ref> to designate various types of acoustic devices, such as underwater localization systems<ref name=underwater_ac_first>{{cite journal |last1=Haslett |first1= R. W. G.|last2=Pearce|first2=G.|last3=Welsh|first3=A. W.|last4=Hussey|first4=K. |date=1966 |title=The underwater acoustic camera |journal=Acta Acustica United with Acustica |volume=17,4 |pages=187–203|publisher= S. Hirzel Verlag}}</ref> or active systems used in medicine.<ref name=active_ac>{{cite journal |last1=Maginness |first1=M. G.|last2=Plummer|first2=J. D.|last3=Meindl|first3=J. D.|date=1974 |title=An acoustic image sensor using a transmit-receive array |journal=Acoustical Holography|pages=619–631|publisher=Springer US|doi=10.1007/978-1-4757-0827-1_36|isbn=978-1-4757-0829-5}}</ref> It designates nowadays any transducer array used to localize sound sources (the medium is usually the air), especially when coupled with an [[camera|optical camera]].
The term '''acoustic camera''' has first appeared at the end of the 19th century: A physiologist, J.R. Ewald,<ref name=first_acoustic_camera>{{cite journal |title=none|last=Ewald |first=J.R.|date=1898 |journal=Wiener klinische Wochenschrift |volume=11 |page=721}}</ref> was investigating the function of the inner ear and introduced an analogy with the [[Ernst Chladni|Chladni plates]] (a domain nowadays called [[Cymatics]]), a device enabling users to visually see the modes of vibration of a plate. He called this device an acoustic camera. The term has then been widely used during the 20th century<ref name=laser_scanned_ac>{{cite journal |last1=Whitman |first1=R. L.|last2=Ahmed|first2=M. |last3=Korpel|first3=A. |date=1972 |title=A progress report on the laser scanned acoustic camera. |journal=Acoustical Holography |volume=20 |pages=11–32|publisher= Springer US|doi=10.1007/978-1-4615-8213-7_2|isbn=978-1-4615-8215-1}}</ref><ref name="pat_us_ac">{{ cite patent |country=US |number=3895340|title=Acoustic camera apparatus |status=patent}}</ref><ref name=Underwater>{{cite journal |last1=Hansen |first1=Rolf Kahrs|last2=Andersen|first2=Poul Arndt |date=1993 |title=3D acoustic camera for underwater imaging |journal=Acoustical Imaging |volume=20 |pages=723–727|publisher= Springer US|doi=10.1007/978-1-4615-2958-3_98|isbn=978-1-4613-6286-9}}</ref> to designate various types of acoustic devices, such as underwater localization systems<ref name=underwater_ac_first>{{cite journal |last1=Haslett |first1= R. W. G.|last2=Pearce|first2=G.|last3=Welsh|first3=A. W.|last4=Hussey|first4=K. |date=1966 |title=The underwater acoustic camera |journal=Acta Acustica United with Acustica |volume=17,4 |pages=187–203|publisher= S. Hirzel Verlag}}</ref> or active systems used in medicine.<ref name=active_ac>{{cite journal |last1=Maginness |first1=M. G.|last2=Plummer|first2=J. D.|last3=Meindl|first3=J. D.|date=1974 |title=An acoustic image sensor using a transmit-receive array |journal=Acoustical Holography|pages=619–631|publisher=Springer US|doi=10.1007/978-1-4757-0827-1_36|isbn=978-1-4757-0829-5}}</ref> It designates nowadays any transducer array used to [[acoustic location|localize sound sources]] (the medium is usually the air), especially when coupled with an [[camera|optical camera]].


== Technology ==
== Technology ==
Line 10: Line 10:
One popular method to obtain an acoustic image from the measurement of the microphone is to use [[beamforming]]: By delaying each microphone signal relatively and adding them, the signal coming from a specific direction <math> \left(\theta_0, \phi_0\right) </math> is amplified while signals coming from other directions are canceled. The power of this resulting signal is then calculated and reported on a power map at a pixel corresponding to the direction <math> \left(\theta_0, \phi_0\right) </math>. The process is iterated at each direction where the power needs to be computed.
One popular method to obtain an acoustic image from the measurement of the microphone is to use [[beamforming]]: By delaying each microphone signal relatively and adding them, the signal coming from a specific direction <math> \left(\theta_0, \phi_0\right) </math> is amplified while signals coming from other directions are canceled. The power of this resulting signal is then calculated and reported on a power map at a pixel corresponding to the direction <math> \left(\theta_0, \phi_0\right) </math>. The process is iterated at each direction where the power needs to be computed.


While this method has many advantages – robustness, easy to understand, highly [[Parallel computing|parallelizable]] because each direction can be computed independently, versatile (there exist many types of beamformers to include various types of hypothesis), relatively fast it also has some drawbacks: the produced acoustic map has artifacts (also called side lobes or ghost sources) and it does not model correctly correlated sound sources. Various methods have been introduced to reduce the artifacts such as DAMAS<ref name=DAMAS>{{cite journal |last1=Brooks |first1=Thomas F.|last2=Humphreys|first2=William M.|date=2004 |title=Deconvolution Approach for the Mapping of Acoustic Sources|journal=NASA Invention Disclosure |volume=LAR-16907-1 |publisher=NASA Langley Research}}</ref> or to take in account correlated sources such as CLEAN-SC,<ref name=CLEAN-SC>{{cite journal |last=Sijtsma |first=P.|date=2007 |title=CLEAN based on spatial source coherence|journal=International Journal of Aeroacoustics |volume=6 |issue=4|pages=357–374 |doi=10.1260/147547207783359459|s2cid=122396368}}</ref> both at the price of a higher computational cost.
This method has many advantages – it is robust, easy to understand, highly [[Parallel computing|parallelizable]] (because each direction can be computed independently), versatile (there exist many types of beamformers), and it is relatively fast. It however has some drawbacks: it does not model correctly correlated sound sources, and the produced acoustic map has artifacts (also called side lobes or ghost sources). Various methods have been introduced to reduce the artifacts such as DAMAS<ref name=DAMAS>{{cite journal |last1=Brooks |first1=Thomas F.|last2=Humphreys|first2=William M.|date=2004 |title=Deconvolution Approach for the Mapping of Acoustic Sources|journal=NASA Invention Disclosure |volume=LAR-16907-1 |publisher=NASA Langley Research}}</ref> or to take in account correlated sources such as CLEAN-SC,<ref name=CLEAN-SC>{{cite journal |last=Sijtsma |first=P.|date=2007 |title=CLEAN based on spatial source coherence|journal=International Journal of Aeroacoustics |volume=6 |issue=4|pages=357–374 |doi=10.1260/147547207783359459|s2cid=122396368}}</ref> both at the price of a higher computational cost.


When the sound sources are near the acoustic camera, the relative intensity perceived by the different microphones as well as the waves not being any more seen as planar but spherical by the acoustic camera add new information compared to the case of sources being far from the camera. It enables to use more effective methods such as [[acoustic holography]].
When the sound sources are near the acoustic camera, the relative intensity perceived by the different microphones as well as the waves not being any more seen as planar but spherical by the acoustic camera add new information compared to the case of sources being far from the camera. It enables to use more effective methods such as [[acoustic holography]].
Line 18: Line 18:


===== Two-dimensional =====
===== Two-dimensional =====
Some acoustic cameras use two-dimensional acoustic mapping, which uses a unidirectional microphone array (e.g. a rectangle of microphones, all facing the same direction). Two-dimensional acoustic mapping works best when the surface to be examined is planar and the acoustic camera can be set up facing the surface perpendicularly. However, the surfaces of real-world objects are not often flat, and it is not always possible to optimally position the acoustic camera.<ref name = Car>[https://www.researchgate.net/publication/228993580_Noise_source_localization_within_a_car_interior_using_3D-microphone_arrays_/file/3deec51500a522cf8c.pdf Meyer, Andy, and Döbler, Dirk. "Noise source localization within a car interior using 3D-microphone arrays." Proceedings of the BeBeC (2006).]</ref>
Some acoustic cameras use two-dimensional acoustic mapping, which uses a unidirectional microphone array (e.g. a rectangle of microphones, all facing the same direction). Two-dimensional acoustic mapping works best when the surface to be examined is planar and the acoustic camera can be set up facing the surface perpendicularly. However, the surfaces of real-world objects are not often flat, and it is not always possible to optimally position the acoustic camera.<ref name = Car>[http://www.bebec.eu/Downloads/BeBeC2006/Papers/BeBeC-2006-17_Meyer_Doebler.pdf Meyer, Andy, and Döbler, Dirk. "Noise source localization within a car interior using 3D-microphone arrays." Proceedings of the BeBeC (2006).]</ref>


Additionally, the two-dimensional method of acoustic mapping introduces error into the calculations of the sound intensity at a point. Two-dimensional mapping approximates three-dimensional surfaces into a plane, allowing the distance between each microphone and the focus point to be calculated relatively easily. However, this approximation ignores the distance differences caused by surfaces having different depths at different points. In most applications of the acoustic camera, this error is small enough to be ignored; however, in confined spaces, the error becomes significant.<ref name="Car"/>
Additionally, the two-dimensional method of acoustic mapping introduces error into the calculations of the sound intensity at a point. Two-dimensional mapping approximates three-dimensional surfaces into a plane, allowing the distance between each microphone and the focus point to be calculated relatively easily. However, this approximation ignores the distance differences caused by surfaces having different depths at different points. In most applications of the acoustic camera, this error is small enough to be ignored; however, in confined spaces, the error becomes significant.<ref name="Car"/>
Line 26: Line 26:


== Applications ==
== Applications ==
There are many applications of the acoustic camera, with most focusing on noise reduction. The camera is frequently applied to improve the noise emission of vehicles (such as cars, airplanes<ref name=aircraft_imaging>{{cite journal |first1=Brusniak|last1=Leon |first2=James R.|last2=Underbrink|first3=Robert W.|last3=Stoker |date=2006 |title=Acoustic imaging of aircraft noise sources using large aperture phased arrays.|journal=AIAA/CEAS Aeroacoustics Conference |volume=12}}</ref>) and trains, structures—such as wind turbines.<ref name= wind_turbine>{{cite journal |first1=Lee|last1=Gwang-Se |first2=Cheolung|last2=Cheong|first3=Su-Hyun|last3=Shin |first4=Sung-Soo|last4=Jung|date=2012 |title=A case study of localization and identification of noise sources from a pitch and a stall regulated wind turbine|journal=Applied Acoustics |volume=73 8 |pages=817–827}}</ref>
There are many applications of the acoustic camera, with most focusing on noise reduction. The camera is frequently applied to improve the noise emission of vehicles (such as cars, airplanes<ref name=aircraft_imaging>{{cite journal |first1=Brusniak|last1=Leon |first2=James R.|last2=Underbrink|first3=Robert W.|last3=Stoker |date=2006 |title=Acoustic imaging of aircraft noise sources using large aperture phased arrays.|journal=AIAA/CEAS Aeroacoustics Conference |volume=12}}</ref>), trains, structures—such as wind turbines<ref name= wind_turbine>{{cite journal |first1=Lee|last1=Gwang-Se |first2=Cheolung|last2=Cheong|first3=Su-Hyun|last3=Shin |first4=Sung-Soo|last4=Jung|date=2012 |title=A case study of localization and identification of noise sources from a pitch and a stall regulated wind turbine|journal=Applied Acoustics |volume=73 8 |pages=817–827}}</ref> and heavy machinery operations such as mining <ref name="Solving issues in the mining industry">{{cite web |last1=Oberholster |first1=Abrie J. |title=Solving issues in the mining industry |url=https://www.plm.automation.siemens.com/global/en/our-story/customers/university-pretoria/88501/ |website=Siemens Digital Industries Software |publisher=University Of Pretoria |access-date=November 12, 2021}}</ref> or drilling.


Acoustic cameras are not only used to measure the exterior emission of products but also to improve the comfort inside cabins of cars,<ref name="Car"/> train or airplanes. Spherical acoustic camera are preferred in this type of application because the three-dimensional placement of the microphone allows to localize sound sources in all directions.
Acoustic cameras are not only used to measure the exterior emission of products but also to improve the comfort inside cabins of cars,<ref name="Car"/> train or airplanes. Spherical acoustic cameras are preferred in this type of application because the three-dimensional placement of the microphone allows to localize sound sources in all directions.


Troubleshooting of faults that occur in machines and mechanical parts can be accomplished with an acoustic camera. To find where the problem lies, the sound mapping of a properly functional machine can be compared to one of a dysfunctional machine.
Troubleshooting of faults that occur in machines and mechanical parts can be accomplished with an acoustic camera. To find where the problem lies, the sound mapping of a properly functional machine can be compared to one of a dysfunctional machine.


A similar setup of the acoustic camera can be used to study the noise inside passenger carts during train operation. Alternatively, the camera can be set up outside, in an area near the train tracks, to observe the train as it goes by. This can give another perspective of the noise that might be heard inside the train. Additionally, an outside setup can be used to examine the squealing of train wheels caused by a curve in the tracks.
A similar setup of the acoustic camera can be used to study the noise inside passenger carts during train operation. Alternatively, the camera can be set up outside, in an area near the train tracks, to observe the train as it goes by. This can give another perspective of the noise that might be heard inside the train. Additionally, an outside setup can be used to examine the squealing of train wheels caused by a curve in the tracks.

Acoustic camera may be used to aid legal enforcement of noise nuisances caused by people or motor vehicles. Epidemiologist Erica Walker has said this is a "lazy" solution to the problem of noise, and expressed concern acoustic cameras could be used to over-police ethnic minority neighbourhoods.<ref>{{cite news |newspaper=The Guardian |title=Honk honk! Can noise cameras reduce ‘potentially fatal’ sound pollution? |date=4 October 2023 |vauthors=Demopoulos A |url=https://www.theguardian.com/us-news/2023/oct/04/new-york-noise-cameras}}</ref>


== Challenges ==
== Challenges ==
Line 38: Line 40:
=== Dynamic range ===
=== Dynamic range ===


The effective dynamic range in the imaging plane can be interpreted as the maximum contrast achievable within the target area. An inherent challenge related to the dynamic range of acoustic cameras lies in its dependency on the sound's wavelength and the size of the array. These physical constraints pose difficulties for far-field acoustic cameras aiming to resolve multiple low-frequency sources. As the aperture size would need to be significantly large to tackle low-frequency issues, it often results in inconclusive or less definitive results within this frequency range. This underlines the unique challenges faced in enhancing the dynamic range of acoustic cameras, particularly in applications involving low-frequency sounds.
{{Empty section|date=June 2014}}


=== Low frequencies in the far-field ===
=== Low frequencies in the far-field ===


The lowest frequency that can be localized with a far-field acoustic camera is primarily determined by the size of the array's aperture (its largest dimension). Challenges arise when dealing with low-frequency issues, particularly those below 300&nbsp;Hz, as they require large array sizes for effective sound source localization.
{{Empty section|date=June 2014}}
Alternatively, there are a number of effective solutions, such as acoustic vector sensors, either standalone or in an array configuration, or near-field acoustic cameras, both can serve as valuable tools for addressing non-stationary issues. On the other hand, methods that employ direct sound mapping using [[sound intensity probe]]s and/or [[particle velocity probe]]s offer robust alternatives for identifying and visualizing time-stationary sound sources.<ref>[https://www.microflown.com/products/sound-localization-systems Sound Source Localization Solutions ]</ref>


=== Computational power ===
=== Computational power ===
The signal processing required by the acoustic camera is very intensive and needs powerful hardware and plenty of memory storage. Because of this, signal processing is frequently done after the recording of data, which can hinder or prevent the use of the camera in analyzing sounds that only occur occasionally or at varying locations. Cameras that do perform signal processing in real time tend to be large and expensive. Hardware and signal processing improvements can help to overcome these difficulties. Signal processing optimizations often focus on reduction of computational complexity, storage requirements, and memory bandwidth (rate of data consumption).<ref name = FPGA>[http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5537301&isnumber=5536941 Zimmermann, B.; Studer, C., "FPGA-based real-time acoustic camera prototype," Circuits and Systems (ISCAS), Proceedings of 2010 IEEE International Symposium on , vol., no., pp.1419,1419, May 30 2010-June 2 2010]</ref>
The signal processing required by the acoustic camera is very intensive and needs powerful hardware and plenty of memory storage. Because of this, signal processing is frequently done after the recording of data, which can hinder or prevent the use of the camera in analyzing sounds that only occur occasionally or at varying locations. Cameras that do perform signal processing in real time tend to be large and expensive. Hardware and signal processing improvements can help to overcome these difficulties. Signal processing optimizations often focus on reduction of computational complexity, storage requirements, and memory bandwidth (rate of data consumption).<ref name = FPGA>[https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5537301&isnumber=5536941 Zimmermann, B.; Studer, C., "FPGA-based real-time acoustic camera prototype," Circuits and Systems (ISCAS), Proceedings of 2010 IEEE International Symposium on, vol., no., pp.1419,1419, May 30 2010-June 2 2010]</ref>


== References ==
== References ==
<references />
<references />

== Further reading ==
* Abrie J., Oberholster. "[https://www.plm.automation.siemens.com/global/en/our-story/customers/university-pretoria/88501/ Localizing the source of the sound efficiently]", University of Pretoria Article 2021.


== External links ==
== External links ==
Line 54: Line 60:


=== Manufacturer links ===
=== Manufacturer links ===
*https://www.sonavu.com
*https://precisereliability.com/product/sdt-sonavu/
*https://www.cae-systems.de/en/
*https://www.sevenbel.com/en
*https://www.sorama.eu/
*https://www.sorama.eu/
*https://www.acoustic-camera.com/
*https://www.acoustic-camera.com/
*https://www.cae-systems.de/en/
*https://www.distran.ch/
*https://www.distran.ch/
*https://www.fluke.com/en-us/product/industrial-imaging/sonic-industrial-imager-ii900
*https://www.fluke.com/en-us/product/industrial-imaging/sonic-industrial-imager-ii900
Line 64: Line 73:
*http://www.signalinterface.com/index.html
*http://www.signalinterface.com/index.html
*http://smins.co.kr/en/
*http://smins.co.kr/en/
*https://www.microflown.com/
*https://www.microflown.com/products/sound-localization-systems/near-field-acoustic-camera
*https://www.bksv.com/
*https://acsoft.co.uk/product/acoustic-camera/
*https://www.flir.com/products/si124/
*https://soundcam.com/

[[Category:Acoustics]]
[[Category:Acoustics]]
[[Category:Imaging]]
[[Category:Imaging]]

Latest revision as of 09:37, 27 October 2024

An acoustic camera (or noise camera) is an imaging device used to locate sound sources and to characterize them. It consists of a group of microphones, also called a microphone array, from which signals are simultaneously collected and processed to form a representation of the location of the sound sources.

Terminology

[edit]

The term acoustic camera has first appeared at the end of the 19th century: A physiologist, J.R. Ewald,[1] was investigating the function of the inner ear and introduced an analogy with the Chladni plates (a domain nowadays called Cymatics), a device enabling users to visually see the modes of vibration of a plate. He called this device an acoustic camera. The term has then been widely used during the 20th century[2][3][4] to designate various types of acoustic devices, such as underwater localization systems[5] or active systems used in medicine.[6] It designates nowadays any transducer array used to localize sound sources (the medium is usually the air), especially when coupled with an optical camera.

Technology

[edit]

General principles

[edit]

An acoustic camera generally consists of a microphone array and optionally an optical camera. The microphones – analog or digital – are acquired simultaneously or with known relative time delays to be able to use the phase difference between the signals. As the sound propagates in the medium (air, water...) at a finite known speed, a sound source is perceived by the microphones at different time instants and at different sound intensities that depend on both the sound source location and the microphone location. One popular method to obtain an acoustic image from the measurement of the microphone is to use beamforming: By delaying each microphone signal relatively and adding them, the signal coming from a specific direction is amplified while signals coming from other directions are canceled. The power of this resulting signal is then calculated and reported on a power map at a pixel corresponding to the direction . The process is iterated at each direction where the power needs to be computed.

This method has many advantages – it is robust, easy to understand, highly parallelizable (because each direction can be computed independently), versatile (there exist many types of beamformers), and it is relatively fast. It however has some drawbacks: it does not model correctly correlated sound sources, and the produced acoustic map has artifacts (also called side lobes or ghost sources). Various methods have been introduced to reduce the artifacts such as DAMAS[7] or to take in account correlated sources such as CLEAN-SC,[8] both at the price of a higher computational cost.

When the sound sources are near the acoustic camera, the relative intensity perceived by the different microphones as well as the waves not being any more seen as planar but spherical by the acoustic camera add new information compared to the case of sources being far from the camera. It enables to use more effective methods such as acoustic holography.

Reprojection

[edit]

Results of far-field beamforming can be reprojected onto planar or non-planar surfaces.

Two-dimensional
[edit]

Some acoustic cameras use two-dimensional acoustic mapping, which uses a unidirectional microphone array (e.g. a rectangle of microphones, all facing the same direction). Two-dimensional acoustic mapping works best when the surface to be examined is planar and the acoustic camera can be set up facing the surface perpendicularly. However, the surfaces of real-world objects are not often flat, and it is not always possible to optimally position the acoustic camera.[9]

Additionally, the two-dimensional method of acoustic mapping introduces error into the calculations of the sound intensity at a point. Two-dimensional mapping approximates three-dimensional surfaces into a plane, allowing the distance between each microphone and the focus point to be calculated relatively easily. However, this approximation ignores the distance differences caused by surfaces having different depths at different points. In most applications of the acoustic camera, this error is small enough to be ignored; however, in confined spaces, the error becomes significant.[9]

Three-dimensional
[edit]

Three-dimensional acoustic cameras fix the errors of two-dimensional cameras by taking into account surface depths, and therefore correctly measuring the distances between the microphone and each spatial point. These cameras produce a more accurate picture, but require a 3-D model of the object or space being analyzed. Additionally, if the acoustic camera picks up sound from a point in space that is not part of the model, the sound may be mapped to a random space in the model, or the sound may not show up at all. 3-D acoustic cameras can also be used to analyze confined spaces, such as room interiors; however, in order to do this, a microphone array that is omnidirectional (e.g. a sphere of microphones, each facing a different direction) is required. This is in addition to the first requirement of having a 3-D model.[9]

Applications

[edit]

There are many applications of the acoustic camera, with most focusing on noise reduction. The camera is frequently applied to improve the noise emission of vehicles (such as cars, airplanes[10]), trains, structures—such as wind turbines[11] and heavy machinery operations such as mining [12] or drilling.

Acoustic cameras are not only used to measure the exterior emission of products but also to improve the comfort inside cabins of cars,[9] train or airplanes. Spherical acoustic cameras are preferred in this type of application because the three-dimensional placement of the microphone allows to localize sound sources in all directions.

Troubleshooting of faults that occur in machines and mechanical parts can be accomplished with an acoustic camera. To find where the problem lies, the sound mapping of a properly functional machine can be compared to one of a dysfunctional machine.

A similar setup of the acoustic camera can be used to study the noise inside passenger carts during train operation. Alternatively, the camera can be set up outside, in an area near the train tracks, to observe the train as it goes by. This can give another perspective of the noise that might be heard inside the train. Additionally, an outside setup can be used to examine the squealing of train wheels caused by a curve in the tracks.

Acoustic camera may be used to aid legal enforcement of noise nuisances caused by people or motor vehicles. Epidemiologist Erica Walker has said this is a "lazy" solution to the problem of noise, and expressed concern acoustic cameras could be used to over-police ethnic minority neighbourhoods.[13]

Challenges

[edit]

Dynamic range

[edit]

The effective dynamic range in the imaging plane can be interpreted as the maximum contrast achievable within the target area. An inherent challenge related to the dynamic range of acoustic cameras lies in its dependency on the sound's wavelength and the size of the array. These physical constraints pose difficulties for far-field acoustic cameras aiming to resolve multiple low-frequency sources. As the aperture size would need to be significantly large to tackle low-frequency issues, it often results in inconclusive or less definitive results within this frequency range. This underlines the unique challenges faced in enhancing the dynamic range of acoustic cameras, particularly in applications involving low-frequency sounds.

Low frequencies in the far-field

[edit]

The lowest frequency that can be localized with a far-field acoustic camera is primarily determined by the size of the array's aperture (its largest dimension). Challenges arise when dealing with low-frequency issues, particularly those below 300 Hz, as they require large array sizes for effective sound source localization. Alternatively, there are a number of effective solutions, such as acoustic vector sensors, either standalone or in an array configuration, or near-field acoustic cameras, both can serve as valuable tools for addressing non-stationary issues. On the other hand, methods that employ direct sound mapping using sound intensity probes and/or particle velocity probes offer robust alternatives for identifying and visualizing time-stationary sound sources.[14]

Computational power

[edit]

The signal processing required by the acoustic camera is very intensive and needs powerful hardware and plenty of memory storage. Because of this, signal processing is frequently done after the recording of data, which can hinder or prevent the use of the camera in analyzing sounds that only occur occasionally or at varying locations. Cameras that do perform signal processing in real time tend to be large and expensive. Hardware and signal processing improvements can help to overcome these difficulties. Signal processing optimizations often focus on reduction of computational complexity, storage requirements, and memory bandwidth (rate of data consumption).[15]

References

[edit]
  1. ^ Ewald, J.R. (1898). Wiener klinische Wochenschrift. 11: 721.{{cite journal}}: CS1 maint: untitled periodical (link)
  2. ^ Whitman, R. L.; Ahmed, M.; Korpel, A. (1972). "A progress report on the laser scanned acoustic camera". Acoustical Holography. 20. Springer US: 11–32. doi:10.1007/978-1-4615-8213-7_2. ISBN 978-1-4615-8215-1.
  3. ^ US patent 3895340, "Acoustic camera apparatus" 
  4. ^ Hansen, Rolf Kahrs; Andersen, Poul Arndt (1993). "3D acoustic camera for underwater imaging". Acoustical Imaging. 20. Springer US: 723–727. doi:10.1007/978-1-4615-2958-3_98. ISBN 978-1-4613-6286-9.
  5. ^ Haslett, R. W. G.; Pearce, G.; Welsh, A. W.; Hussey, K. (1966). "The underwater acoustic camera". Acta Acustica United with Acustica. 17, 4. S. Hirzel Verlag: 187–203.
  6. ^ Maginness, M. G.; Plummer, J. D.; Meindl, J. D. (1974). "An acoustic image sensor using a transmit-receive array". Acoustical Holography. Springer US: 619–631. doi:10.1007/978-1-4757-0827-1_36. ISBN 978-1-4757-0829-5.
  7. ^ Brooks, Thomas F.; Humphreys, William M. (2004). "Deconvolution Approach for the Mapping of Acoustic Sources". NASA Invention Disclosure. LAR-16907-1. NASA Langley Research.
  8. ^ Sijtsma, P. (2007). "CLEAN based on spatial source coherence". International Journal of Aeroacoustics. 6 (4): 357–374. doi:10.1260/147547207783359459. S2CID 122396368.
  9. ^ a b c d Meyer, Andy, and Döbler, Dirk. "Noise source localization within a car interior using 3D-microphone arrays." Proceedings of the BeBeC (2006).
  10. ^ Leon, Brusniak; Underbrink, James R.; Stoker, Robert W. (2006). "Acoustic imaging of aircraft noise sources using large aperture phased arrays". AIAA/CEAS Aeroacoustics Conference. 12.
  11. ^ Gwang-Se, Lee; Cheong, Cheolung; Shin, Su-Hyun; Jung, Sung-Soo (2012). "A case study of localization and identification of noise sources from a pitch and a stall regulated wind turbine". Applied Acoustics. 73 8: 817–827.
  12. ^ Oberholster, Abrie J. "Solving issues in the mining industry". Siemens Digital Industries Software. University Of Pretoria. Retrieved November 12, 2021.
  13. ^ Demopoulos A (4 October 2023). "Honk honk! Can noise cameras reduce 'potentially fatal' sound pollution?". The Guardian.
  14. ^ Sound Source Localization Solutions
  15. ^ Zimmermann, B.; Studer, C., "FPGA-based real-time acoustic camera prototype," Circuits and Systems (ISCAS), Proceedings of 2010 IEEE International Symposium on, vol., no., pp.1419,1419, May 30 2010-June 2 2010

Further reading

[edit]
[edit]
[edit]