Sound-field imaging, the visualization of spatial and temporal distribution of acoustical properties such as sound pressure, is useful for understanding acoustical phenomena. This study investigated the use of parallel phase-shifting interferometry (PPSI) with a high-speed polarization camera for imaging a sound field, particularly high-speed imaging of propagating sound waves. The experimental results showed that the instantaneous sound field, which was generated by ultrasonic transducers driven by a pure tone of 40 kHz, was quantitatively imaged. Hence, PPSI can be used in acoustical applications requiring spatial information of sound pressure.
© 2016 Optical Society of America
Imaging of invisible physical phenomena by means of optical techniques aids the understanding of these phenomena. For acoustics, an understanding of sound emission and propagation is required for various applications including the design of transducers, localization of noise sources, and evaluation of acoustic materials.
Optical measurement of sound, which acquires acoustical quantities using optical methods, is of growing interest because of its contactless nature. It can also be used to image a sound field (spatial and temporal distribution of acoustical quantities such as sound pressure and acoustic particle velocity). Its principle is to measure phase and/or amplitude modulation of light caused by sound.
Laser Doppler vibrometry (LDV) is often used in the optical measurement of the sound field. Sound pressure measurements using LDV were reported by Nakamura et al.  and Harland et al. [2, 3]. Soon after, two-dimensional (2D) sound field imaging using scanning LDV was achieved [4–10]. Three-dimensional sound field reconstruction based on tomography has been well-researched [11–14], and optical wave microphones [15, 16] and optical feedback interferometry  have been developed as scanning-based imaging methods. The application of scanning-based imaging is limited to reproducible field such as sounds generated by a loudspeaker because the 2D acoustical quantities are acquired by a number of measurements.
Using a camera instead of a scanning device enables 2D imaging by a single or a few acquisitions. The Schlieren method has been used widely for visualizing refractive index distribution. Imaging of ultrasonic waves in both water and air has been achieved [18, 19] and imaging of audible sound in air has been reported recently [20, 21]. However, it is difficult to acquire quantitative information by the Schlieren method. Television (TV) holography was proposed for quantitative sound field imaging using a camera by Løkberg [22, 23]. Since TV holography is based on phase-shifting interferometry, multiple acquisitions are required. Thus, a single-shot measurement was not achieved.
For quantitative and single-shot measurement of a sound field, Mizutani et al. investigated interferometric detection of ultrasound in water using a CCD camera . Although they visualized the ultrasound in water in one frame, time-sequential images were not obtained. Matoba et al. reported on the digital holographic method for sound detection in air . They showed that a temporal sound wave could be extracted from holographic images, whereas a spatial sound field could not be visualized. Until now, optical high-speed imaging of a sound field in air by means of quantitative and single-shot phase measurement methods has not been reported.
In this paper, this type of sound field imaging using parallel phase-shifting interferometry (PPSI) [26–29] with a high-speed polarization camera  is investigated. PPSI and the high-speed polarization camera enable the detection of high-speed fluctuation of a phase difference of reference light and light that travels through test region . Once the phase-shifted images are captured by a single-shot using PPSI, applying a phase-shifting algorithm enables the acquisition of instantaneous phase distribution . Sound field information can be calculated by considering the physical relation of a phase of light and a sound field . To the best of our knowledge, this report is first one that achieves a quantitative and single-shot measurement of a sound field in air. Experiments of high-speed imaging of sound fields were conducted. The sound fields were generated by ultrasonic transducers driven by a 40 kHz sinusoidal signal. The framerate of the camera was 100,000 frames per second (fps). We confirmed the validity of the method by comparing the experimental results with simulations. The results indicate that the sound fields were quantitatively visualized by the proposed method.
2.1 Acousto-optic effect caused by weak sound in air
Diffraction, deflection, or phase modulation of light caused by sound is well known as the acousto-optic effect [34, 35]. Several models exist that describe the effect, which depend on the frequency and pressure range of sound, and the kind of medium. For sound in air, a phase modulation model has been introduced , and validity of the model has been quantitatively examined . The relation between sound pressure, p and refractive index of air, n is36], this approximation is valid for the sound pressure of several hundred pascals. (When the sound pressure is 1000 Pa, the relative error between Eq. (1) and Eq. (2) is 0.014.) The sound pressure that satisfies the linear approximation in an appropriate manner is defined as weak sound in this paper.
According to previous research, light propagated in the weak sound field can be expressed as14, 33, 37]. The term Eh represents light propagates in a static field where no temporal and/or spatial density variation of medium (e.g., sound) exists. The equations indicate that the phase modulation of light caused by the weak sound is proportional to the line integration of sound pressure along the optical path. Therefore, the weak sound field in air can be obtained by observing the phase of light.
2.2 Parallel phase-shifting interferometry (PPSI)
Phase-shifting interferometry is widely used for measurement of displacement, vibration, and physical phenomena . It reconstructs phase under test from multiple phase-shifted images. In order to reduce environmental vibration and enable measurement of non-static phenomena, single-shot acquisition of multiple phase-shifted images has widely been researched [26–29, 38–42]. The single-shot acquisition technique of multiple phase-shifted images by a single imaging device is called PPSI. The idea of using an array of phase-shifting elements for that purpose was firstly reported by Horwitz et al. [41,42]. Awatsuji et al. proposed the use of a pixelated phase-shifting array device which directly cause four types of phase retardation . Millerd et al. developed a polarization camera, whose phase-shifting elements are consist of linear polarizers, and applied it to PPSI [27–29].
The principle of four-step PPSI is schematically illustrated in Fig. 1. The phase-shifting array device, which consists of the four types of phase-shifting elements whose phase retardation angles, θ are 0, π/2, π, and 3π/2, respectively, is mounted onto the image sensor . The recorded image includes the four phase-shifted images, I(0), I(π/2), I(π), and I(3π/2), where I(θ) is the intensity of the interference fringe with phase retardation angle, θ. The phase difference of reference light and light that travels through test region can be reconstructed byEq. (6) is wrapped in [-π, π), phase unwrapping should be performed to obtain smooth phase distribution .
2.3 Extraction of sound field from optical phase distribution
Equation (2) indicates that observed data contains bias caused by n0. In order to acquire information only related to sound pressure, the time invariant component must be removed. Therefore, the time-averaged image is subtracted from the phase images after unwrapping. Since the sound field changes frame-by-frame, subtraction of the time-averaged image removes the time invariant component such as slope caused by the optical setup.
3. Measurement system
A schematic of the measurement system is illustrated in Fig. 2. The system is based on the polarization interferometer [44, 45]. Linearly polarized light at an appropriate azimuth enters the polarized beam splitter that divides the incident light into two orthogonally polarized light beams. The orthogonally polarized light changes these polarization states to circular by the quarter-wave plates, Q1 and Q2, respectively. The reflected light changes the polarization states to linear whose principle axes are perpendicular to the incident light waves caused by Q1 and Q2, respectively. The returned light that travels through test region reflected by the beam splitter and the returned reference light is transmitted through the beam splitter. The quarter-wave plate, Q3 makes the polarization states of the object and reference lights, contra-rotating circular polarizations. The superposition of the contra-rotating circular polarized lights becomes linearly polarized light whose angle, θp, is given by θp = ϕ/2 where ϕ is the phase difference of the two light waves . By controlling azimuth of a linear polarizer in front of an image sensor, one can obtain an arbitrary phase-shifted image. By using a polarization camera whose phase elements are consists of four types of linear polarizer whose azimuth are 0, π/4, π/2, and 3π/4, respectively [30, 47], the four phase-shifted images are captured in a single image. Thus, the polarization camera is necessary for the simultaneous acquisition of four phase-shifted images by this system.
In our instruments, the laser with a wavelength of 532 nm and power of 70 mW is used. A high-speed polarization camera, CRYSTA PI-1P made by Photron Ltd., is used. The maximum number of active pixels of the image sensor is 1024 × 1024, which is achieved when the frame rate of the camera is less than 7000 fps. The number of active pixels decreases as the frame rate increases. The maximum frame rate of the camera device is 1.55 Mfps. The light in the measurement volume is expanded to be 10 cm in diameter and reduced by the lens system in front of the high-speed polarization camera to fit with the image sensor size. The size of each phase-pixel in the measurement volume is approximately 0.2 mm × 0.2 mm (512 phase-pixel in 10 cm).
4. Experimental setup
The experimental setup is shown in Fig. 3. The mirror was positioned at 0.3 m away from the front edge of the interferometer. The interferometer and mirror were mounted on a vibration isolation table. The ultrasonic transducer (MURATA MA40S4S) was positioned at the center of the optical path and was driven by a 40 kHz sinusoidal signal with an amplitude of 7.0 Vrms. The transducer was included inside the optical beam in order to emphasize its position within the image. The frame rate of the camera was 100,000 fps and the active number of pixels was 96 × 160; the equivalent measurement cross-section was 18.8 mm × 31.3 mm. The wrapped phase was calculated from the intensity distribution of obtained images using Eq. (6). For 2D phase unwrapping in subsection 2.2, Goldstein’s branch-cut algorithm was used . Since shadows of the transducer appeared in obtained images may cause unwrapping errors, unwrapping was performed with the region where the shadows are not contained. In the experiments, the parts mm in the result figures were ignored within the unwrapping process; the unwrapping is performed only the parts x is greater than 2.4 mm in the figures. In order to emphasize the position of the transducer, we do not eliminate the transducer parts from the results. The time-averaged image explained in subsection 2.3 was calculated by averaging 60 sequential images of each phase-pixel.
The aim of this experiment was to confirm that the proposed method is capable of measuring an instantaneous 2D sound field. For experimental verification of an optical measurement method of sound, Bertling et al. conducted a comparison of a measured and simulated sound field generated by a 40 kHz ultrasonic transducer . They compared the instantaneous field, amplitude of sound, and phase of sound of the measured and simulated images. Since their procedure seems to be well established as a sound field imaging performance test, we chose similar experimental conditions. The most important difference compared with the previous method is that our method does not require a scanning process because the 2D field is measured by a single-shot.
In order to confirm the validity of our experimental results, a sound field generated by the transducer was simulated. The transducer used in the experiment has a disk-shaped diaphragm; it can be modeled as a vibrating circular plate whose vibration direction is normal to the surface. It is assumed that there is no sound source except for the transducer. By considering a plane including the circular plate in three-dimensional Euclidean space, sound pressure at position r can be represented using the Rayleigh integral of the first kind :Eq. (8) can be descretized as
Since the integrated sound pressure along the optical path is obtained in the interferometric measurement, the resultant phase is calculated by
To confirm the capability of the proposed method in quantitative sound field measurement, the simulated field is compared with the measured field. In order to simulate the sound field quantitatively, the constant C has to be appropriately determined. In this study, the constant C is experimentally examined so that the amplitude of the reference microphone and simulation at the corresponding position coincide. This means that the simulated field is calibrated by the actual amplitude obtained from the reference microphone. The reference microphone was fixed at the 1 cm from the center of vibrating plane. The peak value of the sound pressure measured by the reference microphone was 424 Pa. In order to make the simulated sound pressure at the 1 cm from the center of point sources 424 Pa, 3.94 was used for the constant C.
In summary, the numerical simulation was conducted by following steps: (i) the constant C was determined from reference microphone measurement, (ii) 3D distribution of sound pressure was calculated using Eq. (9), and (iii) instantaneous phase distribution of light was obtained from the 3D sound field using Eq. (10).
Figure 4 shows the experimental (left side) and simulated (right side) results when one transducer generates a sound field of 40 kHz. Figure 4 illustrates Fig. 4(a) the instantaneous phase distribution caused by sound field, Fig. 4(b) the amplitude of the sound field, and Fig. 4(c) the phase of the sound field. As can be seen in Fig. 4(a), the instantaneous field was imaged from the measurement. The results are shown in length unit, which are obtained by the phase of light multiplied by the wavenumber of light. That can be converted to the acoustical quantity by using Eq. (5). It was confirmed that the values of both results were of the same order. These results suggest that the proposed system can measure a quantitative sound field. The phases of the sound fields agreed well, as shown in Fig. 4(c), while their amplitude differed, as shown in Fig. 4(b). The slight differences in the pattern of the sound field can beconsidered as the modeling error of the transducer in the simulation because the actual transducer is a vibrating cone with a cylindrical enclosure. It can also be considered within the slight systematic bias of the measurement system. The random noise, which clearly appeared in the results, was due to the shot noise of the imaging device.
Figure 5 shows three time-sequential images of the instantaneous phase distribution. The time step of each image was 10 μs. The blue lines in the figures indicate the position of the corresponding wavefront. Since the sound propagation distance within 10 μs is 3.4 mm when the speed of sound is 340 m/s, the experimental result agrees with the theoretical value. When the length of the short side of the image is 18.8 mm as in Fig. 5 and the speed of sound is 340 m/s, the frame rate of more than 18,000 fps is required to include same wavefront in two sequential images. Our system can meet this requirement easily because its maximum frame rate is 1,550,000 fps.
Figure 6 shows the experimental and simulated results when two transducers were driven by the same 40 kHz signal. The interference pattern of two sound waves appears in Fig. 6(a). Although noise exists in the experimental result, the interference pattern is consistent with that of the simulated result. From these experimental results, it can be concluded that the proposed system is able to image an instantaneous 2D sound field.
The high-speed imaging of a sound field is realized by combining PPSI and a high-speed polarization camera. Since this method has remarkable advantages including its high-speed, instantaneous, quantitative, and non-contact nature, it can be utilized in various acoustical applications such as in the design of transducers, localization of noise sources, and evaluation of acoustic materials. The experiments were conducted in order to confirm that the proposed method could image a 2D sound field. The results suggest that the instantaneous and quantitative imaging of a sound field can be achieved with the proposed system. Future work should include investigation of the properties of the sound measurement system such as linearity, accuracy, repeatability, and noise characteristics.
This work was supported by Grants-in-Aid for JSPS Fellows (16J06772, 15J08043).
References and links
1. K. Nakamura, M. Hirayama, and S. Ueha, “Measurements of air-borne ultrasound by detecting the modulation in optical refractive index of air,” in Proceedings of IEEE Ultrasonics Symposium (IEEE, 2002), pp. 609–612. [CrossRef]
2. A. R. Harland, J. N. Petzing, and J. R. Tyrer, “Non-invasive measurements of underwater pressure fields using laser Doppler velocimetry,” J. Sound Vibrat. 252(1), 169–177 (2002). [CrossRef]
3. A. R. Harland, J. N. Petzing, J. R. Tyrer, C. J. Bickley, S. P. Robinson, and R. C. Preston, “Application and assessment of laser Doppler velocimetry for underwater acoustic measurements,” J. Sound Vibrat. 265(3), 627–645 (2003). [CrossRef]
4. L. Zipser, H. Franke, E. Olsson, N. E. Molin, and M. Sjödahl, “Reconstructing two-dimensional acoustic object fields by use of digital phase conjugation of scanning laser vibrometry recordings,” Appl. Opt. 42(29), 5831–5838 (2003). [CrossRef] [PubMed]
5. L. Zipser and H. Franke, “Laser-scanning vibrometry for ultrasonic transducer development,” Sens. Actuators A Phys. 110(1–3), 264–268 (2004). [CrossRef]
6. A. R. Harland, J. N. Petzing, and J. R. Tyrer, “Nonperturbing measurements of spatially distributed underwater acoustic fields using a scanning laser Doppler vibrometer,” J. Acoust. Soc. Am. 115(1), 187–195 (2004). [CrossRef] [PubMed]
7. S. Frank and J. Schell, “Sound field simulation and visualisation based on laser Doppler vibrometer measurements,” in Forum Acousticum (2005), pp. 91–97.
8. R. Malkin, T. Todd, and D. Robert, “A simple method for quantitative imaging of 2D acoustic fields using refracto-vibrometry,” J. Sound Vibrat. 333(19), 4473–4482 (2014). [CrossRef]
9. K. Yatabe and Y. Oikawa, “PDE-based interpolation method for optically visualized sound field,” in Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing (IEEE, 2014), pp. 4738–4742. [CrossRef]
10. K. Yatabe and Y. Oikawa, “Optically visualized sound field reconstruction based on sparse selection of point sound sources,” in Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing (IEEE, 2015), pp. 504–508. [CrossRef]
11. Y. Oikawa, Y. Ikeda, M. Goto, T. Takizawa, and Y. Yamasaki, “Sound field measurements based on reconstruction from laser projections,” in Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (IEEE, 2005), pp. 661–664.
12. E. Olsson and F. Forsberg, “Three-dimensional selective imaging of sound sources,” Opt. Eng. 48(3), 035801 (2009). [CrossRef]
13. Y. Oikawa, T. Hasegawa, Y. Ouchi, Y. Yamasaki, and Y. Ikeda, “Visualization of sound field and sound source vibration using laser measurement method,” in 20th International Congress on Acoustics, M. Burgess, ed. (Australian Acoustical Society, 2010), pp. 1–5.
15. T. Sakoda and Y. Sonoda, “Visualization of sound field with uniform phase distribution using laser beam microphone coupled with computerized tomography method,” Acoust. Sci. Technol. 29(4), 295–299 (2008). [CrossRef]
16. Y. Sonoda and Y. Nakazono, “Development of optophone with no diaphragm and application to sound measurement in jet flow,” Adv. Acoust. Vib. 2012, 1–17 (2012). [CrossRef]
17. K. Bertling, J. Perchoux, T. Taimre, R. Malkin, D. Robert, A. D. Rakić, and T. Bosch, “Imaging of acoustic fields using optical feedback interferometry,” Opt. Express 22(24), 30346–30356 (2014). [CrossRef] [PubMed]
18. R. B. Barnes and C. J. Burton, “Visual methods for studying ultrasonic phenomena,” J. Appl. Phys. 20(3), 286–294 (1949). [CrossRef]
19. J. A. Bucaro, “Visualization of ultrasonic waves in air,” J. Acoust. Soc. Am. 62(6), 1506–1507 (1977). [CrossRef]
20. M. J. Hargather, G. S. Settles, and M. J. Madalis, “Schlieren imaging of loud sounds and weak shock waves in air near the limit of visibility,” Shock Waves 20(1), 9–17 (2010). [CrossRef]
21. N. Chitanont, K. Yaginuma, K. Yatabe, and Y. Oikawa, “Visualization of sound field by means of Schlieren method with spatio-temporal filtering,” in Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing (IEEE, 2015), pp. 509–513. [CrossRef]
23. O. J. Lokberg, “Recording of sound emission and propagation in air using TV holography,” J. Acoust. Soc. Am. 96(4), 2244–2250 (1994). [CrossRef]
24. K. Mizutani, T. Ezure, K. Nagai, and M. Yoshioka, “Optical measurement of sound fields information using Mach-Zehnder interferometer,” Jpn. J. Appl. Phys. 40(Part 1, No. 5B), 3617–3620 (2001). [CrossRef]
26. Y. Awatsuji, M. Sasada, and T. Kubota, “Parallel quasi-phase-shifting digital holography,” Appl. Phys. Lett. 85(6), 1069–1071 (2004). [CrossRef]
27. J. E. Millerd, N. J. Brock, J. B. Hayes, M. B. North-Morris, M. Novak, and J. C. Wyant, “Pixelated phase-mask dynamic interferometer,” Proc. SPIE 5531, 304–314 (2004). [CrossRef]
28. J. Millerd, N. Brock, J. Hayes, M. North-Morris, B. Kimbrough, and J. Wyant, “Pixelated phase-mask dynamic interferometers,” in Fringe 2005 (Springer-Verlag, 2005), pp. 640–647.
29. M. Novak, J. Millerd, N. Brock, M. North-Morris, J. Hayes, and J. Wyant, “Analysis of a micropolarizer array-based simultaneous phase-shifting interferometer,” Appl. Opt. 44(32), 6861–6868 (2005). [CrossRef] [PubMed]
30. T. Onuma and Y. Otani, “A development of two-dimensional birefringence distribution measurement system with a sampling rate of 1.3 MHz,” Opt. Commun. 315, 69–73 (2014). [CrossRef]
31. T. Kakue, R. Yonesaka, T. Tahara, Y. Awatsuji, K. Nishio, S. Ura, T. Kubota, and O. Matoba, “High-speed phase imaging by parallel phase-shifting digital holography,” Opt. Lett. 36(21), 4131–4133 (2011). [CrossRef] [PubMed]
32. See, for example, J. E. Greivenkamp and J. H. Bruning, See, for example, J. E. Greivenkamp, and J. H. Bruning, “Phase-shifting interferometry,” in Optical Shop Testing, D. Malacara, ed. (Wiley, 1992), pp. 501–598.
33. K. Ishikawa, K. Yatabe, Y. Ikeda, and Y. Oikawa, “Numerical analysis of acousto-optic effect caused by audible sound based on geometrical optics,” in 12th Western Pacific Acoustics Conference, K. M. Lim, ed. (Research publishing, 2015), pp. 165–169.
34. R. Adler, “Interaction between light and sound,” IEEE Spectr. 4(5), 42–54 (1967). [CrossRef]
35. B. E. A. Saleh and M. C. Teich, “Acousto-optics,” in Fundamentals of Photonics (John Wiley & Sons, Inc., 1991).
36. Bureau International des Poids et Measures, “Definition of the standard atmosphere,” http://www.bipm.org/en/CGPM/db/10/4/.
37. P. M. Morse and K. U. Ingard, Theoretical Acoustics (Princeton University, 1986), Ch. 13.
38. R. Smythe and R. Moore, “Instantaneous Phase Measuring Interferometry,” Opt. Eng. 23(4), 234361 (1984). [CrossRef]
40. C. L. Koliopoulos, “Simultaneous phase-shift interferometer,” Proc. SPIE 1531, 119–127 (1992). [CrossRef]
41. B. A. Horwitz and A. J. MacGovern, “Wavefront sensor employing novel D.C. shearing interferometer,” US patent 4575248 A (1986).
42. J. L. McLaughlin and B. A. Horwitz, “Real-time snapshot interferometer,” Proc. SPIE 680, 35–43 (1987). [CrossRef]
43. D. C. Ghiglia and M. D. Pritt, Two-Dimensional Phase Unwrapping: Theory, Algorithms and Software (Wiley, 1998).
47. N. Brock, B. Kimbrough, and J. Millerd, “A pixelated micropolarizer-based camera for instantaneous interferometric measurements,” Proc. SPIE 8160, 81600W (2011). [CrossRef]
48. R. M. Goldstein, H. A. Zebker, and C. L. Werner, “Satellite radar interferometry: Two-dimensional phase unwrapping,” Radio Sci. 23(4), 713–720 (1988). [CrossRef]
49. E. G. Williams, Fourier Acoustics (Academic, 1999), Ch. 8.