Abstract

Imaging turbid media is range limited. In contrast, sensing the medium’s optical properties is possible in larger depths using the iterative multi-plane optical properties extraction technique. It analyzes the reconstructed reemitted light phase image. The root mean square of the phase image yields two graphs with opposite behaviors that intersect at µ’s,cp. These graphs enable the extraction of a certain range of the reduced scattering coefficient, µ’s. Here, we aim to extend the range of µ’s detection by optical magnification. We use a modified diffusion theory and show how µ’s,cp shifts with the varying magnification. The theoretical results were tested experimentally, showing that the technique can be adapted to different ranges of µ’s by changing the magnification.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Light-matter interactions are mainly correlated with absorption, emission, and scattering phenomena. The attenuation, i.e., the total loss of intensity during the interaction, is related to the absorption and represented by the absorption coefficient µa. The scattering, in which the incident light phase and direction are changed due to the light-matter interactions along with the light propagation in the media, is represented by the reduced scattering coefficient µ’s. Since the absorption and the scattering phenomena are frequency-dependent, their coefficients are wavelength-dependent as well [1]. Most of the optical imaging techniques can acquire high-resolution images of the surface. However, imaging inside turbid medium is challenging even when it does not absorb the light. The phase, together with the direction of light, are lost and imaging becomes more difficult as the light optical length within the media is longer. Despite the challenge, different methods for imaging under scattering conditions have been developed. The field of underwater imaging has made a significant progress, with growing necessity, especially for the growing field of autonomous underwater vehicles [2] mainly used for navy purposes. During the last years, various imaging technologies have been trying to overcome the scattering in underwater images; some are based on Sonar acoustic imaging, a technology that is constantly developing [3]. In parallel, other methods were developed based on light illumination [4], requiring compensation algorithms [5]. Another type of method uses the movement of the object [6]. Recent progression in the last two methods includes machine learning tools [79]. Yet, the above techniques are limited too; Sonar is limited to big or stiff objects and far distances [10], and optical methods are limited to short distances [11]. Indeed, the imaging techniques can acquire important data which is also intuitive for understanding. However, in addition to their limitations, imaging techniques do not profess to achieve all the information contained in the lost phase. Thus, some information seems to be lost together with the phase [12] and so there are constant attempts to retrieve the phase in various fields such as biology [13], material science [14], astronomy [15] etc. Although the phase cannot be measured directly, some of the information can be obtained from proper analysis of the medium's response to light, which is defined by the medium’s optical properties. Therefore, a sensing technique, the iterative multi-plane optical property extraction (IMOPE) [1621] technique, was suggested, aiming to detect changes in the reduced scattering coefficient. The IMOPE is a non-invasive technique for the detection of media scattering. It reconstructs and analyzes the reemitted light phase from the irradiated medium, based on the relation between the medium's scattering properties and the reemitted light phase. For reconstructing the phase, the IMOPE uses a multi-plane version of the iterative Gerchberg-Saxton (GS) algorithm [22]. The GS algorithm is an error reduction algorithm [23] for phase retrieval [24] and image reconstruction [25]. The IMOPE technique does not use the retrieved phase itself but its root mean square (RMS). After the RMS of the reemitted light phase is obtained, it is compared to the theoretical model and allows the extraction of the medium’s µ’s. As mentioned above, the IMOPE technique was originally developed in the red regime of the electromagnetic (EM) spectrum (633nm $\textrm{in wavelength})$ [18,19,26], for medical applications using small distances. Later on, we have extended the IMOPE to the other edge of the visible spectrum, to the blue regime, applying it for 473nm in wavelength [27]. The red wavelength is mainly targeted to biological applications. The blue wavelength however has higher potential for underwater research since the blue wavelength has the lowest absorption undersea [28,29]. Since underwater research will require significant magnification of the IMOPE technique, we show here the potential of the technique to be modified for different, yet small, magnifications. However, for more significant magnifications future work is required.

In this work we have extended the detection range of µ’s using the IMOPE technique at the blue regime. This extension allows tailoring the linear detection range of the technique to a desired µ’s range. First, we show an adaptation of the theoretical model to the blue regime. Then, we change the magnification of the image captured by the detector, showing how different magnifications affect the theoretical model. We also observe the change in the linear technique’s detection range for each magnification. Last, we present the magnification effect on the IMOPE using phantoms with known µ’s, and prove how the experiments support the theoretical work.

2. Theoretical model

Inside a turbid medium, the radiation’s propagation is commonly described by the Radiative Transport Equation (RTE) [30]. It is an integro-differential equation which concerns the different displacements and directions, for a photon, then calculates the losses and benefits in its energy due to scattering and absorption within the medium. Different solutions for the RTE were suggested over the years [31]. The RTE is a complex equation, which has been suggested with different solutions, and the most common one is the diffusion approximation (DA) [32]. The DA describes the diffusion reflection (DR), i.e., the intensity of the scattered light reflected from the surface, as well as the diffusive transmission. The DA uses the method of images with fixed boundary conditions [33] where the boundary (red dotted line in Fig. 1) can be defined either on the surface [32] or higher [34]. The method of images is used such that the light source is described as a real isotropic light source beneath the surface and its image light source symmetrically above the chosen boundary (black circles in Fig. 1). The real source is located at a depth of 1/µ’s beneath the surface, and its image is reflected from the chosen boundary. The resulted intensity is represented as a function of ρ, the cylindric coordination of the distance from the center of the source along the surface (green arrow in Fig. 1). This method works well for relatively higher ρ, but is limited at smaller distances where it cannot describe the reflectance accurately [33]. To address this, Piao et al. [34] have published a more accurate model for the DR at short paths of ρ and for a low scattering semi-infinite homogeneous medium. According to Piao’s model, the boundary is located at a distance of 2AD where A=(1+Reff)(1-Reff) (with Reff=0.477 [32]) is the mismatch factor and D=(1/3)/(µa+ µ’s) is the diffusion coefficient. In Piao’s model, the real and image sources are referred as master sources (black circles in Fig. 1), where another slave source is beneath the surface, and its image reflects on the other side of the boundary (gray circles in Fig. 1). This model, for shorter light paths and lower scattering coefficients, is referred as the dual source configuration (when the simpler DA model, that includes the master sources only, is referred as the single source configuration). The IMOPE’s theoretical model combines the intensity that is described by the single and double source configurations, IRdual(ρ), with a phase model [19] φ(ρ). This, in order to describe the electromagnetic field for calculating its phase RMS as a function of µ’s:

$$RM{S_\varphi } = \frac{{\sqrt {\mathop \sum \nolimits_{\rho \epsilon \gamma } {{\left|{\sqrt {|{{I_{Rdual}}(\rho )} |} (exp({i({\varphi (\rho )- {\varphi_{av}}} )} )- {e^{i0}}} \right|}^2}\quad} }}{{\sqrt {\mathop \sum \nolimits_\rho {{\left|{\sqrt {|{{I_{Rdual}}(\rho )} |} } \right|}^2}\quad} }}$$

The distance ρ represents the distance between the center of the pencil beam illumination and a specific pixel. However, the actual pathlength of the light beneath the surface is longer than ρ and therefore the phase is accumulated along this larger distance. The ratio between the actual optical pathlength and ρ is the differential pathlength factor (DPF):

$$DPF = \frac{1}{\rho }\frac{\partial }{{\partial {\mu _a}}}\left[ {\ln \left( {\frac{S}{\Psi }} \right)} \right]$$

 figure: Fig. 1.

Fig. 1. A schematic illustration of the image method as used for the DA model. The medium (light blue area) is illuminated by the laser beam (blue arrow). The light is scattered in the medium (fading blue half a circle). To model the light propagation, the DA model uses the images method, and refers to the field of each point ρ (green circle) as if it was created by the two master sources (black circles) that are located symmetrically above and beneath a chosen boundary (red dotted line). The DA model does not describe well the area close to the center and therefor, Piao’s model describes the field using two additional sources- real and image slave sources (gray circles) located symmetrically above and beneath the chosen boundary (red dotted line).

Download Full Size | PPT Slide | PDF

The DPF calculation refers to the intensity of the real and master source (S), the steady state photon fluence rates (Ψ) at the detector.

Thus, the phase itself is:

$$\varphi (\rho )= \frac{{2\pi n}}{\lambda } \cdot DPF \cdot \rho $$
where $n$ is the refractive index of the object and λ is the laser source wavelength. It was previously shown [26] that the phase image can be divided into two region of interest (ROIs) according to the phase RMS. The last behaves differently in the single scattering and multiple scattering areas, which are defined according to the mean free path transport (MFP’) of the photon within the medium:
$$MFP^{\prime} = \frac{1}{{\mu _s^{\prime}}}$$

Where the multiple scattering area is obtained at ρ>MFP’ (referred as ‘ring’), and the single scattering area is the area closer to the light source (blue arrow in Fig. 1), where ρ<MFP’ (referred as ‘center’). The phase RMS of each ROI is calculated separately.

The basic IMOPE technique uses the dual source configuration (Piao’s model) to describe both intensities at the single and multiple scattering areas. This is due to the improvement that this model suggests for the areas that are close to the illumination point and since for optical magnification of M=1, both ROIs are relatively close to the center. Magnifying the field of view, we found that calculation of the intensities for the multiple scattering area (i.e., the ring) with simply the single source configuration fits better with the experimental results; this is the model that better describes the larger distances. However, for the single scattering area (i.e., the center), the dual source configuration remained.

3. Materials and methods

3.1 IMOPE technique

The IMOPE is a technique developed, originally, to extract the reduced scattering coefficient. It evaluates µ’s from the phase image reconstructed by the GS algorithm. The basic GS algorithm requires two intensity images taken from two planes of the electromagnetic field. At the entrance plane the intensity is notated as Ien, and at the exit plane as Iex. The intensity of each plane is captured by a camera. The mathematical expression for the field propagation is the Fresnel transform (FRT) [35,36]. From the intensities and the field propagation equations, the algorithm aims to reconstruct the field's phase that was lost when the intensities were captured. Since the GS algorithm converges to the intensity’s RMS, but not to the global minima [22], a multi-plane improved version was suggested [37,38]. The multi-plane GS technique uses N planes, rather than two planes only, along the propagation axis. Thus, the IMOPE uses N light intensity images taken at N planes along the z-axis (Fig. 2(a), along with the light-blue arrows). The method starts with applying the multi-plane GS algorithm. The result is a phase image at the desired Nth plane which is the reconstructed phase of the reemitted light. The average value of the phase image ${\varphi }$ is then subtracted from the received phase image. As mentioned, the received phase images can be separated to their ROIs, when the border between them is the MFP’ (Eq. (4), and the orange circle in Fig. 2(b)). However, in order to explore materials with unknown µ’s, we have automized the process which extracts the border between the ROIs, and experimentally confirmed it using phantoms with known µ’s. By knowing the border location, the RMS of each ROI in the phase image can be calculated and compared with the theoretical model (Fig. 2(c) the blue line is the center ROI and the red line is the outer ring ROI), thus the reduced scattering coefficient, µ’s, has been extracted.

 figure: Fig. 2.

Fig. 2. The IMOPE technique’s process. (a) The experimental setup consists of a laser source (1) is connected to an attenuator and a linear polarizer (2). In front of them sits the sample (3). The detector (4), the second linear polarizer (5) and the lens (6) are sitting on a motorized stage. The stage moves while taking the N intensity images required for the GS algorithm. In order to change the magnification, we change the distances between the detector and the lens and between the lens and the object. These distances are represented by the light blue arrows. (b) An example of a phase image extracted by the GS algorithm from a phantom with µs0.75 mm-1. The phase image is divided by the orange circle to the two ROIs: the center (single scattering area, ρ<1/ µs) and the ring (multiple scattering area, ρ>1/ µs). (c) The RMS of each ROI of the phase image is calculated and compared to the theory graph of phase RMS as a function of µ’s. The suitability of the two calculated values to the ring (red line) and center (blue line) graphs yields the µ’s of the measured object.

Download Full Size | PPT Slide | PDF

3.2 Optical setup

The experimental setup for light intensity images (Fig. 2(a)) is composed of a laser with a wavelength of λ=473nm, with a beam diameter of 1.2mm and power of 100mW, an attenuator, that transfer 1% of the laser's intensity, polarizers- for optical clearing purposes, and a lens (focal length of f=75mm) in order to focus the illumination beam. A CMOS camera was used for the intensity images acquisition. The lens, polarizer and camera are set on a moving stage with a small angle θ from the laser source, hence the distances between the images are corrected appropriately. In order to study the effect of the magnification M on the IMOPE technique, we have changed the distances between the lens, the camera and the sample (blue arrows in (Fig. 2(a)). The samples are set on a 3-axis micrometer stage to enable their fine-tuning during the experiments [37].

3.3 Agar-based phantoms

Agar-based phantoms played a significant role in this research. The suitability of the phantoms for this purpose is since we are able to control their µ’s accurately, using a known lipid concentration, while having a relatively low absorption coefficient. According to Mie theory, the size of the unit cell and material type determines the scattering coefficient of a substance; the diameter of the intralipid’s unit cell varies between 25-625nm [39]. The phantoms were first used for the validation of the IMOPE technique following a 473nm [27] laser illumination and the different magnifications. The phantoms are composed of Intralipid (IL) (Intralipid 20% Emulsion, Sigma-Aldrich, Israel) with varying concentrations and 1% Agarose powder (Agarose- low gelling temperature, Sigma-Aldrich, Israel) and water [21]. The µ’s varies as function of the intralipid concentration [1,21]. The IL concentrations of the phantoms used in the following experiments are 0.23%, 0.495%, 0.71%, 1.06%, 1.27%, 1.48%, 1.85%, 2.13%, 2.39%, 2.62% respective to µ’s values of 0.25, 0.5, 0.7, 1.03, 1.23, 1.4, 1.77, 2.03, 2.28, 2.49 mm-1 [40].

4. Results and discussion

The theoretical model was first applied to the blue regime, using different magnifications (M: 1,$\frac{2}{3}\; ,\frac{1}{2}$, 0.4 and $\frac{1}{3}$ represented in Fig. 3(a), by red, yellow, green, blue and purple, respectively). For each magnification, the solid lines describe the center ROI, and the dashed lines describe the ring ROI. Our results suggested that different magnifications changed the theoretical graphs, as for a lower M the graphs presented steeper slope and reached saturation earlier. Also, the ring graphs start for lower µ’s values. For each magnification M, the curves of the ring and the center have a crossing point around the phase RMS value of 1; We noticed that the crossing point of the ring’s graph and the center’s graph shifts with M. Extracting the µ’s value in which this crossing occurs, µs,cp, for each M (asterisks in Fig. 3(b)) yields the linear fit (line in Fig. 3(b)) µs,cp=M.

 figure: Fig. 3.

Fig. 3. The results of the theoretical model’s adaptations. (a) The theoretical model for the different magnifications $M$: $1,\frac{2}{3}\; ,\frac{1}{2}\; ,\; 0.4\; $ and $\frac{1}{3}$ are represented by red, yellow, green, blue and purple, respectively. For each magnification, the solid lines describe the center (i.e, the single scattering ROI) and the dashed lines describe the ring (multiple scattering ROI) . (b) With the change in the the magnification, we have noticed that the crossing point of the ROI’s RMS graphs shifts along the µ’s axis. The black asterisks show the extracted µs,cp, the µ’s value in which the crossing point occurs for every magnification M and presented them VS M. The error bars are defined according to the simulation resolution.

Download Full Size | PPT Slide | PDF

Next, we applied the measurements on the phantoms using the basic M=1 setup. i.e., for each phantom N intensity images were taken using the optical setup described above and their phase RMS was calculated. Since the µ’s values of the phantoms were known, we could calculate a theoretical radius that discriminates between the ROIs of each phase image obtained for every phantom; This border is the phantoms’ MFP’ (Eq. (4)). We aimed to find the range of MFP’ which can be detected by our method. Therefore, we used an algorithm that scans the phase image and finds the border radius of each phantom. This algorithm is useful while measuring samples with unknown µ’s, here however, we used it to examine the capabilities of the IMOPE.

For M=1, the extracted border radius values (the asterisks in Fig. 4(a)) do not match the theory (line in Fig. 4(a)) for the lower µ’s values. As expected, the IMOPE cannot detect the radiuses of the lower µ’s since they are larger than the size of the image, and thus cannot be captured by the detector. Adjusting the optical setup to M=$\frac{1}{2}$, we saw a significant improvement: we were able to reveal the radiuses with the algorithm (asterisks in Fig. 4(b)), achieving a good fit between them and the theoretical MFP’ curve (solid line in Fig. 4(b)) even for low µ’s values.

 figure: Fig. 4.

Fig. 4. The influence of magnification on the radius that discriminates between ROIs (theoretical and experimental results are the lines and asterisks, respectively). For (a) M=1 lower µ’s deviant from theory and for (b) M=$\frac{1}{2}$ lower µ’s fit theory. Inset: The optical setup of each configuration is presented above each graph, where the magnification M is changed by adjusting the distances before and after the lens.

Download Full Size | PPT Slide | PDF

Next, by changing the distances between the camera, the lens and the phantoms, we created several magnifications (M=1, $\frac{1}{2},\;\frac{1}{3},\;$2 and 3, presented in Fig. 5(a)-(e), respectively). We present the theoretical phase RMS graphs of the ring and the center of the phase RMS (red and blue solid lines, respectively), together with the experimental results (red and blue asterisk, respectively). We see that the basic IMOPE technique, for M=1 (Fig. 5(a)), cannot yield ring’s values for low µ’s. In addition, the high µ’s values suffer from saturation. The limitations at both edges of the µ’s axis narrow the linear ange of informative µ’s in which the analysis is possible (between 0.7mm-1 to 1.77mm-1). These limitations were expected, due to the image size limitation; When $\mu _s^{\prime}$ is low, the image captured by the detector is smaller than the MFP’ (Fig. 4(a)). When the $\mu _s^{\prime}$ values are high, they are limited due to low number of pixels which fulfill the condition ρ<MFP’, causing the differences between the phase RMS values become negligible. For M<1 (Fig. 5(b) and (c)), we see that the experimental results, i.e., the phase RMS of the ring and the center (red and blue asterisks, respectively) are in high agreement with theoretical model. The µ’s values in which the agreement occurs, are lower for M<1 than for M=1. However, for higher µ’s we achieve saturation at lower values. For M=$\frac{1}{2}\;$ (Fig. 5(b)) the informative µ’s range is between 0.5mm-1 to 1.4mm-1, and for M=$\frac{1}{3}\;$ (Fig. 5(c)) between 0.25 mm-1 to 1.23mm-1.

 figure: Fig. 5.

Fig. 5. The influence of magnification on the phase RMS. The above figures present the phase RMS versus µ’s graphs for image magnification M of (a) $1$, (b) $\frac{1}{2}$, (c) $\frac{1}{3}$, (d) $2$ and (e) $3$. The solid lines present the theories graphs, blue for the center ROI and red for the ring ROI. The experimental results of the center and ring ROIs are represented by blue and red asterisks.

Download Full Size | PPT Slide | PDF

Zooming-in i.e., configurating the setup to M>1 (Fig. 5(d) and (e)) solved the sensitivity problem expressed by the saturation at the higher µ’s values as it shifts the linear range to the left. For M=2, the ring’s phase RMS is in good correlation with the theory, in which the µ’s ranges between 1.03mm-1 to 2.03mm-1 and for M=3 it seems that the informative µ’s range is above 2.03mm-1. The phase RMS graphs of the centers in this range show a linear behavior with some offset from the theoretical values, as expected since the diffusion-based theory is inaccurate when getting closer to the light source [33]. In other words, this deviation does not indicate a mismatch of the method, but an inaccuracy of the DA based theory describing the area close to the center. Nonetheless, in the low µ’s values the ring ROIs were not captured by the detector (red points are missing) and the center values are saturated. Table 1 summarizes the µ’s ranges that were found to be most informative due to their linear behavior. As mentioned previously, the GS algorithm requires N images of the same phantoms, taken from N planes with a distance Δz between them. All the above experiments were done for different magnifications, based on the same constant distance Δz=0.635mm. We wanted to verify that the deterioration to the fit between the experimental results and the theoretical graphs for M>1 (Fig. 5(d)) and (e)) does not point that the Δz should have been magnified according to M. Hence, for $M = 1,\;\frac{1}{2},\;\frac{1}{3}$ magnifications (Fig. 6(a), (b) and (c) respectively) we compared the theoretical results with the experimental results of Δz (Fig. 6, blue and red asterisks for center and ring, respectively) and 2Δz (Fig. 6, blue and red circles for center and ring, respectively). Note that the above analysis presents a case in which the reconstruction depth (i.e. (N-1)Δz) of each magnification remains the same when changing the number of images. The results show that there is no significant difference between the phase RMS accepted for the different analysis methods (Δz vs. 2Δz).

 figure: Fig. 6.

Fig. 6. Influence of distance between image planes on phase RMS. Note that we scan the same total distance, hence the double distance analysis (blue and red circles for the double distance analysis of the center and the ring respectively) includes half the pictures then the regular analysis (blue and red asterisks for the single distance analysis of the center and the ring respectively). The figures present different magnifications: (a) $M = 1$ (b) $M = \frac{1}{2}$ and (c) $M = \frac{1}{3}\; .$ For all cases, the blue solid lines present the center ROI theory and the red solid lines present the ring ROI theory.

Download Full Size | PPT Slide | PDF

Tables Icon

Table 1. A summary of the $\mu _s^{\prime}$ ranges that were found to be most informative due to their linear behavior.

In this paper, we examined the influence of the optical magnification of the IMOPE’s optical system on its linear range. The change in magnification effectively changes the pixel size, and hence influences the RMS statistics of the phase. A theoretical DA based models for the intensity profiles, combined with Eq. (4)) for the phase theoretical model, predicted a shift in the linear range of the phase RMS values in both, the center (ρ<MFP’) and ring (ρ>MFP’) ROIs of the phase image. Experimental results from phantoms with varying scattering coefficients confirmed this behavior. However, some offset from the expected theoretical values were observed. The change of the optical magnification therefore extends the range of detection or enhances the detection accuracy for different µ’s ranges and may also enable scanning deeper into the medium. To verify this, we had to first check that the µ’s we have used meet the conditions required for the phase reconstruction. The change in the phase is extracted from the intensity image captured by a detector with a pixel size Δx. Each pixel contains data for the phase change along Δz. Thus, Δz must be large enough to enable a phase change along it, but small enough so there will not be too many phase changes averaged by the pixel. The condition on Δz is given by:

$$\Delta z < \frac{{\Delta {x^2}}}{\lambda }$$
where λ is the used wavelength.

In the beginning, we reconstructed the phase from the phantoms using the same analysis for each magnification; The analysis used the same distance Δz=0.635mm between the intensity planes and the same number of planes (N=7) recorded by the detector scanning a total depth Δz⋅(N-1)=D=3.81mm for each phantom. Later, when we changed the number of images N and the distance Δz scanning the same depth (D=3.81mm), the phase RMS of each magnification remained. In other words, we have shown that the results accepted for different Δz and N but with constant reconstruction depth D yield the same phase RMS experimental results (Fig. 6). In addition, we would like to examine how by changing the reconstruction depth, taking different Δz with the same number of images, result with different reconstruction depths for the same magnification, meaning, different scanned depth D. Therefore, for M=$\frac{1}{3}$ we compared two reconstruction depths based on the same number of images with different distances between them: single and triple distance Δz between the image planes (asterisks and circles respectively in Fig. 7). The distances between the images are therefore 0.635 and 1.905mm for single, and triple distances. Hence the total depths D are: 3.81 and 11.43mm. The reconstruction along the different depths had yield the same phase RMS values for the same phantoms, confirming the Δz we chose fulfills the condition of Eq. (5)). Note, that the results shown in Fig. 7 were achieved from a M=$\frac{1}{3}$ configuration. The center is therefore smaller, and the average of the pixels is too coarse. This explains the deterioration of the fit between the theory and the experimental results. In contrary, the ring theory and results highly agree along the µ’s axis.

 figure: Fig. 7.

Fig. 7. Influence of reconstruction depth on the phase RMS. The reconstruction depth was changed by taking the same number of images and changing the distance between them; single and triple distances Δz between the images (corresponding to 0.635 and 1.905 mm represented by asterisks and circles, respectively). The theoretical graphs (lines) and experimental results (points) of image magnification M=$\frac{1}{3}$ are presented. The red and the blue lines and points represent the ring and the center ROIs, respectively.

Download Full Size | PPT Slide | PDF

5. Conclusion

The IMOPE technique aims to extract the reduced scattering coefficient µ’s from a phase RMS reconstructed from a turbid medium. It is based on a multi-planar version of the Gerchberg Saxton algorithm for phase retrieval. The optical setup takes several images that are inserted to the algorithm that results in a reconstructed phase image. The phase image is analyzed and from a comparison with the theoretical model, µ’s can be extracted. In this research, we have searched for the IMOPE’s optimal range of detection, for different magnifications. We show that in different ranges of µ’s the technique’s linear range is limited: at the lower values of µ’s it is limited by the size of the image captured by the detector, where at the higher µ’s values it is limited due to the saturation the phase RMS reaches at low number of pixels. The work presented here aims to improve the capabilities of the IMOPE technique for the detection of lower and higher reduced scattering coefficients by changing the magnification of the image. In fact, we suggest adaptation of the setup in order to tailor the technique for desired µ’s range (summarized in Table 1). We did so by varying the magnification of the image and finding the linear µ’s range for each magnification. The phantoms experiments have supported this theoretical work as well. For a magnification of M<1, the sensitivity for lower µ’s is improved since they are included at the linear range. For M>1 magnification, the linear range includes higher µ’s values. Since these values do not suffer from saturation anymore, the sensitivity of the technique for these ranges has improved. In addition, we notated the cross-point between the center and ring phase RMS curves that shift with the magnification M. Overall, this paper shows that the IMOPE, designed to detect changes in biological tissues, can be extended to other fields beyond its original purpose; It can be tailored to detect wider ranges of µ’s than necessary for biological research as well as larger distances along the turbid media. In the blue wavelength, for example, it could be utilized for underwater research. The IMOPE itself is a promising sensing method, that breakthrough the imaging limitations and achieves information about invisible area. Here we show the great potential of the IMOPE to be extended to various applications.

Acknowledgments

This research conceptualization was formed by D.F. as well as the project administration and funding acquisition. In addition, D.F. together with H.D. were responsible for the research supervision methodology. The software and the theoretical model were written by I.Y. and C.S. The experiments, investigation and formal analysis were performed by C.S, and final validation was conducted by C.S. and H.D. The writing of the original draft preparation was performed by C.S. and H.D. D.F. and R.A. were responsible for the review and editing for improving the paper.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. H. Assadi, R. Karshafian, and A. Douplik, “Optical scattering properties of intralipid phantom in presence of encapsulated microbubbles,” Int. J. Photoenergy (2014).

2. S. Chutia, N. M. Kakoty, and D. Deka, “A review of underwater robotics, navigation, sensing techniques and applications,” Proceedings of the Advances in Robotics 8, 1–6 (2017). [CrossRef]  

3. J. Joslin, “Imaging sonar review for marine environmental monitoring around tidal turbines,” (2019).

4. M. Massot-Campos and G. Oliver-Codina, “Optical sensors and methods for underwater 3D reconstruction,” Sensors 15(12), 31525–31557 (2015). [CrossRef]  

5. H. Lu, Y. Li, Y. Zhang, M. Chen, S. Serikawa, and H. Kim, “Underwater optical image processing: a comprehensive review,” Mobile Netw Appl 22(6), 1204–1211 (2017). [CrossRef]  

6. G. S. Kumar, U. V. Painumgal, M. C. Kumar, and K. Rajesh, “Autonomous underwater vehicle for vision based tracking,” Procedia Computer Science 133, 169–180 (2018). [CrossRef]  

7. N. Wang, Y. Wang, and M. J. Er, “Review on deep learning techniques for marine object recognition: Architectures and algorithms,” Control Engineering Practice104458 (2020).

8. D. Gomes, A. S. Saif, and D. Nandi, “Robust Underwater Object Detection with Autonomous Underwater Vehicle: A Comprehensive Study,” in Proceedings of the International Conference on Computing Advancements (2020), pp. 1–10.

9. Z. Chen, Z. Zhang, F. Dai, Y. Bu, and H. Wang, “Monocular vision-based underwater object detection,” Sensors 17(8), 1784 (2017). [CrossRef]  

10. G. Neves, M. Ruiz, J. Fontinele, and L. Oliveira, “Rotated object detection with forward-looking sonar in underwater applications,” Expert Systems with Applications 140, 112870 (2020). [CrossRef]  

11. D. Berman, D. Levy, S. Avidan, and T. Treibitz, “Underwater single image color restoration using haze-lines and a new quantitative dataset,” IEEE transactions on pattern analysis and machine intelligence (2020).

12. J. Bertolotti, E. G. van Putten, C. Blum, A. Lagendijk, W. L. Vos, and A. P. Mosk, “Non-invasive imaging through opaque scattering layers,” Nature 491(7423), 232–234 (2012). [CrossRef]  

13. S. Wang, L. Xue, J. Lai, Y. Song, and Z. Li, “Phase retrieval method for biological samples with absorption,” J. Opt. 15(7), 075301 (2013). [CrossRef]  

14. I. Häggmark, W. Vågberg, H. M. Hertz, and A. Burvall, “Comparison of quantitative multi-material phase-retrieval algorithms in propagation-based phase-contrast X-ray tomography,” Opt. Express 25(26), 33543–33558 (2017). [CrossRef]  

15. R. A. Gonsalves, “Perspectives on phase retrieval and phase diversity in astronomy,” in Adaptive Optics Systems IV (International Society for Optics and Photonics, 2014), p. 91482P.

16. I. Yariv, G. Rahamim, E. Shliselberg, H. Duadi, A. Lipovsky, R. Lubart, and D. Fixler, “Detecting nanoparticles in tissue using an optical iterative technique,” Biomed. Opt. Express 5(11), 3871–3881 (2014). [CrossRef]  

17. I. Yariv, Y. Kapp-Barnea, E. Genzel, H. Duadi, and D. Fixler, “Detecting concentrations of milk components by an iterative optical technique,” J. Biophotonics 8(11-12), 979–984 (2015). [CrossRef]  

18. I. Yariv, M. Haddad, H. Duadi, M. Motiei, and D. Fixler, “New optical sensing technique of tissue viability and blood flow based on nanophotonic iterative multi-plane reflectance measurements,” Int. J. Nanomed. 11, 5237–5244 (2016). [CrossRef]  

19. I. Yariv, H. Duadi, and D. Fixler, “An optical method to detect tissue scattering: theory, experiments and biomedical applications,” presented at the SPIE BiOS2019.

20. I. Yariv, H. Duadi, and D. Fixler, An optical method to detect tissue scattering: theory, experiments and biomedical applications (SPIE, 2019).

21. I. Yariv, H. Duadi, and D. Fixler, “Depth Scattering Characterization of Multi-Layer Turbid Media Based on Iterative Multi-Plane Reflectance Measurements,” IEEE Photonics J. 12(5), 1–13 (2020). [CrossRef]  

22. R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase image and diffraction plane pictures,” Optik 35, 237–246 (1972).

23. R. Gerchberg, “Super-resolution through error energy reduction,” Optica Acta: International Journal of Optics 21, 709–720 (1974). [CrossRef]  

24. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21(15), 2758–2769 (1982). [CrossRef]  

25. D. Fixler, H. Duadi, R. Ankri, and Z. Zalevsky, “Determination of coherence length in biological tissues,” Lasers Surg. Med. 43(4), 339–343 (2011). [CrossRef]  

26. I. Yariv, H. Duadi, and D. Fixler, “Optical method to extract the reduced scattering coefficient from tissue: theory and experiments,” Opt. Lett. 43(21), 5299–5302 (2018). [CrossRef]  

27. I. Yariv, C. Shapira, H. Duadi, and D. Fixler, “Media Characterization under Scattering Conditions by Nanophotonics Iterative Multiplane Spectroscopy Measurements,” ACS Omega 4(10), 14301–14306 (2019). [CrossRef]  

28. A. Yamashita, M. Fujii, and T. Kaneko, “Color registration of underwater images for underwater sensing with consideration of light attenuation,” in Proceedings 2007 IEEE international conference on robotics and automation (IEEE2007), pp. 4570–4575.

29. J. Mueller, G. S. Fargion, and C. R. McClain, “Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Volume 6; Special Topics in Ocean Optics Protocols and Appendices; Revised,” (2003).

30. K. Sen and S. J. Wilson, Radiative transfer in curved media (World Scientific, 1990).

31. R. Ankri and D. Fixler, “Gold nanorods based diffusion reflection measurements: current status and perspectives for clinical applications,” Nanophotonics 6(5), 1031–1042 (2017). [CrossRef]  

32. R. C. Haskell, L. O. Svaasand, T.-T. Tsay, T.-C. Feng, M. S. McAdams, and B. J. Tromberg, “Boundary conditions for the diffusion equation in radiative transfer,” J. Opt. Soc. Am. A 11(10), 2727–2741 (1994). [CrossRef]  

33. S. L. Jacques and B. W. Pogue, “Tutorial on diffuse light transport,” J. Biomed. Opt. 13(4), 041302 (2008). [CrossRef]  

34. D. Piao and S. Patel, “Simple empirical master–slave dual-source configuration within the diffusion approximation enhances modeling of spatially resolved diffuse reflectance at short-path and with low scattering from a semi-infinite homogeneous medium,” Appl. Opt. 56(5), 1447–1452 (2017). [CrossRef]  

35. Z. Zalevsky, R. G. Dorsch, and D. Mendlovic, “Gerchberg–Saxton algorithm applied in the fractional Fourier or the Fresnel domain,” Opt. Lett. 21(12), 842–844 (1996). [CrossRef]  

36. D. Mendlovic, Z. Zalevsky, and N. Konforti, “Computation considerations and fast algorithms for calculating the diffraction integral,” Journal of Modern Optics 44(2), 407–414 (1997). [CrossRef]  

37. D. Sazbon, Z. Zalevsky, and E. Rivlin, “Qualitative real-time range extraction for preplanned scene partitioning using laser beam coding,” Pattern Recognition Letters 26(11), 1772–1781 (2005). [CrossRef]  

38. E. Grossman, R. Tzioni, A. Gur, E. Gur, and Z. Zalevsky, “Optical through-turbulence imaging configuration: experimental validation,” Opt. Lett. 35(4), 453–455 (2010). [CrossRef]  

39. H. J. Van Staveren, C. J. Moes, J. van Marie, S. A. Prahl, and M. J. Van Gemert, “Light scattering in lntralipid-10% in the wavelength range of 4507–1100 nm,” Appl. Opt. 30(31), 4507–4514 (1991). [CrossRef]  

40. D. Fixler, J. Garcia, Z. Zalevsky, A. Weiss, and M. Deutsch, “Pattern projection for subpixel resolved imaging in microscopy,” Micron 38(2), 115–120 (2007). [CrossRef]  

References

  • View by:

  1. H. Assadi, R. Karshafian, and A. Douplik, “Optical scattering properties of intralipid phantom in presence of encapsulated microbubbles,” Int. J. Photoenergy (2014).
  2. S. Chutia, N. M. Kakoty, and D. Deka, “A review of underwater robotics, navigation, sensing techniques and applications,” Proceedings of the Advances in Robotics 8, 1–6 (2017).
    [Crossref]
  3. J. Joslin, “Imaging sonar review for marine environmental monitoring around tidal turbines,” (2019).
  4. M. Massot-Campos and G. Oliver-Codina, “Optical sensors and methods for underwater 3D reconstruction,” Sensors 15(12), 31525–31557 (2015).
    [Crossref]
  5. H. Lu, Y. Li, Y. Zhang, M. Chen, S. Serikawa, and H. Kim, “Underwater optical image processing: a comprehensive review,” Mobile Netw Appl 22(6), 1204–1211 (2017).
    [Crossref]
  6. G. S. Kumar, U. V. Painumgal, M. C. Kumar, and K. Rajesh, “Autonomous underwater vehicle for vision based tracking,” Procedia Computer Science 133, 169–180 (2018).
    [Crossref]
  7. N. Wang, Y. Wang, and M. J. Er, “Review on deep learning techniques for marine object recognition: Architectures and algorithms,” Control Engineering Practice104458 (2020).
  8. D. Gomes, A. S. Saif, and D. Nandi, “Robust Underwater Object Detection with Autonomous Underwater Vehicle: A Comprehensive Study,” in Proceedings of the International Conference on Computing Advancements (2020), pp. 1–10.
  9. Z. Chen, Z. Zhang, F. Dai, Y. Bu, and H. Wang, “Monocular vision-based underwater object detection,” Sensors 17(8), 1784 (2017).
    [Crossref]
  10. G. Neves, M. Ruiz, J. Fontinele, and L. Oliveira, “Rotated object detection with forward-looking sonar in underwater applications,” Expert Systems with Applications 140, 112870 (2020).
    [Crossref]
  11. D. Berman, D. Levy, S. Avidan, and T. Treibitz, “Underwater single image color restoration using haze-lines and a new quantitative dataset,” IEEE transactions on pattern analysis and machine intelligence (2020).
  12. J. Bertolotti, E. G. van Putten, C. Blum, A. Lagendijk, W. L. Vos, and A. P. Mosk, “Non-invasive imaging through opaque scattering layers,” Nature 491(7423), 232–234 (2012).
    [Crossref]
  13. S. Wang, L. Xue, J. Lai, Y. Song, and Z. Li, “Phase retrieval method for biological samples with absorption,” J. Opt. 15(7), 075301 (2013).
    [Crossref]
  14. I. Häggmark, W. Vågberg, H. M. Hertz, and A. Burvall, “Comparison of quantitative multi-material phase-retrieval algorithms in propagation-based phase-contrast X-ray tomography,” Opt. Express 25(26), 33543–33558 (2017).
    [Crossref]
  15. R. A. Gonsalves, “Perspectives on phase retrieval and phase diversity in astronomy,” in Adaptive Optics Systems IV (International Society for Optics and Photonics, 2014), p. 91482P.
  16. I. Yariv, G. Rahamim, E. Shliselberg, H. Duadi, A. Lipovsky, R. Lubart, and D. Fixler, “Detecting nanoparticles in tissue using an optical iterative technique,” Biomed. Opt. Express 5(11), 3871–3881 (2014).
    [Crossref]
  17. I. Yariv, Y. Kapp-Barnea, E. Genzel, H. Duadi, and D. Fixler, “Detecting concentrations of milk components by an iterative optical technique,” J. Biophotonics 8(11-12), 979–984 (2015).
    [Crossref]
  18. I. Yariv, M. Haddad, H. Duadi, M. Motiei, and D. Fixler, “New optical sensing technique of tissue viability and blood flow based on nanophotonic iterative multi-plane reflectance measurements,” Int. J. Nanomed. 11, 5237–5244 (2016).
    [Crossref]
  19. I. Yariv, H. Duadi, and D. Fixler, “An optical method to detect tissue scattering: theory, experiments and biomedical applications,” presented at the SPIE BiOS2019.
  20. I. Yariv, H. Duadi, and D. Fixler, An optical method to detect tissue scattering: theory, experiments and biomedical applications (SPIE, 2019).
  21. I. Yariv, H. Duadi, and D. Fixler, “Depth Scattering Characterization of Multi-Layer Turbid Media Based on Iterative Multi-Plane Reflectance Measurements,” IEEE Photonics J. 12(5), 1–13 (2020).
    [Crossref]
  22. R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase image and diffraction plane pictures,” Optik 35, 237–246 (1972).
  23. R. Gerchberg, “Super-resolution through error energy reduction,” Optica Acta: International Journal of Optics 21, 709–720 (1974).
    [Crossref]
  24. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21(15), 2758–2769 (1982).
    [Crossref]
  25. D. Fixler, H. Duadi, R. Ankri, and Z. Zalevsky, “Determination of coherence length in biological tissues,” Lasers Surg. Med. 43(4), 339–343 (2011).
    [Crossref]
  26. I. Yariv, H. Duadi, and D. Fixler, “Optical method to extract the reduced scattering coefficient from tissue: theory and experiments,” Opt. Lett. 43(21), 5299–5302 (2018).
    [Crossref]
  27. I. Yariv, C. Shapira, H. Duadi, and D. Fixler, “Media Characterization under Scattering Conditions by Nanophotonics Iterative Multiplane Spectroscopy Measurements,” ACS Omega 4(10), 14301–14306 (2019).
    [Crossref]
  28. A. Yamashita, M. Fujii, and T. Kaneko, “Color registration of underwater images for underwater sensing with consideration of light attenuation,” in Proceedings 2007 IEEE international conference on robotics and automation (IEEE2007), pp. 4570–4575.
  29. J. Mueller, G. S. Fargion, and C. R. McClain, “Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Volume 6; Special Topics in Ocean Optics Protocols and Appendices; Revised,” (2003).
  30. K. Sen and S. J. Wilson, Radiative transfer in curved media (World Scientific, 1990).
  31. R. Ankri and D. Fixler, “Gold nanorods based diffusion reflection measurements: current status and perspectives for clinical applications,” Nanophotonics 6(5), 1031–1042 (2017).
    [Crossref]
  32. R. C. Haskell, L. O. Svaasand, T.-T. Tsay, T.-C. Feng, M. S. McAdams, and B. J. Tromberg, “Boundary conditions for the diffusion equation in radiative transfer,” J. Opt. Soc. Am. A 11(10), 2727–2741 (1994).
    [Crossref]
  33. S. L. Jacques and B. W. Pogue, “Tutorial on diffuse light transport,” J. Biomed. Opt. 13(4), 041302 (2008).
    [Crossref]
  34. D. Piao and S. Patel, “Simple empirical master–slave dual-source configuration within the diffusion approximation enhances modeling of spatially resolved diffuse reflectance at short-path and with low scattering from a semi-infinite homogeneous medium,” Appl. Opt. 56(5), 1447–1452 (2017).
    [Crossref]
  35. Z. Zalevsky, R. G. Dorsch, and D. Mendlovic, “Gerchberg–Saxton algorithm applied in the fractional Fourier or the Fresnel domain,” Opt. Lett. 21(12), 842–844 (1996).
    [Crossref]
  36. D. Mendlovic, Z. Zalevsky, and N. Konforti, “Computation considerations and fast algorithms for calculating the diffraction integral,” Journal of Modern Optics 44(2), 407–414 (1997).
    [Crossref]
  37. D. Sazbon, Z. Zalevsky, and E. Rivlin, “Qualitative real-time range extraction for preplanned scene partitioning using laser beam coding,” Pattern Recognition Letters 26(11), 1772–1781 (2005).
    [Crossref]
  38. E. Grossman, R. Tzioni, A. Gur, E. Gur, and Z. Zalevsky, “Optical through-turbulence imaging configuration: experimental validation,” Opt. Lett. 35(4), 453–455 (2010).
    [Crossref]
  39. H. J. Van Staveren, C. J. Moes, J. van Marie, S. A. Prahl, and M. J. Van Gemert, “Light scattering in lntralipid-10% in the wavelength range of 4507–1100 nm,” Appl. Opt. 30(31), 4507–4514 (1991).
    [Crossref]
  40. D. Fixler, J. Garcia, Z. Zalevsky, A. Weiss, and M. Deutsch, “Pattern projection for subpixel resolved imaging in microscopy,” Micron 38(2), 115–120 (2007).
    [Crossref]

2020 (2)

G. Neves, M. Ruiz, J. Fontinele, and L. Oliveira, “Rotated object detection with forward-looking sonar in underwater applications,” Expert Systems with Applications 140, 112870 (2020).
[Crossref]

I. Yariv, H. Duadi, and D. Fixler, “Depth Scattering Characterization of Multi-Layer Turbid Media Based on Iterative Multi-Plane Reflectance Measurements,” IEEE Photonics J. 12(5), 1–13 (2020).
[Crossref]

2019 (1)

I. Yariv, C. Shapira, H. Duadi, and D. Fixler, “Media Characterization under Scattering Conditions by Nanophotonics Iterative Multiplane Spectroscopy Measurements,” ACS Omega 4(10), 14301–14306 (2019).
[Crossref]

2018 (2)

I. Yariv, H. Duadi, and D. Fixler, “Optical method to extract the reduced scattering coefficient from tissue: theory and experiments,” Opt. Lett. 43(21), 5299–5302 (2018).
[Crossref]

G. S. Kumar, U. V. Painumgal, M. C. Kumar, and K. Rajesh, “Autonomous underwater vehicle for vision based tracking,” Procedia Computer Science 133, 169–180 (2018).
[Crossref]

2017 (6)

Z. Chen, Z. Zhang, F. Dai, Y. Bu, and H. Wang, “Monocular vision-based underwater object detection,” Sensors 17(8), 1784 (2017).
[Crossref]

S. Chutia, N. M. Kakoty, and D. Deka, “A review of underwater robotics, navigation, sensing techniques and applications,” Proceedings of the Advances in Robotics 8, 1–6 (2017).
[Crossref]

H. Lu, Y. Li, Y. Zhang, M. Chen, S. Serikawa, and H. Kim, “Underwater optical image processing: a comprehensive review,” Mobile Netw Appl 22(6), 1204–1211 (2017).
[Crossref]

I. Häggmark, W. Vågberg, H. M. Hertz, and A. Burvall, “Comparison of quantitative multi-material phase-retrieval algorithms in propagation-based phase-contrast X-ray tomography,” Opt. Express 25(26), 33543–33558 (2017).
[Crossref]

R. Ankri and D. Fixler, “Gold nanorods based diffusion reflection measurements: current status and perspectives for clinical applications,” Nanophotonics 6(5), 1031–1042 (2017).
[Crossref]

D. Piao and S. Patel, “Simple empirical master–slave dual-source configuration within the diffusion approximation enhances modeling of spatially resolved diffuse reflectance at short-path and with low scattering from a semi-infinite homogeneous medium,” Appl. Opt. 56(5), 1447–1452 (2017).
[Crossref]

2016 (1)

I. Yariv, M. Haddad, H. Duadi, M. Motiei, and D. Fixler, “New optical sensing technique of tissue viability and blood flow based on nanophotonic iterative multi-plane reflectance measurements,” Int. J. Nanomed. 11, 5237–5244 (2016).
[Crossref]

2015 (2)

I. Yariv, Y. Kapp-Barnea, E. Genzel, H. Duadi, and D. Fixler, “Detecting concentrations of milk components by an iterative optical technique,” J. Biophotonics 8(11-12), 979–984 (2015).
[Crossref]

M. Massot-Campos and G. Oliver-Codina, “Optical sensors and methods for underwater 3D reconstruction,” Sensors 15(12), 31525–31557 (2015).
[Crossref]

2014 (1)

2013 (1)

S. Wang, L. Xue, J. Lai, Y. Song, and Z. Li, “Phase retrieval method for biological samples with absorption,” J. Opt. 15(7), 075301 (2013).
[Crossref]

2012 (1)

J. Bertolotti, E. G. van Putten, C. Blum, A. Lagendijk, W. L. Vos, and A. P. Mosk, “Non-invasive imaging through opaque scattering layers,” Nature 491(7423), 232–234 (2012).
[Crossref]

2011 (1)

D. Fixler, H. Duadi, R. Ankri, and Z. Zalevsky, “Determination of coherence length in biological tissues,” Lasers Surg. Med. 43(4), 339–343 (2011).
[Crossref]

2010 (1)

2008 (1)

S. L. Jacques and B. W. Pogue, “Tutorial on diffuse light transport,” J. Biomed. Opt. 13(4), 041302 (2008).
[Crossref]

2007 (1)

D. Fixler, J. Garcia, Z. Zalevsky, A. Weiss, and M. Deutsch, “Pattern projection for subpixel resolved imaging in microscopy,” Micron 38(2), 115–120 (2007).
[Crossref]

2005 (1)

D. Sazbon, Z. Zalevsky, and E. Rivlin, “Qualitative real-time range extraction for preplanned scene partitioning using laser beam coding,” Pattern Recognition Letters 26(11), 1772–1781 (2005).
[Crossref]

1997 (1)

D. Mendlovic, Z. Zalevsky, and N. Konforti, “Computation considerations and fast algorithms for calculating the diffraction integral,” Journal of Modern Optics 44(2), 407–414 (1997).
[Crossref]

1996 (1)

1994 (1)

1991 (1)

1982 (1)

1974 (1)

R. Gerchberg, “Super-resolution through error energy reduction,” Optica Acta: International Journal of Optics 21, 709–720 (1974).
[Crossref]

1972 (1)

R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase image and diffraction plane pictures,” Optik 35, 237–246 (1972).

Ankri, R.

R. Ankri and D. Fixler, “Gold nanorods based diffusion reflection measurements: current status and perspectives for clinical applications,” Nanophotonics 6(5), 1031–1042 (2017).
[Crossref]

D. Fixler, H. Duadi, R. Ankri, and Z. Zalevsky, “Determination of coherence length in biological tissues,” Lasers Surg. Med. 43(4), 339–343 (2011).
[Crossref]

Assadi, H.

H. Assadi, R. Karshafian, and A. Douplik, “Optical scattering properties of intralipid phantom in presence of encapsulated microbubbles,” Int. J. Photoenergy (2014).

Avidan, S.

D. Berman, D. Levy, S. Avidan, and T. Treibitz, “Underwater single image color restoration using haze-lines and a new quantitative dataset,” IEEE transactions on pattern analysis and machine intelligence (2020).

Berman, D.

D. Berman, D. Levy, S. Avidan, and T. Treibitz, “Underwater single image color restoration using haze-lines and a new quantitative dataset,” IEEE transactions on pattern analysis and machine intelligence (2020).

Bertolotti, J.

J. Bertolotti, E. G. van Putten, C. Blum, A. Lagendijk, W. L. Vos, and A. P. Mosk, “Non-invasive imaging through opaque scattering layers,” Nature 491(7423), 232–234 (2012).
[Crossref]

Blum, C.

J. Bertolotti, E. G. van Putten, C. Blum, A. Lagendijk, W. L. Vos, and A. P. Mosk, “Non-invasive imaging through opaque scattering layers,” Nature 491(7423), 232–234 (2012).
[Crossref]

Bu, Y.

Z. Chen, Z. Zhang, F. Dai, Y. Bu, and H. Wang, “Monocular vision-based underwater object detection,” Sensors 17(8), 1784 (2017).
[Crossref]

Burvall, A.

Chen, M.

H. Lu, Y. Li, Y. Zhang, M. Chen, S. Serikawa, and H. Kim, “Underwater optical image processing: a comprehensive review,” Mobile Netw Appl 22(6), 1204–1211 (2017).
[Crossref]

Chen, Z.

Z. Chen, Z. Zhang, F. Dai, Y. Bu, and H. Wang, “Monocular vision-based underwater object detection,” Sensors 17(8), 1784 (2017).
[Crossref]

Chutia, S.

S. Chutia, N. M. Kakoty, and D. Deka, “A review of underwater robotics, navigation, sensing techniques and applications,” Proceedings of the Advances in Robotics 8, 1–6 (2017).
[Crossref]

Dai, F.

Z. Chen, Z. Zhang, F. Dai, Y. Bu, and H. Wang, “Monocular vision-based underwater object detection,” Sensors 17(8), 1784 (2017).
[Crossref]

Deka, D.

S. Chutia, N. M. Kakoty, and D. Deka, “A review of underwater robotics, navigation, sensing techniques and applications,” Proceedings of the Advances in Robotics 8, 1–6 (2017).
[Crossref]

Deutsch, M.

D. Fixler, J. Garcia, Z. Zalevsky, A. Weiss, and M. Deutsch, “Pattern projection for subpixel resolved imaging in microscopy,” Micron 38(2), 115–120 (2007).
[Crossref]

Dorsch, R. G.

Douplik, A.

H. Assadi, R. Karshafian, and A. Douplik, “Optical scattering properties of intralipid phantom in presence of encapsulated microbubbles,” Int. J. Photoenergy (2014).

Duadi, H.

I. Yariv, H. Duadi, and D. Fixler, “Depth Scattering Characterization of Multi-Layer Turbid Media Based on Iterative Multi-Plane Reflectance Measurements,” IEEE Photonics J. 12(5), 1–13 (2020).
[Crossref]

I. Yariv, C. Shapira, H. Duadi, and D. Fixler, “Media Characterization under Scattering Conditions by Nanophotonics Iterative Multiplane Spectroscopy Measurements,” ACS Omega 4(10), 14301–14306 (2019).
[Crossref]

I. Yariv, H. Duadi, and D. Fixler, “Optical method to extract the reduced scattering coefficient from tissue: theory and experiments,” Opt. Lett. 43(21), 5299–5302 (2018).
[Crossref]

I. Yariv, M. Haddad, H. Duadi, M. Motiei, and D. Fixler, “New optical sensing technique of tissue viability and blood flow based on nanophotonic iterative multi-plane reflectance measurements,” Int. J. Nanomed. 11, 5237–5244 (2016).
[Crossref]

I. Yariv, Y. Kapp-Barnea, E. Genzel, H. Duadi, and D. Fixler, “Detecting concentrations of milk components by an iterative optical technique,” J. Biophotonics 8(11-12), 979–984 (2015).
[Crossref]

I. Yariv, G. Rahamim, E. Shliselberg, H. Duadi, A. Lipovsky, R. Lubart, and D. Fixler, “Detecting nanoparticles in tissue using an optical iterative technique,” Biomed. Opt. Express 5(11), 3871–3881 (2014).
[Crossref]

D. Fixler, H. Duadi, R. Ankri, and Z. Zalevsky, “Determination of coherence length in biological tissues,” Lasers Surg. Med. 43(4), 339–343 (2011).
[Crossref]

I. Yariv, H. Duadi, and D. Fixler, “An optical method to detect tissue scattering: theory, experiments and biomedical applications,” presented at the SPIE BiOS2019.

I. Yariv, H. Duadi, and D. Fixler, An optical method to detect tissue scattering: theory, experiments and biomedical applications (SPIE, 2019).

Er, M. J.

N. Wang, Y. Wang, and M. J. Er, “Review on deep learning techniques for marine object recognition: Architectures and algorithms,” Control Engineering Practice104458 (2020).

Fargion, G. S.

J. Mueller, G. S. Fargion, and C. R. McClain, “Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Volume 6; Special Topics in Ocean Optics Protocols and Appendices; Revised,” (2003).

Feng, T.-C.

Fienup, J. R.

Fixler, D.

I. Yariv, H. Duadi, and D. Fixler, “Depth Scattering Characterization of Multi-Layer Turbid Media Based on Iterative Multi-Plane Reflectance Measurements,” IEEE Photonics J. 12(5), 1–13 (2020).
[Crossref]

I. Yariv, C. Shapira, H. Duadi, and D. Fixler, “Media Characterization under Scattering Conditions by Nanophotonics Iterative Multiplane Spectroscopy Measurements,” ACS Omega 4(10), 14301–14306 (2019).
[Crossref]

I. Yariv, H. Duadi, and D. Fixler, “Optical method to extract the reduced scattering coefficient from tissue: theory and experiments,” Opt. Lett. 43(21), 5299–5302 (2018).
[Crossref]

R. Ankri and D. Fixler, “Gold nanorods based diffusion reflection measurements: current status and perspectives for clinical applications,” Nanophotonics 6(5), 1031–1042 (2017).
[Crossref]

I. Yariv, M. Haddad, H. Duadi, M. Motiei, and D. Fixler, “New optical sensing technique of tissue viability and blood flow based on nanophotonic iterative multi-plane reflectance measurements,” Int. J. Nanomed. 11, 5237–5244 (2016).
[Crossref]

I. Yariv, Y. Kapp-Barnea, E. Genzel, H. Duadi, and D. Fixler, “Detecting concentrations of milk components by an iterative optical technique,” J. Biophotonics 8(11-12), 979–984 (2015).
[Crossref]

I. Yariv, G. Rahamim, E. Shliselberg, H. Duadi, A. Lipovsky, R. Lubart, and D. Fixler, “Detecting nanoparticles in tissue using an optical iterative technique,” Biomed. Opt. Express 5(11), 3871–3881 (2014).
[Crossref]

D. Fixler, H. Duadi, R. Ankri, and Z. Zalevsky, “Determination of coherence length in biological tissues,” Lasers Surg. Med. 43(4), 339–343 (2011).
[Crossref]

D. Fixler, J. Garcia, Z. Zalevsky, A. Weiss, and M. Deutsch, “Pattern projection for subpixel resolved imaging in microscopy,” Micron 38(2), 115–120 (2007).
[Crossref]

I. Yariv, H. Duadi, and D. Fixler, An optical method to detect tissue scattering: theory, experiments and biomedical applications (SPIE, 2019).

I. Yariv, H. Duadi, and D. Fixler, “An optical method to detect tissue scattering: theory, experiments and biomedical applications,” presented at the SPIE BiOS2019.

Fontinele, J.

G. Neves, M. Ruiz, J. Fontinele, and L. Oliveira, “Rotated object detection with forward-looking sonar in underwater applications,” Expert Systems with Applications 140, 112870 (2020).
[Crossref]

Fujii, M.

A. Yamashita, M. Fujii, and T. Kaneko, “Color registration of underwater images for underwater sensing with consideration of light attenuation,” in Proceedings 2007 IEEE international conference on robotics and automation (IEEE2007), pp. 4570–4575.

Garcia, J.

D. Fixler, J. Garcia, Z. Zalevsky, A. Weiss, and M. Deutsch, “Pattern projection for subpixel resolved imaging in microscopy,” Micron 38(2), 115–120 (2007).
[Crossref]

Genzel, E.

I. Yariv, Y. Kapp-Barnea, E. Genzel, H. Duadi, and D. Fixler, “Detecting concentrations of milk components by an iterative optical technique,” J. Biophotonics 8(11-12), 979–984 (2015).
[Crossref]

Gerchberg, R.

R. Gerchberg, “Super-resolution through error energy reduction,” Optica Acta: International Journal of Optics 21, 709–720 (1974).
[Crossref]

Gerchberg, R. W.

R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase image and diffraction plane pictures,” Optik 35, 237–246 (1972).

Gomes, D.

D. Gomes, A. S. Saif, and D. Nandi, “Robust Underwater Object Detection with Autonomous Underwater Vehicle: A Comprehensive Study,” in Proceedings of the International Conference on Computing Advancements (2020), pp. 1–10.

Gonsalves, R. A.

R. A. Gonsalves, “Perspectives on phase retrieval and phase diversity in astronomy,” in Adaptive Optics Systems IV (International Society for Optics and Photonics, 2014), p. 91482P.

Grossman, E.

Gur, A.

Gur, E.

Haddad, M.

I. Yariv, M. Haddad, H. Duadi, M. Motiei, and D. Fixler, “New optical sensing technique of tissue viability and blood flow based on nanophotonic iterative multi-plane reflectance measurements,” Int. J. Nanomed. 11, 5237–5244 (2016).
[Crossref]

Häggmark, I.

Haskell, R. C.

Hertz, H. M.

Jacques, S. L.

S. L. Jacques and B. W. Pogue, “Tutorial on diffuse light transport,” J. Biomed. Opt. 13(4), 041302 (2008).
[Crossref]

Joslin, J.

J. Joslin, “Imaging sonar review for marine environmental monitoring around tidal turbines,” (2019).

Kakoty, N. M.

S. Chutia, N. M. Kakoty, and D. Deka, “A review of underwater robotics, navigation, sensing techniques and applications,” Proceedings of the Advances in Robotics 8, 1–6 (2017).
[Crossref]

Kaneko, T.

A. Yamashita, M. Fujii, and T. Kaneko, “Color registration of underwater images for underwater sensing with consideration of light attenuation,” in Proceedings 2007 IEEE international conference on robotics and automation (IEEE2007), pp. 4570–4575.

Kapp-Barnea, Y.

I. Yariv, Y. Kapp-Barnea, E. Genzel, H. Duadi, and D. Fixler, “Detecting concentrations of milk components by an iterative optical technique,” J. Biophotonics 8(11-12), 979–984 (2015).
[Crossref]

Karshafian, R.

H. Assadi, R. Karshafian, and A. Douplik, “Optical scattering properties of intralipid phantom in presence of encapsulated microbubbles,” Int. J. Photoenergy (2014).

Kim, H.

H. Lu, Y. Li, Y. Zhang, M. Chen, S. Serikawa, and H. Kim, “Underwater optical image processing: a comprehensive review,” Mobile Netw Appl 22(6), 1204–1211 (2017).
[Crossref]

Konforti, N.

D. Mendlovic, Z. Zalevsky, and N. Konforti, “Computation considerations and fast algorithms for calculating the diffraction integral,” Journal of Modern Optics 44(2), 407–414 (1997).
[Crossref]

Kumar, G. S.

G. S. Kumar, U. V. Painumgal, M. C. Kumar, and K. Rajesh, “Autonomous underwater vehicle for vision based tracking,” Procedia Computer Science 133, 169–180 (2018).
[Crossref]

Kumar, M. C.

G. S. Kumar, U. V. Painumgal, M. C. Kumar, and K. Rajesh, “Autonomous underwater vehicle for vision based tracking,” Procedia Computer Science 133, 169–180 (2018).
[Crossref]

Lagendijk, A.

J. Bertolotti, E. G. van Putten, C. Blum, A. Lagendijk, W. L. Vos, and A. P. Mosk, “Non-invasive imaging through opaque scattering layers,” Nature 491(7423), 232–234 (2012).
[Crossref]

Lai, J.

S. Wang, L. Xue, J. Lai, Y. Song, and Z. Li, “Phase retrieval method for biological samples with absorption,” J. Opt. 15(7), 075301 (2013).
[Crossref]

Levy, D.

D. Berman, D. Levy, S. Avidan, and T. Treibitz, “Underwater single image color restoration using haze-lines and a new quantitative dataset,” IEEE transactions on pattern analysis and machine intelligence (2020).

Li, Y.

H. Lu, Y. Li, Y. Zhang, M. Chen, S. Serikawa, and H. Kim, “Underwater optical image processing: a comprehensive review,” Mobile Netw Appl 22(6), 1204–1211 (2017).
[Crossref]

Li, Z.

S. Wang, L. Xue, J. Lai, Y. Song, and Z. Li, “Phase retrieval method for biological samples with absorption,” J. Opt. 15(7), 075301 (2013).
[Crossref]

Lipovsky, A.

Lu, H.

H. Lu, Y. Li, Y. Zhang, M. Chen, S. Serikawa, and H. Kim, “Underwater optical image processing: a comprehensive review,” Mobile Netw Appl 22(6), 1204–1211 (2017).
[Crossref]

Lubart, R.

Massot-Campos, M.

M. Massot-Campos and G. Oliver-Codina, “Optical sensors and methods for underwater 3D reconstruction,” Sensors 15(12), 31525–31557 (2015).
[Crossref]

McAdams, M. S.

McClain, C. R.

J. Mueller, G. S. Fargion, and C. R. McClain, “Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Volume 6; Special Topics in Ocean Optics Protocols and Appendices; Revised,” (2003).

Mendlovic, D.

D. Mendlovic, Z. Zalevsky, and N. Konforti, “Computation considerations and fast algorithms for calculating the diffraction integral,” Journal of Modern Optics 44(2), 407–414 (1997).
[Crossref]

Z. Zalevsky, R. G. Dorsch, and D. Mendlovic, “Gerchberg–Saxton algorithm applied in the fractional Fourier or the Fresnel domain,” Opt. Lett. 21(12), 842–844 (1996).
[Crossref]

Moes, C. J.

Mosk, A. P.

J. Bertolotti, E. G. van Putten, C. Blum, A. Lagendijk, W. L. Vos, and A. P. Mosk, “Non-invasive imaging through opaque scattering layers,” Nature 491(7423), 232–234 (2012).
[Crossref]

Motiei, M.

I. Yariv, M. Haddad, H. Duadi, M. Motiei, and D. Fixler, “New optical sensing technique of tissue viability and blood flow based on nanophotonic iterative multi-plane reflectance measurements,” Int. J. Nanomed. 11, 5237–5244 (2016).
[Crossref]

Mueller, J.

J. Mueller, G. S. Fargion, and C. R. McClain, “Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Volume 6; Special Topics in Ocean Optics Protocols and Appendices; Revised,” (2003).

Nandi, D.

D. Gomes, A. S. Saif, and D. Nandi, “Robust Underwater Object Detection with Autonomous Underwater Vehicle: A Comprehensive Study,” in Proceedings of the International Conference on Computing Advancements (2020), pp. 1–10.

Neves, G.

G. Neves, M. Ruiz, J. Fontinele, and L. Oliveira, “Rotated object detection with forward-looking sonar in underwater applications,” Expert Systems with Applications 140, 112870 (2020).
[Crossref]

Oliveira, L.

G. Neves, M. Ruiz, J. Fontinele, and L. Oliveira, “Rotated object detection with forward-looking sonar in underwater applications,” Expert Systems with Applications 140, 112870 (2020).
[Crossref]

Oliver-Codina, G.

M. Massot-Campos and G. Oliver-Codina, “Optical sensors and methods for underwater 3D reconstruction,” Sensors 15(12), 31525–31557 (2015).
[Crossref]

Painumgal, U. V.

G. S. Kumar, U. V. Painumgal, M. C. Kumar, and K. Rajesh, “Autonomous underwater vehicle for vision based tracking,” Procedia Computer Science 133, 169–180 (2018).
[Crossref]

Patel, S.

Piao, D.

Pogue, B. W.

S. L. Jacques and B. W. Pogue, “Tutorial on diffuse light transport,” J. Biomed. Opt. 13(4), 041302 (2008).
[Crossref]

Prahl, S. A.

Rahamim, G.

Rajesh, K.

G. S. Kumar, U. V. Painumgal, M. C. Kumar, and K. Rajesh, “Autonomous underwater vehicle for vision based tracking,” Procedia Computer Science 133, 169–180 (2018).
[Crossref]

Rivlin, E.

D. Sazbon, Z. Zalevsky, and E. Rivlin, “Qualitative real-time range extraction for preplanned scene partitioning using laser beam coding,” Pattern Recognition Letters 26(11), 1772–1781 (2005).
[Crossref]

Ruiz, M.

G. Neves, M. Ruiz, J. Fontinele, and L. Oliveira, “Rotated object detection with forward-looking sonar in underwater applications,” Expert Systems with Applications 140, 112870 (2020).
[Crossref]

Saif, A. S.

D. Gomes, A. S. Saif, and D. Nandi, “Robust Underwater Object Detection with Autonomous Underwater Vehicle: A Comprehensive Study,” in Proceedings of the International Conference on Computing Advancements (2020), pp. 1–10.

Saxton, W. O.

R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase image and diffraction plane pictures,” Optik 35, 237–246 (1972).

Sazbon, D.

D. Sazbon, Z. Zalevsky, and E. Rivlin, “Qualitative real-time range extraction for preplanned scene partitioning using laser beam coding,” Pattern Recognition Letters 26(11), 1772–1781 (2005).
[Crossref]

Sen, K.

K. Sen and S. J. Wilson, Radiative transfer in curved media (World Scientific, 1990).

Serikawa, S.

H. Lu, Y. Li, Y. Zhang, M. Chen, S. Serikawa, and H. Kim, “Underwater optical image processing: a comprehensive review,” Mobile Netw Appl 22(6), 1204–1211 (2017).
[Crossref]

Shapira, C.

I. Yariv, C. Shapira, H. Duadi, and D. Fixler, “Media Characterization under Scattering Conditions by Nanophotonics Iterative Multiplane Spectroscopy Measurements,” ACS Omega 4(10), 14301–14306 (2019).
[Crossref]

Shliselberg, E.

Song, Y.

S. Wang, L. Xue, J. Lai, Y. Song, and Z. Li, “Phase retrieval method for biological samples with absorption,” J. Opt. 15(7), 075301 (2013).
[Crossref]

Svaasand, L. O.

Treibitz, T.

D. Berman, D. Levy, S. Avidan, and T. Treibitz, “Underwater single image color restoration using haze-lines and a new quantitative dataset,” IEEE transactions on pattern analysis and machine intelligence (2020).

Tromberg, B. J.

Tsay, T.-T.

Tzioni, R.

Vågberg, W.

Van Gemert, M. J.

van Marie, J.

van Putten, E. G.

J. Bertolotti, E. G. van Putten, C. Blum, A. Lagendijk, W. L. Vos, and A. P. Mosk, “Non-invasive imaging through opaque scattering layers,” Nature 491(7423), 232–234 (2012).
[Crossref]

Van Staveren, H. J.

Vos, W. L.

J. Bertolotti, E. G. van Putten, C. Blum, A. Lagendijk, W. L. Vos, and A. P. Mosk, “Non-invasive imaging through opaque scattering layers,” Nature 491(7423), 232–234 (2012).
[Crossref]

Wang, H.

Z. Chen, Z. Zhang, F. Dai, Y. Bu, and H. Wang, “Monocular vision-based underwater object detection,” Sensors 17(8), 1784 (2017).
[Crossref]

Wang, N.

N. Wang, Y. Wang, and M. J. Er, “Review on deep learning techniques for marine object recognition: Architectures and algorithms,” Control Engineering Practice104458 (2020).

Wang, S.

S. Wang, L. Xue, J. Lai, Y. Song, and Z. Li, “Phase retrieval method for biological samples with absorption,” J. Opt. 15(7), 075301 (2013).
[Crossref]

Wang, Y.

N. Wang, Y. Wang, and M. J. Er, “Review on deep learning techniques for marine object recognition: Architectures and algorithms,” Control Engineering Practice104458 (2020).

Weiss, A.

D. Fixler, J. Garcia, Z. Zalevsky, A. Weiss, and M. Deutsch, “Pattern projection for subpixel resolved imaging in microscopy,” Micron 38(2), 115–120 (2007).
[Crossref]

Wilson, S. J.

K. Sen and S. J. Wilson, Radiative transfer in curved media (World Scientific, 1990).

Xue, L.

S. Wang, L. Xue, J. Lai, Y. Song, and Z. Li, “Phase retrieval method for biological samples with absorption,” J. Opt. 15(7), 075301 (2013).
[Crossref]

Yamashita, A.

A. Yamashita, M. Fujii, and T. Kaneko, “Color registration of underwater images for underwater sensing with consideration of light attenuation,” in Proceedings 2007 IEEE international conference on robotics and automation (IEEE2007), pp. 4570–4575.

Yariv, I.

I. Yariv, H. Duadi, and D. Fixler, “Depth Scattering Characterization of Multi-Layer Turbid Media Based on Iterative Multi-Plane Reflectance Measurements,” IEEE Photonics J. 12(5), 1–13 (2020).
[Crossref]

I. Yariv, C. Shapira, H. Duadi, and D. Fixler, “Media Characterization under Scattering Conditions by Nanophotonics Iterative Multiplane Spectroscopy Measurements,” ACS Omega 4(10), 14301–14306 (2019).
[Crossref]

I. Yariv, H. Duadi, and D. Fixler, “Optical method to extract the reduced scattering coefficient from tissue: theory and experiments,” Opt. Lett. 43(21), 5299–5302 (2018).
[Crossref]

I. Yariv, M. Haddad, H. Duadi, M. Motiei, and D. Fixler, “New optical sensing technique of tissue viability and blood flow based on nanophotonic iterative multi-plane reflectance measurements,” Int. J. Nanomed. 11, 5237–5244 (2016).
[Crossref]

I. Yariv, Y. Kapp-Barnea, E. Genzel, H. Duadi, and D. Fixler, “Detecting concentrations of milk components by an iterative optical technique,” J. Biophotonics 8(11-12), 979–984 (2015).
[Crossref]

I. Yariv, G. Rahamim, E. Shliselberg, H. Duadi, A. Lipovsky, R. Lubart, and D. Fixler, “Detecting nanoparticles in tissue using an optical iterative technique,” Biomed. Opt. Express 5(11), 3871–3881 (2014).
[Crossref]

I. Yariv, H. Duadi, and D. Fixler, “An optical method to detect tissue scattering: theory, experiments and biomedical applications,” presented at the SPIE BiOS2019.

I. Yariv, H. Duadi, and D. Fixler, An optical method to detect tissue scattering: theory, experiments and biomedical applications (SPIE, 2019).

Zalevsky, Z.

D. Fixler, H. Duadi, R. Ankri, and Z. Zalevsky, “Determination of coherence length in biological tissues,” Lasers Surg. Med. 43(4), 339–343 (2011).
[Crossref]

E. Grossman, R. Tzioni, A. Gur, E. Gur, and Z. Zalevsky, “Optical through-turbulence imaging configuration: experimental validation,” Opt. Lett. 35(4), 453–455 (2010).
[Crossref]

D. Fixler, J. Garcia, Z. Zalevsky, A. Weiss, and M. Deutsch, “Pattern projection for subpixel resolved imaging in microscopy,” Micron 38(2), 115–120 (2007).
[Crossref]

D. Sazbon, Z. Zalevsky, and E. Rivlin, “Qualitative real-time range extraction for preplanned scene partitioning using laser beam coding,” Pattern Recognition Letters 26(11), 1772–1781 (2005).
[Crossref]

D. Mendlovic, Z. Zalevsky, and N. Konforti, “Computation considerations and fast algorithms for calculating the diffraction integral,” Journal of Modern Optics 44(2), 407–414 (1997).
[Crossref]

Z. Zalevsky, R. G. Dorsch, and D. Mendlovic, “Gerchberg–Saxton algorithm applied in the fractional Fourier or the Fresnel domain,” Opt. Lett. 21(12), 842–844 (1996).
[Crossref]

Zhang, Y.

H. Lu, Y. Li, Y. Zhang, M. Chen, S. Serikawa, and H. Kim, “Underwater optical image processing: a comprehensive review,” Mobile Netw Appl 22(6), 1204–1211 (2017).
[Crossref]

Zhang, Z.

Z. Chen, Z. Zhang, F. Dai, Y. Bu, and H. Wang, “Monocular vision-based underwater object detection,” Sensors 17(8), 1784 (2017).
[Crossref]

ACS Omega (1)

I. Yariv, C. Shapira, H. Duadi, and D. Fixler, “Media Characterization under Scattering Conditions by Nanophotonics Iterative Multiplane Spectroscopy Measurements,” ACS Omega 4(10), 14301–14306 (2019).
[Crossref]

Appl. Opt. (3)

Biomed. Opt. Express (1)

Expert Systems with Applications (1)

G. Neves, M. Ruiz, J. Fontinele, and L. Oliveira, “Rotated object detection with forward-looking sonar in underwater applications,” Expert Systems with Applications 140, 112870 (2020).
[Crossref]

IEEE Photonics J. (1)

I. Yariv, H. Duadi, and D. Fixler, “Depth Scattering Characterization of Multi-Layer Turbid Media Based on Iterative Multi-Plane Reflectance Measurements,” IEEE Photonics J. 12(5), 1–13 (2020).
[Crossref]

Int. J. Nanomed. (1)

I. Yariv, M. Haddad, H. Duadi, M. Motiei, and D. Fixler, “New optical sensing technique of tissue viability and blood flow based on nanophotonic iterative multi-plane reflectance measurements,” Int. J. Nanomed. 11, 5237–5244 (2016).
[Crossref]

J. Biomed. Opt. (1)

S. L. Jacques and B. W. Pogue, “Tutorial on diffuse light transport,” J. Biomed. Opt. 13(4), 041302 (2008).
[Crossref]

J. Biophotonics (1)

I. Yariv, Y. Kapp-Barnea, E. Genzel, H. Duadi, and D. Fixler, “Detecting concentrations of milk components by an iterative optical technique,” J. Biophotonics 8(11-12), 979–984 (2015).
[Crossref]

J. Opt. (1)

S. Wang, L. Xue, J. Lai, Y. Song, and Z. Li, “Phase retrieval method for biological samples with absorption,” J. Opt. 15(7), 075301 (2013).
[Crossref]

J. Opt. Soc. Am. A (1)

Journal of Modern Optics (1)

D. Mendlovic, Z. Zalevsky, and N. Konforti, “Computation considerations and fast algorithms for calculating the diffraction integral,” Journal of Modern Optics 44(2), 407–414 (1997).
[Crossref]

Lasers Surg. Med. (1)

D. Fixler, H. Duadi, R. Ankri, and Z. Zalevsky, “Determination of coherence length in biological tissues,” Lasers Surg. Med. 43(4), 339–343 (2011).
[Crossref]

Micron (1)

D. Fixler, J. Garcia, Z. Zalevsky, A. Weiss, and M. Deutsch, “Pattern projection for subpixel resolved imaging in microscopy,” Micron 38(2), 115–120 (2007).
[Crossref]

Mobile Netw Appl (1)

H. Lu, Y. Li, Y. Zhang, M. Chen, S. Serikawa, and H. Kim, “Underwater optical image processing: a comprehensive review,” Mobile Netw Appl 22(6), 1204–1211 (2017).
[Crossref]

Nanophotonics (1)

R. Ankri and D. Fixler, “Gold nanorods based diffusion reflection measurements: current status and perspectives for clinical applications,” Nanophotonics 6(5), 1031–1042 (2017).
[Crossref]

Nature (1)

J. Bertolotti, E. G. van Putten, C. Blum, A. Lagendijk, W. L. Vos, and A. P. Mosk, “Non-invasive imaging through opaque scattering layers,” Nature 491(7423), 232–234 (2012).
[Crossref]

Opt. Express (1)

Opt. Lett. (3)

Optica Acta: International Journal of Optics (1)

R. Gerchberg, “Super-resolution through error energy reduction,” Optica Acta: International Journal of Optics 21, 709–720 (1974).
[Crossref]

Optik (1)

R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase image and diffraction plane pictures,” Optik 35, 237–246 (1972).

Pattern Recognition Letters (1)

D. Sazbon, Z. Zalevsky, and E. Rivlin, “Qualitative real-time range extraction for preplanned scene partitioning using laser beam coding,” Pattern Recognition Letters 26(11), 1772–1781 (2005).
[Crossref]

Procedia Computer Science (1)

G. S. Kumar, U. V. Painumgal, M. C. Kumar, and K. Rajesh, “Autonomous underwater vehicle for vision based tracking,” Procedia Computer Science 133, 169–180 (2018).
[Crossref]

Proceedings of the Advances in Robotics (1)

S. Chutia, N. M. Kakoty, and D. Deka, “A review of underwater robotics, navigation, sensing techniques and applications,” Proceedings of the Advances in Robotics 8, 1–6 (2017).
[Crossref]

Sensors (2)

M. Massot-Campos and G. Oliver-Codina, “Optical sensors and methods for underwater 3D reconstruction,” Sensors 15(12), 31525–31557 (2015).
[Crossref]

Z. Chen, Z. Zhang, F. Dai, Y. Bu, and H. Wang, “Monocular vision-based underwater object detection,” Sensors 17(8), 1784 (2017).
[Crossref]

Other (11)

D. Berman, D. Levy, S. Avidan, and T. Treibitz, “Underwater single image color restoration using haze-lines and a new quantitative dataset,” IEEE transactions on pattern analysis and machine intelligence (2020).

R. A. Gonsalves, “Perspectives on phase retrieval and phase diversity in astronomy,” in Adaptive Optics Systems IV (International Society for Optics and Photonics, 2014), p. 91482P.

H. Assadi, R. Karshafian, and A. Douplik, “Optical scattering properties of intralipid phantom in presence of encapsulated microbubbles,” Int. J. Photoenergy (2014).

J. Joslin, “Imaging sonar review for marine environmental monitoring around tidal turbines,” (2019).

N. Wang, Y. Wang, and M. J. Er, “Review on deep learning techniques for marine object recognition: Architectures and algorithms,” Control Engineering Practice104458 (2020).

D. Gomes, A. S. Saif, and D. Nandi, “Robust Underwater Object Detection with Autonomous Underwater Vehicle: A Comprehensive Study,” in Proceedings of the International Conference on Computing Advancements (2020), pp. 1–10.

I. Yariv, H. Duadi, and D. Fixler, “An optical method to detect tissue scattering: theory, experiments and biomedical applications,” presented at the SPIE BiOS2019.

I. Yariv, H. Duadi, and D. Fixler, An optical method to detect tissue scattering: theory, experiments and biomedical applications (SPIE, 2019).

A. Yamashita, M. Fujii, and T. Kaneko, “Color registration of underwater images for underwater sensing with consideration of light attenuation,” in Proceedings 2007 IEEE international conference on robotics and automation (IEEE2007), pp. 4570–4575.

J. Mueller, G. S. Fargion, and C. R. McClain, “Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Volume 6; Special Topics in Ocean Optics Protocols and Appendices; Revised,” (2003).

K. Sen and S. J. Wilson, Radiative transfer in curved media (World Scientific, 1990).

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. A schematic illustration of the image method as used for the DA model. The medium (light blue area) is illuminated by the laser beam (blue arrow). The light is scattered in the medium (fading blue half a circle). To model the light propagation, the DA model uses the images method, and refers to the field of each point ρ (green circle) as if it was created by the two master sources (black circles) that are located symmetrically above and beneath a chosen boundary (red dotted line). The DA model does not describe well the area close to the center and therefor, Piao’s model describes the field using two additional sources- real and image slave sources (gray circles) located symmetrically above and beneath the chosen boundary (red dotted line).
Fig. 2.
Fig. 2. The IMOPE technique’s process. (a) The experimental setup consists of a laser source (1) is connected to an attenuator and a linear polarizer (2). In front of them sits the sample (3). The detector (4), the second linear polarizer (5) and the lens (6) are sitting on a motorized stage. The stage moves while taking the N intensity images required for the GS algorithm. In order to change the magnification, we change the distances between the detector and the lens and between the lens and the object. These distances are represented by the light blue arrows. (b) An example of a phase image extracted by the GS algorithm from a phantom with µs0.75 mm-1. The phase image is divided by the orange circle to the two ROIs: the center (single scattering area, ρ<1/ µs) and the ring (multiple scattering area, ρ>1/ µs). (c) The RMS of each ROI of the phase image is calculated and compared to the theory graph of phase RMS as a function of µ’s. The suitability of the two calculated values to the ring (red line) and center (blue line) graphs yields the µ’s of the measured object.
Fig. 3.
Fig. 3. The results of the theoretical model’s adaptations. (a) The theoretical model for the different magnifications $M$: $1,\frac{2}{3}\; ,\frac{1}{2}\; ,\; 0.4\; $ and $\frac{1}{3}$ are represented by red, yellow, green, blue and purple, respectively. For each magnification, the solid lines describe the center (i.e, the single scattering ROI) and the dashed lines describe the ring (multiple scattering ROI) . (b) With the change in the the magnification, we have noticed that the crossing point of the ROI’s RMS graphs shifts along the µ’s axis. The black asterisks show the extracted µs,cp, the µ’s value in which the crossing point occurs for every magnification M and presented them VS M. The error bars are defined according to the simulation resolution.
Fig. 4.
Fig. 4. The influence of magnification on the radius that discriminates between ROIs (theoretical and experimental results are the lines and asterisks, respectively). For (a) M=1 lower µ’s deviant from theory and for (b) M=$\frac{1}{2}$ lower µ’s fit theory. Inset: The optical setup of each configuration is presented above each graph, where the magnification M is changed by adjusting the distances before and after the lens.
Fig. 5.
Fig. 5. The influence of magnification on the phase RMS. The above figures present the phase RMS versus µ’s graphs for image magnification M of (a) $1$, (b) $\frac{1}{2}$, (c) $\frac{1}{3}$, (d) $2$ and (e) $3$. The solid lines present the theories graphs, blue for the center ROI and red for the ring ROI. The experimental results of the center and ring ROIs are represented by blue and red asterisks.
Fig. 6.
Fig. 6. Influence of distance between image planes on phase RMS. Note that we scan the same total distance, hence the double distance analysis (blue and red circles for the double distance analysis of the center and the ring respectively) includes half the pictures then the regular analysis (blue and red asterisks for the single distance analysis of the center and the ring respectively). The figures present different magnifications: (a) $M = 1$ (b) $M = \frac{1}{2}$ and (c) $M = \frac{1}{3}\; .$ For all cases, the blue solid lines present the center ROI theory and the red solid lines present the ring ROI theory.
Fig. 7.
Fig. 7. Influence of reconstruction depth on the phase RMS. The reconstruction depth was changed by taking the same number of images and changing the distance between them; single and triple distances Δz between the images (corresponding to 0.635 and 1.905 mm represented by asterisks and circles, respectively). The theoretical graphs (lines) and experimental results (points) of image magnification M=$\frac{1}{3}$ are presented. The red and the blue lines and points represent the ring and the center ROIs, respectively.

Tables (1)

Tables Icon

Table 1. A summary of the μ s ranges that were found to be most informative due to their linear behavior.

Equations (5)

Equations on this page are rendered with MathJax. Learn more.

R M S φ = ρ ϵ γ | | I R d u a l ( ρ ) | ( e x p ( i ( φ ( ρ ) φ a v ) ) e i 0 | 2 ρ | | I R d u a l ( ρ ) | | 2
D P F = 1 ρ μ a [ ln ( S Ψ ) ]
φ ( ρ ) = 2 π n λ D P F ρ
M F P = 1 μ s
Δ z < Δ x 2 λ

Metrics