Abstract

In this study, a coaxial and multi-aperture polarization imaging system was designed to obtain dehazed images under bad weather conditions, which could effectively utilize polarization information. The system can capture four polarization azimuth images simultaneously by integrating four polarizers with different transmission directions in front of the sub-apertures. The system can utilize energy of incident light effectively, as well as remain the image resolution of the camera. In order to solve the image registration problem, a translation registration method for sub-aperture polarized images was implemented, which was based on the phase-only correlation algorithm. Moreover, a polarization dehazing model suitable for this system is employed to remove the haze effect of the captured images. Further experimental data demonstrated that the system could achieve a pleasing visual effect, and its performance on image quality was certified by several objective evaluation means including the line spread function and quantified detail information. The core advantages of this system are real-time and the good image quality in terms of contrast, details, and clarity. The system can effectively adapt to the long-distance, high-resolution, and real-time imaging in bad weather conditions such as haze weather.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Images of outdoor scenes are often degraded by bad weather conditions, and under these circumstances the atmospheric phenomena like fog and haze would degrade the visibility of the object scenes. Since in bad weather conditions the aerosol is misted by additional particles, in that case the reflected light is scattered. This process would cause image degradations because it could fade the colors and reduce the contrast of the observed objects, and such degraded images often lead to negative effects on environmental monitoring, remote sensing, target detection, etc. [13].

These years, restoring the degraded images by the image processing means has attracted increasing attention. Some methods are based on the consideration of removing haze effect from images, such as the dark channel prior way [4], the Retinex theory [5] and the homomorphic filtering way [6]. However, some information of the image would be inevitably lost due to the bad weather conditions. Though, all the information of these images is very useful in dehazing applications [1,4,7]. Therefore, it is impossible to effectively and fully utilize more information to recover clear image. In addition, these methods may take a long time to remove haze effect in the actual applications [7,8]. At present, some researchers have been focused on employing polarization information to image through turbid medium [911]. Noteworthy, the polarization information has differences of features between the atmospheric scattered light and the target reflected light [10,12]. Therefore, more information in the polarized images can be utilized in the processing of polarization dehazing. In this field, researches have established an atmospheric dehazing model using the feature information that is obtained from polarization azimuth images [1,13,14]. The traditional method of obtaining polarization azimuth information is to rotate the polarizer in front of the lens [11,15]. However, manually rotating the polarizer may bring about a shake to the camera, and then result in the rotation errors and inaccurate polarization information. Besides, the images in a variety of scenes cannot be obtained effectively and in real time, which limits their wide applications [16]. For that reason, researchers use a dichroic prism to obtain images at four different azimuth angles at the same time, which can eliminate the error triggered by manually rotating the polarizer [17]. Unfortunately, the low light energy utilization makes the method hard to obtain valid images in low-light conditions. The reason is that a beam of incident light is split through the prism and then received by different detectors [18,19].

In this paper, a novel design of polarization dehazing system based on a coaxial and multi-aperture polarimetric camera was proposed. This system adopted the improved polarization dehazing algorithm and the sub-aperture polarized image registration technology to reconstruct clear images in real time. Compared with the traditional method, this system avoids rotating the polarizer and increases the utilization of light energy. Several groups of experiments were conducted to show the dehaze effects. Three objective evaluation methods were employed to evaluate the experimental results, i.e. the Mean Structural Similarity (MSSIM), the Line Spread Function (LSF) and the high-frequency information statistics, and the results certified that the dehazing system presented in our study could significantly remove the impacts of haze on images.

2. Polarization dehazing model

The degradation of image quality is a well-known problem, which is usually caused by poor atmospheric conditions such as hazy weather due to the light scattering effect. There would be unwanted scattering components that enter the imaging light path and then be received by the detectors. The information of scattered sunlight can be seen as a major source of the optical noise, which leads to the result of reducing the image clarity as shown in Fig. 1 [15]. In such case, the intensity received by the detector is composed of target light and total airlight, which is shown in Eq. (1):

$$I = {I_c} \cdot {e^{ - \beta d}} + {I_\infty }\left( {1 - {e^ - }^{\beta d}} \right),$$
where Ic is the clear target image (Ic=I·ρ), ρ is the scene irradiance, β is the scattering coefficient related to the composition of the atmospheric particles, d represents the distance of the target to the camera, and I means the intensity image of airlight in region without any object. The two parts on the right of Eq. (1) can be explained separately using the attenuation model and the atmospheric light model shown in Fig. 2 [19,20].

 

Fig. 1. Scattering imaging model under hazy weather.

Download Full Size | PPT Slide | PDF

 

Fig. 2. Variation of light intensity with scene distance.

Download Full Size | PPT Slide | PDF

According to Ref. [1], the information received by the detector can be divided into two parts, the atmospheric light and the attenuation light. By utilizing the polarization characteristics of light intensity, the atmospheric light can be estimated effectively [21]. It is considered that the atmospheric light is partially polarized. So, the Degree of Linear Polarization (DoLP) information of the atmospheric light can be represented by Eq. (2):

$$DoL{P_\infty } = \frac{{{I_{\infty \max }} - {I_{\infty \min }}}}{{{I_{\infty \max }} + {I_{\infty \min }}}} = \frac{{{{({{S_{1E}}^2 + {S_{2E}}^2} )}^{1/2}}}}{{{S_{0E}}}},$$
where DoLP is a matrix representing the degree of linear polarization in the region without any object, which is in same dimensions with the raw intensity image. I∞max and I∞min are maximal and minimal intensity image of airlight in region without any object respectively [11]. S0E, S1E and S2E are Stokes vector elements of the image region without any object [14]. And the total airlight intensity image is shown in Eq. (3):
$${I_\infty }({1 - {e^ - }^{\beta d}} )= {I^t} \cdot \frac{{DoL{P_A}}}{{DoL{P_\infty }}},$$
where the It is the captured intensity image of the target scene (It= Imax+ Imin), Imax and Imin are maximal and minimal intensity image of target scene when rotating the polarizer respectively. DoLPA is a matrix containing information on degree of linear polarization of each pixel in an image capture by our system. The dehazed image can be recovered when total airlight intensity is obtained according to Eq. (3) and shown in Eq. (4):
$${I_c} = {{{I^t}\left( {1 - \frac{{DoL{P_A}}}{{DoL{P_\infty }}}} \right)} \mathord{\left/ {\vphantom {{{I^t}\left( {1 - \frac{{DoL{P_A}}}{{DoL{P_\infty }}}} \right)} {\left( {1 - \frac{{{I^t}DoL{P_A}}}{{{I_\infty }DoL{P_\infty }}}} \right)}}} \right.} {\left( {1 - \frac{{{I^t}DoL{P_A}}}{{{I_\infty }DoL{P_\infty }}}} \right)}}.$$
On the basis of the polarization dehazing model, I, Imax and Imin need to be measured when a clear image is obtained, as shown in Eqs. (2) and (3). Unfortunately, it is very difficult to manually find the polarization azimuth that satisfies the Imax and Imin conditions. In this study, two sets of images of the target scene have been captured as shown in Fig. 3, and Figs. 3(a) and 3(b) are likely to be the maximum intensity image in this scene, while Figs. 3(c) and 3(d) are the approximatively minimum light intensity image. As can be seen from Fig. 3, it is hard to distinguish which group corresponds the most accurate Imax and Imin in experiment, as the images of the 3(a) and 3(c) group are quite similar to the images of the 3(b) and 3(d) group. In fact, the polarization azimuths of Figs. 3(a) and 3(b) differ by 5 degrees. In Fig. 3(e), it represents the diversities on the DoLP information that is obtained from two sets of different polarization azimuth images, where the brighter points represent greater differences. The differences on the polarization results are significant when different researchers subjectively adjust the polarization azimuths to determine Imax and Imin. High-accuracy data are difficult to obtain in reality. In order to solve this problem, here, a coaxial and multi-aperture polarimetric camera system has been designed based on the Stokes theory to gain the information of polarization characteristics.

 

Fig. 3. (a) and (c) are a set of mutually orthogonal polarization azimuth images. (b) and (d) are a set of polarization azimuth images that differ by 5 degrees from (a) and (c), respectively. (e) Difference in polarization between two sets of images.

Download Full Size | PPT Slide | PDF

The polarization characteristics of the imaging scene of the beam can be described by four measurable scalars [16]. And in this study, the total airlight intensity image can be calculated directly using the captured image, which is implemented and shown in Eq. (5):

$${I_\infty }({1 - {e^ - }^{\beta d}} ) = \frac{{{S_0} \cdot DoL{P_A}}}{{DoL{P_\infty }}},$$
where S0 is Stokes vector elements, which expresses total light intensity of the target scene. By adopting the camera equipment and optimizing the parameters of the traditional dehazing model, a dehazing model that does not have to acquire the orthogonally polarized image is built. The dehazed image can be expressed by Eq. (6):
$${I_N} = {I^t}\left( {1 - {{\left( {\frac{{\sqrt {{S_1}^2 + {S_2}^2} }}{{{S_0}}}} \right)} \mathord{\left/ {\vphantom {{\left( {\frac{{\sqrt {{S_1}^2 + {S_2}^2} }}{{{S_0}}}} \right)} {\left( {\frac{{\sqrt {{S_{1E}}^2 + {S_{2E}}^2} }}{{{S_0}_E}}} \right)}}} \right. } {\left( {\frac{{\sqrt {{S_{1E}}^2 + {S_{2E}}^2} }}{{{S_0}_E}}} \right)}}} \right) \times \left( {1\textrm{ - }{{\sqrt {{S_1}^2 + {S_2}^2} } \mathord{\left/ {\vphantom {{\sqrt {{S_1}^2 + {S_2}^2} } {\left( {{I_\infty } \times \frac{{\sqrt {{S_{1E}}^2 + {S_{2E}}^2} }}{{{S_0}_E}}} \right)}}} \right. } {\left( {{I_\infty } \times \frac{{\sqrt {{S_{1E}}^2 + {S_{2E}}^2} }}{{{S_0}_E}}} \right)}}} \right),$$
where the S1 and S2 are Stokes vector elements of the target scene. With the captured images, the dehazed image can be represented by Eq. (7):
$$\begin{aligned} {I_N} & = {S_0}\left( {1 - \frac{{{\gamma _{90}} \times \sqrt {{{\left( { - {\varepsilon _{45}}} \right)}^2} + {{\left( { - \varepsilon _{45}^{\prime}} \right)}^2} + 2{I_0}^2 + {I_{135}}^2 + 2\left( {w_{45}^{135} - w_0^{45} - w_0^{135} - w_{45}^{90}} \right)} }}{{{S_0} \times \sqrt {{{\left( { - {\gamma _{45}}} \right)}^2} + {{\left( { - \gamma _{45}^{\prime}} \right)}^2} + 2{I_{0\infty}}^{2} + {I_{135\infty }}^2 + 2\left( {\bar{w}_{45}^{135} - \bar{w}_0^{45} - \bar{w}_0^{135} - \bar{w}_{45}^{90}} \right)} }}} \right)\\ &\quad\times \;{\left( {1 - \frac{{{\gamma _{90}} \times \sqrt {{{\left( { - {\varepsilon _{45}}} \right)}^2} + {{\left( { - \varepsilon _{45}^{\prime}} \right)}^2} + 2{I_0}^2 + {I_{135}}^2 + 2\left( {w_{45}^{135} - w_0^{45} - w_0^{135} - w_{45}^{90}} \right)} }}{{{I_\infty } \times \sqrt {{{\left( { - {\gamma _{45}}} \right)}^2} + {{\left( { - \gamma _{45}^{\prime}} \right)}^2} + 2{I_{0\infty}}^{2} + {I_{135\infty }}^2 + 2\left( {\bar{w}_{45}^{135} - \bar{w}_0^{45} - \bar{w}_0^{135} - \bar{w}_{45}^{90}} \right)} }}} \right)^{-1}}, \end{aligned}$$
where
$$\left\{ \begin{array}{l} - {\varepsilon _{45}} = I\left( {{{45}^0}} \right) - I\left( {{{90}^0}} \right)\;\;\;\;\;\;\;\;\; - \varepsilon _{45}^{\prime} = I\left( {{0^0}} \right) - 2I\left( {{{45}^0}} \right)\\ {\gamma _{90}} = I{\left( {{0^0}} \right)_\infty } + I{\left( {{{90}^0}} \right)_\infty }\;\;\;\;\;\;\;\; - {\gamma _{45}} = I{\left( {{{45}^0}} \right)_\infty } - I{\left( {{{90}^0}} \right)_\infty }\\ - \gamma _{45}^{\prime} = I{\left( {{0^0}} \right)_\infty } - 2I{\left( {{{45}^0}} \right)_\infty }\;\;\;\;\;\;w_\alpha ^\beta = I\left( {{\alpha ^0}} \right) \cdot I\left( {{\beta ^0}} \right)\\ \bar{w}_\alpha ^\beta = I{\left( {{\alpha ^0}} \right)_\infty } \cdot I{\left( {{\beta ^0}} \right)_\infty }\;\;\;\;\;\;\;\;\;\;\;\;\;\; \end{array} \right. .$$

The sub-aperture images have pixels’ translation errors due to the displacement among four different detecting positions of the multi-aperture camera. Figures 4(a-d) show different sub-aperture polarized images taken simultaneously with the camera system shown in Fig. 4(h). The system is composed of four Point Grey’s GS3-U3-41C6C-C cameras, each with a resolution of 2048×2048 pixels. And the polarizers at 0°, 45°, 90°, and 135° (0° means that the transmission axis is parallel with the X-axis of image) are installed in front of the sub-camera lens, respectively. The brightness of these images from the four azimuths looks distinct, when the polarization azimuth of each image is different from others. In addition, the image deviations among the different sub-aperture polarized images could also be found in the direct observation of Figs. 4(a-d). The intensity image, shows in Fig. 4(e), is calculated from two orthogonal polarization azimuth images. Figure 4(f) is the DoLP result calculated by the information of Figs. 4(a-d). The DoLP information in this image has a ghosting effect, which has a great influence on the accuracy of the dehazing algorithm. Furthermore, the quantification of the reduced clarity after images superposition is shown in Fig. 4(g). The intensity curve at horizontal pixels from 360 row to 500 row and the vertical pixels from 264 column, represents the windows of the red building in the image. This horizontal line passes through the most noticeable building in Figs. 4(a) and 4(e), which provides a forceful demonstration of the reduction on image clarity for the superposition images. Therefore, solving the offset errors between pixels of sub-aperture polarized images is very necessary.

 

Fig. 4. (a-d) Multi-aperture system images of I(0°), I(45°), I(90°) and I(135°), respectively. (e) The intensity image. (f) The DoLP result utilize Stokes method. (g) Statistics of more than 140 pixels’ gray values in line 264 column of Figs. (a) and (e). (h) The coaxial and multi-aperture polarimetric camera.

Download Full Size | PPT Slide | PDF

3. Sub-aperture polarized image registration model

The translation errors of corresponding pixels between two images can be corrected based on the phase-only correlation algorithm is implemented. As shown in Fig. 4(h), these sub-cameras are fixed in a steel frame, so there are no rotation and scaling errors existed among them. There is only the translation transformation among the captured images. In addition, polarized images are very sensitive to pixel errors, so a high accuracy translation registration algorithm is essential.

A sub-pixel matching algorithm based on the Fourier transform can be quickly operated with a small amount of calculation, without interference from polarization information and noise in haze images. This method mainly applies the Phase Only Correlation (POC) between two images [22]. Then the method of curve fitting is performed using a plurality of points near the maximum POC value to finally acquire the actual maximum POC value. Then, accurate estimations of the sub-pixel translation can be achieved.

The translational motion of the signal in the time domain can be manifested by the phase change in the frequency domain while performing pixel registration between each pair of the four polarized images. So, the images are processed by the Fourier transform as Eq. (8):

$$\left\{ \begin{array}{l} {f_i}\left( {x,y} \right) = {f_j}\left( {x - {x_0},y - {y_0}} \right)\;\\ {F_j}\left( {u,v} \right) = {F_i}\left( {u,v} \right){e^{ - i2\pi \left( {u{x_{i0}}/M + v{y_{i0}}/N} \right)}} \end{array} \right.\;\;\;\left( {i = {{45}^0}, j \in [{0^0},{{90}^0},{{135}^0}]} \right),$$
where fi(x, y) and fj(x, y) express images from different polarization azimuths, M×N is the size of the image, and fi(x, y) is the shifted result, which means the image fj(x, y) is moved in the horizontal and vertical direction for (x0, y0) respectively. Fi(u, v) and Fj(u, v) are the Fourier transform images of fi(x, y) and fj(x, y), respectively. Between any two different polarization azimuths of the system, they have the same Fourier transform amplitude but different phase relationships. The phase difference is determined by the pixel displacement between the two polarized images. Then the phase difference between these two images can be represented by the cross-power spectral density of Fourier inversion, and the pulse function is expressed in Eq. (9):
$$\begin{aligned} & {c_j}({u,v} )= {F^{ - 1}}\left[ {\frac{{F_i^ \ast ({u,v} ){F_j}({u,v} )}}{{|{{F_i}({u,v} )} ||{{F_j}({u,v} )} |}}} \right] = \\ & \frac{1}{{MN}}\frac{{\sin [{\pi ({x - {x_{i0}}} )} ]}}{{\sin \left[ {\frac{\pi }{M}({x - {x_{i0}}} )} \right]}}\frac{{\sin [{\pi ({y - {y_{i0}}} )} ]}}{{\sin \left[ {\frac{\pi }{N}({y - {y_{i0}}} )} \right]}}\;\;, \end{aligned}$$
where Fi*(u, v) is conjugate of Fi(u, v). The phase is equal to the phase difference between the two images. The pulse function is not zero at position (xi0, yi0), and others are zero. The position is exactly where the pixel match needs to be determined. Usually, the values of π/M(x-xi0) and π/N(y-yi0) are close to 0, because the values of M and N are usually large. So the Eq. (9) is simplified and can be approximated as the function of sinc as Eq. (10):
$$\begin{aligned} {c_j}({u,v} )&\approx \frac{{\sin [{\pi ({x - {x_{i0}}} )} ]}}{{\pi ({x - {x_{i0}}} )}}\frac{{\sin [{\pi ({y - {y_{i0}}} )} ]}}{{\pi ({y - {y_{i0}}} )}}\\ & = {\mathop{\rm sinc}\nolimits} ({x - {x_{i0}}} ){\mathop{\rm sinc}\nolimits} ({y - {y_{i0}}} )\;\;\;\;\;\;\;\;. \end{aligned}$$
However, additional calculation of the actual cj(u, v) function’s peak is necessary when xi0 and yi0 are not integers. Hence, the sub-pixel motion estimation is indispensable, and it can further improve the accuracy of registration.

Since the maximum value of the sinc function can be approximately expressed as a quadric surface, the quadratic surface fitting process based on the least square method can be applied [23]. By fitting the phase correlation values with the surface near the relevant peak point, the parameters of pixel translation can be got. Then the information of coordinate displacement among the four polarization azimuth images can be obtained.

After obtaining the amount of image shift, align size and object pixel position of the multiple polarized images by cropping the edges of the image. By utilizing the information of these images, the clear image can be obtained without blur. Figure 5(a) is the sum of the errors for the four polarized images, and Fig. 5(b) shows the total errors after registration. It can be found with the experimental results that, most of the errors between each two polarized images are eliminated. In addition, we counted the pixels in the 939 row in these four images. The image before registration has a displacement, so the position of the maximum intensity of the light is different in Fig. 5(c). As obviously shown in Fig. 5(d), after the image registration process, the tendencies of the curve changes for the four polarized images are similar. Therefore, the results depict that, our registration method is fruitful, and more results will be shown in the next section.

 

Fig. 5. Pixel statistics before and after registration. (a) result before registration. (b) result after registration. (c) pixel statistics of (a). (d) pixel statistics of (b).

Download Full Size | PPT Slide | PDF

4. Experiments and analysis

The coaxial and multi-aperture polarimetric camera is applied to collecting four images of different polarization azimuths simultaneously. The dehazed image can be obtained by inputting information of these images into the dehazing algorithm. Figure 6 shows the dehazing results of the images in Figs. 4(a-d) before and after registration.

 

Fig. 6. Dehazed image of Figs. 4(a-d). (a) is the dehazed image before registration. (b) is the dehazed image after registration.

Download Full Size | PPT Slide | PDF

Figure 6(a) illustrates the dehazed image, which is unregistered. It can be seen that, the pixel mismatching of the four-aperture polarimetric cameras causes the target blurring and degrades the clarity of the image. Moreover, noise is very serious in the sky region due to pixel mismatching errors. Figure 6(b) shows the image result after image registration, where the details of the selected region are clearly enhanced, and the color information in the scene is also restored better. From the subjective evaluation on the dehazed images, it can be found that the dehazing system can be well applied after registration. Table 1 shows the comparison of MSSIM values [24] between the two images before and after registration, which can reflect the structural similarity of the two images. When an image is blurred, the MSSIM values are low. As can be seen from the values in Table 1, the MSSIM values of Fig. 6(b) are much higher than those in Fig. 6(a). The experimental results can certify that the dehazed image of registration retain the original structure information to a large extent.

Tables Icon

Table 1. Mean Structural Similarity Value

In addition, the Edge Spread Function (ESF) and LSF were utilized to evaluate the blurring degree of the reconstructed images. Using the high-precision Fermi function to fit the ESF of the edge region, the two-dimensional coordinate information was transformed into one-dimensional x, which represents the distance of the points in the edge area to the fitted straight line [25]. The Fermi function in Eq. (11), used for the ESF fitting, is robust to the random noise. The LSF can be obtained by analytically differentiating the ESF function, as in Eq. (12) [25,26]:

$$F\left( x \right) = \frac{a}{{1 + \exp \left[ {{{\left( {x - b} \right)} \mathord{\left/ {\vphantom {{\left( {x - b} \right)} c}} \right. } c}} \right]}} + D,$$
$$LSF(x) = \frac{{dF\left( x \right)}}{{dx}} = \frac{{ - a\exp \left[ {{{\left( {x - b} \right)} \mathord{\left/ {\vphantom {{\left( {x - b} \right)} c}} \right.} c}} \right]}}{{c{{\left\{ {\exp \left[ {{{\left( {x - b} \right)} \mathord{\left/ {\vphantom {{\left( {x - b} \right)} c}} \right.} c}} \right] + 1} \right\}}^2}}},$$
where a, b, c and D are four variables. A detailed explanation and calculation are elaborated in Ref. [26]. Choosing the same position of the building in Figs. 6(a) and 6(b), the ESF and LSF are utilized for edge region fitting as shown in Fig. 7. Figures 7(a) and 7(b) show the evaluation results of the ESF. The slope of the boundary change in Fig. 7(a) is significantly smaller than that in Fig. 7(b). Besides, Fig. 7(c) evaluates the slope of the boundary fitting curve ESF in the resulting image more intuitively. In the horizontal direction, the same conclusion can be got by observing the analytical curves in Figs. 7(d-f). By locally analyzing the sharpness of the edges in the image, the image sharpness after registration is greatly improved. Therefore, it is necessary to implement the registration study in the dehazing system.

 

Fig. 7. Building boundary definition evaluation result. (a) and (d) are the statistical results of using ESF on the horizontal and vertical boundaries of Fig. 6(a), respectively. (b) and (e) are the statistical results of using ESF on the horizontal and vertical boundaries of Fig. 6(b), respectively. (c) and (f) are the comparison of the evaluation results of Fig. 6(a) and (b) by LSF.

Download Full Size | PPT Slide | PDF

In addition, the clear scene results of this dehazing system are demonstrated in Fig. 8. Figures 8(a) and 8(e) show the raw images. Figures 8(c) and 8(g) show the dehazed images after registration. It can be obviously seen that, the distant objects become clear, and the region of buildings in this image have better performance on contrast and details, compared to those in haze image. Moreover, the intensity images of red, green, and blue channels are shown in Figs. 8(b), 8(d), 8(f) and 8(h), which represent the intensity distribution of the reconstruction images respectively. In Figs. 8(b) and 8(f), pixel intensity distributions are in the range of approximately 150 (red and green channels value range from about 30 to 180, blue channel value range from about 40 to 200), which are relatively concentrated. The intensity distribution of the images after dehazing are shown in Figs. 8(d) and 8(h), which are about 1.7 times higher than before. Hence, dehazed images have better visual effects.

 

Fig. 8. Haze images, dehazed images and color statistical images. (a) and (e) are the haze images. (b), (d), (f) and (h) are the RGB color statistics images of (a), (c), (e) and (g), respectively. (c) and (g) are dehazed images after registration.

Download Full Size | PPT Slide | PDF

Besides, high-frequency information of images extracted by the Canny edge detection algorithm, is shown in Fig. 9 [27]. The high-frequency information of the haze image is compared with its dehazed version under the same Canny parameters. As shown in Fig. 9, it can be intuitively seen that the detailed texture information processed by the dehazing system is more abundant. As shown in Fig. 9(b), the high-frequency information of the windows in the building is better restored compared with that in Fig. 9(a). Besides, the information of distant building and surface details in Fig. 9(d) can be reconstructed at the same time. The number of extracted pixels is employed to represent the richness of the details in the scene. The numerical values of the haze images and the dehazed images of the two scenes are 7511, 7425, 14536 and 38577 pixels respectively. The improvement of high-frequency information can make the target look more realistic, the details are raised, and it can also help to recovery the information of distant objects in the scene.

 

Fig. 9. The Canny edge detection results. (a) and (c) are high-frequency information extraction results of the haze images in Figs. 8(a) and 8(e), respectively. (b) and (d) are high-frequency information extraction results of the dehazed images in Figs. 8(c) and 8(g), respectively.

Download Full Size | PPT Slide | PDF

The coaxial and multi-aperture imaging system has an obvious advantage that, for both intuitive observation and objective evaluation of the intensity images, the processed images provide improved quality in terms of contrast, details, and clarity. These evaluation indicators agree well with the conclusions of the previous research results on subjective assessment.

5. Conclusion

This study presents a novel polarization dehazing system. With a coaxial and multi-aperture polarimetric camera that can capture the linearly polarized information simultaneously, the designed system can solve the problems such as the poor real-time performance on polarized image acquisition, low energy utilization and manual operation errors, thus enhancing its practicability under haze weather conditions. Along with the imaging system, the dehazing algorithm is improved to realize clear imaging of the scene without the need to acquire Imax and Imin, which enable the system free from the complex calculation process. Multiple images of different polarization azimuths can be corrected by using the sub-aperture polarized image registration model. Compared to the haze image, a dehazed image can be reconstructed with richer colors, higher contrast, approximately 3 times more detailed information and 1.5 times greater MSSIM values. These advantages make the system effectively adapt to the long-distance, high-resolution and real-time imaging in haze weather.

Funding

National Natural Science Foundation of China (NSFC) (61705175); Natural Science Foundation of Shaanxi Province (2018JQ6010); China Postdoctoral Science Foundation (2017M613063); the Fundamental Research Funds for the Central Universities (20103196515); the Innovation Fund of Xidian University (20109196769).

Acknowledgements

We thank Associate Professor Rui Gong from Xidian University for checking the English grammar of this paper. Thanks to Kui Yang from Xidian University for helping to acquire experimental data in this study.

References

1. Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Polarization-based vision through haze,” Appl. Opt. 42(3), 511–525 (2003). [CrossRef]  

2. R. L. Spellicy, “Optical Environmental Monitoring in Industry,” Opt. Photonics News 6(1), 22–26 (1995). [CrossRef]  

3. A. Akula, R. Ghosh, S. Kumar, and H. K. Sardana, “Moving target detection in thermal infrared imagery using spatiotemporal information,” J. Opt. Soc. Am. A 30(8), 1492–1501 (2013). [CrossRef]  

4. K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2011). [CrossRef]  

5. G. Orsini and G. Ramponi, “A modified retinex for image contrast enhancement and dynamics control,” in Proceedings of IEEE Conference on International Conference on Image Processing (IEEE, 2003), pp. 393–396.

6. M. J. Seow and V. K. Asari, “Ratio rule and homomorphic filter for enhancement of digital colour image,” Neurocomputing 69(7-9), 954–958 (2006). [CrossRef]  

7. J. Liang, L. Ren, H. Ju, W. Zhang, and E. Qu, “Polarimetric dehazing method for dense haze removal based on distribution analysis of angle of polarization,” Opt. Express 23(20), 26146–26157 (2015). [CrossRef]  

8. F. Liu, L. Cao, X. Shao, P. Han, and X. Bin, “Polarimetric dehazing utilizing spatial frequency segregation of image,” Appl. Opt. 54(27), 8116–8122 (2015). [CrossRef]  

9. X. Li, H. Hu, L. Zhao, H. Wang, Y. Yu, L. Wu, and T. Liu, “Polarimetric image recovery method combining histogram stretching for underwater imaging,” Sci. Rep. 8(1), 12430 (2018). [CrossRef]  

10. X. Xiao, B. Javidi, G. Saavedra, M. Eismann, and M. Martinez-Corral, “Three-dimensional polarimetric computational integral imaging,” Opt. Express 20(14), 15481–15488 (2012). [CrossRef]  

11. F. Liu, P. Han, Y. Wei, K. Yang, S. Huang, X. Li, G. Zhang, L. Bai, and X. Shao, “Deeply seeing through highly turbid water by active polarization imaging,” Opt. Lett. 43(20), 4903–4906 (2018). [CrossRef]  

12. M. I. Mishchenko, Y. S. Yatskiv, V. K. Rosenbush, and G. Videen, Polarimetric Detection, Characterization, and Remote Sensing (Springer, 2011).

13. J. Mudge and M. Virgen, “Real time polarimetric dehazing,” Appl. Opt. 52(9), 1932–1938 (2013). [CrossRef]  

14. L. Cao, X. Shao, F. Liu, and L. Wang, “Dehazing method through polarimetric imaging and multi-scale analysis,” Proc. SPIE 9501, 950111 (2015). [CrossRef]  

15. Y. Y. Schechner and N. Karpel, “Recovery of underwater visiblilty and structure by polarization analysis,” IEEE J. Oceanic Eng. 30(3), 570–587 (2005). [CrossRef]  

16. J. Liang, L. Ren, E. Qu, B. Hu, and Y. Wang, “Method for enhancing visibility of hazy images based on polarimetric imaging,” Photonics Res. 2(1), 38–44 (2014). [CrossRef]  

17. W. Zhang, J. Liang, L. Ren, H. Ju, E. Qu, Z. Bai, Y. Tang, and Z. Wu, “Real-time image haze removal using an aperture-division polarimetric camera,” Appl. Opt. 56(4), 942–947 (2017). [CrossRef]  

18. F. Chen, A. Henry, M. S. Brian, and R. A. Gonzalo, “Compressive spectral polarization imaging by a pixelized polarizer and colored patterned detector,” J. Opt. Soc. Am. A 32(11), 2178–2188 (2015). [CrossRef]  

19. S. G. Narasimhan and S. K. Nayar, “Shedding light on the weather,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2003), pp. I-665–I-672.

20. S. G. Narasimhan and S. K. Nayar, “Vision and the atmosphere,” Int. J. Comput. Vis. 48(3), 233–254 (2002). [CrossRef]  

21. M. Sulami, I. Glatzer, R. Fattal, and M. Werman, “Automatic recovery of the atmospheric light in hazy images,” International Conference on Computational Photography (IEEE, 2014), pp. 1–11.

22. C. M. Persons, D. B. Chenault, M. W. Jones, and C. A. Farlow, “Automated registration of polarimetric imagery using Fourier transform techniques,” Proc. SPIE 4819, 107–117 (2002). [CrossRef]  

23. H. S. Stone, M. T. Orchard, E.-C. Chang, and S. A. Martucci, “A fast direct Fourier-based algorithm for subpixel registration of images,” IEEE Trans. Geosci. Remote Sensing 39(10), 2235–2243 (2001). [CrossRef]  

24. Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. on Image Process. 13(4), 600–612 (2004). [CrossRef]  

25. P. Han, F. Liu, K. Yang, J. Ma, J. Li, and X. Shao, “Active underwater descattering and image recovery,” Appl. Opt. 56(23), 6631–6638 (2017). [CrossRef]  

26. P. T. Alexis and M. M. Jonathan, “Measurement of the modulation transfer function of infrared cameras,” Opt. Eng. 34(6), 1808–1817 (1995). [CrossRef]  

27. J. Canny, “A Computional Approach to Edge Detection,” IEEE Trans. Pattern Anal. Mach. Intell. PAMI-8(6), 679–698 (1986). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Polarization-based vision through haze,” Appl. Opt. 42(3), 511–525 (2003).
    [Crossref]
  2. R. L. Spellicy, “Optical Environmental Monitoring in Industry,” Opt. Photonics News 6(1), 22–26 (1995).
    [Crossref]
  3. A. Akula, R. Ghosh, S. Kumar, and H. K. Sardana, “Moving target detection in thermal infrared imagery using spatiotemporal information,” J. Opt. Soc. Am. A 30(8), 1492–1501 (2013).
    [Crossref]
  4. K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2011).
    [Crossref]
  5. G. Orsini and G. Ramponi, “A modified retinex for image contrast enhancement and dynamics control,” in Proceedings of IEEE Conference on International Conference on Image Processing (IEEE, 2003), pp. 393–396.
  6. M. J. Seow and V. K. Asari, “Ratio rule and homomorphic filter for enhancement of digital colour image,” Neurocomputing 69(7-9), 954–958 (2006).
    [Crossref]
  7. J. Liang, L. Ren, H. Ju, W. Zhang, and E. Qu, “Polarimetric dehazing method for dense haze removal based on distribution analysis of angle of polarization,” Opt. Express 23(20), 26146–26157 (2015).
    [Crossref]
  8. F. Liu, L. Cao, X. Shao, P. Han, and X. Bin, “Polarimetric dehazing utilizing spatial frequency segregation of image,” Appl. Opt. 54(27), 8116–8122 (2015).
    [Crossref]
  9. X. Li, H. Hu, L. Zhao, H. Wang, Y. Yu, L. Wu, and T. Liu, “Polarimetric image recovery method combining histogram stretching for underwater imaging,” Sci. Rep. 8(1), 12430 (2018).
    [Crossref]
  10. X. Xiao, B. Javidi, G. Saavedra, M. Eismann, and M. Martinez-Corral, “Three-dimensional polarimetric computational integral imaging,” Opt. Express 20(14), 15481–15488 (2012).
    [Crossref]
  11. F. Liu, P. Han, Y. Wei, K. Yang, S. Huang, X. Li, G. Zhang, L. Bai, and X. Shao, “Deeply seeing through highly turbid water by active polarization imaging,” Opt. Lett. 43(20), 4903–4906 (2018).
    [Crossref]
  12. M. I. Mishchenko, Y. S. Yatskiv, V. K. Rosenbush, and G. Videen, Polarimetric Detection, Characterization, and Remote Sensing (Springer, 2011).
  13. J. Mudge and M. Virgen, “Real time polarimetric dehazing,” Appl. Opt. 52(9), 1932–1938 (2013).
    [Crossref]
  14. L. Cao, X. Shao, F. Liu, and L. Wang, “Dehazing method through polarimetric imaging and multi-scale analysis,” Proc. SPIE 9501, 950111 (2015).
    [Crossref]
  15. Y. Y. Schechner and N. Karpel, “Recovery of underwater visiblilty and structure by polarization analysis,” IEEE J. Oceanic Eng. 30(3), 570–587 (2005).
    [Crossref]
  16. J. Liang, L. Ren, E. Qu, B. Hu, and Y. Wang, “Method for enhancing visibility of hazy images based on polarimetric imaging,” Photonics Res. 2(1), 38–44 (2014).
    [Crossref]
  17. W. Zhang, J. Liang, L. Ren, H. Ju, E. Qu, Z. Bai, Y. Tang, and Z. Wu, “Real-time image haze removal using an aperture-division polarimetric camera,” Appl. Opt. 56(4), 942–947 (2017).
    [Crossref]
  18. F. Chen, A. Henry, M. S. Brian, and R. A. Gonzalo, “Compressive spectral polarization imaging by a pixelized polarizer and colored patterned detector,” J. Opt. Soc. Am. A 32(11), 2178–2188 (2015).
    [Crossref]
  19. S. G. Narasimhan and S. K. Nayar, “Shedding light on the weather,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2003), pp. I-665–I-672.
  20. S. G. Narasimhan and S. K. Nayar, “Vision and the atmosphere,” Int. J. Comput. Vis. 48(3), 233–254 (2002).
    [Crossref]
  21. M. Sulami, I. Glatzer, R. Fattal, and M. Werman, “Automatic recovery of the atmospheric light in hazy images,” International Conference on Computational Photography (IEEE, 2014), pp. 1–11.
  22. C. M. Persons, D. B. Chenault, M. W. Jones, and C. A. Farlow, “Automated registration of polarimetric imagery using Fourier transform techniques,” Proc. SPIE 4819, 107–117 (2002).
    [Crossref]
  23. H. S. Stone, M. T. Orchard, E.-C. Chang, and S. A. Martucci, “A fast direct Fourier-based algorithm for subpixel registration of images,” IEEE Trans. Geosci. Remote Sensing 39(10), 2235–2243 (2001).
    [Crossref]
  24. Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. on Image Process. 13(4), 600–612 (2004).
    [Crossref]
  25. P. Han, F. Liu, K. Yang, J. Ma, J. Li, and X. Shao, “Active underwater descattering and image recovery,” Appl. Opt. 56(23), 6631–6638 (2017).
    [Crossref]
  26. P. T. Alexis and M. M. Jonathan, “Measurement of the modulation transfer function of infrared cameras,” Opt. Eng. 34(6), 1808–1817 (1995).
    [Crossref]
  27. J. Canny, “A Computional Approach to Edge Detection,” IEEE Trans. Pattern Anal. Mach. Intell. PAMI-8(6), 679–698 (1986).
    [Crossref]

2018 (2)

X. Li, H. Hu, L. Zhao, H. Wang, Y. Yu, L. Wu, and T. Liu, “Polarimetric image recovery method combining histogram stretching for underwater imaging,” Sci. Rep. 8(1), 12430 (2018).
[Crossref]

F. Liu, P. Han, Y. Wei, K. Yang, S. Huang, X. Li, G. Zhang, L. Bai, and X. Shao, “Deeply seeing through highly turbid water by active polarization imaging,” Opt. Lett. 43(20), 4903–4906 (2018).
[Crossref]

2017 (2)

2015 (4)

2014 (1)

J. Liang, L. Ren, E. Qu, B. Hu, and Y. Wang, “Method for enhancing visibility of hazy images based on polarimetric imaging,” Photonics Res. 2(1), 38–44 (2014).
[Crossref]

2013 (2)

2012 (1)

2011 (1)

K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2011).
[Crossref]

2006 (1)

M. J. Seow and V. K. Asari, “Ratio rule and homomorphic filter for enhancement of digital colour image,” Neurocomputing 69(7-9), 954–958 (2006).
[Crossref]

2005 (1)

Y. Y. Schechner and N. Karpel, “Recovery of underwater visiblilty and structure by polarization analysis,” IEEE J. Oceanic Eng. 30(3), 570–587 (2005).
[Crossref]

2004 (1)

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. on Image Process. 13(4), 600–612 (2004).
[Crossref]

2003 (1)

2002 (2)

S. G. Narasimhan and S. K. Nayar, “Vision and the atmosphere,” Int. J. Comput. Vis. 48(3), 233–254 (2002).
[Crossref]

C. M. Persons, D. B. Chenault, M. W. Jones, and C. A. Farlow, “Automated registration of polarimetric imagery using Fourier transform techniques,” Proc. SPIE 4819, 107–117 (2002).
[Crossref]

2001 (1)

H. S. Stone, M. T. Orchard, E.-C. Chang, and S. A. Martucci, “A fast direct Fourier-based algorithm for subpixel registration of images,” IEEE Trans. Geosci. Remote Sensing 39(10), 2235–2243 (2001).
[Crossref]

1995 (2)

R. L. Spellicy, “Optical Environmental Monitoring in Industry,” Opt. Photonics News 6(1), 22–26 (1995).
[Crossref]

P. T. Alexis and M. M. Jonathan, “Measurement of the modulation transfer function of infrared cameras,” Opt. Eng. 34(6), 1808–1817 (1995).
[Crossref]

1986 (1)

J. Canny, “A Computional Approach to Edge Detection,” IEEE Trans. Pattern Anal. Mach. Intell. PAMI-8(6), 679–698 (1986).
[Crossref]

Akula, A.

Alexis, P. T.

P. T. Alexis and M. M. Jonathan, “Measurement of the modulation transfer function of infrared cameras,” Opt. Eng. 34(6), 1808–1817 (1995).
[Crossref]

Asari, V. K.

M. J. Seow and V. K. Asari, “Ratio rule and homomorphic filter for enhancement of digital colour image,” Neurocomputing 69(7-9), 954–958 (2006).
[Crossref]

Bai, L.

Bai, Z.

Bin, X.

Bovik, A. C.

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. on Image Process. 13(4), 600–612 (2004).
[Crossref]

Brian, M. S.

Canny, J.

J. Canny, “A Computional Approach to Edge Detection,” IEEE Trans. Pattern Anal. Mach. Intell. PAMI-8(6), 679–698 (1986).
[Crossref]

Cao, L.

F. Liu, L. Cao, X. Shao, P. Han, and X. Bin, “Polarimetric dehazing utilizing spatial frequency segregation of image,” Appl. Opt. 54(27), 8116–8122 (2015).
[Crossref]

L. Cao, X. Shao, F. Liu, and L. Wang, “Dehazing method through polarimetric imaging and multi-scale analysis,” Proc. SPIE 9501, 950111 (2015).
[Crossref]

Chang, E.-C.

H. S. Stone, M. T. Orchard, E.-C. Chang, and S. A. Martucci, “A fast direct Fourier-based algorithm for subpixel registration of images,” IEEE Trans. Geosci. Remote Sensing 39(10), 2235–2243 (2001).
[Crossref]

Chen, F.

Chenault, D. B.

C. M. Persons, D. B. Chenault, M. W. Jones, and C. A. Farlow, “Automated registration of polarimetric imagery using Fourier transform techniques,” Proc. SPIE 4819, 107–117 (2002).
[Crossref]

Eismann, M.

Farlow, C. A.

C. M. Persons, D. B. Chenault, M. W. Jones, and C. A. Farlow, “Automated registration of polarimetric imagery using Fourier transform techniques,” Proc. SPIE 4819, 107–117 (2002).
[Crossref]

Fattal, R.

M. Sulami, I. Glatzer, R. Fattal, and M. Werman, “Automatic recovery of the atmospheric light in hazy images,” International Conference on Computational Photography (IEEE, 2014), pp. 1–11.

Ghosh, R.

Glatzer, I.

M. Sulami, I. Glatzer, R. Fattal, and M. Werman, “Automatic recovery of the atmospheric light in hazy images,” International Conference on Computational Photography (IEEE, 2014), pp. 1–11.

Gonzalo, R. A.

Han, P.

He, K.

K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2011).
[Crossref]

Henry, A.

Hu, B.

J. Liang, L. Ren, E. Qu, B. Hu, and Y. Wang, “Method for enhancing visibility of hazy images based on polarimetric imaging,” Photonics Res. 2(1), 38–44 (2014).
[Crossref]

Hu, H.

X. Li, H. Hu, L. Zhao, H. Wang, Y. Yu, L. Wu, and T. Liu, “Polarimetric image recovery method combining histogram stretching for underwater imaging,” Sci. Rep. 8(1), 12430 (2018).
[Crossref]

Huang, S.

Javidi, B.

Jonathan, M. M.

P. T. Alexis and M. M. Jonathan, “Measurement of the modulation transfer function of infrared cameras,” Opt. Eng. 34(6), 1808–1817 (1995).
[Crossref]

Jones, M. W.

C. M. Persons, D. B. Chenault, M. W. Jones, and C. A. Farlow, “Automated registration of polarimetric imagery using Fourier transform techniques,” Proc. SPIE 4819, 107–117 (2002).
[Crossref]

Ju, H.

Karpel, N.

Y. Y. Schechner and N. Karpel, “Recovery of underwater visiblilty and structure by polarization analysis,” IEEE J. Oceanic Eng. 30(3), 570–587 (2005).
[Crossref]

Kumar, S.

Li, J.

Li, X.

X. Li, H. Hu, L. Zhao, H. Wang, Y. Yu, L. Wu, and T. Liu, “Polarimetric image recovery method combining histogram stretching for underwater imaging,” Sci. Rep. 8(1), 12430 (2018).
[Crossref]

F. Liu, P. Han, Y. Wei, K. Yang, S. Huang, X. Li, G. Zhang, L. Bai, and X. Shao, “Deeply seeing through highly turbid water by active polarization imaging,” Opt. Lett. 43(20), 4903–4906 (2018).
[Crossref]

Liang, J.

Liu, F.

Liu, T.

X. Li, H. Hu, L. Zhao, H. Wang, Y. Yu, L. Wu, and T. Liu, “Polarimetric image recovery method combining histogram stretching for underwater imaging,” Sci. Rep. 8(1), 12430 (2018).
[Crossref]

Ma, J.

Martinez-Corral, M.

Martucci, S. A.

H. S. Stone, M. T. Orchard, E.-C. Chang, and S. A. Martucci, “A fast direct Fourier-based algorithm for subpixel registration of images,” IEEE Trans. Geosci. Remote Sensing 39(10), 2235–2243 (2001).
[Crossref]

Mishchenko, M. I.

M. I. Mishchenko, Y. S. Yatskiv, V. K. Rosenbush, and G. Videen, Polarimetric Detection, Characterization, and Remote Sensing (Springer, 2011).

Mudge, J.

Narasimhan, S. G.

Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Polarization-based vision through haze,” Appl. Opt. 42(3), 511–525 (2003).
[Crossref]

S. G. Narasimhan and S. K. Nayar, “Vision and the atmosphere,” Int. J. Comput. Vis. 48(3), 233–254 (2002).
[Crossref]

S. G. Narasimhan and S. K. Nayar, “Shedding light on the weather,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2003), pp. I-665–I-672.

Nayar, S. K.

Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Polarization-based vision through haze,” Appl. Opt. 42(3), 511–525 (2003).
[Crossref]

S. G. Narasimhan and S. K. Nayar, “Vision and the atmosphere,” Int. J. Comput. Vis. 48(3), 233–254 (2002).
[Crossref]

S. G. Narasimhan and S. K. Nayar, “Shedding light on the weather,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2003), pp. I-665–I-672.

Orchard, M. T.

H. S. Stone, M. T. Orchard, E.-C. Chang, and S. A. Martucci, “A fast direct Fourier-based algorithm for subpixel registration of images,” IEEE Trans. Geosci. Remote Sensing 39(10), 2235–2243 (2001).
[Crossref]

Orsini, G.

G. Orsini and G. Ramponi, “A modified retinex for image contrast enhancement and dynamics control,” in Proceedings of IEEE Conference on International Conference on Image Processing (IEEE, 2003), pp. 393–396.

Persons, C. M.

C. M. Persons, D. B. Chenault, M. W. Jones, and C. A. Farlow, “Automated registration of polarimetric imagery using Fourier transform techniques,” Proc. SPIE 4819, 107–117 (2002).
[Crossref]

Qu, E.

Ramponi, G.

G. Orsini and G. Ramponi, “A modified retinex for image contrast enhancement and dynamics control,” in Proceedings of IEEE Conference on International Conference on Image Processing (IEEE, 2003), pp. 393–396.

Ren, L.

Rosenbush, V. K.

M. I. Mishchenko, Y. S. Yatskiv, V. K. Rosenbush, and G. Videen, Polarimetric Detection, Characterization, and Remote Sensing (Springer, 2011).

Saavedra, G.

Sardana, H. K.

Schechner, Y. Y.

Y. Y. Schechner and N. Karpel, “Recovery of underwater visiblilty and structure by polarization analysis,” IEEE J. Oceanic Eng. 30(3), 570–587 (2005).
[Crossref]

Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Polarization-based vision through haze,” Appl. Opt. 42(3), 511–525 (2003).
[Crossref]

Seow, M. J.

M. J. Seow and V. K. Asari, “Ratio rule and homomorphic filter for enhancement of digital colour image,” Neurocomputing 69(7-9), 954–958 (2006).
[Crossref]

Shao, X.

Sheikh, H. R.

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. on Image Process. 13(4), 600–612 (2004).
[Crossref]

Simoncelli, E. P.

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. on Image Process. 13(4), 600–612 (2004).
[Crossref]

Spellicy, R. L.

R. L. Spellicy, “Optical Environmental Monitoring in Industry,” Opt. Photonics News 6(1), 22–26 (1995).
[Crossref]

Stone, H. S.

H. S. Stone, M. T. Orchard, E.-C. Chang, and S. A. Martucci, “A fast direct Fourier-based algorithm for subpixel registration of images,” IEEE Trans. Geosci. Remote Sensing 39(10), 2235–2243 (2001).
[Crossref]

Sulami, M.

M. Sulami, I. Glatzer, R. Fattal, and M. Werman, “Automatic recovery of the atmospheric light in hazy images,” International Conference on Computational Photography (IEEE, 2014), pp. 1–11.

Sun, J.

K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2011).
[Crossref]

Tang, X.

K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2011).
[Crossref]

Tang, Y.

Videen, G.

M. I. Mishchenko, Y. S. Yatskiv, V. K. Rosenbush, and G. Videen, Polarimetric Detection, Characterization, and Remote Sensing (Springer, 2011).

Virgen, M.

Wang, H.

X. Li, H. Hu, L. Zhao, H. Wang, Y. Yu, L. Wu, and T. Liu, “Polarimetric image recovery method combining histogram stretching for underwater imaging,” Sci. Rep. 8(1), 12430 (2018).
[Crossref]

Wang, L.

L. Cao, X. Shao, F. Liu, and L. Wang, “Dehazing method through polarimetric imaging and multi-scale analysis,” Proc. SPIE 9501, 950111 (2015).
[Crossref]

Wang, Y.

J. Liang, L. Ren, E. Qu, B. Hu, and Y. Wang, “Method for enhancing visibility of hazy images based on polarimetric imaging,” Photonics Res. 2(1), 38–44 (2014).
[Crossref]

Wang, Z.

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. on Image Process. 13(4), 600–612 (2004).
[Crossref]

Wei, Y.

Werman, M.

M. Sulami, I. Glatzer, R. Fattal, and M. Werman, “Automatic recovery of the atmospheric light in hazy images,” International Conference on Computational Photography (IEEE, 2014), pp. 1–11.

Wu, L.

X. Li, H. Hu, L. Zhao, H. Wang, Y. Yu, L. Wu, and T. Liu, “Polarimetric image recovery method combining histogram stretching for underwater imaging,” Sci. Rep. 8(1), 12430 (2018).
[Crossref]

Wu, Z.

Xiao, X.

Yang, K.

Yatskiv, Y. S.

M. I. Mishchenko, Y. S. Yatskiv, V. K. Rosenbush, and G. Videen, Polarimetric Detection, Characterization, and Remote Sensing (Springer, 2011).

Yu, Y.

X. Li, H. Hu, L. Zhao, H. Wang, Y. Yu, L. Wu, and T. Liu, “Polarimetric image recovery method combining histogram stretching for underwater imaging,” Sci. Rep. 8(1), 12430 (2018).
[Crossref]

Zhang, G.

Zhang, W.

Zhao, L.

X. Li, H. Hu, L. Zhao, H. Wang, Y. Yu, L. Wu, and T. Liu, “Polarimetric image recovery method combining histogram stretching for underwater imaging,” Sci. Rep. 8(1), 12430 (2018).
[Crossref]

Appl. Opt. (5)

IEEE J. Oceanic Eng. (1)

Y. Y. Schechner and N. Karpel, “Recovery of underwater visiblilty and structure by polarization analysis,” IEEE J. Oceanic Eng. 30(3), 570–587 (2005).
[Crossref]

IEEE Trans. Geosci. Remote Sensing (1)

H. S. Stone, M. T. Orchard, E.-C. Chang, and S. A. Martucci, “A fast direct Fourier-based algorithm for subpixel registration of images,” IEEE Trans. Geosci. Remote Sensing 39(10), 2235–2243 (2001).
[Crossref]

IEEE Trans. on Image Process. (1)

Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. on Image Process. 13(4), 600–612 (2004).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (2)

J. Canny, “A Computional Approach to Edge Detection,” IEEE Trans. Pattern Anal. Mach. Intell. PAMI-8(6), 679–698 (1986).
[Crossref]

K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2011).
[Crossref]

Int. J. Comput. Vis. (1)

S. G. Narasimhan and S. K. Nayar, “Vision and the atmosphere,” Int. J. Comput. Vis. 48(3), 233–254 (2002).
[Crossref]

J. Opt. Soc. Am. A (2)

Neurocomputing (1)

M. J. Seow and V. K. Asari, “Ratio rule and homomorphic filter for enhancement of digital colour image,” Neurocomputing 69(7-9), 954–958 (2006).
[Crossref]

Opt. Eng. (1)

P. T. Alexis and M. M. Jonathan, “Measurement of the modulation transfer function of infrared cameras,” Opt. Eng. 34(6), 1808–1817 (1995).
[Crossref]

Opt. Express (2)

Opt. Lett. (1)

Opt. Photonics News (1)

R. L. Spellicy, “Optical Environmental Monitoring in Industry,” Opt. Photonics News 6(1), 22–26 (1995).
[Crossref]

Photonics Res. (1)

J. Liang, L. Ren, E. Qu, B. Hu, and Y. Wang, “Method for enhancing visibility of hazy images based on polarimetric imaging,” Photonics Res. 2(1), 38–44 (2014).
[Crossref]

Proc. SPIE (2)

L. Cao, X. Shao, F. Liu, and L. Wang, “Dehazing method through polarimetric imaging and multi-scale analysis,” Proc. SPIE 9501, 950111 (2015).
[Crossref]

C. M. Persons, D. B. Chenault, M. W. Jones, and C. A. Farlow, “Automated registration of polarimetric imagery using Fourier transform techniques,” Proc. SPIE 4819, 107–117 (2002).
[Crossref]

Sci. Rep. (1)

X. Li, H. Hu, L. Zhao, H. Wang, Y. Yu, L. Wu, and T. Liu, “Polarimetric image recovery method combining histogram stretching for underwater imaging,” Sci. Rep. 8(1), 12430 (2018).
[Crossref]

Other (4)

G. Orsini and G. Ramponi, “A modified retinex for image contrast enhancement and dynamics control,” in Proceedings of IEEE Conference on International Conference on Image Processing (IEEE, 2003), pp. 393–396.

M. I. Mishchenko, Y. S. Yatskiv, V. K. Rosenbush, and G. Videen, Polarimetric Detection, Characterization, and Remote Sensing (Springer, 2011).

S. G. Narasimhan and S. K. Nayar, “Shedding light on the weather,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2003), pp. I-665–I-672.

M. Sulami, I. Glatzer, R. Fattal, and M. Werman, “Automatic recovery of the atmospheric light in hazy images,” International Conference on Computational Photography (IEEE, 2014), pp. 1–11.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. Scattering imaging model under hazy weather.
Fig. 2.
Fig. 2. Variation of light intensity with scene distance.
Fig. 3.
Fig. 3. (a) and (c) are a set of mutually orthogonal polarization azimuth images. (b) and (d) are a set of polarization azimuth images that differ by 5 degrees from (a) and (c), respectively. (e) Difference in polarization between two sets of images.
Fig. 4.
Fig. 4. (a-d) Multi-aperture system images of I(0°), I(45°), I(90°) and I(135°), respectively. (e) The intensity image. (f) The DoLP result utilize Stokes method. (g) Statistics of more than 140 pixels’ gray values in line 264 column of Figs. (a) and (e). (h) The coaxial and multi-aperture polarimetric camera.
Fig. 5.
Fig. 5. Pixel statistics before and after registration. (a) result before registration. (b) result after registration. (c) pixel statistics of (a). (d) pixel statistics of (b).
Fig. 6.
Fig. 6. Dehazed image of Figs. 4(a-d). (a) is the dehazed image before registration. (b) is the dehazed image after registration.
Fig. 7.
Fig. 7. Building boundary definition evaluation result. (a) and (d) are the statistical results of using ESF on the horizontal and vertical boundaries of Fig. 6(a), respectively. (b) and (e) are the statistical results of using ESF on the horizontal and vertical boundaries of Fig. 6(b), respectively. (c) and (f) are the comparison of the evaluation results of Fig. 6(a) and (b) by LSF.
Fig. 8.
Fig. 8. Haze images, dehazed images and color statistical images. (a) and (e) are the haze images. (b), (d), (f) and (h) are the RGB color statistics images of (a), (c), (e) and (g), respectively. (c) and (g) are dehazed images after registration.
Fig. 9.
Fig. 9. The Canny edge detection results. (a) and (c) are high-frequency information extraction results of the haze images in Figs. 8(a) and 8(e), respectively. (b) and (d) are high-frequency information extraction results of the dehazed images in Figs. 8(c) and 8(g), respectively.

Tables (1)

Tables Icon

Table 1. Mean Structural Similarity Value

Equations (13)

Equations on this page are rendered with MathJax. Learn more.

I = I c e β d + I ( 1 e β d ) ,
D o L P = I max I min I max + I min = ( S 1 E 2 + S 2 E 2 ) 1 / 2 S 0 E ,
I ( 1 e β d ) = I t D o L P A D o L P ,
I c = I t ( 1 D o L P A D o L P ) / I t ( 1 D o L P A D o L P ) ( 1 I t D o L P A I D o L P ) ( 1 I t D o L P A I D o L P ) .
I ( 1 e β d ) = S 0 D o L P A D o L P ,
I N = I t ( 1 ( S 1 2 + S 2 2 S 0 ) / ( S 1 2 + S 2 2 S 0 ) ( S 1 E 2 + S 2 E 2 S 0 E ) ( S 1 E 2 + S 2 E 2 S 0 E ) ) × ( 1  -  S 1 2 + S 2 2 / S 1 2 + S 2 2 ( I × S 1 E 2 + S 2 E 2 S 0 E ) ( I × S 1 E 2 + S 2 E 2 S 0 E ) ) ,
I N = S 0 ( 1 γ 90 × ( ε 45 ) 2 + ( ε 45 ) 2 + 2 I 0 2 + I 135 2 + 2 ( w 45 135 w 0 45 w 0 135 w 45 90 ) S 0 × ( γ 45 ) 2 + ( γ 45 ) 2 + 2 I 0 2 + I 135 2 + 2 ( w ¯ 45 135 w ¯ 0 45 w ¯ 0 135 w ¯ 45 90 ) ) × ( 1 γ 90 × ( ε 45 ) 2 + ( ε 45 ) 2 + 2 I 0 2 + I 135 2 + 2 ( w 45 135 w 0 45 w 0 135 w 45 90 ) I × ( γ 45 ) 2 + ( γ 45 ) 2 + 2 I 0 2 + I 135 2 + 2 ( w ¯ 45 135 w ¯ 0 45 w ¯ 0 135 w ¯ 45 90 ) ) 1 ,
{ ε 45 = I ( 45 0 ) I ( 90 0 ) ε 45 = I ( 0 0 ) 2 I ( 45 0 ) γ 90 = I ( 0 0 ) + I ( 90 0 ) γ 45 = I ( 45 0 ) I ( 90 0 ) γ 45 = I ( 0 0 ) 2 I ( 45 0 ) w α β = I ( α 0 ) I ( β 0 ) w ¯ α β = I ( α 0 ) I ( β 0 ) .
{ f i ( x , y ) = f j ( x x 0 , y y 0 ) F j ( u , v ) = F i ( u , v ) e i 2 π ( u x i 0 / M + v y i 0 / N ) ( i = 45 0 , j [ 0 0 , 90 0 , 135 0 ] ) ,
c j ( u , v ) = F 1 [ F i ( u , v ) F j ( u , v ) | F i ( u , v ) | | F j ( u , v ) | ] = 1 M N sin [ π ( x x i 0 ) ] sin [ π M ( x x i 0 ) ] sin [ π ( y y i 0 ) ] sin [ π N ( y y i 0 ) ] ,
c j ( u , v ) sin [ π ( x x i 0 ) ] π ( x x i 0 ) sin [ π ( y y i 0 ) ] π ( y y i 0 ) = sinc ( x x i 0 ) sinc ( y y i 0 ) .
F ( x ) = a 1 + exp [ ( x b ) / ( x b ) c c ] + D ,
L S F ( x ) = d F ( x ) d x = a exp [ ( x b ) / ( x b ) c c ] c { exp [ ( x b ) / ( x b ) c c ] + 1 } 2 ,

Metrics