Abstract

The analysis of polarized filtered images has been proven useful in image dehazing. However, the current polarization-based dehazing algorithms are based on the assumption that the polarization is only associated with the airlight. This assumption does not hold up well in practice since both object radiance and airlight contribute to the polarization. In this study, a new polarization hazy imaging model is presented, which considers the joint polarization effects of the airlight and the object radiance in the imaging process. In addition, an effective method to synthesize the optimal polarized-difference (PD) image is introduced. Then, a decorrelation-based scheme is proposed to estimate the degree of polarization for the object from the polarized image input. After that, the haze-free image can be recovered based on the new polarization hazy imaging model. The qualitative and quantitative experimental results verify the effectiveness of this new dehazing scheme. As a by-product, this scheme also provides additional polarization properties of the objects in the image, which can be used in extended applications, such as scene segmentation and object recognition.

© 2014 Optical Society of America

1. Introduction

The image quality of outdoor scenes can be severely limited by atmospheric aerosols which scatter and absorb the target signal out of the optical path and scatter unwanted light into the optical path from the surroundings. To eliminate such negative effects, several dehazing methods have been developed in recent years. The existing dehazing methods can be divided into two categories: single image dehazing [15] and polarization-based dehazing [610]. Because of their ill-posed nature, single image dehazing schemes rely on various assumptions to eliminate ambiguity, including dark channel prior [1], local consistency [2], neighboring smoothness and maximizing image contrast [3]. Polarization-based dehazing methods use as few as two polarized images taken through a polarizer at different orientations. The representative schemes in polarization-based dehazing were proposed by Schechner et al. [68, 10], who assumed that only airlight is polarized (OAP assumption), as shown in Fig. 1(b). In particular, Namer and Schechner [8] pointed out that this assumption works fine in most cases with the exception of occasional specular dielectric objects in a scene. Furthermore, they performed a correction of the airlight map by detecting incorrect areas and re-estimating the polarization property of the airlight. The correction process sharply increased the computation cost while only obvious mistakes can be corrected, such as areas of water bodies and shiny construction materials.

 

Fig. 1 The components of a hazy image. (a) components of intensity. (b) Schechner et al.’s view of polarization’s components. (c) the proposed view of polarization’s components.

Download Full Size | PPT Slide | PDF

In fact, polarizations of the airlight and the object radiance both commonly contribute to polarization of images, as shown in Fig. 1(c). The schemes based on this overly strong OAP assumption do not work well in certain situations, especially when polarization of the object radiance is much more than the polarization of the airlight. Moreover, the related studies have shown that polarization-based methods can also provide polarization information about the scene, which is beneficial to extended image processing applications such as object recognition [1114], scene segmentation [1518] and material classification [12, 19, 20].

Consequently, we propose a new dehazing method that jointly considers polarization effects of both airlight and object radiance. The remainder of this paper is organized as follows. First, the related optical models are introduced in Section 2, including both polarization imaging model and polarization hazy imaging model. Then, we prove that polarization of object radiance cannot be ignored, as shown by observation data presented in Section 3. Next, in Section 4, the new dehazing algorithm is described in detail. Subsequently, the dehazing results and their comparison with existing schemes are shown in Section 5. Finally, the conclusion and discussion are presented in Section 6.

2. Optical models

2.1 Stokes vector and polarization imaging

A common representation of a polarization state is the Stokes vector representation [21]. In this representation, a 4 × 1 column vector is assembled over a scene with x-y spatial coordinates as

S(x,y)=(S0(x,y)S1(x,y)S2(x,y)S3(x,y))=(I(x,y,00)+I(x,y,900)I(x,y,00)-I(x,y,900)I(x,y,450)-I(x,y,450)IL(x,y)-IR(x,y)).

where S0 represents the total intensity of the remitted and collected light; S1 represents the difference in intensities between the horizontal and vertical linearly polarized components; S2 represents the difference in intensities between linearly polarized components traveling at 45° and −45° with respect to the x-axis; and S3 represents the difference in intensities between right and left circularly polarized light. In this study, we have chosen to ignore S3 because circular polarization is relatively rare in airlight, and it has not appeared as a major component in natural scenes.

The Stokes vector for a partially polarized beam [21] can be considered as a superposition of a completely polarized Stokes vector and a non-polarized Stokes vector. The polarized portion of the beam represents a net polarization ellipse traced by the electric field vector as a function of time. From the Stokes vector, the degree of linear polarization (DoLP) ρλ and the orientation angle of polarization (AOP) αλ ellipse are given by:

ρλ=(S1λ)2+(S2λ)2S0λ.
αλ=12arctan(S2λS1λ).
Here, λ[λR,λG,λB] represents the regular RGB wavebands of imaging system.

Suppose a perfect linear polarizer is placed in front of a camera. The observed intensity of image Iλ(x,y,θ) at pixel (x,y) is a function of the angle θ that the polarization analyzer lies with respect to a reference direction adopted in prior studies [22, 23], which can be described as:

Iλ(x,y,θ)=12S0λ(x,y)+12S1λ(x,y)cos2θ+12S2λ(x,y)sin2θ.
where S0λ, S1λ and S2λ are the first three Stokes parameters. The parameters S0λ, S1λ and S2λ can be obtained when we acquire three input polarized images with the linear polarizer setting at different orientations.

2.2 Polarization hazy imaging model

The following equation approximately represents the optical energy transferred through the homogeneous atmosphere medium by absorption and scattering processes [24, 25]. The total intensity of an image is proportional to its optical energy; therefore, image intensity is equal to the first element of the Stokes vector.

S0(x,y)=S0D(x,y)+S0A(x,y).
S0(x,y) is the intensity of the image captured by the camera at position (x, y). S0D(x,y) is called the direct transmission, which describes how the scene radiance is attenuated due to the atmosphere. S0A(x,y) is called airlight or path radiance. It originates from the environmental illumination, a portion of which is scattered into the line-of-sight by atmospheric particles. The expressions of S0D(x,y) and S0A(x,y) are:
S0D(x,y)=J(x,y)t(x,y).
S0A(x,y)=A(1-t(x,y)).
Here, J is the scene radiant intensity, A is the airlight radiant intensity corresponding to an object at an infinite distance (e.g the horizon), and t is the transmission map which can be expressed as:
t(x,y)=exp(βd(x,y)).
Here, β is the extinction coefficient due to scattering and absorption. Equation (8) describes the medium transmittance mainly depending on the distance d between the object and the observer. Thus the transmission map t can also be regarded as a scaled range map.

As reported in [6, 8, 23], Eq. (2) can be equivalently expressed as Eq. (9)

ρ(x,y)=Imax(x,y,θmax)Imin(x,y,θmin)Imax(x,y,θmax)+Imin(x,y,θmin)=ΔI(x,y)S0(x,y).

where Imax(x,y,θmax) denotes the maximum intensity at position (x, y) when rotating the polarizer, and the angle of polarization analyzer at this time is called θmax. Imin(x,y,θmin) corresponds to the minimum intensity and the angle of polarization analyzer at this time is called θmin. We define ΔI as the polarized-difference (PD) image and S0 as the polarized-sum (PS) image [13, 23, 26]. Equation (9) is more suitable for the wideband signals.

Similarly, the DoLP of the direct transmission S0D(x,y) and the airlight S0A(x,y) can be defined as:

ρD(x,y)=ImaxD(x,y,θmax)IminD(x,y,θmin)ImaxD(x,y,θmax)+IminD(x,y,θmin)=ΔD(x,y)S0D(x,y).
ρA(x,y)=ImaxA(x,y,θmax)IminA(x,y,θmin)ImaxA(x,y,θmax)+IminA(x,y,θmin)=ΔA(x,y)S0A(x,y).
Then, a polarization hazy image formation can be obtained and expressed as:
ρ(x,y)S0(x,y)=ρD(x,y)S0D(x,y)+ρA(x,y)S0A(x,y).
Note that A, J, S0, S0A, S0D, β, ρ, ρDandρAare functions of the light wavelength λ. Since RGB channels were available in the camera used in this study, the analysis for each channel can be performed independently.

3. An influence of polarization of object radiance

To prove that polarization of object radiance cannot be ignored in most cases, we select 40,000 polarized images from the two-year observation data for a fixed scene. The observation scene included sky, mountain, buildings and trees, as shown in Fig. 2(a). Directly in front of the imaging device several features were positioned at the following linear distances from the camera: a lawn (0-130 m), a pond (130-280 m), a reservoir (280-1000 m), buildings and trees (1000-6000 m), and a mountain (6000 m). The underlying topography of the light path can be categorized as a complex surface type, as shown Fig. 2(b). Owing to the high humidity of the region, haze is quite common throughout the year. Thus, it is easy to capture hazy images.

 

Fig. 2 The description of the observed scene. (a) an observed scene. (b) the underlying topography of the light path. (c) the scene segmentation results.

Download Full Size | PPT Slide | PDF

Each collected image was then segmented into five regions, as shown in Fig. 2(c). The means of the DoLP for each region of each image group were computed. The means of the DoLP for the sky region is considered as the DoLP of airlight ρAand that of the other regions are considered as the common DoLP ρof both the airlight and the object radiance. The statistical results are shown in Fig. 3, where the points represent the DoLPs of corresponding regions and the points on the same dotted line represent the DoLPs of a certain image group.

 

Fig. 3 The statistical results for mean of DoLP of each region. Each dot represents the DoLP of the corresponding region and each line corresponds to a polarized image group.

Download Full Size | PPT Slide | PDF

Schechner et al assumed that light emanating from scene objects is not polarized, so its energy is evenly distributed between the polarization components [6]. This means ΔD=0 and ρD=0. According to Eqs. (9) and (11), it can be concluded that the mean of the DoLP for the sky region should be greater than or equal to that of the other regions. However, it is not the case from the results shown in Fig. 3, where mean of the DoLP of the sky region is often less than or close to that of the other regions. We believe this is due to the effect of the polarization of the object radiance. Therefore, the assumption that polarization is associated only with the airlight is not always true.

Moreover, we further quantitatively analyzed on the relationship between ρA andρD. It can be derived that ρD-ρA=(S0/S0D)*(ρ-ρA) according to Eqs. (12) and (5). If ρ=ρA, then ρD=ρA. If ρis greater than ρA, since S0/S0D>1,then ρDis much larger than ρA. This means that ρD>ρ>ρA. This shows that the polarization of the object radiance in fact contributes greatly to the image polarization, and consequently cannot be ignored.

In fact, neither ρD nor ρA, should be ignored in the analysis of polarization images. There are two main reasons for this. The first reason is that the polarization of airlight varies with meteorological situations. The second reason is that the polarization of object radiance occurs with respect to the incident angle and the properties of material surface. Because polarizations of airlight and object radiance are dynamic and change considerably with specific situations, we cannot decide which one is more important. Therefore, neither of them can be ignored.

According to the above observation and analysis, a well-founded conclusion can be drawn that both polarization of airlight and object radiance typically contributes to the polarization of images.

4. Improved dehazing algorithm

4.1 Image restoration model

Since the polarization of the object radiance cannot be ignored, we will use the new polarization hazy model presented in Section 2, which considers both the polarization effects of the airlight and the object radiance, to remove the undesired haze.

Combining Eqs. (5), (7), (9), and (12), we get the expression of the transmission map t as follows:

t(x,y)=1ΔI(x,y)ρD(x,y)S0(x,y)A(ρA(x,y)ρD(x,y)).
Combining Eqs. (5)(7), and (13), we can get the expression of the intensity of the scenes radiance J (i.e. dehazed image) as follows:
J(x,y)=ΔI(x,y)ρA(x,y)S0(x,y)ρD(x,y)(1S0(x,y)/A)+ΔI(x,y)/AρA(x,y).
To get the dehazed image using Eq. (14), we need to estimate the following parameters: the PD image ΔI, the infinity atmospheric intensity A, the DoLP of the airlight ρA and the DoLP of the object ρD. The PD image ΔI can be obtained from the polarized images Imax and Imin. The method to synthesize the PD image will be described in Section 4.2. The infinity atmospheric intensity Aand the DoLP of the airlight ρA are global parameters, and the estimation methods for these parameters will be introduced in Section 4.3. As for the DoLP of the direct transmission, i.e. the DoLP of the object ρD, a decorrelation-based method will be proposed to deal with them and will be explained in detail in Section 4.4 and 4.5.

4.2 Synthesis of PD image ΔI

Tyo et al. [13, 26, 27] proposed a measuring method for a PD image by automatically selecting the optimal orthogonal directions of the polarizer. However, since polarization properties of the object radiance in the scene are different, the best PD image cannot be captured only relying on the optimal orthogonal direction. Here, we propose a synthesis method to obtain the optimal PD image.

According to Eqs. (2) and (3), Eq. (4) can be equivalently described as [22]

Iλ(x,y,θ)=12(1ρλ(x,y))S0λ(x,y)+ρλ(x,y)S0λ(x,y)cos2(αλ(x,y)θ).
According to Eq. (15), the observed image intensity Iλ(x,y,θ) is maximum at θ=αλ and minimum at θ=αλ±π/2. Since the polarization properties of the observed objects are different (i.e. α is different for various object in a scene), we cannot directly capture the images Imax and Imin by rotating the polarizer. Here, we can solve ρ, α, and S0 when we have three polarized images as input with the linear polarizer setting at different orientations. However, we can synthesize images Imax andImin, which can be expressed as:
Imax(x,y,θmax)=12(1+ρ(x,y))S0(x,y).Imin(x,y,θmin)=12(1ρ(x,y))S0(x,y).
Clearly, the PD image can be obtained by subtracting Imax from Imin.

4.3 Estimation of ρA and A

In this study, the atmosphere was assumed to be homogeneous. Therefore, ρA and A are constant for all of the pixels in the image. As in prior studies [68], the pixel values of the sky region in the image can be used to estimate the two parameters, since they correspond to the airlight radiance from an object at an infinite distance. Then, the two parameters are estimated as:

A=1|Ω|(x,y)Ω(Imax(x,y,θmax)+Imin(x,y,θmin)).ρA=1|Ω|(x,y)Ω(Imax(x,y,θmax)Imin(x,y,θmin)Imax(x,y,θmax)+Imin(x,y,θmin)).
Here, Ω denotes the selected sky region and |Ω| represents the number of pixels inΩ. We automatically detect the sky region according to the prior method [28].

4.4 Estimation of ρD and t

The transmission map t depends on the scene depth and the atmospheric attenuation coefficientβ, while the dehazed image J represents the inherent trait of the scene object. Therefore, it is reasonable to assume that they are not statistically correlated [4, 23] over a localized set of pixels sharing the sameρD. It can be formulated as Covω(x,y)(t(x,y),J1(x,y))=0, (x,y)ω(x,y). However, it does not always hold up in practice due to noise. Therefore, we attempt to estimate an optimal value forρD:

ρD(x,y)=argminρD(x,y)|Covω(x,y)(t(x,y),J1(x,y))|.
whereCovω(x,y) denotes the covariance between two individual pixels in the local neighborhood ω(x,y). If we let E(ρD)=Covω(x,y)(t(x,y),J1(x,y)) and substitute Eqs. (13) and (14), we obtain:
E(ρD)=Cov{ρDS0(x,y)ΔI(x,y),ρDS0(x,y)ΔI(x,y)A(ρDρA)ΔI(x,y)ρAS0(x,y)}(ρDρA)A2.
If the pixels in the same patch belong to different objects, ρD varies across ω(x,y). To deal with this problem, we assign a weight to each pixel in ω(x,y).

The optimal solution of the above equation can be obtained by solving the differential equation dE2(ρD)/dρD=0. The method described here is similar to the Independent Component Analysis (ICA) method reported in prior work [4].

4.5 Refining ρD(x,y) and t(x,y)

It is inevitable that some artifacts, such as block effects and estimation errors, are introduced into the results obtained based on the proposed scheme, as shown in Figs. 6(d) and 6(e). In order to overcome these problems, we apply image guided filtering [29] to post-process the estimated transmission map t. This filter is an excellent edge-preserving smoothing operator, so it can be used not only to filter the errors by neighborhood pixels smoothing but also to preserve the edges and structures. In this section, as t is in vector form, we use the one-dimensional symbol i to denote the pixel index for simplicity. We denote the refined transmission map by t^ and the original transmission map by t. Rewriting them in their vector form as t^ and t, the filtering output at a point i is expressed as:

t^i=jWij(I)tj.
The filter kernel Wij can be expressed by:
Wij(I)=1|ωk|2k:(i,j)ωk(1+(Iiμk)(Ijμk)σk2+ε).
where μk and σk are the mean and covariance matrix of pixels in window ωk, Ii and Ij are the pixels of the input image I at pixels i and j, respectively. ε is a regularization parameter, and |ωk| is the size of the window ωk. More details about this filter can be found in [29].

Moving the filter kernel through all positions, we can obtain the refined transmission map t^ . Then, we substitute t^ into Eq. (13) to obtain the refined DoLP map ρ^D.

Notes that only the transmission map t of R-channel is refined in our experiments, for reducing computing cost. According to Eq. (8), we can get refined t^gand t^b by a simple calculation. The expressions are t^g=exp((βg/βr)·lnt^r) and t^b=exp((βb/βr)·lnt^r), where βgβr=1Nlntg(x,y)lntr(x,y)and βbβr=1Nlntb(x,y)lntr(x,y), and N is image size.

5. Experimental results

5.1 Overview

Figure 4 shows the flowchart of the proposed scheme consisting of three steps. The first step aims at synthesizing the polarized images Imax and Imin. In the second step, we estimate the key model parameter ρD based on a decorrelation-based method. The other two parameters, ρA and t, are also estimated in this step. Finally in the third step, in order to correct the estimations and eliminate the block effects, we apply an image guided filtering to the transmission map t. Then, the improved t is used to refineρD. After that, the recovered image J is obtained according to Eq. (14).

 

Fig. 4 Flowchart of our proposed method.

Download Full Size | PPT Slide | PDF

5.2 Experimental data

Aiming at developing schemes for continental aerosols and marine aerosols, we selected two typical hazy image sets to test our algorithm. First, we captured several sets of images at different level of atmospheric visibility in the city of Hefei, a typical inland city with air pollution in China. Therefore, image degradation is mostly due to continental aerosols. Typical examples are shown in Figs. 5(a)5(d). Then, we captured sets of images on the island of Jinmen in the coastal East China Sea region, as shown in Figs. 5(e) and 5(f). The degradation of these images is mainly due to marine aerosols. These polarized images are captured by an automatic rotating-polarizer. The next step is to restore the degraded images using the proposed algorithm.

 

Fig. 5 Experimental images.

Download Full Size | PPT Slide | PDF

5.3 Dehazing experiments

In this section, the performance of the proposed scheme is tested on polarized hazy images. Taking Scene 1 as an example [as shown in Fig. 5(a)], the flow of the process and the interim results are shown in Fig. 6. Since objects have different polarization properties, as shown in Fig. 6(c), the PD map shows the outline the contour of the objects. Accordingly, the PD map can be of great use for image processing applications, such as recognition and scene segmentation. There are some isolated bright spots distributed in the sky and building regions in Fig. 6(e), which are obviously errors. These errors can be corrected in the refined transmission map, as shown in Fig. 6(f). Furthermore, the refined transmission map succeeds in capturing the sharp edge discontinuities. Comparing the results shown in Figs. 6(d) and 6(g), the DoLP map of the refined object is much smoother. Figure 6(i) shows the dehazing result with the rough ρD, which has apparent errors in the regions outlined by the red rectangle and more noise in sky region than the final dehazing result Fig. 6(j). The contrast of Fig. 6(j) is 0.2501, and the contrast of hazy image is 0.0970. It can be seen that the proposed scheme can effectively improve image contrast and recover the details. Here the contrast is calculated according to the reference [30].

 

Fig. 6 Key parameter estimation and dehazing results for Scene 1. (a) synthesizing image Imax. (b) synthesizing image Imin. (c) polarized-difference (PD) image. (d) estimated rough DoLP map ρD. (e) Estimated rough transmission map t of R-channel. (f) refined transmission map t^ of R-channel. (g) final DoLP map ρ^D. (h) sky region detection. (i) dehazing results with rough ρD.(j) dehazing results with refined ρ^D.

Download Full Size | PPT Slide | PDF

To evaluate the performance of proposed scheme on hazy images caused by marine aerosols, we also conducted experiments on Scenes 5 and 6. As can be seen in Fig. 7, the proposed approach can unveil the details and recover vivid color information even in some very dense haze regions. The experimental results show that the proposed scheme can also work well under the hazy weather of marine aerosols.

 

Fig. 7 Experimental results for Scenes 5 and 6. (a) dehazing result of Scene 5. (b) dehazing result of Scene 6.

Download Full Size | PPT Slide | PDF

5.4 Dehazing comparison with and without considering ρD

Different from the existing polarization-based dehazing algorithms, the proposed algorithm considers the polarization of both airlight and object radiance. As mentioned in prior studies [68], if only the polarization of airlight is considered, the real radiance of a scene, J, is expressed as follows:

J(x,y)=ΔI(x,y)ρA(x,y)S0(x,y)ΔI(x,y)/AρA(x,y).

Taking Scene 2 as example, we perform several experiments to compare the proposed scheme with those that only consider the polarization of airlight [68]. Comparing Fig. 8(a) (contrast: 0.1170)with Fig. 8(b) (contrast: 0.1932), it can be seen that there is a significant improvement in image contrast, especially at the region outlined by the red rectangle in the dehazing results. The enlarged red rectangles shown in Figs. 8(c) and 8(d) are more apparent.

 

Fig. 8 Comparision experiment with and without ρD for Scene 2. (a) dehazed image considering only the polarization of airlight. (b) dehazed images. (c) magnified region on the red rectangle in (a). (d) magnified region on the red rectangle in (b).

Download Full Size | PPT Slide | PDF

To further evaluate the effectiveness of the proposed model, another two sets of experiments are carried out the results are shown in Fig. 9, whose input images are shown in Figs. 5(c) and 5(d). They are polarized images of the same scene at different atmospheric visibility. For Fig. 5(c), the DoLP of the sky region is similar to that of the object region. For Fig. 5(d), the DoLP of the sky region is less than that of the object region. Figures 9(a) and 9(c) (the contrasts: 0.1170 and 0.0344) are the dehazed images without considering the DoLP of object, which have lower contrast in comparison to the results shown in Figs. 9(b) and 9(d) (the contrasts: 0.1932 and 0.0572).

 

Fig. 9 Dehazing results for Scenes 3 and 4. (a) and (c) are the dehazed image without considering the DoLP of object. (b) and (d) are our dehazed images.

Download Full Size | PPT Slide | PDF

We calculate the noise levels of dehazing images according to the reference [31], which is shown in Table 1. From these experimental results, we can see that the noise level is higher in the dehazed images without considering ρD. The value of ΔI/Ain Eq. (22) is usually small. When ρA is also very small, the image noise can be easily amplified in the process of image recovery. As for the model characterized by Eq. (14), the denominator is larger than that of Eq. (22) due to the contribution of the term ρD(1S0/A). Therefore, the proposed scheme can recover the dehazed image with reduced noise level.

Tables Icon

Table 1. The noise level comparison of dehazing images with and without considering ρD

5.5 Dehazing comparison with the method of Schechner et al.’s

In prior work reported in [68], the input images are the best and the worst polarized images, because it is assumed that polarization is associated only with the airlight. Therefore, in order to compare our results with that of previous studies [68], we used the best and worst polarized images to approximate Imax and Imin images.

Comparing Figs. 10(b) and 10(c), it can be seen that the proposed scheme can achieve a better performance, especially in the region of specular reflection (outlined by the red circle in the dehazing results). This is because the scheme in [6] assumes that the light reflected from scene object has no significant polarization. This assumption fails to characterize the region of specular reflection. In order to deal with this situation, additional steps have to be applied to process the specular surface area individually, as in [8]. In contrast, the proposed scheme simultaneously considers the light polarization from object radiance and airlight, and consequently achieves better results without additional post-processing. The quantitative comparison between Figs. 10(b) and 10(c) are also shown in Table 2.

 

Fig. 10 Comparison with Schechner’s dehazing results. (a) the two-group input polarized images. Each group has a worst-polarized image and a best-polarized image. (b) Schechner’s results. (c) our dehazed images.

Download Full Size | PPT Slide | PDF

Tables Icon

Table 2. The six scene dehazed result’s image quality

5.6 The quantitative evaluation of dehazed images

Due to the lack of ground truth images, we adopt the no-reference natural image quality evaluator (NIQE) [32] to quantitatively assess and compare the dehazing results. The NIQE could evaluate total image quality including image structure, image noise level, image blur degree, edge sharpness, and so on. The NIQE model constructed a collection of quality awareness features and then fit them to a multivariate Gaussian (MVG) model. The quality awareness features were derived from a spatial natural scene statistic (NSS) model, which was the framework of locally normalized luminance coefficients. Mittal et al. [33] showed that the natural image normalized luminance coefficients closely follow a Gaussian-like distribution, but the degraded image did not hold well for this distribution. Thus, the quality of a given test image was simply expressed as the distance between an MVG fit of the NSS features extracted from the test image and an MVG model from the corpus of natural images. Mathematically, this is defined as:

D(ν1,ν2,Σ1,Σ2)=(ν1ν2)T(Σ1+Σ22)1(ν1ν2).
where ν1,ν2 and Σ1,Σ2are the mean vectors and covariance matrices of the natural MVG model and the distorted image’s MVG model, respectively.

We ran the NIQE code (http://live.ece.utexas.edu/research/Quality/index.htm) to evaluate the dehazed images. The results are shown in Table 2. Values in Table 2 represent the distance between the model statistics of clear natural images and those of the dehazed image. The smaller the value is, the closer the dehazed image is to the clear images. The quantitative evaluation shows that all of the dehazed images based on the proposed scheme have been improved over the images processed by the existing schemes that only considered the polarization of airlight.

6. Conclusion

In this paper, a new approach is presented for the recovery of haze-free images from polarized hazy images. The main difference between this scheme and the existing polarization dehazing algorithms is that this scheme considers the polarization of the airlight and the object radiance jointly. Three novel aspects are studied, including (1) a new polarization hazy imaging model including ρA and ρD; (2) a decorrelation-based algorithm to obtain ρD from polarized hazy images; (3) an effective method to synthesize the optimal PD image. After obtaining these key parameters, the haze-free image can be recovered according to polarization hazy imaging model. The qualitative and quantitative experimental results show that our dehazing scheme can considerably improve the image quality in contrast, detail and signal-to-noise ratio.

This scheme can also generate two valuable derivative results, the PD image and the DoLP of the object. Unlike the traditional approach which uses the selection of measuring two optimal orthogonal directions of the polarizer [26], the PD image is obtained by synthesizing the polarized images Imaxand Imin. Since the polarization properties of the observed scene vary, the optimal PD image is actually unable to be directly measured in practice. However, the proposed scheme can obtain the optimal PD image, which can be applied into object perceptions, especially in foggy weather or underwater surroundings. In addition, this scheme can separate the DoLP of an object from the DoLP of an image. This scheme can obtain a much more stable and accurate DoLP of an object, comparing with the traditional approach which uses the DoLP of an image as a substitute. Since the DoLP of an object represents an essential attribute of the object, it will be useful in object recognition and scene segmentation.

Note that the proposed scheme requires that the input images contain some sky areas in order to estimate the parameters of airlight. However, the sky is sometimes cannot be seen within the field-of-view. Tarel [3] proposed an atmospheric veil and an algorithm of inferring atmospheric veil. In our future work, we will apply an atmospheric veil of polarized images to obtain the polarization of airlight.

Acknowledgments

The authors thank Prof. Rao Ruizhong and Dr. Wu Pengfei for fruitful discussions. This work was supported by the National Natural Science Foundation of China (No.61175033) and the Fundamental Research Funds for Central Universities (No.2010HGXJ0018, No.2012HGCX0001).

References and links

1. K. M. He, J. Sun, and X. O. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2010). [PubMed]  

2. C. H. Yeh, L.-W. Kang, M.-S. Lee, and C.-Y. Lin, “Haze effect removal from image via haze density estimation in optical model,” Opt. Express 21(22), 27127–27141 (2013). [CrossRef]   [PubMed]  

3. J. P. Tarel and N. Hautiere, “Fast visibility restoration from a single color or gray level image,” in Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition (IEEE, 2009), pp. 2201–2208. [CrossRef]  

4. R. Fattal, “Single image dehazing,” ACM Trans. Graph. 27(3), 988–992 (2008). [CrossRef]  

5. R. T. Tan, “Visibility in bad weather from a single image,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8. [CrossRef]  

6. Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Instant dehazing of images using polarization,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2001), pp. 325–332. [CrossRef]  

7. Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Polarization-based vision through haze,” Appl. Opt. 42(3), 511–525 (2003). [CrossRef]   [PubMed]  

8. E. Namer and Y. Y. Schechner, “Advanced visibility improvement based on polarization filtered images,” Proc. SPIE 5888, 36–45 (2005).

9. E. Namer, S. Shwartz, and Y. Y. Schechner, “Skyless polarimetric calibration and visibility enhancement,” Opt. Express 17(2), 472–493 (2009). [CrossRef]   [PubMed]  

10. S. Shwartz, E. Namer, and Y. Y. Schechner, “Blind haze separation,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 1984–1991. [CrossRef]  

11. M. Saito, Y. Sato, K. Ikeuchi, and H. Kashiwagi, “Measurement of surface orientations of transparent objects using polarization in highlight,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 1999), pp. 381–386. [CrossRef]  

12. H. Chen and L. B. Wolff, “Polarization phase-based method for material classification and object recognition in computer vision,” Proc. SPIE 2599, 54–63 (1996). [CrossRef]  

13. J. S. Tyo, M. P. Rowe, E. N. Pugh Jr, and N. Engheta, “Target detection in optically scattering media by polarization-difference imaging,” Appl. Opt. 35(11), 1855–1870 (1996). [CrossRef]   [PubMed]  

14. K. Yemelyanov, M. Lo, E. Pugh Jr, and N. Engheta, “Display of polarization information by coherently moving dots,” Opt. Express 11(13), 1577–1584 (2003). [CrossRef]   [PubMed]  

15. L. B. Wolff, “Using polarization to separate reflection components,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 1989), pp. 363–369. [CrossRef]  

16. M. Ben-Ezra, “Segmentation with invisible keying signal,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2000), pp. 32–37. [CrossRef]  

17. K.M. Yemelyanov, S.S. Lin, E.N. Pugh, Jr., and N. Engheta, “Polarization-based segmentation for enhancement of target detection in adaptive polarization-difference imaging,” in Frontiers in Optics, OSA Technical Digest Series (Optical Society of America, 2005), paper JWA51.

18. S. S. Lin, K. M. Yemelyanov, E. N. Pugh Jr, and N. Engheta, “Separation and contrast enhancement of overlapping cast shadow components using polarization,” Opt. Express 14(16), 7099–7108 (2006). [CrossRef]   [PubMed]  

19. L. B. Wolff, “Polarization-based material classification from specular reflection,” IEEE Trans. Pattern Anal. Mach. Intell. 12(11), 1059–1071 (1990). [CrossRef]  

20. S. Tominaga and A. Kimachi, “Polarization imaging for material classification,” Opt. Eng. 47(12), 123201 (2008). [CrossRef]  

21. M. Bass, Devices, Measurements, and Properties, Vol. 2 of Handbook of Optics (McGraw-Hill, 1995), Chap. 22.

22. M. W. Hyde, S. C. Cain, J. D. Schmidt, and M. J. Havrilla, “Material classification of an unknown object using turbulence-degraded polarimetric imagery,” in Proceedings of IEEE Transactions on Geoscience and Remote Sensing (IEEE, 2010), pp. 264–276. [CrossRef]  

23. K. M. Yemelyanov, S. S. Lin, E. N. Pugh Jr, and N. Engheta, “Adaptive algorithms for 2-channel polarization sensing under various polarization statistics with nonuniform distributions,” Appl. Opt. 45(22), 5504–5520 (2006). [CrossRef]   [PubMed]  

24. R. C. Henry, S. Mahadev, S. Urquijo, and D. Chitwood, “Color perception through atmospheric haze,” J. Opt. Soc. Am. A 17(5), 831–835 (2000). [CrossRef]   [PubMed]  

25. S. G. Narasimhan and S. K. Nayar, “Vision and the atmosphere,” Int. J. Comput. Vis. 48(3), 233–254 (2002). [CrossRef]  

26. J. S. Tyo, “Design of optimal polarimeters: maximization of signal-to-noise ratio and minimization of systematic error,” Appl. Opt. 41, 619–630 (2002). [CrossRef]   [PubMed]  

27. J. S. Tyo, “Optimum linear combination strategy for an N-channel polarization sensitive vision or imaging system,” J. Opt. Soc. Am. A 15(2), 359–366 (1998). [CrossRef]  

28. D. Hoiem, A. A. Efros, and M. Hebert, “Automatic photo pop-up,” ACM Trans. Graph. 24(3), 577–584 (2005). [CrossRef]  

29. K. M. He, J. Sun, and X. Tang, “Guided image filtering,” in Proceedings of European Conference on Computer Vision, K. Daniilidis, P. Maragos, N. Paragios, eds. (Berlin, 2010), pp. 1–14.

30. R. M. Haralick, K. Shanmugam, and I. H. Dinstein, “Textural features for image classification,” IEEE Trans. Syst. Man Cybern. 3(6), 610–621 (1973). [CrossRef]  

31. S. Pyatykh, J. Hesser, and L. Zheng, “Image noise level estimation by principal component analysis,” IEEE Trans. Image Process. 22(2), 687–699 (2013). [CrossRef]   [PubMed]  

32. A. Mittal, R. Soundararajan, and A. C. Bovik, “Making a completely blind image quality analyzer,” in Proceedings of IEEE Conference on Signal Processing Letters (IEEE, 2013), pp. 209–212.

33. A. Mittal, A. K. Moorthy, and A. C. Bovik, “No-reference image quality assessment in the spatial domain,” IEEE Trans. Image Process. 21(12), 4695–4708 (2012). [CrossRef]   [PubMed]  

References

  • View by:
  • |
  • |
  • |

  1. K. M. He, J. Sun, and X. O. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2010).
    [PubMed]
  2. C. H. Yeh, L.-W. Kang, M.-S. Lee, and C.-Y. Lin, “Haze effect removal from image via haze density estimation in optical model,” Opt. Express 21(22), 27127–27141 (2013).
    [CrossRef] [PubMed]
  3. J. P. Tarel and N. Hautiere, “Fast visibility restoration from a single color or gray level image,” in Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition (IEEE, 2009), pp. 2201–2208.
    [CrossRef]
  4. R. Fattal, “Single image dehazing,” ACM Trans. Graph. 27(3), 988–992 (2008).
    [CrossRef]
  5. R. T. Tan, “Visibility in bad weather from a single image,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.
    [CrossRef]
  6. Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Instant dehazing of images using polarization,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2001), pp. 325–332.
    [CrossRef]
  7. Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Polarization-based vision through haze,” Appl. Opt. 42(3), 511–525 (2003).
    [CrossRef] [PubMed]
  8. E. Namer and Y. Y. Schechner, “Advanced visibility improvement based on polarization filtered images,” Proc. SPIE 5888, 36–45 (2005).
  9. E. Namer, S. Shwartz, and Y. Y. Schechner, “Skyless polarimetric calibration and visibility enhancement,” Opt. Express 17(2), 472–493 (2009).
    [CrossRef] [PubMed]
  10. S. Shwartz, E. Namer, and Y. Y. Schechner, “Blind haze separation,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 1984–1991.
    [CrossRef]
  11. M. Saito, Y. Sato, K. Ikeuchi, and H. Kashiwagi, “Measurement of surface orientations of transparent objects using polarization in highlight,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 1999), pp. 381–386.
    [CrossRef]
  12. H. Chen and L. B. Wolff, “Polarization phase-based method for material classification and object recognition in computer vision,” Proc. SPIE 2599, 54–63 (1996).
    [CrossRef]
  13. J. S. Tyo, M. P. Rowe, E. N. Pugh, and N. Engheta, “Target detection in optically scattering media by polarization-difference imaging,” Appl. Opt. 35(11), 1855–1870 (1996).
    [CrossRef] [PubMed]
  14. K. Yemelyanov, M. Lo, E. Pugh, and N. Engheta, “Display of polarization information by coherently moving dots,” Opt. Express 11(13), 1577–1584 (2003).
    [CrossRef] [PubMed]
  15. L. B. Wolff, “Using polarization to separate reflection components,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 1989), pp. 363–369.
    [CrossRef]
  16. M. Ben-Ezra, “Segmentation with invisible keying signal,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2000), pp. 32–37.
    [CrossRef]
  17. K.M. Yemelyanov, S.S. Lin, E.N. Pugh, Jr., and N. Engheta, “Polarization-based segmentation for enhancement of target detection in adaptive polarization-difference imaging,” in Frontiers in Optics, OSA Technical Digest Series (Optical Society of America, 2005), paper JWA51.
  18. S. S. Lin, K. M. Yemelyanov, E. N. Pugh, and N. Engheta, “Separation and contrast enhancement of overlapping cast shadow components using polarization,” Opt. Express 14(16), 7099–7108 (2006).
    [CrossRef] [PubMed]
  19. L. B. Wolff, “Polarization-based material classification from specular reflection,” IEEE Trans. Pattern Anal. Mach. Intell. 12(11), 1059–1071 (1990).
    [CrossRef]
  20. S. Tominaga and A. Kimachi, “Polarization imaging for material classification,” Opt. Eng. 47(12), 123201 (2008).
    [CrossRef]
  21. M. Bass, Devices, Measurements, and Properties, Vol. 2 of Handbook of Optics (McGraw-Hill, 1995), Chap. 22.
  22. M. W. Hyde, S. C. Cain, J. D. Schmidt, and M. J. Havrilla, “Material classification of an unknown object using turbulence-degraded polarimetric imagery,” in Proceedings of IEEE Transactions on Geoscience and Remote Sensing (IEEE, 2010), pp. 264–276.
    [CrossRef]
  23. K. M. Yemelyanov, S. S. Lin, E. N. Pugh, and N. Engheta, “Adaptive algorithms for 2-channel polarization sensing under various polarization statistics with nonuniform distributions,” Appl. Opt. 45(22), 5504–5520 (2006).
    [CrossRef] [PubMed]
  24. R. C. Henry, S. Mahadev, S. Urquijo, and D. Chitwood, “Color perception through atmospheric haze,” J. Opt. Soc. Am. A 17(5), 831–835 (2000).
    [CrossRef] [PubMed]
  25. S. G. Narasimhan and S. K. Nayar, “Vision and the atmosphere,” Int. J. Comput. Vis. 48(3), 233–254 (2002).
    [CrossRef]
  26. J. S. Tyo, “Design of optimal polarimeters: maximization of signal-to-noise ratio and minimization of systematic error,” Appl. Opt. 41, 619–630 (2002).
    [CrossRef] [PubMed]
  27. J. S. Tyo, “Optimum linear combination strategy for an N-channel polarization sensitive vision or imaging system,” J. Opt. Soc. Am. A 15(2), 359–366 (1998).
    [CrossRef]
  28. D. Hoiem, A. A. Efros, and M. Hebert, “Automatic photo pop-up,” ACM Trans. Graph. 24(3), 577–584 (2005).
    [CrossRef]
  29. K. M. He, J. Sun, and X. Tang, “Guided image filtering,” in Proceedings of European Conference on Computer Vision, K. Daniilidis, P. Maragos, N. Paragios, eds. (Berlin, 2010), pp. 1–14.
  30. R. M. Haralick, K. Shanmugam, and I. H. Dinstein, “Textural features for image classification,” IEEE Trans. Syst. Man Cybern. 3(6), 610–621 (1973).
    [CrossRef]
  31. S. Pyatykh, J. Hesser, and L. Zheng, “Image noise level estimation by principal component analysis,” IEEE Trans. Image Process. 22(2), 687–699 (2013).
    [CrossRef] [PubMed]
  32. A. Mittal, R. Soundararajan, and A. C. Bovik, “Making a completely blind image quality analyzer,” in Proceedings of IEEE Conference on Signal Processing Letters (IEEE, 2013), pp. 209–212.
  33. A. Mittal, A. K. Moorthy, and A. C. Bovik, “No-reference image quality assessment in the spatial domain,” IEEE Trans. Image Process. 21(12), 4695–4708 (2012).
    [CrossRef] [PubMed]

2013

S. Pyatykh, J. Hesser, and L. Zheng, “Image noise level estimation by principal component analysis,” IEEE Trans. Image Process. 22(2), 687–699 (2013).
[CrossRef] [PubMed]

C. H. Yeh, L.-W. Kang, M.-S. Lee, and C.-Y. Lin, “Haze effect removal from image via haze density estimation in optical model,” Opt. Express 21(22), 27127–27141 (2013).
[CrossRef] [PubMed]

2012

A. Mittal, A. K. Moorthy, and A. C. Bovik, “No-reference image quality assessment in the spatial domain,” IEEE Trans. Image Process. 21(12), 4695–4708 (2012).
[CrossRef] [PubMed]

2010

K. M. He, J. Sun, and X. O. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2010).
[PubMed]

2009

2008

R. Fattal, “Single image dehazing,” ACM Trans. Graph. 27(3), 988–992 (2008).
[CrossRef]

S. Tominaga and A. Kimachi, “Polarization imaging for material classification,” Opt. Eng. 47(12), 123201 (2008).
[CrossRef]

2006

2005

E. Namer and Y. Y. Schechner, “Advanced visibility improvement based on polarization filtered images,” Proc. SPIE 5888, 36–45 (2005).

D. Hoiem, A. A. Efros, and M. Hebert, “Automatic photo pop-up,” ACM Trans. Graph. 24(3), 577–584 (2005).
[CrossRef]

2003

2002

2000

1998

1996

H. Chen and L. B. Wolff, “Polarization phase-based method for material classification and object recognition in computer vision,” Proc. SPIE 2599, 54–63 (1996).
[CrossRef]

J. S. Tyo, M. P. Rowe, E. N. Pugh, and N. Engheta, “Target detection in optically scattering media by polarization-difference imaging,” Appl. Opt. 35(11), 1855–1870 (1996).
[CrossRef] [PubMed]

1990

L. B. Wolff, “Polarization-based material classification from specular reflection,” IEEE Trans. Pattern Anal. Mach. Intell. 12(11), 1059–1071 (1990).
[CrossRef]

1973

R. M. Haralick, K. Shanmugam, and I. H. Dinstein, “Textural features for image classification,” IEEE Trans. Syst. Man Cybern. 3(6), 610–621 (1973).
[CrossRef]

Bovik, A. C.

A. Mittal, A. K. Moorthy, and A. C. Bovik, “No-reference image quality assessment in the spatial domain,” IEEE Trans. Image Process. 21(12), 4695–4708 (2012).
[CrossRef] [PubMed]

Chen, H.

H. Chen and L. B. Wolff, “Polarization phase-based method for material classification and object recognition in computer vision,” Proc. SPIE 2599, 54–63 (1996).
[CrossRef]

Chitwood, D.

Dinstein, I. H.

R. M. Haralick, K. Shanmugam, and I. H. Dinstein, “Textural features for image classification,” IEEE Trans. Syst. Man Cybern. 3(6), 610–621 (1973).
[CrossRef]

Efros, A. A.

D. Hoiem, A. A. Efros, and M. Hebert, “Automatic photo pop-up,” ACM Trans. Graph. 24(3), 577–584 (2005).
[CrossRef]

Engheta, N.

Fattal, R.

R. Fattal, “Single image dehazing,” ACM Trans. Graph. 27(3), 988–992 (2008).
[CrossRef]

Haralick, R. M.

R. M. Haralick, K. Shanmugam, and I. H. Dinstein, “Textural features for image classification,” IEEE Trans. Syst. Man Cybern. 3(6), 610–621 (1973).
[CrossRef]

He, K. M.

K. M. He, J. Sun, and X. O. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2010).
[PubMed]

Hebert, M.

D. Hoiem, A. A. Efros, and M. Hebert, “Automatic photo pop-up,” ACM Trans. Graph. 24(3), 577–584 (2005).
[CrossRef]

Henry, R. C.

Hesser, J.

S. Pyatykh, J. Hesser, and L. Zheng, “Image noise level estimation by principal component analysis,” IEEE Trans. Image Process. 22(2), 687–699 (2013).
[CrossRef] [PubMed]

Hoiem, D.

D. Hoiem, A. A. Efros, and M. Hebert, “Automatic photo pop-up,” ACM Trans. Graph. 24(3), 577–584 (2005).
[CrossRef]

Kang, L.-W.

Kimachi, A.

S. Tominaga and A. Kimachi, “Polarization imaging for material classification,” Opt. Eng. 47(12), 123201 (2008).
[CrossRef]

Lee, M.-S.

Lin, C.-Y.

Lin, S. S.

Lo, M.

Mahadev, S.

Mittal, A.

A. Mittal, A. K. Moorthy, and A. C. Bovik, “No-reference image quality assessment in the spatial domain,” IEEE Trans. Image Process. 21(12), 4695–4708 (2012).
[CrossRef] [PubMed]

Moorthy, A. K.

A. Mittal, A. K. Moorthy, and A. C. Bovik, “No-reference image quality assessment in the spatial domain,” IEEE Trans. Image Process. 21(12), 4695–4708 (2012).
[CrossRef] [PubMed]

Namer, E.

E. Namer, S. Shwartz, and Y. Y. Schechner, “Skyless polarimetric calibration and visibility enhancement,” Opt. Express 17(2), 472–493 (2009).
[CrossRef] [PubMed]

E. Namer and Y. Y. Schechner, “Advanced visibility improvement based on polarization filtered images,” Proc. SPIE 5888, 36–45 (2005).

Narasimhan, S. G.

Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Polarization-based vision through haze,” Appl. Opt. 42(3), 511–525 (2003).
[CrossRef] [PubMed]

S. G. Narasimhan and S. K. Nayar, “Vision and the atmosphere,” Int. J. Comput. Vis. 48(3), 233–254 (2002).
[CrossRef]

Nayar, S. K.

Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Polarization-based vision through haze,” Appl. Opt. 42(3), 511–525 (2003).
[CrossRef] [PubMed]

S. G. Narasimhan and S. K. Nayar, “Vision and the atmosphere,” Int. J. Comput. Vis. 48(3), 233–254 (2002).
[CrossRef]

Pugh, E.

Pugh, E. N.

Pyatykh, S.

S. Pyatykh, J. Hesser, and L. Zheng, “Image noise level estimation by principal component analysis,” IEEE Trans. Image Process. 22(2), 687–699 (2013).
[CrossRef] [PubMed]

Rowe, M. P.

J. S. Tyo, M. P. Rowe, E. N. Pugh, and N. Engheta, “Target detection in optically scattering media by polarization-difference imaging,” Appl. Opt. 35(11), 1855–1870 (1996).
[CrossRef] [PubMed]

Schechner, Y. Y.

Shanmugam, K.

R. M. Haralick, K. Shanmugam, and I. H. Dinstein, “Textural features for image classification,” IEEE Trans. Syst. Man Cybern. 3(6), 610–621 (1973).
[CrossRef]

Shwartz, S.

Sun, J.

K. M. He, J. Sun, and X. O. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2010).
[PubMed]

Tang, X. O.

K. M. He, J. Sun, and X. O. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2010).
[PubMed]

Tominaga, S.

S. Tominaga and A. Kimachi, “Polarization imaging for material classification,” Opt. Eng. 47(12), 123201 (2008).
[CrossRef]

Tyo, J. S.

Urquijo, S.

Wolff, L. B.

H. Chen and L. B. Wolff, “Polarization phase-based method for material classification and object recognition in computer vision,” Proc. SPIE 2599, 54–63 (1996).
[CrossRef]

L. B. Wolff, “Polarization-based material classification from specular reflection,” IEEE Trans. Pattern Anal. Mach. Intell. 12(11), 1059–1071 (1990).
[CrossRef]

Yeh, C. H.

Yemelyanov, K.

Yemelyanov, K. M.

Zheng, L.

S. Pyatykh, J. Hesser, and L. Zheng, “Image noise level estimation by principal component analysis,” IEEE Trans. Image Process. 22(2), 687–699 (2013).
[CrossRef] [PubMed]

ACM Trans. Graph.

R. Fattal, “Single image dehazing,” ACM Trans. Graph. 27(3), 988–992 (2008).
[CrossRef]

D. Hoiem, A. A. Efros, and M. Hebert, “Automatic photo pop-up,” ACM Trans. Graph. 24(3), 577–584 (2005).
[CrossRef]

Appl. Opt.

J. S. Tyo, M. P. Rowe, E. N. Pugh, and N. Engheta, “Target detection in optically scattering media by polarization-difference imaging,” Appl. Opt. 35(11), 1855–1870 (1996).
[CrossRef] [PubMed]

Appl. Opt.

IEEE Trans. Image Process.

S. Pyatykh, J. Hesser, and L. Zheng, “Image noise level estimation by principal component analysis,” IEEE Trans. Image Process. 22(2), 687–699 (2013).
[CrossRef] [PubMed]

IEEE Trans. Image Process.

A. Mittal, A. K. Moorthy, and A. C. Bovik, “No-reference image quality assessment in the spatial domain,” IEEE Trans. Image Process. 21(12), 4695–4708 (2012).
[CrossRef] [PubMed]

IEEE Trans. Pattern Anal. Mach. Intell.

L. B. Wolff, “Polarization-based material classification from specular reflection,” IEEE Trans. Pattern Anal. Mach. Intell. 12(11), 1059–1071 (1990).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell.

K. M. He, J. Sun, and X. O. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell. 33(12), 2341–2353 (2010).
[PubMed]

IEEE Trans. Syst. Man Cybern.

R. M. Haralick, K. Shanmugam, and I. H. Dinstein, “Textural features for image classification,” IEEE Trans. Syst. Man Cybern. 3(6), 610–621 (1973).
[CrossRef]

Int. J. Comput. Vis.

S. G. Narasimhan and S. K. Nayar, “Vision and the atmosphere,” Int. J. Comput. Vis. 48(3), 233–254 (2002).
[CrossRef]

J. Opt. Soc. Am. A

Opt. Eng.

S. Tominaga and A. Kimachi, “Polarization imaging for material classification,” Opt. Eng. 47(12), 123201 (2008).
[CrossRef]

Opt. Express

Proc. SPIE

E. Namer and Y. Y. Schechner, “Advanced visibility improvement based on polarization filtered images,” Proc. SPIE 5888, 36–45 (2005).

H. Chen and L. B. Wolff, “Polarization phase-based method for material classification and object recognition in computer vision,” Proc. SPIE 2599, 54–63 (1996).
[CrossRef]

Other

J. P. Tarel and N. Hautiere, “Fast visibility restoration from a single color or gray level image,” in Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition (IEEE, 2009), pp. 2201–2208.
[CrossRef]

R. T. Tan, “Visibility in bad weather from a single image,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.
[CrossRef]

Y. Y. Schechner, S. G. Narasimhan, and S. K. Nayar, “Instant dehazing of images using polarization,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2001), pp. 325–332.
[CrossRef]

S. Shwartz, E. Namer, and Y. Y. Schechner, “Blind haze separation,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 1984–1991.
[CrossRef]

M. Saito, Y. Sato, K. Ikeuchi, and H. Kashiwagi, “Measurement of surface orientations of transparent objects using polarization in highlight,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 1999), pp. 381–386.
[CrossRef]

M. Bass, Devices, Measurements, and Properties, Vol. 2 of Handbook of Optics (McGraw-Hill, 1995), Chap. 22.

M. W. Hyde, S. C. Cain, J. D. Schmidt, and M. J. Havrilla, “Material classification of an unknown object using turbulence-degraded polarimetric imagery,” in Proceedings of IEEE Transactions on Geoscience and Remote Sensing (IEEE, 2010), pp. 264–276.
[CrossRef]

L. B. Wolff, “Using polarization to separate reflection components,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 1989), pp. 363–369.
[CrossRef]

M. Ben-Ezra, “Segmentation with invisible keying signal,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2000), pp. 32–37.
[CrossRef]

K.M. Yemelyanov, S.S. Lin, E.N. Pugh, Jr., and N. Engheta, “Polarization-based segmentation for enhancement of target detection in adaptive polarization-difference imaging,” in Frontiers in Optics, OSA Technical Digest Series (Optical Society of America, 2005), paper JWA51.

A. Mittal, R. Soundararajan, and A. C. Bovik, “Making a completely blind image quality analyzer,” in Proceedings of IEEE Conference on Signal Processing Letters (IEEE, 2013), pp. 209–212.

K. M. He, J. Sun, and X. Tang, “Guided image filtering,” in Proceedings of European Conference on Computer Vision, K. Daniilidis, P. Maragos, N. Paragios, eds. (Berlin, 2010), pp. 1–14.

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1
Fig. 1

The components of a hazy image. (a) components of intensity. (b) Schechner et al.’s view of polarization’s components. (c) the proposed view of polarization’s components.

Fig. 2
Fig. 2

The description of the observed scene. (a) an observed scene. (b) the underlying topography of the light path. (c) the scene segmentation results.

Fig. 3
Fig. 3

The statistical results for mean of DoLP of each region. Each dot represents the DoLP of the corresponding region and each line corresponds to a polarized image group.

Fig. 4
Fig. 4

Flowchart of our proposed method.

Fig. 5
Fig. 5

Experimental images.

Fig. 6
Fig. 6

Key parameter estimation and dehazing results for Scene 1. (a) synthesizing image I max . (b) synthesizing image I min . (c) polarized-difference (PD) image. (d) estimated rough DoLP map ρ D . (e) Estimated rough transmission map t of R-channel. (f) refined transmission map t ^ of R-channel. (g) final DoLP map ρ ^ D . (h) sky region detection. (i) dehazing results with rough ρ D .(j) dehazing results with refined ρ ^ D .

Fig. 7
Fig. 7

Experimental results for Scenes 5 and 6. (a) dehazing result of Scene 5. (b) dehazing result of Scene 6.

Fig. 8
Fig. 8

Comparision experiment with and without ρ D for Scene 2. (a) dehazed image considering only the polarization of airlight. (b) dehazed images. (c) magnified region on the red rectangle in (a). (d) magnified region on the red rectangle in (b).

Fig. 9
Fig. 9

Dehazing results for Scenes 3 and 4. (a) and (c) are the dehazed image without considering the DoLP of object. (b) and (d) are our dehazed images.

Fig. 10
Fig. 10

Comparison with Schechner’s dehazing results. (a) the two-group input polarized images. Each group has a worst-polarized image and a best-polarized image. (b) Schechner’s results. (c) our dehazed images.

Tables (2)

Tables Icon

Table 1 The noise level comparison of dehazing images with and without considering ρ D

Tables Icon

Table 2 The six scene dehazed result’s image quality

Equations (12)

Equations on this page are rendered with MathJax. Learn more.

S ( x , y ) = ( S 0 ( x , y ) S 1 ( x , y ) S 2 ( x , y ) S 3 ( x , y ) ) = ( I ( x , y , 0 0 ) + I ( x , y , 9 0 0 ) I ( x , y , 0 0 ) - I ( x , y , 9 0 0 ) I ( x , y , 45 0 ) - I ( x , y , 45 0 ) I L ( x , y ) - I R ( x , y ) ) .
ρ λ = ( S 1 λ ) 2 + ( S 2 λ ) 2 S 0 λ .
α λ = 1 2 arc tan ( S 2 λ S 1 λ ) .
I λ ( x , y , θ ) = 1 2 S 0 λ ( x , y ) + 1 2 S 1 λ ( x , y ) cos 2 θ + 1 2 S 2 λ ( x , y ) sin 2 θ .
S 0 ( x , y ) = S 0 D ( x , y ) + S 0 A ( x , y ) .
S 0 A ( x , y ) = A ( 1 - t ( x , y ) ) .
ρ ( x , y ) S 0 ( x , y ) = ρ D ( x , y ) S 0 D ( x , y ) + ρ A ( x , y ) S 0 A ( x , y ) .
t ( x , y ) = 1 Δ I ( x , y ) ρ D ( x , y ) S 0 ( x , y ) A ( ρ A ( x , y ) ρ D ( x , y ) ) .
I max ( x , y , θ max ) = 1 2 ( 1 + ρ ( x , y ) ) S 0 ( x , y ) . I min ( x , y , θ min ) = 1 2 ( 1 ρ ( x , y ) ) S 0 ( x , y ) .
E ( ρ D ) = C o v { ρ D S 0 ( x , y ) Δ I ( x , y ) , ρ D S 0 ( x , y ) Δ I ( x , y ) A ( ρ D ρ A ) Δ I ( x , y ) ρ A S 0 ( x , y ) } ( ρ D ρ A ) A 2 .
J ( x , y ) = Δ I ( x , y ) ρ A ( x , y ) S 0 ( x , y ) Δ I ( x , y ) / A ρ A ( x , y ) .
D ( ν 1 , ν 2 , Σ 1 , Σ 2 ) = ( ν 1 ν 2 ) T ( Σ 1 + Σ 2 2 ) 1 ( ν 1 ν 2 ) .

Metrics