Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Simple and effective calculations about spectral power distributions of outdoor light sources for computer vision

Open Access Open Access

Abstract

The spectral power distributions (SPD) of outdoor light sources are not constant over time and atmospheric conditions, which causes the appearance variation of a scene and common natural illumination phenomena, such as twilight, shadow, and haze/fog. Calculating the SPD of outdoor light sources at different time (or zenith angles) and under different atmospheric conditions is of interest to physically-based vision. In this paper, for computer vision and its applications, we propose a feasible, simple, and effective SPD calculating method based on analyzing the transmittance functions of absorption and scattering along the path of solar radiation through the atmosphere in the visible spectrum. Compared with previous SPD calculation methods, our model has less parameters and is accurate enough to be directly applied in computer vision. It can be applied in computer vision tasks including spectral inverse calculation, lighting conversion, and shadowed image processing. The experimental results of the applications demonstrate that our calculation methods have practical values in computer vision. It establishes a bridge between image and physical environmental information, e.g., time, location, and weather conditions.

© 2016 Optical Society of America

1. Introduction

Solar irradiance is changed by atmospheric transmittance effects including absorption, reflecting, and scattering, which causes the spectral power distribution (SPD) of the light that ultimately reaches the Earth’s surface to vary with time and air conditions. As shown in Fig. 1, atmospheric transmittance effects lead to many natural things related to computer vision applications such as the variation of sun and sky appearance, twilight, shadow, haze/fog, and cloud. Therefore, in computer vision, a SPD calculation method of outdoor light sources is useful to deal with the problems caused by different time (or zenith angles) and weather conditions.

 figure: Fig. 1

Fig. 1 Outdoor appearances vary with time and air conditions.

Download Full Size | PDF

Nature light modeling and image illumination processing have received a lot of attention in the computer vision and computer graphics community. Based on the fact that the luminance varies with the different parts of the sky, a physically-based sky luminance model is employed to infer the camera azimuth in [1] and to recover camera focal length in [2]. Lalonde et al. [3] presented how they can geolocate a camera by using the sky appearance and sun position annotated in images. Liu et al. [4] proposed a method to estimate the lighting condition of outdoor scenes through learning a set of images captured at the same sun position. Sunkavalli et al. [5] presented that sunlight and skylight SPDs can be recovered by analyzing a time-lapse video of outdoor scenes. Haber et al. [6] presented an approach to compute the colors of the sky during twilight period before sunrise and after sunset based on the theory of light scattering by small particles. Perez et al. [7] proposed a popular sky model that has been widely used in computer vision and graphics. In general, most of these methods focus on modeling or applying the sky spatial radiance distributions rather than spectral information. Spectral information, i.e., the SPD knowledge of light sources, is useful in some computer vision tasks, such as reflectance recovery [8], camera spectral sensitivities estimation [9], color constancy [10], relighting [11], image rendering [12], and augmented reality [13].

In the computer vision community, two methods are usually applied to estimate SPD illuminations. The first one employs Planck’s blackbody radiation law to approximately calculate the SPDs of outdoor light sources. Finlayson et al. [14] and Tian et al. [15] applied Planck’s blackbody radiation law to approximately estimate the SPDs of daylight and skylight for deriving shadow invariant images and for detecting shadows, respectively. The second one employs daylight characteristic vectors to recover the SPDs of outdoor light sources. Judd et al. [16] proposed the characteristic vector analysis method on 622 samples of daylight measured at 10 nm intervals over the visible range. Their results suggested that most of the daylight samples can be approximated accurately by a linear combination of the three fixed characteristic vectors. Based on Perez sky model [7], Preetham et al. [17] proposed an improved sky model. It suggests an analytical approximation of the five distribution coefficients in Perez sky model to be a linear function of a single parameter, turbidity, by fitting the Perez formulas to the reference images. With another parameter, sun angle, the absolute sky luminance Y as well as the CIE chromaticities x and y of a sky element can be calculated. They also provide a way to convert the outputs to spectral radiance data (See Appendix 5 in [17]). This analytic model of spectral sky-dome radiance is widely-used. Using images captured during a single day, Jung et al. [18] presented an outdoor photometric stereo method via skylight estimation according to the Preetham sky model. Given sky images and viewing geometry as the input, Kawakami et al. [19] proposed a method to estimate the turbidity of the sky by fitting image intensities to the Preetham sky model. Then, the sky spectrum can be calculated. Having the sky RGB values and their corresponding spectrum, the method estimates the camera spectral sensitivities and white balance setting. In the experimental section, we will analyze and compare with the Planck’s blackbody radiation and the Preetham sky model.

In meteorology, there are SPD calculation methods based on analyzing the transmittance functions of absorption and scattering along the path of solar radiation through the atmosphere. The latest versions of two representative SPD calculation models with wavelength resolution are SMARTS2 [20] and MODTRAN [21]. The SPDs calculated by these two methods can approximate real measured ones well. However, they may be too complex to be used in computer vision, because they contain many parameters to be determined and unfortunately most of these meteorological parameters are not readily available for computer vision applications. Bird et al. [22] proposed a simpler method to calculate the SPD of outdoor light sources. Though Bird’s method is much simpler than those in [20, 21], it is still difficult to be used in computer vision. In general, the existing methods are not developed for computer vision. They consider the full spectrum and involve many factors and parameters that have no effects on the visible spectrum. Besides, in all existing methods, the characteristics of human vision are not taken into account. Therefore, they may be more suitable for the research on meteorology than computer vision. Because most computer vision tasks concentrate on analyzing images captured in the visible spectrum, it is possible for us to develop the simpler and effective SPD calculation method that can be easily applied in computer vision tasks, such as illumination processing, shadow removal, and image relighting.

The main contribution of this paper is that we propose a very simple method to calculate the SPDs of sunlight and skylight in the visible spectrum. Different from previous calculations, we first calculate the atmosphere’s absorption, since some light emitted from the sun never reach the Earth’s surface neither as direct sunlight nor as diffuse skylight due to the atmosphere’s absorption. We propose a simple new absorption calculation method and provide the new “total absorption coefficient”. During the development of our method, we pay more attention to the wavelengths near the peaks of the human color matching functions (CMFs), to guarantee the accuracy of color reproduction when using our calculated SPDs to substitute the real ones.

2. A simple method for computing the spectral power distribution of sunlight and skylight

2.1. Light and image

Because our SPDs calculations are for imaging and computer vision, we develop our method based on image formation theories. Figure 2 shows the image formation procedure in outdoor environments. Since both our eyes and conventional cameras are only sensitive to the visible spectrum, all the calculations and analysis in this paper will be done within the spectrum 400–700nm. Given illumination SPD E(λ) and object’s reflectance S(λ), the tristimulus values are,

FH=400700E(λ)S(λ)QH(λ)dλ
where QH(λ) denotes the XYZ or sRGB color matching functions in three channels. In this paper, tristimulus values in XYZ will be used in Section 2.4 and those in sRGB will be used in Section 4 or in the experiments in Section 4. The detail about XYZ to sRGB conversion can be found in [23].

 figure: Fig. 2

Fig. 2 Illustration of the image formation procedure.

Download Full Size | PDF

2.2. Absorption of extraterrestrial irradiance passing through the atmosphere

When the extraterrestrial irradiance passes through the atmosphere, it is attenuated by absorption, reflection, and scattering processes, and thereby its spectral composition is changed considerably. Some solar radiation will never reachs the Earth’s surface neither as direct sunlight nor as diffuse skylight due to the atmosphere’s absorption by ozone, nitrogen dioxide, mixed gases, and water vapor. We denote E as extraterrestrial irradiance at mean earth-sun distance for wavelength λ. After absorption, the irradiance becomes,

Ein=EoλToλTNλTwλTuλ
where T, T, T, and T denote the transmittance functions for molecular ozone, nitrogen dioxide, molecular water vapor, and mixed gases, respectively. Extraterrestrial irradiance also varies with the sun-earth distance which is determined by the Day Number of a year. Since this factor is independent of wavelength, it can be taken outside the integration and can be consider as exposure time in Eq. (1). Therefore this factor can be neglected for computer vision application. T and T can be represented by,
Toλ=exp(0.35aoλm)(Referto[24])
TNλ=exp(0.0016anλm)(Referto[20,25])
where a and a are the absorption coefficients of molecular ozone and nitrogen dioxide, respectively. In [22, 24], T and T are given by,
Twλ=exp[0.2385awλWm/(1+20.07awλWm)0.45]
Tuλ=exp[1.41auλm/(1+118.93auλm)0.45]
where a and a are the absorption coefficients of molecular water vapor, and mixed gases, respectively. W is water vapor amount, and its typical value is 1.6. m is air mass that is calculated by,
m=sec(θ)

θ denotes zenith angle in this paper. Unlike that T and T take noticeable effects in the visible spectrum, T mainly takes effect on wavelengths longer than 1000nm and in the visible spectrum takes effect in 687nm691nm [26]; T mainly takes effect on wavelengths that are longer than 800nm and in the visible spectrum takes effect in 690nm700nm [27]. T and T have little effect in the visible spectrum but their computations are more complex than those of T and T. More importantly, T and T have different forms with T and T, which brings difficulties for merging these four terms. Therefore, we here want to seek simple and effective expressions for T and T. Eq. (5) and Eq. (6) have similar formulas, thus we first merge T into T, i.e.,

Twλ(awλ+εauλ|m)=Twλ(awλ|m)Tuλ(auλ|m)

Where ε can be determined by,

ε=argminmTwλTuλTwλ(awλ+εauλ)2

The result is ε = 4.3. We have,

Twλ(awλ+4.3auλ)TwλTuλ

The two attenuations caused by water vapor and mixed gases are merged into a single attenuation coefficient. We define the new term as Twλ,

Twλ=exp[0.2385awλWm/(1+20.07awλWm)0.45]
where awλ=awλ+4.3auλ.

For different m, we find that data counterparts of [awλ,log(Twλ)] satisfy a power function. So in the following we write Twλ as the form of T and T. We define a new transmittance function of water vapor and uniform mixture gas as,

Twλnew=exp[p(awλ)qm]
with,
p,q=argminmp(awλ)qmlog(Twλ)2

The results of Eq. (13) are p = 0.055 and q = 0.56. We apply the simple exhaustive search method to solve the optimization problem in Eq. (9) and Eq. (13). We set the sampling step size of λ as 1nm and the step size of ε, p, and q as 0.01. The new attenuation expression of water vapor and mixed gases is approximated as a decreasing exponential function of the combined coefficient.

Twλnew=exp[0.055(awλ)0.56m]

Then since all the absorption factors can be expressed by a decreasing exponential function, we rewrite Eq. (2) as,

Ein=EoλTλ
where,
Tλ=exp(τ(λ)m)

The results in Fig. 3 show that our new simple expression produces similar results compared with the combined effects of the original two terms in the visible spectrum. We name the proposed Tλ as “total absorption transmittance function” and τ′ (λ) as “total absorption coefficient” that is represented by,

τ(λ)=0.35aoλ+0.0016anλ+0.055awλ0.56

 figure: Fig. 3

Fig. 3 New attenuation and transmittance expressions of water vapor and uniformly mixed gases produce good results as the original expressions.

Download Full Size | PDF

The total absorption coefficient can be found in the Appendix (Table 4) and is plotted in Fig. 4, in which a, a, a, and a are derived from [20] with a = A, a = A, a = 100A, and a = 100A. The SPD of extraterrestrial irradiance and an example of its result after absorption at m = 2 is shown in Fig. 5.

 figure: Fig. 4

Fig. 4 Total absorption coefficient.

Download Full Size | PDF

 figure: Fig. 5

Fig. 5 The SPD of extraterrestrial irradiance reported by World Radiation Center and that after absorption at air mass equals to 2.

Download Full Size | PDF

Using our proposed total absorption coefficient and transmittance function, the absorption calculation of ozone, nitrogen dioxide, water vapor, and mixed gases can be considered all together. Eq. (16) is much simpler to use.

2.3. Computing direct sunlight

The direct irradiance at Earth’s surface normal to the direction of the sun can be calculated by,

Edλ=EinTrλTaλ
where Ein is the SPD of extraterrestrial irradiance after absorption, and T, T denote the transmittance functions of Rayleigh scattering and Aerosol scatting, respectively. The transmittance function for the Rayleigh scattering in [28] is adopted in this paper.
Trλ=exp(0.008735λ4.08m)

The transmittance function for the aerosol scattering in [28] is adopted in this paper.

Taλ=exp(βλαm)
where α is wavelength exponent and the value of 1.3 is commonly employed for α. The parameter β is the cleanliness index (turbidity) which varies from 0.0 to 0.4. For clear weather, turbid weather, and very turbid weather, its value is about 0.1, 0.2, and 0.3 respectively [28].

2.4. Computing diffuse skylight

The calculations of diffuse skylight in other works are complicated. Here we propose a very simple method. From Eq. (19) and Eq. (20), we know that T and T describe the direct transmittance functions of Rayleigh scattering and Aerosol scattering respectively. Thus (1−T T) actually describes the sum of scattered light that can reach the Earth’s surface and that are lost in the atmosphere. So the key point becomes how to determine the proportion of the scattered light that can reach the ground. Therefore, the diffuse skylight on the horizontal surface can be calculated by,

Esλ=Eincos(θ)(1TrλTaλ)κ
where κ accounts for the proportion of the scattered light that can reach the ground.

As shown in Fig. 6, κ should be zenith-dependent. The longer light passes through the atmosphere, the more light will be lost, i.e., less scattered light can reach the ground. We determine κ by fitting CIE ΔELab between the calculated SPDs by Eq. (21) and those from the true measurements by a spectrometer SOC710. In detail, for the ideal white reflectance, S(λ) = 1 is applied in calculation. For the calculated SPD by our method, the XYZ tristimulus values are,

FH=400700EsλS(λ)QH(λ)dλ.

 figure: Fig. 6

Fig. 6 Schematic diagram of the solar radiation onto the Earth surface.

Download Full Size | PDF

For the measured SPD, the XYZ tristimulus values are,

FH=400700EsλS(λ)QH(λ)dλ.

The XYZ tristimulus values are converted to CIELAB color space and ΔELab is applied to evaluate the error between E and Esλ.

ΔELab=ΔL2+Δa2+Δb2

κ can be determined by,

κ=argminΔELab

The reason that we apply XYZ values and ΔELab to determine κ is to keep higher accuracy near the peaks of CMFs. The values of κ with different zenith angles are tabulated in Table 1.

Tables Icon

Table 1. Proportion of the scattered light that can reach the ground with different zenith angles

3. Experiments and comparisons

Figure 7 shows the comparisons of the calculated SPDs by our method with those by SMARTS2 method [20] and Bird’s method [22] in clear weather, i.e. β = 0.1. The results generally denote that there are no significant discrepancies between the calculated SPDs by our simple method and by the other two complex meteorology ones, especially for direct sunlight. To evaluate the influences of the discrepancies on imagery, we simulate sRGB values (see the 2nd and 4th columns in Fig. 7) of Xrite colorchecker which contains 24 common colors in our daily life. From the results, it is hard to find differences by our naked eyes on the simulated images under SPDs calculated by the three methods. The last column in Fig. 7 shows the quantitative discrepancies of pixel values of the simulated images. For each color patch, in three channels, max differences versus max values are plotted as percent errors.

 figure: Fig. 7

Fig. 7 Comparison of the calculated SPDs by our method and by Bird’s method [22] and SMARTS2 method [20] at different zenith angles for β = 0.1. From top to bottom, the zenith angle equals to 20, 30,…,80 degree, respectively. The x-coordinate denotes wavelength (nm) and the y-coordinate denotes irradiance power(W·m2·nm1). The 2nd column shows simulated Xrite colorchecker appearances under sunlight and the 4th column is that under skylight. Upper: Our; Middle: Bird’s; Lower: SMARTS2. The last column is pixel value differences (percent). A: Direct sunlight compared with Bird’s method. B: Direct sunlight compared with SMARTS2 method. C: Diffuse skylight compared with Bird’s method. D: Diffuse skylight compared with SMARTS2 method.

Download Full Size | PDF

The results in Fig. 7 also show that the SPDs do not vary significantly from 20 degree to 40 degree, which indicates that the outdoor light sources are quite stable during noon. The variation of intensity is larger than that of spectrum. Because variation of intensity can be taken outside the integration and can be put into exposure time in Eq. (1), it has no effect on imagery.

Figure 8 shows the comparison of the calculated SPDs by our method with those by SMARTS2 and Bird’s method under different turbidities. The consistency by the three methods on sunlight is better than that on skylight. We can see that for very clear weather β = 0, the outputs of the three methods are quite similar for skylight. The similarity degrades with increasing β. For very turbid weather, β = 0.3, the three outputs are obvious different. Overall, our results are more close to those of SMARTS2 method in the skylight calculation.

 figure: Fig. 8

Fig. 8 Comparisons of the calculated SPDs by our method and those by Bird’s method [22] and SMARTS2 method [20] under different turbidities. From top to bottom are β = 0, β = 0.2, and β = 0.3, respectively.

Download Full Size | PDF

The comparison on the formula expressions of our method with those of SMARTS2 and Bird’s method are listed in Table 2. From the table, we can see that our expressions are much simpler than those in the other two methods. More importantly, in the other two methods there are many parameters that may be hard to be determined in computer vision.

Tables Icon

Table 2. Comparison of the formula expressions of our methods with those of SMARTS2 and Bird’s method

3.1. Comparison with blackbody radiation

Planck’s blackbody radiation law is usually applied to calculate the SPDs of outdoor light sources. It is well known that extraterrestrial solar irradiance can be approximated by the black-body radiation at 5777 K. However, from Fig. 9, we can see that the extraterrestrial solar irradiance roughly follows the blackbody radiation at 5777 K in the near infrared spectrum while it does not follow blackbody radiation law well enough in the visible region. Figure 10 shows the chromaticity of the blackbody radiation and that of sunlight, skylight, and daylight. We find that the chromaticity of daylight is very near to Planckian locus and so it can be accurately approximated by the blackbody radiation. In contrast, the chromaticity of sunlight and skylight deviate from that calculated by blackbody radiation noticeably, which indicates that noticeable errors occur when blackbody radiation is employed to model the SPDs of sunlight and skylight.

 figure: Fig. 9

Fig. 9 Using blackbody radiation to approximate the true extraterrestrial data. The right image is the close-up view of the left one.

Download Full Size | PDF

 figure: Fig. 10

Fig. 10 CIE Chromaticity of sunlight, skylight, and daylight are compared with Planckian locus formed by color temperatures from 1000k to 500000k.

Download Full Size | PDF

3.2. Comparison with Preetham sky model

The Preetham model [17] also only requires the zenith angle and turbidity as varying parameters. Using Judd’s characteristic vectors, this model can produce radiance spectral information of all points on the skyphere. Details about recovering radiance spectral information from sky chromaticity values based on Judd’s characteristic vectors can be found in the Appendix 5 in [17]. After integration of the incoming light from all the directions in the hemisphere, SPD of skylight irradiance can be calculated. Figure 11 shows the comparison between our method, SMARTS2, Bird’s method, and Preetham method. We can find that, compared with our method, Preetham method is not accurate enough to match the two meteorology methods. This is caused by that some errors may arise when recovering full spectra from sky X, Y, Z values. It only uses three principle basis. Another disadvantage of using the Preetham sky model to recover SPD is that it can only recover skylight SPD while it cannot recover SPD of direct sunlight and daylight.

 figure: Fig. 11

Fig. 11 Compared with Preetham model. Left: at 30 degree zenith angle. Right: at 60 degree zenith angle.

Download Full Size | PDF

4. Applications

4.1. The recovery of camera capture information

We first demonstrate estimating illumination, zenith angle, turbidity, and camera spectral sensitivities from a single 24 color checker image. Rewriting E(λ) in Eq. (1), we have,

FH=400700E(θ,T,λ)S(λ)QH(λ)dλ,H=R,G,B

Jiang’s et al. [29] shows that camera spectral sensitivities (CSS) can be decomposed as QH= σH·CH, where σH = [σH,1, σH,2] ; CH=[cH,1,cH,2]T is the eigenvector matrix that is precomputed by a dataset including different CSS. Given a 24 color checker image and the reflectance spectrum as shown in Fig. 12, parameters σH, θ, T can be recovered by iteratively minimizing the following RMS, and then illumination SPDs and CSS can be recovered.

n=124FHnλE(θ,T)Sn[σHCH],H=R,G,B

 figure: Fig. 12

Fig. 12 A 24 color checker image captured by a Canon 5D Mark II camera under skylight and its corresponding spectral reflectance.

Download Full Size | PDF

Compared with Jiang’s method that can recover CSS and one illuminations-SPD, as shown in Fig. 13, our method can recover CSS and SPDs of three lights, i.e., sunlight, daylight, and skylight simultaneously. More importantly, using our method, sun angle and turbidity can be recovered from the image while Jiang’s method can only recover color temperature.

 figure: Fig. 13

Fig. 13 The recovered skylight spectrum and the recovered CSS of Canon 5D Mark II. In the second figure, M and E are the abbreviations of ”Measured” and ”Estimated”, respectively.

Download Full Size | PDF

4.2. Shadow features for shadow detection

Shadow is an active research topic in computer vision. Its generation and characteristics have close relation to direct sunlight and diffuse skylight. Tian et al. [30] show that the pixel values of a surface illuminated by daylight (in non-shadow region) are proportional to those of the same surface illuminated by skylight (in shadow region). If fH denote pixel values of a surface in shadow area and FH denote those of the same surface in non-shadow area, [30] shows,

FHfH=400700Eday(θ,T,λ)S(λ)QH(λ)dλ400700Esky(θ,T,λ)S(λ)QH(λ)dλ=KH
where
Eday(θ,T,λ)=Esun(θ,T,λ)cosθ+Esky(θ,T,λ).

Since our calculating method can simultaneously calculate SPDs of daylight and skylight. Based on our calculated SPDs, we can easily obtain KH for any sun angles and turbidity. As shown in Fig. 14, we capture three images at 43 degree sun angle and in clear weather with 0.15 turbidity (estimated by section 4.1). We calculate the normalized KH in the Canon 5D Mark II RAW images without Gamma correction. We can find these values are quite near to the calculated KH = [0.69,0.57,0.45] by our calculated SPDs, indicating that our SPD calculation method can predict KH in shadowed images.

 figure: Fig. 14

Fig. 14 Our SPD calculation method can predict KH in shadowed images.

Download Full Size | PDF

We also calculated KH under different sun angles and turbidity to find some rules that will be useful for shadow detection. We list these KH values in Table 3. From the table we can find that three properties of shadows.

  • Property 1. KH satisfy KR > KG > KB
  • Property 2. KH decrease as turbidity increases.
  • Property 3. KH decrease as sun angle increases.

Tables Icon

Table 3. The calculated KH at different sun angles and under different turbidities

To test property 1, we collected 563 real-world images downloaded from the web or real-captured. Particularly we carefully captured images at different sun angles and under different weather conditions (including clear sky, little overcast, and heavy clouds that didn’t cover the sun). For each image, shadow regions were manually labeled. We got 3672 shadow & non-shadow counterparts and calculated all the KH by these counterparts. We find that 90.50% KH satisfy property 1. Preliminary analysis shows that overexposure, thin shadows, and too large contrast enhancement by post-processing on images usually cause that the calculated KH do not satisfy property 1.

Since we usually don’t know under what sun angles and turbidity an image is captured, property 2 and 3 are not convenient to be directly used. They may be useful to some inverse problem from images, e.g., using detected shadows to infer sun angle and turbidity. In detail, if we know the CSS of a camera, we can obtain a table like Table 3, and coarsely infer sun angle and turbidity by comparing the KH in the table and the KH calculated from the detected shadows. The application in this section shows that our SPD calculation is valuable for finding novel shadow features, which may hard to accomplish by other SPD calculation methods since their methods cannot calculate Eq. (28) and Table 3.

4.3. Deriving intrinsic images

Our SPD calculation method can be used to derive intrinsic images. If we convert the linear RGB to gamma-corrected sRGB vaules (According to [23]), Eq. (28) becomes

log(FH+14)=log(KH)2.4+log(fH+14)

From Eq. (30), the following equation holds.

log(FR+14)+log(FG+14)β1log(FB+14)=log(fR+14)+log(fG+14)β1log(fB+14)
where
β1=log(KR)+log(KG)log(KB)

For a pixel in an image, Eq. (31) represents a shadow invariant. For an arbitrary pixel and its RGB value vector, (vR, vG,vB)T, log(vR+14)+log(vG+14) − β1·log(vB+14) keeps the same no matter whether the pixel is in shadow region or not. Apparently, a grayscale intrinsic image is obtained. One result is shown in the middle of Fig. 15.

 figure: Fig. 15

Fig. 15 Intrinsic image result. Left: Original image; middle: Grayscale intrinsic image; right: Color intrinsic image.

Download Full Size | PDF

To obtain a color intrinsic image, for a RGB value vector (vR,vG,vB)T, we first define the following grayscale intrinsic values I1 according to Eq. (31).

I1=log(FR+14)+log(FG+14)β1log(FB+14)=log(fR+14)+log(fG+14)β1log(fB+14)=log(vR+14)+log(vG+14)β1log(vB+14)

Similar to Eq. (33), we can obtain two other grayscale intrinsic values I2 and I3,

I2=log(FR+14)β2log(FG+14)+log(FB+14)=log(fR+14)β2log(fG+14)+log(fB+14)=log(vR+14)β2log(vG+14)+log(vB+14)
I3=β3log(FR+14)+log(FG+14)+log(FB+14)=β3log(fR+14)+log(fG+14)+log(fB+14)=β3log(vR+14)+log(vG+14)+log(vB+14)
where
β2=log(KR)+log(KB)log(KG),β3=log(KG)log(KB)log(KR)

A color intrinsic image can be obtained from the unique particular solution of the equation set generated by Eqs. (31), (34), and (35) (More details can be found in [31]). One color intrinsic image result is shown in the right of Fig. 15. When deriving the intrinsic images, KH are the key parameters. More accurate estimation of KH can produce better intrinsic image results. As shown in Fig. 16, using KH computed by the SPDs calculation method proposed in this paper, better intrinsic image results can be obtained. In [31], KH is just approximately computed from the mean value of some real measured SPDs.

 figure: Fig. 16

Fig. 16 More accurate estimation of KH can produce better intrinsic image results. Left: Original images; middle: Intrinsic image results using KH calculated by Eq. (29) and the SPDs calculation method proposed in this paper. right: Intrinsic image results using KH introduced in [31] that is calculated from the mean value of some real measured SPDs.

Download Full Size | PDF

4.4. Lighting conversion

This application converts an image from one illumination to another. In detail, using our SPD calculation method, we firstly recover reflectance from the original image using the method in [32], and then we can relight the same scene according to Eq. (1) and convert it to sRGB color space. As shown in Fig. 17, we first convert illumination from skylight to daylight at zenith angle equals to 60 degree. Both the color and the intensity of the converted image are quite similar to the real captured one by our naked eyes. The errors of the conversion are shown in Fig. 18. The mean error of pixel value is 9. The max error of pixel value is 36, which occurs at patch No.15 in Red channel. The error mainly arises from that commercial cameras apply some post processing such as tone mapping and gamut mapping on the images. These factors are not considered when we convert RAW images to sRGB JPG images in our calculation.

 figure: Fig. 17

Fig. 17 Converting illumination from skylight to daylight. Left: Image illuminated by skylight; Middle: The image with illumination converted to daylight. Right: The real image captured in daylight.

Download Full Size | PDF

 figure: Fig. 18

Fig. 18 Errors between the converted image and the real captured one.

Download Full Size | PDF

Figure 19 shows two more results of image lighting conversion. The two images in the first row of Fig. 19 show that image lighting conversion from afternoon to noon. The color of the converted image is more similar to the scene at noon than the original one. The second row of Fig. 19 shows image lighting converted from sunset to afternoon. Both the color and the clarity of the converted image are better than that of the original one.

 figure: Fig. 19

Fig. 19 Two more results of image lighting conversions. Left: Original images; Right: converted images.

Download Full Size | PDF

5. Conclusion

Illumination is one of the important components of imaging. Understanding the properties of spectral distributions of outdoor light and their dynamical changes at different times and under different atmospheric conditions is of interest to computer vision. In this paper, we proposed a simple method for computing SPDs of sunlight and skylight for a given zenith angle and turbidity. In the computer vision community, researchers usually apply the blackbody radiation or the eigenvectors of daylight [16] to approximate real SPDs. Compared with these two kinds of methods, our method has two advantages. First of all, SPDs calculated by our method can approximate those calculated by meteorology ones. The second is that our method can simultaneously calculate SPDs of daylight, sunlight, and skylight. The advantages especially the second one are important for our applications. These applications are difficult to accomplish by the blackbody or the eigenvector model since they cannot simultaneously calculate the corresponding SPDs of daylight, sunlight, and skylight. Therefore, it is difficult to employ these models to simultaneously recover the SPDs of daylight and skylight in Sec. 4.1, estimate KH in Sec. 4.2 and 4.3, and relight the ColorChecker image from skylight to daylight under the same sun angle in Sec. 4.4.

Our method is much simpler than the existing atmospheric methods and shows its advantages in a number of computer vision applications. Different from other models that need many parameters such as ozone, nitrogen dioxide, mixed gases, and water vapor that have no close relation to computer vision and are not easily obtained in computer vision, our model only requires two parameters that are most related to physically-based computer vision: sun angle (geometry) and turbidity (weather). Our model establishes a bridge between image and physical environmental information, e.g., time, location, and weather conditions. Since we focus on computer vision applications rather than atmosphere physics or meteorology, the basic calculations in atmosphere physics, e.g., transmission functions of scattering, are still based on previous atmospheric sciences literature.

Appendix

Tables Icon

Table 4. Our proposed total absorption coefficient.

Acknowledgments

This work was supported by the National Natural Science Foundation of China under Grants 61473280 and 61102116.

References and links

1. N. Jacobs, N. Roman, and R. Pless, “Toward fully automatic geo-location and geo-orientation of static outdoor cameras,” in IEEE Workshop on Applications of Computer Vision, (IEEE, 2008), pp. 1–6.

2. J.-F. Lalonde, S. G. Narasimhan, and A. A. Efros, “What does the sky tell us about the camera?” in Proc. ECCV, (Springer, 2008), pp. 354–367.

3. J.-F. Lalonde, S. G. Narasimhan, and A. A. Efros, “What do the sun and the sky tell us about the camera?” Int. J. Comput. Vis. 88, 24–51 (2010). [CrossRef]  

4. Y. Liu, X. Qin, S. Xu, E. Nakamae, and Q. Peng, “Light source estimation of outdoor scenes for mixed reality,” The Visual Computer 25, 637–646 (2009). [CrossRef]  

5. K. Sunkavalli, F. Romeiro, W. Matusik, T. Zickler, and H. Pfister, “What do color changes reveal about an outdoor scene?” in Proc. CVPR (IEEE, 2008), pp. 1–8.

6. J. Haber, M. Magnor, and H.-P. Seidel, “Physically-based simulation of twilight phenomena,” ACM Trans. Graph. 24, 1353–1373 (2005). [CrossRef]  

7. R. Perez, R. Seals, and J. Michalsky, “All-weather model for sky luminance distribution—preliminary configuration and validation,” Sol. Energy 50, 235–245 (1993). [CrossRef]  

8. K.-J. Yoon, E. Prados, and P. Sturm, “Joint estimation of shape and reflectance using multiple images with known illumination conditions,” Int. J. Comput. Vis. 86, 192–210 (2010). [CrossRef]  

9. D. Wu, J. Tian, B. Li, Y. Wang, and Y. Tang, “Recovering sensor spectral sensitivity from raw data,” J. Electron. Imaging 22, 023032 (2013). [CrossRef]  

10. G. D. Finlayson and S. D. Hordley, “Color constancy at a pixel,” J. Opt. Soc. Am. A 18, 253–264 (2001). [CrossRef]  

11. X. Xing, W. Dong, X. Zhang, and J.-C. Paul, “Spectrally-based single image relighting,” in Entertainment for Education. Digital Techniques and Systems (Springer, 2010), pp. 509–517. [CrossRef]  

12. T. Gierlinger, D. Danch, and A. Stork, “Rendering techniques for mixed reality,” J. Real-Time Image Processing 5, 109–120 (2010). [CrossRef]  

13. J. Wither, S. DiVerdi, and T. Höllerer, “Annotation in outdoor augmented reality,” Comp. Graphics 33, 679–689 (2009). [CrossRef]  

14. G. D. Finlayson, M. S. Drew, and C. Lu, “Entropy minimization for shadow removal,” Int. J. Comput. Vis. 85, 35–57 (2009). [CrossRef]  

15. J. Tian, J. Sun, and Y. Tang, “Tricolor attenuation model for shadow detection,” IEEE Trans. Image Process. 18, 2355–2363 (2009). [CrossRef]   [PubMed]  

16. D. B. Judd, D. L. MacAdam, G. Wyszecki, H. Budde, H. Condit, S. Henderson, and J. Simonds, “Spectral distribution of typical daylight as a function of correlated color temperature,” J. Opt. Soc. Am. A. 54, 1031–1040 (1964). [CrossRef]  

17. A. J. Preetham, P. Shirley, and B. Smits, “A practical analytic model for daylight,” in Proceedings of the 26th annual conference on Computer graphics and interactive techniques, (ACM Press/Addison-Wesley Publishing Co., 1999), pp. 91–100.

18. J. Jung, J. Lee, and I. Kweon, “One-day outdoor photometric stereo via skylight estimation,” in Proc. CVPR (IEEE, 2015), pp. 4521–4529.

19. R. Kawakami, H. Zhao, R. T. Tan, and K. Ikeuchi, “Camera spectral sensitivity and white balance estimation from sky images,” Int. J. Comput. Vis. 105,187–204, 2013. [CrossRef]  

20. C. Gueymard, SMARTS2: A Simple Model of the Atmospheric Radiative Transfer of Sunshine: Algorithms and Performance Assessment (Florida Solar Energy Center Cocoa, FL, 1995).

21. A. Berk, L. S. Bernstein, and D. C. Robertson, Modtran: A moderate resolution model for lowtran, Tech. Rep., DTIC Document (DTIC1987).

22. R. E. Bird and C. Riordan, “Simple solar spectral model for direct and diffuse irradiance on horizontal and tilted planes at the earth’s surface for cloudless atmospheres,” J. Climate Appl. Meteor. 25, 87–97 (1986). [CrossRef]  

23. International Electrotechnical Commission, Multimedia systems and equipment - Colour measurement and management - Part 2-1: Colour management - Default RGB colour space - sRGB, Tech. Rep. IEC 619966-2-1(1999).

24. B. Leckner, “The spectral distribution of solar radiation at the earth’s surface—elements of a model,” Sol. Energy 20, 143–150 (1978). [CrossRef]  

25. R. Schroeder and J. Davies, “Significance of nitrogen dioxide absorption in estimating aerosol optical depth and size distributions,” Atmosphere-Ocean 25, 107–114 (1987). [CrossRef]  

26. J. H. Pierluissi and C.-M. Tsai, “New lowtran models for the uniformly mixed gases,” App. Opt. 26, 616–618 (1987). [CrossRef]  

27. L. Zhou, P. Guo, and Y. Tan, “A new way to study water-vapor absorption coefficient,” Marine Science Bulletin7 (2005).

28. M. Iqbal, An Introduction to Solar Radiation (Elsevier, 2012).

29. J. Jiang, D. Liu, J. Gu, and S. Susstrunk, “What is the space of spectral sensitivity functions for digital color cameras?,” in IEEE Workshop on Applications of Computer Vision (IEEE, 2013), pp. 168–179.

30. J. Tian and Y. Tang, “Linearity of each channel pixel values from a surface in and out of shadows and its applications,” in Proc. CVPR (IEEE, 2011), pp. 985–992.

31. L. Qu, J. Tian, Z. Han, and Y. Tang, “Pixel-wise Orthogonal Decomposition for Color Illumination Invariant and Shadow-free Image,” Opt. Express 23,2220–2239, 2015. [CrossRef]   [PubMed]  

32. J. Tian and Y. Tang, “Wavelength-sensitive-function controlled reflectance reconstruction,” Opt. Lett. 38, 2818–2820 (2013). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (19)

Fig. 1
Fig. 1 Outdoor appearances vary with time and air conditions.
Fig. 2
Fig. 2 Illustration of the image formation procedure.
Fig. 3
Fig. 3 New attenuation and transmittance expressions of water vapor and uniformly mixed gases produce good results as the original expressions.
Fig. 4
Fig. 4 Total absorption coefficient.
Fig. 5
Fig. 5 The SPD of extraterrestrial irradiance reported by World Radiation Center and that after absorption at air mass equals to 2.
Fig. 6
Fig. 6 Schematic diagram of the solar radiation onto the Earth surface.
Fig. 7
Fig. 7 Comparison of the calculated SPDs by our method and by Bird’s method [22] and SMARTS2 method [20] at different zenith angles for β = 0.1. From top to bottom, the zenith angle equals to 20, 30,…,80 degree, respectively. The x-coordinate denotes wavelength (nm) and the y-coordinate denotes irradiance power(W·m2·nm1). The 2nd column shows simulated Xrite colorchecker appearances under sunlight and the 4th column is that under skylight. Upper: Our; Middle: Bird’s; Lower: SMARTS2. The last column is pixel value differences (percent). A: Direct sunlight compared with Bird’s method. B: Direct sunlight compared with SMARTS2 method. C: Diffuse skylight compared with Bird’s method. D: Diffuse skylight compared with SMARTS2 method.
Fig. 8
Fig. 8 Comparisons of the calculated SPDs by our method and those by Bird’s method [22] and SMARTS2 method [20] under different turbidities. From top to bottom are β = 0, β = 0.2, and β = 0.3, respectively.
Fig. 9
Fig. 9 Using blackbody radiation to approximate the true extraterrestrial data. The right image is the close-up view of the left one.
Fig. 10
Fig. 10 CIE Chromaticity of sunlight, skylight, and daylight are compared with Planckian locus formed by color temperatures from 1000k to 500000k.
Fig. 11
Fig. 11 Compared with Preetham model. Left: at 30 degree zenith angle. Right: at 60 degree zenith angle.
Fig. 12
Fig. 12 A 24 color checker image captured by a Canon 5D Mark II camera under skylight and its corresponding spectral reflectance.
Fig. 13
Fig. 13 The recovered skylight spectrum and the recovered CSS of Canon 5D Mark II. In the second figure, M and E are the abbreviations of ”Measured” and ”Estimated”, respectively.
Fig. 14
Fig. 14 Our SPD calculation method can predict KH in shadowed images.
Fig. 15
Fig. 15 Intrinsic image result. Left: Original image; middle: Grayscale intrinsic image; right: Color intrinsic image.
Fig. 16
Fig. 16 More accurate estimation of KH can produce better intrinsic image results. Left: Original images; middle: Intrinsic image results using KH calculated by Eq. (29) and the SPDs calculation method proposed in this paper. right: Intrinsic image results using KH introduced in [31] that is calculated from the mean value of some real measured SPDs.
Fig. 17
Fig. 17 Converting illumination from skylight to daylight. Left: Image illuminated by skylight; Middle: The image with illumination converted to daylight. Right: The real image captured in daylight.
Fig. 18
Fig. 18 Errors between the converted image and the real captured one.
Fig. 19
Fig. 19 Two more results of image lighting conversions. Left: Original images; Right: converted images.

Tables (4)

Tables Icon

Table 1 Proportion of the scattered light that can reach the ground with different zenith angles

Tables Icon

Table 2 Comparison of the formula expressions of our methods with those of SMARTS2 and Bird’s method

Tables Icon

Table 3 The calculated KH at different sun angles and under different turbidities

Tables Icon

Table 4 Our proposed total absorption coefficient.

Equations (36)

Equations on this page are rendered with MathJax. Learn more.

F H = 400 700 E ( λ ) S ( λ ) Q H ( λ ) d λ
E i n = E o λ T o λ T N λ T w λ T u λ
T o λ = exp ( 0.35 a o λ m ) ( R e f e r t o [ 24 ] )
T N λ = exp ( 0.0016 a n λ m ) ( R e f e r t o [ 20 , 25 ] )
T w λ = exp [ 0.2385 a w λ W m / ( 1 + 20.07 a w λ W m ) 0.45 ]
T u λ = exp [ 1.41 a u λ m / ( 1 + 118.93 a u λ m ) 0.45 ]
m = sec ( θ )
T w λ ( a w λ + ε a u λ | m ) = T w λ ( a w λ | m ) T u λ ( a u λ | m )
ε = a r g m i n m T w λ T u λ T w λ ( a w λ + ε a u λ ) 2
T w λ ( a w λ + 4.3 a u λ ) T w λ T u λ
T w λ = exp [ 0.2385 a w λ W m / ( 1 + 20.07 a w λ W m ) 0.45 ]
T w λ n e w = exp [ p ( a w λ ) q m ]
p , q = a r g m i n m p ( a w λ ) q m log ( T w λ ) 2
T w λ n e w = exp [ 0.055 ( a w λ ) 0.56 m ]
E i n = E o λ T λ
T λ = exp ( τ ( λ ) m )
τ ( λ ) = 0.35 a o λ + 0.0016 a n λ + 0.055 a w λ 0.56
E d λ = E i n T r λ T a λ
T r λ = exp ( 0.008735 λ 4.08 m )
T a λ = exp ( β λ α m )
E s λ = E i n cos ( θ ) ( 1 T r λ T a λ ) κ
F H = 400 700 E s λ S ( λ ) Q H ( λ ) d λ .
F H = 400 700 E s λ S ( λ ) Q H ( λ ) d λ .
Δ E L a b = Δ L 2 + Δ a 2 + Δ b 2
κ = a r g m i n Δ E L a b
F H = 400 700 E ( θ , T , λ ) S ( λ ) Q H ( λ ) d λ , H = R , G , B
n = 1 24 F H n λ E ( θ , T ) S n [ σ H C H ] , H = R , G , B
F H f H = 400 700 E d a y ( θ , T , λ ) S ( λ ) Q H ( λ ) d λ 400 700 E s k y ( θ , T , λ ) S ( λ ) Q H ( λ ) d λ = K H
E d a y ( θ , T , λ ) = E s u n ( θ , T , λ ) cos θ + E s k y ( θ , T , λ ) .
log ( F H + 14 ) = log ( K H ) 2.4 + log ( f H + 14 )
log ( F R + 14 ) + log ( F G + 14 ) β 1 log ( F B + 14 ) = log ( f R + 14 ) + log ( f G + 14 ) β 1 log ( f B + 14 )
β 1 = log ( K R ) + log ( K G ) log ( K B )
I 1 = log ( F R + 14 ) + log ( F G + 14 ) β 1 log ( F B + 14 ) = log ( f R + 14 ) + log ( f G + 14 ) β 1 log ( f B + 14 ) = log ( v R + 14 ) + log ( v G + 14 ) β 1 log ( v B + 14 )
I 2 = log ( F R + 14 ) β 2 log ( F G + 14 ) + log ( F B + 14 ) = log ( f R + 14 ) β 2 log ( f G + 14 ) + log ( f B + 14 ) = log ( v R + 14 ) β 2 log ( v G + 14 ) + log ( v B + 14 )
I 3 = β 3 log ( F R + 14 ) + log ( F G + 14 ) + log ( F B + 14 ) = β 3 log ( f R + 14 ) + log ( f G + 14 ) + log ( f B + 14 ) = β 3 log ( v R + 14 ) + log ( v G + 14 ) + log ( v B + 14 )
β 2 = log ( K R ) + log ( K B ) log ( K G ) , β 3 = log ( K G ) log ( K B ) log ( K R )
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.