Illuminant direction estimation is an important research issue in the field of image processing. Due to low cost for getting texture information from a single image, it is worthwhile to estimate illuminant direction by employing scenario texture information. This paper proposes a novel computation method to estimate illuminant direction on both color outdoor images and the extended Yale face database B. In our paper, the luminance component is separated from the resized YCbCr image and its edges are detected with the Canny edge detector. Then, we divide the binary edge image into 16 local regions and calculate the edge level percentage in each of them. Afterward, we use the edge level percentage to analyze the complexity of each local region included in the luminance component. Finally, according to the error function between the measured intensity and the calculated intensity, and the constraint function for an infinite light source model, we calculate the illuminant directions of the luminance component’s three local regions, which meet the requirements of lower complexity and larger average gray value, and synthesize them as the final illuminant direction. Unlike previous works, the proposed method requires neither all of the information of the image nor the texture that is included in the training set. Experimental results show that the proposed method works better at the correct rate and execution time than the existing ones.
© 2014 Optical Society of AmericaFull Article | PDF Article
Juan L. Nieves, Clara Plata, Eva M. Valero, and Javier Romero
Appl. Opt. 47(20) 3574-3584 (2008)
Meng Wu, Jun Sun, Jun Zhou, and Gengjian Xue
J. Opt. Soc. Am. A 27(10) 2097-2105 (2010)
Sandra Skaff and James J. Clark
J. Opt. Soc. Am. A 28(11) 2385-2399 (2011)