Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

High-dynamic imaging system for real-time detection of illuminance by vehicle head lamps

Open Access Open Access

Abstract

In this paper, we propose an imaging system for quick detection of high-dynamic illumination with a high-contrast cutoff line. A digital camera is used to capture the image of the light pattern from a vehicle headlamp. After having image processing and calibration, we can accurately obtain the illuminance of the measurement plane while different exposure times are used to increase the dynamic range of the system measurement. Two representative regulations are mainly discussed, one is K-mark and the other is ECE R113 Class B. Results show that the measurement errors of our imaging system are within ±5% when the values are more than 2 lx, and have a difference of 0.1 lx when the values are less than 2 lx.

© 2021 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Various applications have emerged as lighting technology becomes more solid including indoor and outdoor lighting [16]. One of the most important outdoor applications is the illumination on the roadway, such as street lights and automotive headlamps [713]. With the increasing public attention to environmental issues and health issues [14], the number of bicycle sales has also increased year by year. Therefore, the design of bicycle headlamps is also highly valued. A well-designed headlamp can provide comfortable lighting performance, while a poor design headlamp will cause safety issues. To increase road safety, countries around the world have formulated rigorous regulations. For a bicycle, one of the representative regulations is called K-mark, but for an e-bike is the ECE class A, B, or C [15], and ECE has stricter requirements on cut-off line compared to K-mark, especially class B [1213]. A detection system is necessary to check whether the headlamp can meet regulations or not, the most common way is to use a goniophotometer to measure headlamp in a dark room [1617], the headlamp will be placed on a rotatable electric stage, and a photometer located at 25 meters away will measure the luminous intensity at various angles. Goniphotometer system provides high accuracy but requires large space, and long measurement time, so methods based on image analysis have been developed to optimize the measurement [1722]. In this paper, we propose a high-dynamic imaging system for real-time detection of illuminance. The main equipment of the system is a digital camera and a standardized test site. Compared to a monochrome camera, a color camera provides lower costs and is commonly seen, and it might be a basic function in a cell phone in the future. After the light pattern from the vehicle headlamp is captured by the camera, the system will accurately judge whether this light pattern meets the regulation or not in a short time after series of image processing and calibration.

2. Image processing and calibration

When an image sensor is exposed, it can only detect the total flux received by pixels but exclude wavelength information, which is why the image sensor cannot distinguish colors. Although many methods have been proposed to make the image sensor able to receive color information by scientists, the most well-known method is the Bayer array proposed by Bryce Bayer in 1976 [23], the method is to add a color filter array above the image sensor the number of green pixels in this structure is twice the number of red and blue pixels to simulate the physiological function of the human eye that is most sensitive to green light under photopic vision [24]. In this paper, a camera called Canon 650D is used as the image sensor of our system. The most common image file generated by a camera is the JPG file, which is a standard distortion compression method widely used to process images. The method was proposed by the Joint Photographic Experts Group (JPEG) in 1992 and was certified by the International Organization for Standardization (ISO) in 1994 [25]. However, JPG files will lose many information after being compressed. In order to avoid any distortion, the raw image file is finally considered to use in our research. Figure 1 shows the image processing flow after obtaining the raw image file, first step is normalization, which linearly maps the value between the full dark value and the saturation value to between 0 and 1, to avoid the occurrence of values lower than the full dark value or higher than the saturation value. Second is white balance correction [26], when the image sensor receives the reflected light of a white object under different light sources, the red, green, and blue values will also be different. Therefore, we must separately compare the red, green, and blue pixels in the raw image file. The value of the red, green, and blue pixels are multiplied by their respective gain to make the values of the white object in the same photo. Third is called demosaicing [27], after white balance correction, the single channel intensity image is used to reconstruct the three-channel color image through different algorithms, a MATLAB's built-in function is used to demosaicing and rebuild a color image. Final is the color space conversion, the color space of the camera need to be converted to the CIE 1931 XYZ, the formula can be expressed

$${\left[ {\begin{array}{c} R\\ G\\ B \end{array}} \right]_{XYZ}} = [{{A_{Cam \to XYZ}}} ]{\left[ {\begin{array}{c} R\\ G\\ B \end{array}} \right]_{Camera}}, $$
where matrix A is the conversion matrix from the camera color space to the CIE XYZ color space. A conversion matrix from CIE XYZ color space to camera color space can be found in the EXIF file of Canon 650D [28], and be expressed
$$[{{B_{XYZ \to Cam}}} ]= \left[ {\begin{array}{ccc} {0.6602}&{ - 0.0841}&{ - 0.0939}\\ { - 0.4472}&{1.2458}&{0.2247}\\ { - 0.0975}&{0.2039}&{0.6148} \end{array}} \right], $$
which is the inverse matrix of matrix A, so the formula can be rewritten
$${\left[ {\begin{array}{c} R\\ G\\ B \end{array}} \right]_{XYZ}} = {[{{B_{XYZ \to Cam}}} ]^{ - 1}}{\left[ {\begin{array}{c} R\\ G\\ B \end{array}} \right]_{Camera}}, $$
and can finally get the value of CIE 1931 XYZ. The value Y is adopt as the brightness signal of the image file and corresponds to the grayscale.

 figure: Fig. 1.

Fig. 1. The flow chart of image processing.

Download Full Size | PDF

The calibration of the measurement environment is also important. As shown in the Fig. 2, a projection screen is set 10 meters away from the camera, which the light pattern of the headlamp can clearly project on. One of the essential calibrations is to compensate for the off-axis response of the camera. Due to the influence of vignetting and the cosine fourth law [24], the intensity received by the image sensor will decrease from the center to the edge, which will cause uneven brightness of the raw image file. Therefore, the 69 points were placed on the projection screen, corresponding to 8 directions radiating outward from the center of the optical axis of the camera, and the off-axis angle between the points was controlled at 1 degree. By projecting a light pattern with the same central illuminance by a spotlight at each of 69 points as shown in Fig. 3(a), we can get the off-axis response as shown in Fig. 3(b) of the camera and complete the compensation.

 figure: Fig. 2.

Fig. 2. The measurement environment including a camera, a stage for placing headlamp, and a projection screen.

Download Full Size | PDF

 figure: Fig. 3.

Fig. 3. (a) 69 points on the projection screen, corresponding to 8 directions radiating outward from the center of the optical axis of the camera, (b) the off-axis response of the camera.

Download Full Size | PDF

A power tunable spotlight is needed to produce different illuminance on the projection screen by changing different currents, and the CCT of the light source is around 6500K correspond to the common CCT of a white light LED headlamp. Grayscales obtained by the camera will correspond to illuminance results measured by an illuminance meter, so the conversion factor between grayscale and illuminance can then be calculated. Another problem of the image sensor is the noise effect. The noise cannot be ignored when the grayscale is small. Considering the noise problem in the image sensor, more photos were taken to average out the effect so that the result will be more accurate, as shown in the Fig. 4, we increase the number of taken photos by 1, 5, 10, 15, 20 and 30 respectively. When it comes to 20 and 30 photos, the overall trend will generally not change. Figure 5 shows the fitting equations, results show that the second-order fitting is much more accurate than the first-order fitting, especially in the part of low illumination. When almost all the parameters are determined, there is one more issue that needs to be clarified. When the exposure time is fixed, the grayscale of the camera will become saturate if the illuminance keep increase, and is so called the dynamic range of the system. Due to less bicycle headlamps on the market can reach an illuminance of 300lx, we regarded this condition as our standard to find the exposure time for higher illuminance. As shown in Fig. 6, when the exposure time is at 0.25 seconds, the grayscale captured by the camera is far from saturation.

 figure: Fig. 4.

Fig. 4. The relation between grayscale and illuminance : (a) by averaging one picture, (b) by averaging five pictures, (c) by averaging ten pictures, (d) by averaging fifteen pictures, (e) by averaging twenty pictures, (f) by averaging thirty pictures.

Download Full Size | PDF

 figure: Fig. 5.

Fig. 5. (a) The result by first-order fitting function, (b) the error of the first-order fitting result. (c) The fitting result by second-order fitting function, (d) the error of the second-order fitting result.

Download Full Size | PDF

 figure: Fig. 6.

Fig. 6. Higher exposure time leads to grayscale saturation.

Download Full Size | PDF

The conversion curve between grayscale and illuminance for the exposure time of 1 second and 0.25 seconds are partially overlapped. Therefore, we have to compare the conversion factor under which exposure time should be used for the overlapping part to get the accurate illuminance value. The input power of the spotlight is changed to generate different illuminance on the projection screen and measured by an illuminance meter, the measurement results are then compared to the illuminance that calculates by the imaging system with two different exposure times, the result is shown in Fig. 7. In Fig. 7(a), the conversion factor for the exposure time of 1 second has higher accuracy at low illuminance, while the exposure time of 0.25 seconds has higher accuracy at high illuminance. In Fig. 7(b), the difference of the overlapping part between the two conversion factors are nearly negligible, except the part that below 20 lx, but since the conversion factor for the exposure time of 0.25 seconds is mainly used for high illuminance, the accuracy for the part of low illuminance only depends on the conversion factor for the exposure time of 1 second. Based on the above results, 50 lx is set to be the reference illuminance of which conversion factor should be used. Therefore, a threshold must be set for the program in our imaging system. As shown is Fig. 8, in the conversion factor of 1 second, we set the threshold when the grayscale is equal to 7000, which correspond to the illuminance of 50.1 lx, so when the grayscale captured by the system is lower than 7000, we will keep this value and convert it into illuminance, while the part with a grayscale higher than 7000 will be removed. In the 0.25 seconds conversion factor, we set the threshold when the grayscale is equal to 3000, which correspond to the illuminance of 49.7 lx, so when the grayscale captured by the system is higher than 3000, the value will be kept and convert into illuminance while the part with a grayscale lower than 3000 will be removed. As for the overlapping part between the two conversion factor, the system will take an average as the value

 figure: Fig. 7.

Fig. 7. (a) The illuminance difference between two different exposure times, (b) The illuminance error between two different exposure times.

Download Full Size | PDF

 figure: Fig. 8.

Fig. 8. The threshold illuminance is determined to increase the dynamic range of the system.

Download Full Size | PDF

3. Experiment and verification

The experiment was done by using a headlamp which is designed to meet both K-mark and ECE Class B [5]. Figure 9 shows an analysis result with a contour map that was generated by the imaging system, the illuminance distribution can be easily seen through the figure and the measurement point of the regulation will also be marked on the figure. The measurement values for K-mark are shown in Fig. 10(a), and the measurement values of ECE Class B are shown in Fig. 10(b). The left value in Fig. 10 is measured by an illuminance meter while the right value is measured by the imaging system, and the discriminant value is in parentheses. When the measured value is greater than 2 lx, the discriminant value is compared with the error percentage; when the measured value is less than 2 lx, the discriminant value is represented by the value difference instead of error percentage if the discriminant value is set, because a small variety of the illuminance below this range often leads to a huge fluctuation in the error percentage, and the analysis method will be totally unfair compared to the range of high illuminance. The verification result in the part below 2 lx, the difference between illuminance meter and the imaging system is within plus or minus 0.1 lx, and in part above 2 lx, the error percentage is within plus or minus 5%.

 figure: Fig. 9.

Fig. 9. Illumination figures generate by the imaging system after capture the image of the headlamp light pattern : (a) low beam for K-mark, (b) low beam with high beam for K-mark. (c) low beam for ECE class B, (d) low beam with high beam for ECE class B.

Download Full Size | PDF

 figure: Fig. 10.

Fig. 10. Values on the left is measures by an illuminance meter and values on the right is measured by the imaging system : (a) K-mark regulation points with high beam and low beam, (b) ECE class B regulation points with high beam and low beam.

Download Full Size | PDF

4. Conclusion

In this paper, a high dynamic imaging system is proposed. The image raw image file generated by a digital camera is successfully restored from a Bayer array to a color image through a series of image processing. The off-axis response of the camera is corrected, and the dynamic range measured by the system is expanded through different exposure times. Finally, the illuminance value of the target surface can be obtained by just taking a few images. In the experiment, a headlamp that is designed to meet K-mark and ECE is arranged. The imaging system can determine whether the headlamp can meet K-mark or ECE. When it comes to ECE, it is divided into several different standards, of which the class B is the most stringent, both the flatness and sharpness of the cut-off line are highly required than other classes. Therefore, this article chooses class B as the requirement, the experimental results show that the imaging system can accurately determine whether the headlamp meets the regulations or not, and a contour maps with regulatory marks, and measurement data will be generated simultaneously. The results show that the measurement errors of the proposed imaging system in a high-dynamic illumination circumstance are within ±5% when the measurement illuminance is more than 2 lx, and have a difference of 0.1 lx when the measurement illuminance is less than 2 lx.

Funding

Ministry of Science and Technology, Taiwan (109-2221-E-008-087-MY2, 109-2622-E-008-014-CC2, 109-2622-E-008-026).

Acknowledgments

The author would like to thank Breault Research Organization (BRO), Inc. for sponsoring ASAP software program.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. P. Schlotter, R. Schmidt, and J. Schneider, “Luminescence conversion of blue light emitting diodes,” Appl. Phys. A 64(4), 417–418 (1997). [CrossRef]  

2. S. Nakamura, S. Pearton, and G. Fasol, The Blue Laser Diode: The Complete Story, (Springer-Verlag, 2000).

3. A. Zukauskas, M. S. Shur, and R. Caska, Introduction to Solid-state Lighting, (John Wiley & Sons, New York, 2002).

4. E. F. Schubert and J. K. Kim, “Solid-state light sources becoming smart,” Science 308(5726), 1274–1278 (2005). [CrossRef]  

5. L. Venema, “The art of illumination,” Nature 450(7173), 1175 (2007). [CrossRef]  

6. R. Karlicek, C. C. Sun, G. Zissis, and R. Ma, Handbook of advanced lighting technology, (Springer, 2017).

7. J. F. V. Derlofske and M. McClogan, “White LED sources for vehicle forward lighting,” Proc. SPIE 4776, 195–205 (2002). [CrossRef]  

8. P. Brick and T. Schmid, “Automotive headlamp concepts with low-beam and high-beam out of a single LED,” Proc. SPIE 8170, 817008 (2011). [CrossRef]  

9. W. S. Sun, C. L. Tien, W. C. Lo, and P. Y. Chu, “Optical design of an LED motorcycle headlamp with compound reflectors and a toric lens,” Appl. Opt. 54(28), E102–E108 (2015). [CrossRef]  

10. C. C. Sun, C. S. Wu, Y. S. Lin, Y. J. Lin, C. Y. Hsieh, S. K. Lin, T. H. Yang, and Y. W. Yu, “Review of optical design for vehicle forward lighting based on white LEDs,” Opt. Eng. 60(9), 091501 (2021). [CrossRef]  

11. C. C. Sun, C. S. Wu, C. Y. Hsieh, Y. H. Lee, S. K. Lin, T. X. Lee, T. H. Yang, and Y. W. Yu, “Single reflector design for integrated low/high beam meeting multiple regulations with light field management,” Opt. Express 29(12), 18865–18875 (2021). [CrossRef]  

12. X. H. Lee, I. Moreno, and C. C. Sun, “High-performance LED street lighting using microlens arrays,” Opt. Express 21(9), 10612–10621 (2013). [CrossRef]  

13. C. C. Sun, X. H. Lee, I. Moreno, C. H. Lee, Y. W. Yu, T. H. Yang, and T. Y. Chung, “Design of LED street lighting adapted for free-form roads,” IEEE Photonics J. 9, 1–13 (2017). [CrossRef]  

14. S. R. Chekroud, R. Gueorguieva, A. B. Zheutlin, M. Paulus, H. M. Krumholz, J. H. Krystal, and A. M. Chekroud, “Association between physical exercise and mental health in 1.2 million individuals in the USA between 2011 and 2015: a cross-sectional study,” The Lancet Psychiatry 5(9), 739–746 (2018). [CrossRef]  

15. UNECE website, http://www.unece.org/trans/main/wp29/wp29regs101-120.html.

16. P. Csuti, F. Szabó, and R. Dubnicka, “Comparison of luminous intensity distributions measured on luminaire turning and mirror goniophotometer,” IEE Lighting Conf. Visegrad Countries 4, 1–4 (2016). [CrossRef]  

17. S. Royo, M. J. Arranz, J. Arasa, M. Cattoen, and T. M. Bosch, “Compact low-cost unit for photometric testing of automotive headlamps,” Opt. Eng. 45(6), 063602 (2006). [CrossRef]  

18. M. S. Rea and I. Jeffrey, “A new luminance and image analysis system for lighting and vision I. Equipment and calibration,” J. Illum. Eng. Soc. 19(1), 64–72 (1990). [CrossRef]  

19. H. H. Wu, Y. P. Lee, and S. H. Chang, “Fast measurement of automotive headlamps based on high dynamic range imaging,” Appl. Opt. 51(28), 6870–6880 (2012). [CrossRef]  

20. Y. Lin, Y. Gao, Y. Sun, S. Zhang, and W. Wang, “An Automatic Evaluation System for the Photometric Performance of Vehicle Headlamps Using Image Processing Algorithms,” Conf. IEEE ICM 14, 559–604 (2014). [CrossRef]  

21. W. Wang, S. To, C. Cheung, J. Jiang, and B. Wang, “A new method to test the photometric characteristics of lamps for motor vehicles,” Optik 120(11), 549–552 (2009). [CrossRef]  

22. D. Wüller and H. Gabele, “The usage of digital cameras as luminance meters,” Proc. SPIE 6502, 65020U (2007). [CrossRef]  

23. B. E. Bayer, “Color imaging array,” United States Patent, US3971065A (1976).

24. V. N. Mahajan, Optical Imaging and Aberrations: Part I. Ray Geometrical Optics, (SPIE, 1998).

25. G. K. Wallace, “The JPEG still picture compression standard,” IEEE Trans. Consumer Electron. 38(1), xviii–xxxiv (1992). [CrossRef]  

26. D. Qian, J. Toker, and S. Bencuya, “An automatic light spectrum compensation method for CCD white balance measurement,” IEEE Trans. Consumer Electron. 43(2), 216–220 (1997). [CrossRef]  

27. B. K. Gunturk, J. Glotzbach, Y. Altunbasak, R. W. Schafer, and R. M. Mersereau, “Demosaicking: color filter array interpolation,” IEEE Signal Process. Mag. 22(1), 44–54 (2005). [CrossRef]  

28. “EXIF tags for canon raw image,” https://exiftool.org/TagNames/Canon.html.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1.
Fig. 1. The flow chart of image processing.
Fig. 2.
Fig. 2. The measurement environment including a camera, a stage for placing headlamp, and a projection screen.
Fig. 3.
Fig. 3. (a) 69 points on the projection screen, corresponding to 8 directions radiating outward from the center of the optical axis of the camera, (b) the off-axis response of the camera.
Fig. 4.
Fig. 4. The relation between grayscale and illuminance : (a) by averaging one picture, (b) by averaging five pictures, (c) by averaging ten pictures, (d) by averaging fifteen pictures, (e) by averaging twenty pictures, (f) by averaging thirty pictures.
Fig. 5.
Fig. 5. (a) The result by first-order fitting function, (b) the error of the first-order fitting result. (c) The fitting result by second-order fitting function, (d) the error of the second-order fitting result.
Fig. 6.
Fig. 6. Higher exposure time leads to grayscale saturation.
Fig. 7.
Fig. 7. (a) The illuminance difference between two different exposure times, (b) The illuminance error between two different exposure times.
Fig. 8.
Fig. 8. The threshold illuminance is determined to increase the dynamic range of the system.
Fig. 9.
Fig. 9. Illumination figures generate by the imaging system after capture the image of the headlamp light pattern : (a) low beam for K-mark, (b) low beam with high beam for K-mark. (c) low beam for ECE class B, (d) low beam with high beam for ECE class B.
Fig. 10.
Fig. 10. Values on the left is measures by an illuminance meter and values on the right is measured by the imaging system : (a) K-mark regulation points with high beam and low beam, (b) ECE class B regulation points with high beam and low beam.

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

[ R G B ] X Y Z = [ A C a m X Y Z ] [ R G B ] C a m e r a ,
[ B X Y Z C a m ] = [ 0.6602 0.0841 0.0939 0.4472 1.2458 0.2247 0.0975 0.2039 0.6148 ] ,
[ R G B ] X Y Z = [ B X Y Z C a m ] 1 [ R G B ] C a m e r a ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.