Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Optical range-finding system using a single-image sensor with liquid crystal display aperture

Open Access Open Access

Abstract

This Letter presents an optical range-finding camera using a liquid crystal display (LCD) to generate multiple, off-axis color-filtered apertures in a flexible manner. The disparity between the different color channels is measured from a pair of stereo images acquired by two off-axis apertures, and the distance of a scene point from the camera is then estimated from the pre-specified relationship between the color disparity and distance.

© 2016 Optical Society of America

Popular range cameras are classified into stereo and time-of-flight (TOF) cameras. A stereo camera simulates a human binocular vision using two or more lenses with a separate image sensor for each lens. However, the stereo camera cannot avoid a number of inherent limitations: (1) multiple lenses and the corresponding sensors increase the cost and volume of the system; (2) computationally expensive pre-processing algorithms are need to align and match the stereo images, and (3) a large disparity between the stereo images results in occlusion or nonmatched holes. A TOF camera consists of an illumination unit to emit the light source and an image sensor to measure the time that the light travels to the object and reflected back to the focal plane array [13]. Although the TOF camera can quickly estimate the distance, the distance estimation range is limited, and it is impossible to estimate the distance to a transparent object.

In order to overcome the drawbacks of the stereo and TOF cameras, a light field camera is equipped with multiple microlenses, and a dual aperture camera consists of an image sensor with infrared pickup pixels and a special aperture that blocks infrared [46]. These two cameras can reconstruct a 3D input scene using depth information under a single-image sensor. However, a very high-precision manufacturing process is needed.

An aperture is an important optical device that controls the amount of incoming light to the camera and determines the depth of field. A circle of confusion or point spread function (PSF) is generally dependent on the size and shape of the aperture. In addition, if the center of the aperture deviates from the optical axis, the position of the image on the sensor moves according to the distance of an object. In order to use the characteristics of a special aperture, a coded aperture and color-filtered apertures were proposed in the literature [7,8]. Among various approaches to use a special aperture, an off-axis aperture can easily generate the disparity using a single camera and, as a result, can estimate the distance without additional optical devices [9,10]. Since a fixed type off-axis aperture can control neither the amount of incoming light nor change the shape of the PSF, a flexible generation of the aperture shape and location is needed.

A liquid crystal display (LCD) consists of a polarizing filter, a thin-film transistor, and liquid crystal. For this reason, it is a good candidate to produce a flexible aperture that can control the amount of incoming light, shape of the PSF at the same time, and colors. A coded aperture and light field cameras were produced using the LCD [1113]. The major issue of the LCD aperture is the diffraction caused by the pixel grid of the LCD. In order to estimate the distance using a single camera, this Letter presents a computational camera system that can easily change the color, size, and location of the aperture using an LCD. We also analyze the diffraction of the LCD to formulate the image degradation model of the camera. Using the LCD imaging property, we can realize a special type of aperture that has variable sizes, locations, and colors.

As shown in Fig. 1(a), a cell of the color LCD consists of a pair of orthogonally polarizing filters, a pair of glass plates, and three pairs of color filters and liquid crystal cells on the thin-film transistors (TFT). The upper and lower glass plates protect liquid crystal cells, and directions of two polarizing filters are orthogonal to each other. Since the inactive liquid crystal has the codirectional molecular structure, light cannot pass through the color LCD cell with an orthogonal pair of polarizing filter, as shown in Fig. 1(a). However, when the liquid crystal becomes active by the electric signal generated by the thin-film transistor, the upper and lower molecular structures become orthogonal to each other, and the rotated light in the liquid crystal can pass through the pair of orthogonally polarizing filters.

 figure: Fig. 1.

Fig. 1. Structure of an LCD and diffraction phenomenon: (a) cell of a color LCD and (b) light trajectories in the LCD aperture imaging.

Download Full Size | PDF

Figure 1(b) shows an imaging system with an LCD aperture, where each cell of the LCD consists of red, green, and blue color filters. However, the light passing through the LCD aperture is diffracted by the narrow grating of an LCD cell.

Assuming that the light source is far enough from the camera, the polarization light wave is considered to be orthogonal to the direction of the propagation. Based on Fraunhofer’s diffraction grating theory, the angle of diffraction is expressed as

sinθn=nλ/d,
where λ represents the wavelength of the light, d is the grating of an LCD cell, and θn is the n-th diffraction angle. Since sinθnθn for a sufficiently small θ, the diffraction angle is determined by the wavelength and inversely proportional to the period d of grating.

The diffracted light rays generate virtual images, as shown in Fig. 1(b), and the distance between the real and n-th virtual images is determined as

xn=Lnλ/d,
where xn represents the location of the n-th virtual image on the image sensor, and L represents the distance between the aperture and the image sensor.

The proposed range finding camera uses an LCD to build a programmable coded aperture. As shown in Fig. 2(a), the LCD aperture is installed between the front and rear lenses of the camera. The LCD can play a role of multiple color-filtered apertures since a white pixel lets the entire light pass the LCD, and a pixel with a specific color lets the corresponding color pass the LCD. We can generate any of the colors, shapes, or positions of the LCD aperture by simply generating a desired pattern image on the LCD.

 figure: Fig. 2.

Fig. 2. Structure of the optical range-finding camera system: (a) proposed camera using the LCD aperture, (b) light paths of three different object positions using an off-axis aperture, and (c) light paths of three different object positions using dual color-filtered apertures.

Download Full Size | PDF

The proposed system uses multiple off-axis-type apertures for distance measurement. The off-axis aperture is located away from the optical axis, as shown in Fig. 2(b). A conventional camera with a single aperture on the optical axis makes the image of any of the points on the optical axis at the center of the image sensor. On the other hand, the off-axis aperture makes the image of a scene point on the optical axis at different positions in the image sensor depending on the distance of each point from the lens. Figure 2(c) shows the dual color-filtered aperture (DCA) imaging system. Since the DCA can be considered as a double extension of the off-axis aperture with complementary color filters, a single point on the optical axis at an unfocused area makes two differently located images on the corresponding color channels. The disparity between the two color channels is determined by the distance of the point from the in-focused position. Therefore, the distance between the camera and the object can be estimated using the relationship between the distance and disparity. An accurate estimation of the disparity between the two different color channels is a challenge because the existing image registration methods estimate the disparity between homogeneous color channels. An existing approach to solve this problem is the use of the color-shifting model at the cost of a nontrivial computational complexity [9,14].

In addition, the diffraction property of the LCD apertures generates a special image degradation model. Figure 3(a) shows the captured image from conventional camera system. The point spread function (PSF) is very close to the impulse function at the focused region, and becomes the Gaussian function at the unfocused regions. However, as shown in Fig. 3(b), the proposed system has the PSF convolved the impulse train function to the PSF of conventional camera system. The impulse train function means the mathematical representation of diffraction phenomenon caused by the patterns of LCD pixels that has the highest weight value at the zero-th order of diffracted light.

 figure: Fig. 3.

Fig. 3. Diffraction property of the proposed camera system: (a) acquired image using a general camera system and (b) image acquired by the proposed camera system. (a) and (b) are focused at the triangle object.

Download Full Size | PDF

The proposed computational camera system uses two horizontally symmetric off-axis apertures to solve this problem. Since a pair of color stereo images is acquired using the two LCD off-axis apertures, additional color registration and rectification are not needed.

The proposed camera system estimates disparities between the two color images using the normalized cross-correlation (NCC) [15]. Since the NCC measures the similarity between two images based on variance and deviation, the result is robust to a certain amount of intensity difference. The disparity is measured using the NCC as

Δx=argmaxxn,m[IRT(x,y)I¯RT][IL(x+x,y)I¯L]n,m[IRT(x,y)I¯RT]2n,m[IL(x+x,y)I¯L]2,
where IL(x,y) and IR(x,y), respectively, represent a pair of stereo images generated by the right and left apertures; IRT(x,y) is a template that contains an object in IR(x,y); and I¯RT and I¯L are the average values of IRT(x,y) and IL(x,y), respectively. Since each stereo image acquired by the off-axis apertures is degraded by the same PSF, the diffraction artifact does not affect the disparity measurement.

Finally, the distance of the object is computed using the relationship with the disparity. Lee et al. proposed a single optical range-finding system using the optical characteristics of off-axis apertures to estimate the distance as [10]

z=ffz0Δcxααccz(z0f)Δxf2Δcxα+αc(z0f)(fcz)Δx,
where f denotes the focal length of the lens module, αc and α, respectively, the sizes of a pixel of image sensor and LCD; (cx,cy,cz) denotes the coordinate of the aperture center; and Δcx denotes the distance between the centers of two apertures. The unit of z is millimeter (mm). The size of a pixel α is estimated as
α=WH/N1N2,
where W and H, respectively, represent the width and height of the pixel array; N1N2 represent the total number of pixels; and the unit of α is also mm. All the values of Eq. (4) except Δx are defined from the specification of the lens module. As a result, the estimated distance using Eq. (4) depends on the disparity obtained by Eq. (3).

For a precision measurement, we evaluate the obtained disparities from the optical parameters of the off-axis aperture. The ground truth of the real disparity is computed by modifying Eq. (4) as [10]

ΔxGT=f2αcz0z(z0f)(fz(zf))czΔcxα,
where the unit of ΔxGT is a pixel.

A prototype of the proposed optical range-finding system is shown in Fig. 4. We inserted a 45.7 mm TFT-LCD with a resolution of 160×128 into a 50 mm fixed focus lens module after removing the backlight module. A microcontroller unit (MCU) on top of the prototype system controls the LCD to generate the desired aperture images using switches, as shown in Fig. 4(b). As a result, the prototype system can flexibly generate various types of apertures by activating the group of sub-pixels, as shown in Fig. 4(c).

 figure: Fig. 4.

Fig. 4. Prototype of the proposed optical range-finding system: (a) right side of the system with an LCD aperture that is inserted between the lens and the camera body, (b) MCU and switches on top of the system, and (c) front view of the system and the LCD aperture.

Download Full Size | PDF

To evaluate the performance of the prototype system, we used a test scene with a white triangular object, as shown in Fig. 5(a), and compared the estimated distance with an existing off-axis aperture proposed by Lee et al. [10]. We acquired multiple pairs of stereo images by moving the object at the distance from 320 to 920 mm with an interval of 25 mm, and estimated the disparity and distance using Eqs. (3) and (4), respectively. In order to measure the disparity without using color registration, a white object of Fig. 5(a) is used. Figure 5(b) shows a color-shifted image acquired by the DCA camera system, where the disparity is measured between the R and B channels. A pair of stereo images acquired by the proposed two off-axis apertures is shown in Figs. 5(c) and 5(d).

 figure: Fig. 5.

Fig. 5. Experiment to generate stereo images using the two different types of off-axis apertures: (a) input scene with a white triangular object at the distance of 520 mm, (b) acquired image using the DCA [10], and (c) and (d) pair of acquired stereo images using the proposed left and right off-axis apertures, respectively.

Download Full Size | PDF

Figure 6(a) shows the result of the estimated disparity using the stereo images shown in Fig. 5, where the x- and y-axes represent the object distance and the estimated disparity, respectively. In spite of the image degradation caused by the diffraction in the LCD aperture, the proposed system can correctly estimate the disparity that is almost equal to the actual value, and it is called the ground truth in this Letter. The estimated distance using the disparity is shown in Fig. 6(b), where the straight line represents the ground truth. The experimental results show that the average difference from the ground truth is approximately 6 mm.

 figure: Fig. 6.

Fig. 6. Estimated disparity and distance using the test image shown in Fig. 5: (a) ground truth and two estimated disparity curves using the DCA and the proposed system, and (b) corresponding distances computed form the disparity shown in part (a).

Download Full Size | PDF

To test whether the proposed system can measure color-invariant distance, we performed another experiment using a colorful object, as shown in Fig. 7(a), under the same conditions of the first experiment. Since each color channel has different intensity distributions for the same object, as shown in Figs. 7(b) and 7(c), the existing DCA system cannot accurately estimate the disparity, as shown in Fig. 7(f) and, thus, the estimated distance is incorrect, as shown in Fig. 7(g). On the other hand, the proposed system can generate a set of color stereo images using two off-axis apertures and successfully estimate the disparity and distance, as shown in Figs. 7(d) and 7(f), respectively.

 figure: Fig. 7.

Fig. 7. Distance and disparity measurement results using a color object: (a) input scene, including the colorful object, and (b) and (c) stereo images from DCA. (d) and (c) Stereo images from the proposed two off-axis apertures, and (f) and (g) estimated corresponding disparity and distance curves, respectively.

Download Full Size | PDF

This Letter presents an optical range-finding system with an LCD to generate a programmable coded aperture. The proposed system can easily change the shape, location, and color of multiple apertures by simply controlling the LCD. In addition, it can measure the distance of an input scene under various environments since the LCD apertures can adapt to change the exposure, the baseline, and the color of the aperture in real time. The experimental results showed that the proposed camera system could robustly measure the disparity between the intentionally misaligned color channels in spite of the inherent image degradation caused by the LCD’s diffraction.

Funding

Ministry of Science, ICT and Future Planning (MSIP) (B0101-16-0525); Technology Innovation Program, Ministry of Trade, Industry and Energy (MOTIE) (10047788).

REFERENCES

1. A. Farooq and C. S. Won, IEIE Trans. Smart Process. Comput. 4, 281 (2015). [CrossRef]  

2. H. G. Kim, J. K. Kang, and B. C. Song, IEIE Trans. Smart Process. Comput. 3, 110 (2014). [CrossRef]  

3. G. A. Sergi Foix and C. Torras, IEEE Sens. J. 11, 1 (2011). [CrossRef]  

4. R. Ng, Computer Science and the Committee on Graduate Studies of Stanford University in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy (2006).

5. M. Martinello, A. Wajs, S. Quan, H. Lee, C. Lim, T. Woo, W. Lee, S.-S. Kim, and D. Lee, IEEE International Conference on Computational Photography (ICCP) (2015), pp. 1–10.

6. B. Yoon, K. Choi, M. Ra, and W.-Y. Kim, IEIE Trans. Smart Process. Comput. 4, 224 (2015). [CrossRef]  

7. A. Levin, R. Fergus, F. Durand, and W. T. Freeman, ACM Trans. Graphics 26, 70 (2007). [CrossRef]  

8. A. Chakrabarti and T. Zickler, Proceedings of the European Conference on Computer Vision (2012), Vol. 7576, p. 1.

9. S. Lee, J. Lee, M. H. Hayes, A. K. Katsaggelos, and J. Paik, IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2012), pp. 801–804.

10. S. Lee, M. H. Hayes, and J. Paik, Opt. Express 21, 23116 (2013). [CrossRef]  

11. C.-K. Liang, T.-H. Lin, B.-Y. Wong, C. Liu, and H. H. Chen, ACM Trans. Graphics 27, 1 (2008). [CrossRef]  

12. H. Nagahara, C. Zhou, T. Watanabe, H. Ishiguro, and S. K. Nayar, ECCV (2010), Vol. 6316, p. 337.

13. T. B. Milnes, Degree of Doctor of Philosophy in Mechanical Engineering at the Massachusetts Institute of Technology (2013).

14. E. Lee, W. Kang, and S. K. An Joonki Paik, IEEE Trans. Consum. Electron. 56, 317 (2010). [CrossRef]  

15. J. P. Lewis, Vis. Interface 10, 120 (1995).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. Structure of an LCD and diffraction phenomenon: (a) cell of a color LCD and (b) light trajectories in the LCD aperture imaging.
Fig. 2.
Fig. 2. Structure of the optical range-finding camera system: (a) proposed camera using the LCD aperture, (b) light paths of three different object positions using an off-axis aperture, and (c) light paths of three different object positions using dual color-filtered apertures.
Fig. 3.
Fig. 3. Diffraction property of the proposed camera system: (a) acquired image using a general camera system and (b) image acquired by the proposed camera system. (a) and (b) are focused at the triangle object.
Fig. 4.
Fig. 4. Prototype of the proposed optical range-finding system: (a) right side of the system with an LCD aperture that is inserted between the lens and the camera body, (b) MCU and switches on top of the system, and (c) front view of the system and the LCD aperture.
Fig. 5.
Fig. 5. Experiment to generate stereo images using the two different types of off-axis apertures: (a) input scene with a white triangular object at the distance of 520 mm, (b) acquired image using the DCA [10], and (c) and (d) pair of acquired stereo images using the proposed left and right off-axis apertures, respectively.
Fig. 6.
Fig. 6. Estimated disparity and distance using the test image shown in Fig. 5: (a) ground truth and two estimated disparity curves using the DCA and the proposed system, and (b) corresponding distances computed form the disparity shown in part (a).
Fig. 7.
Fig. 7. Distance and disparity measurement results using a color object: (a) input scene, including the colorful object, and (b) and (c) stereo images from DCA. (d) and (c) Stereo images from the proposed two off-axis apertures, and (f) and (g) estimated corresponding disparity and distance curves, respectively.

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

sin θ n = n λ / d ,
x n = L n λ / d ,
Δ x = arg max x n , m [ I R T ( x , y ) I ¯ R T ] [ I L ( x + x , y ) I ¯ L ] n , m [ I R T ( x , y ) I ¯ R T ] 2 n , m [ I L ( x + x , y ) I ¯ L ] 2 ,
z = f f z 0 Δ c x α α c c z ( z 0 f ) Δ x f 2 Δ c x α + α c ( z 0 f ) ( f c z ) Δ x ,
α = W H / N 1 N 2 ,
Δ x GT = f 2 α c z 0 z ( z 0 f ) ( f z ( z f ) ) c z Δ c x α ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.