Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Light-sheet based two-dimensional Scheimpflug lidar system for profile measurements

Open Access Open Access

Abstract

This work presents a novel concept for 2D Scheimpflug lidar. A light-sheet based 2D Scheimpflug lidar system is developed and realized for surface profile measurements. The theory of a geometrical relationship underlying the system is developed, and the possibility of 3D profile measurements for a plastic bowl, a rhombic carton box and a manikin are presented. The sizes of reconstructed images are consistent with respective physical objects with small (~mm) errors at close range. Experimental results show that the 2D Scheimpflug lidar system performs well for 3D surface profiling and has great potential for close-range applications in other fields.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Over the years, 3D surface profilometry has been a strong tool for various industrial applications and many methods for 3D shape measurement have been proposed [1,2]. Good reviews of 3D laser imaging and model acquisition techniques can be found in [3,4]. Ranging technologies are usually divided into position-based (triangulation) and time-based (time-of-flight) approaches [5]. Time-based methods can provide reasonable precision over much longer distances, but their short-range capabilities are limited. In addition, time-based laser scanners can only scan one point at a time, making the scanning procedure time-consuming. In contrast, the traditional triangulation method can provide much higher precision at a close range and triangulation laser scanners can scan multiple points simultaneously using light sheet to increase scanning speed [6]. The Scheimpflug lidar method is a novel technique based on high power Continuous-Wave (CW) laser diodes, the range resolution of which is achieved through a triangulation method rather than the time-of-flight approach used by traditional pulsed lidar systems [7].

The Scheimpflug lidar concept inspired by the century-old Scheimpflug principle has been developed over recent years [8]. According to the Scheimpflug principle, an entire plane can be imaged in focus if the image plane and lens plane intersects the object plane on the same line [9]. This implies that a laser beam transmitted into the atmosphere can be imaged onto a straight line using a tilted camera [8]. It is worth noting that the Scheimpflug condition is not constrained by the focal length of the lens, thus the Hinge rule or the lens equation must be satisfied to achieve focus [10]. The Scheimpflug lidar technique has been successfully demonstrated for both elastic and inelastic lidar applications, including kHz remote modulation spectroscopy and polarimetric technique for entomological applications [8,11–13], single- and multi-band elastic lidar for atmospheric aerosol remote sensing [10,14–17], differential absorption lidar for gas sensing [7,18], fine scale lidar for industrial and combustion processes [19], and inelastic hyperspectral lidar for aquatic and vegetation applications [20–22]. A comprehensive review of the Scheimpflug lidar method is given in [23] and references therein.

However, the Scheimpflug lidar system presented above is so called 1D lidar system, which is typically based on a point light source, an imaging lens, and a linear CMOS array detector. In our work, we present a novel and compact 2D Scheimpflug lidar system for profile measurements of opaque objects based on a line light source, an imaging lens and a 2D-CMOS camera. 3D surface profile can be generated by stacking the continuous scanning images. Compared to the previous work of laser triangulation used Scheimpflug condition [24–26], the principle of system build is similar because Scheimpflug condition should be satisfied in both systems. However, the detection range is improved dramatically in our lidar system (1.5 m-20 m for manikin measurement), which brings about some difference in system parameters, such as the increase of the laser power and the angle between the laser beam and the lens, and the decrease of the range resolution. In this paper, we will demonstrate the feasibility of 3D profile measurements for a plastic bowl, a rhombic carton box and a manikin. Our results show that the 2D Scheimpflug lidar has great potential for other fields, such as particle image velocimetry (PIV), combustion diagnostics, indoor navigation and obstacle location in unmanned vehicles.

In the following sections, the geometrical relations between the object plane and the image plane will be discussed firstly, in particular the relationship between the pixels of the image sensor and the measurement position. The system schematic of the 2D Scheimpflug lidar system is described in detail in Section 3. In Section 4, the experimental results of 3D profile measurements of three opaque objects are shown. Conclusions and prospects for future work are reported at the end.

2. Principles and method

The Scheimpflug principle is a geometric rule which describes how sharp focus can be achieved by tilting the image sensor so that the image plane intersects with both the lens and object planes on the same line when the lens and object planes are not parallel.

The relationship between the imaging point (Mʹ) and the object point (M) of the yz plane has been derived if the initial parameters of the Scheimpflug lidar system are set [21]: d - the distance to the lens from the object plane, α - the tilt angle of the lens plane relative to the object plane, and f - the focal length of the lens, as illustrated in Fig. 1:

 figure: Fig. 1

Fig. 1 Scheimpflug principle: the image plane intersects with both the lens and object planes on the same line when the object plane is nonparallel to the lens plane. O and Oʹ are the origins of the three-dimensional coordinate system and the lens plane, respectively; d - the distance to the lens from the object plane, α - the tilt angle of the lens plane to the object plane, β - the tilt angle of the image plane to the lens plane. MN is a line on the object plane and MʹNʹ is the image of the line on the image plane, which satisfies the lens equation, u is the corresponding object distance and f is the focal length of the lens.

Download Full Size | PDF

My'=d(dcosα+zsinα)dcosα+zsinαf
Mz'=zfdcosα+zsinαf

Here, we extend this relationship to the 3D space. The point N (x, 0, z) on the object plane is projected onto the image plane Nʹ. The y- and z-coordinates of Nʹ are the same as Mʹ, and the x-coordinate of Nʹ can be deduced from the similar triangle calculation:

Nx'=xfdcosα+zsinαf

Thus, the object profile in the real-world coordinate system can be imaged sharply on the image plane after image acquisition by the camera. For the calibration of the Scheimpflug lidar system, the relationship between the image coordinate system and the real-world coordinate system must be developed. The position of one reflected laser point in the image plane is Nʹ (xim, zim) in the image coordinate system, where xim and zim are the pixel numbers along the row and column directions, respectively (see Fig. 2.). The coordinates of the corresponding laser point on the detected object N (x, z) in the real-world coordinate system can be derived to the following form when α is 90°:

 figure: Fig. 2

Fig. 2 The relationship between the image and the real-world coordinate system when α = 90°. MN is a line on the object plane in the real-world coordinate system and MʹNʹ is the image of the line in the image coordinate system. The center of the camera is placed along the line OOʹʹ. xim and zim are the pixel numbers along the row and column directions, respectively.

Download Full Size | PDF

z=f(1+f2+d2|M'O''|(zimzim,M')wcol)
x=fzf(ximhrow/2)wrow

Here, |MʹOʹʹ| is the length of the line segment MʹOʹʹ, which is constant and easily calculated by the known distance zM (|OM|). zim,Mʹ is the pixel number of Mʹ on the camera, and hrow is the total pixel number along the row direction. wcol and wrow are the pixel sizes along the column and row directions, respectively.

The geometrical relationship between the pixel of the image sensor and the measurement position of the object has been shown. The light sheet that is transmitted into the atmosphere illuminates on the object, thus the distance information of the object can be gathered one strip at a time by the tilted camera according to the geometric relationship of the 2D Scheimpflug lidar system. By scanning along the y direction, the 3D surface profile of the detected object can be reconstructed by stacking the backscattered laser lines. Laser line power density and line width are crucial for good reconstruction. A high power density can enhance signal contrast under ambient light especially at longer distances, and small line width helps to distinguish detailed features for profiling measurements. The effect of laser speckle has to be concerned since the homogeneity of the laser lines is directed on optical rough surfaces [24]. However, choosing the right lens aperture setting contributes to minimizing laser speckle, and a larger aperture produces a high frequency speckle pattern that is less disturbing [26].

3. Instrumentation

The schematic of our light-sheet based 2D Scheimpflug lidar system is shown in Fig. 3. The system incorporates a commercial blue laser diode at 450 nm with 1.5 W maximum output power. A single cylindrical lens is integrated into the illumination module to turn the collimated laser into a line generator. The light-sheet laser beam with about 0.23 mW/cm2 at 10 m is transmitted out to profile the opaque target objects. Herein, only one optical section of the sample is illuminated. The backscattered light from the surface of the sample is collected by a receiver with a ø 58 mm, f = 55 mm imaging lens (Canon, Japan) located at a separation of d = 0.1 m from the object plane. The large aperture contributes to minimize the effect of laser speckle. A 450-nm bandpass optical filter with 18-nm full width at half-maximum (FWHM) is placed right after the receiver to suppress the background signal. The lens plane is designed to be perpendicular to the object plane, that is, the tilt angle α is set to 90°. As illustrated in Fig. 2, the tilt angle β is approximately 29° calculated by trigonometric formulas, thus the 2D-CMOS camera (CMOSIS CMV2000, 2048×1088pixels, 5.5μm×5.5μmpixel size, Lumenera, Canada) is tilted 29° off the y axis. The center of the camera lies along the line OOʹʹ and the field of view (FOV) is about 11°. The backscattered light passes through the imaging lens and is captured by the CMOS chip. Thus, a photo of the surface profile on one section of the sample can be acquired. The entire Scheimpflug lidar system is mounted on a 12cm×12cm×1cm rectangular aluminum plate and scanned by a high-precision motorized linear translation stage with a 300-mm travel range (KSA300-11-x, Zolix, China). Unidirectional scanning is controlled by the motion controller (TMC-USB-4-S257, Zolix, China) and the software of the computer. Consequently, 3D surface profiles can be generated by stacking the continuous scanning images processed by MATLAB software (The MathWorks Inc., Natick, USA).

 figure: Fig. 3

Fig. 3 Schematic diagram of the light-sheet based 2D Scheimpflug lidar system. The lidar system incorporates a laser diode, a cylindrical lens, an imaging lens, an optical filter and a 2D-CMOS camera. The whole system is mounted on a scanning translation stage.

Download Full Size | PDF

4. Results and discussion

In this section, we will discuss an initial test of our light-sheet based 2D Scheimpflug lidar system that is performed as well as the 3D profile measurements for a plastic bowl, a rhombic carton box and a manikin that are performed to show the feasibility of the system.

Seven distinct distances (1.10, 1.40, 2.31, 3.12, 3.80, 4.72, and 6.53 m) are chosen to test the range accuracy of the Scheimpflug lidar system, in which backscattered signal from the farthest distance (6.53 m) is used to calibrate the pixel position in the CMOS chip and the other distances can be deduced from the corresponding pixel number according to Eq. (4). As illustrated in Fig. 4(a), the curve gives a theoretical relationship between pixel number (along the zim-axis) and distance (along the z-axis). The data points marked by * are experimental test data, which coincide well with the theoretical curve. It can be noted that the distance range of validity of the system is 1.05-6.53 m, and the range resolution is higher in the near field (~mm) than in the far field (~cm). To explain the range resolution in the farther distance (~20 m), the similar experiment is performed, as illustrated in Fig. 4(b). The average range resolution is about 10 cm at 6 m-20 m, which is satisfactory for manikin profiling measurements. The range resolution can be improved by a longer baseline (d) or a longer focal length (f) [23], although for this work the system size will instead be increased and the FOV will be further decreased. Thus, the best compromise between the above parameters should be considered according to specific applications.

 figure: Fig. 4

Fig. 4 The relationship between pixel number and distance. (a) Range: 1.05 m–6.53 m. (b) Range: 1.5 m-20 m.

Download Full Size | PDF

After the range calibration, a 5×3 black and white checkerboard with a 40mm×40mm block size, the edge of which is wrapped with a black tape to reduce strong backscattering, is placed coinciding with the object plane illuminated by the laser sheet. The image of the checkerboard captured by the camera is processed by a binary image conversion for every block by extracting feature points of the checkerboard in the image coordinate system. According to Eqs. (4) and (5), the image of the checkboard in the real-world coordinate system can be obtained. As shown in Fig. 5, the image of 5×3checkerboard is successfully restored, which demonstrates the correctness of our scheme. Compared with the standard size of the white block (40mm×40mm), the range error is on the order of millimeters at close range and the height error (along the x-axis) is approximately one order of magnitude smaller. It can be noted that lines along the z-axis are not strictly horizontal perhaps owing to the imperfect placement of the camera and the range error.

 figure: Fig. 5

Fig. 5 The 5×3 black and white checkerboard with 40mm×40mm block size in the real-world coordinate system. The center of the checkerboard is magnified and shown in the northeast corner.

Download Full Size | PDF

After the initial test, the 3D profile measurements for a plastic bowl and a rhombic carton box (see Fig. 6) are performed with our Scheimpflug lidar system (1.05 m-6.53 m). Two samples are scanned successively under ambient light at close range, and a total of 108 and 123 sectional images are acquired, respectively. The scan time for each object is about 5~10 s. Image smoothing is performed to eliminate speckle noise and binary conversion is taken with the proper threshold value. 3D surface profiles are generated by stacking the continuous scanning images processed by MATLAB software, as shown in Figs. 7 and 8, where the front view and the top view of the samples are also presented. It can be noted that the samples are fixed by a stick. The measured sizes of the outer and inner edge for the bowl are 19.1cm×19.8cmand 8.4cm×8.2cmwith the depth of8.6cm, and the measured size of the box is 25.2cm×24.8cmin the diagonal directions with a depth of 10.3cm. The sizes of reconstructed images are consistent with respective physical objects with small (~mm) errors. In addition, the accuracy in the scanning direction (about 2mm in the experiment) can be improved by reducing the scanning speed and decreasing the beam width, however, the run time of the program will be increased because of the increased number of sectional images and subsequent processing.

 figure: Fig. 6

Fig. 6 The picture of real objects of the plastic bowl and the rhombic carton box. The sizes of the outer and inner edge for the bowl are 20cm×20cmand 8.5cm×8.5cmwith the depth of 8cm, and the size of the box is 25cm×25cm in the diagonal directions with a depth of 10cm.

Download Full Size | PDF

 figure: Fig. 7

Fig. 7 (a) The 3D profile of the plastic bowl. The data aspect ratio is (0.112, 1, 0.36) along the z-axis, x-axis, and y-axis, making the equal unit lengths in all directions in real-world coordinate system. (b) The front view of the bowl. The measured sizes of the outer and inner edge for the bowl are 19.1cm×19.8cmand 8.4cm×8.2cm. (c) The top view of the bowl. The measured depth of the bowl is 8.6cm.

Download Full Size | PDF

 figure: Fig. 8

Fig. 8 (a) The 3D profile of the rhombic carton box. The data aspect ratio is (0.112, 1, 0.32) along the z-axis, x-axis, and y-axis, making the equal unit lengths in all directions in real-world coordinate system. (b) The front view of the box. The measured size of the box is 25.2cm×24.8cmin the diagonal directions. (c) The top view of the box. The measured depth of the box is 10.3cm.

Download Full Size | PDF

We also present a Scheimpflug lidar system with farther range (1.5 m-20 m). The range calibration is done (see Fig. 4(b)) and similar 3D profile measurements for a manikin are performed at 19.6 m and 5.6 m, as illustrated in Fig. 9. A rough outline of the manikin can be identified at 19.6 m and detailed facial and body features can be recognized at 5.6 m. It can be noticed that the laser line width does not generally restrict the resolution of the height measurement, but affects the measurement accuracy in the scanning direction.

 figure: Fig. 9

Fig. 9 (a) The 3D profile of the manikin at 19.6 m. (b) The 3D profile of the manikin at 5.6 m.

Download Full Size | PDF

In general, an exploratory study has been implemented to show the feasibility of our light-sheet based 2D Scheimpflug lidar system in the field of 3D profile measurements, and we believe it has great potential for the applications in the popular field of obstacle location in unmanned vehicles.

5. Conclusion and outlook

We have developed a light-sheet based 2D Scheimpflug lidar system for surface profile measurements. The geometrical relationship underlying the system is firstly established. The plastic bowl, rhombic carton box and manikin are measured, and their 3D reconstructed images are successfully obtained with small (~mm) errors at close range, which indicates that the 2D Scheimpflug lidar has great potential in other fields, such as particle image velocimetry (PIV), combustion diagnostics, indoor navigation and obstacle location in unmanned vehicles. However, some limitations should be noted. The ultimate range limitation is decided by the laser power divergency, so it is only suitable for close-range applications (100m). The sample rate of the 2D lidar system is decreased compared to the 1D system, and the compromise between the FOV and range resolution must be considered for the specific application. Laser eye safety is also an important consideration, as the high-power LDs which emit below 400 nm or over 1400 nm are strictly required. However, the infrared camera is much more expensive than the visible light camera if infrared LDs are used. Overall, an exploratory study has been implemented to show the feasibility of our light-sheet based 2D Scheimpflug lidar system in the field of 3D profile measurements, and it is believed that the system has great potential for the close-range applications in other fields.

Funding

National Natural Science Foundation of China (11621101, 91233208); Fundamental Research Funds for the Central Universities (2017FZA5001) from the Science and Technology Department of Zhejiang Province.

Acknowledgments

We appreciate Drs. Jingwei Li and Chunsheng Yan for valuable discussion and help.

References

1. T. Yoshizawa and T. Wakayama, “Compact camera system for 3D profile measurement,” Proc. SPIE 7513, 751304 (2009). [CrossRef]  

2. T. Yoshizawa, Handbook of optical metrology: Principles and Applications (Chemical Rubber Company, 2015).

3. F. Blais, “Review of 20 years of range sensor development,” J. Electron. Imaging 13(1), 231–244 (2004). [CrossRef]  

4. F. Bernardini and H. Rushmeier, “The 3D model acquisition pipeline,” Comput. Graph. Forum 21(2), 149–172 (2002). [CrossRef]  

5. C. English, S. Zhu, C. Smith, S. Ruel, and I. Christie, “Tridar: A hybrid sensor for exploiting the complimentary nature of triangulation and LIDAR technologies,” In Proceedings of the ISAIRAS, Munich, Germany, (2005).

6. G. Fu, A. Menciassi, and P. Dario, “Development of a low-cost active 3D triangulation laser scanner for indoor navigation of miniature mobile robots,” Robot. Auton. Syst. 60(10), 1317–1326 (2012). [CrossRef]  

7. L. Mei and M. Brydegaard, “Continuous-wave differential absorption lidar,” Laser Photonics Rev. 9(6), 629–636 (2015). [CrossRef]  

8. M. Brydegaard, A. Gebru, and S. Svanberg, “Super resolution laser radar with blinking atmospheric particles—-application to interacting flying insects,” Prog. Electromagnetics Res. 147, 141–151 (2014). [CrossRef]  

9. T. Scheimpflug, “Improved method and apparatus for the systematic alteration or distortion of plane pictures and images by means of lenses and mirrors for photography and for other purposes,” GB Pat. No.1196 (1904).

10. L. Mei and M. Brydegaard, “Atmospheric aerosol monitoring by an elastic Scheimpflug lidar system,” Opt. Express 23(24), A1613–A1628 (2015). [CrossRef]   [PubMed]  

11. E. Malmqvist, S. Jansson, S. Török, and M. Brydegaard, “Effective parameterization of laser radar observations of atmospheric fauna,” IEEE J. Sel. Top. Quant 22(3), 327–334 (2016). [CrossRef]  

12. C. Kirkeby, M. Wellenreuther, and M. Brydegaard, “Observations of movement dynamics of flying insects using high resolution lidar,” Sci. Rep. 6(1), 29083 (2016). [CrossRef]   [PubMed]  

13. S. Zhu, E. Malmqvist, W. Li, S. Jansson, Y. Li, Z. Duan, K. Svanberg, H. Feng, Z. Song, G. Zhao, M. Brydegaard, and S. Svanberg, “Insect abundance over Chinese rice fields in relation to environmental parameters, studied with a polarization-sensitive CW near-IR lidar system,” Appl. Phys. B 123(7), 211 (2017). [CrossRef]  

14. M. Brydegaard, J. Larsson, S. Török, E. Malmqvist, G. Zhao, S. Jansson, M. Andersson, S. Svanberg, S. Åkesson, F. Laurell, and J. Bood, “Short-wave infrared atmospheric Scheimpflug lidar,” presented at the 28th International Laser Radar Conference, Bucharest, Romania, (2017).

15. L. Mei, P. Guan, Y. Yang, and Z. Kong, “Atmospheric extinction coefficient retrieval and validation for the single-band Mie-scattering Scheimpflug lidar technique,” Opt. Express 25(16), A628–A638 (2017). [CrossRef]   [PubMed]  

16. L. Mei and P. Guan, “Development of an atmospheric polarization Scheimpflug lidar system based on a time-division multiplexing scheme,” Opt. Lett. 42(18), 3562–3565 (2017). [CrossRef]   [PubMed]  

17. L. Mei, Z. Kong, and P. Guan, “Implementation of a violet Scheimpflug lidar system for atmospheric aerosol studies,” Opt. Express 26(6), A260–A274 (2018). [CrossRef]   [PubMed]  

18. L. Mei, P. Guan, and Z. Kong, “Remote sensing of atmospheric NO2 by employing the continuous-wave differential absorption lidar technique,” Opt. Express 25(20), A953–A962 (2017). [CrossRef]   [PubMed]  

19. E. Malmqvist, M. Brydegaard, M. Aldén, and J. Bood, “Scheimpflug Lidar for combustion diagnostics,” Opt. Express 26(12), 14842–14858 (2018). [CrossRef]   [PubMed]  

20. G. Zhao, M. Ljungholm, E. Malmqvist, G. Bianco, L. A. Hansson, S. Svanberg, and M. Brydegaard, “Inelastic hyperspectral lidar for profiling aquatic ecosystems,” Laser Photonics Rev. 10(5), 807–813 (2016). [CrossRef]  

21. F. Gao, J. Li, H. Lin, and S. He, “Oil pollution discrimination by an inelastic hyperspectral Scheimpflug lidar system,” Opt. Express 25(21), 25515–25522 (2017). [CrossRef]   [PubMed]  

22. G. Zhao, E. Malmqvist, K. Rydhmer, A. Strand, G. Bianco, L.-A. Hansson, S. Svanberg, and M. Brydegaard, “Inelastic hyperspectral lidar for aquatic ecosystems monitoring and landscape plant scanning test,” in ILRC28, Bucharest, (2017).

23. M. Brydegaard, E. Malmqvist, S. Jansson, J. Larsson, S. Torok, and G. Zhao, “The Scheimpflug lidar method,” Proc. SPIE 10406, 104060I (2017).

24. G. Bickel, G. Hausler, and M. Maul, “Triangulation with expanded range of depth,” Opt. Eng. 24(6), 246975 (1985). [CrossRef]  

25. G. Häusler and W. Heckel, “Light sectioning with large depth and high resolution,” Appl. Opt. 27(24), 5165–5169 (1988). [CrossRef]   [PubMed]  

26. A. Krischke, C. Knothe, P. Gips, and U. Oechsner, “Laser line generators for light-sectioning in rail inspection: 3D-measurement and process control for research and industrial environments,” Laser Tech. J. 10(1), 41–44 (2013). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 Scheimpflug principle: the image plane intersects with both the lens and object planes on the same line when the object plane is nonparallel to the lens plane. O and Oʹ are the origins of the three-dimensional coordinate system and the lens plane, respectively; d - the distance to the lens from the object plane, α - the tilt angle of the lens plane to the object plane, β - the tilt angle of the image plane to the lens plane. MN is a line on the object plane and MʹNʹ is the image of the line on the image plane, which satisfies the lens equation, u is the corresponding object distance and f is the focal length of the lens.
Fig. 2
Fig. 2 The relationship between the image and the real-world coordinate system when α = 90°. MN is a line on the object plane in the real-world coordinate system and MʹNʹ is the image of the line in the image coordinate system. The center of the camera is placed along the line OOʹʹ. xim and zim are the pixel numbers along the row and column directions, respectively.
Fig. 3
Fig. 3 Schematic diagram of the light-sheet based 2D Scheimpflug lidar system. The lidar system incorporates a laser diode, a cylindrical lens, an imaging lens, an optical filter and a 2D-CMOS camera. The whole system is mounted on a scanning translation stage.
Fig. 4
Fig. 4 The relationship between pixel number and distance. (a) Range: 1.05 m–6.53 m. (b) Range: 1.5 m-20 m.
Fig. 5
Fig. 5 The 5×3 black and white checkerboard with 40mm×40mm block size in the real-world coordinate system. The center of the checkerboard is magnified and shown in the northeast corner.
Fig. 6
Fig. 6 The picture of real objects of the plastic bowl and the rhombic carton box. The sizes of the outer and inner edge for the bowl are 20cm×20cmand 8.5cm×8.5cmwith the depth of 8cm, and the size of the box is 25cm×25cm in the diagonal directions with a depth of 10cm.
Fig. 7
Fig. 7 (a) The 3D profile of the plastic bowl. The data aspect ratio is (0.112, 1, 0.36) along the z-axis, x-axis, and y-axis, making the equal unit lengths in all directions in real-world coordinate system. (b) The front view of the bowl. The measured sizes of the outer and inner edge for the bowl are 19.1cm×19.8cmand 8.4cm×8.2cm. (c) The top view of the bowl. The measured depth of the bowl is 8.6cm.
Fig. 8
Fig. 8 (a) The 3D profile of the rhombic carton box. The data aspect ratio is (0.112, 1, 0.32) along the z-axis, x-axis, and y-axis, making the equal unit lengths in all directions in real-world coordinate system. (b) The front view of the box. The measured size of the box is 25.2cm×24.8cmin the diagonal directions. (c) The top view of the box. The measured depth of the box is 10.3cm.
Fig. 9
Fig. 9 (a) The 3D profile of the manikin at 19.6 m. (b) The 3D profile of the manikin at 5.6 m.

Equations (5)

Equations on this page are rendered with MathJax. Learn more.

M y ' = d(dcosα+zsinα) dcosα+zsinαf
M z ' = zf dcosα+zsinαf
N x ' = xf dcosα+zsinαf
z=f(1+ f 2 + d 2 | M'O'' |( z im z im,M' ) w col )
x= fz f ( x im h row /2) w row
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.