Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Telecentric 3D profilometry based on phase-shifting fringe projection

Open Access Open Access

Abstract

Three dimensional shape measurement in the microscopic range becomes increasingly important with the development of micro manufacturing technology. Microscopic fringe projection techniques offer a fast, robust, and full-field measurement for field sizes from approximately 1 mm2 to several cm2. However, the depth of field is very small due to the imaging of non-telecentric microscope, which is often not sufficient to measure the complete depth of a 3D-object. And the calibration of phase-to-depth conversion is complicated which need a precision translation stage and a reference plane. In this paper, we propose a novel telecentric phase-shifting projected fringe profilometry for small and thick objects. Telecentric imaging extends the depth of field approximately to millimeter order, which is much larger than that of microscopy. To avoid the complicated phase-to-depth conversion in microscopic fringe projection, we develop a new system calibration method of camera and projector based on telecentric imaging model. Based on these, a 3D reconstruction of telecentric imaging is presented with stereovision aided by fringe phase maps. Experiments demonstrated the feasibility and high measurement accuracy of the proposed system for thick object.

© 2014 Optical Society of America

1. Introduction

With the advance of micro manufacturing technology, ever-improving metrology is required for the production of precision-engineered micro-parts. Precise systems such as micro-electro-mechanical systems (MEMS) and micro-optical devices incorporate assemblies of micro-parts having a size on the order of micrometer to millimeter. Numerous non-contact optical methods have been utilized in the determination of the shape of an object such as confocal microscopy, white-light interferometry and microscopic fringe projection based on triangulation [1,2]. Although confocal microscopy and white-light interferometry offer resolutions in the nanometer range, microscopic fringe projection offers a fast and robust method for measurement field sizes from approximately 1 mm2 to several cm2 [3].

Phase-shifting projected fringe profilometry (PSPFP) is notable for its non-contact nature, full-field measurement capability, high profile sampling density, and low environmental vulnerability. With a phase-shifting detection scheme, accuracy better than one part in ten thousands of the field of view can be achieved even with excessive image noise. Traditional microscopic fringe projection method uses a zoom stereo microscope as the base optical system, and one of the camera ports is adapted for fringe projection by LCD, LCOS or DMD [3–6]. However, the depth of field of microscopy is limited to sub-millimeter order, which is often not sufficient to measure the complete depth of a 3D object at one time. It can achieve a large depth of field by an additional vertical translation stage for the object, but result in a complex system and low efficiency. Moreover, the calibration of phase-to-depth conversion is complicated using a precision translation stage and a reference plane is required, which result directly in measurement error [7]. Compared with conventional lenses, telecentric lenses feature as orthographic projection and have many advantages such as low distortion, constant magnification, and increased depth of field [8]. Thus, telecentric lenses have become a key component for all those developing high accuracy gauging applications. Several calibration methods of telecentric lenses and telecentric 3D measurement systems are developed. We have proposed a simple, accurate calibration method for a camera with telecentric lenses and studied the impacts of different distortion models in literature [9]. Chen proposed a calibration method of telecentric stereo micro-vision system based on speckle projection of structured light projection [10]. The calibration procedure uses Zhang’s method [11] for the solution of orthographic projection model for telecentric lens. Zhu developed a system for deformation measurement with telecentric lens based on speckle projection, and proposes a simple calibration method based on linear fitting [12]. Kim proposed a new telecentric lens system with multiple aperture stops, which could be able to gain 3D information [13]. However, all these methods are limited to achieve high-resolution and high accuracy results due to speckle projection or stereo matching. K. Haskamp proposed a fringe projection system consisted of a camera with an object-sided telecentric lens and a projector with non-telecentric lens, and described the model and calibration strategy for the calculation of the system parameters [14]. However, this system is not suitable for 3D measurement of small thick objects due to the mismatching of field of views between telecentric camera and ordinary projector. Besides, it did not mention the 3D reconstruction algorithm and the 3D measurement accuracy. And the parameters solution in it used multi-step for nonlinear optimization with start guess values zeros, which would converge slower and less efficient than the method in literature [9].

In this paper, we propose a novel telecentric phase-shifting projected fringe profilometry for small objects. Telecentric imaging can extend the depth of field approximately to several millimeters, which is much larger than that of microscopy. We build the telecentric fringe project system by a camera with a telecentric lens and a projector whose projector lens is also replaced by telecentric lens. Phase shifted fringe patterns are projected on the test objects by the modified telecentric projector. The distorted fringe patterns are captured by the telecentric camera, and the phase map is calculated using phase shifting technique. To avoid the complicated phase-to-depth conversion in microscopic fringe projection, we treat the projector as a virtual camera. Since telecentric imaging has the unique property of purely orthographic projections of scene points, the conventional camera calibration methods typically based on a pinhole model are not applicable. We calibrate the camera and projector using the telecentric imaging model and procedure which has been presented in literature [9]. Based on this, a new telecentric stereovision for 3D reconstruction is proposed with aided by fringe phase maps.

The principle of telecentric imaging system is described in Section2. The calibration of telecentric fringe projection system is described in detail in Section 3. The principle of telecentric 3D reconstruction and stereo matching assisted by the phase values are explained in Section 4. Section 5 shows the experimental results of a calibration target for quantitative evaluation of the measurement accuracy of the proposed system. The 3D measurement of Ball-Grid-Array-chip solder balls was also showed to demonstrate the validity of the proposed method. The conclusions are presented in Section6.

2. Principle

2.1 Telecentric imaging

Telecentric lenses have the unique property of purely orthographic projections of scene points and maintaining a constant magnification over a specific range of object distances. In a bilateral telecentric lens system, only the light rays that are approximately parallel to the optical axis of the lens pass the aperture stop and form the image. A bilateral telecentric lens accurately reproduces dimensional relationships within its telecentric depth, and it is not susceptible to small differences in the distance between the lens and the camera’s sensor. Telecentric lenses normally show a very low distortion degree and offer depth of field in millimeter order, which is much larger than that of microscope. Thus, we introduce telecentric lenses for three dimensional shape measurements with fringe projection.

2.2 System design

The optical layout of the telecentric fringe projection 3D shape measurement system is shown in Fig. 1. The optical telecentric imaging is consisted of a DH-HV 1351UC camera with resolution 1280 × 1024 and a telecentric lens GCO230105 with magnification 0.16. The telecentric projection is realized by a HP AX325AA projector with resolution 800 × 600 in which the projection lens is replaced by a telecentric lens GCO230106 with magnification 0.12. The measurement field sizes is about 30mm × 20mm, and the depth of field reaches 5mm, which is usually sufficient to measure the depth of a small complex 3D object.

 figure: Fig. 1

Fig. 1 The layout of the telecentric fringe projection system.

Download Full Size | PDF

Compared with traditional microscopic 3D measurement systems, it uses bilateral telecentric lenses, which reduces lens distortion and extends the depth of field. In addition, the field of view of the system can be changed by adjustment of the magnification of telecentric lens. However, the telecentric imaging performs orthographic projection, which is different from conventional pin-hole model. A new system calibration and 3D reconstruction technique need to be developed based on telecentric imaging.

2.3 Four-step phase-shifting technique

Phase-shifting algorithms are widely used in optical metrology because of their measurement speed and accuracy [15]. Numerous phase-shifting algorithms have been developed including three-step, four-step, and five-step. In this paper, we use a four-step phase-shifting algorithm with a phase shift of π/2. Four phase shifted fringe patterns with π/2 phase shift between each are sequentially projected on the test objects by the modified telecentric projector. And the deformed fringe patterns are also sequentially captured by the telecentric camera, which can be described as

Ik(i,j)=I(i,j)+I(i,j)cos[ϕ(i,j)+(k1)π/2]
wherek=1,2,3,4; i, j are the pixel indices, Ik(i,j)is the intensities of the kth phase-shifted image; I(i,j) is the average intensity; I(i,j)is the intensity modulation; and ϕ(i,j)is the phase map of the fringe pattern. By solving Eq. (1), we can obtain ϕ(i,j)as
ϕ(i,j)=arctanI4I2I1I3
This equation provides the phase ranging [-π, + π) with 2π discontinuities. A continuous phase map can be obtained by adopting a spatial or temporal phase unwrapping algorithm. In this research, we used the temporal phase unwrapping frame-work introduced in [16].

3. System calibration

3.1 Camera calibration with telecentric lens

The pinhole model of wide angle cameras was not applicable for telecentric imaging system. Telecentric lenses perform scaled orthographic projection, the projection of an arbitrary point P to computer image coordinate is expressed as Eq. (3).

[xuyu1]=[m/du000m/dv0001][r11r12r13txr21r22r23ty0001][XwYwZw1]
Taking the radial and thin prism distortion into account, we have
{δx=k1xu(xu2+yu2)+s1(xu2+yu2)δy=k1yu(xu2+yu2)+s2(xu2+yu2)[(uu0)du(vv0)dv]=[xuyu]+[δxδy]
where mis the effective magnification of telecentric lens, which needs to be calibrated; rij is the element of rotation matrix R and tx ty are elements of the translation matrix; (Xw, Yw, Zw) is the 3D coordinate of the object point P in the 3D world coordinate system; (xu, yu) is the image coordinate if a perfect orthographic projection model is used. (u, v) is the image coordinate of the computer in pixels; (u0, v0) denotes the pixel position of the principal point O, usually the centre of the image plane; du and dv are the sizes of a pixel in x and y direction respectively.

To obtain intrinsic parameters of the camera, a ceramic plate with a precise planar circle pattern is used. The calibration procedures follow the two-step procedure proposed by us [9]. The calibration results are shown in Table 1, where R is shown in axis-angle representation by Rodrigues' rotation formula.

Tables Icon

Table 1. Calibration results for the camera with telecentric lens

3.2 Projector calibration

A projector can be regarded as the inverse of a camera, thus the calibration of a projector can be made almost the same as that of a camera [17]. Since a projector obviously cannot capture images directly, we use a camera to capture images for the projector and then transforming the images into projector images, so that they are as if captured directly by the projector. The key problem is to establish a highly accurate correspondence between camera pixels and projector pixels. In this research, we use a phase shifting method as mentioned in Section 2.3 to accomplish this task.

We project a series of sinusoidal phase shifted fringe patterns in vertical direction and another in horizontal direction onto the calibration plane and capture them. The captured images consist of a calibration plane image and some sinusoidal phase-shifted fringe patterns. Based on a four-step phase-shifting algorithm, the actual phases at each pixel in both vertical and horizontal direction can be calculated, which is unwrapped with the temporal phase unwrapping method. Therefore, we can establish a one-to-one mapping between a camera image and a projector image in condition of phase equality.

We extract the subpixel position of each circle center OiC from the calibration plane image, as shown in Fig. 2. The absolute phases of the point OiC in the two directionsφh, φvcan be obtained by phase-shifting algorithm and linear interpolation. The coordinates (uiP, viP) of the corresponding pointOiP, which has the same absolute phase in the two directions, can be calculated as follows:

uiP=φv(OiC)Nv×2π×W,viP=φh(OiC)Nh×2π×H
where Nv, Nh are the numbers of periods in the vertical and horizontal fringe patterns, respectively, and W and H are the width and height of the fringe patterns.

 figure: Fig. 2

Fig. 2 Data set generation of projector calibration flow chart.

Download Full Size | PDF

According to this method, we can estimate all the corresponding points of the circle centers. The data set generation flow chart is shown in Fig. 2. Then, the calibration of projector is converted as the calibration of a camera in Section 3.1. The calibration results of the projector are shown in Table 2.

Tables Icon

Table 2. Calibration results for the projector with telecentric lens

4. Telecentric 3D reconstruction

According to the telecentric image model, a 3D point P with world coordinate [Xw, Yw, Zw]T is expressed into camera image coordinate by the following transformation.

[uCvC1]=[mCduC0u0C0mCdvCv0C001][r11Cr12Cr13CtxCr14Cr15Cr16CtyC0001][XwYwZw1]
where the intrinsic and extrinsic parameters are calibrated for the camera.

Similarly, we have the coordinate transformation equation for the projector,

[uPvP1]=[mPduP0u0P0mPdvPv0P001][r11Pr12Pr13PtxPr14Pr15Pr16PtyP0001][XwYwZw1]
where the intrinsic and extrinsic parameters are calibrated for the projector.

From Eqs.(6) and (7), we can obtain the follow equations,

{uC=u0C+mC×(r11CXw+r12CYw+r13CZw+txC)ducvC=v0C+mC×(r14CXw+r15CYw+r16CZw+tyC)dvCuP=u0P+mP×(r11PXw+r12PYw+r13PZw+txP)duP
Since the vertical sinusoidal phase shifted fringe patterns are projected for measurement, the phase of projector image is only increased in horizontal direction. Therefore, for each arbitrary point (uC, vC) on the camera image plane with phase value φ(uC,vC) can identify a vertical line in the projector image, which has the same absolute phase value. The position of the corresponding vertical line in projector image uiP can be calculated by
uP=φ(uC,vC)N×2π×W
where N is the numbers of periods in the vertical fringe patterns, and W is the width of the fringe patterns in pixels.

Therefore, each camera point (uC, vC) can be used for calculating the corresponding 3D world coordinates by Eqs. (8) and (9).

5. Experiment and discussions

A ceramic plate with scattered surface was used for quantitative evaluation of the proposed system. There are 9x11 discrete black solid circle markers on the surface, as illustrated in Fig. 3. The separation of neighboring markers along row and column direction has the same value of 3 mm with an accuracy of 1 μm. We placed the plate on an accurate translating stage with resolution of 1μm for depth evaluation. The plate was positioned at −1mm, −0.5mm, 0.5mm and 1mm with respect to the reference plane M (i.e. the plate positioned at 0mm). At each position, the 3D point cloud data was calculated from Eq. (8), as shown in Fig. 3.

 figure: Fig. 3

Fig. 3 Measurement of ceramic plate with precise circle pattern for accuracy evaluation. (a). Ceramic plate with 9 × 11 circles, with separation of 3 mm. (b). 3D reconstructed model of the ceramic plate.

Download Full Size | PDF

We extracted the 3D data of centers of all the solid circle markers on the plate, and calculated the distance between neighboring circle markers. The distribution of measured distance of neighboring markers is shown in Fig. 4 at the position of 1mm (90 data along the X direction and 88 data along the Y direction). Table 3 lists the average distance between all the markers along the X and Y direction respectively at the four positions. The absolute error of the measured distance is listed in the 5th and 6th column respectively. The 8th and 9th column show the standard deviation of the measured distance along the X and Y direction. To evaluate the depth measurement accuracy along the Z direction, we fitted the point cloud data to a plane, and measured the distance of the four positions to the reference plane. The measured value, absolute error (absolute difference between the measured distance and the distance by the stage) for the four positions are listed in the 4th and 7th column in Table 3, respectively. The measurement error is less than 10 micrometers.

 figure: Fig. 4

Fig. 4 Measured distance between neighboring circle markers of ceramic plate at the position of 1mm. (a). 90 data of measured distance along the X direction. (b). 88 data of measured distance along the Y direction.

Download Full Size | PDF

Tables Icon

Table 3. Experimental results on the accurately positioned plate (Unit: mm)

The experimental results on the four plate positions show that the proposed telecentric phase-shifting projected fringe profilometry has the capability of micron level measurement accuracy in 3D shape measurement within several millimeters depth. It means that this system has great advantage over the existing microscopic fringe projection method for thick object.

Another evaluation experiment was carried out by measuring an electronic chip. A BGA chip containing 548 solder balls with interval of 0.8mm is used as shown in Fig. 5(a). The area of BGA chip is 20mm × 20mm, and the diameter of a solder ball is 0.45mm.

 figure: Fig. 5

Fig. 5 3D shape measurement of BGA solder balls. (a) BGA chip with 548 solder balls, with interval of 0.8mm. (b) Captured fringe pattern of BGA solder balls. (c) Unwrapped phase map. (d) 3D reconstructed model of BGA solder balls. (e) Partially enlarged view of red circle in (d).

Download Full Size | PDF

The four phase-shifted sinusoidal fringe patterns with a phase shift of π/2 are projected on the BGA solder balls by the telecentric projector, and captured by the camera with telecentric lens. To overcome the highly varying reflectance of BGA chip, a high dynamic range scanning (HDRS| technique is used [18]. The captured fringe image produced using HDRS technique is shown in Fig. 5(b). The unwrapped phase map is retrieved by temporal phase unwrapping algorithm as shown in Fig. 5(c). The 3D reconstructed result by the telecentric 3D fringe projection system is shown in Fig. 5(d). The partially enlarged view is shown as Fig. 5 (e), in which we can see the solder balls are very smooth and well reconstructed.

We selected five groups of two adjacent solder balls from different part of the BGA chip for the interval measurement. The balls centers are fitted, and the distance is calculated in every group. We compared the calculated distance with the nominal interval 0.8mm, and the results are shown in Table 4.

Tables Icon

Table4. measurement results of solder balls interval

The results show that the 3D surface profile of BGA solder bumps is well presented by this system, which is much better than the results in literature [19,20]. From Table3, it is clear that the measured distance between two adjacent solder balls approaches to the nominal interval value 0.8mm.

6. Conclusion

We propose a novel telecentric phase-shifting projected fringe profilometry for small objects. The depth of field is significantly extended to millimeter order due to telecentric imaging, which is much larger than that of traditional microscopic 3D measurement systems. We build the telecentric fringe project system by a camera with a telecentic lens and a projector whose projector lens is replaced by telecentric lens. We treat the projector as a camera, and calibrate the camera and the projector independently using telecentric imaging model, which is much different from traditional pin-hole camera calibration. A phase-shifting method is used to establish the correspondence between camera pixels and projector pixels. We realize the 3D reconstruction with the telecentric stereovision aided by fringe phase maps. A ceramic plate with precision circle pattern and an accurate translating stage were used for quantitative evaluation of the measurement accuracy of the proposed system. Finally, a 3D measurement of Ball-Grid-Array-chip solder balls is conducted. The experimental results demonstrate the feasibility of the proposed technique and show the measurement error of the proposed system is less than 10 micrometers within several millimeters depth. Therefore, the proposed method has great advantage over the existing microscopic fringe projection method for thick object.

Acknowledgments

The authors would like to appreciate the financial support by Natural Science Foundation of China (No. 51305273), Special Fund for Talent of Guangdong province (No. 2011125) and Science and Technology Plan Project of Shenzhen (No. JCYJ20140418091413498).

References and links

1. F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39(1), 10–22 (2000). [CrossRef]  

2. S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010). [CrossRef]  

3. K. P. Proll, J. M. Nivet, K. Körner, and H. J. Tiziani, “Microscopic three-dimensional topometry with ferroelectric liquid-crystal-on-silicon displays,” Appl. Opt. 42(10), 1773–1778 (2003). [CrossRef]   [PubMed]  

4. S. Ri, T. Muramatsu, M. Saka, and H. Tanaka, “Fast and accurate shape measurement system utilizing the fringe projection method with a ferroelectric liquid-crystal-on-silicon microdisplay,” Opt. Eng. 51(8), 081506 (2012). [CrossRef]  

5. K. P. Proll, J. M. Nivet, C. Voland, and H. J. Tiziani, “Application of a liquid-crystal spatial light modulator for brightness adaptation in microscopic topometry,” Appl. Opt. 39(34), 6430–6435 (2000). [CrossRef]   [PubMed]  

6. C. Zhang, P. S. Huang, and F. P. Chiang, “Microscopic phase-shifting profilometry based on digital micromirror device technology,” Appl. Opt. 41(28), 5896–5904 (2002). [CrossRef]   [PubMed]  

7. H. Liu, W. H. Su, K. Reichard, and S. Yin, “Calibration-based phase-shifting projected fringe profilometry for accurate absolute 3D surface profile measurement,” Opt. Commun. 216(1-3), 65–80 (2003). [CrossRef]  

8. http://www.opto-engineering.com/resources/telecentric-lenses-tutorial.

9. D. Li and J. Tian, “An accurate calibration method for a camera with telecentric lenses,” Opt. Lasers Eng. 51(5), 538–541 (2013). [CrossRef]  

10. Z. Chen, H. Liao, and X. Zhang, “Telecentric stereo micro-vision system: Calibration method and experiments,” Opt. Lasers Eng. 57, 82–92 (2014). [CrossRef]  

11. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000). [CrossRef]  

12. F. Zhu, W. Liu, H. Shi, and X. He, “Accurate 3D measurement system and calibration for speckle projection method,” Opt. Lasers Eng. 48(11), 1132–1139 (2010). [CrossRef]  

13. J. S. Kim and T. Kanade, “Multiaperture telecentric lens for 3D reconstruction,” Opt. Lett. 36(7), 1050–1052 (2011). [CrossRef]   [PubMed]  

14. K. Haskamp, M. Kästner, and M. E. Reithmeier, “Accurate calibration of a fringe projection system by considering telecentricity,” SPIE Optical Metrology, München, S. 80821B (2011).

15. S. Zhang, “Recent progresses on real-time 3D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48(2), 149–158 (2010). [CrossRef]  

16. J. M. Huntley and H. O. Saldner, “Temporal phase-unwrapping algorithm for automated interferogram analysis,” Appl. Opt. 32(17), 3047–3052 (1993). [CrossRef]   [PubMed]  

17. S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006). [CrossRef]  

18. S. Zhang and S. T. Yau, “High dynamic range scanning technique,” Opt. Eng. 48(3), 033604 (2009). [CrossRef]  

19. H. N. Yen, D. M. Tsai, and S. K. Feng, “Full-Field 3D flip-chip solder bumps measurement using DLP-based phase shifting technique,” IEEE Trans. Adv. Packag. 31(4), 830–840 (2008). [CrossRef]  

20. H. N. Yen and D. M. Tsai, “A fast full-field 3D measurement system for BGA coplanarity inspection,” Int. J. Adv. Manuf. Technol. 24, 132–139 (2004).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1
Fig. 1 The layout of the telecentric fringe projection system.
Fig. 2
Fig. 2 Data set generation of projector calibration flow chart.
Fig. 3
Fig. 3 Measurement of ceramic plate with precise circle pattern for accuracy evaluation. (a). Ceramic plate with 9 × 11 circles, with separation of 3 mm. (b). 3D reconstructed model of the ceramic plate.
Fig. 4
Fig. 4 Measured distance between neighboring circle markers of ceramic plate at the position of 1mm. (a). 90 data of measured distance along the X direction. (b). 88 data of measured distance along the Y direction.
Fig. 5
Fig. 5 3D shape measurement of BGA solder balls. (a) BGA chip with 548 solder balls, with interval of 0.8mm. (b) Captured fringe pattern of BGA solder balls. (c) Unwrapped phase map. (d) 3D reconstructed model of BGA solder balls. (e) Partially enlarged view of red circle in (d).

Tables (4)

Tables Icon

Table 1 Calibration results for the camera with telecentric lens

Tables Icon

Table 2 Calibration results for the projector with telecentric lens

Tables Icon

Table 3 Experimental results on the accurately positioned plate (Unit: mm)

Tables Icon

Table4 measurement results of solder balls interval

Equations (9)

Equations on this page are rendered with MathJax. Learn more.

I k (i,j)= I (i,j)+ I (i,j)cos[ϕ(i,j)+(k1)π/2]
ϕ(i,j)=arctan I 4 I 2 I 1 I 3
[ x u y u 1 ]=[ m/du 0 0 0 m/dv 0 0 0 1 ][ r 11 r 12 r 13 t x r 21 r 22 r 23 t y 0 0 0 1 ][ X w Y w Z w 1 ]
{ δ x = k 1 x u ( x u 2 + y u 2 )+ s 1 ( x u 2 + y u 2 ) δ y = k 1 y u ( x u 2 + y u 2 )+ s 2 ( x u 2 + y u 2 ) [ (u u 0 )du (v v 0 )dv ]=[ x u y u ]+[ δ x δ y ]
u i P = φ v ( O i C ) N v ×2π ×W, v i P = φ h ( O i C ) N h ×2π ×H
[ u C v C 1 ]=[ m C d u C 0 u 0 C 0 m C d v C v 0 C 0 0 1 ][ r 11 C r 12 C r 13 C t x C r 14 C r 15 C r 16 C t y C 0 0 0 1 ][ X w Y w Z w 1 ]
[ u P v P 1 ]=[ m P d u P 0 u 0 P 0 m P d v P v 0 P 0 0 1 ][ r 11 P r 12 P r 13 P t x P r 14 P r 15 P r 16 P t y P 0 0 0 1 ][ X w Y w Z w 1 ]
{ u C = u 0 C + m C ×( r 11 C X w + r 12 C Y w + r 13 C Z w + t x C ) d u c v C = v 0 C + m C ×( r 14 C X w + r 15 C Y w + r 16 C Z w + t y C ) d v C u P = u 0 P + m P ×( r 11 P X w + r 12 P Y w + r 13 P Z w + t x P ) d u P
u P = φ( u C , v C ) N×2π ×W
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.