Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Focal plane coincidence method for a multi-view telecentric 3D imaging system

Open Access Open Access

Abstract

Multi-view microscopic fringe projection systems, which use high-resolution telecentric lenses and the Scheimpflug condition, face challenges in coinciding focal planes accurately, resulting in inconsistent measurements between views. In this Letter, we developed a sharpness evaluation function based on the total power of the line-spread function, which was subsequently used to generate a full-field sharpness distribution map. Then we employed the correlation between the sharpness map and orientation of the focal plane to precisely coincide the focal planes. Experimental results validate the proposed method and demonstrate its improved consistency in 3D reconstruction.

© 2024 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

Microscopic fringe projection profilometry (MFPP) is a three-dimensional (3D) imaging technique suitable for microscale objects [1]. Compared to the traditional single-view system (1-projector, 1-camera), the system introducing multiple views (1-projector, multi-camera) significantly has higher measurement accuracy [2] and a larger measurable depth range [3], with a higher dynamic range [4] and fewer occluding shadows [5]. However, the issue of focal planes coinciding from different views remains unresolved. In recent years, the increasing use of high-resolution telecentric lenses in MFPP systems [2,3,6] makes this problem more prominent and more difficult. While the Scheimpflug principle aids in coinciding the focal planes to form a common focus area [3], maximizing image sharpness from all views across the entire field of view (FOV) remains a challenge [6].

Non-coincidence of focal planes leads to inconsistent 3D measurement results between different views, with the degree of inconsistency depending on the position in the FOV. Two mechanisms contribute to the inconsistency in measurement results. The first is non-parallelism of focal planes resulting in varying depth references across the FOV. The second is non-uniform defocus across the entire FOV, which introduces ambiguity in detailed features of the surface profile. The impact of the first mechanism on measurements can be reduced using coarse to fine point cloud registration with complicated data processing. However, compensating for the deviation of detailed features due to the second mechanism in post-processing is challenging and unquantifiable. Therefore, a method to quantitatively evaluate and control the quality of focal plane coincidence is necessary.

Accurate coincidence of focal planes across the FOV not only extends the lateral measurable range but also ensures the consistency of the measurement, aspects previously overlooked. Consistent measurements are crucial in the industrial field, especially for micro defect inspection and microscale 3D metrology. Although studies [1,3,6] have noted the issue of coincidence, effective methods for precise quality control of coincidence are still lacking, to the best of our knowledge. Typically, mechanical components position imaging modules in specific locations, then the imaging module orientation is fine-tuned for the clearest image from each view. Obviously, this empirical method falls short in precisely controlling the quality of coincidence. The study by Wang et al. [3] inspires the idea that equal clarity throughout common patterned target images from different views indicates coincidence of the focal planes. Image clarity is characterized by sharpness, measured with gradient operators in autofocus applications. However, the gradient operators still struggle to discern minor clarity differences. This difficulty arises from the mismatch between the cutoff frequency of the transfer function for the gradient operator and the spatial spectrum variation due to minimal defocus. Consequently, some signal components linked to defocus cannot contribute to sharpness, reducing the sharpness sensitivity to defocus. Additionally, pixel size-induced sampling bandwidth limitations lead to the loss of defocus-related spectral components in the original image, further weakening the sharpness–defocus correlation. Therefore, using gradient operators to differentiate clarity demonstrates poor robustness when the non-coincidence of focal planes is minimal.

In this Letter, we proposed a method for coinciding focal planes for multi-view telecentric 3D imaging systems, especially for those with the Scheimpflug principle. A sharpness evaluation function was developed for maximizing the contribution of defocus-related spectral components to sharpness. This function enables visualization of minor focal plane orientation changes through a sharpness distribution map. On this basis, it is possible to quantitatively control the quality of the coincidence between the focal plane and the common patterned target plane. Focal planes from the vertical and oblique views are coincided using the proposed method. Experimental results show a maximum axial deviation of under 25 μm between the focal planes within the FOV of the telecentric lens. Furthermore, another experiment demonstrated the positive impact of focal plane coincidence on the consistency of 3D reconstructed details.

First, we define the sharpness evaluation function based on the relationship between defocus and the spatial frequency response of the imaging system. The spatial frequency response is characterized by the modulation transfer function (MTF). Theoretically, defocus alters the MTF globally [7], with even minor defocus shifting the cutoff frequency of MTF to lower frequency and attenuating each frequency components variably. Therefore, the MTF can potentially detect slight defocus changes with high sensitivity. We used the edge-spread function (ESF) to measure the MTF of the imaging system [8,9]. The ideal 2D step signal is given by

$$f({x,y} )= u(x )\cdot 1(y )\cdot u(x )= \left\{ \begin{array}{@{}ll@{}} 0, & {x \ge 0} \\ 1, & x < 0 \end{array} \right..$$

The ESF results from convoluting the point-spread function (PSF) with the step signal:

$$I({x,y} )\equiv ESF(x )= PSF({x,y} )\ast f({x,y} ).$$

By means of a one-dimensional Fourier transform, the MTF of the optical system is given by

$$MTF(\nu )= |{\mathrm{{\cal F}}\{{LSF(x )} \}} |= \left|{\mathrm{{\cal F}}\left\{ {\frac{d}{{dx}}ESF(x )} \right\}} \right|,$$
where LSF(x) is the line-spread function. To ensure intuitive physical significance in subsequent definitions, the total power of LSF(x) based on discrete sampling is given by
$$\textrm{Total power} = \sum\limits_{{\nu _{\textrm{low}}}}^{{\nu _{\textrm{high}}}} {\frac{{MT{F^2}(\nu )}}{{{N^2}}}\Delta \nu } .$$
$MT{F^2}(\nu )/{N^2}$ represents the power spectrum of LSF(x), and N is the sample number.

To maximize the correlation between the total power and defocus effects, it is necessary to further define the bandwidth range. The use of a super-resolution sampling strategy for extracting the 1D ESF from the 2D edge image results in redundancy within the sampling bandwidth. Sampling bandwidth that exceeds the spatial frequency corresponding to the diffraction limit is not affected by defocus. In addition, the low-frequency component of the MTF is not only insensitive to defocus but also susceptible to disturbance by non-uniform illumination. Therefore, we set the bandwidth limits as follows:

$${{\nu _{\textrm{low}}} = 0.1{f_{\textrm{diff}\textrm{.lim}}}}\quad {{\nu _{\textrm{high}}} = 0.9{f_{\textrm{diff}\textrm{.lim}}}} ,$$
where ${f_{\textrm{diff}\textrm{.lim}}}$ is the cutoff frequency corresponding to the diffraction limit satisficing the Rayleigh criterion.

In this study, we use the bandwidth-limited total power to characterize sharpness. It is important to note that this “sharpness” reflects the spatial frequency response of the imaging system, not a characteristic of the image itself. For convenience, we refer to the total power as “sharpness.” Besides its potential sensitivity benefits, this sharpness evaluation method is also generalizable. This generalizability arises from a high-quality edge pattern having sufficient bandwidth and a consistent spectrum, so that the bandwidth-limited total power can be used to evaluate the sharpness of telecentric imaging modules with different specifications.

Secondly, to measure the distribution of sharpness across the FOV, we used a checkerboard target as an array edge signal generator. A sharpness distribution map is obtained by calculating the sharpness of each edge. The procedure for creating a sharpness map can be summarized as follows:

  • • Step 1: Segment the checkerboard image into an array of ROIs containing the edge pattern, as shown in Fig. 1(a).
  • • Step 2: Convert the 2D edge patterns into 1D ESFs; Fig. 2(b) shows ESFs for the red and blue ROIs in Fig. 1(a). The conversion algorithm is detailed in Appendix C of [9].
  • • Step 3: Calculate the sharpness using Eqs. (3) and (4), as shown in Fig. 1(c).
  • • Step 4: Associate the center coordinate of the ROI with its sharpness to create the sharpness map, as shown in Fig. 1(d).

 figure: Fig. 1.

Fig. 1. Procedure for generating a sharpness map from a checkerboard image. (a) Segment the checkerboard into ROIs. (b) ESFs corresponding to the red and blue ROIs of (a). (c) Sharpness determined by integral region. (d) Sharpness map corresponding to (a).

Download Full Size | PDF

 figure: Fig. 2.

Fig. 2. Centroid estimation for a sharpness map. (a) ROI segmentation across FOV. (b) Sharpness spot is divided into four quadrants.

Download Full Size | PDF

Advantages of this full-field sharpness detection method are: (1) The checkerboard target can be procured from off-the-shelf products or processed by lithography process. (2) A single-shot measurement can produce a full-field sharpness map. In addition, in the MATLAB 2022b environment with an i5-12,600 K processor, steps1–4 required 298, 232, 0.48, and 121 ms for processing, respectively.

As shown in Fig. 2(a), a checkerboard image covering the entire FOV (14.6 × 8.6 mm) is segmented into 47 × 32 ROIs (yellow rectangular), and the sharpness map created by the ROIs is shown in Fig. 2(b). The sharpness distribution resembles an eccentric circle due to the field curvature which shapes the focal plane into a paraboloid [10]. Ideally, if the focal plane is parallel to the target plane and intersects at the center of the FOV, the sharpness map would form a perfect circular spot. This implies that the focal and target planes are parallel when the “centroid” of sharpness spot is centered in the FOV. In order to determine the centroid, the sharpness map is divided into four quadrants, as shown in Fig. 2(b). An offset of the centroid from the FOV center in the x and y directions is quantified by

$${\frac{{({S1 + S2} )- ({S3 + S4} )}}{{S1 + S2 + S3 + S4}} = \textrm{x}}\quad {\frac{{({S1 + S4} )- ({S2 + S3} )}}{{S1 + S2 + S3 + S4}} = \textrm{y}},$$
where S1, S2, S3, and S4 represent the sum of sharpness from each of the four quadrants.

The offset is related to the orientation of the focal plane. Figure 3(a) shows that the focal plane is parallel to the target plane when x and y are zero. When the focal plane is parallel to the target plane, the sharpness in the center of the FOV indicates the perpendicular distance between those two planes. Figure 3(b) illustrates that when the working distance is adjusted around the optimal focus, the focal plane and the target plane are tangent to the center of the FOV at the sharpness peak. In this case, the two planes are regarded as coincident. Notably, the sharpness map distinguishes whether the focal plane is in front of or behind the target plane due to the field curvature.

 figure: Fig. 3.

Fig. 3. Orientation and position estimation of the focal plane using the sharpness distribution map. (a) Sharpness maps corresponding to orientation of focal plane. (b) Sharpness maps with axial displacements.

Download Full Size | PDF

The experimental setup is shown in Fig. 4(a). To demonstrate the effectiveness and generalizability of the proposed method, the MFPP system comprises a vertical view and an oblique view that satisfy the Scheimpflug condition. Each view used a telecentric lens (#15-873, Edmund Optics) and a CMOS sensor (IMX 183, Sony), both with identical specifications. In the oblique view, the C-mount component of the camera was detached. An off-the-shelf checkerboard target (#12-198, Edmund Optics) is served as a common target for both views.

 figure: Fig. 4.

Fig. 4. Measurement of the position and orientation of the focal plane using focal stack. (a) Experimental system. (b) Focal plane fitting. (c) Focal planes from two views in one coordinate system. (d) Perpendicular distance distribution by subtracting the two focal planes in (b).

Download Full Size | PDF

Using the proposed method, the focal planes of both views coincided with the common target plane, and therefore the focal planes are coincident. In order to evaluate the quality of the coincidence, we employed the focal stack method to precisely measure the position of the focal planes [11,12]. A motorized stage (KMTS25E/M, Thorlabs) was used to incrementally move the checkerboard target along the optical axis. The step length was set to 0.5 μm due to a black box model of the telecentric lens provided by Edmund Inc., which indicated the field curvature of 40.5 μm. After stepping the target, each view captured an image, followed by the calculation of a sharpness map set using the proposed sharpness evaluation function. According to the sharpness map set, the focal position of each pixel is determined by

$${f^\ast }(x,y) = WD({i^\ast }) = \mathop {\arg \max }\limits_i \textrm{ }S(x,y,i),$$
where $S(x,y,i)$ represents the sharpness map of the ith image and $WD({i^\ast })$ is the working distance corresponding to the index. Figure 4(b) shows the focal plane with field curvature, derived from the focal position scatter data fitting. The field curvature matches well with the black box model. Due to the field curvature, the two focal planes cannot perfectly coincident when aligned within one coordinate system as shown in Fig. 4(c). The focal planes intersect at the center in the FOV and are essentially parallel. A perpendicular distance distribution between the two focal planes was determined by subtracting them on a pixel-by-pixel basis, as shown in Fig. 4(d). This distribution appears roughly circular due to the field curvature, with a maximum distance of about 25 μm in corners. Considering that the depth of field is 1.2 mm (f-number of 6), the profile smoothing difference due to the perpendicular distance between the two planes is negligible in most scenarios. It is worth noting that the two telecentric lenses exhibit different field curvatures in Fig. 4(c), despite identical specifications. We hypothesize this result from the additional field curvature aberration induced by the oblique view.

To validate the positive effect of the proposed method on the consistency of measurement results, a 3D reconstruction of a microscale rough surface was performed. The surface roughness Sa [13] is used to statistically assess the ability of the profilometer to retain detailed features. Higher roughness indicates more detailed profile features. Phase demodulation employed a composite frequency fringe projection method [14] with a high-frequency fringe pitch of 76.14 μm and 10 phase steps. Figure 5 shows the 3D reconstruction results using both the proposed method and without it. Using the proposed method, point clouds from both views are nearly coincident throughout the FOV, with a slight deviation of about 2 μm, as demonstrated in cross section plots of Fig. 5(b). This deviation arises from the noise-induced ambiguity of the sharpness peak. As shown in Fig. 5(c), the coincidence of the two focal planes significantly improves the roughness consistency across five areas (A0–A4). In addition, with the focal plane coincidence, the roughness in the center area exceeds that in the corners due to the field curvature. The experimental results indicate that the proposed method effectively reduces inconsistency in measurements caused by the two mechanisms discussed above.

 figure: Fig. 5.

Fig. 5. 3D reconstruction results for microscale rough surface. (a) 3D reconstruction without focal plane coincidence. (b) 3D reconstruction with focal plane coincidence. (c) Roughness of areas A0–A4.

Download Full Size | PDF

In conclusion, we proposed a focal plane coincidence method for multi-view MFPP. The method uses bandwidth-limited total power of the line-spread function to evaluate sharpness, creating a sharpness distribution map across the entire FOV by analyzing edge patterns on a checkerboard target. Focal plane coincidence is achieved by utilizing the inherent field curvature of the telecentric lens effect on the sharpness map. The experimental results indicate that with the proposed method, focal planes from both oblique and vertical views are coincident. The maximum axial deviation across the field of view is under 25 μm, approximately 2% of the depth of field. Additionally, the consistency of the 3D reconstruction results for each view is improved after the focal plane coincidence using the proposed method. By scaling the checkerboard target, the proposed method can be applicable to other multi-view MFPP systems.

Funding

National Key Research and Development Program of China (2021YFB3200202).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

REFERENCES

1. Y. Hu, Q. Chen, S. Feng, et al., Opt. Lasers Eng. 135, 106192 (2020). [CrossRef]  

2. Y. Hu, K. Zheng, Z. Liang, et al., Surf. Topogr.: Metrol. Prop. 10, 024004 (2022). [CrossRef]  

3. M. Wang, Y. Yin, D. Deng, et al., Opt. Express 25, 19408 (2017). [CrossRef]  

4. P. Zhou, H. Wang, Y. Wang, et al., Meas. Sci. Technol. 34, 075021 (2023). [CrossRef]  

5. A. Dickins, T. Widjanarko, D. Sims-Waterhouse, et al., J. Opt. Soc. Am. A 37, B93 (2020). [CrossRef]  

6. Y. Hu, Z. Liang, S. Feng, et al., Opt. Lasers Eng. 149, 106793 (2022). [CrossRef]  

7. H. H. Hopkins, Proc. R. Soc. Lond. A 231, 91 (1955). [CrossRef]  

8. G. D. Boreman, Modulation Transfer Function in Optical and Electro-Optical Systems, 2nd edition, Tutorial Texts in Optical Engineering No. Volume TT121 (SPIE Press, 2021).

9. “Photography—electronic still picture imaging — resolution and spatial frequency responses,” ISO 12233:2023(E).

10. V. N. Mahajan, Aberration Theory Made Simple, 2nd ed, Tutorial Texts Series No. v. TT93 (SPIE Press, 2011).

11. S. Matsunaga and S. K. Nayar, IEEE Trans. Comput. Imaging 1, 259 (2015). [CrossRef]  

12. X. Hu, G. Wang, J.-S. Hyun, et al., Opt. Lett. 45, 375 (2020). [CrossRef]  

13. R. Windecker, S. Franz, and H. J. Tiziani, Appl. Opt. 38, 2837 (1999). [CrossRef]  

14. D. Wang, W. Zhou, Z. Zhang, et al., Opt. Express 31, 39528 (2023). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1.
Fig. 1. Procedure for generating a sharpness map from a checkerboard image. (a) Segment the checkerboard into ROIs. (b) ESFs corresponding to the red and blue ROIs of (a). (c) Sharpness determined by integral region. (d) Sharpness map corresponding to (a).
Fig. 2.
Fig. 2. Centroid estimation for a sharpness map. (a) ROI segmentation across FOV. (b) Sharpness spot is divided into four quadrants.
Fig. 3.
Fig. 3. Orientation and position estimation of the focal plane using the sharpness distribution map. (a) Sharpness maps corresponding to orientation of focal plane. (b) Sharpness maps with axial displacements.
Fig. 4.
Fig. 4. Measurement of the position and orientation of the focal plane using focal stack. (a) Experimental system. (b) Focal plane fitting. (c) Focal planes from two views in one coordinate system. (d) Perpendicular distance distribution by subtracting the two focal planes in (b).
Fig. 5.
Fig. 5. 3D reconstruction results for microscale rough surface. (a) 3D reconstruction without focal plane coincidence. (b) 3D reconstruction with focal plane coincidence. (c) Roughness of areas A0–A4.

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

f ( x , y ) = u ( x ) 1 ( y ) u ( x ) = { 0 , x 0 1 , x < 0 .
I ( x , y ) E S F ( x ) = P S F ( x , y ) f ( x , y ) .
M T F ( ν ) = | F { L S F ( x ) } | = | F { d d x E S F ( x ) } | ,
Total power = ν low ν high M T F 2 ( ν ) N 2 Δ ν .
ν low = 0.1 f diff .lim ν high = 0.9 f diff .lim ,
( S 1 + S 2 ) ( S 3 + S 4 ) S 1 + S 2 + S 3 + S 4 = x ( S 1 + S 4 ) ( S 2 + S 3 ) S 1 + S 2 + S 3 + S 4 = y ,
f ( x , y ) = W D ( i ) = arg max i   S ( x , y , i ) ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.