## Abstract

A 3D sensing method to retrieve an entire shape from many segmented profiles is described. Image registration is not required in this method. Advantages of this method also include (1) very high integration accuracy, (2) improved robustness, (3) reduced computational time, (4) very low computation cost for the data fusion, and (5) capability of compensating distortions of the optical system at every pixel location.

© 2008 Optical Society of America

## 1. Introduction

3D sensing measurements using fringe projection schemes [1–5] are one of the well-known methods because of its non-scanning and full-field nature. It uses a fringe pattern projected onto the inspected object and evaluates the phase distribution from another viewpoint. The distorted fringes contain depth information and therefore are desirable to retrieve the inspected shape. However, when objects of interested are too complicated, shading and obstruction appears. Data points on that area are therefore lost. It is difficult to inspect the entire surface from one viewpoint. Segmented measurements from different partial views are generally required to form an entire shape.

The entire inspected shape should be segmented to different regions, which are overlapped with each other. Each segmented region is then inspected by a 3D sensor system from a partial view. The overlapped regions provide useful information for addressing the correspondence between two segmented coordinate systems. Once the correspondence in the overlapped region is known, the results obtained at different perspective angles can be transformed to a common coordinate system for the correct fusion of the partial data sets.

Searching for the matched points from the associated data sets in the overlapped regions is called image registration. It plays a key role to identify the correspondence between two coordinates systems. There is extensive research on the registration schemes. Most of these schemes can be categorized as area-based algorithms [6], and feature-based algorithms [7]. Another well-known scheme, the so-called active stereo vision [8, 9] utilizes structured light projected onto the inspected surface to perform the image registration. Searching for two matched surface points is simplified as a task to indicate the matched features created by the projection of structured light.

Even though all the above registration schemes are well performed and can be employed to integrate the data sets, limitations still happen. All of these registration methods require subpixel accuracy to ensure the integration accuracy. This makes the computation cost relatively high. Enormous errors also exist when the overlapped region is not large enough. Besides, inaccurate transformation from segmented coordinate systems to a reference system is harmful to the integration performance. The overall measurements become time consuming and make it impractical to real-time operation.

In this paper, an integration method to reconstruct the 3D shape from a complicated object is presented. Image registration is not required in our setup. Errors from registration are therefore eliminated. Since there is no need to identify the correspondence between two segmented systems, computation cost is reduced as well. Each segmented 3D shape sensor is calibrated with the same calibration tools. Thus, systematic accuracy is only determined by the accuracy of each segmented measurement. In addition, with this calibration scheme, it is desirable to compensate perspective errors and lens distortions at every pixel location.

## 2. Principle

#### 2.1 Photogrammetry of a projected fringe system

Figure 1 shows the optical setup of the 3D shape sensor, which is used for segmented 3D measurements. A one-dimensional fringe pattern that lies in the *x _{g}*-

*y*plane with fringes normal to

_{g}*x*-axis is projected to the inspected object. Phase distribution can be mathematically expressed as

_{g}where *d* is the period of the fringes.

Shape of the tested object is identified in the world coordinate system (*x*, *y*, *z*). The relationship between the fringe coordinates and the world coordinates is given by

where *r _{ij}* are coefficients signified with magnification and coordinates rotation, and

*t*are shifting parameters. The superscript (

_{i}*p*) denotes a projector parameter. Distortion from the projection lens is addressed with Δ

*x*, Δ

*y*, and Δ

*z*.

Fringe deformation on the object is then imaged to the detection plane, which is located in the coordinate system (*x _{d}*,

*y*). A similar relationship between the detector coordinates and the world coordinates is represented as

_{d}The superscript (*c*) denotes a camera parameter. For a fixed detection location, the terms on the left side of Eq. (3) are two constants. Equation (3) is therefore simplified as the following expression:

where *a _{i}* and

*b*are constants, and can be determined with a proper calibration scheme. Substituting (4) into Eq. (2), the variable z can be represented as a function of phase. Even though distortions of the projection lens are taken into account, it still can be approximately expressed as

_{i}The parameters *c _{i}* can be carried out with a proper calibration scheme. In Section 2.2, a calibration procedure is described.

#### 2.2 Parameter identification

As shown in Fig. 2(a), a standard diffusive flat plate is applied to identify the parameters *c _{i}* for a 3D sensing system. A sinusoidal pattern is projected onto the flat surface perpendicular to the

*z*direction. Fringes on the flat surface are then captured by a CCD detector. The phase measurement is repeated as the flat is successively translated to different z positions. With Fourier transform method [1] or phase-shifting technique [2], wrapped phases can be found out. Unwrapping is then performed to recover the true phase from modulo 2

*π*data. Thus, a set of absolute phases and the associated

*z*positions are obtained at each pixel location. With the subsequent curve fitting process,

*φ*-to-

*z*relation at each pixel is determined. Parameters

*c*are therefore carried out by the curve-fitting algorithm.

_{i}To identify the parameters *a _{i}* and

*b*, a 2D fringe pattern is used as a calibration tool. The reflectivity of this pattern is sinusoidal and is expressed as

_{i}where *d _{x}* and

*d*are known periods in

_{y}*x*- and

*y*- direction, respectively.

This pattern is sequentially placed at the known plane *z*=*z _{i}* in front of the CCD camera, as shown in Fig. 2(b). Images of the fringe pattern at various

*z*positions are sequentially captured by the CCD camera. Thus a series of 2D absolute phases and the corresponding

*z*values are obtained at each pixel location. Since the periods of the 2D fringes are known, absolute phases can be used to represent the associated transverse positions. Thus, the relationship between

*x*and

*z*(or

*y*and

*z*) can be found out. With a simple algebra manipulation, the parameters

*a*and

_{i}*b*can be evaluated.

_{i}Note that the detection plane does not need to perpendicular to the optical axis of the imaging lens. This makes it more flexible to retrieve shapes from arbitrary viewpoints.

#### 2.3 Calibration of multiple measurements

Figure 3 illustrates a setup of the entire-shape measurement. Each sensing system performs segmented measurements from its partial view. As described in section 2.1, the retrieved shape from a partial view can be expressed as

where the superscript (*l*) denotes a parameter from the *l*
^{th} sensing system.

Calibration procedures described in section 2.2 are employed to identify the parameters in each sensing system. Setups to determine the *φ*-to-*z* conversion and *z*-to-*x* (or *y*) conversion are shown in Fig. 4 and Fig. 5, respectively. The calibration tool (either the optical flat or the 2D pattern) is placed at the known plane *z*=*z _{i}*. Distribution of fringes on the calibration tool is recorded by all the CCD cameras at the same time. The phase measurement is repeated at different z positions and stops when enough measurements are obtained to execute the curve-fitting algorithm. Thus, for each sensor system, a series of absolute phases and the associated depths are obtained at each pixel location. The relationships between

*φ*

^{(l)}and

*z*,

*z*and

*x*, and z and

*y*in each sensor system are determined. Parameters rs

*a*

^{(l)}

*,*

_{i}*b*

^{(l)}

_{i}, and

*c*

^{(l)}

_{i}are then identified by the curve-fitting algorithm.

All the sensing systems are calibrated with the same calibration tools, which are placed in the world coordinates (*x*, *y*, *z*). Thus, all the segmented profiles are retrieved in the same coordinate system. Integration becomes a simple task to display all point clouds in the coordinate system. Once all the sensing systems are calibrated, all the optical elements should be rigid to ensure the repeatability of data fusion.

## 4. Experiments

In our setup, a bowl with approximately 120 *mm* in diameter and 40 *mm* in depth was chosen as the inspected sample. Two 3D sensing systems located at different viewpoints were used to perform the segmented measurements. In each system, the projected fringes were observed by a CCD camera with 1024×1024 pixels at 12-bit pixel resolution. The recorded images from two viewpoints are shown in Figs. 6(a) and 6(b), respectively. Data on the edge of the bowl were lost because of shading.

Phases are evaluated using the phase-shifting technique. Figures 7(a) and 7(b) show the computed phases, which were within the interval between −*π* and *π*. Phase unwrapping was then performed to eliminate the discontinuities. In our experiment, we use Goldstein’s algorithm [10] to restore the absolute phases. Errors occur during the unwrapping procedure because of phase noise, surface anomalies, and insufficient sampling densities. These errors may propagate over a large area and spoil many pixels. Shown in Figs. 8(a) and 8(b) are the unwrapped results. Unwrapped errors appear on the beveled edge around the shading area.

With Eq. (7), the absolute phase at each pixel can be used to determine the *z* position, and then the *z* value is translated to carry out the *x*- and *y*- location. The relationship between *φ*
^{(l)} and *z* can be more accurate as the order of the polynomial, *N*, is increased. Our experiments had shown that accuracy of the coefficients could be achieved in the order of microns while *N*=4. Thus, at least 5 measurements at different *z* positions were required to identify the parameters. In our setup, there were 20 measurements employed to perform the curve-fitting algorithm. Calibration range along the *z*-axis was 50 *mm*, which ensured that all the data points were obtained within the calibration area. With the phase-shifting technique, depth accuracy for each segmented system was approximately 5*µm*. However, transverse accuracy was only 127 *µm*, which was mainly determined by the sampling density of the CCD camera.

Retrieved profiles from the two partial views in Rectangular Coordinates are shown in Figs. 9(a) and 9(b). Note that Figs. 9(a) and 9(b) have the same dimensions. Thus, there was no need to identify any transformation between the segmented systems. Systematic accuracy was then only determined by the accuracy of each segmented measurement. Figure 9(c) shows the meshed data sets in which the uncorrected ones had been thrown away.

## 7. Conclusion

We have presented a method to integrate segmented measurements performed by calibration-based projected fringe profilometry. Unlike conventional algorithms, image registration was not required in this setup. All the segmented sensing systems were calibrated with the same calibration tools. Thus, all the segmented profiles were retrieved in the same coordinate system. Integration becomes a simple task to display all the point clouds in this coordinate system. Among all the integration schemes, this method is superior since it offers many major advantages, including: (1) very low computation cost for the data fusion, (2) reduced computational time, and (3) very high integration accuracy.

## References and links

**1. **M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shaped,” Appl. Opt. **22**, 3977–3982 (1983). [CrossRef] [PubMed]

**2. **V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl. Opt. **23**, 3105–3108 (1984). [CrossRef] [PubMed]

**3. **D. R. Burton and M. J. Lalor, “Multichannel Fourier fringe analysis as an aid to automatic phase unwrapping,” Appl. Opt. **33**, 2939–2948 (1994). [CrossRef] [PubMed]

**4. **W. H. Su and H. Liu, “Calibration-based two frequency projected fringe profilometry: a robust, accurate, and single-shot meaurement for objects with large depth discontinuities,” Opt. Express **14**, 9178–9187 (2006). [CrossRef] [PubMed]

**5. **W. H. Su, “Color-encoded fringe projection for 3D shape measurements,” Opt. Express **15**, 13167–13181 (2007). [CrossRef] [PubMed]

**6. **A. W. Gruen, “Geometrically constrained multiphoto matching,” Photogramm. Eng. Remote Sens. **54**, 633–641 (1988).

**7. **L. G. Brown, “A survey of image registration techniques,” ACM Comput. Surv. **24**, 325–376, (1992). [CrossRef]

**8. **C. Reich, R. Ritter, and J. Thesing, “3-D shape measurement of complex objects by combining photogrammetry and fringe projection,” Opt. Eng. **39**, 224–231 (2000). [CrossRef]

**9. **A. Dipanda, S. Woo, F. Marzani, and J. M. Bilbault, “3-D shape reconstruction in an active stereo vision system using genetic algorithms” Pattern Recog. **36**, 2143–2159 (2003). [CrossRef]

**10. **E. Zappa and G. Busca, “Comparison of eight unwrapping algorithms applied to Fourier-transform profilometry,” Opt. Lasers Eng. **46**, 106–116 (2008). [CrossRef]