Abstract

A three-dimensional (3D) profilometry method without phase unwrapping is proposed. The key factors of the proposed profilometry are the use of composite projection of multi-frequency and four-step phase-shift sinusoidal fringes and its geometric analysis, which enable the proposed method to extract the depth information of even largely separated discontinuous objects as well as lumped continuous objects. In particular, the geometric analysis of the multi-frequency sinusoidal fringe projection identifies the shape and position of target objects in absolute coordinate system. In the paper, the depth extraction resolution of the proposed method is analyzed and experimental results are presented.

©2009 Optical Society of America

1. Introduction

3D profilometry is a technology for extracting the position and depth information of 3D objects and has been researched intensively due to its importance in space recognition [1–4]. In profilometry, specific patterns are designed and projected on the target object surfaces (this process is referred to as space coding), and then, from the images of the projected patterns on 3D object surfaces, the shapes and other information related to depth and position are analyzed (this process is referred to as space decoding).

In general, profilometry is comparable with coherent interferometry. Some coherent interferometric techniques use multi-frequency patterns synthesized with a few laser sources with different wavelengths or several different carrier frequencies in order to reduce the phase ambiguity that usually appears in single frequency interferometric techniques [5, 6]. One of the disadvantages of coherent interferometric techniques is that it is difficult to obtain a degree of freedom associated with changing or controlling carrier frequencies. On the contrary, the profilometry uses the composition of projectors with incoherent light sources and a charge-coupled device (CCD) camera. The fringe patterns for 3D profiling of target objects are directly generated by the projector and the projected images of the fringe patterns on target objects captured by the CCD camera are used as measurement fringe patterns. Thus it can be said that profilometry has an advantage in easily controlling and changing the spatial frequencies of fringe patterns in comparison with coherent interferometric techniques.

In some profilometry methods, phase-shifting technique is used for reducing background noise and solving phase ambiguity problems [7, 8]. In the use of the phase-shifting technique, at least three steps of phase-shifting are necessary, which gives two relative phase bases. However, four steps of phase-shifting are usually preferred due to accuracy and reliability [9–11]. Thus, in this paper, we use four-step phase shifting method for synthesizing multi-frequency composite fringe patterns and performing the depth extraction.

On the other hand, conventional phase-shifting profilometry methods use a single spatial frequency fringe pattern [12, 13]. The spatial frequency is smaller than the maximum frequency that can be recorded on a CCD camera. In this case, phase unwrapping is indispensable for depth extraction of target objects [14]. Also, single frequency profilometry has a limitation in the dynamic range of depth extraction. If separations or discontinuities in target objects are larger than the values that can be managed by the spatial frequency of used fringe pattern, the depth extraction of the objects cannot be successful. Also if the spatial frequency is much higher than that of the prominence of target object surface, it is hard to extract fine depth map of the target object surface.

Therefore, profilometry techniques using multi-frequency fringe patterns have been actively researched. In the multi-frequency profilometry techniques, both relatively lower and relatively higher frequency fringes are used to analyze relatively large separations or discontinuities and detail depth map, respectively [15–18].

However, even in previous multi-frequency profilometry techniques, the need of phase unwrapping processes still remains although the multi-frequency profilometry technique was successful in some applications for profiling surface morphology without phase unwrapping [19].

In this paper, a novel 3D profilometry method without phase unwrapping is proposed. The key factors of the proposed profilometry method are the use of composite projection of multi-frequency and four-step phase-shift sinusoidal fringe patterns and its geometrical analysis. In most conventional profilometry techniques, only relative depth between separate objects can be measured [20, 21]. However, without phase unwrapping process, the proposed method provides the depth and position information of target objects in the absolute coordinate system with the geometric analysis regarding the positions and field-of-views of a projector and a CCD camera.

In Section 2, the proposed system geometry for depth extraction in the absolute coordinate system is described. The resolution of depth extraction of the system with a projector and a CCD camera with finite resolution is analyzed. In Section 3, the proposed depth extraction method is elucidated. The construction of multi-frequency and four-step phase-shift sinusoidal fringe patterns and its use for practical depth extraction are addressed in detail. In Section 4, experimental results are presented. The feasibility of extracting the depth of multiple objects with large discontinuity is shown. In Section 5, concluding remarks are provided.

2. System geometry for depth extraction in the absolute coordinate system

In this section, system geometry for depth extraction in the absolute coordinate system is addressed. Figure 1 shows the geometry of the profilometry system represented in two-dimensional x-z coordinate system. Basically the profilometry system is composed of a projector, a CCD camera and target objects. Sinusoidal fringe pattern with a single frequency is projected by the projector to the target objects. And the CCD camera captures and saves the image of the target objects with fringe pattern on their surfaces. This projection and capture process is repeated for several different frequency fringe patterns. The meaning and objective of this measurement process will be detailed in next section.

 

Fig. 1. Geometry of the proposed profilometry system

Download Full Size | PPT Slide | PDF

The center of imaging lenses of the projector and the CCD camera are referred to as the reference points of the projector and CCD camera, respectively. In Fig. 1, the reference points of the projector and CCD camera are located on (xPRJ, zPRJ) and (xCCD, zCCD), respectively. The projector and the CCD camera have finite angular fields of view. Let the angular field of views of the projector and the CCD camera be denoted by ΘPRJ and ΘCCD, respectively. As indicated in Fig. 1, it is not necessary that the image planes of the projector and the CCD camera should be placed on the same plane. The normal vectors of the image planes of the projector and the CCD camera are respectively rotated by specific angles so that the regions inside the fields of view of the projector and the CCD camera have a intersection region and this intersection region covers the target objects to be observed. In Fig. 1, the amount of rotation tilt angle of the projector and the CCD camera are indicated by φPRJ and φCCD, respectively. φPRJ (φCCD) is the angle between the left boundary of the field of view of the projector (CCD camera) and the z-axis.

Let us assume that the projector illuminates a single bright line image focused on a point on the object surface (xOBJ,zOBJ). The position of the line in the projection image is straightforwardly converted to the local illumination angle, θLocal, and the position of the line in the image captured by the CCD camera is also corresponding to the detection angle, θDetect.

These geometric parameters are used in extracting the depth of objects in the absolute coordinate system. In the rotated coordinates, where the axis of z′ is parallel to the left boundary of the viewing region of the CCD camera, the bright line position, (xOBJ, zOBJ) on the object surface, seen by the CCD camera is located on the line give by

zzCCD=cotθDetect(xxCCD).

A point (x′,z′) in the rotated coordinate system is transformed into (x,z) in the absolute coordinate system by rotation transform by the tile angle φCCD as

(xxCCDzzCCD)=(cosφCCDsinφCCDsinφCCDcosφCCD)(xxCCDzzCCD).

With all geometric parameters, including the reference points (xCCD,zCCD) and (xPRJ,zPRJ), known, the position (xOBJ,zOBJ) can be calculated. Therefore, xOBJ and zOBJ have the relation as

zOBJzCCD=cot(φCCD+θDetect)(xOBJxCCD),
zOBJzPRJ=cot(φPRJ+θLocal)(xOBJxPRJ).

From Eqs. (3a) and (3b), xOBJ and zOBJ are obtained as

zOBJ=(xCCDxPRJ)tan(φPRJ+θLocal)(zCCDzPRJ)tan(φPRJ+θLocal)tan(φCCD+θDetect)+zCCD,
xOBJ=(zCCDzPRJ)cot(φPRJ+θLocal)(xCCDxPRJ)cot(φPRJ+θLocal)cot(φCCD+θDetect)+xCCD.

The absolute coordinate position of a point, (xOBJ, zOBJ), on the surface of target object is evaluated from the geometric parameters in absolute coordinates. In the above analysis, it is assumed that the point on the object surface is indicated by the bright line produced by the projector. In principle, if we scan the line image over the surfaces of the target objects and analyze the absolute coordinates of all points simultaneously, we would obtain the total depth information of the target objects without auxiliary process such as phase unwrapping algorithm.

 

Fig. 2. Intersectional region with field of views by both a CCD camera and a projector.

Download Full Size | PPT Slide | PDF

 

Fig. 3. Spatial resolution in the intersectional region: (a) the relation between spatial resolution and angular resolutions of both a CCD camera and a projector and (b) the contour map of resolvable voxel in the intersectional region.

Download Full Size | PPT Slide | PDF

The resolution of depth extraction in this profilometry scheme is analyzed. Figure 2 shows the intersectional region of the fields of view of a projector and a CCD camera. The depth of objects placed in this region can be extracted with the stated method. In practice, the projector and the CCD camera have finite resolutions of 1024×768 pixels and 1280×960 pixels, respectively. Accordingly, the field of view is quantized as shown in Fig. 2. In Fig. 3(a), the relation between spatial resolution and angular resolutions of both CCD camera and projector is shown. Since the projector is positioned apart from the CCD camera, the minimum resolvable voxel (volume pixel) has a rhombic shape approximately. Let the vertices of the (m,n)th voxel are indexed by (xmn,zmn), (x mn+1,z mn+1), (x m+1n,z m+1n) and (x m+1n+1,z m+1n+1). The area of this voxel is given by

Sresolution=S(xmn,zmn;xm+1n;zm+1n;xmn+1,zmn+1)
+S(xm+1n+1,zm+1n+1;xm+1n,zm+1n;xmn+1,zmn+1).

Here, S() is Heron’s formula representing the triangular area with three points given by

S(x1,y1;x2,y2;x3,y3)=12[(x1·y2)+(x2·y3)+(x3·y2)][(x2·y1)+(x3·y2)+(x1·y3)].

The contour map of a resolvable voxel in the intersectional region is shown in Fig. 3(b). The minimum resolution is about 0.7mm 2 around the point 1500mm away from the reference point of the projector. The 3D discontinuous objects used in the first experiment are placed from 2300mm to 2800mm from the reference point of the projector. Hence, the resolution ranges of the first experiment are from 4mm 2 to 8mm 2 approximately.

3. Depth extraction using multi-frequency and four-step phase-shift sinusoidal fringe composite

In Section 2, it is shown that if 3D objects can be scanned by a bright line, we conduct 3D profiling of the target objects with simultaneous analysis of depth and position. In practice, the direct scanning of bright line over target objects is inefficient since moving picture of line image scanning over the target objects is necessary.

In this paper, an efficient method for realizing the above principle with multi-frequency sinusoidal fringe patterns is devised. In this section, the devised depth extraction analysis method that is effectively equivalent to the above stated line image scanning but more efficient than it is described.

Figure 4 shows the schematic of the devised 3D profilimetry method. Within the setup depicted in Fig. 1, a sinusoidal fringe pattern, Pn,p (ξ), with a specific spatial frequency, fn, and a specific phase shift, p, is generated as given by

Pn,p(ξ)=1+cos(2πfnξ+p),

where ξ is the lateral axis of the projector image plane. The phase-shift, p, is given by one of four quadrate phase-shift values as

p=0,π/2,π,and3π/2,

and the spatial frequency, fn, is chosen as the inverse exponent of integer 2,

fn=2n.

A sinusoidal fringe pattern, Pn,p (ξ), is projected onto target object located in the intersectional region of the fields of view of the projector and CCD camera. The CCD camera captures the image of the distorted fringe patterns formed on the object surface. The captured image is named by In,p and saved for depth analysis through post data processing.

 

Fig. 4. Schematic of 3D profilometry with multi-frequency and four-step phase shift sinusoidal fringe projection.

Download Full Size | PPT Slide | PDF

In the proposed method, we should take a bundle of images of distorted sinusoidal fringes with several spatial frequencies fn. Also, it is important that for a spatial frequency fn, four distorted fringe images of four sinusoidal fringe patterns with quadrate phase-shift values of p = 0,π/2,π, and 3;π/2 have to be reserved. Then if we use N spatial frequencies, we have to get 4N pictures. The next step is numerical depth extraction of target objects from the obtained 4N pictures.

Before this process, let us address a mathematical property of superposition of the raw data pictures In,p. Basically, the objective of using several multi-frequency fringe patterns is to equivalently realize the bright line scanning as stated in Section 2. According to the Fourier transform, a sharp bright line image can be represented by a superposition of several weighted sinusoidal fringe patterns. The reason of the need of four fringe patterns with quadrate phase-shifts is to construct an exponential Fourier basis. Since the sinusoidal fringe patterns that can be generated by the projector are inevitably positive value functions, the necessary exponential Fourier basis for a spatial frequency fn is constructed as follows. cos (2πfnξ) and sin(2πfnξ) are given, respectively, as

cos(2πfnξ)=[Pn,0(ξ)Pn,π(ξ)]/2,
sin(2πfnξ)=[Pn,3π/2(ξ)Pn,π/2(ξ)]/2.

As a result, the exponential Fourier basis can be obtained as

exp(j2πfnξ)=[Pn,0(ξ)Pn,π(ξ)]/2+j[Pn,3π/2(ξ)Pn,π/2(ξ)]/2.

Then we set a shifted pulse-shaped function h(ξ-s) representing the bright line image, where s is the lateral translation of the centered pulse-shaped function h(ξ), and find the Fourier coefficients of h(ξ-s), Fr[h(ξ;s)](fξ). If we use a finite number of Fourier harmonics, the Fourier series cannot represent the original shifted pulse-shaped function, h(ξ-s), but it is enough to represent local illumination pattern wherein brightness around the center of the original pulse-shaped function is relatively stronger than that of other parts. Let this partially truncated Fourier series of the original pulse-shaped function h(ξ-s) be denoted by PLocal (E; s). At this stage, PLocal (ξ; s) is represented as

PLocal(ξ;s)=real{nFr[h(ξs)]fξ=fnexp(j2πfnξ)},

where real(t) is the real part of a complex number t.

For convenience, we adopt the pulse-shaped function by a delta function with the peak position s given by

h=(ξs)=δ(ξs).

As a result, the local illumination image function, PLocal (ξ; s), is represented as

PLocal(ξ;s)=real{nexp[j2πfn(ξs)]}
=real{nexp(j2πfns){[Pn,0(ξ)Pn,π(ξ)]+j[Pn,3π/2(ξ)Pn,π/2(ξ)]}/2}.

For a projector with finite resolution of 1024 × 768 pixels, the maximum spatial frequency of expressible sinusoidal fringe pattern is 2-10. Thus, the available spatial frequencies for the projector with 1024×768 resolution are in the range of 2-10fn ≤ 20. The important property of Eq. (12) associated with profilometry is that the peak shift of the local illumination function can be simply controlled by a change of s, where s is in the range of 0 ≤ s ≤ 1023. Therefore, we can perform bright line scanning action with just raw images of multi-frequency sinusoidal fringes effectively by repeatedly calculating Eq. (12) with change of s. Without actual scanning action of bright line illumination over target objects, we can numerically calculate the pictures of object surface illuminated by shifting bright line images, ILocal (s), as, according to the superposition principle,

ILocal(s)=real{nexp(j2πfns){[In,0In,π]+j[In,3π/2In,π/2]}/2}.

As a result, without actual scanning action, with just 4N pictures, 1024 pictures of line scanning images of the target objects are obtained. This is a main meaning of using multi-frequency and four-step phase-shift sinusoidal fringe projection in the proposed profilometry method.

Figure 5 shows some examples of the local illumination function, PLocal (ξ;s), of Eq. (12). In every charts, the ξ-axis indicates the lateral pixel index of the 1024×768 projector. The η-axis is the light intensity of the composite local illumination image.

 

Fig. 5. Composite local illumination functions of multi-frequency sinusoidal fringe patterns of (a) a single spatial frequency of n = 10, (b) two spatial frequencies of n = 5 and 10, (c) two spatial frequencies of n = 9 and 10, and (d) nine spatial frequencies of n = 2, 3,…,and 10. The shift parameter s is set to s = 512.

Download Full Size | PPT Slide | PDF

 

Fig. 6. The peak-shaped response by the overlapped patterns with the composition of multi-frequency sine waves at (a) s = 256, (b) s = 512 , and (c) s = 768 .

Download Full Size | PPT Slide | PDF

In Fig. 5(a), the local illumination function composed of single spatial frequency fringe of n = 10 is shown. In Figs. 5(b) and 5(c), the local illumination functions composed of two sinusoidal fringes with spatial frequencies of n = 5 and n = 10 and with n = 9 and n= 10 are presented. When nine fringes with spatial frequencies of n = 2,3,…,and 10 are used, as indicated in Fig. 5(d), we can see that the discernible sharp peak appears around the center, ξ = 512. Although the differences between light intensities at the center pixel (ξ = 512) and other parts are small, the differences are enough to be available for local illumination and its scanning.

Next, the scanning action of the local illumination function with varying the shifting parameter s is demonstrated. Figure 6 shows the translation action of a few fringe patterns and the composite local illumination functions with varying the shift parameter s. As indicated in Eq. (12), the fringe pattern of spatial frequency fn is shifted by the amount of 2n+1 πs. In Fig. 6, the first, second and third row figures show the translation of single frequency fringe patterns for n = 10, n = 9, and n = 5, respectively. We can see peak scanning action in the first, second, and third column figures (indicated by (a), (b), and (c), respectively).

With the combination of the effective local illumination scanning method stated in this section and the geometric analysis stated in Section 2, we can perform the depth extraction of target objects in the absolute coordinate system.

4. Experimental results

Figure 7 shows the experiment setup composed of a projector (Mitsubishi SCD-SX90), a CCD (SONY XCD-SX90) and target objects. The first target object is two Greek head-shaped statues (Venus in the left side and Julian in the right side) which are at distances of 1800mm and 2300mm from the projector. In Fig. 7, blue solid lines and red dashed lines indicate the fields of views of the projector and the CCD, respectively.

 

Fig. 7. Experiment setup of the proposed profilometry.

Download Full Size | PPT Slide | PDF

The pictures of objects illuminated by the fringe patterns with the lowest spatial frequency and the highest spatial frequency are shown in Figs. 8(a) and 8(b), respectively. 36 pictures for sinusoidal fringe patterns of nine spatial frequencies are taken. In Fig. 8(c), the image of locally illuminated objects synthesized according to Eq. (13) is presented. In this figure, the red colored bright line illumination is perceived on the side facet of Venus. With the geometric analysis, the absolute coordinates of the bright points can be obtained. In Fig. 8(d), the movie of scanning the local illumination is attached.

Figure 9 exhibits the depth extraction results obtained in the experiment. The white parts in the images indicate parts with incorrect depth information to be avoided or improved. Since this experiment is set for the depth range from 1720mm to 2300mm, depth values beyond the range are interpreted as white parts indicating non-measurable depth. Figure 9(a) shows the depth profile obtained using a single spatial frequency fringe pattern of n = 10. It can be seen that several parts especially on the back head of Venus and the front head of Julian are white-colored. In Figs. 9(b) and 9(c), the depth profiles extracted using two fringe patterns of n = 5,10 and n = 9,10, respectively, are shown. Figure 9(c) shows a more successful measure of the depth profile of almost whole object, compared with the result in Fig. 9(b), with respect to the amount of the white-colored incorrect depth region.

 

Fig. 8. Images of the target objects (a) illuminated by low spatial frequency fringe pattern (n=10) and (b) high spatial frequency fringe pattern (n=10). (c) Synthesized image of the object illuminated by the local illuminated function and (d) the movie of scanning the local illumination (Media 1).

Download Full Size | PPT Slide | PDF

 

Fig. 9. Depth profiles obtained using multi-frequency fringe composite (a) with a single frequency n = 10, (b) with two frequencies in the case of n = 5 and 10, (c) with n = 9 and 10, and (d) with nine frequencies n = 2,3,…, and 10.

Download Full Size | PPT Slide | PDF

Referring to Fig. 9, Fig. 9(c) has more apparent difference between the sharpest peak at 512th pixel and signal level at other parts than that of the case in Fig. 9(b). Hence the local illumination with n =9 and n = 10 works better than one with n =5 and n =10 . In Fig. 9(d), the result obtained using nine sinusoidal fringe patterns of n = 2,3,…,and 10 is presented. This demonstrates that the depth of the discontinuous objects is measured accurately within the whole image without any white parts (incorrect depth extraction) with the local illumination synthesized with just nine sinusoidal fringe patterns.

To demonstrate the accuracy of the proposed method more vividly, another sample composed of spatially separated 20 balls with radius of 30mm, is chosen as shown in Fig. 10(a). The depth of 20 balls is set within the range from 1600mm to 2300mm. This target sample would show the accuracy of the proposed profilometry method clearly since the measured depth values can be compared to actual exact depth values.

 

Fig. 10. Experimental results of depth extraction of spatially separated 20 balls; (a) camera image, (b) measured depth profile, and (c) perspective view of measured depth profile.

Download Full Size | PPT Slide | PDF

Figure 10(a) represents the picture of 20 balls taken by the CCD camera. Figures 10(b) and 10(c) show the experimentally measured depth extraction of 20 balls using nine multi-frequency fringe patterns. Figure 10(c) is a perspective view of Fig. 10(b). In Table 1, the comparison of the real exact depth values and the measured depth values are presented. It successfully demonstrates that the proposed porfilometry method performs the depth extraction of the separated objects without phase unwrapping algorithm in the absolute coordinate system.

Tables Icon

Table 1. Comparison between the actual depth of the center points of 20 balls and the extracted depth of the center points.

5. Conclusion

In conclusion, 3D profilometry without phase unwrapping using multi-frequency and four-step phase-shift sinusoidal fringe projection has been developed. The scanning of local illumination pattern necessary for the depth extraction without phase unwrapping is performed by composition of multi-frequency sinusoidal fringes with four-step phase shifting method. In this proposed profilometry, the ambiguity in depth extraction of conventional profilometry methods associated with phase unwrapping is removed and the feasibility is demonstrated with experimental results.

Acknowledgments

This work was supported by the Korea Science and Engineering Foundation and the Ministry of Education, Science and Engineering of Korea through the National Creative Research Initiative Program (#R16-2007-030-01001-0).

References and links

1. E. Stoykova, J. Harizanova, and V. Sainov, “Pattern projection profilometry for 3D coordinates measurement of dynamic scenes,” in Three-Dimensional Television: Capture, Transmission, Display, H. M. Ozaktas and L. Onural, ed. (Springer, 2008), pp. 85–164. [CrossRef]  

2. J. Salvi, J. Pagès, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recogn. 37, 827–849 (2004). [CrossRef]  

3. D. M. Meadows, W. O. Johnson, and J. B. Allen, “Generation of surface contours by Moiré patterns,” Appl. Opt. 9, 942–947 (1970). [CrossRef]   [PubMed]  

4. H. Takasaki, “Moiré topography,” Appl. Opt. 9, 1467–1472 (1970). [CrossRef]   [PubMed]  

5. P. Picart, E. Moisson, and D. Mounier, “Twin sensitivity measurement by spatial multiplexing of digitally recorded holograms,” Appl. Opt. 42, 1947–1957 (2003). [CrossRef]   [PubMed]  

6. E. A. Barbosa, E. A. Lima, M. R. R. Gesualdi, and M. Muramatsu, “Enhanced multiwavelength holographic profilometry by laser mode selection,” Opt. Eng. 46, 075601–075607 (2007). [CrossRef]  

7. G. Mauvoisin, F. Brémand, and A. Lagarde, “Three-dimensional shape reconstruction by phase-shifting shadow Moiré,” Appl. Opt. 33, 2163–2169 (1994). [CrossRef]   [PubMed]  

8. X. F. Meng, L. Z. Cai, X. F. Xu, X. L. Yang, X. X. Shen, G. Y. Dong, and Y. R. Wang, “Two-step phase-shifting interferometry and its application in image encryption,” Opt. Lett. 31, 1414–1416 (2006). [CrossRef]   [PubMed]  

9. I. Yamaguchi, “Phase-shifting digital holography,” in Digital Holography and Three-Dimensional Display, T.-C. Poon, ed. (Springer, 2007), pp. 145–172.

10. J. Hahn, H. Kim, S.-W. Cho, and B. Lee, “Phase-shifting interferometry with genetic algorithm-based twin image noise elimination,” Appl. Opt. 47, 4068–4076 (2008). [CrossRef]   [PubMed]  

11. D. Kim and Y. J. Cho, “3-D surface profile measurement using an acousto-optic tunable filter based spectral phase shifting technique,” J. Opt. Soc. Korea 12, 281–287 (2008). [CrossRef]  

12. M. Chang and C. S. Ho, “Phase measuring profilometry using sinusoidal grating,” Exp. Mech. 33, 117–122 (1993). [CrossRef]  

13. R. Zheng, Y. Wang, X. Zhang, and Y. Song, “Two-dimensional phase-measuring profilometry,” Appl. Opt. 44, 954–958 (2005). [CrossRef]   [PubMed]  

14. M. Takeda, Q. Gu, M. Kinoshita, H. Takai, and Y. Takahashi, “Frequency-multiplex Fourier-transform profilometry: a single-shot three-dimensional shape measurement of objects with large height discontinuities and/or surface isolations,” Appl. Opt. 36, 5347–5354 (1997). [CrossRef]   [PubMed]  

15. W. Su and H. Liu, “Calibration-based two-frequency projected fringe profilometry: a robust, accurate, and single-shot measurement for objects with large depth discontinuities,” Opt. Express 14, 9178–9187 (2006). [CrossRef]   [PubMed]  

16. J.-L. Li, H.-J. Su, and X.-Y. Su, “Two-frequency grating used in phase-measuring profilometry,” Appl. Opt. 36, 277–280 (1997). [CrossRef]   [PubMed]  

17. C. E. Towers, D. P. Towers, and J. D. C. Jones, “Optimum frequency selection in multifrequency interferometry,” Opt. Lett. 28, 887–889 (2003). [CrossRef]   [PubMed]  

18. J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” J. Opt. Soc. Am. A 20, 106–115 (2003). [CrossRef]  

19. J. Ryu, S. S. Hong, B. K. P. Horn, D. M. Freeman, and M. S. Mermelstein, “Multibeam interferometric illumination as the primary source of resolution in optical microscopy,” Appl. Phys. Lett. 88, 171112 (2006). [CrossRef]  

20. W.-J. Ryu, Y.-J. Kang, S.-H. Baik, and S.-J. Kang, “A study on the 3-D measurement by using digital projection Moiré method,” Opt. Int. J. Light Electron. Opt. 119, 453–458 (2007). [CrossRef]  

21. J. A. N. Buytaert and J. J. J. Dirckx, “Moiré profilometry using liquid crystals for projection and demodulation,” Opt. Express 16, 179–193 (2008). [CrossRef]   [PubMed]  

References

  • View by:
  • |
  • |
  • |

  1. E. Stoykova, J. Harizanova, and V. Sainov, “Pattern projection profilometry for 3D coordinates measurement of dynamic scenes,” in Three-Dimensional Television: Capture, Transmission, Display, H. M. Ozaktas and L. Onural, ed. (Springer, 2008), pp. 85–164.
    [Crossref]
  2. J. Salvi, J. Pagès, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recogn. 37, 827–849 (2004).
    [Crossref]
  3. D. M. Meadows, W. O. Johnson, and J. B. Allen, “Generation of surface contours by Moiré patterns,” Appl. Opt. 9, 942–947 (1970).
    [Crossref] [PubMed]
  4. H. Takasaki, “Moiré topography,” Appl. Opt. 9, 1467–1472 (1970).
    [Crossref] [PubMed]
  5. P. Picart, E. Moisson, and D. Mounier, “Twin sensitivity measurement by spatial multiplexing of digitally recorded holograms,” Appl. Opt. 42, 1947–1957 (2003).
    [Crossref] [PubMed]
  6. E. A. Barbosa, E. A. Lima, M. R. R. Gesualdi, and M. Muramatsu, “Enhanced multiwavelength holographic profilometry by laser mode selection,” Opt. Eng. 46, 075601–075607 (2007).
    [Crossref]
  7. G. Mauvoisin, F. Brémand, and A. Lagarde, “Three-dimensional shape reconstruction by phase-shifting shadow Moiré,” Appl. Opt. 33, 2163–2169 (1994).
    [Crossref] [PubMed]
  8. X. F. Meng, L. Z. Cai, X. F. Xu, X. L. Yang, X. X. Shen, G. Y. Dong, and Y. R. Wang, “Two-step phase-shifting interferometry and its application in image encryption,” Opt. Lett. 31, 1414–1416 (2006).
    [Crossref] [PubMed]
  9. I. Yamaguchi, “Phase-shifting digital holography,” in Digital Holography and Three-Dimensional Display, T.-C. Poon, ed. (Springer, 2007), pp. 145–172.
  10. J. Hahn, H. Kim, S.-W. Cho, and B. Lee, “Phase-shifting interferometry with genetic algorithm-based twin image noise elimination,” Appl. Opt. 47, 4068–4076 (2008).
    [Crossref] [PubMed]
  11. D. Kim and Y. J. Cho, “3-D surface profile measurement using an acousto-optic tunable filter based spectral phase shifting technique,” J. Opt. Soc. Korea 12, 281–287 (2008).
    [Crossref]
  12. M. Chang and C. S. Ho, “Phase measuring profilometry using sinusoidal grating,” Exp. Mech. 33, 117–122 (1993).
    [Crossref]
  13. R. Zheng, Y. Wang, X. Zhang, and Y. Song, “Two-dimensional phase-measuring profilometry,” Appl. Opt. 44, 954–958 (2005).
    [Crossref] [PubMed]
  14. M. Takeda, Q. Gu, M. Kinoshita, H. Takai, and Y. Takahashi, “Frequency-multiplex Fourier-transform profilometry: a single-shot three-dimensional shape measurement of objects with large height discontinuities and/or surface isolations,” Appl. Opt. 36, 5347–5354 (1997).
    [Crossref] [PubMed]
  15. W. Su and H. Liu, “Calibration-based two-frequency projected fringe profilometry: a robust, accurate, and single-shot measurement for objects with large depth discontinuities,” Opt. Express 14, 9178–9187 (2006).
    [Crossref] [PubMed]
  16. J.-L. Li, H.-J. Su, and X.-Y. Su, “Two-frequency grating used in phase-measuring profilometry,” Appl. Opt. 36, 277–280 (1997).
    [Crossref] [PubMed]
  17. C. E. Towers, D. P. Towers, and J. D. C. Jones, “Optimum frequency selection in multifrequency interferometry,” Opt. Lett. 28, 887–889 (2003).
    [Crossref] [PubMed]
  18. J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” J. Opt. Soc. Am. A 20, 106–115 (2003).
    [Crossref]
  19. J. Ryu, S. S. Hong, B. K. P. Horn, D. M. Freeman, and M. S. Mermelstein, “Multibeam interferometric illumination as the primary source of resolution in optical microscopy,” Appl. Phys. Lett. 88, 171112 (2006).
    [Crossref]
  20. W.-J. Ryu, Y.-J. Kang, S.-H. Baik, and S.-J. Kang, “A study on the 3-D measurement by using digital projection Moiré method,” Opt. Int. J. Light Electron. Opt. 119, 453–458 (2007).
    [Crossref]
  21. J. A. N. Buytaert and J. J. J. Dirckx, “Moiré profilometry using liquid crystals for projection and demodulation,” Opt. Express 16, 179–193 (2008).
    [Crossref] [PubMed]

2008 (3)

2007 (2)

E. A. Barbosa, E. A. Lima, M. R. R. Gesualdi, and M. Muramatsu, “Enhanced multiwavelength holographic profilometry by laser mode selection,” Opt. Eng. 46, 075601–075607 (2007).
[Crossref]

W.-J. Ryu, Y.-J. Kang, S.-H. Baik, and S.-J. Kang, “A study on the 3-D measurement by using digital projection Moiré method,” Opt. Int. J. Light Electron. Opt. 119, 453–458 (2007).
[Crossref]

2006 (3)

2005 (1)

2004 (1)

J. Salvi, J. Pagès, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recogn. 37, 827–849 (2004).
[Crossref]

2003 (3)

1997 (2)

1994 (1)

1993 (1)

M. Chang and C. S. Ho, “Phase measuring profilometry using sinusoidal grating,” Exp. Mech. 33, 117–122 (1993).
[Crossref]

1970 (2)

Allen, J. B.

Baik, S.-H.

W.-J. Ryu, Y.-J. Kang, S.-H. Baik, and S.-J. Kang, “A study on the 3-D measurement by using digital projection Moiré method,” Opt. Int. J. Light Electron. Opt. 119, 453–458 (2007).
[Crossref]

Barbosa, E. A.

E. A. Barbosa, E. A. Lima, M. R. R. Gesualdi, and M. Muramatsu, “Enhanced multiwavelength holographic profilometry by laser mode selection,” Opt. Eng. 46, 075601–075607 (2007).
[Crossref]

Batlle, J.

J. Salvi, J. Pagès, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recogn. 37, 827–849 (2004).
[Crossref]

Brémand, F.

Buytaert, J. A. N.

Cai, L. Z.

Chang, M.

M. Chang and C. S. Ho, “Phase measuring profilometry using sinusoidal grating,” Exp. Mech. 33, 117–122 (1993).
[Crossref]

Cho, S.-W.

Cho, Y. J.

Dirckx, J. J. J.

Dong, G. Y.

Freeman, D. M.

J. Ryu, S. S. Hong, B. K. P. Horn, D. M. Freeman, and M. S. Mermelstein, “Multibeam interferometric illumination as the primary source of resolution in optical microscopy,” Appl. Phys. Lett. 88, 171112 (2006).
[Crossref]

Gesualdi, M. R. R.

E. A. Barbosa, E. A. Lima, M. R. R. Gesualdi, and M. Muramatsu, “Enhanced multiwavelength holographic profilometry by laser mode selection,” Opt. Eng. 46, 075601–075607 (2007).
[Crossref]

Gu, Q.

Guan, C.

Hahn, J.

Harizanova, J.

E. Stoykova, J. Harizanova, and V. Sainov, “Pattern projection profilometry for 3D coordinates measurement of dynamic scenes,” in Three-Dimensional Television: Capture, Transmission, Display, H. M. Ozaktas and L. Onural, ed. (Springer, 2008), pp. 85–164.
[Crossref]

Hassebrook, L. G.

Ho, C. S.

M. Chang and C. S. Ho, “Phase measuring profilometry using sinusoidal grating,” Exp. Mech. 33, 117–122 (1993).
[Crossref]

Hong, S. S.

J. Ryu, S. S. Hong, B. K. P. Horn, D. M. Freeman, and M. S. Mermelstein, “Multibeam interferometric illumination as the primary source of resolution in optical microscopy,” Appl. Phys. Lett. 88, 171112 (2006).
[Crossref]

Horn, B. K. P.

J. Ryu, S. S. Hong, B. K. P. Horn, D. M. Freeman, and M. S. Mermelstein, “Multibeam interferometric illumination as the primary source of resolution in optical microscopy,” Appl. Phys. Lett. 88, 171112 (2006).
[Crossref]

Johnson, W. O.

Jones, J. D. C.

Kang, S.-J.

W.-J. Ryu, Y.-J. Kang, S.-H. Baik, and S.-J. Kang, “A study on the 3-D measurement by using digital projection Moiré method,” Opt. Int. J. Light Electron. Opt. 119, 453–458 (2007).
[Crossref]

Kang, Y.-J.

W.-J. Ryu, Y.-J. Kang, S.-H. Baik, and S.-J. Kang, “A study on the 3-D measurement by using digital projection Moiré method,” Opt. Int. J. Light Electron. Opt. 119, 453–458 (2007).
[Crossref]

Kim, D.

Kim, H.

Kinoshita, M.

Lagarde, A.

Lee, B.

Li, J.

Li, J.-L.

Lima, E. A.

E. A. Barbosa, E. A. Lima, M. R. R. Gesualdi, and M. Muramatsu, “Enhanced multiwavelength holographic profilometry by laser mode selection,” Opt. Eng. 46, 075601–075607 (2007).
[Crossref]

Liu, H.

Mauvoisin, G.

Meadows, D. M.

Meng, X. F.

Mermelstein, M. S.

J. Ryu, S. S. Hong, B. K. P. Horn, D. M. Freeman, and M. S. Mermelstein, “Multibeam interferometric illumination as the primary source of resolution in optical microscopy,” Appl. Phys. Lett. 88, 171112 (2006).
[Crossref]

Moisson, E.

Mounier, D.

Muramatsu, M.

E. A. Barbosa, E. A. Lima, M. R. R. Gesualdi, and M. Muramatsu, “Enhanced multiwavelength holographic profilometry by laser mode selection,” Opt. Eng. 46, 075601–075607 (2007).
[Crossref]

Pagès, J.

J. Salvi, J. Pagès, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recogn. 37, 827–849 (2004).
[Crossref]

Picart, P.

Ryu, J.

J. Ryu, S. S. Hong, B. K. P. Horn, D. M. Freeman, and M. S. Mermelstein, “Multibeam interferometric illumination as the primary source of resolution in optical microscopy,” Appl. Phys. Lett. 88, 171112 (2006).
[Crossref]

Ryu, W.-J.

W.-J. Ryu, Y.-J. Kang, S.-H. Baik, and S.-J. Kang, “A study on the 3-D measurement by using digital projection Moiré method,” Opt. Int. J. Light Electron. Opt. 119, 453–458 (2007).
[Crossref]

Sainov, V.

E. Stoykova, J. Harizanova, and V. Sainov, “Pattern projection profilometry for 3D coordinates measurement of dynamic scenes,” in Three-Dimensional Television: Capture, Transmission, Display, H. M. Ozaktas and L. Onural, ed. (Springer, 2008), pp. 85–164.
[Crossref]

Salvi, J.

J. Salvi, J. Pagès, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recogn. 37, 827–849 (2004).
[Crossref]

Shen, X. X.

Song, Y.

Stoykova, E.

E. Stoykova, J. Harizanova, and V. Sainov, “Pattern projection profilometry for 3D coordinates measurement of dynamic scenes,” in Three-Dimensional Television: Capture, Transmission, Display, H. M. Ozaktas and L. Onural, ed. (Springer, 2008), pp. 85–164.
[Crossref]

Su, H.-J.

Su, W.

Su, X.-Y.

Takahashi, Y.

Takai, H.

Takasaki, H.

Takeda, M.

Towers, C. E.

Towers, D. P.

Wang, Y.

Wang, Y. R.

Xu, X. F.

Yamaguchi, I.

I. Yamaguchi, “Phase-shifting digital holography,” in Digital Holography and Three-Dimensional Display, T.-C. Poon, ed. (Springer, 2007), pp. 145–172.

Yang, X. L.

Zhang, X.

Zheng, R.

Appl. Opt. (8)

Appl. Phys. Lett. (1)

J. Ryu, S. S. Hong, B. K. P. Horn, D. M. Freeman, and M. S. Mermelstein, “Multibeam interferometric illumination as the primary source of resolution in optical microscopy,” Appl. Phys. Lett. 88, 171112 (2006).
[Crossref]

Exp. Mech. (1)

M. Chang and C. S. Ho, “Phase measuring profilometry using sinusoidal grating,” Exp. Mech. 33, 117–122 (1993).
[Crossref]

J. Opt. Soc. Am. A (1)

J. Opt. Soc. Korea (1)

Opt. Eng. (1)

E. A. Barbosa, E. A. Lima, M. R. R. Gesualdi, and M. Muramatsu, “Enhanced multiwavelength holographic profilometry by laser mode selection,” Opt. Eng. 46, 075601–075607 (2007).
[Crossref]

Opt. Express (2)

Opt. Int. J. Light Electron. Opt. (1)

W.-J. Ryu, Y.-J. Kang, S.-H. Baik, and S.-J. Kang, “A study on the 3-D measurement by using digital projection Moiré method,” Opt. Int. J. Light Electron. Opt. 119, 453–458 (2007).
[Crossref]

Opt. Lett. (2)

Pattern Recogn. (1)

J. Salvi, J. Pagès, and J. Batlle, “Pattern codification strategies in structured light systems,” Pattern Recogn. 37, 827–849 (2004).
[Crossref]

Other (2)

E. Stoykova, J. Harizanova, and V. Sainov, “Pattern projection profilometry for 3D coordinates measurement of dynamic scenes,” in Three-Dimensional Television: Capture, Transmission, Display, H. M. Ozaktas and L. Onural, ed. (Springer, 2008), pp. 85–164.
[Crossref]

I. Yamaguchi, “Phase-shifting digital holography,” in Digital Holography and Three-Dimensional Display, T.-C. Poon, ed. (Springer, 2007), pp. 145–172.

Supplementary Material (1)

» Media 1: AVI (2321 KB)     

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1.
Fig. 1. Geometry of the proposed profilometry system
Fig. 2.
Fig. 2. Intersectional region with field of views by both a CCD camera and a projector.
Fig. 3.
Fig. 3. Spatial resolution in the intersectional region: (a) the relation between spatial resolution and angular resolutions of both a CCD camera and a projector and (b) the contour map of resolvable voxel in the intersectional region.
Fig. 4.
Fig. 4. Schematic of 3D profilometry with multi-frequency and four-step phase shift sinusoidal fringe projection.
Fig. 5.
Fig. 5. Composite local illumination functions of multi-frequency sinusoidal fringe patterns of (a) a single spatial frequency of n = 10, (b) two spatial frequencies of n = 5 and 10, (c) two spatial frequencies of n = 9 and 10, and (d) nine spatial frequencies of n = 2, 3,…,and 10. The shift parameter s is set to s = 512.
Fig. 6.
Fig. 6. The peak-shaped response by the overlapped patterns with the composition of multi-frequency sine waves at (a) s = 256, (b) s = 512 , and (c) s = 768 .
Fig. 7.
Fig. 7. Experiment setup of the proposed profilometry.
Fig. 8.
Fig. 8. Images of the target objects (a) illuminated by low spatial frequency fringe pattern (n=10) and (b) high spatial frequency fringe pattern (n=10). (c) Synthesized image of the object illuminated by the local illuminated function and (d) the movie of scanning the local illumination (Media 1).
Fig. 9.
Fig. 9. Depth profiles obtained using multi-frequency fringe composite (a) with a single frequency n = 10, (b) with two frequencies in the case of n = 5 and 10, (c) with n = 9 and 10, and (d) with nine frequencies n = 2,3,…, and 10.
Fig. 10.
Fig. 10. Experimental results of depth extraction of spatially separated 20 balls; (a) camera image, (b) measured depth profile, and (c) perspective view of measured depth profile.

Tables (1)

Tables Icon

Table 1. Comparison between the actual depth of the center points of 20 balls and the extracted depth of the center points.

Equations (20)

Equations on this page are rendered with MathJax. Learn more.

zzCCD=cotθDetect(xxCCD).
(xxCCDzzCCD)=(cosφCCDsinφCCDsinφCCDcosφCCD) (xxCCDzzCCD) .
zOBJzCCD=cot(φCCD+θDetect) (xOBJxCCD) ,
zOBJzPRJ=cot(φPRJ+θLocal) (xOBJxPRJ) .
zOBJ=(xCCDxPRJ)tan(φPRJ+θLocal)(zCCDzPRJ)tan(φPRJ+θLocal)tan(φCCD+θDetect)+zCCD,
xOBJ=(zCCDzPRJ)cot(φPRJ+θLocal)(xCCDxPRJ)cot(φPRJ+θLocal)cot(φCCD+θDetect)+xCCD.
Sresolution=S(xmn,zmn;xm+1n;zm+1n;xmn+1,zmn+1)
+S(xm+1n+1,zm+1n+1;xm+1n,zm+1n;xmn+1,zmn+1) .
S(x1,y1;x2,y2;x3,y3)=12[(x1·y2)+(x2·y3)+(x3·y2)][(x2·y1)+(x3·y2)+(x1·y3)].
Pn,p(ξ)=1+cos(2πfnξ+p),
p=0,π/2,π,and3π/2,
fn=2n .
cos(2πfnξ)=[Pn,0(ξ)Pn,π(ξ)]/2 ,
sin(2πfnξ)=[Pn,3π/2(ξ)Pn,π/2(ξ)]/2 .
exp(j2πfnξ)=[Pn,0(ξ)Pn,π(ξ)] / 2+j [Pn,3π/2(ξ)Pn,π/2(ξ)] / 2 .
PLocal(ξ;s)=real{nFr[h(ξs)]fξ=fnexp(j2πfnξ)} ,
h=(ξs) =δ(ξs).
PLocal(ξ;s)=real{nexp[j2πfn(ξs)]}
=real{nexp(j2πfns){[Pn,0(ξ)Pn,π(ξ)]+j[Pn,3π/2(ξ)Pn,π/2(ξ)]}/2}.
ILocal(s)=real{nexp(j2πfns){[In,0In,π]+j[In,3π/2In,π/2]}/2} .

Metrics