## Abstract

Fringe patterns are raw output data from many measurement systems including laser interferometers and moiré systems. For instruments with a range of zoom levels to measure the object at different scales, a technique (algorithm) is needed to combine and/or compare data to obtain information at different levels of details. A technique to keep the continuity of output images both at different levels of zoom and within the same level of zoom is developed and demonstrated. Image registration is used to correlate images, find relative zoom values, and obtain shift between images in the lateral plane. Fringe patterns from a moiré system and a laser interferometer are used as images to be stitched and demonstrate the technique. Interferomteric fringes are used to find the required parameters to inter-relate locations and scale of the fringe patterns at different levels of zoom. The calculated parameters are scale and translation in both directions; these parameters make it possible to locate the coordinates of the region that the measurement system is zoomed in on, related to the area with lower magnification and relative locations of images within the same level of zoom. Results show that this technique is capable of finding the scale and shift parameters within the resolution of one pixel and therefore can restore continuity between images at different levels of zoom.

© 2011 OSA

## 1. Introduction

Fringe patterns are essential for many precision measurements including laser interferometry [1], Moiré techniques [2] and holography [3]. Fringe patterns contain phase information that corresponds to height information [4], stress [5], strain [6], or other measurands [7]. To extract information a phase map is typically calculated first from the fringe pattern and then the required measurands and parameters are calculated [8].

For measuring large objects, different measurements are taken at different locations on the object and stitched together to form one image, which typically requires large overlap regions among the measurements. Information such as height, stress, strain, etc. can be obtained from this large stitched image without discontinuity. In cases where regions of interest (ROI) are small and discrete areas of the object, different strategies might be chosen. One strategy is based on scanning the whole object with the highest resolution followed by processing and stitching the acquired data. In this case, processed data outside the ROIs are discarded, wasting measurement time and effort. For example, suppose the part to be measured is an array of solder bumps deposited on a chip pad, and the solder bumps have to be carefully inspected. If there is a defect on one of the solder bumps, the location and details of the defect need to be identified. With this strategy, the profile of all solder bumps has to be acquired with the highest resolution, part by part, and then profiles stitched together from different parts to get one continuous profile. This approach is very time consuming and not efficient.

The other strategy is to take the high resolution measurements only of the ROIs. The difficulty here is that ROIs, have to be identified first and second, the relative location of the ROIs, with respect to a reference point on object, are unknown in general (if there is no overlap between ROIs).

To address these issues, a combination of multiscale measurement and stitching is proposed, namely a measurement with low resolution is taken, ROIs identified, and then measurements with high resolution are taken of the ROIs. This requires a strategy for inter-relating locations and magnification scales at different levels of zoom. Therefore the proposed technique offers measurements at different levels of zoom while stitches data at all zoom levels. Stitching means that the exact coordinates of all measurements at different magnifications are established relative to a common reference point.

To stitch data from different measurements, we use the fringe patterns from each measurement, which avoids error contributions that may originate from the phase map calculations. Also this technique speeds up the stitching process by avoiding phase map calculations prior to stitching. Hence, with this technique we can get detailed information (high resolution measurements) about the ROIs and their relative locations while saving time and calculation cost.

We have applied this technique to simulated moiré fringes [9]. In this paper, we report the application of this technique for fringe patterns from a laser interferometer and a projection moiré system, and show that this technique works for both fringe patterns and is capable of locating different ROIs over a range of magnifications.

## 2. Background

Laser interferometry is one of the techniques used for precision metrology [10]. It is used to measure a wide range of objects and surfaces [2]. There are different configurations for interferometers including the Michelson interferometer, the Mach-Zehnder interferometer, the Fizeau interferometer, etc. In all interferometers two or more waves interfere and the resultant interference pattern is used to calculate a wavefront change that results from transmission through or reflection from a test part of interest. The wavefront change, in turn, corresponds to the parameters and/or features of interest concerning the object under measurement. With laser interferometry, very precise measurements can be achieved down to sub-nanometer levels [11, 12].

Moiré technique is another optical method for form and surface measurements [2] [13–15]. It can be used to measure in-plane deformation and strain [16], as well as out-of-plane deformation and profilometry [17,18]. In-plane measurements are usually performed by attaching a grating to the object and then measuring the deformation of the grating. Out-of-plane measurements are based on illuminating the object with the structured light, usually a straight line grating, and measuring the deformation of reflected light from the object. In both cases the reflected light can be summed or multiplied by another grating to form a Moiré fringe pattern or the deformation can be directly measured. Moiré techniques are most common for form measurement and they can be used to realize measurements with uncertainties down to the micrometer level [19, 20].

Raw or intermediate output data from Moiré systems or laser interferometers is a fringe pattern that is processed to extract the desired measurand. For large objects these measurement systems usually have variable magnification capabilities that are typically accomplished with a zoom lens or different objective lenses. When reconfiguring the instrument for a measurement at a new magnification, the field of view does not necessarily stay centered and the value of the magnification (the new field of view) is only known approximately from the instrument settings. To quantitatively relate fine and coarse measurements, it is necessary to find the relative coordinates and value of the magnification with respect to a reference image that can be conveniently taken to be the measurement at the lowest magnification.. This can be accomplished using image registration.

Image registration is a technique to align two images from one scene [21]. These images can be taken with two different instruments, from two different angles, with two different levels of zoom, or at different times. One of the images is referred to as the reference image and the other one is the target image [21]. Image registration aligns the target image with the reference image. Mathematically speaking, the reference image (*R*) and the target image (*T*) belong to sets of d-dimensional images with *d = 2*; i.e. $R,T\in Img\left(d=2\right):=\left\{b:{\mathbb{R}}^{2}\to \mathbb{R}|bis2-dimensionalimage\right\}$. Image registration minimizes the difference (or maximize the similarity) between the target and reference images. To quantify the difference (or similarity) of two images, a distance measure (similarity measure) has to be defined. The similarity measure used in this work is pattern intensity [22]. The pattern intensity measure for two images *R* and *T* is defined as

The pattern intensity operates on the difference between images. The average intensity of the captured fringe patterns at different zoom levels are different, and this is compensated for by multiplying one fringe pattern with parameter $\lambda $ and making the average intensity of both images equal. Parameter *r* determines the radius inside of which the difference comparison (calculation) is made. The parameter $\sigma $ determines the sharpness of the similarity measure. Smaller $\sigma $ means more sensitivity to the pixel differences and therefore a sharper similarity curve versus the difference between pixels. In our calculations values of $\sigma =10$ and *r = 5* provide the best results for image registration of fringe patterns. The function $\phi $ is the transformation of the coordinates of the image. There are different transformations used to implement image registration [23-25]. If the image transformation includes only translation, scale, rotation and shear, it is called an affine transformation. Otherwise the transformation is called a non-rigid transformation. In the case of changing the zoom level, the transformation is dominated by translation and scaling only, so an affine transformation is sufficient to register the images.

The affine transformation used here is defined by $\phi \left(x\right)=Ax+b,A\in {\mathbb{R}}^{d\times d}$ with $\mathrm{det}\left(A\right)>0,b\in {\mathbb{R}}^{d}$ [26] where $\mathrm{det}\left(A\right)$ is determinant of matrix *A*. For the case of 2-d image transformation that includes translation and scaling, the transformation can be defined as

To summarize, the problem of 2-d image registration can be stated as follows [26]: Given a distance measure $\mathcal{D}:Img{(d=2)}^{2}\to \mathbb{R}$ and two images $R,T\in img(d)$, find a mapping $\phi :{\mathbb{R}}^{d}\to {\mathbb{R}}^{d}$ and a mapping $g:\mathbb{R}\to \mathbb{R}$ such that $\mathcal{D}\left(R,g\left(T\left(\phi \left(x,y\right)\right)\right)\right)$ is minimum.

In this work, the distance measure (similarity measure) is pattern intensity, transformation is an affine transformation with 2-d images of fringe patterns, and $g\left(x\right)=x$.

Now we present the simulation of the proposed technique, using image registration with the affine transformation and the pattern intensity distance measure.

Projection Moiré and laser interferometer are two systems simulated to evaluate the proposed technique. Figure 1 shows the schematics for the simulated systems. The projection Moiré (Fig. 1.a) consists of a light source and a grating to generate and project a series of parallel lines with specific spatial wavelength (fringe pattern), a lens system to form an image of the parallel fringe pattern on the object and a second lens system to image the distorted from the object interferes with an undistorted reference wavefront. The interference pattern is captured by a camera. The simulation was done in MATLAB. Parameters for projection Moiré are as follow: the angle between the camera and the normal to the object is 20°, and the angle between light source and normal to object is also 20°. The camera is 512 by 512 pixels. The pitch of the grating is 0.1 mm and the number of fringes at the lowest zoom level is 51.2.

The electronic noise is considered as additive white Gaussian noise with an SNR of 20dB. The optics for projection and capturing the reflected light are telecentric lenses, i.e. illuminated and reflected rays are parallel.

The laser interferometer is simulated with the same camera as mentioned before. In addition, averaged exponential multiplicative speckle noise is added to the signal with a mean value that equals half the signal power. The laser wavelength is taken to be 633nm and additive white Gaussian noise with SNR of 20dB is added to simulate electronic noise.

The simulated object is taken to be a superposition of a rotationally symmetric cosine function with a period of 2 mm and an amplitude of 0.1mm (representing the large-scale form of the object), and a higher frequency additional structure is added that has a peak and valley shape with a period of 0.01mm and amplitude of 0.001 mm (representing ROI surface features). Figure 2 shows the object at zoom levels of 4 and 128. Note that at the lowest magnification (zoom level of 4), only the cosine shape of the object is recognizable. When the magnification is increased (zoom level of 128), the details surface features are clearly seen while the low spatial frequency cosine form is not.

With this defined object, simulations have been carried out for both the projection Moiré and the laser interferometer at different levels of zoom. Figure 3 shows the simulated fringe patterns from Moiré system at zoom levels of 1, 2, 8 and 16 and interferograms from laser interferometer at zoom levels of 64, 128, 256 and 512.

For all simulated fringe patterns, we have assumed no translation error between fringe patterns at different zoom levels, without loss of generality, i.e. all fringe patterns have the same center in both X and Y directions for each zoom level (there is no drift in the system’s optical axis relative to the object). The goal is to find the amount of zoom (scale) and relative coordinate (translation) in both X and Y directions. Then, the translation should nominally be zero.

Figure 3 shows the results of image registration for the Moiré system and the laser interferometer, respectively. Simulation results are summarized in Table 1 . The error observed in the final results is dominated by the noise present in the fringe patterns, either electronic noise (projection moiré) or both electronic and speckle noise (laser interferometer).

Since the camera has 512 pixels in each direction, 0.1% error in scale equals approximately half a pixel while 0.2% error corresponds to approximately one pixel error. Therefore, for fringes from the laser interferometer, the error is within one pixel and for the Moiré system the error is about one pixel. Hence, based on simulation results the proposed technique can find the scale and translation of different fringe patterns within one pixel.

## 3. Experiment

Figure 4 shows the experimental setup for the measurement system. The system is a combined projection Moiré and laser interferometer and is the combination of schematics of Fig. 1.a and 1.b. The zoom level is determined by the combination of two zoom lenses. For the projection Moiré, a projector is used as a combined light source and grating. The laser interferometer is a Michelson interferometer with a HeNe laser source.

Projection Moiré is used for the coarse (form) measurement, i.e. low levels of zoom, while the system switches to the laser interferometer for fine measurements; i.e. high level of zoom.

The object is coated array of solder bumps on a substrate. As mentioned before, projection Moiré is used to measure the form of the object, and the laser interferometer is used for measuring the fine surface features of the object. The fine feature measurement is for ROIs and not the whole surface of the object. In this case, the ROIs would be selected individual solder bumps. Therefore, the measurement procedure is as follows: first, the form of theobject is acquired, second the form measurement is used to identify ROIs, and finally the relative location and features in the ROIs are measured with high resolution. The details of converting the fringe patterns into phase maps and object profiles are not discussed here as it is not the focus of this work. These procedures are well documented [2,4,8]. Our focus is to show that the proposed technique can be used to find the relative location and magnification of the fringe patterns at different levels of zoom.

Figure 5 shows the fringe patterns acquired from the projection Moiré and the laser interferometer at different levels of zoom. Our image registration technique has been applied to these fringe patterns to calculate the parameters required finding the relative scale and the relative coordinates at each zoom. Figure 6 shows the results of image registration. The results show that the technique can relate the location of the fringe pattern at the highest level of zoom to the location of fringe pattern at the lowest level of zoom within fringe patterns of projection Moiré and laser interferometer, and spanning the range from the Moiré to the laserinterferometer measurements (Spanning from projection Moiré to the laser interferometer is explained later in this section). Since this can be done for all fringe patterns at the highest level of zoom, the relative location of the fringe pattern at the highest level of zoom can be determined with respect to a reference point at the lowest level of zoom. Table 2 shows the calculated parameters from the proposed technique.

Since the change of zoom level is not automated, the values experimentally known are only approximate values. Our technique determines the magnifications exactly, as presented in Table 2. The shift in X and Y between the different fringe patterns is also determined by the proposed technique. The nonzero shifts in both X and Y directions are dominated by errors in the optical setup and alignment of the optical elements and the object. The relationship between the fringe pattern location from the projection Moiré and the interferogram location from the laser interferometer can be determined if first a single camera is used for both projection Moiré and the laser interferometer, and second when we switch from projection Moiré to the laser interferometer, the zoom level does not change, i.e. the camera captures the same part of the object. Otherwise, the relative location can be found by first converting the fringe patterns into height profiles and then finding the relative location with the calculated height profile from the projection Moiré and the laser interferometer.

Figure 7 shows the overall process for a solder bump array when there is an overlap between projection Moiré and laser interferometer. Since there is overlap between fringe pattern acquired from projection Moiré and laser interferometer at zoom level 8, those two sets of registered fringe pattern can be combined to result in a continuous image from zoom level 1 to 48 (Fig. 7(c)).

As mentioned before, applying image registration to fringe patterns, rather than the phase map or profile of object, results in better and more precise scale and location finding. The primary reason for this is that the registration step based on the fringe patterns does not include errors that might arise from the phase map or object profile calculation.

## 4. Conclusion

We have proposed a technique to generate a continuous image from different fringe patterns at different levels of magnification (zoom) and within same level of zoom. The technique is a combination of multiscale measurement and stitching, namely measuring data at different levels of zoom for regions of interest (RIOs) and then stitching the data onto a common surface coordinate system based on image registration techniques. Simulation results show that the proposed technique is capable of stitching and registering data to within one pixel. Experiments have been conducted to demonstrate the applicability for fringe patterns obtained from Moiré systems and laser interferometers. The proposed technique can be applied to other multi-scale measurement instruments. Calculations for error analysis of the proposed technique have been presented.

Results can be improved by using noise reduction techniques before applying the proposed technique and using adaptive image registration appropriate for a specific measurement instrument. The authors are investigating different transformations for fast image registration for different fringe patterns and instruments.

## Acknowledgments

This project is supported by the grant from Industrial Affiliates Program, The Center for Precision Metrology, University of North Carolina at Charlotte.

## References and links

**1. **L. M. Barker and R. E. Hollenbach, “Laser interferometer for measuring high velocities of any reflecting surface,” J. Appl. Phys. **43**(11), 4669–4675 (1972). [CrossRef]

**2. **K. Creath, J. Schmit, and C. Wyant, “Optical Metrology of Diffuse Surfaces,” in *Optical Shop Testing* D. Malacara (John Wiley & Sons, Inc. Publications, 2007), pp. 756–807.

**3. **I. Yamaguchi and T. Zhang, “Phase-shifting digital holography,” Opt. Lett. **22**(16), 1268–1270 (1997). [CrossRef]

**4. **M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. **72**(1), 156–160 (1982). [CrossRef]

**5. **R. J. Sanford and J. W. Dally, “A general method for determining mixed-mode stress intensity factors from isochromatic fringe patterns,” Eng. Fract. Mech. **11**(4), 621–633 (1979). [CrossRef]

**6. **A. E. Ennos, “Measurement of in-plane surface strain by hologram interferometry,” J. Phys. E Sci. Instrum. **1**(7), 731–734 (1968). [CrossRef]

**7. **Q. Kemao, “Two-dimensional windowed Fourier transform for fringe pattern analysis: Principles, applications and implementations,” Opt. Lasers Eng. **45**(2), 304–317 (2007). [CrossRef]

**8. **W. W. Macy Jr., “Two-dimensional fringe-pattern analysis,” Appl. Opt. **22**(23), 3898–3901 (1983). [CrossRef]

**9. **M. Abolbashari, A. S. Gerges, A. Davies, and F. Farahi, “Image continuity at different levels of zoom for moiré techniques,” Proc. SPIE **7790**, 77900S, 77900S-6 (2010). [CrossRef]

**10. **G. E. Sommargren, “A new laser measurement system for precision metrology,” Precis. Eng. **9**(4), 179–184 (1987). [CrossRef]

**11. **M. Tanaka, T. Yamagami, and K. Nakayama, “Linear interpolation of periodic error in a heterodyne laser interferometer at subnanometer levels,” IEEE Trans. Instrum. Meas. **38**(2), 552–554 (1989). [CrossRef]

**12. **C. M. Wu, “Heterodyne interferometric system with subnanometer accuracy for measurement of straightness,” Appl. Opt. **43**(19), 3812–3816 (2004). [CrossRef]

**13. **K. J. Gåsvik, *Optical Metrology* (John Wiley & Sons, Ltd, 2002).

**14. **C. A. Sciammarella, “The Moiré Method—A Review,” Exp. Mech. **19**, 418–422 (1979).

**15. **C. A. Walker, “A historical review of moiré interferometry,” Exp. Mech. **34**(4), 281–299 (1994). [CrossRef]

**16. **F. P. Chiang, “Moiré methods of strain analysis,” Exp. Mech. **22**, 290–308 (1982).

**17. **J. A. N. Buytaert and J. J. J. Dirckx, “Moiré profilometry using liquid crystals for projection and demodulation,” Opt. Express **16**(1), 179–193 (2008). [CrossRef]

**18. **L. H. Jin, Y. Otani, and T. Yoshizawa, “Shadow moiré profilometry by frequency sweeping,” Opt. Eng. **40**(7), 1383 (2001). [CrossRef]

**19. **Z. Xu, H. K. Taylor, D. S. Boning, S. F. Yoon, and K. Youcef-Toumi, “Large-area and high-resolution distortion measurement based on Moiré fringe method for hot embossing process,” Opt. Express **17**(21), 18394–18407 (2009). [CrossRef]

**20. **G. N. de Oliveira, M. E. de Oliveira, and P. A. M. dos Santos, “Dynamic moiré patterns for profilometry applications,” J. Phys.: Conf. Ser. **274**, 012036 (2011). [CrossRef]

**21. **A. A. Goshtasby, *2-D and 3-D Image Registration: for Medical, Remote Sensing, and Industrial Applications* (John Wiley & Sons, Inc. Publications, 2005).

**22. **G. P. Penney, J. Weese, J. A. Little, P. Desmedt, D. L. G. Hill, and D. J. Hawkes, “A comparison of similarity measures for use in 2-D-3-D medical image registration,” IEEE Trans. Med. Imaging **17**(4), 586–595 (1998). [CrossRef]

**23. **S. Periaswamy and H. Farid, “Medical image registration with partial data,” Med. Image Anal. **10**(3), 452–464 (2006). [CrossRef]

**24. **J. Salvi, C. Matabosch, D. Fofi, and J. Forest, “A review of recent range image registration methods with accuracy evaluation,” Image Vis. Comput. **25**(5), 578–596 (2007). [CrossRef]

**25. **B. Zitová and J. Flusser, “Image registration methods: a survey,” Image Vis. Comput. **21**(11), 977–1000 (2003). [CrossRef]

**26. **J. Modersitzki, *Numerical Methods for Image Registration* (Oxford University Press, 2004).