Abstract

As the integration packaging density of high-power LED (Light Emitting Diode) chip modules become higher and higher, the accuracy and speed of visual inspection require higher demands correspondingly. The accurate position matching of substrates and flip-chip LEDs is one of the key technologies in the automatic eutectic welding process. In this paper we propose a method based on image features to complete the matching of the substrates and the flip-chip LEDs. Firstly, the substrate images and the flip-chip images are pre-processed respectively to obtain binary images. Then we apply Hough transformation to detect straight lines on the binary images, and find out the main linear directions to trigger the mechanical arms to adjust the positions of the substrate and the chip initially. Thirdly, we use eight neighbors interconnected domain algorithm for the first time to locate notable features of the substrate, and pass the located information to the control system to trigger the mechanical arm to adjust the substrate for the second time. At the same time, projection algorithm is applied to locate the anode of the flip-chip to drive the mechanical arm to adjust the position of the flip-chip again. Finally, the position information is used to trigger the mechanical arm to accomplish the matching of the substrate and the flip-chip. The proposed method improves the speed of matching on the basis of the accuracy of matching, which achieves these requirements of real-time and high accuracy applications.

© 2014 Optical Society of America

1. Introduction

With the advantages of high light efficiency, energy saving, long life and environmental protection, LED is considered as a new energy-efficient lighting in the 21st century. Currently, LED is widely used in general lightings, displays, landscape lightings, vehicle headlamps and other fields [1]. LED packaging technology is one of the most important parts that makes LED from the semiconductor chip to the final product. China is a big country of LED package and it is estimated that 80% LED products are packaged in China. High power LED is regarded as the core of semiconductor lighting in the future. Due to some characteristics of high light efficiency and long lifetime, high power LED is known as the fourth generation light source.

According to the LED’s structure, currently LED chips include three types: horizontal chips, vertical chips and flip chips [2].In the most common market, horizontal LED packaging mainly includes two stages: die bonding and then wire bonding. Most of the die bonding and wire bonding devices use machine vision technology to locate the position of the chip and the substrate. They usually use template matching to complete the matching between the substrate and the chip. Flip chip possesses the advantages of small size, high light efficiency, high density and good thermal performance [26]. Consequently, it will be widely used in the high power LED packaging in the future. Its two electrodes are on the bottom surface of the flip LED chip. So the chip can be directly attached to the electrically-conductive backing plate, without additional wiring. The common welding technology is not suitable for the flip chip anymore. Eutectic welding is considered as a new technology for the flip chip in the LED packaging industry. Now there are only a few equipment manufacturers that can provide eutectic welding devices, such as ASM corporation [7,8]. It can provide two types of such devices. Both of them don’t need wiring. One type needs fluxes for die attaching and then needs reflow soldering. The other type is that die bonding and welding are completed at the same time [8]. These devices cost so expensively up to 300,000 dollars per suit. To our best known, there are seldom literatures reported on eutectic welding for the flip chip LED packaging.

High power flip-chip LED’s anode and cathode are extremely close, shown as in Fig. 1.This chip is a GaN LED named S-45ABFUP,made by Sanan corporation. Its physical size is 45mils by 45mils. It is only 75 um between the two poles. In the eutectic welding process, the LED chips are directly attached to the substrate. If the middle positional relationship of the ceramic substrate and the LED chip exits a slight deviation, it will cause a short circuit fault. In addition, the high precision matching of the ceramic substrate and the flip LED is also absolutely necessary to achieve the highest thermal efficiency. Meanwhile, in the existing automatic matching process of the substrate and the flip LED chip, the matching speed is not fast enough and matching accuracy is not good enough. For ASM’s eutectic die attaching the cycle time is about 450ms.The position accuracy of pattern recognition is about 1/4 pixels [7,8]. All of them have the space for improvement.

 

Fig. 1 (a) Top surface of the LED chip; (b) Bottom surface of the LED chip; and (c) the physical size of the LED chip.

Download Full Size | PPT Slide | PDF

Generally, image matching methods can be roughly divided into three categories: matching based on cross correlation [9], matching based on transform domain [10], and matching based on features [11]. Compared to the first two methods, the matching based on features has some advantages such as less computation, strong robustness [12]. Thus, this paper chooses an approach based on geometry features to complete the matching of the ceramic substrate and the flip-chip LED. With the advantages of less complexity and easy implementation, interconnected domain algorithm is widely applied in character segmentation [13], license plate recognition [14]. According to the characteristics of the ceramic substrate, this paper makes use of interconnected domain method to locate the salient features of the substrate. Based on the features of flip-chip LED, we take advantage of gray projection algorithm to locate the positions of its four vertices. Finally, the position information is passed to the control system to trigger mechanical arms to accomplish the matching of the substrate and the flip-chip. This approach proves well in amount of calculation and matching speed and meets the requirements of real-time and high accuracy applications.

2. Proposed matching methods

2.1 The whole matching process

The proposed approach to accomplish the matching of the substrate and the flip LED chip can be divided into four steps. Firstly, the substrate image and the flip LED chip image are captured by a CCD and pre-processed to binary images. Secondly, Hough transformation [15] is applied to detect the straight lines on the binary images, and the main directions of these lines are computed to trigger the mechanical arms to adjust the positions of the substrate and the chip initially. Thirdly, eight-neighborhood interconnected domain algorithm is used to locate the two holes on the substrate. The located holes’ position information is then passed to the control system to trigger the mechanical arms to adjust the substrate again. At the same time, projection algorithm is applied to locate the anode of the flip-chip to drive the mechanical arms to adjust the chip again. Finally, the located holes’ position information of the substrate and the located four vertices positional information of the flip-chip are sent back to the control system, and the mechanical arms are triggered to complete the matching of the substrate and the chip. Entire matching process is shown in Fig. 2.

 

Fig. 2 The matching flow of the substrate and the flip-chip LED.

Download Full Size | PPT Slide | PDF

2.2 The principles of eight-neighborhood interconnected domain and projection algorithm

(1) Principle of eight-neighborhood interconnected

Considering the trade-off between complexity and accuracy, we choose eight-neighborhood interconnected domain algorithm as follows. As shown in Fig. 3(a),region that is located in eight neighbors of the point (x,y) is called eight neighborhood domain, which containing points (x-1,y-1), (x,y-1), (x + 1,y-1), (x-1,y), (x + 1,y), (x-1,y + 1), (x,y + 1), (x + 1,y + 1). As shown in Fig. 3(b), any points in the eight neighbors, whose gray value is equal to that of the point (m), should be considered as a part of the eight connected domain corresponding to the point (m).

 

Fig. 3 (a) Eight neighbors; (b) Eight connected domain.

Download Full Size | PPT Slide | PDF

(2) Principle of projection algorithm of binary images

Assume the white points represent foreground color and the black points represent background color. Projection image is obtained by computing the number of white points in the binary image along a certain direction. AssumeI(x,y)is the intensity value of point (x,y) in the binary image, the width of the image is W and its height is H,

I(x,y)={1,if(x,y)isawhitepoint0,if(x,y)isablackpoint.
Projection can be implemented along any directions, but mainly in the horizontal or the vertical direction. The horizontal/vertical projection images are obtained as follows:

Horizontal projection H(x) is expressed as

H(x)=y=1HI(x,y),(0xW).
Vertical projection V(y) is expressed as
V(y)=x=1WI(x,y),(0yH)
From Eq. (2) and Eq. (3) we know, the process of projection is actually to convert two-dimensional functions of the original image to one-dimensional functions.

2.3 Preprocessing

We put forward a method based on images features to complete the matching of the substrate and the flip-chip LED. The images of the flip-chip and the substrate are captured by a CCD camera. Firstly, the color substrate image and the flip-chip image are converted into gray images. Then the gray images are filtered to remove noises. The filtering window of the substrate is 3 × 3 pixels, the filtering window of the flip-chip is 5 × 5 pixels. In order to extract contours of the images, the filtered images are processed by edge detection, which can reduce the amount of calculation and facilitate the subsequent processing. Canny operator has a good SNR (signal to noise ratio) and accurate detection. At the same time, it also achieves a good balance between image noise removal and edge detail preservation [16]. Therefore, in order to get preferable edge images, we choose Canny operator to extract image edges. Two thresholds in Canny operator are respectively 150 and 50. Finally, the edge image is binarized to a binary image. The flow of preprocessing is shown in Fig. 4.

 

Fig. 4 The flow chart of preprocessing.

Download Full Size | PPT Slide | PDF

2.4 Calibration of the substrate and the flip-chip LED

Since the images of the substrate and the flip-chip before the correction stage are generally titled, they must be adjusted before matching. The process of calibration is divided into following steps:

  1. Detect the straight lines in the binary substrate image and the binary chip image respectively by Hough transformation; Compute the straight lines and determine the main linear direction to trigger the mechanical arms to adjust the substrate and the chip initially.
  2. Apply eight-neighborhood interconnected domain algorithm to locate the two holes on the substrate, and drive the mechanical arms to correct the position of the substrate again.
  3. Apply projection algorithm to locate the anode of the flip-chip, and trigger the mechanical arm to correct the position of the flip-chip again.

Since the features have great differences between the substrate and the flip-chip, in step (1) the filter should be used with different scales filtering window. In step (1), the step that determines the main direction of the straight lines is mainly divided into two procedures: Firstly, calculate the slope of each line (L) in the binary substrate/chip images and the angle (α) between the horizontal direction. Secondly, divide π into n intervals and determine which section the chosen straight line is based on the value of angle (|α|). Here choose n = 120.We count the number of straight lines in each interval, identify the interval that contains the maximum number of straight lines and record the interval. Then every angle α of the straight lines in the interval and the average angle α¯ are calculated. α¯is the main direction of the straight lines. And the main direction is used to driver the mechanical arm to adjust the substrate or the flip-chip initially.

In step 2, in order to improve the matching rate, we can scale the substrate image and make its size two times less than the captured image by the CCD, which can reduce complexity and accelerate the matching speed. Then the eight-neighbor interconnected domain algorithm is applied to detect the entire connected domain in the substrate image, and determine their external rectangles. According to the geometric characteristics of external rectangle, such as ratio of width and height, area, four vertices and two holes coordinates of external rectangular on the substrate are located and center position of the two holes are found. If the vertical coordinates of the two holes are greater than half the height of the substrate image, we should rotate the substrate for 180 degrees to prevent it upside down.

In step 3, the projection algorithm is utilized to obtain the horizontal projection and the vertical projection of the rotation corrected flip-chip image respectively. Then the coordinates of the four vertices of the flip-chip are computed according to the two directional projection images. Find out two peaks that are the nearest respectively to the upper and the lower boundary line in the horizontal projection image, and compute their distances d1, d2. Then identify two peaks that are the nearest respectively to the left and the right boundary line in the vertical projection image, and compute their distances d3, d4. Compare the four distances and search the maximum and its corresponding boundary. The calculated boundary is identified as the anode of the flip-chip. Finally, mechanical arm is driven to rotate the flip-chip to make sure the anode is at the lower side.

2.5 The matching of the substrate and the flip-chip

Four vertices positions of the flip-chip can be located by projection algorithm. According to the above located two holes on the substrate, we gain the precise position with integral pixels of the two holes. We use surface fitting around the position of integral pixels and derivate the surface to get positions with sub-pixel precision. Then specific position information of the substrate and the chip is passed to the control system, and the mechanical arm is triggered to move the flip-chip to the corresponding position of the substrate. The matching of the substrate and the chip is finished when the flip-chip is on the substrate.

3. Experiment results

We did some experiments to prove the feasibility of the above methods. Experimental conditions: Computer Configuration is 2.9GHzIntel (R) Core (TM) i7-3520M CPU and 4GB memory. These methods proposed are implemented under the environment of Visual Studio 2010 C + + .

The color image of the LED chip is shown in Fig. 1 and color image of the substrate is shown in Fig. 5.The resolutions of them are both 640 × 480 pixels. According to the algorithms proposed color images are converted into gray images and filtered, shown as in Fig. 6 and Fig. 7 respectively. We use Canny operator to detect the edges of the images shown as in Fig. 8.Then we use Hough Transform to detect the straight lines in the binary substrate and LED images. Hough transformation results are shown as in Fig. 9.We can determine the angles between the main straight line and the horizontal line and first trigger the mechanism to rotate the chip and the substrate, obtaining the results shown as in Fig. 10.

 

Fig. 5 Color image of the input substrate.

Download Full Size | PPT Slide | PDF

 

Fig. 6 Gray images of (a) the substrate; (b)LED chip.

Download Full Size | PPT Slide | PDF

 

Fig. 7 Filtered results of gray images of (a) the substrate; (b) LED chip.

Download Full Size | PPT Slide | PDF

 

Fig. 8 Binarization image of (a) the substrate;(b) the LED chip.

Download Full Size | PPT Slide | PDF

 

Fig. 9 Hough Transformation of (a) binary image of the substrate; (b) binary image of the LED chip.

Download Full Size | PPT Slide | PDF

 

Fig. 10 First rotation correction of (a) the substrate and (b) the LED chip.

Download Full Size | PPT Slide | PDF

Then we calculate the position of the two holes in the substrate to ensure they are at the top half of the substrate. If the two holes are on the bottom half of the substrate image, shown as in Fig. 11, we will rotate it by 180 degrees to ensure the two holes are on the top half.

 

Fig. 11 External rectangle of connected domain in the substrate image.

Download Full Size | PPT Slide | PDF

Projection algorithm is applied to the binaryzation of the first rotation-corrected image of the LED chip (Fig. 10(b)), shown as in Fig. 12.Caculate the distance between the peaks’ positions and their relatively nearest edges. d1 = 22, d2 = 23, d3 = 52, d4 = 15. d3 is the maximum. So the nearest edge corresponding to d3 is the anode. Finally the LED chip is matched to the substrate shown as in Fig. 13..

 

Fig. 12 The projection images of binaryzation of Fig. 10(b): (a) horizontal projection; (b) vertical projection image.

Download Full Size | PPT Slide | PDF

 

Fig. 13 The matching result of the substrate and the flip-chip LED.

Download Full Size | PPT Slide | PDF

The average time of entire matching process is about 110ms and the matching accuracy is 1/8 pixel, indicating the proposed method can satisfy the requirements of real-time and high accuracy.

4. Conclusions

We complete the matching of the substrate and the flip-chip LED based on image features. Compared with the conventional matching process, the method reduces the complexity and improves the matching speed on the basis of the matching accuracy. The experiments show the approach can achieve fast and efficient matching of the substrate and the flip-chip in the eutectic welding process. The method in the paper makes matching technology in LED eutectic welding meet the requirements of real-time and high accuracy applications.

Acknowledgments

This work was supported by the Key technologies R&D Program of Guangdong Province (Nos.2009A080301013, 2010A080402009),the Strategic Emerging Industry Special funds of Guangdong Province(No.2012A080304015), and the Key Technologies R&D Program of Guangzhou City (Nos.2010U1-D00221, 2011Y5-00006), Fundamental Research Funds for the Central Universities of China(SCUT-2013ZM0092).

References and links

1. S. Pimputkar, J. S. Speck, S. P. DenBaars, and S. Nakamura, “Prospects for LED lighting,” Nat. Photonics 3(4), 180–182 (2009). [CrossRef]  

2. J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, and S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001). [CrossRef]  

3. O. B. Shchekin, J. E. Epler, T. A. Trottier, T. Margalith, D. A. Steigerwald, M. O. Holcomb, P. S. Martin, and M. R. Krames, “High performance thin-film flip-chip InGaN–GaN light-emitting diodes,” Appl. Phys. Lett. 89(7), 071109(2006). [CrossRef]  

4. S. J. Chang, W. S. Chen, S. C. Shei, C. T. Kuo, T. K. Ko, C. F. Shen, J. M. Tsai, W. C. Lai, J. K. Sheu, and A. J. Lin, “High-brightness InGaN–GaN power flip-chip LEDs,” J. Lightwave Technol. 27(12), 1985–1989 (2009).

5. T. X. Lee, K. F. Gao, W. T. Chien, and C. C. Sun, “Light extraction analysis of GaN-based light-emitting diodes with surface texture and/or patterned substrate,” Opt. Express 15(11), 6670–6676 (2007). [CrossRef]   [PubMed]  

6. C. C. Wang, W. R. Yang, J. J. Chen, W. W. Shih, I. J. Wang, T. Y. Guo, and K. L. Huang, Optical and Thermal Analysis for a Modified Flip-Chip Light Emitting Diode,” in Asia Optical Fiber Communication and Optoelectronic Exposition and Conference (Shanghai China,2008), SAK4.

7. ASM, AD838L,http://www.asmpacific.com/asmpt/products_diebond_ad838l.htm

8. ASM, AD211,http://www.asmpacific.com/asmpt/products_diebond_ad211.html

9. B. Pan and K. Li, “A fast digital image correlation method for deformation measurement,” Opt. Lasers Eng. 49(7), 841–847 (2011). [CrossRef]  

10. B. S. Reddy and B. N. Chatterji, “An FFT-based technique for translation, rotation, and scale-invariant image registration,” IEEE T Imag. Process. 5(8), 1266–1271 (1996).

11. B. Zitová and J. Flusser, “Image registration methods: a survey,” Image Vis. Comput. 21(11), 977–1000 (2003). [CrossRef]  

12. D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60(2), 91–110 (2004). [CrossRef]  

13. K. Jung, K. I. Kim, and A. K. Jain, “Text information extraction in images and video: a survey,” Pattern Recognit. 37(5), 977–997 (2004). [CrossRef]  

14. C. N. E. Anagnostopoulos, I. E. Anagnostopoulos, I. D. Psoroulas, V. Loumos, and E. Kayafas, “License plate recognition from still images and video sequences: A survey,” IEEE T. Intell. Transp. 9(9), 377–391 (2008). [CrossRef]  

15. R. O. Duda and P. E. Hart, “Use of the Hough transformation to detect lines and curves in pictures,” Commun. ACM 15(1), 11–15 (1972). [CrossRef]  

16. J. Canny, “A computational approach to edge detection,” IEEE Trans. Pattern Anal. Mach. Intell. 8(6), 679–698 (1986). [CrossRef]   [PubMed]  

References

  • View by:
  • |
  • |
  • |

  1. S. Pimputkar, J. S. Speck, S. P. DenBaars, S. Nakamura, “Prospects for LED lighting,” Nat. Photonics 3(4), 180–182 (2009).
    [CrossRef]
  2. J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
    [CrossRef]
  3. O. B. Shchekin, J. E. Epler, T. A. Trottier, T. Margalith, D. A. Steigerwald, M. O. Holcomb, P. S. Martin, M. R. Krames, “High performance thin-film flip-chip InGaN–GaN light-emitting diodes,” Appl. Phys. Lett. 89(7), 071109(2006).
    [CrossRef]
  4. S. J. Chang, W. S. Chen, S. C. Shei, C. T. Kuo, T. K. Ko, C. F. Shen, J. M. Tsai, W. C. Lai, J. K. Sheu, A. J. Lin, “High-brightness InGaN–GaN power flip-chip LEDs,” J. Lightwave Technol. 27(12), 1985–1989 (2009).
  5. T. X. Lee, K. F. Gao, W. T. Chien, C. C. Sun, “Light extraction analysis of GaN-based light-emitting diodes with surface texture and/or patterned substrate,” Opt. Express 15(11), 6670–6676 (2007).
    [CrossRef] [PubMed]
  6. C. C. Wang, W. R. Yang, J. J. Chen, W. W. Shih, I. J. Wang, T. Y. Guo, and K. L. Huang, Optical and Thermal Analysis for a Modified Flip-Chip Light Emitting Diode,” in Asia Optical Fiber Communication and Optoelectronic Exposition and Conference (Shanghai China,2008), SAK4.
  7. ASM, AD838L, http://www.asmpacific.com/asmpt/products_diebond_ad838l.htm
  8. ASM, AD211, http://www.asmpacific.com/asmpt/products_diebond_ad211.html
  9. B. Pan, K. Li, “A fast digital image correlation method for deformation measurement,” Opt. Lasers Eng. 49(7), 841–847 (2011).
    [CrossRef]
  10. B. S. Reddy, B. N. Chatterji, “An FFT-based technique for translation, rotation, and scale-invariant image registration,” IEEE T Imag. Process. 5(8), 1266–1271 (1996).
  11. B. Zitová, J. Flusser, “Image registration methods: a survey,” Image Vis. Comput. 21(11), 977–1000 (2003).
    [CrossRef]
  12. D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60(2), 91–110 (2004).
    [CrossRef]
  13. K. Jung, K. I. Kim, A. K. Jain, “Text information extraction in images and video: a survey,” Pattern Recognit. 37(5), 977–997 (2004).
    [CrossRef]
  14. C. N. E. Anagnostopoulos, I. E. Anagnostopoulos, I. D. Psoroulas, V. Loumos, E. Kayafas, “License plate recognition from still images and video sequences: A survey,” IEEE T. Intell. Transp. 9(9), 377–391 (2008).
    [CrossRef]
  15. R. O. Duda, P. E. Hart, “Use of the Hough transformation to detect lines and curves in pictures,” Commun. ACM 15(1), 11–15 (1972).
    [CrossRef]
  16. J. Canny, “A computational approach to edge detection,” IEEE Trans. Pattern Anal. Mach. Intell. 8(6), 679–698 (1986).
    [CrossRef] [PubMed]

2011 (1)

B. Pan, K. Li, “A fast digital image correlation method for deformation measurement,” Opt. Lasers Eng. 49(7), 841–847 (2011).
[CrossRef]

2009 (2)

2008 (1)

C. N. E. Anagnostopoulos, I. E. Anagnostopoulos, I. D. Psoroulas, V. Loumos, E. Kayafas, “License plate recognition from still images and video sequences: A survey,” IEEE T. Intell. Transp. 9(9), 377–391 (2008).
[CrossRef]

2007 (1)

2006 (1)

O. B. Shchekin, J. E. Epler, T. A. Trottier, T. Margalith, D. A. Steigerwald, M. O. Holcomb, P. S. Martin, M. R. Krames, “High performance thin-film flip-chip InGaN–GaN light-emitting diodes,” Appl. Phys. Lett. 89(7), 071109(2006).
[CrossRef]

2004 (2)

D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60(2), 91–110 (2004).
[CrossRef]

K. Jung, K. I. Kim, A. K. Jain, “Text information extraction in images and video: a survey,” Pattern Recognit. 37(5), 977–997 (2004).
[CrossRef]

2003 (1)

B. Zitová, J. Flusser, “Image registration methods: a survey,” Image Vis. Comput. 21(11), 977–1000 (2003).
[CrossRef]

2001 (1)

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

1996 (1)

B. S. Reddy, B. N. Chatterji, “An FFT-based technique for translation, rotation, and scale-invariant image registration,” IEEE T Imag. Process. 5(8), 1266–1271 (1996).

1986 (1)

J. Canny, “A computational approach to edge detection,” IEEE Trans. Pattern Anal. Mach. Intell. 8(6), 679–698 (1986).
[CrossRef] [PubMed]

1972 (1)

R. O. Duda, P. E. Hart, “Use of the Hough transformation to detect lines and curves in pictures,” Commun. ACM 15(1), 11–15 (1972).
[CrossRef]

Anagnostopoulos, C. N. E.

C. N. E. Anagnostopoulos, I. E. Anagnostopoulos, I. D. Psoroulas, V. Loumos, E. Kayafas, “License plate recognition from still images and video sequences: A survey,” IEEE T. Intell. Transp. 9(9), 377–391 (2008).
[CrossRef]

Anagnostopoulos, I. E.

C. N. E. Anagnostopoulos, I. E. Anagnostopoulos, I. D. Psoroulas, V. Loumos, E. Kayafas, “License plate recognition from still images and video sequences: A survey,” IEEE T. Intell. Transp. 9(9), 377–391 (2008).
[CrossRef]

Canny, J.

J. Canny, “A computational approach to edge detection,” IEEE Trans. Pattern Anal. Mach. Intell. 8(6), 679–698 (1986).
[CrossRef] [PubMed]

Chang, S. J.

Chatterji, B. N.

B. S. Reddy, B. N. Chatterji, “An FFT-based technique for translation, rotation, and scale-invariant image registration,” IEEE T Imag. Process. 5(8), 1266–1271 (1996).

Chen, W. S.

Chien, W. T.

Christenson, G.

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

DenBaars, S. P.

S. Pimputkar, J. S. Speck, S. P. DenBaars, S. Nakamura, “Prospects for LED lighting,” Nat. Photonics 3(4), 180–182 (2009).
[CrossRef]

Duda, R. O.

R. O. Duda, P. E. Hart, “Use of the Hough transformation to detect lines and curves in pictures,” Commun. ACM 15(1), 11–15 (1972).
[CrossRef]

Epler, J. E.

O. B. Shchekin, J. E. Epler, T. A. Trottier, T. Margalith, D. A. Steigerwald, M. O. Holcomb, P. S. Martin, M. R. Krames, “High performance thin-film flip-chip InGaN–GaN light-emitting diodes,” Appl. Phys. Lett. 89(7), 071109(2006).
[CrossRef]

Flusser, J.

B. Zitová, J. Flusser, “Image registration methods: a survey,” Image Vis. Comput. 21(11), 977–1000 (2003).
[CrossRef]

Gao, K. F.

Gardner, N. F.

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

Götz, W.

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

Hart, P. E.

R. O. Duda, P. E. Hart, “Use of the Hough transformation to detect lines and curves in pictures,” Commun. ACM 15(1), 11–15 (1972).
[CrossRef]

Holcomb, M. O.

O. B. Shchekin, J. E. Epler, T. A. Trottier, T. Margalith, D. A. Steigerwald, M. O. Holcomb, P. S. Martin, M. R. Krames, “High performance thin-film flip-chip InGaN–GaN light-emitting diodes,” Appl. Phys. Lett. 89(7), 071109(2006).
[CrossRef]

Jain, A. K.

K. Jung, K. I. Kim, A. K. Jain, “Text information extraction in images and video: a survey,” Pattern Recognit. 37(5), 977–997 (2004).
[CrossRef]

Jung, K.

K. Jung, K. I. Kim, A. K. Jain, “Text information extraction in images and video: a survey,” Pattern Recognit. 37(5), 977–997 (2004).
[CrossRef]

Kayafas, E.

C. N. E. Anagnostopoulos, I. E. Anagnostopoulos, I. D. Psoroulas, V. Loumos, E. Kayafas, “License plate recognition from still images and video sequences: A survey,” IEEE T. Intell. Transp. 9(9), 377–391 (2008).
[CrossRef]

Kern, R. S.

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

Kim, K. I.

K. Jung, K. I. Kim, A. K. Jain, “Text information extraction in images and video: a survey,” Pattern Recognit. 37(5), 977–997 (2004).
[CrossRef]

Ko, T. K.

Krames, M. R.

O. B. Shchekin, J. E. Epler, T. A. Trottier, T. Margalith, D. A. Steigerwald, M. O. Holcomb, P. S. Martin, M. R. Krames, “High performance thin-film flip-chip InGaN–GaN light-emitting diodes,” Appl. Phys. Lett. 89(7), 071109(2006).
[CrossRef]

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

Kuo, C. T.

Lai, W. C.

Lee, T. X.

Li, K.

B. Pan, K. Li, “A fast digital image correlation method for deformation measurement,” Opt. Lasers Eng. 49(7), 841–847 (2011).
[CrossRef]

Lin, A. J.

Loumos, V.

C. N. E. Anagnostopoulos, I. E. Anagnostopoulos, I. D. Psoroulas, V. Loumos, E. Kayafas, “License plate recognition from still images and video sequences: A survey,” IEEE T. Intell. Transp. 9(9), 377–391 (2008).
[CrossRef]

Lowe, D. G.

D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60(2), 91–110 (2004).
[CrossRef]

Lowery, C.

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

Ludowise, M. J.

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

Margalith, T.

O. B. Shchekin, J. E. Epler, T. A. Trottier, T. Margalith, D. A. Steigerwald, M. O. Holcomb, P. S. Martin, M. R. Krames, “High performance thin-film flip-chip InGaN–GaN light-emitting diodes,” Appl. Phys. Lett. 89(7), 071109(2006).
[CrossRef]

Martin, P. S.

O. B. Shchekin, J. E. Epler, T. A. Trottier, T. Margalith, D. A. Steigerwald, M. O. Holcomb, P. S. Martin, M. R. Krames, “High performance thin-film flip-chip InGaN–GaN light-emitting diodes,” Appl. Phys. Lett. 89(7), 071109(2006).
[CrossRef]

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

Nakamura, S.

S. Pimputkar, J. S. Speck, S. P. DenBaars, S. Nakamura, “Prospects for LED lighting,” Nat. Photonics 3(4), 180–182 (2009).
[CrossRef]

O’Shea, J. J.

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

Pan, B.

B. Pan, K. Li, “A fast digital image correlation method for deformation measurement,” Opt. Lasers Eng. 49(7), 841–847 (2011).
[CrossRef]

Pimputkar, S.

S. Pimputkar, J. S. Speck, S. P. DenBaars, S. Nakamura, “Prospects for LED lighting,” Nat. Photonics 3(4), 180–182 (2009).
[CrossRef]

Psoroulas, I. D.

C. N. E. Anagnostopoulos, I. E. Anagnostopoulos, I. D. Psoroulas, V. Loumos, E. Kayafas, “License plate recognition from still images and video sequences: A survey,” IEEE T. Intell. Transp. 9(9), 377–391 (2008).
[CrossRef]

Reddy, B. S.

B. S. Reddy, B. N. Chatterji, “An FFT-based technique for translation, rotation, and scale-invariant image registration,” IEEE T Imag. Process. 5(8), 1266–1271 (1996).

Shchekin, O. B.

O. B. Shchekin, J. E. Epler, T. A. Trottier, T. Margalith, D. A. Steigerwald, M. O. Holcomb, P. S. Martin, M. R. Krames, “High performance thin-film flip-chip InGaN–GaN light-emitting diodes,” Appl. Phys. Lett. 89(7), 071109(2006).
[CrossRef]

Shei, S. C.

Shen, C. F.

Shen, Y.-C.

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

Sheu, J. K.

Speck, J. S.

S. Pimputkar, J. S. Speck, S. P. DenBaars, S. Nakamura, “Prospects for LED lighting,” Nat. Photonics 3(4), 180–182 (2009).
[CrossRef]

Steigerwald, D. A.

O. B. Shchekin, J. E. Epler, T. A. Trottier, T. Margalith, D. A. Steigerwald, M. O. Holcomb, P. S. Martin, M. R. Krames, “High performance thin-film flip-chip InGaN–GaN light-emitting diodes,” Appl. Phys. Lett. 89(7), 071109(2006).
[CrossRef]

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

Stockman, S. A.

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

Subramanya, S.

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

Sun, C. C.

Trottier, T. A.

O. B. Shchekin, J. E. Epler, T. A. Trottier, T. Margalith, D. A. Steigerwald, M. O. Holcomb, P. S. Martin, M. R. Krames, “High performance thin-film flip-chip InGaN–GaN light-emitting diodes,” Appl. Phys. Lett. 89(7), 071109(2006).
[CrossRef]

Tsai, J. M.

Wierer, J. J.

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

Zitová, B.

B. Zitová, J. Flusser, “Image registration methods: a survey,” Image Vis. Comput. 21(11), 977–1000 (2003).
[CrossRef]

Appl. Phys. Lett. (2)

J. J. Wierer, D. A. Steigerwald, M. R. Krames, J. J. O’Shea, M. J. Ludowise, G. Christenson, Y.-C. Shen, C. Lowery, P. S. Martin, S. Subramanya, W. Götz, N. F. Gardner, R. S. Kern, S. A. Stockman, “High-power AlGaInN flip-chip light-emitting diodes,” Appl. Phys. Lett. 78(22), 3379–3381 (2001).
[CrossRef]

O. B. Shchekin, J. E. Epler, T. A. Trottier, T. Margalith, D. A. Steigerwald, M. O. Holcomb, P. S. Martin, M. R. Krames, “High performance thin-film flip-chip InGaN–GaN light-emitting diodes,” Appl. Phys. Lett. 89(7), 071109(2006).
[CrossRef]

Commun. ACM (1)

R. O. Duda, P. E. Hart, “Use of the Hough transformation to detect lines and curves in pictures,” Commun. ACM 15(1), 11–15 (1972).
[CrossRef]

IEEE T Imag. Process. (1)

B. S. Reddy, B. N. Chatterji, “An FFT-based technique for translation, rotation, and scale-invariant image registration,” IEEE T Imag. Process. 5(8), 1266–1271 (1996).

IEEE T. Intell. Transp. (1)

C. N. E. Anagnostopoulos, I. E. Anagnostopoulos, I. D. Psoroulas, V. Loumos, E. Kayafas, “License plate recognition from still images and video sequences: A survey,” IEEE T. Intell. Transp. 9(9), 377–391 (2008).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell. (1)

J. Canny, “A computational approach to edge detection,” IEEE Trans. Pattern Anal. Mach. Intell. 8(6), 679–698 (1986).
[CrossRef] [PubMed]

Image Vis. Comput. (1)

B. Zitová, J. Flusser, “Image registration methods: a survey,” Image Vis. Comput. 21(11), 977–1000 (2003).
[CrossRef]

Int. J. Comput. Vis. (1)

D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60(2), 91–110 (2004).
[CrossRef]

J. Lightwave Technol. (1)

Nat. Photonics (1)

S. Pimputkar, J. S. Speck, S. P. DenBaars, S. Nakamura, “Prospects for LED lighting,” Nat. Photonics 3(4), 180–182 (2009).
[CrossRef]

Opt. Express (1)

Opt. Lasers Eng. (1)

B. Pan, K. Li, “A fast digital image correlation method for deformation measurement,” Opt. Lasers Eng. 49(7), 841–847 (2011).
[CrossRef]

Pattern Recognit. (1)

K. Jung, K. I. Kim, A. K. Jain, “Text information extraction in images and video: a survey,” Pattern Recognit. 37(5), 977–997 (2004).
[CrossRef]

Other (3)

C. C. Wang, W. R. Yang, J. J. Chen, W. W. Shih, I. J. Wang, T. Y. Guo, and K. L. Huang, Optical and Thermal Analysis for a Modified Flip-Chip Light Emitting Diode,” in Asia Optical Fiber Communication and Optoelectronic Exposition and Conference (Shanghai China,2008), SAK4.

ASM, AD838L, http://www.asmpacific.com/asmpt/products_diebond_ad838l.htm

ASM, AD211, http://www.asmpacific.com/asmpt/products_diebond_ad211.html

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1
Fig. 1

(a) Top surface of the LED chip; (b) Bottom surface of the LED chip; and (c) the physical size of the LED chip.

Fig. 2
Fig. 2

The matching flow of the substrate and the flip-chip LED.

Fig. 3
Fig. 3

(a) Eight neighbors; (b) Eight connected domain.

Fig. 4
Fig. 4

The flow chart of preprocessing.

Fig. 5
Fig. 5

Color image of the input substrate.

Fig. 6
Fig. 6

Gray images of (a) the substrate; (b)LED chip.

Fig. 7
Fig. 7

Filtered results of gray images of (a) the substrate; (b) LED chip.

Fig. 8
Fig. 8

Binarization image of (a) the substrate;(b) the LED chip.

Fig. 9
Fig. 9

Hough Transformation of (a) binary image of the substrate; (b) binary image of the LED chip.

Fig. 10
Fig. 10

First rotation correction of (a) the substrate and (b) the LED chip.

Fig. 11
Fig. 11

External rectangle of connected domain in the substrate image.

Fig. 12
Fig. 12

The projection images of binaryzation of Fig. 10(b): (a) horizontal projection; (b) vertical projection image.

Fig. 13
Fig. 13

The matching result of the substrate and the flip-chip LED.

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

I(x,y)={ 1,if (x,y) is a white point 0,if (x,y) is a black point .
H(x)= y=1 H I(x,y) ,(0xW).
V(y)= x=1 W I(x,y) ,(0yH)

Metrics