In inter-satellite laser communication systems, accurate positioning of the beacon is essential for establishing a steady laser communication link. For inter-satellite optical communication, the main factor affecting the acquisition and tracking of the beacon is the background noise, such as stellar background light. In this study, we considered the effect of the background noise on a beacon in inter-satellite optical communication and proposed a new recognition algorithm for the beacon, which uses the optical flow vector obtained from the image data. We verified the feasibility of this method by performing simulation analysis and experiments. Both simulation and experiments showed that the new algorithm could accurately obtain the position of the centroid of the beacon under the effect of the background light. Furthermore, considering the identification probability of a light spot through the background light, the locating accuracy of the new algorithm was higher than that of the conventional gray centroid algorithm. Therefore, this new approach would be beneficial for the design of satellite-to-ground optical communication systems.
© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
In recent years, satellite optical communication has become an international research topic [1–4]. Pointing, acquisition, and tracking (PAT) technology is one of the key technologies in satellite optical communication. In inter-satellite laser communication, the beacon needs to be tracked accurately over time by two communication terminals. In the tracking phase, the star background light may enter the optical terminal, forming light spots on the receiving surface of the charge coupled device (CCD). These light spots make it difficult for the acquisition system to determine the accurate location of the beacon, which may cause interruptions in the laser link. Therefore, it is necessary to identify and track the beacon stably in the tracking phase. Numerous research works have been performed to suppress the interference of the star background light [5–7]. A well-known approach for reducing the effect of the star background light is typically to identify the beacon light by the method of querying the ephemeris. Another method is to match the beacon frame-by-frame, so that it is identified according to the changes in the centroid coordinates. However, these methods are computationally complex, which is not suitable for realizing rapid acquisition in optical communication. In , a method was proposed to identify a beacon based on the characteristics of the noise spots. However, the above research was mainly aimed at the stage of acquiring the beacon light. From the aspect of hardware, methods increasing the transmission power of a beacon or narrowing the laser beam are used to identify the beacon. However, these methods require a high transmission power and very narrow beam divergence angle, which increases the difficulty in the system design. Another method is to add a narrow-band filter to the receiving system to identify a beacon, but this method requires a high temperature control, which increases the structural complexity of the terminal. From the aspect of software, some researchers have proposed a method to suppress the background light by improving the CCD sampling frame frequency, which can reduce the interference of the star background light to some extent. However, in the tracking process, the star background light is expected to enter the optical terminal, forming noise spots on the CCD, which will impact the accuracy of the tracking system. This method is used to recognize the beacon by the motion feature of the beacon. In summary, in the existing literature, reports from the aspect of suppressing the noise spots in the tracking progress are few. Therefore, in this paper, an algorithm for identifying and tracking a beacon in satellite optical communication is proposed. The remainder of this paper is organized as follows. The theoretical analysis of the algorithm is introduced in section 2. The experiment established to verify the effectiveness of the algorithm is presented in section 3. Section 4 summarizes our results.
2. Theoretical analysis of the algorithm
2.1 Extract the target spot
Let f(x,y) denote the grayscale of the ith row and jth column in a digital image. According to the threshold segmentation method, the image is processed as
A binary Gaussian function is used in the simulation to generate the light spots, which is expressed as:
By using Eq. (1), the spot image is processed via threshold segmentation. Each connected domain presents a spot. In this paper, the four-neighborhood method is used to obtain the connected domain. A schematic of the connected domains is shown in Fig. 1.
Here, we assume that the size of the collecting window of the CCD is 7 × 7 pixels, in which four connected domains (connected domains A, B, C, and D) can be acquired via the threshold segmentation method. Each circle represents a pixel of the CCD, and each connected domain responds to a light spot generated by the background light or beacon. Each white circle denotes a pixel whose gray value is smaller than the threshold, whereas the cyan colored circle corresponds to a pixel with a gray value larger than the threshold.
Once all the spots are acquired, their centroid coordinates are computed separately by the gray centroid coordinate method, which is given by:
2.2 Track the beacon by the optical flow method
The previous section describes the procedure for acquiring a spot. After the stage of acquisition, the beacon needs to be tracked over time. To improve the real-time capability of the system, windows are used to track the beacon. The purpose of setting the windows is to narrow the search spot area and improve the anti-interference ability of the algorithm. However, one of the difficulties in window tracking is determining the appropriate window size to be set. To solve this problem, this study first sets a large window for each spot in the image based on the priori information and tracks the spot for numerous cycles. Subsequently, the moving distance of each spot between the adjacent frames is recorded. We assume that this moving distance horizontally and longitudinally is and , respectively. The size of the spot is . To keep the spot within the window, xb pixels are extended up and down based on the target spot, whereas yb pixels are extended left and right. The flow chart of the window setting is shown in Fig. 2.
The range of the window is:
Subsequently, the feature of the target spot is extracted from the kth frame image, and based on the feature, the spot is tracked in the k + 1th frame. For rapidly tracking a target (beacon spot), in this study, the optical flow method is used. The optical flow field is the projection of the velocity field of the object in the image, which represents the instantaneous displacement change in the object in the image. A large amount of information can be obtained from the optical flow field, including the movement information and structure information of the object. Assuming that the gray value at point (x,y) in the image at time tis denoted by I(x,y,t), from the grays constant hypothesis, we can obtain :
By using the chain rule differentiation, we obtain:
Satellite optical communication considers the energy concentration of the central part of the Gaussian spot. A pixel with a high energy is used as the tracking feature block. The advantage of this approach cannot only significantly reduce the amount of computing but also enhance the stability of the tracking. Therefore, this study uses the block matching optical flow method to track the target spot. The principle of this method is introduced below .
The block matching algorithm divides kth and (k + 1) frame images into many small blocks, and the optical flow of all the pixels within each small block of the moving object is consistent. Based on a certain similarity criterion, the similarity between the small blocks of the kth and (k + 1)th images is calculated. This displacement is the optical flow of each pixel in the small block. The sum of absolute difference (SAD) is used as similarity criterion, which is given by:
In the following part, we discuss a simulation performed to verify the tracking effect of the new algorithm. The simulation process is given as follows: the software generates some images for simulating the images acquired by the CCD. The simulation parameters are as follows: the image size is 240 × 240 and frame frequency is 50 Hz. The first frame collecting image is shown in Fig. 3.
The target spot is extracted from Fig. 3, a 3 × 3 window is set to extract the feature block from the spot, and the optical flow method is used to track the target spot. Figure 4 presents the tracking result of the third frame. The white dotted boxes indicate the window and blue lines denoted the optical flow vectors. In the 11th frame, a star background spot enters the CCD, and the tracking result at this time is shown in Fig. 5. In Fig. 5, the beacon can be recognized based on the optical flow, and the centroid coordinate of the beacon can be calculated by Eqs. (3) and (4).
When the beacon and background spot co-exist in the same image, they seem to be similar at some point. However, the gray distribution and gradient are different for the beacon and star background spot, which provides the possibility of using the optical flow method to track the beacon. To test the above possibility, a two-spot tracking simulation analysis is performed and is described below.
First, the two spots are extracted and a 3 × 3 window is set to acquire the center of the spot as a feature block. Second, a suitable window is set and the spots are tracked by the optical flow method. The tracking results are displayed in Figs. 6 and 7.
The simulation results show that the spots can be tracked stably by the optical flow method. However, in the tracking process, the beacon spot and star spot may partially overlap. The following part mainly discusses this situation. The simulation parameters are same as the above. In the 31st frame figure, the two spots still do not coincide. However, in the 32nd frame, the connected domain abruptly reduces to one and the size of the connected domain increases significantly. The collecting image of the 32nd frame is shown in Fig. 8, and the corresponding connected domain is exhibited in Fig. 9. The area of the connected region is obviously larger than that of the No.1 spot or No.2 spot, but it is less than the sum of the areas of the two spots, which indicates that there is a partial overlap between the No. 1 spot and No. 2 spot at this instant.
The gray value of the overlapping area is changed compared with the original spot, and it is difficult to calculate the centroid of the No.1 spot or No.2 spot using the grayscale centroid method. In this case, the following method is used to calculate the centroid of the beacon spot. We assume that the spot has not partially coincided in the k-1th frame. In this case, the central area of the light spot needs to be extracted as a feature block. Furthermore, it is also necessary to extract the pixel points with a larger gradient modulus near the edge of the spot and then extend one pixel in the up, down, left, and right direction, forming a new feature block, which is called the auxiliary feature block. If the kth frame spot partially overlaps, the feature block and auxiliary feature blocks need to be tracked by the optical flow method, and then the uniform optical flow can be averaged to estimate the movement of the spot. The formula is given as:
We extract the feature blocks in 31st image and use the optical flow method to track them, the corresponding tracking result is presented in Fig. 10.
When more spots overlap, a few feature blocks will fail to track, in which case some other feature blocks will need to be tracked. If most of the spots coincide, then all the feature blocks will fail to track. In this case, the center of mass of the light spot needs to be predicted based on the position of the light spot collected in the previous period. The process flow of the new algorithm is summarized in Fig. 11.
Compared with the gray centroid coordinate method, the computation of this new algorithm is more complex although it is more effective for reducing the effect of background light. In order to realize the approach for recognizing and tracking the beacon in the practical application, we use the parallel image processing system to reduce the computation complexity, such as CPLD (Complex Programmable Logic Device). The system is also suitable for in-orbit use.
3. Experimental verification
This section presents the experiment established to verify the effectiveness of the new algorithm. In an actual inter-satellite laser link, one, two, or three star spots may appear in the field of the CCD view. However, some studies  have shown that the probability of one spot appearing in the CCD field is the highest. Therefore, in this study, the simulation experiments mainly investigate the existence of a single star spot.
The experiment configuration is shown in Fig. 12, and the main parameters of the equipment are listed in Table 1. In addition, Fig. 13 displays the photograph of the image processing system belonging to the control system.
The core of the image processing system is CPLD (Complex Programmable Logic Device, model number: EPM7256), which is an embedded system for realizing a real-time image algorithm by parallel processing. Counter frequency of EPM7256 is up to 172.4MHz, and its usable gates are 5000. The IEEE 1394 interface is used for the communication between this image processing and the CCD.
As is shown in Fig. 12, a laser is emitted by a laser source, and laser beam 1 is reflected into the CCD by fast steering mirror 1 (FSM1), which is used to simulate the beacon light. The vibration signal generated by computer 1 is used to control FSM driver 1 to simulate the vibration of the satellite platform. Laser beam 2 is reflected into the CCD by FSM2 for simulating the star background light. The spot images collected by the CCD are delivered to computer 1 to identify and track the beacon. Owing to the strong light intensity of the He–Ne laser source, attenuation tablets are used to reduce the laser energy. In the experiment, a sine signal is used to simulate the motion of the beacon and star background light. The amplitude of the No. 1 spot in the X and Y directions is 50 μrad, and the vibration frequency is 5 Hz. Concurrently, the amplitude of the No. 2 spot in the X and Y directions is set as 30 μrad, and the vibration frequency is 5 Hz. The image frame rate is 100 Hz.
Figure 14(a) displays the result of tracking the 15th image, where the centroid coordinates of No.1 spot are (75.28, 103.48) and No. 2 spot are (126.60, 75.37). Figure 14(b) shows the 71st image tracking results, where the centroid coordinates of the No. 1 spot are (71.34, 106.26) and No. 2 spot are (132.59, 69.38). The No. 1 spot is used to simulate the beacon, whereas the No. 2 spot is used to simulate the star background spot. It can be seen that the algorithm proposed in this paper can identify and track the spots efficiently. For the case of two or three star background spots in the CCD, the analysis method is similar to the above algorithm.
To compare with the tradition method for reducing the effect of the star background light, the approach using the reference star calendar table is also employed for tracking the beacon under the same simulation conditions, and its procedure is presented in . The method first saves an initial star chart, which is considered as a criterion. The initial star chart is used for comparing the subsequent collecting image, and the different section concludes the message of the beacon so that the beacon can be tracked.
The comparative analysis results of the two methods are presented in Table 2. The approaches for reducing the effect of the spot noise and decreasing the image processing data are very important for reducing the recognition time. Therefore, the optical flow tracking method is superior to the traditional method using the reference star calendar table.
To reduce the interference of the star background light, this paper proposes an effective method for identifying a tracking beacon light based on the optical flow method in the presence of a complex background light environment. Theory and experiment show that this method reduces the amount of computation, increasing the processing speed of identifying and tracking the spots compared with the traditional method. This research work will be beneficial for maintaining the stability of the laser link for inter-satellite optical communication.
National Natural Science Foundation of China (NSFC) (11404082, 61503096, 11504068); Fundamental Research Funds for the Central Universities of Harbin Institute of Technology (AUGA5710058015).
The authors are grateful to Free Space Optical Communication Technology Research Center of Harbin Institute of Technology.
The authors declare that there are no conflicts of interest related to this article.
1. T. Jono, Y. Takayama, K. Shiratama, I. Mase, B. Demelenne, Z. Sodnik, A. Bird, M. Toyoshima, H. Kunimori, D. Giggenbach, N. Perlot, M. Knapek, and K. Arai, “Overview of the inter-orbit and the orbit-to-ground laser communication demonstration by OICETS,” Proc. SPIE 6457, 645702 (2007). [CrossRef]
2. D. M. Boroson and B. S. Robinson, “The lunar laser communication demonstration: NASA’s first step toward very high data rate support of science and exploration missions,” Space Sci. Rev. 185(1), 115–128 (2014). [CrossRef]
3. S. Yu, Z. Ma, J. Ma, F. Wu, and L. Tan, “Far-field correlation of bidirectional tracking beams due to wave-front deformation in inter-satellites optical communication links,” Opt. Express 23(6), 7263–7272 (2015). [CrossRef] [PubMed]
5. X. Chen, Y. Zheng, and Y. Wang, “Influence of spot noise in inter-satellite optical communications and suppression algorithm,” Chin. J. Lasers 37(03), 743–747 (2010). [CrossRef]
6. Q. Xu, J. Yu, and Y. Zhou, “Decreasing earth background radiation in satellite-ground communication,” Infrared Laser Eng. 43(07), 2300–2306 (2014).
7. Q. Han, J. Ma, and L. Tan, “Analysis of stellar background noise and study of restraining methods in satellite optical communication,” Opt. Technol. 31(03), 330–334 (2005).
9. B. K. P. Horn and B. G. Schunck, “Determining optical flow,” Artif. Intell. 17(1), 185–203 (1981). [CrossRef]
10. G. R. Bradski and A. Kaehler, Learning Open CV (Tsinghua University, 2009), 356–371.
11. Q. Han, J. Ma, and L. Tan, “Influences of stellar background noise on tracking and pointing subsystem of inter-satellite optical communications,” Opt. Technol. 32(03), 444–448 (2006).
12. J. W. Alexander, S. Lee, and C. Chen, “Pointing and tracking concepts for deep-space missions,” Proc. SPIE 3615, 230–249 (1999). [CrossRef]