Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Monocular vision aided optical tracking for underwater optical wireless communications

Open Access Open Access

Abstract

Underwater optical wireless communication (UOWC) has received increasing attention due to its distinctive characteristics such as high bandwidth and low latency. However, the UOWC link is usually vulnerable to water flow, underwater turbulence and the terminals’ own vibration. Thus, to maintain a stable transmission link, an accurate tracking method is essential for UOWC systems. In this paper, a monocular vision aided optical tracking method is proposed, where the deviation degree of the laser spot is employed to realize autonomous real-time alignment of the receiver with the transmitter. Experimental results verify that with the proposed tracking method, the bit error ratio can be ensured to be below the forward error correction (FEC) limit of 3.8×10−3 even under strong disturbance for the practical UOWC system.

© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Underwater optical wireless communication (UOWC) has become an emerging technology used for underwater communication [1]. Traditionally, the performance of the underwater acoustic communication is limited by its low bandwidth and long latency. Besides, radio waves only transmit over a short distance in water [2]. Consequently, their application to underwater communication is restricted. Compared with the acoustic communication and radio frequency communication, UOWC can support high-bandwidth and low-latency communication in a moderate transmission distance, and is suitable for underwater localization, navigation and communication between unmanned underwater vehicles (UUVs), such as remotely operated vehicles (ROVs) and autonomous underwater vehicles (AUVs).

In UOWC systems, light emitting diode (LED) or laser diode (LD) is used as the transmitter. For LED based UOWC system, the optical signal is usually transmitted over a wide-angle field, and detected by the photodiodes with a wide field-of-view, resulting in limited communication range and poor performance of rejecting the background light [3]. On the contrary, the laser beam emitted by LD is collimated and energy-intensive, which can be utilized for UOWC to overcome the aforementioned deficiencies.

In LD based UOWC system, the transmission distance is restricted by the complex underwater conditions [4]. Due to the interactions of the photons with water particulates, the absorption and scattering effect restricts the transmission distance up to hundreds of meters [5]. Besides, variation of salinity and pressure could result in optical turbulence, which causes the spot shape to change and degrades the communication performance [6,7]. To maintain a stable transmission link, localization techniques are required to realize the terminals’ pointing or tracking. Since the laser provides a collimated light beam, the tracking accuracy between the transmitter and receiver will significantly affect the communication performance. With the aim of improving the localization accuracy and efficiency, several researches have investigated the underwater localization methods. There are three typical underwater localization technologies: acoustic localization, radio wave based localization and optical localization. To decrease the localization error of acoustic signals, movement prediction modeling [8] and special sonar receiver [9] are proposed. However, these methods are usually complicated and lack the adaptability to the dynamic underwater environment. As for radio frequency localization, a large number of radio nodes are always employed to underwater wireless sensor networks due to its short communication range, so that the system performance almost depends on the node topology and network architecture [10]. Moreover, previous work about optical localization technology includes environmental modeling [11] and device design (e.g., laser reflective module and solar-panel photodetector) [1214], while they are costly and difficult to implement because of the strict requirement for optical device layout and underwater environment. Besides, other work may utilize some external markers to improve the localization accuracy, which is difficult to deploy in water [15].

For most unmanned underwater vehicles (UUVs), they have their own economical and accessible camera. Since the spot generated by the LD can be captured after a long transmission distance, this optical beacon can be utilized for underwater localization. Thus, unlike previous work, a lightweight accessible camera is utilized for underwater optical tracking in this paper. A monocular vision aided optical tracking method is further proposed, where the deviation degree of the laser spot is utilized to realize the autonomous real-time alignment of the receiver with the transmitter. Compared to the traditional methods with high-cost and complicated equipment or external reference, the proposed method is practical and easy to implement with the utilization of the lightweight camera attached on the mobile terminal. As a result, the additional cost for tracking in our UOWC system is reduced.

The outline of the paper is as follows: Section I reviews the related work concentrating on underwater localization. In Section II, the monocular vision aided optical tracking method is described. Section III; shows the experimental setup. In Section IV, the performance of the optical tracking method is verified. Finally, a discussion about this work and an outline of the follow-up work are summarized in Section V.

2. Related work

Acoustic localization technology is the most widely used localization method for underwater applications. By comparing the relative phase and the time of arrival of the acoustic signals, the movement between underwater terminals and environment can be obtained for node localization [16]. Nevertheless, the drawbacks of acoustic techniques (e.g., poor accuracy, limited bandwidth and week robustness in time synchronization) restrict the extensive usage on the accurate and highly dynamic tracking [17]. In [8], the environment model is further proposed to assist the mobile terminal localization. A multi-step localization method based on environment model is used to design the movement prediction algorithm. However, the model is established based on the movement law of specific water flow and not suitable for other water types.

On the other hand, since high attenuation is the most notable deficiency of underwater radio signal, traditional radio frequency localization method based on the received signal strength indicator has a large localization error in water [18]. Besides, underwater radio frequency localization accuracy is challenged by the dynamic channel. To overcome this challenge, magnetic induction (MI) based method has been proposed as an effective alternative [19]. However, the transmission distance and efficiency of MI communication is determined by the specific system layout, which is significantly affected by the dynamics and uncertainty of the underwater environment.

Note that in an UOWC system, the available beacon for localization is the visible light emitted by the transmitter. Benefitting from this collimated laser beam, precise underwater localization can be realized. Up to now, researches about LD based localization mainly focus on decreasing the localization error at the receiver while ignoring the external disturbance imposed on the terminal [11,20]. To maintain the optical link during the terminal’s movement, a compact, low-power and low-cost acquisition, pointing and tracking (APT) system is proposed [14]. In the APT system, a camera fixed on the transmitter is utilized to identify and track a special mark attached to the receiver. However, a certain range of illumination generated by the additional light source should be provided to ensure the system performance.

3. Monocular vision aided optical tracking method

In LD based UOWC system, a clear spot can be captured even after travelling through a long distance. However, as is shown in Fig. 1, the spot shape continuously changes with the variation of the receiver’s position. When the receiver is aligned with the transmitter, the spot is approximately circular. On the contrary, when the receiver deviates from the center of the beam, the spot is elongated.

 figure: Fig. 1.

Fig. 1. (a) Schematic representation of the detected spot at different positions. The illustrations of terminal at different positions in (b)-(d) correspond with the three states in (a).

Download Full Size | PDF

Motivated by this observation, a monocular vision aided optical tracking method is proposed in this paper. Specifically, the spot area is firstly extracted from the picture captured by the camera. Since a circle of halo around the spot could be identified as parts of the spot, the gaussian mixed model (GMM) is adopted to avoid the spot extraction error [21]. Then, the probability distribution of the sample point x in the picture can be expressed as

$$P(x) = \sum\limits_{j = 1}^K {{\pi _j}N(x|{\mu _j},\sum{_j})}$$
where K is the clusters number of the region of interest (ROI), ${\pi _j}$ is the weight of the j-th cluster, and $N(x|{\mu _j},{\sum _j})$ is the gaussian distribution density of the sample point with mean ${\mu _j}$ and covariance $\sum{_j}$, which are expressed as
$$\left\{ {\begin{array}{{l}} {{\pi_j} = \frac{{\sum\limits_{i = 1}^M {\gamma (i,j)} }}{M}}\\ {{\mu_j} = \frac{{\sum\limits_{i = 1}^M {\gamma (i,j){x_i}} }}{{\sum\limits_{i = 1}^M {\gamma (i,j)} }}}\\ {\sum{_j} = \frac{{\sum\limits_{i = 1}^M {\gamma (i,j){{({x_i} - {\mu_j})}^2}} }}{{\sum\limits_{i = 1}^M {\gamma (i,j)} }}} \end{array}} \right.$$
where M is the number of sample points.

Here, $\gamma (i,j)$ is the probability of the i-th sample generated by the j-th cluster, which is further expressed as

$$\gamma (i,j) = \frac{{{\pi _j}N({x_i}|{\mu _j},\sum {_j})}}{{\sum\limits_{j = 1}^K {{\pi _j}N({x_i}|{\mu _j},\sum {_j})} }}$$

Since directly calculating P(x) usually suffers from floating-point underflow, the logarithmic form of $P^{\prime}(x)$ is used, which is expressed as

$${P^{\prime}}(x)\textrm{ = }\sum\limits_{i = 1}^M {log\left\{ {\sum\limits_{j = 1}^K {{\pi_j}N({x_i}|{\mu_j},\sum{_j})} } \right\}}$$

To estimate GMM, the expectation-maximization (EM) algorithm is applied due to its distinctive properties, such as automatic satisfaction of probabilistic constrains, monotonic convergence and low computational overhead [22,23]. Based on $P^{\prime}(x)$, the EM algorithm is further adopted to extract the spot [24], which is shown in Algorithm 1. The probability of each point in ROI generated by every cluster is estimated based on (3). The parameters of each cluster are calculated according to (2). This process is repeated until (4) converges. Accordingly, the precise spot is just one of the K clusters that contains the centroid sample of the connected region of ROI.

Tables Icon

Algorithm 1. EM based parameter estimation

Since the detected spots present different at different positions of the receiver, we categorize the extracted spot according to the ratio of the spot area S1 to its minimum circumscribed circle S2 (i.e., S1/ S2) in the ROI. When the ratio is above a threshold η, the spot shape is circular, or else it is an ellipse, which is illustrated in Fig. 2. For different spot shapes, the deviation r of the receiver is calculated to decide whether the receiver is aligned with the transmitter and further used for tracking, which is detailed in the following.

 figure: Fig. 2.

Fig. 2. Illustration of spot shape for different receiver’s positions.

Download Full Size | PDF

3.1 Circular spot

When the receiver is aligned with the transmitter, the detected spot is approximately circular, and the spot with deviation r = 0 is regarded as the reference spot. To calculate the centroid of the circular spot detected by the receiver in other positions, i.e., ccur, the least square method is used to fit the circle. The general equation of the circle is expressed as

$$f(x,y) = {x^2} + {y^2} + Ax + By + C = 0$$

The coefficients of A, B and C are calculated by

$$\left[ {\begin{array}{{c}} A\\ B\\ C \end{array}} \right] ={-} {\left[ {\begin{array}{*{20}{c}} {\sum {x_i^2} }&{\sum {{x_i}{y_i}} }&{\sum {{x_i}} }\\ {\sum {{x_i}{y_i}} }&{\sum {y_i^2} }&{\sum {{y_i}} }\\ {\sum {{x_i}} }&{\sum {{y_i}} }&R \end{array}} \right]^{\textrm{ - }1}}\left[ {\begin{array}{*{20}{c}} {\sum {(x_i^3 + {x_i}y_i^2)} }\\ {\sum {(x_i^2{y_i} + y_i^3)} }\\ {\sum {(x_i^2 + y_i^2)} } \end{array}} \right]$$
where (xi, yi) is the sample edge point extracted from the real-time spot and R is the number of the spot edge points. Then, the centroid of the fitting circle ccur= (-A/2, -B/2) is regarded as the beam center. As a result, for the circular spot, the receiver’s deviation is expressed as
$$r = {c_{ref}} - {c_{cur}}$$
where cref is the centroid of the reference spot when the receiver is perfectly aligned with the transmitted laser beam with deviation r = 0.

Note that since the terminals can still have acceptable communication performance (i.e., the bit error ratio (BER) is below the forward error correction (FEC) limit [25]) in a small area where the extracted spot is approximately circular, this area can be regarded as an alignment area as is shown in Fig. 1. Therefore, when r is less than the radius of practical alignment area r0, the receiver is considered to be aligned with the transmitter. Otherwise, the transmitter and receiver are considered to be misaligned.

3.2 Elliptical spot

When the camera is not in the optical axis of the transmitted beam, the beam center and the center of the spot do not coincide. When the distance between these two centers becomes large, the detected spot could be elliptical. In this case, the beam direction, defined as the direction of the beam center to the elliptical spot center, is exploited for tracking to adjust the deviation.

As is shown in Fig. 3, taking the center of the ellipse as the origin Oe, the major axis of the ellipse can be divided into two parts. Several adjacent points along each side of the major axis with the gray value above 0 are sampled to calculate the average gray value, which is expressed as

$$s = \frac{1}{{{L_0}}}\sum\limits_{i = 1}^{{L_0}} {f({x_i},{y_i})}$$
where f (xi, yi) is the gray value of the sample point (xi, yi), L0 is the number of sample points on each side. Then, two values s1, s2 are calculated for each side of the major axis and the beam center lies on the side with a larger gray value s1 (if s1 > s2). Accordingly, the deviation r is defined as
$$r = {( - 1)^w}\lambda \frac{a}{b}$$
where a and b are the sizes of the major axis and minor axis of the ellipse, λ is the compensation coefficient for the elliptical spot, w is the sign of the beam direction projected on the pixel coordinate system Ouv, and the value of w is 0 or 1, which is determined by the values of s1, s2.

 figure: Fig. 3.

Fig. 3. Simulated beam spot.

Download Full Size | PDF

Lastly, based on (7) and (9), the deviation r for different spots is input into the receiver’s PID controller to obtain a force vector ${\tau _r}$, which is expressed as [26]

$${\tau _r} = {g_r}(r) - {J^T}(r)\left[ {{K_p}r + {K_d}\dot{r} + {K_i}\int r } \right]$$
where ${g_r}(r)$ is the vector of hydrostatic forces, $J(r)$ is the transformation matrix from the vehicle body frame to the world reference frame, Kp, Kd and Ki are the proportional gain, derivative gain and integral gain in PID controller. With the force vector τr, the controller continuously adjusts the receiver’s position and decreases its deviation, thereby assuring the alignment of the receiver with the transmitter.

Based on the above analysis, the optical tracking method is shown in Algorithm 2.

Tables Icon

Algorithm 2. Optical tracking method

4. Experimental setup

Figure 4 shows the experimental setup of the proposed UOWC system to verify the proposed tracking method. The experiment system is composed of two ROVs (BlueROV Heavy [27]), equipped with an UOWC transmitter and an UOWC receiver respectively. The tests are conducted in an experimental pool with dimension of about 18×5×5m. The ROVs are deployed with a distance of about 16m and the depth of −2.5m. For simplification, one ROV with a transmitter is fixed in water. The spot emitted by the transmitter is captured by the other ROV equipped with a receiver and a camera. In our experiments, the LD Thorlabs L520G1 is used as the transmitter with the wavelength of 520nm and emission power of 1W. The output of the receiver is monitored by the digital storage oscilloscope (DSO) and processed online through a laptop on the ground. Besides, in our tests, the deviation calculated by the optical tracking method is called as the measured value dm while the actual deviation in the physical world is called as the practical value dp.

 figure: Fig. 4.

Fig. 4. (a) The developed BlueROV equipped with either an UOWC transmitter or an UOWC receiver. (b) The experimental setup of the proposed UOWC system. (c) The mobile receiver. (d) The experiment pool. (e) The fixed transmitter.

Download Full Size | PDF

5. Results and discussion

Figure 5 shows the BER performance of the receiver when it is at different positions with different deviation degrees. As expected, the BER performance is degraded from 4 × 105 to 3.8 × 103 with the increase of the practical deviation, which is also verified by the eye diagrams. When the practical value dp is less than 0.5m (red dash line), the BER is below FEC limit of 3.8 × 103. As a result, 0.5m is considered as the radius r0 of the practical alignment area in our tests. In this area, the receiver is considered to be aligned with the transmitter.

 figure: Fig. 5.

Fig. 5. Receiver’s BER performance under different positions with different practical deviation degrees.

Download Full Size | PDF

It should be pointed out that due to the camera distortion, environmental interference and other random errors existed in ROV, measured deviation and practical deviation are usually different. To evaluate the difference between them, measured values dm are calculated at different positions with the practical deviation dp ranging from −2m to 2m. As is shown in Fig. 6, the difference between the two values is in the range of 0.5m to 0.7m at different positions. When dp is 0.5m (corresponding to the FEC limit), dm is about 1m. Therefore, the measured value with dm = 1m corresponds to the radius r0 of practical alignment area in UOWC links. This implies that when the measured value dm is less than 1m, the communication performance is acceptable.

 figure: Fig. 6.

Fig. 6. The difference between the measured deviation and the practical deviation.

Download Full Size | PDF

Figure 7 shows the anti-disturbance performance of the optical tracking method. In this test, due to water flow and the receiver’s vibration, small disturbance is always encountered before 59s. However, the measured deviation dm of the receiver is less than 1m during most of the time, which implies that the optical communication can be guaranteed with our tracking method. From 59s to 83s, strong disturbance is imposed on the receiver. In this case, the deviation continuously increases. The measured value dm could climb up to 2.5 m. Benefitting from the proposed tracking method, the deviation dm could decrease until the receiver is once again aligned with the transmitter at 108s. In other words, the optical link can be restored to a stable status even when a strong disturbance is imposed. Note that with the high frequency of spot detection, the measured deviation can be output at a speed of 0.2s. However, to overcome the water disturbance and move stably, the ROV requires much longer time to adjust the pose by its motion controller.

 figure: Fig. 7.

Fig. 7. Measured deviation when small disturbance is imposed before 59s and strong disturbance is imposed from 59s to 83s.

Download Full Size | PDF

It should be mentioned that optical turbulence could cause the fast variation of spot shape. However, benefiting from the high frequency of spot detection (i.e., the measured deviation can be output at speed of 0.2s), the expected deviation/position of the receiver can be precisely calculated. Thus, our method can work in the practical complex underwater environment.

Figure 8 shows the visual tracking performance of the receiver during the disturbance process. The images are captured every 3s from 50s to 110s. Before 59s, the spot keeps circular with the proposed optical tracking method under small disturbance. Between 59s and 83s, a strong disturbance is imposed and the practical deviation reaches to 2 m. However, due to the inertia, the ROV will continue to move forward until the speed decreases to 0 at 86s. It can be observed that the spot changes from circle to ellipse, and the ratio of the major axis and minor axis of the elliptical spot increases gradually during the disturbance. Thanks to the optical tracking algorithm, the real-time position of the receiver is adjusted and the spot becomes circular again after 108s. Once the ROV is tracked and aligned again, the deviation will be continuously calculated and slightly adjusted to ensure that it is less than the radius of practical alignment area r0.

 figure: Fig. 8.

Fig. 8. Snapshots of the spot at the receiver side.

Download Full Size | PDF

6. Conclusion

In this paper, an anti-disturbance optical tracking method based on monocular vision is proposed. It paves a practical way for realizing tracking in UOWC systems. The basic idea of this method is to use the detected laser spot associated with its deviation degree to determine whether the transmitter and the receiver are aligned. Since the spot can be circular and elliptical at different receiver’s positions, the deviation degree is calculated differently for different spot shapes. Experimental results verify that satisfying communication performance can be achieved with the proposed tracking method for a practical UOWC system. Even with strong disturbance, the BER can be ensured to be below 3.8×10−3.

Funding

National Key Basic Research Program of China (62101291); Major Key Project of PCL (PCL2021A05).

Acknowledgement

The author would like to thank Tsinghua Shenzhen International Graduate School, Tsinghua University, China, for providing the facilities and maintenance of the experimental pool.

Disclosures

The authors declare that there are no conflicts of interest related to this article.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. M. A. Khalighi and M. Uysal, “Survey on Free Space Optical Communication: A Communication Theory Perspective,” IEEE Commun. Surv. Tutorials 16(4), 2231–2258 (2014). [CrossRef]  

2. M. F. Ali, D. N. K. Jayakody, Y. A. Chursin, S. Affes, and S. Dmitry, “Recent Advances and Future Directions on Underwater Wireless Communications,” Arch. Comput. Methods Eng. 27(5), 1379–1412 (2020). [CrossRef]  

3. N. D. Hardy, H. G. Rao, S. D. Conrad, T. R. Howe, M. S. Scheinbart, R. D. Kaminsky, and S. A. Hamilton, “Demonstration of vehicle-to-vehicle optical pointing, acquisition, and tracking for undersea laser communications,” Proc. SPIE 10910, 35 (2019). [CrossRef]  

4. W. Liu, Z. Xu, and L. Yang, “SIMO detection schemes for underwater optical wireless communication under turbulence,” Photonics Res. 3(3), 48–53 (2015). [CrossRef]  

5. S. Kumar, S. Prince, J. Venkata Aravind, and S. Kumar G, “Analysis on the effect of salinity in underwater wireless optical communication,” Mar. Georesources Geotechnol. 38(3), 291–301 (2020). [CrossRef]  

6. R. M. Kraemer, L. M. Pessoa, and H. M. Salgado, “Monte Carlo radiative transfer modeling of underwater channel,” in Wireless Mesh Networks-Security, Architectures and Protocols, M. Khatib, ed. (IntechOpen, 2020).

7. Y. Guo, M. Kong, M. Sait, S. Marie, O. Alkhazragi, T. K. Ng, and B. S. Ooi, “Compact scintillating-fiber/450-nm-laser transceiver for full-duplex underwater wireless optical communication system under turbulence,” Opt. Express 30(1), 53–69 (2022). [CrossRef]  

8. W. Zhang, G. Han, X. Wang, M. Guizani, K. Fan, and L. Shu, “A Node Location Algorithm Based on Node Movement Prediction in Underwater Acoustic Sensor Networks,” IEEE Trans. Veh. Technol. 69(3), 3166–3178 (2020). [CrossRef]  

9. M. Kong, J. Lin, Y. Guo, X. Sun, M. Sait, O. Alkhazragi, C. H. Kang, J. A. Holguin-Lerma, M. Kheireddine, M. Ouhssain, B. H. Jones, T. K. Ng, and B. S. Ooi, “AquaE-lite Hybrid-Solar-Cell Receiver-Modality for Energy-Autonomous Terrestrial and Underwater Internet-of-Things,” IEEE Photonics J. 12(4), 1–13 (2020). [CrossRef]  

10. K. Munasinghe, M. Aseeri, S. Almorqi, M. Hossain, M. Binte Wali, and A. Jamalipour, “EM-Based High Speed Wireless Sensor Networks for Underwater Surveillance and Target Tracking,” J. Sensors 2017, 1–14 (2017). [CrossRef]  

11. J. Zhang, L. Kou, Y. Yang, F. He, and Z. Duan, “Monte-Carlo-based optical wireless underwater channel modeling with oceanic turbulence,” Opt. Commun. 475, 126214 (2020). [CrossRef]  

12. S. Wu, P. Zhou, C. Yang, Y. Zhu, and H. Zhi, “A Novel Approach for Underwater Vehicle Localization and Communication Based on Laser Reflection,” Sensors 19(10), 2253 (2019). [CrossRef]  

13. A. S. Fletcher, C. E. DeVoe, I. D. Gaschits, F. Hakimi, N. D. Hardy, J. G. Ingwersen, R. D. Kaminsky, H. G. Rao, M. S. Scheinbart, T. M. Yarnall, and S. A. Hamilton, “A narrow-beam undersea laser communications field demonstration,” in Proceedings of OCEANS 2016 MTS/IEEE Monterey (IEEE, 2016), pp.1–5.

14. J. Lin, Z. Du, C. Yu, W. Ge, W. Lyu, H. Deng, C. Zhang, X. Chen, Z. Zhang, and J. Xu, “Machine-vision-based acquisition, pointing, and tracking system for underwater wireless optical communications,” Chin. Opt. Lett. 19(5), 050604 (2021). [CrossRef]  

15. D. A. Duecker, K. Eusemann, and E. Kreuzer, “Towards an Open-Source Micro Robot Oceanarium: A Low-Cost, Modular, and Mobile Underwater Motion-Capture System,” in Proceedings of 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE, 2019), pp. 8048–8053.

16. J. Yan, D. Guo, X. Luo, and X. Guan, “AUV-aided localization for underwater acoustic sensor networks with current field estimation,” IEEE Trans. Veh. Technol. 69(8), 8855–8870 (2020). [CrossRef]  

17. J. González-García, A. Gómez-Espinosa, E. Cuan-Urquizo, L. G. García-Valdovinos, T. Salgado-Jiménez, and J. A. E. Cabello, “Autonomous underwater vehicles: Localization, navigation, and communication for collaborative missions,” Appl. Sci. 10(4), 1256 (2020). [CrossRef]  

18. Z. Tian, X. Gong, F. He, J. He, and X. Wang, “Compressed sensing grid-based target stepwise location method in underground tunnel,” Sensor Rev. 40(4), 397–405 (2020). [CrossRef]  

19. A. L. Prasanna, V. Kumar, and S. B. Dhok, “Cooperative Communication and Energy-Harvesting-Enabled Energy-Efficient Design of MI-Based Clustered Nonconventional WSNs,” IEEE Syst. J. 14(2), 2293–2302 (2020). [CrossRef]  

20. X. Huang, F. Yang, and J. Song, “Hybrid LD and LED-based underwater optical communication: state-of-the-art, opportunities, challenges, and trends,” Chin. Opt. Lett. 17(10), 100002 (2019). [CrossRef]  

21. M. Raitoharju, A. F. García-Fernández, R. Hostettler, R. Piché, and S. Särkkä, “Gaussian mixture models for signal mapping and positioning,” Signal Process. 168, 107330 (2020). [CrossRef]  

22. A. P. Dempster, N. M. Laird, and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm,” J. R. Stat. Soc., Ser. B Methodol. 39(1), 1–22 (1977). [CrossRef]  

23. L. Xu and M. I. Jordan, “On convergence properties of the EM algorithm for Gaussian mixtures,” Neural Comput. 8(1), 129–151 (1996). [CrossRef]  

24. V. Melnykov and I. Melnykov, “Initializing the EM algorithm in Gaussian mixture models with an unknown number of components,” Comput. Stat. Data. Anal. 56(6), 1381–1395 (2012). [CrossRef]  

25. W. Lyu, M. Zhao, X. Chen, X. Yang, Y. Qiu, Z. Tong, and J. Xu, “Experimental demonstration of an underwater wireless optical communication employing spread spectrum technology,” Opt. Express 28(7), 10027–10038 (2020). [CrossRef]  

26. A. Manzanilla, S. Reyes, M. Garcia, D. Mercado, and R. Lozano, “Autonomous Navigation for Unmanned Underwater Vehicles: Real-Time Experiments Using Computer Vision,” IEEE Robot. Autom. Lett. 4(2), 1351–1356 (2019). [CrossRef]  

27. Blue Robotics, “Bluerov2 Heavy,” https://bluerobotics.com/store/rov/bluerov2-upgrade-kits/brov2-heavy-retrofit-r1-rp/.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1.
Fig. 1. (a) Schematic representation of the detected spot at different positions. The illustrations of terminal at different positions in (b)-(d) correspond with the three states in (a).
Fig. 2.
Fig. 2. Illustration of spot shape for different receiver’s positions.
Fig. 3.
Fig. 3. Simulated beam spot.
Fig. 4.
Fig. 4. (a) The developed BlueROV equipped with either an UOWC transmitter or an UOWC receiver. (b) The experimental setup of the proposed UOWC system. (c) The mobile receiver. (d) The experiment pool. (e) The fixed transmitter.
Fig. 5.
Fig. 5. Receiver’s BER performance under different positions with different practical deviation degrees.
Fig. 6.
Fig. 6. The difference between the measured deviation and the practical deviation.
Fig. 7.
Fig. 7. Measured deviation when small disturbance is imposed before 59s and strong disturbance is imposed from 59s to 83s.
Fig. 8.
Fig. 8. Snapshots of the spot at the receiver side.

Tables (2)

Tables Icon

Algorithm 1. EM based parameter estimation

Tables Icon

Algorithm 2. Optical tracking method

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

P ( x ) = j = 1 K π j N ( x | μ j , j )
{ π j = i = 1 M γ ( i , j ) M μ j = i = 1 M γ ( i , j ) x i i = 1 M γ ( i , j ) j = i = 1 M γ ( i , j ) ( x i μ j ) 2 i = 1 M γ ( i , j )
γ ( i , j ) = π j N ( x i | μ j , j ) j = 1 K π j N ( x i | μ j , j )
P ( x )  =  i = 1 M l o g { j = 1 K π j N ( x i | μ j , j ) }
f ( x , y ) = x 2 + y 2 + A x + B y + C = 0
[ A B C ] = [ x i 2 x i y i x i x i y i y i 2 y i x i y i R ]  -  1 [ ( x i 3 + x i y i 2 ) ( x i 2 y i + y i 3 ) ( x i 2 + y i 2 ) ]
r = c r e f c c u r
s = 1 L 0 i = 1 L 0 f ( x i , y i )
r = ( 1 ) w λ a b
τ r = g r ( r ) J T ( r ) [ K p r + K d r ˙ + K i r ]
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.