Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Phase unwrapping algorithm based on phase edge tracking for dynamic measurement

Open Access Open Access

Abstract

Phase unwrapping is an essential procedure for fringe projection profilometry (FPP). To improve measurement efficiency and reduce phase unwrapping errors (PUEs) in dynamic measurement, a phase unwrapping algorithm based on phase edge tracking is proposed, which unwraps the current wrapped phase map with the aid of the previously unwrapped one. The phase edges are accurately tracked and their trajectories are used to divide the phase map into several regions, each of which is unwrapped either temporally or spatially according to its properties. It doesn’t require extra patterns for phase unwrapping once the initial unwrapped phase map is obtained, thus significantly increasing the frame rate of the 3D result. Meanwhile, it greatly reduces the PUEs caused by noise amplification and motion-induced misalignment of phase edges. Experiments prove that it is capable of retrieving the absolute phase maps of complex dynamic scenes with high unwrapping accuracy and efficiency.

© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Dynamic three-dimensional (3D) shape measurement is in great demand in many fields. Among various optical 3D shape measurement methods, fringe projection profilometry (FPP) stands out due to its advantages of high-speed, high-accuracy, full-field, and noninvasive [14]. In FPP, phase unwrapping is an essential procedure to obtain the actual phase. Many phase unwrapping algorithms have been developed in the past few decades [517]. They can be mainly divided into two categories: spatial phase unwrapping algorithm (SPUA) [18,19] and temporal phase unwrapping algorithm (TPUA) [5,6]. SPUAs don’t require to project and capture extra patterns for phase unwrapping, but they are not suitable for isolated objects or complex surfaces with large discontinuities [20]. Moreover, SPUAs are usually path-dependent, which makes them time-consuming [16]. TPUAs unwrap each pixel independently to retrieve an absolute phase map using additional temporal information, so they are capable of dealing with complex scenes [12]. And they can work fast owing to few computations and parallel computing. However, they are initially designed for static measurement and some shortcomings show up when they are used in the dynamic measurement. The need to obtain additional images makes them consume more sampling time, which limits the frame rate of the 3D result. Furthermore, object motion in dynamic measurement will bring more serious adverse effects, such as causing many phase unwrapping errors (PUEs) [21,22].

Some improved phase unwrapping algorithms have been proposed which are more suitable for dynamic measurement. Cong et al. [16] embedded a sparse set of markers in the fringe patterns to aid spatial phase unwrapping to obtain the absolute phases without any other patterns. The algorithm is more efficient and robust than the conventional single-path flood-fill algorithms, but the embedded markers need to be carefully designed and processed. Zhang et al. [23] introduced a reference-guided phase unwrapping algorithm that uses the first fitted unwrapped phase map to unwrap all other wrapped phase maps. It consumes no extra patterns after the first unwrapping; however, it requires the height variation of the surface to be very small. An et al. [11] proposed a phase unwrapping algorithm that uses an artificial phase map created from geometric constraints of the calibrated structured light system to unwrap phases. Although this algorithm doesn’t require additional patterns, the working depth range is very limited. Feng et al. [13] presented two 3D spatial phase unwrapping algorithms that use the 3D quality maps to unwrap sequences of wrapped phase maps. By using both spatial and temporal information, these algorithms are more robust to resist noise and discontinuities than their 2D versions with no extra patterns. However, only some relatively simple scenes have been tested. Wu et al. [17] proposed a tripartite phase unwrapping algorithm to avoid PUEs near the boundaries of the gray code words. This algorithm also uses a time-overlapping coding strategy to reduce the number of projected patterns. But it still needs one extra pattern to unwrap phases for each 3D frame. An et al. [24] combines SPUA and TPUA to retrieve the absolute phase map of isolated objects. This algorithm divides the wrapped phase map into several regions first, which are unwrapped by SPUA individually. Then a series of reliable points from the fringe order map are selected to determine the absolute fringe order of each region. So, this algorithm requires extra patterns to get the fringe order map. Deng et al. [25] proposed an edge-preserved PUE correction strategy to correct two types of PUEs. In this algorithm, the edge of the measured object is preserved in the coarse correction process, and the remaining PUEs are corrected by the eight-neighbor filtering in the fine correction process. Although this algorithm effectively corrects PUEs, it doesn’t reduce the patterns required for phase unwrapping. The above-mentioned algorithms along with other algorithms [2632] can achieve satisfactory unwrapping results in specific scenes. However, more robust and efficient algorithms that can be applied in the dynamic measurement of complex objects remain to be further studied.

To this end, a novel phase unwrapping algorithm based on phase edge tracking is proposed in this paper. The core idea of the proposed algorithm is to unwrap the current wrapped phase map with the aid of the previously unwrapped phase map. This means that it requires no additional patterns after obtaining the initial unwrapped phase map. Phase edges [15] play an important role in this algorithm because the object motion introduces the misalignment of phase edges that will cause non-negligible PUEs. The trajectories of the phase edges in two consecutive phase maps are accurately tracked by Gaussian weighted template matching [33] followed by further refinement. And the trajectories are used to divide the phase map into several regions that will be unwrapped either temporally or locally spatially depending on the region properties. During the unwrapping process, most of the pixels are unwrapped by extra-pattern-free TPUA firstly, and then the remaining few pixels near the phase edges are unwrapped by SPUA. Compared with the conventional dual-frequency phase unwrapping algorithm (DFPUA) which is a frequently adopted TPUA in dynamic measurement due to its robustness and efficiency [12,14], the proposed algorithm significantly reduces PUEs caused by noise amplification and the motion-induced misalignment of phase edges while consuming far fewer patterns. Compared with conventional path-dependent SPUAs, the proposed algorithm is more computationally efficient because only few pixels are unwrapped spatially while other pixels can be processed with fewer calculations parallelly. Moreover, this algorithm retrieves absolute phases enabling it to measure isolated objects.

The rest of this paper is organized as follows. Section 2 explains the limitations of the conventional DFPUA in dynamic measurement. Section 3 describes the proposed phase unwrapping algorithm in detail. In Section 4, the proposed algorithm is validated by three representative experiments. Section 5 discusses the advantages and limitations of the proposed algorithm. Finally, this paper is summarized in Section 6.

2. Limitations of the conventional DFPUA in dynamic measurement

In an FPP system, the captured image can be mathematically described as:

$$I({x,y} )= A({x,y} )+ B({x,y} )\cos [{\phi ({x,y} )} ],$$
where $({x,y} )$ is the pixel coordinates, $A({x,y} )$ is the average intensity of the fringe image, $B({x,y} )$ is the intensity modulation, and $\phi ({x,y} )$ is the shape-related phase. Due to the periodicity of trigonometric functions, the obtained $\phi ({x,y} )$ is wrapped between $- \mathrm{\pi }$ and $\mathrm{\pi }$. To remove the $2\mathrm{\pi }$ discontinuities, phase unwrapping should be performed that can be described as:
$$\Phi ({x,y} )= \phi ({x,y} )+ 2\pi \times K({x,y} ),$$
where $\mathrm{\Phi }({x,y} )$ is the unwrapped phase map and $K({x,y} )$ is the fringe order map. DFPUA is usually applied to obtain $K({x,y} )$ in dynamic measurement due to its efficiency and robustness. It needs another set of single-period fringe patterns. Then $K({x,y} )$ can be determined by:
$$K({x,y} )= round\left[ {\frac{{R{\Phi _{ref}}({x,y} )- \phi ({x,y} )}}{{2\pi }}} \right],$$
where $round$ is to round a number to its closest integer number, R is the frequency ratio, ${\mathrm{\Phi }_{ref}}({x,y} )$ is the unwrapped low-frequency phase map, and $\phi ({x,y} )$ is the wrapped high-frequency phase map. $\mathrm{\Phi }({x,y} )$ can be obtained with Eqs. (2) and (3). Then it will be used to recover the 3D shape of the measured surfaces in combination with the system parameters [34].

Although DFPUA is efficient and robust, it still faces many challenges in dynamic measurement. Firstly, the requirement of extra fringe patterns increases the data acquisition time and reduces the frame rate of the 3D result. If DFPUA is adopted in combination with three-step phase-shifting technique [35], every set of high-frequency fringe images requires a set of low-frequency fringe images to unwrap the phase, so half of the images are used for phase unwrapping, which reduces the time resolution of the 3D result.

Secondly, the noise of the reference phase will be amplified by the frequency ratio R in DFPUA which might bring in PUEs. Suppose the noise of ${\mathrm{\Phi }_{ref}}({x,y} )$ is $\mathrm{\Delta }{\mathrm{\Phi }_{ref}}({x,y} )$, and the noise of $\phi ({x,y} )$ is $\mathrm{\Delta }\phi ({x,y} )$, the deviation of $K({x,y} )$ will be:

$$\mathrm{\Delta }K({x,y} )= round\left[ {\frac{{R\mathrm{\Delta }{\Phi _{ref}}({x,y} )- \mathrm{\Delta }\phi ({x,y} )}}{{2\pi }}} \right].$$

As demonstrated in Region I of Fig. 1, a small $\mathrm{\Delta }{\mathrm{\Phi }_{ref}}({x,y} )$ can result in a wrong $K({x,y} )$ when it is scaled up by R. In precise measurement, the frequency of the high-frequency fringe patterns should maintain a relatively high level, such as 20 or larger, because denser fringes produce higher quality phases [12]. This will bring R to the order of tens in DFPUA, thus causing numerous PUEs.

 figure: Fig. 1.

Fig. 1. PUEs in the conventional DFPUA.

Download Full Size | PDF

Thirdly, the misalignment of the phase edges due to object motion will also cause PUEs. For example, as shown in Region II of Fig. 1, a phase edge pixel is located at pixel A when acquiring the high-frequency fringe images, but it moves to pixel B when acquiring the low-frequency fringe images. In this case, the pixels between pixel A and pixel B will be unwrapped wrongly.

To overcome the above drawbacks, we proposed a novel phase unwrapping algorithm based on phase edge tracking.

3. Principle

The proposed phase unwrapping algorithm aims to improve measurement efficiency and reduce PUEs. The core idea of this algorithm is to track the phase edges in consecutive phase maps in the dynamic measurement, and then divide the wrapped phase map into different regions according to the trajectories of the phase edges, and finally unwrap the different regions applying either extra-pattern-free TPUA or SPUA depending on the region properties. The proposed algorithm doesn’t require to project and capture any extra patterns after obtaining the initial unwrapped phase map, instead, it unwraps the current wrapped phase map with the aid of the previously unwrapped one. The detailed key techniques are explained below.

3.1 Phase edge tracking

Phase edges are those pixels or lines with large phase changes that exist at the discontinuities. Tracking the phase edges in two consecutive phase maps is an important task in the proposed algorithm. In dynamic measurement, the scene varies relatively slowly. Therefore, a phase edge pixel can be tracked by using the information of its surrounding pixels. Based on this feature, phase edge tracking can be achieved by template matching technique. This technique is widely used to find the part on an image that is most similar to the template image. In the proposed algorithm, the intensity modulation $B({x,y} )$ is chosen as the image for matching because it is not sensitive to ambient light [36]. Considering that the farther a pixel is from the phase edge pixel, the greater the difference in displacement, the Gaussian weighted loss function is used to track the phase edge pixel as follows:

$$L({u,v} )= \mathop \sum \limits_{x^{\prime},y^{\prime}} \{{Gau({x^{\prime},y^{\prime}} ){{[{B({x^{\prime},y^{\prime},t - 1} )- B({x^{\prime} + u,y^{\prime} + v,t} )} ]}^2}} \},$$
where $({u,v} )$ is the displacement of the phase edge pixel, $B({x^{\prime},y^{\prime},t - 1} )$ is the template from $B({x,y,t - 1} )$ that is centered at the phase edge pixel, $B({x^{\prime} + u,y^{\prime} + v,t} )$ is the target subset from $B({x,y,t} )$, $Gau({x^{\prime},y^{\prime}} )$ is a two-dimensional Gaussian window, and $L({u,v} )$ is the loss map. Because the ideal target is most similar to the template, the displacement of the phase edge pixel should minimize $L({u,v} )$. To avoid unnecessary computations, $L({u,v} )$ should be calculated for only a small region of interest (ROI) centered at the phase edge pixel because each phase edge pixel has only a small displacement between two consecutive phase maps. The process of phase edge tracking is illustrated in Fig. 2 for a better explanation. The phase edges at $t - 1$ is labelled as red lines and superimposed on $B({x,y,t - 1} )$ in Fig. 2. The template from $B({x,y,t - 1} )$ is compared with the target from $B({x,y,t} )$ pixel-wisely. The square of their difference is then weighted by the Gaussian window. The sum of the weighted values is the final loss value. For each candidate pixel in the ROI shown in the red box in Fig. 2, the corresponding target subset goes through the same calculations for a loss value. Finally, the pixel that has a minimum loss value is regarded as the tracked phase edge pixel at t.

 figure: Fig. 2.

Fig. 2. Process of phase edge tracking. The red lines represent the phase edges. The red cross represents the phase edge pixel at $t - 1$ and the center of the template, and the white cross represents the center of the target. The red box is the region of interest, within which the phase edge pixel at t is likely to be.

Download Full Size | PDF

To reduce the influence of accidental factors on the tracking results, two refinement steps are performed. Firstly, the displacements of the tracked phase edges are smoothed. This is based on the feature that nearby phase edge pixels have close displacements. Secondly, maintain the connection between the adjacent phase edge pixels. If two phase edge pixels are adjacent at $t - 1$, connect them by phase edges at t if they are not connected. The purpose of this operation is to avoid interruption of the phase edge line. After refinement, a fine and comprehensive phase edge map at t is obtained and ready for further use.

3.2 Extra-pattern-free temporal and spatial phase unwrapping

The misalignment of phase edges will result in PUEs. Therefore, the regions covered by the phase edge trajectories should be processed separately. To do this, the wrapped phase map is divided into different regions according to the trajectories of phase edges firstly, then each region is unwrapped either temporally or spatially depending on its properties. The process of region division is demonstrated in Fig. 3. Assume that two discontinuous surfaces, Surface 1 and Surface 2, are measured as shown in Fig. 3 where the orange dotted lines and orange solid lines represent the phase edges at $t - 1$ and t respectively. Without loss of generality, suppose Surface 1 moves a bit to the upper right at t as illustrated in Fig. 3(a). The phase edges are tracked as demonstrated in Fig. 3(b). Then the trajectories of phase edges are used to divide the scene into several regions as shown in Fig. 3(c) where yellow, blue and green represent ${R_T}$, ${R_1}$ and ${R_2}$ respectively. ${R_T}$ is the region that is not affected by the movement of phase edges, so it can be unwrapped by extra-pattern-free TPUA taking the previously unwrapped phase map as the reference phase map:

$$\Phi ({x,y,t} )= \phi ({x,y,t} )+ 2\pi \times round\left[ {\frac{{\Phi ({x,y,t - 1} )- \phi ({x,y,t} )}}{{2\pi }}} \right],$$
where $\mathrm{\Phi }({x,y,t} )$ is the unwrapped phase map at t, $\phi ({x,y,t} )$ is the wrapped phase map at t, and $\mathrm{\Phi }({x,y,t - 1} )$ is the unwrapped phase map at $t - 1$. Note that Eq. (6) is similar to Eq. (2) combined with Eq. (3) except that the noise of $\mathrm{\Phi }({x,y,t - 1} )$ in Eq. (6) is not amplified while that of ${\mathrm{\Phi }_{ref}}({x,y} )$ in Eq. (3) is amplified largely by R. Besides, $\mathrm{\Phi }({x,y,t - 1} )$ in Eq. (6) has a lower noise level than ${\mathrm{\Phi }_{ref}}({x,y} )$ in Eq. (3) because it is calculated from the high-frequency fringe images [12].

 figure: Fig. 3.

Fig. 3. Demonstration of region division: (a) motion of phase edges; (b) phase edge tracking; (c) result of region division.

Download Full Size | PDF

As for ${R_1}$, it cannot be unwrapped by extra-pattern-free TPUA because of the misalignment of phase edges. And ${R_2}$ should be abandoned because it becomes an invalid region. Therefore, after ${R_T}$ is unwrapped temporally, ${R_1}$ should be unwrapped spatially. To do this, the quality-guided phase unwrapping algorithm [8] is applied, i.e., the pixels in these regions are unwrapped one by one in order of their qualities by the following equation:

$$\Phi ({x,y,t} )= \phi ({x,y,t} )+ 2\pi \times round\left[ {\frac{{\Phi ({{x_a},{y_a},t} )- \phi ({x,y,t} )}}{{2\pi }}} \right],$$
where $\mathrm{\Phi }({{x_a},{y_a},t} )$ is the unwrapped phase value of pixel $({{x_a},{y_a}} )$ that is adjacent to $({x,y} )$ at t. During the spatial phase unwrapping, the phase edges are used to block the unwrapping path to prevent error propagation. Notably, the spatially obtained phases here are also absolute because the starting pixel of the unwrapping path is adjacent to a pixel with an absolute phase value obtained by extra-pattern-free TPUA.

3.3 Workflow of the proposed algorithm

With the aid of the key techniques explained in Section 3.1 and 3.2, the procedures of the proposed phase unwrapping algorithm can be summarized as follows.

Step 1: phase edge detection. Given the unwrapped phase map at $t - 1$, the background has a low signal-to-noise ratio (SNR) and it is removed by labeling the pixels whose modulation is smaller than a preset threshold as invalid. And then the pixels with large phase gradients are also labeled as invalid because their phase values are usually inaccurate. After the above operations, the phase edges can be detected by searching for the invalid pixels that are adjacent to any valid pixel.

Step 2: phase edge tracking. Applying the Gaussian weighted template matching technique described in Section 3.1, the phase edges at t are located, and the tracking result is further refined.

Step 3: region division. The background of the wrapped phase map at t is also labeled as invalid, then the rest part is divided into different regions, ${R_T}$, ${R_1}$ and ${R_2}$, according to the trajectories of phase edges.

Step 4: temporal and spatial phase unwrapping. ${R_T}$ is unwrapped by extra-pattern-free TPUA firstly, then ${R_1}$ is unwrapped by SPUA while ${R_2}$ is abandoned.

To better illustrate the proposed algorithm, its workflow is presented in Fig. 4. At the beginning of the measurement, an initial unwrapped phase map is needed. Therefore, TPUAs such as DFPUA, multi-frequency phase unwrapping algorithm (MFPUA), or gray-code phase unwrapping algorithm can be applied. Then when $t \ge 1$, only high-frequency fringe patterns are required. The wrapped phase map of them is processed by the above four steps with the aid of the unwrapped phase map at $t - 1$. Those unwrapped phase maps are used to recover the 3D shape of the measured objects. The proposed algorithm realizes phase unwrapping for discontinuous surfaces efficiently without the need for extra patterns after obtaining the initial unwrapped phase map. Therefore, the time resolution of the 3D result is greatly improved, and the frame rate of the 3D result can be doubled compared with the conventional DFPUA, without changing the hardware. Meanwhile, it is more accurate and robust than the conventional DFPUA because it doesn’t amplify the noise of the reference phase map and it solves the problem of phase edge misalignment.

 figure: Fig. 4.

Fig. 4. Workflow of the proposed algorithm

Download Full Size | PDF

In practical measurement, some isolated surfaces which are not included in the initial phase map would emerge. In this case, their wrapped phases cannot be unwrapped if their reference phases are not obtained. To properly retrieve the absolute phases of the newly emerging isolated surfaces, we combine the proposed algorithm with DFPUA. In this strategy, DFPUA is performed periodically so that the absolute phases of the newly emerging isolated surfaces are obtained and regarded as the newest reference phases. Besides DFPUA, other temporal phase unwrapping algorithms such as multi-frequency phase wrapping algorithm and gray-code phase unwrapping algorithm can also be combined with the proposed algorithm to solve this problem. Compared with geometric constraint method, the proposed algorithm needs a few extra patterns to obtain the initial phase maps. However, the measurement depth range of the proposed algorithm is not limited, unlike geometric constraint method, because the reference phases in the proposed algorithm are real while those in geometric constraint method are virtual.

4. Experiments

To evaluate the performance of the proposed method, three experiments were conducted. A measuring system composed of a DLP projector (LightCrafter 4500) and a CMOS camera (FLIR BFS-U3-19S4M-C) was developed. The camera is equipped with a lens with a focal length of 12 mm. The resolutions of the projector and the camera are $912 \times 1140$ and $1616 \times 1240$ respectively. The three experiments are arranged in order of difficulty. In the first experiment, the motion of the object is controlled for the convenience of quantitative analysis. In the second experiment, two objects in a scene are measured, one of which remains stationary and the other is moving at a non-uniform speed. In the last experiment, the measured object also moves non-uniformly and the range of motion is very large. Moreover, the trajectories of the phase edges are complicated because some surfaces appear while some disappear during the movement. And the problem of lacking the reference phases on newly emerging surfaces is challenging.

4.1 Measurement of a pseudo-dynamic scene for quantitative analysis

To quantitatively compare the proposed algorithm and the conventional DFPUA, a pseudo-dynamic scene is tested. In this scene, a plaster statue is placed on the translation stage and moved 9 times to the right at 0.5 mm intervals. At each position, four sets of three-step phase-shifting fringe patterns are projected and captured. The periods of the four sets of fringe patterns are 18 pixels (Set 1), 72 pixels (Set 2), 288 pixels (Set 3), and 900 pixels (Set 4) respectively. The wrapped phase map of each set of images is denoted as ${\phi _i}({x,y,t} )$ where i is the set index ($i = 1,2,3,4$) and t is the time index or position index ($t = 0,1,2, \ldots ,9$). We use ${\mathrm{\Phi }^{ET}}({x,y,t} )$ and ${\mathrm{\Phi }^{DF}}({x,y,t} )$ to represent the unwrapped phase maps of ${\phi _1}({x,y,t} )$ obtained by the proposed algorithm and DFPUA respectively. The accurate unwrapped phase map at each position is calculated by four-frequency phase unwrapping algorithm using all four sets of images, and it is used to detect PUEs in ${\mathrm{\Phi }^{ET}}({x,y,t} )$ and ${\mathrm{\Phi }^{DF}}({x,y,t} )$. To simulate the object motion in dynamic measurement, DFPUA uses ${\phi _4}({x,y,t - 1} )$ as the reference phase map to unwrap ${\phi _1}({x,y,t} )$ while the proposed algorithm unwrap ${\phi _1}({x,y,t} )$ with the aid of ${\mathrm{\Phi }^{ET}}({x,y,t - 1} )$. ${\mathrm{\Phi }^{ET}}({x,y,0} )$ is obtained by the conventional MFPUA.

Figure 5 presents the results of the proposed algorithm and DFPUA when $t = 9$. The image of the plaster status and one of the high-frequency fringe images are presented in Fig. 5(a) and (b) respectively. The wrapped phase map is given in Fig. 5(c). The tracked phase edges are demonstrated as red lines in Fig. 5(d) and they are superimposed on $B({x,y,9} )$. From Fig. 5(d), it can be seen that the tracked phase edges fit well with the real discontinuities. The result of region division is presented in Fig. 5(e) where yellow represents ${R_T}$, blue represents ${R_1}$, green stands for ${R_2}$, and the background is shown in gray. As shown in Fig. 5(e), ${R_1}$ is concentrated near the phase edges and its proportion is small. Therefore, most pixels are unwrapped by extra-pattern-free TPUA which is more computationally efficient than SPUAs. For better visibility, the close-up view of the black box in Fig. 5(e) is given in Fig. 5(f). It can be seen from Fig. 5(f) that ${R_1}$ is located on the left side of the right surface while ${R_2}$ is located on the right side of the left surface which is because the object moves towards the right. Note that in the lower part of Fig. 5(f), ${R_1}$ and ${R_2}$ are very close. This is because there is a narrow invalid region between the two surfaces. The unwrapped phase maps obtained by the proposed algorithm and DFPUA are presented in Fig. 5(g) and (h) respectively. Figure 5(i) and (j) illustrate their corresponding PUEs. And Fig. 5(k) and (l) are the corresponding 3D results. It can be seen clearly from these results that DFPUA produces a large number of PUEs in the relatively low-SNR region due to the high frequency ratio and near the phase edges due to the motion-induced misalignment of the phase edges. However, the proposed algorithm effectively solves these problems with almost no PUEs generated. Note that the proposed algorithm doesn’t influence the quality of the high-frequency wrapped phase though it contributes to improving the unwrapping accuracy.

 figure: Fig. 5.

Fig. 5. Phase unwrapping results of the proposed algorithm and DFPUA on the pseudo-dynamic scene ($t = 9$): (a) image of the plaster statue; (b) one of the high-frequency fringe images; (c) wrapped phase map; (d) phase edge tracking result superimposed on $B({x,y,9} )$; (e) region division result; (f) close-up view of the black box in (e); (g) ${\mathrm{\Phi }^{ET}}({x,y,t} )$; (h) ${\mathrm{\Phi }^{DF}}({x,y,t} )$; (i) PUEs of the proposed algorithm; (j) PUEs of DFPUA; (k) 3D result of the proposed algorithm; (l) 3D result of DFPUA. (The size of the phase edges in (d) is enlarged 4 times for better visibility)

Download Full Size | PDF

To further compare the unwrapping accuracy of the two algorithms, the PUE rate at each position is calculated and presented in Fig. 6. The PUE rate is calculated by dividing the number of PUEs by the number of all valid pixels. Note that the ordinate axis in Fig. 6 is not continuous for better visibility. When $t \le 4$, the proposed algorithm does not produce any PUEs, and when $t > 4$, the PUE rate of DFPUA is ten thousand times that of the proposed algorithm. The PUE rates of DFPUA are so high because there are large regions whose fringe quality is low and the phase noise of these regions is amplified largely by 50. However, the frequency ratio cannot be significantly reduced in precision measurement considering the phase quality. On the contrary, the proposed algorithm can avoid the noise amplification of the reference phase map while maintaining high phase quality. Moreover, it requires fewer fringe patterns than DFPUA. Notably, the PUE rate of the proposed algorithm gradually increases with time while that of DFPUA does not strictly conform to this trend. This is because the PUEs in the proposed algorithm might propagate over time if they are not corrected. However, even if the PUE propagation exists, the PUE rate remains at a very low level that only about 27 pixels out of one million pixels were unwrapped wrongly after performing the algorithm nine times. And if the PUEs are corrected after being unwrapped, this effect will be minimal.

 figure: Fig. 6.

Fig. 6. Comparison of PUE rates of the two algorithms. The ordinate axis is not continuous for better visibility.

Download Full Size | PDF

4.2 Measurement of a deformable dynamic scene

The second experiment is to measure a complex deformable dynamic scene where a hand moves in front of a flat board. During the measurement, the five fingers bend gradually. The measurement of this kind of scene is challenging for both conventional SPUA and TPUA because it contains discontinuous surfaces and the non-rigid object moving at a non-uniform speed. The projection and capture speed of the fringe patterns is 120 Hz. The pattern sequence starts with a set of low-frequency three-step fringe patterns with a period of 900 pixels, followed by cyclic high-frequency three-step phase-shifting fringe patterns with a period of 18 pixels. The first six fringe images are used for obtaining an initial unwrapped phase map by DFPUA, then the rest images are processed by the proposed algorithm. Therefore, the frame rate of the 3D result in this experiment is almost 40 Hz which is twice that of the conventional DFPUA.

The measurement results at four moments are presented in Fig. 7. The four columns in Fig. 7 are the results at $50\; ms$, $1250\; ms$, $2500\; ms$, and $3750\; ms$ respectively. Figure 7(a) shows the captured fringe images at each moment. Their corresponding wrapped phase maps with low-SNR regions removed are demonstrated in Fig. 7(b). Figure 7(c) illustrates the results of phase edge tracking where the phase edges indicated in red are superimposed on $B({x,y,t} )$. From them we can see that the phase edges are tracked successfully. Based on the phase edge tracking results, the regions are divided into different types as shown in Fig. 7(d) where the yellow regions ${R_T}$ are unwrapped by extra-pattern-free TPUA, and the blue regions ${R_1}$ are unwrapped by SPUA. It can be seen from it that ${R_1}$ occupies a small proportion of the total area that needs to be unwrapped. The final unwrapped phase maps demonstrated in Fig. 7(e) show that the wrapped phase maps are unwrapped successfully. The corresponding 3D results are demonstrated in Fig. 7(f) and (g). They prove that the proposed algorithm performs well in this deformable dynamic scene. To better demonstrate the effectiveness of the proposed algorithm, the measurement results are dynamically displayed in Visualization 1.

 figure: Fig. 7.

Fig. 7. Measurement results of the deformable dynamic scene at four moments: (a) fringe images; (b) wrapped phase maps; (c) phase edge tracking results superimposed on $B({x,y,t} )$; (d) region division; (e) unwrapped phase maps; (f) 3D results of the objects; (g) 3D results of the hand (see Visualization 1). (The size of phase edges in (c) is enlarged 4 times for better visibility)

Download Full Size | PDF

To further evaluate the effectiveness of the proposed algorithm, the trajectories of five points on the fingertips from $50\; ms$ to $3750\; ms$ are obtained through the phase edge tracking technique. Although these pixels are not phase edges, this technique can still track the two-dimensional (2D) trajectories of them as shown in Fig. 8(a), in which, the five trajectories are superimposed on $A({x,y} )$ at $50\; ms$. The 3D trajectories can be obtained by converting the pixel coordinates and their phase values into 3D coordinates. The 3D trajectories along with the point cloud of the hand at $50\; ms$ are demonstrated in Fig. 8(b). It can be seen clearly from Fig. 8(a) and (b) how the fingertips move when the fingers bend gradually. These results demonstrate the ability of the proposed algorithm to track pixels and the correctness of phase unwrapping.

 figure: Fig. 8.

Fig. 8. Trajectories of the five points on the fingertips: (a) 2D trajectories; (b) 3D trajectories.

Download Full Size | PDF

4.3 Measurements of a complex dynamic scene

To further explore the potential of the proposed algorithm, a more complex experiment is conducted. This experiment is to measure a hand that is turned from the palm to the back of the hand which means some surfaces appear while some disappear during the measurement. The motion range of the hand is very large and the motion is also non-uniform. To perform the measurement, a higher sampling speed is desirable. Therefore, two changes are made in the experimental settings. Firstly, 7-bit instead of 8-bit fringe patterns are projected which enables the DLP projector to reach a pattern rate of 222.2 Hz while barely reducing fringe quality when the projector is slightly defocused. Secondly, combine every four pixels of the camera into one pixel through pixel binning technique, leading to a max frame rate of 227 Hz and a resolution of $808 \times 620$. After completing these settings, this experiment is carried out when the sampling speed is set to 222.2 Hz (exposure time of $4.5\; ms$). The projected pattern sequence starts with a set of three-step fringe patterns with a period of 900 pixels, followed by 18 sets of three-step phase-shifting fringe patterns with a period of 18 pixels. This pattern sequence composed of 57 fringe patterns is projected and captured repeatedly until the end of the measurement. Every first six images of each sequence are used for obtaining an initial unwrapped phase map by DFPUA, then the rest 51 images of each sequence are processed by the proposed algorithm. Therefore, each sequence of images can produce 18 (1 + 17) 3D results and the frame rate of the 3D result in this experiment reaches 70 Hz. The reason for combining the proposed algorithm with DFPUA is that in such a complex scene, the newly emerging isolated surfaces need to obtain the reference phase with the help of DFPUA. Here, every 222 images can produce 70 3D results. But if the pure DFPUA is applied, only 37 results can be produced using the same number of images. In summary, the proposed algorithm increases the frame rate of the 3D result by 89.19% in this experiment.

Figure 9 illustrates the measurement results at four moments, $0.0135\; s$, $4.0905\; s$, $8.1945\; s$, and $12.0420\; s$. It is notably that these results are all the results of the last set of fringe images in their corresponding sequences, which means they are the results after the proposed algorithm has been performed 17 times continuously. Each row in Fig. 9 presents the corresponding contents of each row in Fig. 7. Figure 9(c) shows that the phase edges are effectively tracked through the phase edge tracking algorithm at each moment. Similar to Fig. 7(d), Fig. 9(d) also indicates that most of the pixels are unwrapped by extra-pattern-free TPUA while few pixels near the phase edges are unwrapped by SPUA. The unwrapped phase maps and their corresponding 3D results are demonstrated in Fig. 9(e) and (f) respectively. They clearly prove the effectiveness of the proposed algorithm. For a better demonstration of the measurement results, readers can refer to Visualization 2.

 figure: Fig. 9.

Fig. 9. Measurement results of the moving hand at four moments: (a) fringe images; (b) wrapped phase maps; (c) phase edge tracking results superimposed on $B({x,y} )$; (d) region division; (e) unwrapped phase maps; (f) 3D results of the objects (see Visualization 2). (The size of phase edges in (c) is enlarged 3 times for better visibility)

Download Full Size | PDF

5. Discussion

The above experiments demonstrate the effectiveness of the proposed algorithm. Firstly, the proposed algorithm can significantly increase the frame rate of the 3D result. In the first and second experiments, it requires no extra patterns for phase unwrapping after obtaining the initial unwrapped phase map, so it nearly doubles the frame rate of the 3D result compared with the conventional DFPUA. In the third experiment, it still increases the frame rate of the 3D result by 89.19% even if it is used in combination with the conventional DFPUA. Secondly, the proposed algorithm can retrieve accurate absolute phase maps for objects with complex shapes and non-uniform motions. The first experiment quantitatively proves that it produces only about one ten-thousandth as many PUEs as the conventional DFPUA. And in the second and third experiments, no obvious PUEs are observed even if the shape and motion of objects are complex. Thirdly, the proposed algorithm can work stably and efficiently. From the results provided in the three experiments, it can be confirmed that the proposed algorithm can track phase edges and divide regions stably. Meanwhile, it is computationally efficient because these results show that the proportions of pixels that need to be tracked and pixels that need be to unwrapped by SPUA are very small. Despite these promising results, some details need to be further discussed.

Firstly, the kinematic velocity of the objects in the above experiments is relatively slow due to the limitation of the hardware and the use of phase-shifting algorithm, rather than the limitation of the proposed algorithm. If the frame rate of the hardware is high enough to make the movement small enough between two consecutive measurements, the proposed method can always work well. If the binary fringe projection technique [37] is applied and a high-speed camera is used, the increased sampling speed will enable the system to measure faster objects. And the motion-induced phase error of phase-shifting algorithm can be reduced by applying existing excellent algorithms [21,38,39], in which case the performance of the proposed algorithm would be further improved.

Secondly, PUEs might still exist and are at risk of propagating along space. However, the first experiment has proved that the proportion of PUEs is extremely low, and PUE correction can be performed to prevent PUEs from propagating along space. There are some effective algorithms [15,25,29] that can detect and correct the PUEs automatically.

Thirdly, the phases of the newly emerging isolated surfaces cannot be unwrapped properly by the proposed algorithm solely due to the lack of the reference phase. By combining the proposed algorithm with DFPUA or other TPUAs such as MFPUA and gray-code phase unwrapping algorithm, the absolute phases of the newly emerging isolated surfaces are obtained and regarded as the newest reference phases. In this case, the frame rate of the 3D result can still be much higher than that of the conventional DFPUA. Furthermore, some other strategies potential to deal with this problem are being investigated by our team.

Fourthly, the method of obtaining the initial unwrapped phase map depends on the practical situation. If the measured object can start to move from a static state, it is recommended to use MFPUA or other TPUAs to retrieve a high-quality unwrapped phase map when the object is static. However, if this condition is difficult to meet, methods with less sampling time such as DFPUA are preferred. When DFPUA is applied, especially when the projection strategy in Experiment 3 is applied, it is vital to make sure that the phase maps unwrapped by DFPUA are accurate. Therefore, it is necessary to detect and correct the potential PUEs in the unwrapped phase with the above-mentioned effective PUE correction algorithms.

Lastly, the proposed algorithm processes the data offline at the current stage. Compared with the conventional DFPUA, although the data acquisition time of each 3D frame is less, it takes extra time to track the phase edges and perform SPUA. Since the algorithm is implemented in Python and has not been optimized, it takes a few seconds (around $3\; s$ for megapixels) to perform unwrapping. If it is implemented in a more efficient programming language and fully optimized, the proposed algorithm is very likely to be used for real-time unwrapping.

6. Conclusion

This paper presents a novel phase unwrapping algorithm based on phase edge tracking for dynamic 3D measurement. It unwraps the current wrapped phase map with the aid of the previously unwrapped phase map so it doesn’t require extra patterns for phase unwrapping. Therefore, it has the ability to double the frame rate of the 3D result at no additional cost. Even if this algorithm doesn’t consume extra patterns, it can avoid PUEs caused by noise amplification and motion-induced misalignment of phase edges. The presented experimental results reveal that it can retrieve the accurate absolute phase maps for objects with complex shapes and non-uniform motions stably and efficiently. Despite its limitations, the proposed algorithm is very promising and will be further researched and developed.

Funding

National Natural Science Foundation of China (61975161, 61905190); Natural Science Foundation of Jiangsu Province (BK20190219); China Postdoctoral Science Foundation (2020M683460).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. S. S. Gorthi and P. Rastogi, “Fringe projection techniques: Whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010). [CrossRef]  

2. X. Su and Q. Zhang, “Dynamic 3-D shape measurement method: A review,” Opt. Lasers Eng. 48(2), 191–204 (2010). [CrossRef]  

3. S. Zhang, “High-speed 3D shape measurement with structured light methods: A review,” Opt. Lasers Eng. 106, 119–131 (2018). [CrossRef]  

4. J. Xu and S. Zhang, “Status, challenges, and future perspectives of fringe projection profilometry,” Opt. Lasers Eng. 135, 106193 (2020). [CrossRef]  

5. J. M. Huntley and H. Saldner, “Temporal phase-unwrapping algorithm for automated interferogram analysis,” Appl. Opt. 32(17), 3047 (1993). [CrossRef]  

6. H. Zhao, W. Chen, and Y. Tan, “Phase-unwrapping algorithm for the measurement of three-dimensional object shapes,” Appl. Opt. 33(20), 4497 (1994). [CrossRef]  

7. D. C. Ghiglia and M. D. Pritt, Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software, 1st edition (Wiley-Interscience, 1998).

8. X. Su and W. Chen, “Reliability-guided phase unwrapping algorithm: a review,” Opt. Lasers Eng. 42(3), 245–261 (2004). [CrossRef]  

9. S. Zhang, X. Li, and S.-T. Yau, “Multilevel quality-guided phase unwrapping algorithm for real-time three-dimensional shape reconstruction,” Appl. Opt. 46(1), 50 (2007). [CrossRef]  

10. E. Zappa and G. Busca, “Comparison of eight unwrapping algorithms applied to Fourier-transform profilometry,” Opt. Lasers Eng. 46(2), 106–116 (2008). [CrossRef]  

11. Y. An, J.-S. Hyun, and S. Zhang, “Pixel-wise absolute phase unwrapping using geometric constraints of structured light system,” Opt. Express 24(16), 18445–18459 (2016). [CrossRef]  

12. C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016). [CrossRef]  

13. Y. Feng, Y. Han, and Q. Zhang, “Two 3D spatial phase unwrapping algorithms for dynamic 3D shape measurement,” J. Mod. Opt. 67(19), 1479–1491 (2020). [CrossRef]  

14. X. He and Q. Kemao, “A comparative study on temporal phase unwrapping methods in high-speed fringe projection profilometry,” Opt. Lasers Eng. 142, 106613 (2021). [CrossRef]  

15. C. Zhang, H. Zhao, X. Gao, Z. Zhang, and J. Xi, “Phase unwrapping error correction based on phase edge detection and classification,” Opt. Lasers Eng. 137, 106389 (2021). [CrossRef]  

16. P. Cong, Z. Xiong, Y. Zhang, S. Zhao, and F. Wu, “Accurate Dynamic 3D Sensing With Fourier-Assisted Phase Shifting,” IEEE J. Sel. Top. Signal Process. 9(3), 396–408 (2015). [CrossRef]  

17. Z. Wu, W. Guo, Y. Li, Y. Liu, and Q. Zhang, “High-speed and high-efficiency three-dimensional shape measurement based on Gray-coded light,” Photonics Res. 8(6), 819–829 (2020). [CrossRef]  

18. K. Itoh, “Analysis of the phase unwrapping algorithm,” Appl. Opt. 21(14), 2470 (1982). [CrossRef]  

19. R. M. Goldstein, H. A. Zebker, and C. L. Werner, “Satellite radar interferometry - Two-dimensional phase unwrapping,” Radio Sci. 23(4), 713–720 (1988). [CrossRef]  

20. Y. Wang, S. Zhang, and J. H. Oliver, “3D shape measurement technique for multiple rapidly moving objects,” Opt. Express 19(9), 8539–8545 (2011). [CrossRef]  

21. S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-D measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018). [CrossRef]  

22. Z. Wu, W. Guo, L. Lu, and Q. Zhang, “Generalized phase unwrapping method that avoids jump errors for fringe projection profilometry,” Opt. Express 29(17), 27181–27192 (2021). [CrossRef]  

23. B. Zhang, J. Ziegert, F. Farahi, and A. Davies, “In situ surface topography of laser powder bed fusion using fringe projection,” Addit. Manuf. 12, 100–107 (2016). [CrossRef]  

24. H. An, Y. Cao, H. Wu, N. Yang, C. Xu, and H. Li, “Spatial-temporal phase unwrapping algorithm for fringe projection profilometry,” Opt. Express 29(13), 20657–20672 (2021). [CrossRef]  

25. J. Deng, J. Li, H. Feng, S. Ding, Y. Xiao, W. Han, and Z. Zeng, “Edge-preserved fringe-order correction strategy for code-based fringe projection profilometry,” Signal Process. 182, 107959 (2021). [CrossRef]  

26. B. Li, Z. Liu, and S. Zhang, “Motion-induced error reduction by combining Fourier transform profilometry with phase-shifting profilometry,” Opt. Express 24(20), 23289–23303 (2016). [CrossRef]  

27. Y. An and S. Zhang, “Three-dimensional absolute shape measurement by combining binary statistical pattern matching with phase-shifting methods,” Appl. Opt. 56(19), 5418–5426 (2017). [CrossRef]  

28. B. Li, T. Bell, and S. Zhang, “Computer-aided-design-model-assisted absolute three-dimensional shape measurement,” Appl. Opt. 56(24), 6770–6776 (2017). [CrossRef]  

29. C. Zuo, T. Tao, S. Feng, L. Huang, A. Asundi, and Q. Chen, “Micro Fourier Transform Profilometry (µFTP): 3D shape measurement at 10,000 frames per second,” Opt. Lasers Eng. 102, 70–91 (2018). [CrossRef]  

30. S. Zhang, “Absolute phase retrieval methods for digital fringe projection profilometry: A review,” Opt. Lasers Eng. 107, 28–37 (2018). [CrossRef]  

31. J. Qian, T. Tao, S. Feng, Q. Chen, and C. Zuo, “Motion-artifact-free dynamic 3D shape measurement with hybrid Fourier-transform phase-shifting profilometry,” Opt. Express 27(3), 2713–2731 (2019). [CrossRef]  

32. W. Guo, Z. Wu, Y. Li, Y. Liu, and Q. Zhang, “Real-time 3D shape measurement with dual-frequency composite grating and motion-induced error reduction,” Opt. Express 28(18), 26882–26897 (2020). [CrossRef]  

33. R. Brunelli, Template Matching Techniques in Computer Vision: Theory and Practice (Wiley, 2009).

34. B. Li, N. Karpinsky, and S. Zhang, “Novel calibration method for structured-light system with an out-of-focus projector,” Appl. Opt. 53(16), 3415–3426 (2014). [CrossRef]  

35. S. Zhang and S.-T. Yau, “High-resolution, real-time 3D absolute coordinate measurement based on a phase-shifting method,” Opt. Express 14(7), 2644–2649 (2006). [CrossRef]  

36. Z. Wu, W. Guo, B. Pan, Q. Kemao, and Q. Zhang, “A DIC-assisted fringe projection profilometry for high-speed 3D shape, displacement and deformation measurement of textured surfaces,” Opt. Lasers Eng. 142, 106614 (2021). [CrossRef]  

37. S. Lei and S. Zhang, “Flexible 3-D shape measurement using projector defocusing,” Opt. Lett. 34(20), 3080–3082 (2009). [CrossRef]  

38. Z. Liu, P. C. Zibley, and S. Zhang, “Motion-induced error compensation for phase shifting profilometry,” Opt. Express 26(10), 12632–12637 (2018). [CrossRef]  

39. L. Lu, V. Suresh, Y. Zheng, Y. Wang, J. Xi, and B. Li, “Motion induced error reduction methods for phase shifting profilometry: A review,” Opt. Lasers Eng. 141, 106573 (2021). [CrossRef]  

Supplementary Material (2)

NameDescription
Visualization 1       Measurement results of the second experiment: a hand moves in front of a flat board.
Visualization 2       Measurement results of the third experiment: a hand turns from the palm to the back of the hand.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. PUEs in the conventional DFPUA.
Fig. 2.
Fig. 2. Process of phase edge tracking. The red lines represent the phase edges. The red cross represents the phase edge pixel at $t - 1$ and the center of the template, and the white cross represents the center of the target. The red box is the region of interest, within which the phase edge pixel at t is likely to be.
Fig. 3.
Fig. 3. Demonstration of region division: (a) motion of phase edges; (b) phase edge tracking; (c) result of region division.
Fig. 4.
Fig. 4. Workflow of the proposed algorithm
Fig. 5.
Fig. 5. Phase unwrapping results of the proposed algorithm and DFPUA on the pseudo-dynamic scene ($t = 9$): (a) image of the plaster statue; (b) one of the high-frequency fringe images; (c) wrapped phase map; (d) phase edge tracking result superimposed on $B({x,y,9} )$; (e) region division result; (f) close-up view of the black box in (e); (g) ${\mathrm{\Phi }^{ET}}({x,y,t} )$; (h) ${\mathrm{\Phi }^{DF}}({x,y,t} )$; (i) PUEs of the proposed algorithm; (j) PUEs of DFPUA; (k) 3D result of the proposed algorithm; (l) 3D result of DFPUA. (The size of the phase edges in (d) is enlarged 4 times for better visibility)
Fig. 6.
Fig. 6. Comparison of PUE rates of the two algorithms. The ordinate axis is not continuous for better visibility.
Fig. 7.
Fig. 7. Measurement results of the deformable dynamic scene at four moments: (a) fringe images; (b) wrapped phase maps; (c) phase edge tracking results superimposed on $B({x,y,t} )$; (d) region division; (e) unwrapped phase maps; (f) 3D results of the objects; (g) 3D results of the hand (see Visualization 1). (The size of phase edges in (c) is enlarged 4 times for better visibility)
Fig. 8.
Fig. 8. Trajectories of the five points on the fingertips: (a) 2D trajectories; (b) 3D trajectories.
Fig. 9.
Fig. 9. Measurement results of the moving hand at four moments: (a) fringe images; (b) wrapped phase maps; (c) phase edge tracking results superimposed on $B({x,y} )$; (d) region division; (e) unwrapped phase maps; (f) 3D results of the objects (see Visualization 2). (The size of phase edges in (c) is enlarged 3 times for better visibility)

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

I ( x , y ) = A ( x , y ) + B ( x , y ) cos [ ϕ ( x , y ) ] ,
Φ ( x , y ) = ϕ ( x , y ) + 2 π × K ( x , y ) ,
K ( x , y ) = r o u n d [ R Φ r e f ( x , y ) ϕ ( x , y ) 2 π ] ,
Δ K ( x , y ) = r o u n d [ R Δ Φ r e f ( x , y ) Δ ϕ ( x , y ) 2 π ] .
L ( u , v ) = x , y { G a u ( x , y ) [ B ( x , y , t 1 ) B ( x + u , y + v , t ) ] 2 } ,
Φ ( x , y , t ) = ϕ ( x , y , t ) + 2 π × r o u n d [ Φ ( x , y , t 1 ) ϕ ( x , y , t ) 2 π ] ,
Φ ( x , y , t ) = ϕ ( x , y , t ) + 2 π × r o u n d [ Φ ( x a , y a , t ) ϕ ( x , y , t ) 2 π ] ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.