Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Resolving time-varying attitude jitter of an optical remote sensing satellite based on a time-frequency analysis

Open Access Open Access

Abstract

Attitude jitter is a crucial factor that limits the imaging quality and geo-positioning accuracy of high-resolution optical satellites, which has attracted significant research interests in recent years. However, few researchers have attempted to retrieve the dynamic characteristics and time-varying trends of a satellite attitude jitter. This paper presents a novel processing framework for detecting, estimating, and investigating time-varying attitude jitter in long strips based on a time-frequency analysis with the input from either an attitude sensor or an optical imaging sensor. Attitude angle signals containing attitude jitter information are detected from attitude data through generating the Euler angles relative to the orbit coordinate system, or from image data through high-accuracy dense matching between parallax observations, correction of integration time variation and frequency domain-based deconvolution. Variational mode decomposition is adopted to extract the separate band-limited periodic components, and Hilbert spectral analysis is integrated to estimate the instantaneous attributes for each time sample and the varying trends for the entire duration. Experiments with three sets of ZiYuan-3 long-strip datasets were carried out to test the novel processing framework of attitude jitter. The experimental results indicate that the processing framework could reveal the dynamic jitter characteristics, and the mutual validations of different data sources demonstrate the effectiveness of the proposed method.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

High-resolution optical satellite imagery is an important data source for many remote sensing applications [13]. Accurate attitude estimation is crucial to the geometric performance of high-resolution satellite images that largely determines their application potentials [4]. According to the principle of photogrammetric measurement, a slight bias in attitude angles can result in considerable geo-positioning errors due to the high orbital altitude. In addition to the attitude measurement errors, one of the significant error sources of sensor orientation is platform instability. The micro-vibration of the satellite platform induces attitude jitter, which is the periodic instability of satellite attitude [5].

Attitude jitter is a common phenomenon, and many high-resolution optical remote sensing satellites suffer from this issue, such as ALOS, Pléiades-HR, ZiYuan-3 (ZY-3), and Mars Reconnaissance Orbiter [69]. It can easily originate from dynamic mechanical disturbances, attitude control operation, thermal change, and other impacting factors [10]. The new requirements to design lightweight and agile satellites make the presence of attitude jitter more probable [11]. If unrecorded or unmodelled by attitude measurement sensors, attitude jitter will undermine the photogrammetric relation and lead to geolocation information loss. Also, the jitter can be transmitted to the focal plane and thus result in noticeable distortions in images. Therefore, attitude jitter significantly limits the performance of satellite images in remote sensing applications requiring high accuracies, such as rational polynomial coefficients generation, digital elevation model production, and change detection [1214]. With the increasing spatial resolution of satellite images, the influence of attitude jitter will become more serious. As a result, it is an indispensable step to estimate and analyze attitude jitter in order to ensure image quality.

According to the previous studies, the estimation and investigation of satellite attitude jitter are realized mainly in two strategies. One solution is to analyze the attitude data obtained from on-board attitude sensors directly. In general, satellite attitude can be determined by combining measurements from the star sensors and gyros, which are sampled at a relatively low frequency [15]. According to the Nyquist-Shannon theorem, attitude jitter with the frequency below half of the sampling rate is able to be recognized. For the purpose of acquiring attitude information with higher temporal and angular resolutions, a few recent satellites have equipped with some additional high-performance attitude sensors, such as angular displacement sensors in ALOS [16] and Yaogan-26 satellite [17], and micro-vibration accelerometer in GaoFen-9 satellite [18]. Based on these external attitude sensors, high-frequency attitude jitter can be measured using Fourier spectral analysis.

Alternatively, attitude jitter can be retrieved from the distorted images. In this category, attitude jitter detection based on parallax observation configuration of pushbroom camera is the most commonly used method, which adopts dense matching to obtain the relative distortions caused by attitude jitter between short-time asynchronous images. The similar methods have received wide attention to estimate and analyze attitude jitter of many satellites and sensors due to the high flexibility and convenience [1924]. For instance, Teshima and Iwasaki [19] detected attitude jitter of Terra spacecraft from short-wave infrared sensors of the Advanced Spaceborne Thermal Emission and Reflection Radiometer and discovered the jitter with a frequency of around 1.5 Hz. Tong, et al [20]. found a noticeable jitter of ZY-3 satellite with a frequency of around 0.65 Hz using multispectral imagery. Liu et al [23]. determined the jitter of Chinese Heavenly Palace-1 satellite at a frequency of 0.12 Hz based on parallax observations and attitude data. Zhu, et al [24]. estimated the attitude jitter with a frequency of 1.1-1.2 Hz in the cross-track direction for Gaofen-1 02/03/04 satellites.

Attitude jitter of remote sensing satellite is a complicated dynamic process, and the frequency and amplitude attributes of jitter are varying over time, even not constant within a short period. However, the existing studies mainly focus on detection and validation of attitude jitter using datasets with a relatively short duration, while the comprehensive jitter interpretation and stability analysis of satellite platforms prefer a more extended time duration. In addition, most of the related works take attitude jitter information as stationary signals and calculate the fixed attributes of attitude jitter for the entire signal using Fourier spectral analysis or trigonometric function fitting, which is not a practical solution for long-term investigations. In this case, the dynamic characteristics and varying trends of attitude jitter over time cannot be revealed. To address these limitations, a processing framework of attitude jitter using observations of long strips based on time-frequency analysis is developed in this paper. The jitter processing framework in this study integrates the tasks of jitter detection that extracts attitude jitter information, jitter estimation that models the attributes of attitude jitter and jitter analysis that investigates the varying trends of attitude jitter. The inputs can be either on-board attitude measurement data or distorted images with parallax observation such as different multispectral bands and staggered arrays. In our framework, attitude jitter information detected from attitude data or optical imagery is regarded as non-stationary signals, and the dominant periodic components and instantaneous attributes for each time sample are extracted based on signal decomposition and Hilbert transform (HT). Experiments conducted using ZY-3 long-strip datasets demonstrate the performance of the proposed attitude jitter processing framework. The derived components of attitude jitter can be effectively corrected using the existing jitter compensation methods when it is required, and the results of time-frequency analysis can be beneficial to the following stability analysis and source determination. Therefore, the present study is meaningful to the field of attitude jitter processing. The main contributions of this paper are summarized as follows. 1) A novel processing framework of attitude jitter is proposed for both long-strip attitude data and imagery instead of short-duration datasets; 2) Different from the conventional direct frequency estimation and Fourier-based spectral analysis that cannot resolve the dynamic characteristics of attitude jitter over time, variational mode decomposition (VMD) is adopted to extract the separate periodic components with narrow-band property, and Hilbert spectral analysis is integrated to retrieve the instantaneous characteristics for each time sample and varying trends for the whole observation period; 3) Taking ZY-3 satellite as a case study, the long-strip estimation and dynamic analysis of attitude jitter is performed by the proposed processing framework.

The remainder of this paper is organized as follows. The details of the proposed processing framework of attitude jitter, including jitter detection and time-frequency analysis, are introduced in Section 2. Section 3 describes the experimental datasets and presents the experimental results and discussion. Finally, the concluding remarks are given in Section 4.

2. Proposed processing framework of attitude jitter

The objective of this study is to develop a practical processing framework for detection, estimation, and investigation of satellite attitude jitter in long strips using time-frequency analysis. Figure 1 illustrates the overall workflow of the proposed long-strip processing framework of attitude jitter, which can be divided into three major parts: jitter detection from attitude data, jitter detection from optical imagery, and time-frequency analysis. The inputs are long-strip data captured from attitude sensors or optical imaging sensors that record the apparent information of satellite attitude jitter. For attitude sensors, the fused attitude data is represented as Euler angles relative to the orbit coordinate system (OCS). For optical imaging sensors, attitude jitter information is retrieved from the relative disparity between parallax images obtained by high-accuracy dense matching. The outputs are the periodic components of attitude jitter and the corresponding time histories of instantaneous attributes, which are further extracted through VMD and Hilbert spectral analysis. More details of the long-strip processing framework are described in the following sections.

 figure: Fig. 1.

Fig. 1. The overall workflow of the proposed processing framework of attitude jitter.

Download Full Size | PDF

2.1. Jitter detection from attitude data

For most high-resolution optical remote sensing satellites, attitude estimation is achieved by fusing the measurement data from star sensors and precise inertial attitude sensors using Kalman filter. Gyroscopes are commonly adopted as the inertial sensors, and some high-performance attitude sensors such as angular displacement sensors are attached for a few high-resolution satellites to provide high-frequency attitude information. After the information fusion, the satellite attitude is typically represented in the form of quaternion and defines the rotation matrix $R_B^{ECI}$ from the body coordinate system to the earth-centered inertial coordinate system. However, the Euler angles (i.e., roll, pitch, yaw) converted from $R_B^{ECI}$ cannot directly reflect attitude jitter as they include other rotation factors [17]. Therefore, attitude data should be transferred to the OCS to reduce the value of the Euler angles such that the attitude jitter information becomes significant. According to the position vector $\overrightarrow P (t)$ and velocity vector $\overrightarrow V (t)$ of the satellite at time t in the inertial coordinate system, the rotation matrix $R_B^O$ from the body coordinate system to the OCS can be given by:

$$\begin{array}{l} R_B^O = {({R_O^{ECI}} )^T} \cdot R_B^{ECI}\\ R_O^{ECI}(t) = \left[ {\begin{array}{ccc} {{{\overrightarrow X }_O}(t)}&{{{\overrightarrow Y }_O}(t)}&{{{\overrightarrow Z }_O}(t)} \end{array}} \right]\\ {\overrightarrow Z _O}(t) = \frac{{\overrightarrow P (t)}}{{||{\overrightarrow P (t)} ||}}, {\overrightarrow X _O}(t) = \frac{{\overrightarrow V (t) \times {{\overrightarrow Z }_O}(t)}}{{||{\overrightarrow V (t) \times {{\overrightarrow Z }_O}(t)} ||}},\\ {\overrightarrow Y _O}(t) = {\overrightarrow Z _O}(t) \times {\overrightarrow X _O}(t) \end{array}$$

The Euler angles around three axes (i.e., roll(t), pitch(t), yaw(t)) can be converted from $R_B^O$ to perform the following jitter analysis. In addition, due to the drift angle compensation considering Earth’s rotation, the yaw angle relative to the orbit coordinate system contains a global trend besides the periodic attitude jitter. Therefore, we alternatively take the difference between yaw angle and its result of second-order polynomial fitting to eliminate this strong global trend preliminarily. Figure 2 illustrates an instance of the original yaw angle and the angle difference. It can be seen that the periodic attitude jitter is dominated in the yaw angle after trend removal.

 figure: Fig. 2.

Fig. 2. An instance of the correction in the yaw angle. (a) The original yaw angle relative to the OCS. (b) The difference between yaw angle and its second-order polynomial fitting.

Download Full Size | PDF

Finally, each attitude angle is fitted to a smoothing spline model in order to filter the attitude noise as well as maximally reserve the attitude jitter information [25]. The cubic smoothing spline maintains a balance between having a smooth curve and being close to the given data, which solves the coefficients of smoothing spline function by minimizing the following objective equation [26]:

$$p\sum\limits_i {{w_i}{{({y_i} - s({x_i}))}^2}} + (1 - p)\int {s^{\prime\prime}} {(x)^2}dx$$
where s denotes the smoothing spline function represented by piecewise polynomials, ${s^{\prime\prime}}$ denotes the second derivative, p is the smoothing parameter controlling the trade-off between data fidelity and roughness, w is the specified weight of data, and (x, y) is the given inputs, e.g., time and attitude angle signals. A solution of Eq. (2) was proposed in [27].

2.2. Jitter detection from optical imagery

Optical sensors provide an alternative solution to detect and estimate attitude jitter since the distorted images also record information about attitude jitter [28]. In this study, the conventional jitter detection method based on parallax observation [20] is refined and applied to long-strip images. For two parallel linear arrays within a parallax observation configuration of the pushbroom camera, there exists a time difference for imaging the same ground object. In the presence of attitude jitter, the parallax images are warped by the same dynamic process but with different amplitudes, which induces the relative disparity $g(t)$ expressed as [19]:

$$g(t) = f(t + \Delta t) - f(t)$$
where $f(t)$ denotes the image distortion caused by attitude jitter, and $\Delta t$ is the time difference between parallax observations.

Although the image distortion $f(t)$ cannot be directly measured, the relative disparity $g(t)$ can be obtained via dense matching. Precise matching of two parallax images is realized by image correlation on a dense grid [29]. As the matching accuracy is a crucial factor, the phase correlation method using singular value decomposition and unified random sample consensus [30], which is operated directly in the frequency domain based on the Fourier shift property, is adopted to calculate subpixel shifts on each grid position. Two (i.e., cross-track and along-track) disparity maps are generated, and mismatches mainly in decorrelation areas are filtered out by thresholding the correlation measures. As each pixel on the same image line is captured at the same exposure time, the disparity maps can be averaged in each line with three-sigma limits to obtain the average disparity curve.

A significant factor that makes the obtained subpixel shifts in the along-track direction bias the assumption of Eq. (3) is integration time variation. For the pushbroom cameras with time delay integration charge-coupled device arrays, the integration time of scanning line is commonly adjusted in order to guarantee that the scanning rates match the object motion [31]. As shown in Fig. 3(a), the along-track average disparity curve obtained from dense matching contains apparent image shifts caused by integration time variation besides the periodic relative disparity caused by attitude jitter. In this study, the influence of integration time variation on long-strip jitter estimation is eliminated before jitter estimation by automatically determining the affected intervals according to the integration time records of scanning line and effectively correcting the induced image shifts. Figure 3 gives an instance using ZY-3 multispectral imagery. The blue curve in Figs. 3(a) and 3(b) shows the along-track average disparity from dense matching and the integration time of each scanning line, respectively. The affected time intervals by integration time variation can be acquired from the nominal time difference between parallax images (see the blue curve in Fig. 3(c)), which is calculated from the integration time records considering the nominal distance between parallax images. The start and end of each affected interval are determined as the inflection points of the nominal time difference (see the red and green circles in Fig. 3(c)). The total image shift $\Delta L$ caused by each integration time variation can be approximated as [31]:

$$\Delta L = d{L_2} - d{L_1} = \frac{{{T_1} - {T_2}}}{{{T_1}}} \cdot d{L_2}$$
where T1 is the former integration time before variation, T2 is the latter integration time, and dL denotes the along-track pixel offset between parallax images. Within each affected time interval, the relative image shift linearly varies from 0 to $\Delta L$. Therefore, we can calculate the compensated image shift for each scanning time (see the red curve in Fig. 3(a)) and eliminate the influence of integration time variation by subtracting the calculated image shift.

 figure: Fig. 3.

Fig. 3. An instance of the influence of integration time variation. (a) Along-track average disparity (blue curve) and the calculated image shift caused by integration time variation (red curve). (b) The integration time of each scanning line. (c) Nominal time difference between parallax images. The red and green circles denote the starts and ends of affected intervals, respectively.

Download Full Size | PDF

After correcting the influence of integration time variation in the along-track direction, two-direction average disparity measurements are also fitted to a smoothing spline model separately. In addition to further reducing matching artifacts, we can create uniformly spaced data series by means of this smoothing spline interpolation, which is necessary to the following spectral analysis. The relative disparity $g(t)$ in each direction is finally obtained by removing the non-periodic direct-current component.

According to the relationship in Eq. (3), the image distortion $f(t)$ caused by attitude jitter can be retrieved from $g(t)$. This deconvolution problem can be realized in the frequency domain based on the Fourier shift property [20], which is expressed as follows:

$$f(t) = {{\cal F}^{ - 1}}\left\{ {\frac{{{\cal F}[g(t)]}}{{\exp ({{2\pi ik\Delta t} \mathord{\left/ {\vphantom {{2\pi ik\Delta t} N}} \right.} N}) - 1}}} \right\}$$
where i is the first solution to the equation ${i^2} ={-} 1$, N is the data length, $k = 0, 1, \ldots , N - 1$, and ${\cal F}$ and ${{\cal F}^{ - 1}}$ denote the Fourier transform (FT) and the inverse FT, respectively. However, due to the periodicity assumption of discrete FT, $f(t + \Delta t)$ requires to be circularly shifted for practical consideration. Therefore, $g(t)$ in Eq. (5) should be given by [32]:
$$g(t) = \left\{ \begin{array}{ll} f(t + {\Delta }t) - f(t) = {g_d}(t) &1 \le t \le N - {\Delta }t\\ \begin{array}{ll} {f(t - (N - {\Delta }t)) - f(t)}\\ { \approx - \sum\limits_{i = 0}^{k - 1} {{g_d}((i + 1){\Delta }t + t - N)} } \end{array} &N - {\Delta }t\textrm{ + }1 \le t \le N \end{array} \right.$$
where $k = \lfloor{N/\Delta t} \rfloor - 1$, and ${g_d}$ denotes the relative disparities obtained from dense matching.

Image distortions in the cross-track and along-track directions are separately calculated using Eq. (5). In general, the cross-track distortion ${f_c}$ is mainly caused by the variations of roll and yaw angle, and the along-track distortion ${f_a}$ is mainly caused by the variation of pitch angle [33]. However, the yaw angle is not involved in this study since it results in much smaller distortions in the case of a tiny off-nadir viewing angle. The roll and pitch angle variations can be simply calculated from the corresponding image distortions by the sensor parameters as:

$$\begin{aligned} &\Delta \textrm{roll}(t) ={-} {{{f_c}(t)} \mathord{\left/ {\vphantom {{{f_c}(t)} F}} \right.} F}\\ &\Delta \textrm{pitch}(t) ={-} {{{f_a}(t) \cdot {{\cos }^2}\beta } \mathord{\left/ {\vphantom {{{f_a}(t) \cdot {{\cos }^2}\beta } F}} \right.} F} \end{aligned}$$
where F is the focal length in pixel and $\beta$ is the off-nadir viewing angle.

2.3. Time-frequency analysis

Through jitter detection from attitude data or from imagery, attitude angle signals that contain the dynamic attitude jitter information are obtained and can be taken as the inputs for the following processing. In order to extract the periodical components of attitude jitter with narrow-band property and estimate the corresponding time-varying attributes, we resort to time-frequency tools in this study. Specifically, the VMD algorithm is adopted to decompose the attitude angle signals into different periodical components with separate central frequencies, and Hilbert spectral analysis is used to calculate the time histories of instantaneous attributes for each component.

2.3.1. Variational mode decomposition

VMD is a newly developed adaptive signal decomposition technique [34]. It is able to non-recursively decompose a signal into K discrete number of band-limited sub-signals (modes) ${u_k}$, where each mode is considered to be most compact around a central frequency ${\omega _{ck}}$ and its bandwidth is estimated using H1 Gaussian smoothness of its shifted analytic signal. Compared with the widely used empirical mode decomposition [35] and its extensions, the advantages of VMD are the supporting mathematical foundation, and the insensitivity to noise, sampling, and the problem of mode mixing. As a result, it has attracted extensive attention in a variety of applications [36,37].

The constrained variational problem for VMD can be written as:

$$\begin{aligned} &\mathop {\min }\limits_{\{{{u_k}} \},\{{{\omega_{ck}}} \}} \left\{ {\sum\limits_{k = 1}^K {\left\|{{\partial_t}\left[ {\left( {\delta (t) + \frac{j}{{\pi t}}} \right) \ast {u_k}(t)} \right]{e^{ - j{\omega_{ck}}t}}} \right\|_2^2} } \right\}\\ &\textrm{s}.\textrm{t}. \sum\limits_{k = 1}^K {{u_k}(t) = x(t)} \end{aligned}$$
where $\{{u_k}\}$ and $\{{\omega _{ck}}\}$ represent the ensemble of modes and their corresponding center frequencies, respectively, δ(t) is the Dirac function, $j = \sqrt { - 1}$ and ${\ast} $ denotes the convolution operator. x(t) is the attitude angle signal with attitude jitter, which is Euler angle roll(t), pitch(t) or yaw(t) converted from the rotation matrix $R_B^O$ in Eq. (1) for jitter detection from attitude data, and is roll or pitch angle variation $\Delta \textrm{roll}(t)$ or $\Delta \textrm{pitch}(t)$ in Eq. (7) for jitter detection from imagery. The unconstrained formula can be derived by adding a quadratic penalty term and a Lagrange multiplier $\lambda$ as:
$$\begin{aligned} L({\{{{u_k}} \},\{{{\omega_{ck}}} \},\lambda } )&= \alpha \sum\limits_{k = 1}^K {\left\|{{\partial_t}\left[ {\left( {\delta (t) + \frac{j}{{\pi t}}} \right) \ast {u_k}(t)} \right]{e^{ - j{\omega_{ck}}t}}} \right\|_2^2} \\ &+ \left\|{x(t) - \sum\limits_{k = 1}^K {{u_k}(t)} } \right\|_2^2 + \left\langle {\lambda (t),x(t) - \sum\limits_{k = 1}^K {{u_k}(t)} } \right\rangle \end{aligned}$$
where $\alpha$ is the balancing parameter of the data-fidelity constraint. The alternate direction method of multipliers algorithm [38] is used to solve the above optimization problem, which generates different band-limited modes and their respective center frequencies. The modes are updated by Wiener filtering directly in the spectral domain, and the center frequencies are updated as the center of gravity of the corresponding mode’s power spectrum. More details and the implementation of VMD can be found in [34].

2.3.2. Hilbert spectral analysis

The HT can be used to construct an analytic signal from a purely real signal, which is a complex signal with the imaginary part being the HT of the real part [39]. For each periodic component of attitude angle signal extracted from VMD, the instantaneous attributes can be estimated by its analytic signal:

$$z(t) = {u_k}(t) + j{\widetilde u_k}(t) = {A_k}(t){e^{j{\phi _k}(t)}}$$
where ${\widetilde u_k}(t)$ is related to ${u_k}(t)$ by the HT:
$${\widetilde u_k}(t) = \frac{1}{\pi }P\int_{ - \infty }^\infty {\frac{{{u_k}(\tau )}}{{t - \tau }}} d\tau$$
where P denotes the Cauchy principal value of the complex integral. The instantaneous amplitude ${A_k}(t)$, instantaneous phase ${\phi _k}(t)$, and instantaneous frequency ${\omega _k}(t)$ for each mode and each time sample can be accordingly determined as:
$$\begin{aligned} {A_k}(t) &= \sqrt {u_k^2(t) + \widetilde u_k^2(t)} \\ {\phi _k}(t) &= \arctan ({{{\widetilde u}_k}(t)/{u_k}(t)} )\\ {\omega _k}(t) &= \frac{1}{{2\pi }}\frac{{d{\phi _k}(t)}}{{dt}} \end{aligned}$$

3. Experiment and discussion

3.1. Experimental data

In this paper, ZY-3 satellite, China’s first civilian stereo surveying and mapping satellite, is taken as a case study to validate the promising performance of the proposed time-varying processing framework based on time-frequency analysis. For attitude data, the attitude measurements post-processed by information fusion of star trackers and a gyro with a sampling rate of 4 Hz [40] were used. For image data, the raw multispectral images with a spatial resolution of 5.8 m, an inclination from nadir view of 6°, and a line integration time of approximately 0.8 ms were used.

Three sets of long-strip multispectral imagery and attitude data were tested in this experiment. These datasets were obtained from different areas and captured on different dates. Table 1 gives the essential information regarding the experimental datasets. The acquisition duration of these long-strip datasets is all more than 140 s, which corresponds to more than 35 thousand image distortion data calculated from Eq. (5) in each direction and more than 5.7 hundred attitude data calculated from Eq. (1) for each angle.

Tables Icon

Table 1. Basic information of the experimental datasets.

3.2. Experimental results

The attitude jitter information can be extracted from either long-strip attitude data or long-strip image data. The attitude data of ZY-3 satellite is provided under the earth-centered inertial reference system and is transferred to the OCS with the aid of orbit information following Eq. (1). The three attitude angles of each dataset are shown in Fig. 4. On the other hand, the relative disparities were calculated from the first two spectral bands that are parallel on the focal plane with a nominal distance of 152 pixels. In this experiment, the grid sizes of dense matching were set as 10 pixels in the cross-track direction and 5 pixels in the along-track direction to reduce the computational time for long-strip images. The two average disparity curves are shown in Fig. 5, and the parts in Data A seriously contaminated by the textureless areas such as clouds and waters are excluded. Note that the times of these two sources are unified and relative to the same initial time.

 figure: Fig. 4.

Fig. 4. Attitude angles relative to the OCS: roll (upper), pitch (middle), yaw (lower). (a) Data A. (b) Data B. (c) Data C.

Download Full Size | PDF

 figure: Fig. 5.

Fig. 5. Average disparities in the cross-track (upper) and along-track (lower) direction. (a) Data A. (b) Data B. (c) Data C.

Download Full Size | PDF

For attitude data, the transferred attitude angles were corrected in the yaw angle and then smoothed using Eq. (2) with the results given in Fig. 6. For image data, the distortion resulted from integration time variation and the non-periodic global trend were removed, and the relative disparities were then smoothed and interpolated, as shown in the upper part of Fig. 7. Furthermore, the absolute image distortions caused by attitude jitter were retrieved from the relative disparities and subsequently converted to attitude angle variations following Eqs. (5)–(7) (see the lower part of Fig. 7). Equation (6) was adopted to address the periodicity assumption. A simple visual inspection of Figs. 6 and 7 show that there exists distinct periodic attitude jitter information in the signals obtained from both attitude data and image data for all three datasets. The reason for the visual difference between Fig. 6 and the last two rows of Fig. 7 is that the attitude angles obtained from attitude data still contain slight global trends. But VMD can effectively handle these slight global trends by fixing the center frequency of one mode at 0.

 figure: Fig. 6.

Fig. 6. Attitude angles after correction and smoothing: roll (upper), pitch (middle), yaw (lower). (a) Data A. (b) Data B. (c) Data C.

Download Full Size | PDF

 figure: Fig. 7.

Fig. 7. The smoothed cross-track and along-track disparities after removing the distortion of integration time variation and global trend (first two rows), and the attitude angles (row and pitch) converted from the retrieved image distortions (last two rows). (a) Data A. (b) Data B. (c) Data C.

Download Full Size | PDF

With the obtained attitude angle signals from attitude data or image data in Figs. 6 and 7, the detailed attitude jitter information can be estimated and investigated. The conventional methods normally use Fourier spectral analysis. Based on the FT, the Fourier spectra with frequencies and amplitudes were achieved, as shown in Fig. 8 using the signals from attitude data and in Fig. 9 using the signals from image data. Although a global frequency representation of attitude jitter is correctly identified, the dynamic jitter characteristics over time cannot be described by FT. Alternatively, advanced time-frequency analysis tools were employed to resolve periodic components that reflect the time-varying characteristics of attitude jitter information in these signals. VMD and Hilbert transform were conducted separately on these attitude angle signals. In this study, we focus on the principal periodic component with maximum amplitude according to the results of Fourier spectral analysis and the prior knowledge on the attitude jitter of ZY-3 satellite. VMD processing was performed using the implementation from [34]. For the initialization of center frequencies for each mode, the center frequency of the first model was fixed at 0 to deal with the existence of a global trend, the center frequency of the second mode was initialized as the frequency with maximum amplitude in Fourier spectral analysis, and the initial center frequencies of other modes were uniformly distributed below 2 Hz which is the Nyquist frequency of attitude data.

 figure: Fig. 8.

Fig. 8. Fourier spectra of attitude angles from attitude data. (a) Data A. (b) Data B. (c) Data C.

Download Full Size | PDF

 figure: Fig. 9.

Fig. 9. Fourier spectra of attitude angles from imagery. (a) Data A. (b) Data B. (c) Data C.

Download Full Size | PDF

Figures 1018 present the extracted periodic components of attitude jitter and the corresponding instantaneous frequencies and amplitudes from each attitude angle signal for Data A, Data B, and Data C. Figures 1013, and 16 show the results from attitude data (upper) and image data (lower) for roll angle. Figures 1114, and 17 show the results from attitude data (upper) and image data (lower) for pitch angle. Figures 1215, and 18 show the results from attitude data for yaw angle. It can be seen from the figures for each dataset that the results of the principal periodic component and instantaneous attributes from two different sources are basically similar for roll and pitch angles. However, there appear some slight inconsistencies in the comparison of the details for pitch angle, especially in the case of small jitter amplitude. In order to quantitatively assess the results of time-frequency analysis, the instantaneous attributes from image data were resampled to the same sampling rate as the attitude data. Table 2 gives the root mean square (RMS) values of the differences between two results. It can be found that the instantaneous frequencies and amplitudes estimated from two different data sources are rather close. The RMS values for roll angle are all below 0.007 Hz and 0.075 arc seconds, while the ones for pitch angle are slightly higher with the RMS values below 0.015 Hz and 0.1 arc seconds. A reasonable explanation for the worse performance in pitch angle is the topographic effect induced by the small baseline stereo vision of the different multispectral bands. The terrain elevation changes lead to tiny topographic parallax in the along-track direction mixed with the relative disparities caused by attitude jitter in this case, which affect the jitter detection from imagery and bias the following time-frequency analysis [41].

 figure: Fig. 10.

Fig. 10. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data and image data for roll angle of Data A.

Download Full Size | PDF

 figure: Fig. 11.

Fig. 11. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data and image data for pitch angle of Data A.

Download Full Size | PDF

 figure: Fig. 12.

Fig. 12. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data for yaw angle of Data A.

Download Full Size | PDF

 figure: Fig. 13.

Fig. 13. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data and image data for roll angle of Data B.

Download Full Size | PDF

 figure: Fig. 14.

Fig. 14. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data and image data for pitch angle of Data B.

Download Full Size | PDF

 figure: Fig. 15.

Fig. 15. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data for yaw angle of Data B.

Download Full Size | PDF

 figure: Fig. 16.

Fig. 16. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data and image data for roll angle of Data C.

Download Full Size | PDF

 figure: Fig. 17.

Fig. 17. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data and image data for pitch angle of Data C.

Download Full Size | PDF

 figure: Fig. 18.

Fig. 18. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data for yaw angle of Data C.

Download Full Size | PDF

Tables Icon

Table 2. RMS values of the differences between instantaneous attributes from image data and attitude data.

The time-frequency characteristics and the long-strip variation laws of attitude jitter are successfully revealed by the proposed processing framework. For the tested ZY-3 satellite, apparent attitude jitter components can be detected in all attitude angles. The frequency of attitude jitter is mainly in the range of 0.6-0.7 Hz, except that the main frequency for pitch and yaw angles of Data C is in the range of 0.2-0.3 Hz. The amplitude of attitude jitter is dominated in roll angle with the maximum value up to 3.5 arc seconds, especially for the first two datasets captured during the early period after the satellite launch. It is found that the frequencies and amplitudes of attitude jitter are not fixed values and smoothly change with time variation, even in a short duration. Interestingly, the frequencies of attitude jitter in roll angle and yaw angle tend to change in a monotonical trend relative to imaging time in a long strip, while this relationship is not evident in pitch angle. Moreover, it is noteworthy that a very low-frequency periodic trend seems to exist, which is a possible reason for fluctuations in the results.

3.3. Discussion

Two sources carrying attitude jitter information, i.e., attitude data and image data, are tested in this paper to resolve the periodic components and instantaneous attributes of attitude jitter, whose results can be mutually validated. Jitter detection from two sources has its own pros and cons. Jitter detection from attitude data is more direct and is able to consider all three attitude angles. It is free of other influential factors such as topographic effect, sensor quality, and matching artifacts. However, its performance is determined by sampling frequency and measurement accuracy, while a high-quality attitude sensor is technically and economically infeasible for many satellites, especially with small platforms. In contrast, jitter detection from optical imagery is more flexible without the need for high-performance attitude sensors and convenient to implement with a high sampling rate. The results are acquired with better time resolution and a larger frequency range. Therefore, jitter processing from these two sources is well complementary to each other and can respectively play a leading role in different cases. The capability to deal with different data sources is one of the strengths of the proposed processing framework.

The experiment results using ZY-3 datasets demonstrate the feasibility of our method. The proposed processing framework extends the conventional methods to long-strip and dynamic analysis. The combination of VMD and HT is a suitable choice for attitude jitter processing, while the conventional methods cannot be competent. FT decomposes a signal into harmonics with constant frequencies but is a time-independent frequency estimation. Short-time FT and wavelet transform are still bounded by the Heisenberg uncertainty principle with a trade-off between time and frequency resolution [42]. Empirical mode decomposition is able to handle the nonlinear and non-stationary signals adaptively, but is sensitive to noise and cannot well guarantee the band-limited decomposition. Time-frequency analysis based on VMD and HT can achieve both high time and frequency resolution for the nonlinear and non-stationary signals, and ensure the high robustness and narrow-band property. Accordingly, the meaningful periodic components and the corresponding instantaneous attributes of attitude jitter can be efficiently extracted and estimated using the inputs from attitude data or optical imagery. The extracted periodic components can be used to correct the negative effects of attitude jitter using the existing jitter compensation methods based on imagery and sensor model. Image compensation directly resamples the affected image through intensity interpolation according to the obtained image distortions [10,19]. Sensor model compensation involves nonlinear bias correction of the rational function model [12] or using correct attitude data in the rigorous model [15,18]. In addition, the estimated instantaneous attributes reflect the spatiotemporal characteristics of attitude jitter, which will facilitate the following stability analysis, generalized jitter modeling, and determination of sources inducing jitter.

The experimental results also indicate that the frequency estimation is instable to some extent in the case of small jitter amplitude (e.g., less than 0.1 arc seconds). In this case, the potential of the proposed processing framework is limited by the measurement error of the attitude sensor or the error of image matching. This is a study area that should be specifically considered in future work.

4. Conclusion

In this paper, a novel processing framework for detection, estimation, and investigation of time-varying attitude jitter based on time-frequency analysis has been developed. The proposed processing framework is able to extract the essential periodic components and estimates the instantaneous attributes of attitude jitter in three attitude angels from long-strip attitude data and in two attitude angles from long-strip imagery. We attribute the remarkable merits of the proposed framework to the effectiveness of VMD and Hilbert spectral analysis. Experiments were conducted with three sets of ZY-3 long-strip attitude measurements and multispectral images with the acquisition durations more than 140 s. The proposed processing framework successfully revealed the dynamic characteristics and varying trends of attitude jitter for the whole duration, which cannot be retrieved by the conventionally used Fourier spectral analysis and trigonometric function fitting. The results obtained from attitude data accord well with ones from image data. The RMS values of the differences between two data sources are below 0.015 Hz and 0.1 arc seconds for the instantaneous frequencies and amplitudes, which mutually validates the reliability and feasibility of the proposed framework. The experimental results show that the attitude jitter of ZY-3 satellite is a complex phenomenon, with varying frequencies and amplitudes over time and date.

The proposed processing framework with the input of either attitude data or image data is generic and can be easily adapted to time-varying jitter processing of other high-resolution satellites. The present study concentrates exclusively on detection, estimation, and investigation of attitude jitter, while jitter compensation is not the main contribution. In future work, more datasets will be tested, and specific jitter compensation according to the estimation results will be further explored.

Funding

National Natural Science Foundation of China (41631178); National Key Research and Development Program of China (2017YFB0502900, 2018YFB0505400); Fundamental Research Funds for the Central Universities.

Acknowledgments

The authors would like to thank the Chinese Land Satellite Remote Sensing Application Center for providing the experimental data.

Disclosures

The authors declare no conflicts of interest.

References

1. X. Huang, H. Chen, and J. Gong, “Angular difference feature extraction for urban scene classification using ZY-3 multi-angle high-resolution satellite imagery,” ISPRS J. Photogramm Remote Sens. 135, 127–141 (2018). [CrossRef]  

2. X. Wu, D. Hong, J. Tian, J. Chanussot, W. Li, and R. Tao, “ORSIm detector: A novel object detection framework in optical remote sensing imagery using spatial-frequency channel features,” IEEE Trans. Geosci. Remote Sens. 57(7), 5146–5158 (2019). [CrossRef]  

3. Z. Ye, X. Tong, S. Zheng, C. Guo, S. Gao, S. Liu, X. Xu, Y. Jin, H. Xie, S. Liu, and P. Chen, “Illumination-robust subpixel Fourier-based image correlation methods based on phase congruency,” IEEE Trans. Geosci. Remote Sens. 57(4), 1995–2008 (2019). [CrossRef]  

4. Y. Pi, B. Yang, X. Li, and M. Wang, “Study of full-link on-orbit geometric calibration using multi-attitude imaging with linear agile optical satellite,” Opt. Express 27(2), 980–998 (2019). [CrossRef]  

5. A. Iwasaki, “Detection and estimation of satellite attitude jitter using remote sensing imagery,” in Advances in Spacecraft Technologies, J. Hall, ed. (InTech, 2011), pp. 257–272.

6. J. Takaku and T. Tadono, “High resolution DSM generation from ALOS PRISM-processing status and influence of attitude fluctuation,” in Proceedings of IEEE International Geoscience and Remote Sensing Symposium (IEEE, 2010), pp. 4228–4231.

7. V. Amberg, C. Dechoz, L. Bernard, D. Greslou, F. de Lussy, and L. Lebegue, “In-flight attitude perturbances estimation: application to Pleiades-HR satellites,” Proc. SPIE 8866, 886612 (2013). [CrossRef]  

8. X. Tong, Z. Ye, L. Li, S. Liu, Y. Jin, P. Chen, H. Xie, and S. Zhang, “Detection and estimation of along-track attitude jitter from Ziyuan-3 three-line-array images based on back-projection residuals,” IEEE Trans. Geosci. Remote Sens. 55(8), 4272–4284 (2017). [CrossRef]  

9. R. L. Kirk, E. Howington-Kraus, M. Rosiek, J. Anderson, B. Archinal, K. Becker, D. Cook, D. Galuszka, P. Geissler, and T. Hare, “Ultrahigh resolution topographic mapping of Mars with MRO HiRISE stereo images: Meter-scale slopes of candidate Phoenix landing sites,” J. Geophys. Res. 113(E3), E00A24 (2008). [CrossRef]  

10. X. Tong, Z. Ye, Y. Xu, X. Tang, S. Liu, L. Li, H. Xie, F. Wang, T. Li, and Z. Hong, “Framework of jitter detection and compensation for high resolution satellites,” Remote Sens. 6(5), 3944–3964 (2014). [CrossRef]  

11. R. Perrier, E. Arnaud, P. Sturm, and M. Ortner, “Estimation of an observation satellite's attitude using multimodal pushbroom cameras,” IEEE Trans. Pattern Anal. Mach. Intell. 37(5), 987–1000 (2015). [CrossRef]  

12. X. Tong, L. Li, S. Liu, Y. Xu, Z. Ye, Y. Jin, F. Wang, and H. Xie, “Detection and estimation of ZY-3 three-line array image distortions caused by attitude oscillation,” ISPRS J. Photogramm Remote Sens. 101, 291–309 (2015). [CrossRef]  

13. M. R. Henriksen, M. R. Manheim, K. N. Burns, P. Seymour, E. J. Speyerer, A. Deran, A. K. Boyd, E. Howington-Kraus, M. R. Rosiek, B. A. Archinal, and M. S. Robinson, “Extracting accurate and precise topography from LROC narrow angle camera stereo observations,” Icarus 283, 122–137 (2017). [CrossRef]  

14. F. Ayoub, S. Leprince, R. Binety, K. W. Lewis, O. Aharonson, and J.-P. Avouac, “Influence of camera distortions on satellite image registration and change detection applications,” in Proceedings of IEEE International Geoscience and Remote Sensing Symposium (IEEE, 2008), pp. 1072–1075.

15. M. Wang, Y. Zhu, S. Jin, J. Pan, and Q. Zhu, “Correction of ZY-3 image distortion caused by satellite jitter via virtual steady reimaging using attitude data,” ISPRS J. Photogramm Remote Sens. 119, 108–123 (2016). [CrossRef]  

16. T. Iwata, T. Kawahara, N. Muranaka, and D. Laughlin, “High-bandwidth attitude determination using jitter measurements and optimal filtering,” in AIAA Guidance, Navigation, and Control Conference (American Institute of Aeronautics and Astronautics, 2009), pp. 7349–7369.

17. M. Wang, C. Fan, J. Pan, S. Jin, and X. Chang, “Image jitter detection and compensation using a high-frequency angular displacement method for Yaogan-26 remote sensing satellite,” ISPRS J. Photogramm Remote Sens. 130, 32–43 (2017). [CrossRef]  

18. G. Zhang and Z. Guan, “High-frequency attitude jitter correction for the Gaofen-9 satellite,” Photogramm. Rec. 33(162), 264–282 (2018). [CrossRef]  

19. Y. Teshima and A. Iwasaki, “Correction of attitude fluctuation of Terra spacecraft using ASTER/SWIR imagery with parallax observation,” IEEE Trans. Geosci. Remote Sens. 46(1), 222–227 (2008). [CrossRef]  

20. X. Tong, Y. Xu, Z. Ye, S. Liu, X. Tang, L. Li, H. Xie, and J. Xie, “Attitude oscillation detection of the ZY-3 satellite by using multispectral parallax images,” IEEE Trans. Geosci. Remote Sens. 53(6), 3522–3534 (2015). [CrossRef]  

21. T. Sun, H. Long, B.-C. Liu, and Y. Li, “Application of attitude jitter detection based on short-time asynchronous images and compensation methods for Chinese mapping satellite-1,” Opt. Express 23(2), 1395–1410 (2015). [CrossRef]  

22. S. S. Sutton, A. K. Boyd, R. L. Kirk, D. Cook, J. W. Backer, A. Fennema, R. Heyd, A. S. McEwen, and S. D. Mirchandani, “Correcting spacecraft jitter in HiRISE images,” in International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences (ISPRS, 2017), pp. 141–148.

23. H. Liu, H. Ma, Z. Jiang, and D. Yan, “Jitter detection based on parallax observations and attitude data for Chinese Heavenly Palace-1 satellite,” Opt. Express 27(2), 1099–1123 (2019). [CrossRef]  

24. Y. Zhu, M. Wang, Y. Cheng, L. He, and L. Xue, “An improved jitter detection method based on parallax observation of multispectral sensors for Gaofen-1 02/03/04 satellites,” Remote Sens. 11(1), 16 (2018). [CrossRef]  

25. H. Pan, Z. Zou, G. Zhang, X. Zhu, and X. Tang, “A penalized spline-based attitude model for high-resolution satellite imagery,” IEEE Trans. Geosci. Remote Sens. 54(3), 1849–1859 (2016). [CrossRef]  

26. C. de Boor, A Practical Guide to Splines (Revised Edition) (Springer-Verlag, 2001).

27. C. H. Reinsch, “Smoothing by spline functions,” Numer. Math. 10(3), 177–183 (1967). [CrossRef]  

28. Z. Ye, Y. Xu, F. Wang, S. Liu, X. Tang, L. Li, J. Xie, and X. Tong, “Estimation of the attitude perturbance using parallax imagery-Application to ZY-3 satellite,” in ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences (ISPRS, 2015), pp. 279–283.

29. X. Tong, Z. Ye, Y. Xu, S. Gao, H. Xie, Q. Du, S. Liu, X. Xu, S. Liu, K. Luan, and U. Stilla, “Image registration with Fourier-based image correlation: A comprehensive review of developments and applications,” IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 12(10), 4062–4081 (2019). [CrossRef]  

30. X. Tong, Z. Ye, Y. Xu, S. Liu, L. Li, H. Xie, and T. Li, “A novel subpixel phase correlation method using singular value decomposition and unified random sample consensus,” IEEE Trans. Geosci. Remote Sens. 53(8), 4143–4156 (2015). [CrossRef]  

31. B. Cao, Z. Qiu, S. Zhu, W. Meng, D. Mo, and F. Cao, “A solution to RPCs of satellite imagery with variant integration time,” Surv. Rev. 48(351), 392–399 (2016). [CrossRef]  

32. Z. Zhang, L. Dong, G. Xu, and J. Song, “Real-time attitude estimation by using parallax observation system,” Neural Process. Lett. 48(3), 1415–1429 (2018). [CrossRef]  

33. S. Liu, X. Tong, F. Wang, W. Sun, C. Guo, Z. Ye, Y. Jin, H. Xie, and P. Chen, “Attitude jitter detection based on remotely sensed images and dense ground controls: A case study for Chinese ZY-3 satellite,” IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 9(12), 5760–5766 (2016). [CrossRef]  

34. K. Dragomiretskiy and D. Zosso, “Variational mode decomposition,” IEEE Trans. Signal Process. 62(3), 531–544 (2014). [CrossRef]  

35. N. E. Huang, Z. Shen, S. R. Long, M. C. Wu, H. H. Shih, Q. Zheng, N.-C. Yen, C. C. Tung, and H. H. Liu, “The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis,” Proc. R. Soc. London, Ser. A 454(1971), 903–995 (1998). [CrossRef]  

36. Y. Wang and R. Markert, “Filter bank property of variational mode decomposition and its applications,” Signal Process. 120, 509–521 (2016). [CrossRef]  

37. S. Cheng, S. Liu, J. Guo, K. Luo, L. Zhang, and X. Tang, “Data processing and interpretation of Antarctic ice-penetrating radar based on variational mode decomposition,” Remote Sens. 11(10), 1253 (2019). [CrossRef]  

38. D. Hong, N. Yokoya, J. Chanussot, and X. X. Zhu, “CoSpace: Common subspace learning from hyperspectral-multispectral correspondences,” IEEE Trans. Geosci. Remote Sens. 57(7), 4349–4359 (2019). [CrossRef]  

39. M. Feldman, “Hilbert transform in vibration analysis,” Mech. Syst. Signal Process. 25(3), 735–802 (2011). [CrossRef]  

40. X. Tang, J. Xie, X. Wang, and W. Jiang, “High-precision attitude post-processing and initial verification for the ZY-3 satellite,” Remote Sens. 7(1), 111–134 (2014). [CrossRef]  

41. Z. Ye, Y. Xu, X. Tong, S. Zheng, H. Zhang, H. Xie, and U. Stilla, “Estimation and analysis of along-track attitude jitter of ZiYuan-3 satellite based on relative residuals of tri-band multispectral imagery,” ISPRS J. Photogramm Remote Sens. 158, 188–200 (2019). [CrossRef]  

42. S. Yu and J. Ma, “Complex variational mode decomposition for slop-preserving denoising,” IEEE Trans. Geosci. Remote Sens. 56(1), 586–597 (2018). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (18)

Fig. 1.
Fig. 1. The overall workflow of the proposed processing framework of attitude jitter.
Fig. 2.
Fig. 2. An instance of the correction in the yaw angle. (a) The original yaw angle relative to the OCS. (b) The difference between yaw angle and its second-order polynomial fitting.
Fig. 3.
Fig. 3. An instance of the influence of integration time variation. (a) Along-track average disparity (blue curve) and the calculated image shift caused by integration time variation (red curve). (b) The integration time of each scanning line. (c) Nominal time difference between parallax images. The red and green circles denote the starts and ends of affected intervals, respectively.
Fig. 4.
Fig. 4. Attitude angles relative to the OCS: roll (upper), pitch (middle), yaw (lower). (a) Data A. (b) Data B. (c) Data C.
Fig. 5.
Fig. 5. Average disparities in the cross-track (upper) and along-track (lower) direction. (a) Data A. (b) Data B. (c) Data C.
Fig. 6.
Fig. 6. Attitude angles after correction and smoothing: roll (upper), pitch (middle), yaw (lower). (a) Data A. (b) Data B. (c) Data C.
Fig. 7.
Fig. 7. The smoothed cross-track and along-track disparities after removing the distortion of integration time variation and global trend (first two rows), and the attitude angles (row and pitch) converted from the retrieved image distortions (last two rows). (a) Data A. (b) Data B. (c) Data C.
Fig. 8.
Fig. 8. Fourier spectra of attitude angles from attitude data. (a) Data A. (b) Data B. (c) Data C.
Fig. 9.
Fig. 9. Fourier spectra of attitude angles from imagery. (a) Data A. (b) Data B. (c) Data C.
Fig. 10.
Fig. 10. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data and image data for roll angle of Data A.
Fig. 11.
Fig. 11. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data and image data for pitch angle of Data A.
Fig. 12.
Fig. 12. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data for yaw angle of Data A.
Fig. 13.
Fig. 13. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data and image data for roll angle of Data B.
Fig. 14.
Fig. 14. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data and image data for pitch angle of Data B.
Fig. 15.
Fig. 15. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data for yaw angle of Data B.
Fig. 16.
Fig. 16. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data and image data for roll angle of Data C.
Fig. 17.
Fig. 17. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data and image data for pitch angle of Data C.
Fig. 18.
Fig. 18. (a) Principal periodic component, (b) instantaneous frequency and (c) instantaneous amplitude estimated from attitude data for yaw angle of Data C.

Tables (2)

Tables Icon

Table 1. Basic information of the experimental datasets.

Tables Icon

Table 2. RMS values of the differences between instantaneous attributes from image data and attitude data.

Equations (12)

Equations on this page are rendered with MathJax. Learn more.

R B O = ( R O E C I ) T R B E C I R O E C I ( t ) = [ X O ( t ) Y O ( t ) Z O ( t ) ] Z O ( t ) = P ( t ) | | P ( t ) | | , X O ( t ) = V ( t ) × Z O ( t ) | | V ( t ) × Z O ( t ) | | , Y O ( t ) = Z O ( t ) × X O ( t )
p i w i ( y i s ( x i ) ) 2 + ( 1 p ) s ( x ) 2 d x
g ( t ) = f ( t + Δ t ) f ( t )
Δ L = d L 2 d L 1 = T 1 T 2 T 1 d L 2
f ( t ) = F 1 { F [ g ( t ) ] exp ( 2 π i k Δ t / 2 π i k Δ t N N ) 1 }
g ( t ) = { f ( t + Δ t ) f ( t ) = g d ( t ) 1 t N Δ t f ( t ( N Δ t ) ) f ( t ) i = 0 k 1 g d ( ( i + 1 ) Δ t + t N ) N Δ t  +  1 t N
Δ roll ( t ) = f c ( t ) / f c ( t ) F F Δ pitch ( t ) = f a ( t ) cos 2 β / f a ( t ) cos 2 β F F
min { u k } , { ω c k } { k = 1 K t [ ( δ ( t ) + j π t ) u k ( t ) ] e j ω c k t 2 2 } s . t . k = 1 K u k ( t ) = x ( t )
L ( { u k } , { ω c k } , λ ) = α k = 1 K t [ ( δ ( t ) + j π t ) u k ( t ) ] e j ω c k t 2 2 + x ( t ) k = 1 K u k ( t ) 2 2 + λ ( t ) , x ( t ) k = 1 K u k ( t )
z ( t ) = u k ( t ) + j u ~ k ( t ) = A k ( t ) e j ϕ k ( t )
u ~ k ( t ) = 1 π P u k ( τ ) t τ d τ
A k ( t ) = u k 2 ( t ) + u ~ k 2 ( t ) ϕ k ( t ) = arctan ( u ~ k ( t ) / u k ( t ) ) ω k ( t ) = 1 2 π d ϕ k ( t ) d t
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.