Friday, September 13, 2024

Precise Motion Compensation Approach for High-Resolution Multirotor UAV SAR in the Presence of Multiple Errors


Precise Motion Compensation Approach for High-Resolution Multirotor UAV SAR in the Presence of Multiple Errors | IEEE Journals & Magazine | IEEE Xplore

J. Han et al., "Precise Motion Compensation Approach for High-Resolution Multirotor UAV SAR in the Presence of Multiple Errors," in IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 17, pp. 15148-15165, 2024, doi: 10.1109/JSTARS.2024.3449318.


Abstract: As an important supplement to traditional airborne synthetic aperture radar (SAR), multirotor unmanned aerial vehicle (UAV) SAR has the advantages of low cost, high flexibility, and strong survival ability. 

However, due to the complex motion and flight characteristics of the multirotor UAV platform, multirotor UAV SAR faces challenges, including spatially variant low-frequency (LF) errors and severe high-frequency (HF) errors. To deal with these problems, an improved motion compensation approach is proposed for multirotor UAV SAR imaging, which is implemented through two processing steps. 

  1. ) The LF errors are eliminated by an improved two-step MoCo approach, which takes into account the spatial variations of both envelope and phase. 
  2. ) The HF errors are estimated and corrected by an extended phase gradient autofocus scheme. 

Different from conventional solutions, our approach can effectively remove the complex motion errors of multirotor UAV SAR step-by-step with high robustness even in high-resolution scenarios. 

Computer simulation and experimental results verify the effectiveness of our approach.


keywords: {Autonomous aerial vehicles;Imaging;Synthetic aperture radar;Trajectory;Geometric modeling;Radar polarimetry;History;High-frequency (HF) error;low-frequency (LF) error;motion compensation (MOCO);multirotor unmanned aerial vehicle (UAV);synthetic aperture radar (SAR)},
 

URLhttps://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10646452&isnumber=10330207

 Summary

 This paper presents an improved motion compensation (MoCo) approach for high-resolution synthetic aperture radar (SAR) imaging using multirotor unmanned aerial vehicles (UAVs). The key points of the paper are:

1. Problem addressed: Multirotor UAV SAR faces challenges due to complex motion errors, including spatially variant low-frequency (LF) errors and severe high-frequency (HF) errors, which degrade image quality.

2. Proposed solution: An improved MoCo approach implemented in two main steps:
   a) LF error compensation using an improved two-step MoCo method with scaling correction
   b) HF error compensation using an extended phase gradient autofocus (PGA) scheme

3. Methodology:

  •    - Establishes a geometric model of multirotor UAV SAR
  •    - Analyzes the effects of LF and HF errors on imaging quality
  •    - Develops a scaling correction method to address spatially variant LF envelope errors
  •    - Proposes an extended PGA scheme for HF error estimation and correction

4. Key innovations:

  •    - Addresses both LF and HF errors in a step-by-step approach
  •    - Considers spatially variant components of LF errors
  •    - Adapts PGA for multi-component HF errors in UAV SAR

5. Validation: The approach is validated through:

  •    - Computer simulations
  •    - Real data experiments using a multirotor UAV SAR system

6. Results:

  •    - Demonstrates improved focusing of targets across the entire scene
  •    - Shows effective removal of ghosting artifacts caused by HF errors
  •    - Achieves better image quality parameters (IRW, PSLR, ISLR) compared to existing methods

7. Significance:

  •    - Enables high-quality SAR imaging from multirotor UAV platforms
  •    - Expands the potential applications of UAV SAR in various fields

The paper concludes that the proposed approach effectively compensates for complex motion errors in multirotor UAV SAR, leading to improved high-resolution imaging capabilities.

Figures

Here's a list of the figures in the paper along with their titles and descriptions:

1.       Fig. 1: Multirotor UAV SAR system -    - Shows an illustration of a multirotor UAV with SAR equipment

2.       Fig. 2: Multirotor UAV SAR imaging geometry model with motion errors -    - Illustrates the 3D geometric model used for analysis

3.       3. Fig. 3: Side view of multirotor UAV SAR imaging geometry    - Provides a 2D side view of the imaging geometry

4.       Fig. 4: LF errors of multirotor UAV platform   - Shows simulated low-frequency motion errors along y and z axes

(a)    y-axis

(b)    z-axis

5.       Fig. 5: Phase errors induced by spatially variant LF errors with different orders   - Illustrates the phase errors caused by different orders of LF errors

(a)    First-order spatially variant component

(b)    High-order spatially variant components

6.       Fig. 6: Envelope errors induced by spatially variant LF errors with different orders   - Shows the envelope errors caused by different orders of LF errors

(a)    First-order spatially variant component

(b)    High-order spatially variant components

7.       Fig. 7: HF errors of multirotor UAV platform   - Displays simulated high-frequency motion errors along y and z axes

(a)    y-axis

(b)    z-axis

8.       Fig. 8: Impulse responses of desired target after focusing   - Compares target focusing with and without HF errors

(a)    Without HF errors

(b)    With single-component HF errors

(c)     With multiple-component HF errors

9.       Fig. 9: Phase errors induced by HF errors with different components   - Shows phase errors caused by different components of HF errors

(a)    Bulk component

(b)    Spatially variant component

10.   Fig. 10: Flowchart of scaling correction    - Illustrates the process of scaling correction for LF errors

11.   Fig. 11: Flowchart of imaging approach    - Provides an overview of the entire proposed imaging approach

12.   Fig. 12: Additional envelope errors induced by scaling correction    - Shows the envelope errors introduced by the scaling correction process

13.   Fig. 13: Spatially variant components of additional envelope errors    - Illustrates the spatially variant parts of the additional envelope errors

14.   Fig. 14-24: Various simulation results    - These figures show results from computer simulations, including motion trajectories, simulated scenes, and processed image results

15.   Fig. 25-30: Real data experiment results    - These figures display the results from processing real SAR data collected by a multirotor UAV, showing improvements in image quality and ghost suppression

These figures collectively demonstrate the theoretical basis, methodology, and effectiveness of the proposed motion compensation approach for multirotor UAV SAR imaging.

Magnitude and Frequency of Motion Compensation

Upon reviewing the paper, I notice that the authors don't provide explicit quantitative definitions of what constitutes "low-frequency" versus "high-frequency" in terms of specific frequency ranges or acceleration magnitudes. This is a limitation of the paper's presentation.

However, I can provide some quantitative information from the figures and simulations presented in the paper:

Low-Frequency (LF) Errors:
- From Fig. 4, which shows simulated LF errors:
  - Y-axis: Peak-to-peak amplitude of about 0.4 meters
  - Z-axis: Peak-to-peak amplitude of about 0.6 meters
  - The period of these oscillations appears to be around 10-20 seconds

High-Frequency (HF) Errors:
- From Fig. 7, which shows simulated HF errors:
  - Y-axis: Peak-to-peak amplitude of about 0.006 meters (6 mm)
  - Z-axis: Peak-to-peak amplitude of about 0.004 meters (4 mm)
  - The period of these oscillations appears to be much shorter, roughly 0.1-0.2 seconds

These figures suggest that:
1. LF errors have larger amplitudes (tens of centimeters) and longer periods (several seconds)
2. HF errors have smaller amplitudes (few millimeters) and shorter periods (fraction of a second)

The paper mentions that HF errors are related to propeller rotation, which typically occurs at frequencies of tens to hundreds of Hz for multirotor UAVs. However, specific rotational speeds are not provided.

Regarding acceleration magnitudes, the paper doesn't provide this information directly. These could be estimated by taking the second derivative of the position errors shown in the figures, but this would be an approximation based on simulated data.

It's worth noting that the lack of explicit quantitative definitions for LF and HF errors in terms of frequency ranges is a limitation of the paper. In practice, the distinction between LF and HF might depend on the specific UAV platform, SAR system parameters, and imaging requirements.

Motion Compensation Algorithm

I'll describe the motion compensation algorithm in mathematical terms, following the paper's presentation. The algorithm is divided into two main parts: Low-Frequency (LF) and High-Frequency (HF) error compensation.

1. Low-Frequency Error Compensation:

a) The true instantaneous slant range history R(η) is modeled as:

R(η) ≈ R_q + ((Vη + ΔX(η) - X_q)^2) / (2R_q) - ΔY(η)sin(θ_q) + ΔZ(η)cos(θ_q)

Where:
- η is the azimuth slow time
- R_q is the closest slant range to the target
- V is the average velocity
- ΔX, ΔY, ΔZ are motion errors
- θ_q is the incident angle

b) The LF errors are separated into three components:

ΔR_low(η) = ΔR_low_0(η) + ΔR_low_1(η) + ΔR_low_h(η)

Where:
- ΔR_low_0(η) is the bulk component
- ΔR_low_1(η) is the first-order spatially variant component
- ΔR_low_h(η) is the high-order spatially variant component

c) Scaling correction is applied using Chirp-Z Transform (CZT):

S_0(f'_τ, η) = S_0(f_τ · [1 + α(η)], η)

Where α(η) is the linear coefficient of LF envelope errors.

d) Two-step MoCo is applied:

First step: H_3(f_τ, η) = exp[j(4π(f_0 + f_τ)/c)ΔR_low_0(η)]
Second step: H_4(τ, η) = exp{j(4π/λ)[ΔR_low_1(η) + ΔR_low_h(η)]}

e) Additional error correction:

H_add(f_τ, f_η) = exp{j(4πf_τR_p/c)(α(f_η)/[1 + α(f_η)])((8V + λ^2f_η^2)/(8V))}

2. High-Frequency Error Compensation:

a) HF errors are modeled as:

ΔR_high(η) = Σ(i=1 to M) A_i sin(2πf_iη)

Where A_i and f_i are the amplitude and frequency of the i-th HF error component.

b) The phase error introduced by HF errors:

Δφ_high(η) = exp{-j(4π/λ) · Σ(i=1 to M) A_i sin(2πf_iη)}

c) Extended Phase Gradient Autofocus (PGA) is used to estimate HF phase errors φ_h(η).

d) HF error compensation function:

H_f(τ, η) = exp{jφ_h(η)}

3. Final Compensated Signal:

After applying all compensation steps, the echo signal becomes:

S_0(τ, η) = w_r{τ - (2R_q/c)}w_a(η)exp{-j(4π/λ)R_0(η)}exp{-j(4π/λ)ΔR_at(η)}

Where ΔR_at(η) represents remaining along-track errors, which can be further compensated using additional techniques like the modified Map Drift algorithm.

This mathematical description outlines the key steps and equations used in the motion compensation algorithm. The actual implementation involves applying these compensations to the SAR data in appropriate domains (range time, azimuth time, range frequency, or 2D frequency) as described in the paper.

 Attitude

The paper does not explicitly discuss the attitude reference system, which is a crucial element for accurately resolving motion components in a UAV SAR system. This is a significant omission in the paper's description of the overall system. Let me elaborate on why an attitude reference would be necessary:

1. Attitude Information Requirement:
   To properly resolve azimuth, elevation, and x, y, z components of motion, the system would need to know the UAV's attitude (roll, pitch, and yaw) at each moment during the data collection.

2. Typical Attitude Reference Systems:
   - Inertial Measurement Unit (IMU): Provides angular rates and accelerations
   - Gyroscopes: Measure angular velocity
   - Magnetometers: Provide heading information
   - GPS: Often used in conjunction with IMU for position and velocity

3. Importance in SAR Motion Compensation:
   - Translates body-frame measurements to the earth-fixed coordinate system
   - Essential for accurately interpreting motion sensor data
   - Crucial for relating the SAR antenna's orientation to the ground

4. Integration with Motion Compensation:
   An attitude reference would be necessary to:
   - Correctly interpret the ΔX(η), ΔY(η), and ΔZ(η) motion errors
   - Accurately project these errors onto the slant range direction
   - Properly implement the geometric transformations in the SAR processing

5. Impact on Algorithm:
   The paper's algorithms implicitly assume that attitude information is available and accurate. The quality of attitude estimation would directly affect the performance of both LF and HF error compensation.

6. Potential Sources of Attitude Data:
   - Onboard UAV navigation system
   - Integration of IMU data with GPS measurements
   - Possibly derived from the SAR data itself in some cases (though this would be challenging and is not discussed in the paper)

The lack of discussion about the attitude reference system in the paper is a significant oversight. In a practical implementation of this motion compensation approach, the integration of a reliable attitude reference system would be crucial. This would typically involve sensor fusion techniques to combine data from various sources (IMU, GPS, magnetometers) to obtain accurate attitude information throughout the SAR data collection process.

The absence of this information in the paper limits the reader's ability to fully understand how the proposed algorithms would be implemented in a real-world UAV SAR system, where attitude estimation errors could significantly impact the motion compensation performance.

Authors

Here is a summary of the authors, their institutional affiliations, and related prior work mentioned in the paper:
  1. Jiahao Han - National Key Laboratory of Radar Signal Processing, Xidian University, China
  2. Shiyang Tang - National Key Laboratory of Radar Signal Processing, Xidian University, China
  3. Zhanye Chen - State Key Laboratory of Millimeter Waves and Institute of Electromagnetic Space, Southeast University, China
  4. Yi Ren - National Key Laboratory of Radar Signal Processing, Xidian University, China  
  5. Zhixin Lian - National Key Laboratory of Radar Signal Processing, Xidian University, China
  6. Ping Guo - College of Communication and Information Engineering, Xi'an University of Science and Technology, China
  7. Yinan Li - National Key Laboratory of Radar Signal Processing, Xidian University and China Academy of Space Technology, China
  8. Linrang Zhang - National Key Laboratory of Radar Signal Processing, Xidian University, China
  9. Hing Cheung So - Department of Electrical Engineering, City University of Hong Kong, Hong Kong
Related Prior Work:

- The authors reference several of their own prior papers on topics related to SAR imaging and motion compensation, including:
  1. Y. Ren et al. on an improved spatially variant MOCO approach for UAV SAR imaging (2022)
  2. S. Tang et al. on acceleration model analyses and imaging algorithms for squinted airborne spotlight-mode SAR (2015)
  3. J. Chen et al. on two-step accuracy improvement of motion compensation for airborne SAR (2019)
  4. Y. Ren, S. Tang et al. on 2-D spatially variant motion error compensation for airborne SAR (2022)
- They also cite work by other researchers on topics like:
  1. SAR imaging principles and processing algorithms
  2. Motion compensation techniques for airborne and UAV SAR  
  3. Autofocus methods for SAR phase error correction
  4. High-frequency vibration compensation for SAR
The authors build on this prior work to develop their new motion compensation approach for multirotor UAV SAR imaging.




 


 

 

No comments:

Post a Comment

TMTT CFP Special Issue on Latest Advances on Radar-Based Physiological Sensors and Their Applications

Radar can be used for human non-contact monitoring and interaction TMTT CFP Special Issue on Latest Advances on Radar-Based Physiological Se...