Tuesday, February 20, 2024

High Frame-Rate Imaging Using Swarm of UAV-Borne Radars | IEEE Journals & Magazine | IEEE Xplore

Fig. 3. - Illustration of high frame-rate imaging using collaborative radar swarm of UAVs, where radar operates at traditional microwave frequency band, for example, K-band.

 

High Frame-Rate Imaging Using Swarm of UAV-Borne Radars | IEEE Journals & Magazine | IEEE Xplore

Abstract: High frame-rate imaging of synthetic aperture radar (SAR), known as video SAR, has received much research interest these years. It usually operates at extremely high frequency and even THz band as a technical tradeoff between high frame rate and high resolution. As a result, video SAR system always suffers from limited functional range due to strong atmospheric attenuation of signals. 

This article attempts to present a new high frame-rate collaborative imaging regime in the microwave frequency band based on a swarm of unmanned aerial vehicles (UAVs). The spatial degrees of freedom are employed to shorten the synthetic time and thus improve the frame rate. More specifically, the long synthetic aperture is split into multiple short subapertures, and each UAV-borne radar implements short subaperture imaging in a short time. 

Then, the accelerated fast back-projection (AFBP) algorithm is employed to fuse multiple subimages to produce an image with high azimuth resolution. To implement the collaborative working of swarm of UAV-borne radars, a suitable orthogonal waveform is selected and a useful spatial configuration of the swarm is designed to compensate for the effect of the orthogonal waveform on imaging. Simulation results have been presented to highlight the advantages of collaborative imaging using a swarm of UAV-borne radars.

J. Ding, K. Zhang, X. Huang and Z. Xu, "High Frame-Rate Imaging Using Swarm of UAV-Borne Radars," in IEEE Transactions on Geoscience and Remote Sensing, vol. 62, pp. 1-12, 2024, Art no. 5204912, doi: 10.1109/TGRS.2024.3362630.


keywords: {Radar imaging; Radar;Imaging; Synthetic aperture radar; Apertures; Collaboration; Image resolution; Collaborative radar imaging; microwave high frame-rate imaging; radar network; synthetic aperture radar (SAR); video SAR},


URL: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10428066&isnumber=10354519


Published in: IEEE Transactions on Geoscience and Remote Sensing ( Volume: 62)

Article Sequence Number: 5204912
Date of Publication: 08 February 2024
ISSN Information:
Publisher: IEEE
Funding Agency:
 

SECTION I. Introduction

Radar mounted on a civil unmanned aerial vehicle (UAV) has recently entered a period of significant progress in technology innovations, as researchers continue to explore their applications, and particularly, UAV-based radar imaging has been extensively researched [1], [2]. Most recently, the use of UAV swarms is highly expected to open a gate of various new applications of radar because of high flexibility and low cost. In the past years, some successful experiments of UAV swarms have been demonstrated worldwide with an emphasis on coordinated formation flight [3], [4], [5], which inspires researchers to pay much attention to the swarm of UAV-borne radars. Unfortunately, the implementation of collaborative working of the swarm of UAV-borne radars is greatly hampered due to a few technical difficulties, such as reliable solution of system precise synchronization and dynamic communication, which differs a lot from the payloads of photoelectric sensors that have been widely mounted on UAVs. The analysis and development of the swarm of UAV-borne radars are relatively underexplored until now.

It has long been recognized that distributed collaborative radar networks with a higher degree of coordination than conventional multistatic radars could provide an entirely new paradigm to overcome fundamental limitations of today’s radars that work independently or with poor collaboration [6], [7]. However, the previous efforts in collaborative radar network are mostly restricted to ground-based radar systems with limited dynamics, technically because they strongly rely on optical fibers to ensure high-quality and high coherence of signals. This implementation becomes very difficult, if not impossible, in airborne scenarios due to the high dynamics of platforms. This article attempts to present a useful collaborative imaging regime based on a radar swarm of UAVs that can be afforded by state-of-the-art technologies.

It has been highly desired in the radar community to develop a microwave sensor with high resolution that operates as effectively as current electro-optical sensors. Video synthetic aperture radar (SAR), an area of great current activity, can provide a persistent view of the scene of interest by high frame-rate imaging [8], [9], [10]. Additionally, a moving target often leaves dynamic shadows at their true instant position in sequential images, which allows for an alternative solution for detection, location, and tracking of moving targets [11], [12]. The imaging frame rate determines what kind of target we can track. There is always a trade-off between resolution and frame rate, which pushes the operational frequency of the radar to enter the THz band [13], [14]. However, the use of such a high frequency significantly adds to atmospheric attenuation of radar signals, and therefore, the growing concern regarding video SAR is the limited functional range, which in many cases is not acceptable. It is highly expected that video SAR systems can work in classical radar bands, for example, K-band.

When video SAR operates at a relatively low frequency, there are still two approaches to achieving a high frame rate. The first is to increase the speed of the radar platform [15]. As the speed of the platform increases, the coherent processing time to obtain the desired azimuth resolution decreases, and thus, the frame rate increases. As a side effect, the azimuth Doppler bandwidth expands, which often results in the Doppler ambiguity of the SAR system. Moreover, the platform speed is not adjustable in many applications. The second approach is to improve the aperture overlapping ratio in image formation, which has been often used for obtaining a high frame rate. However, the negative result is no-return area in the SAR scene may quickly be washed out since the synthetic time of each frame is not reduced in aperture overlapping processing, which adds to the difficulty of the detection of moving target shadow.

Distributed airborne radar utilizes multiplatform to create larger synthetic aperture and can obtain high imaging resolution [16]. In other words, high-resolution images are available in a shorter synthetic duration by fusing multiview information from distributed radars. Actually, the swarm of UAV-borne radars has much higher flexibility in configuration designation than the common airborne counterparts [21], which gives rise to high frame-rate imaging without relying on aperture overlapping in classical radar bands.

This article attempts to demonstrate a collaborative radar imaging approach using the swarm of UAV-borne radars, which is an alternative for high frame-rate imaging in classical radar frequency bands rather than video SAR designed in THz frequency. The proposed approach has been verified by using simulations. The article is organized as follows. Section II briefly describes video SAR and the accelerated fast back-projection (AFBP) algorithm. Section III presents the developed framework for high frame-rate imaging based on the swarm of UAV-borne radars. The experimental examination are detailed in Section IV, and Section V concludes this article.

SECTION II. Preliminaries

A. Video SAR

Video SAR is a classical high frame-rate imaging system, which can generate sequential SAR images at a frame rate similar to conventional video formats, and the frame rate can be given as

fps=1(1γ)Ta=v(1γ)L=2ρafcv(1γ)cR(1)
View SourceRight-click on figure for MathML and additional features. where Ta is the synthetic aperture time, L is the length of synthetic aperture, ρa is the azimuth resolution, fc is center frequency, v is the velocity of the radar platform, c is the speed of light, R is the range from the radar to the scene center, and γ is the data overlap rate.

In conventional SAR imagery, moving targets are detected through their Doppler energy. However, the Doppler may be shifted or smeared. In contrast, moving target shadows precisely present their locations and motion [23]. Consequently, we can find moving targets by detecting their dynamic shadows in video SAR image sequences [11], [12]. The performance of moving target detection by shadow is associated with the difference between the shadow and its surrounding background. The shadow-to-background ratio (SBHR) statistically describes the intensity contrast between a moving target shadow and its surrounding background [12]. The smaller the SHBR, the darker the shadow, and the easier to detect. For a moving target with the length of Lm and the velocity of vt , the SBHR of a ground point at the shadow center can be expressed as

SHBRTc=σn+(1Tc/Ta)σbσn+σb=min(Ta,Lm/vt)(2)
View SourceRight-click on figure for MathML and additional features.where σn and σb represents the equivalent backscatter coefficient of noise and background, respectively. Tc is the occlusion time of the ground point at the shadow center. Obviously, the SBHR is associated with the synthetic time Ta rather than the data overlap rate γ . In other words, improving the frame rate by overlapped processing does not shorten the synthetic time, which cannot benefit the detection and tracking of moving target shadows. Hence, this work aims at high frame-rate imaging without overlapped SAR processing.

B. Accelerated Fast Back-Projection Algorithm

The AFBP algorithm [27] has been developed for spotlight SAR imaging. Thanks to the unified pseudo-polar coordinate system used in subimage formation, the AFBP can efficiently implement image fusion via fast Fourier transform (FFT) and circular shift, which avoids 2-D image-domain interpolation in the classical fast back-projection (FBP) algorithm [26]. The subaperture division and the imaging coordinate system are illustrated in Fig. 1, where the platform moves along a straight flight path, generating a synthetic aperture of length L . The trajectory is defined as the x -axis with the origin O at the center of the synthetic aperture. The full synthetic aperture is divided into I subapertures with an equal length of l=L/I . The imaging coordinate (ρ,α) is constructed at the full aperture center, where ρ denotes the distance from origin O to the point, and α denotes the sine of the angle between the y -axis and the radar line of sight.

Fig. 1. - Aperture division and pseudo-polar grids used in the AFBP algorithm.
Fig. 1. Aperture division and pseudo-polar grids used in the AFBP algorithm.

The wavenumber spectrum of the i th subimage can be given by

Ii(kρ,kα)=rect[kρkρcΔkρ]exp(jkρρ)×rect[kαkαiΔkαi]exp(jkαα)(3)
View SourceRight-click on figure for MathML and additional features. and the angular wavenumber variables are defined as
kα=kρxkαi=kρxiΔkαi=kρl(4)
View SourceRight-click on figure for MathML and additional features.where kρ is the radial wavenumber, kρc is the center of radial wavenumber, Δkρ is the extent of range wavenumber, kα is the angular wavenumber, kαi is the center of the i th angular wavenumber, xi is the center of the i th subaperture, and Δkαi is the extent of angular wavenumber.

Equations (3) and (4) reveal the linear relation between angular wavenumber kα and aperture position x . Since the aperture positions are continuous, the spectra of the subimages can be connected without gaps. However, due to the coarse angular interval, there may be ambiguity in the wavenumber spectrum, which should be eliminated before the spectrum connection. Fig. 2 presents the spectrum fusion process.

Fig. 2. - Wavenumber spectrum fusion.
Fig. 2. Wavenumber spectrum fusion.

The AFBP procedure can be outlined as follows: 1) forming multiple subimages with coarse angular resolution in the unified pseudo-polar coordinate system; 2) transforming the subimages into the wavenumber domain by FFT; 3) performing spectrum center correction and spectrum connection; and 4) transforming the complete wavenumber spectrum back to image domain to obtain a full-resolution SAR image.

SECTION III. Collaborative High Frame-Rate Imaging

For the traditional SAR system with high azimuth resolution, increasing the center frequency to the W-band [17] or even THz band [13] can significantly reduce the synthetic aperture time, thereby improving frame rate and making moving target shadows easier observation. However, the signal with extremely high frequency suffers from atmospheric attenuation while the radar platform speed is limited by some physical factors. In contrast, the swarm of UAV-borne radars is completely dispersed in space, which makes it possible to shorten the synthetic time. The swarm of UAV-borne radars can be configured flexibly, which enables the multiview observation of the region of interest. Then, by fusing multiview information of multiple UAV-borne radars, high azimuth resolution can be achieved.

When each UAV-borne radar works in a self-transmitting and self-receiving mode, the collaborative imaging system based on a radar swarm of UAVs can be regarded as the collection of multiple independent UAV-borne SARs. In this case, as illustrated in Fig. 3, the long synthetic aperture to obtain high azimuth resolution can be split into multiple short subapertures, and each UAV-borne SAR implements short subaperture imaging to generate a low-resolution image. Finally, multiple low-resolution images are fused to generate an image with the desired high azimuth resolution. As a result, the synthetic aperture time is proportional to the inverse of the number of UAVs. From (9), the frame rate of the imaging system based on the swarm of UAV-borne radars can be expressed as

fps=vN(1γ)L=2ρafcvN(1γ)cR(5)
View SourceRight-click on figure for MathML and additional features. where N is the number of UAVs. Obviously, increasing the number of UAV platforms can linearly increase the imaging frame rate.

Fig. 3. - Illustration of high frame-rate imaging using collaborative radar swarm of UAVs, where radar operates at traditional microwave frequency band, for example, K-band.
Fig. 3. Illustration of high frame-rate imaging using collaborative radar swarm of UAVs, where radar operates at traditional microwave frequency band, for example, K-band.

As opposed to video SAR, collaborative imaging based on the swarm of UAVs can reduce the synthetic aperture time Ta to the time Ta/N . Hence, from (2), in the imagery obtained by the collaborative imaging, the SBHR of a ground point at the shadow center is

SHBRTc=σn+(1TcN/Ta)σbσn+σb=min(Ta/N,Lm/vt).(6)
View SourceRight-click on figure for MathML and additional features.

We compare the SBHR at the shadow center in SAR images obtained by the two imaging modes. As shown in Table I, compared to video SAR, collaborative imaging can reduce the SBHR when the moving target speed is greater than Lm/Ta . Furthermore, for a moving target with a size of 2×5 m, SHBR curves of its shadow varying with target velocity and number of UAVs are shown in Fig. 4, where the red dotted line represents the detection requirement of SHBR (−1.5 dB) and the equivalent backscatter coefficients of noise and background are set as −34.5 and −16.5 dB, respectively. It should be pointed out that collaborative imaging degenerates to video SAR imaging when the number of UAVs is 1. Obviously, the greater the number of UAVs, the lower the SHBR of the moving target shadow, and the higher the maximum detectable velocity. It can be concluded that collaborative imaging has the potential for more accurate detection of moving targets with shadows aided.

TABLE I SBHR at the Shadow Center in SAR Images From the Two Imaging Modes
Table I- 
SBHR at the Shadow Center in SAR Images From the Two Imaging Modes
Fig. 4. - SHBR curves of moving target shadow varying with target velocity and number of UAVs, where the red dotted line represents the detection requirement of SHBR (−1.5 dB).
Fig. 4.

SHBR curves of moving target shadow varying with target velocity and number of UAVs, where the red dotted line represents the detection requirement of SHBR (−1.5 dB).

A. Operating Mode and Waveform Design

The basis of collaborative imaging based on the swarm of UAV-borne radars is to transmit multiple orthogonal waveforms simultaneously, and each waveform is associated with each UAV-borne radar. Since the frequency-modulated continuous waveform (FMCW) is always used in UAV-borne radars for the sake of miniaturization and lightweight, the waveform suitable for the FMCW radar should be selected from various orthogonal waveforms [18], [19], [20]. In FMCW radar systems, dechirping processing is employed to alleviate sampling requirements, where a beat signal whose bandwidth is narrower than the transmission bandwidth is generated from the backscattered signal. Making full use of the advantage of the dechirping processing, an orthogonal waveform named beat frequency division (BFD) is proposed [20]. Although the proposed waveform is not orthogonal during transmission, high orthogonality can be obtained after the dechirping processing. More specifically, when adopting the BFD waveform, as shown in Fig. 5, each UAV-borne radar transmits signals at the same chirp rate γ but with a small frequency offset Δfb . Obviously, most of the transmission frequencies between UAV-borne radars overlap due to the small frequency offset Δfb , which makes the subsequent image coherent fusion possible. After the dechirping processing, the superposition reflected signals from multiple UAV-borne radar are separated when the frequency offset Δfb satisfies

Δfb>2Rswathγc(7)
View SourceRight-click on figure for MathML and additional features. where Rswath is the range swath of the observation scene. Finally, we can get the desired signal by filtering.

Fig. 5. - Time-frequency diagrams of radar signals. (a) Transmit signal. (b) Receive signal. (c) Beat signal. (d) Filtered signal.
Fig. 5. Time-frequency diagrams of radar signals. (a) Transmit signal. (b) Receive signal. (c) Beat signal. (d) Filtered signal.

Due to the spatial distribution of multiple UAV platforms, time-frequency synchronization errors and phase synchronization errors are inevitable in the swarm of UAV-borne radars. To reduce the effect of synchronization errors, it is crucial to choose the operating mode for the swarm of UAV-borne radars. Common distributed radar systems either operate in a one-transmitter-multiple-receiver mode or in a multiple-transmitter-multiple-receiver mode, where the transmitting and receiving oscillators are generally separated. Consequently, both the two modes require the precise calibration of synchronization errors caused by different radar local oscillators among distributed UAVs, which is extremely difficult. To alleviate the effect of synchronization errors, the swarm of UAV-borne radars in this article operates in a self-transmitting and self-receiving mode, where the transmitting and receiving ends share the same reference signal source. Consequently, the frequency and phase synchronization errors can be canceled as in a monostatic SAR, enabling coherent processing across multiple platforms. In addition, time-frequency synchronization errors have an effect on the orthogonality when the BFD waveform is adopted for collaborative imaging. More details are described by the following mathematical formulas.

When the BFD waveform is adopted, the signal transmitted from the n th UAV-borne radar can be expressed as

Snt(t,tr)=rect(trTd)exp[j2π(fc+(n1)Δfb)t+jπγt2r](8)
View SourceRight-click on figure for MathML and additional features. where tr is the fast time, Td is sweep duration, and γ is the chirp rate.

Since the swarm of UAV-borne radars is spatially distributed, there are time-frequency synchronization errors and phase synchronization errors between different UAV-borne radars, and thus the signal transmitted from the m th UAV-borne radar can be expressed as

Smt(t,tr)=rect(tr+δtnmTd)×exp[j2π(fc+δfnm)(t+δtnm)]×exp[j2π(m1)Δfb(t+δtnm)]×exp[jπγ(tr+δtnm)2+jβnm](9)
View SourceRight-click on figure for MathML and additional features. where δtnm , δfnm and βnm represent time synchronization errors, frequency synchronization errors, and phase synchronization errors between the n th UAV-borne radar and the m th UAV-borne radar, respectively. When n=m , δtnm=0 , δfnm=0 and βnm=0 .

The reflected signal received by the n th UAV-borne radar is

Snr(t,tr)=m=1Nrect(tr+δtnmRnm(t)/cTd)×exp[jπγ(tr+δtnmRnm(t)c)2+jβnm]×exp[j2π(fc+δfnm)(t+δtnmRnm(t)c)]×exp[j2π(m1)Δfb(t+δtnmRnm(t)c)](10)
View SourceRight-click on figure for MathML and additional features. where Rnm(t) is the sum of the distances from the n th UAV-borne radar and the m th UAV-borne radar to a ground point target.

In the dechirping processing, the reference signal is a replica of the transmitted waveform delayed by the time to the center of the scene, and the reference signal of the n th UAV-borne radar can be written as

Snref(t,tr)=rect(tr2Rs/cTref)exp[jπγ(tr2Rsc)2]×exp[j2π(fc+(n1)Δfb)(t2Rsc)](11)
View SourceRight-click on figure for MathML and additional features. where Tref is the sweep duration of the reference signal, and Rs is the distance between the UAV and the scene center.

After dechirping, the beat signal of the n th UAV-borne radar can be expressed as

Snb(t,tr)=m=1Nrect(tr+δtnmRnm(t)/cTd)rect(tr2Rs/cTref)×exp[j2πγ(tr+δtnm2Rsc)(Rnm(t)c2Rsc)]×exp[j2π((mn)Δfb+γδtnm+δfnm)(tRnm(t)c)]×exp[j2π(fc+(n1)Δfb)(Rnm(t)c2Rsc)]×exp[jπγ(Rnm(t)c2Rsc)2]×exp[jπγδt2nm+jβnm].(12)
View SourceRight-click on figure for MathML and additional features.

Inspecting the second exponential term in (12), we can find that time synchronization errors and frequency synchronization errors have an effect on the signal separation. In this case, to ensure the interferences from other UAV-borne radars can be easily filtered, the frequency offset Δfb should satisfy

Δfb>2Rswathγc+2γ|δtmax|+2|δfmax|(13)
View SourceRight-click on figure for MathML and additional features. where δtmax and δfmax represent the maximum values of time synchronization errors and frequency synchronization errors between adjacent UAV-borne radars, respectively.

To remove the last exponential term in (12) that represents the synchronization errors, a low pass filter is employed to filter out the interferences from the other UAV-borne radar, and the signal from the n th UAV-borne radar itself becomes

Snf(t,tr)=rect(tr2Rnn(t)/cTd)rect(tr2Rs/cTref)×exp[j2πγ(2Rnn(t)c2Rsc)(tr2Rsc)]×exp[j2π(fc+(n1)Δfb)(2Rnn(t)c2Rsc)]×exp[jπγ(2Rnn(t)c2Rsc)2].(14)
View SourceRight-click on figure for MathML and additional features.

After compensating for the residual video phase (RVP) [24], the filtered signal can be expressed as

Snf(t,tr)=rect(tr2Rs/cTd)×exp[j2πγ(tr2Rsc)(2Rnn(t)c2Rsc)]×exp[j2π(fc+(n1)Δfb)(2Rnn(t)c2Rsc)].(15)
View SourceRight-click on figure for MathML and additional features.

For simplicity, the filtered signal can be rewritten as

Snf(t,Kr)=rect(Kr4πB/c)exp[j(Kr+Knc)(Rnn(t)Rs)](16)
View SourceRight-click on figure for MathML and additional features. where B is the bandwidth, Kr is the range wavenumber, and Knc=4π(fc+(n1)Δfb)/c is the center wavenumber of the echo data from the n th UAV-borne radar.

Further, after compensating the reference range Rs , we have

Snf(t,Kr)=rect(Kr4πB/c)exp[j(Kr+Knc)Rnn(t)].(17)
View SourceRight-click on figure for MathML and additional features.

B. Imaging Algorithm and Configuration Design

A simple spatial configuration of the swarm of UAVs for the collaborative imaging is shown in Fig. 6, where N UAV-borne radars move along a circular flight path with the radius of Rc at the same speed v . The beams of N UAV-borne radar are spotlighted on the same observed scene with the radius of Ra . The position of the first UAV-borne radar is (Rcsinθ,Rccosθ,H) , where θ[0,2π] is the aspect angle and H represents the height of the UAV. Meanwhile, the position of the n th UAV-borne radar is (Rcsin(θ+(n1)Δθ),Rccos(θ+(n1)Δθ),H) , where Δθ is the interval of the azimuth angle between adjacent UAV-borne radars and is defined as

Δθ=c2fc(N1)ρa.(18)
View SourceRight-click on figure for MathML and additional features. In fact, Δθ is determined by the desired azimuth resolution ρa . After RcΔθ/v seconds of flight, the echo data of N UAV-borne radars can be fused to form a long aperture observation, and then through imaging processing, one image frame with high azimuth resolution can be obtained.

Fig. 6. - Geometric observation model.
Fig. 6. Geometric observation model.

However, the center wavenumber of the echo data from different UAV-borne radars is different due to the frequency offset. As a result, the frequency-domain imaging algorithms, such as chirp-scaling algorithm, polar format algorithm, cannot be applied directly to do the collaborative imaging. In addition, since the swarm of UAV-borne radars is spatially distributed, the multichannel reconstruction algorithm [25] is not suitable for the collaborative imaging. By contrast, the back-projection (BP) algorithm is capable of fusing the data from N UAV-borne radars directly by projecting the data onto the same imaging coordinate system.

To improve processing efficiency of the BP algorithm, the FBP algorithm [26] and the fast factorized BP (FFBP) algorithm [28] split the synthetic aperture to multiple short subapertures, and then fuse low-resolution images generated from subapertures by using 2-D interpolation. However, the interpolation inevitably brings errors and also limits the efficiency. In the Cartesian factorized BP (CFBP) algorithm [29], [30], two spectrum compressing filters are used to unfold the spectrums of low-resolution images generated from subapertures, and thus low-resolution images can be fused without interpolation to produce a high-resolution image. Unlike the CFBP projects data onto a Cartesian coordinate system, the AFBP reconstructs the image in a unified pseudo-polar coordinate system and efficiently achieves image fusion by wavenumber spectrum connection. Moreover, due to the inherent processing architecture, the AFBP can be implemented on distributed processors for further efficiency, which fits well with the spatial distribution nature of the swarm.

When the AFBP is applied to the collaborative imaging, the echo data of N UAV-borne radars are projected onto a unified ground pseudo-polar coordinate system. The coarse image generated from the echo data of n th UAV-borne radar can be expressed as

In(ρ,α)=tKrSnf(t,Kr)exp[j(Kr+Knc)R(ρ,α)](19)
View SourceRight-click on figure for MathML and additional features. where
ϕR(ρ,α)=(Rcsinϕρα)2+(Rc(1cosϕ)ρ1α2)2+H2[(n1)ΔθNΔθ/2,nΔθNΔθ/2](20)
View SourceRight-click on figure for MathML and additional features. is the range from n th UAV-borne radar to the ground point located at (ρ,α,0) in the unified ground pseudo-polar coordinate system.

The Fourier transform of In(ρ,α) with respect to both ρ and α can be expressed as

I~n(Kρ,Kα)=ραIn(ρ,α)exp(jKρρjKαα).(21)
View SourceRight-click on figure for MathML and additional features.

According to the principle of stationary phase, we have

ρ((Kr+Knc)R(ρ,α)Kρρ)α((Kr+Knc)R(ρ,α)Kαα)=0=0.(22)
View SourceRight-click on figure for MathML and additional features.Then, Kα and Kρ can be approximated as
KαKρ(Kr+Knc)Rc(R2c+H2)Rcsinϕ(Kr+Knc)Rc(R2c+H2).(23)
View SourceRight-click on figure for MathML and additional features. Further, the center of Kα and Kρ can be written as
KcnαKcnρ=KncRc(R2c+H2)Rcsinϕ=KncRc(R2c+H2).(24)
View SourceRight-click on figure for MathML and additional features.

Obviously, due to the frequency offset, the 2-D wavenumber spectra of the coarse images from different UAVs are not aligned along the radial wavenumber Kρ , which leads to the degradation of imaging performance. We thus design a useful configuration of the swarm of UAVs for the collaborative imaging. Specifically, N UAV-borne radars move along N circular flight paths with the same radius of Rc but at different heights. When the position of the first UAV-borne radar is (Rcsinθ,Rccosθ,H) , the position of the n th UAV-borne radar can be expressed as (Rcsin(θ+(n1)Δθ),Rccos(θ+(n1)Δθ),Hn) . Then, we make Kcnρ corresponding to different UAV-borne radars the same, which can be expressed as

Kcnρ=Kc1ρ.(25)
View SourceRight-click on figure for MathML and additional features.Namely,
KncRc(R2c+H2n)=K1cRc(R2c+H2).(26)
View SourceRight-click on figure for MathML and additional features. The height of the n th UAV-borne radar Hn should satisfy
Hn=(fc+(n1)Δffc)2(R2c+H2)R2c.(27)
View SourceRight-click on figure for MathML and additional features.

The flowchart of the collaborative imaging based on the AFBP is illustrated in Fig. 7. First, in the unified ground pseudo-polar coordinate system, N subimages with low azimuth resolution are generated from the echo data of N UAV-borne radars individually by using the BP integral. Second, N subimages are transformed into the wavenumber domain by FFT. Due to the coarse angular interval in the subimages, their angular wavenumber spectra may be folding. The spectrum center correction is employed to unfold the spectra by circular shifting. More specifically, for the n th subimage, the center of its angular wavenumber spectrum should be shifted from zero to Kcnα . After the spectrum center correction, the spectrum connection is performed to directly connect wavenumber spectra of N subimages to obtain the complete wavenumber spectrum. Subsequently, we transform the complete wavenumber spectrum back to the image domain to obtain a high-resolution SAR image in the unified pseudo-polar coordinate system. Finally, the geometric correction is performed by projecting the high-resolution SAR image onto the Cartesian coordinate system via image-domain interpolation. Note that imaging efficiency is limited when the number of UAVs N is small. A simple but effective solution is to further divide the echo of each UAV-borne radar into M subapertures.

Fig. 7. - Flowchart of collaborative imaging of UAV swarm based on AFBP.
Fig. 7.

Flowchart of collaborative imaging of UAV swarm based on AFBP.

SECTION IV. Simulation Results

Simulation experiments have been used to assess the performance of the proposed collaborative high frame-rate imaging approach.

We compare the collaborative high frame-rate imaging system with the video SAR system in terms of imaging performance and the synthetic aperture time. The collaborative imaging system consists of six UAV-borne radars, and the beat frequency is 40 MHz. The parameters of the first UAV-borne radar are given in Table II. The heights of other UAVs can be calculated according to (27), and the imaging geometry is shown in Fig. 8. Nine stationary point targets with the same radar cross section are present, and their positions are listed in Table III. The parameters of a traditional video SAR system are the same as those of the first UAV-borne radar.

TABLE II Parameters of the Collaborative Imaging System
Table II- 
Parameters of the Collaborative Imaging System
TABLE III Position of Nine Point Targets
Table III- 
Position of Nine Point Targets
Fig. 8. - Collaborative imaging geometry of UAV swarm.
Fig. 8.

Collaborative imaging geometry of UAV swarm.

Fig. 9(a) and (b) shows the imaging result of the traditional video SAR system with the frame rate of 2.6 and 0.43 Hz, respectively. For the video SAR system operating in a common microwave band, high frame rate and high resolution are a contradiction. Fig. 9(c) is the result of the collaborative imaging system with a frame rate of 2.6 Hz. Obviously, compared to the traditional video SAR system, the collaborative imaging system achieves a higher frame rate at the same azimuth resolution, which indicates that the collaborative imaging system enables both high frame rate and high-resolution imaging in the common microwave band rather than the THz band.

Fig. 9. - Imaging results obtained by the two imaging modes. (a) and (b) Results of traditional video SAR system with the frame rate of 2.6 and 0.43 Hz, respectively. (c) Result of the collaborative imaging system with the frame rate of 2.6 Hz.
Fig. 9. Imaging results obtained by the two imaging modes. (a) and (b) Results of traditional video SAR system with the frame rate of 2.6 and 0.43 Hz, respectively. (c) Result of the collaborative imaging system with the frame rate of 2.6 Hz.

To verify the effectiveness of the adopted BFD waveform, the collaborative imaging result from the echo data without BFD demodulation is given in Fig. 10. When the BFD waveform is not adopted, the signals from six UAV-borne radars interfere with each other and thus the collaborative imaging result is not acceptable.

Fig. 10. - Collaborative imaging result without BFD demodulation.
Fig. 10. Collaborative imaging result without BFD demodulation.

Additionally, we compare and analyze the imaging performance of the collaborative imaging system under two spatial configurations. When six UAV-borne radars follow the same circular flight path, the imaging result in the pseudo-polar coordinate system is given in Fig. 11(a), and the corresponding wavenumber spectrum is shown in Fig. 11(c). Meanwhile, Fig. 11(b) displays the corresponding interpolation result in the Cartesian coordinate system for easier observation of resolution. It can be seen that the frequency offset leads to the misalignment of the wavenumber spectrum along the radial wavenumber Kρ , resulting in the distorted imaging result. In contrast, when six UAV-borne radars complete formation flight based on the designed spatial configuration, more satisfactory imaging result in the pseudo-polar and Cartesian coordinate systems can be obtained and are, respectively, shown in Fig. 11(d) and (e), and the corresponding wavenumber spectrum is given in Fig. 11(f). Obviously, the designed spatial configuration can compensate the effect of the frequency offset on the collaborative imaging.

Fig. 11. - Simulation results of the collaborative imaging system under two spatial configurations. (a) and (b) Imaging results in pseudo-polar and Cartesian coordinate systems, respectively, when all UAV-borne radars move along the same circular flight path. (d) and (e) Imaging results in pseudo-polar and Cartesian coordinate systems, respectively, when the collaborative imaging system works under the designed spatial configuration. (c) and (f) Corresponding wavenumber spectra of (a) and (d) under the two spatial configurations.
Fig. 11. Simulation results of the collaborative imaging system under two spatial configurations. 
a) and (b) Imaging results in pseudo-polar and Cartesian coordinate systems, respectively, when all UAV-borne radars move along the same circular flight path.
(d) and (e) Imaging results in pseudo-polar and Cartesian coordinate systems, respectively, when the collaborative imaging system works under the designed spatial configuration.
(c) and (f) Corresponding wavenumber spectra of (a) and (d) under the two spatial configurations.

However, it is very challenging in practice to control multiple UAVs to fly at ideal heights precisely, and thus height deviations are inevitable. We conduct a simulation to investigate the effect of height deviations on imaging performance. In the simulation, the actual heights of six UAVs are listed in Table IV, where the height deviations obey the Gaussian distribution with 1 variance and the mean of δh . Note that the probability of the up or down of deviation is the same. Fig. 12 shows the imaging results at different levels of height deviation. Fig. 12(a), (c), and (e) display the wavenumber spectra at different levels of height deviation. It can be seen that the misalignment of the wavenumber spectrum along the radial wavenumber becomes more severe with the increase of height deviation. Fig. 12(b), (d), and (f) shows the imaging results of point 9 at different levels of height deviation. Obviously, as the height deviation increases, the imaging results are more deformed and the sidelobes are more irregular, but the resolution is not significantly degraded.

TABLE IV Flight Heights of UAVs at Different Levels of Deviation
Table IV- 
Flight Heights of UAVs at Different Levels of Deviation
Fig. 12. - Imaging results at different levels of height deviation. (a), (c), and (e) Wavenumber spectra when height deviations obey the Gaussian distribution with the mean of 1, 5, and 10, respectively. (b), (d), and (f) Corresponding imaging results of a single point target.
Fig. 12. Imaging results at different levels of height deviation. (a), (c), and (e) Wavenumber spectra when height deviations obey the Gaussian distribution with the mean of 1, 5, and 10, respectively. (b), (d), and (f) Corresponding imaging results of a single point target.

The effect of the beat frequency offset is also a concern. The main simulation parameters are shown in Table II, and the beat frequency is reset to 30, 40, and 50 MHz, respectively. The imaging results of the collaborative imaging SAR system with different beat frequency offset are given in Fig. 13. The results have no noticeable change when the frequency offset increases thanks to the designed configuration.

Fig. 13. - Effect of the beat frequency offset on the collaborative imaging performance. (a)–(c) Result of the collaborative imaging with the beat frequency offset of 30, 40, and 50 MHz, respectively.
Fig. 13. Effect of the beat frequency offset on the collaborative imaging performance. (a)–(c) Result of the collaborative imaging with the beat frequency offset of 30, 40, and 50 MHz, respectively.

Furthermore, for the collaborative imaging system with the imaging frame rate of 2.6 Hz, the influence of the number of UAVs on the image performance is investigated. Fig. 14 shows the imaging results when the number of UAVs is, respectively, 2, 4, and 6. It can be seen that as the increase of the number of UAVs, the resolution is improved. Consequently, by increasing the number of UAVs, the collaborative imaging system can obtain SAR images with both high frame rate and high resolution.

Fig. 14. - Influence of the number of UAVs on collaborative imaging performance. (a)–(c) Imaging result with two, four, and six UAVs, respectively.
Fig. 14. Influence of the number of UAVs on collaborative imaging performance. (a)–(c) Imaging result with two, four, and six UAVs, respectively.

In addition, we investigate the influence of trajectory errors on imaging performance. Fig. 15(a) gives the imaging result under trajectory errors, where the energy is significantly defocusing along the azimuth direction. Fortunately, the AFBP algorithm establishes the Fourier transform relationship between the image domain and the wavenumber domain, and thus traditional auto-focusing algorithms can be employed to compensate for the effects of trajectory errors. By employing the phase gradient autofocus (PGA) [31] algorithm, we can obtain the focusing imaging result despite the presence of trajectory errors among multiple UAV platforms, as shown in Fig. 15(b). Therefore, the proposed approach in this article has the potential to mitigate the effect of trajectory errors.

Fig. 15. - Imaging results under the same trajectory error (a) before and (b) after autofocusing.
Fig. 15. Imaging results under the same trajectory error (a) before and (b) after autofocusing.

Finally, a collaborative imaging simulation is used to investigate the effect of the number of UAVs on the moving target shadows. The collaborative imaging results with the same resolution ρa=0.3 m but under different number of UAVs (1, 2, 4, and 6) are shown in Fig. 16. Four moving targets with different velocities (5, 10, 15, and 20 m/s) are arranged in a homogeneous scene. The equivalent backscatter coefficients of noise and background in the homogeneous scene is set to −34.5 and −16.5 dB, respectively. It should be pointed out that collaborative imaging degenerates to video SAR imaging when the number of UAV decreases to one. Compared to the traditional video SAR system, the collaborative imaging system relies on more platforms to create a large aperture in a shorter synthetic time, which results in a short target movement distance. Therefore, moving targets in the imagery obtained by the collaborative imaging system are more likely to produce distinct shadows, which is highly expected in shadow detection. Moreover, increasing the number of UAVs can make shadows more distinct and therefore improves the maximum detectable velocity significantly.

Fig. 16. - Simulation results of moving target shadow with different target velocities. (a)–(d) Result with one, two, four, and six UAVs, respectively.
Fig. 16. Simulation results of moving target shadow with different target velocities. (a)–(d) Result with one, two, four, and six UAVs, respectively.

SECTION V. Conclusion

Collaborative operation of airborne radar network has not received much research interest until recently. This article is the first, to the best of our knowledge, to discuss high-frame rate collaborative imaging in the research field related to the emerging technology of swarm of UAV-borne radars.

The collaborative imaging system consists of multiple self-transmitting and self-receiving UAV-borne radars. Each UAV-borne radar obtains short aperture data in a short time, and multiple aperture data are then fused to achieve the desired azimuth resolution by using the AFBP SAR algorithm. Moreover, the BFD waveform is introduced for collaborative operation of distributed UAV-borne radars, and we design a useful spatial configuration to suppress the effect of the BFD waveform on collaborative imaging. The simulation results have been presented to assess the performance of the proposed approach.


New Multi-Mode Radar Incorporates Video SAR | Unmanned Systems Technology

Caroline Rees

ga-asi-introduces-Eagle-Eye-radar

General Atomics Aeronautical Systems, Inc. (GA-ASI) has released the Eagle Eye radar, a new Multi-mode Radar (MMR) that has been installed and flown on a U.S. Army-operated Gray Eagle Extended Range (GE-ER) Unmanned Aerial Vehicle (UAV). 

Eagle Eye is a high-performance radar system that delivers high-resolution, photographic-quality imagery that can be captured through clouds, rain, dust, smoke and fog at multiple times the range of previous radars. It’s a drop-in solution for GE-ER and is designed to meet the range and accuracy to Detect, Identify, Locate & Report (DILR) stationary and moving targets relevant for Multi-Domain Operations (MDO) with Enhanced Range Cannon Artillery (ERCA). Eagle Eye radar can deliver precision air-to-surface targeting accuracy and superb wide-area search capabilities in support of Long-Range Precision Fires.

Featuring Synthetic Aperture Radar (SAR), Ground/Dismount Moving Target Indicator (GMTI/DMTI), and robust Maritime Wide Area Search (MWAS) modes, Eagle Eye’s search modes provide the wide-area coverage for any integrated sensor suite, allowing for cross-cue to a narrow Field-Of-View (FOV) Electro-Optical/Infrared (EO/IR) sensor.

The Eagle Eye’s first flight on the Army GE-ER aircraft took place in December 2021, incorporating the new Video SAR capability. Video SAR enables continuous collection and processing of radar data, allowing persistent observation of targets day or night and during inclement weather or atmospheric conditions. In addition, Eagle Eye’s processing techniques enable three modes – SAR Shadow Moving Detection, SAR Stationary Vehicle Detection and Moving Vehicle Detection as part of its Moving Target Indicator – to operate simultaneously.  

“The Video SAR in Eagle Eye provides all-weather tracking and revolutionizes precision targeting of both moving and stationary targets at the same time,” said GA-ASI Vice President of Army Programs Don Cattell. “This is a critical capability in an MDO environment to ensure military aviation, ground force and artillery have constant situational awareness and targeting of enemy combatants.”

GA-ASI Introduces New Eagle Eye Radar


GA-ASI Introduces New Eagle Eye Radar

New Radar Flies on U.S. Army Gray Eagle UAS; Features New Video SAR Capability

SAN DIEGO – 05 April 2022 – General Atomics Aeronautical Systems, Inc. (GA-ASI), a leader in Multi-mode Radar technology for Unmanned Aircraft Systems, introduces the Eagle Eye radar. The new MMR is installed and has flown on a U.S. Army-operated Gray Eagle Extended Range (GE-ER) UAS. Eagle Eye joins GA-ASI’s line of radar products, which includes the Lynx® MMR.

Eagle Eye is a high-performance radar system that delivers high-resolution, photographic-quality imagery that can be captured through clouds, rain, dust, smoke and fog at multiple times the range of previous radars. It’s a “drop-in solution” for Gray Eagle ER and is designed to meet the range and accuracy to Detect, Identify, Locate & Report (DILR) stationary and moving targets relevant for Multi-Domain Operations (MDO) with Enhanced Range Cannon Artillery (ERCA). Eagle Eye radar can deliver precision air-to-surface targeting accuracy and superb wide-area search capabilities in support of Long-Range Precision Fires.

Featuring Synthetic Aperture Radar (SAR), Ground/Dismount Moving Target Indicator (GMTI/DMTI), and robust Maritime Wide Area Search (MWAS) modes, Eagle Eye’s search modes provide the wide-area coverage for any integrated sensor suite, allowing for cross-cue to a narrow Field-of-View (FOV) Electro-optical/Infrared (EO/IR) sensor.

The Eagle Eye’s first flight on the Army GE-ER aircraft took place in December, incorporating the new Video SAR capability. Video SAR enables continuous collection and processing of radar data, allowing persistent observation of targets day or night and during inclement weather or atmospheric conditions. In addition, Eagle Eye’s processing techniques enables three modes – SAR Shadow Moving Detection, SAR Stationary Vehicle Detection and Moving Vehicle Detection as part of its Moving Target Indicator – to operate simultaneously.  

“The Video SAR in Eagle Eye provides all-weather tracking and revolutionizes precision targeting of both moving and stationary targets at the same time,” said GA-ASI Vice President of Army Programs Don Cattell. “This is a critical capability in an MDO environment to ensure military aviation, ground force and artillery have constant situational awareness and targeting of enemy combatants.”

About GA-ASI

General Atomics-Aeronautical Systems, Inc. (GA-ASI), an affiliate of General Atomics, is a leading designer and manufacturer of proven, reliable remotely piloted aircraft (RPA) systems, radars, and electro-optic and related mission systems, including the Predator® RPA series and the Lynx® Multi-mode Radar. With more than seven million flight hours, GA-ASI provides long-endurance, mission-capable aircraft with integrated sensor and data link systems required to deliver persistent flight that enables situational awareness and rapid strike. The company also produces a variety of ground control stations and sensor control/image analysis software, offers pilot training and support services, and develops meta-material antennas. For more information, visit www.ga-asi.com

Avenger, Lynx, Predator, SeaGuardian and SkyGuardian are registered trademarks of General Atomics Aeronautical Systems, Inc.

ga-asi Apr 5, 2022

 

No comments:

Post a Comment

When RAND Made Magic + Jason Matheny Response

Summary The article describes RAND's evolution from 1945-present, focusing on its golden age (1945-196...