Fig. 2. Flying object data sample. (a) ULA, (b) URA, and (c) UCA. |
Flying Objects Classification Based on Micro–Doppler Signature Data From UAV Borne Radar | IEEE Journals & Magazine | IEEE Xplore
SECTION I. Introduction
Unmanned
aerial vehicles (UAVs) have seen fast technological advancements in
recent years, enabling extensive usage in various applications [1].
Despite receiving more attention from different public and private
sectors, UAVs unquestionably constitute a severe threat to the safety of
the airspace, which may jeopardize civilian privacy and security.
Concerns about unlicensed and inexperienced pilots invading restricted
areas and interfering with flight systems have been raised by using
civilian drones in the national airspace. Using UAVs in illegal
surveillance and terrorist strikes is the most problematic situation [2].
Therefore, the deployment of anti-drone equipment is urgently required.
Such a drone defense system must find, recognize, and follow a drone’s
movements. Various video, audio, radar, and radio frequency (RF)
surveillance technologies are used for micro-UAV detection and
categorization [3].
Radar can be preferred as it works in all weather conditions. Deep
learning (DL) has seen tremendous growth in recent years. With the
ability to choose, extract, and analyze features from raw datasets
without relying on manual feature selection and extraction, DL
approaches are recognized as one of the most influential and successful
techniques. If the initial hyperparameters are adequately calibrated, DL
approaches can adapt to the dataset’s diversity presented without
degrading the classification performance, making those algorithms
incredibly efficient and time-effective [4]. DIAT-
The work in this letter has the following contributions:
Data acquired by an array of HB100 radar mounted on a UAV are used to classify flying objects (drones, helicopters, and artificial birds). The implications of various HB100 array orientations or placements, such as uniform linear array (ULA), uniform rectangular array (URA), and uniform circular array (UCA), are explored. Furthermore, depending on the experimental data, the activities or modes are classified.
A hybrid algorithm is proposed and utilized to categorize the flying object and identify the activities (of drone) with various configurations of HB100 radar array.
Classification accuracy is analyzed for the proposed algorithm by considering different parameters such as batch size, population size (PS), and array configuration into consideration.
Note: Matrices and vectors are represented in boldface letters. In this work, drones and UAVs are used interchangeably. Section II describes the experimental setup and dataset description.
SECTION II. Experimental Set-Up and Dataset Description
The experimental setup’s framework for categorizing the flying object and identifying its activities is shown in Fig. 1. The HB100 radar mounted to a UAV captures the micro-Doppler impact caused by flying objects depicted in Fig. 2. The transmitted signal rebounds to the radar and can be obtained at the IF terminal after collecting the micro-Doppler effect generated by the targeted objects [7]. The radar is associated with an amplifier circuit (AC) in order to strengthen the signal because its output is obtained in microvolts [8]. The Zigbee module is linked to the AC’s output to transfer the data to the desktop or laptop for additional MATLAB processing such as data preprocessing, formatting, dataset generation, and subsequent analysis. A more detailed composition of the dataset is represented in Table I and Fig. 3(a). In Table I, the term “Ratio” means the samples of the particular class divided by the total number of samples of all the classes. Dataset is fragmented into 70: 20: 10 ratios for training, testing, and validation, respectively.
SECTION Algorithm 1
Proposed Algorithm
CNN: for
l = 1 to 4 Convolution:Z=Wj⋅Y+B ReLU:Z′={0YifZ≤0ifZ>0 Pooling:X=max(Z′) end forOutput Prediction: True Positive, True Negative, False Positive, False Negative
FF: Calculate the accuracy [5].
Q(k)=[FF1(k),FF2(k),…,FFj(k)]
SECTION III. Proposed Algorithm
The proposed algorithm combines a Convolutional Neural Network (CNN) and a Shuffled Frog Leap Algorithm (SFLA).
CNNs are commonly employed in pattern recognition because of their capacity to retrieve and categorize features effectively [9]. SFLA is a metaheuristic optimization technique inspired by the foraging behavior of frogs [10].
It employs global search, in which successful solutions are exchanged
among different groups of frogs, and local search, in which solutions
are improved by altering their components. CNNs and SFLA are effective
methods for feature extraction, classification, and optimization. In
order to train the MDS data using CNN, the population is initialized by
randomly assigning weights to a known number of filters. The evaluation
entails feeding the experimental MDS,
SECTION IV. Result and Discussion
The proposed algorithm is assessed based on MDS gathered from HB100 radar mounted on a UAV. The data are collected at different arrangements of HB100 radar antenna array (ULA, URA, and UCA). Tables II–IV depict the classification accuracy of the flying objects with four classes—artificial bird, helicopter, drone 1, and drone 2, by varying the batch size (32, 64, and 128) and iterations at different angles (0°, 30°, and 45°) along with the activities of UAVs, i.e., eight classes—artificial bird, helicopter, drone 1 with On, Off, Connected activity, drone 1 with Flying activity, drone 1 with Hovering activity and drone 2 with On, Off, Connected activity, drone 2 with Flying activity, drone 2 with Hovering activity, as mentioned in Table I, for ULA, URA, and UCA, respectively. It can be seen that ULA configuration accuracy is less near 0°, as compared to different directions, i.e., 30° and 45°. Although URA works well near 0°. This is due to the nonuniform operation of each antenna element in the ULA [12]. The ULA’s field of view is so constrained. It is unable to recognize or evaluate accurately in all aspects. URA, on the other hand, has a symmetry array structure that enables a nearly 360° field of view with little variations in beamwidth or sidelobe level. The reference point distribution of URAs allows for higher resolution than that of UCAs. The circular layout of UCAs limits their ability to differentiate far separated sites, whereas URAs can do so with more effectiveness.
Another finding is that accuracy grows with more batch sizes, up to 128 batches and 500 iterations, after which accuracy declines in every situation. This experimental study yields greater accuracy at 128 batch size (500 iterations). This is because of the proposed algorithm’s weights being updated both locally and globally. Consequently, accuracy may be improved even with a 128-batch size. However, accuracy declines after 500 iterations since the proposed algorithm is likely to overfit the categorization after many iterations with large batch sizes.
Three various sample sizes—PS = 25, 50, and 60—are taken into account. Tables II–IV shows that in all cases, a PS = 50 provides the best accuracy. This is due to insufficient variety in the search space. As PS increases, categorization complexity increases as well, leading to less accuracy at PS = 60 than PS = 50. A suitable number must be taken into account because both a low and a high population have a negative impact on accuracy. Low performance in smaller populations may be caused by the weights’ limited search space, and the intricacy of the large PS.
The proposed algorithm is a promising approach for classification in real-world scenarios. Figs. 4 and 5 depict the classification accuracy of the proposed algorithm with four and eight classes, respectively, with the existing algorithms [5], [13], [14], and [15]. In comparison to the existing methods, the proposed algorithm performs better in each of the circumstances considered. As more information about the distinctive characteristics of different flying objects becomes available owing to the convolutional layer count, their accurate classification becomes possible. Utilizing the sorting, subgrouping, and replacement building blocks will therefore aid in avoiding early convergence and exploring additional diversity without running into overfitting issues. But it costs plenty of computing power to run proposed algorithm. For real-time applications, this might pose a drawback. Moreover, it is susceptible to the hyperparameters. To achieve the accurate outcomes, it is crucial to properly adjust these hyperparameters. From an application perspective, achieving a high level of accuracy is crucial, as misclassification of flying objects can lead to false alarms or missed detections. It is observed that the proposed method is better than the existing method. This high accuracy translates directly into the reliability of our system in real-world applications, such as airspace surveillance or flying classification and categorization.
Fig. 6 depicts the sensitivity of ULA, URA, and UCA. The array configuration’s sensitivity is determined by the placement of the antennas as well as AoA. The mathematical relation of the sensitivity with respect to the array configuration is depicted in [12].
SECTION V. Conclusion
For the invader monitoring system to function properly in the restricted area, flying objects must be classified. For this application, the proposed strategy performs better than the prevailing techniques in categorizing flying objects with their activities. In this case, the CNN model’s main parameter is weights. Moreover, the weights are updated via SFLA, which also keeps the process away from convergent abruptly. ULA, URA, and UCA have been explored in various orientations, with URA excelling in endfire direction and ULA performing better in the 30° and 45°. The proposed technique attains an accuracy of 98.4% (URA at endfire direction), 99.7% at 30° (ULA), 99.9% at 45° (ULA), and 98.1% (URA), 99.4% at 30° (ULA), 99.6% at 45° (ULA), for the four and eight classes, respectively.
No comments:
Post a Comment