Thursday, December 25, 2025

FMCW Radar Technology Emerges as Privacy-Preserving Solution for Human Activity Recognition


[2410.08483] FMCW Radar Principles and Human Activity Recognition Systems: Foundations, Techniques, and Applications

FMCW Radar Principles and Human Activity Recognition Systems: Foundations, Techniques, and Applications

This book introduces the theoretical foundations of FMCW radar systems, including range and velocity estimation, signal processing techniques, and the generation of radar point clouds. A detailed discussion of Python and MATLAB as the primary programming tools for radar signal processing is provided, including the integration of libraries like NumPy, Matplotlib, and SciPy for data analysis and visualization. In addition, the book covers advanced techniques such as deep learning applications for radar signal processing, focusing on Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) networks, and Transformers for analyzing radar data. Furthermore, it highlights state-of-the-art methods for human activity recognition using radar, leveraging a combination of traditional signal processing techniques and machine learning models. The book is designed to cater to both beginners and experts in radar signal processing, offering practical examples, code implementations, and insights into the future of radar technology in various domains, including autonomous systems and security applications.
Comments: 203pages
Subjects: Signal Processing (eess.SP)
Cite as: arXiv:2410.08483 [eess.SP]
  (or arXiv:2410.08483v2 [eess.SP] for this version)
  https://doi.org/10.48550/arXiv.2410.08483

FMCW Radar Technology Emerges as Privacy-Preserving Solution for Human Activity Recognition

BLUF (Bottom Line Up Front)

Frequency-modulated continuous wave (FMCW) radar technology has matured into a highly accurate, privacy-preserving platform for human activity recognition (HAR), achieving recognition accuracies exceeding 97% in recent studies while operating through walls and in complete darkness. This millimeter-wave technology is rapidly displacing camera-based systems in healthcare monitoring, elderly fall detection, and smart home applications due to its unique combination of sub-millimeter precision, environmental robustness, and absence of visual privacy concerns. The convergence of FMCW radar with deep learning architectures—particularly convolutional neural networks, transformers, and point cloud processors—has enabled real-time detection of complex human activities ranging from daily living tasks to emergency fall events, with commercial deployment now accelerating across healthcare facilities, smart buildings, and autonomous vehicle platforms.


The Radar Revolution in Human Sensing

Human activity recognition has long relied on either wearable sensors that users find burdensome or camera systems that raise significant privacy concerns. FMCW radar technology offers a fundamentally different approach, using electromagnetic waves in the 57-81 GHz frequency range to detect and classify human movements with millimeter-level precision while capturing no identifiable visual information.

Unlike traditional pulsed radar, FMCW systems transmit continuous waves with linearly increasing frequency—chirp signals that sweep across several gigahertz of bandwidth. When these signals reflect off moving objects, the radar measures both the range (distance) and velocity of targets by analyzing the frequency difference between transmitted and received signals. This technique, refined over decades in military and automotive applications, has recently undergone a dramatic miniaturization that makes it practical for indoor human monitoring.

The technology's recent prominence stems from a confluence of factors: commercial availability of single-chip radar sensors operating at 60 GHz and 77 GHz, dramatic improvements in deep learning algorithms for radar signal interpretation, and growing awareness of privacy concerns with camera-based monitoring systems. According to a comprehensive 200-page technical book released in December 2025 by researchers at Purdue University and Emory University, FMCW radar has emerged as "the technology of choice for privacy-sensitive applications where continuous, non-invasive monitoring is essential."

Technical Foundations: From Chirps to Classifications

The signal processing pipeline for FMCW radar-based HAR involves multiple sophisticated stages. The radar transmits a linear frequency-modulated chirp signal, typically sweeping from 60 to 64 GHz in industrial applications or 77 to 81 GHz in automotive contexts. When this signal reflects from a human body, the returned echo contains frequency shifts proportional to both the target's distance and velocity.

The radar receiver extracts a "beat frequency" by mixing the transmitted and received signals, creating an intermediate frequency signal that contains all the range and Doppler information. This IF signal then undergoes a three-stage fast Fourier transform (FFT) process: the first FFT estimates range by analyzing frequency content, the second estimates Doppler shift (velocity) across multiple chirps, and the third determines angular position using multiple receive antennas in MIMO (multiple-input multiple-output) configurations.

This processing generates what researchers call a "radar data cube"—a three-dimensional matrix containing range, velocity, and angle information. From this cube, algorithms can extract two primary data representations: Range-Doppler maps, which show how radar reflections are distributed across distance and velocity space, and radar point clouds, which represent detected targets as sets of three-dimensional points similar to lidar data.

Different human activities create distinctive signatures in these representations. Walking produces a characteristic pattern of periodic motion with consistent velocity, while a fall generates a rapid downward acceleration followed by stationary behavior. Sitting down shows a brief period of motion followed by stillness at a specific range, whereas gestures like waving create localized velocity patterns without significant range change.

Deep Learning Architectures Transform Radar Data Into Recognition

The application of deep learning to radar signal processing has proven transformative. Convolutional Neural Networks (CNNs) excel at extracting spatial features from Range-Doppler maps, treating them as images and identifying patterns such as the characteristic "micro-Doppler signature" of human movement—the unique frequency shifts caused by motion of body parts, breathing, and heartbeat.

A study published in MDPI Sensors in May 2023 demonstrated the power of multi-domain feature fusion, combining time-Doppler and time-range maps using a Multi-Feature Attention Fusion Network (MFAFN) that achieved 97.58% accuracy on the University of Glasgow's publicly available HAR dataset. The network incorporated channel attention mechanisms to emphasize the most informative features while suppressing noise and irrelevant information.

For temporal pattern recognition, Long Short-Term Memory (LSTM) networks have proven essential. These recurrent architectures maintain memory of previous time steps, enabling them to distinguish activities that unfold over time. Recent research published in Applied Sciences in July 2025 on elderly fall detection combined LSTM with attention mechanisms, allowing the network to focus on the most critical moments in an activity sequence—particularly important for detecting the transition from normal movement to a fall event.

The most sophisticated current systems employ hybrid CNN-LSTM architectures that first extract spatial features from each radar frame using convolutional layers, then feed these feature sequences into LSTM layers to model temporal evolution. This approach proved particularly effective in a June 2025 study published in Measurement Science & Technology, where researchers achieved robust activity recognition from arbitrary orientations using a lightweight spatio-spectro-temporal network with the Texas Instruments IWR6843ISK radar sensor.

Perhaps most intriguingly, point cloud processing networks originally developed for lidar data have been adapted for radar. PointNet and PointNet++ architectures process radar point clouds directly, learning permutation-invariant features that work regardless of the number or order of detected points. A November 2024 study published in PMC demonstrated the first successful multi-person posture recognition system using FMCW radar point clouds, employing DenseNet architecture with a 12-transmit, 16-receive antenna MIMO configuration to simultaneously track two subjects.

Transformer architectures, featuring self-attention mechanisms, represent the latest frontier. These networks can model long-range dependencies in radar sequences and have shown promise in applications from maritime radar prediction to radar pulse deinterleaving. A lightweight hybrid Vision Transformer (LH-ViT) proposed in Scientific Reports in October 2023 combines efficient convolution operations with transformer self-attention to achieve both high accuracy and computational efficiency.

Hardware Platforms Enable Practical Deployment

Commercial radar sensors have become remarkably capable and affordable. Texas Instruments' IWR6843 series, operating at 60-64 GHz, integrates RF front-end, analog-to-digital converters, and processing cores in a single chip measuring just millimeters across. These sensors can detect motion at ranges exceeding 6 meters with millimeter-resolution accuracy while consuming as little as 988 microamperes in low-power motion detection mode, according to Texas Instruments' technical documentation.

The IWR6843ISK evaluation module, widely used in research, features a MIMO antenna configuration providing 120-degree azimuth field of view and can output processed point clouds via USB at rates sufficient for real-time activity recognition. Its companion sensor, the IWR6843AOP, integrates the antenna array directly into the chip package, enabling even more compact designs for consumer products.

For automotive and industrial applications, the AWR series operates at 76-81 GHz with 4 GHz of bandwidth, providing finer range resolution. These sensors have been successfully deployed in adaptive cruise control systems and industrial safety applications where detecting human presence in hazardous zones is critical.

Low-power variants like the IWRL6432 achieve sub-milliwatt operation in motion detection mode, making battery-powered applications practical. This sensor powers smart home applications from occupancy detection in office pods to presence sensing in smart TVs, where it enables automatic adjustment of lighting and ventilation based on occupancy.

Healthcare Applications: Fall Detection and Beyond

Elderly fall detection represents perhaps the most compelling application of FMCW radar technology. Falls are the leading cause of injury among people over 65, with rapid detection and response being critical to minimizing harm. Camera-based systems raise obvious privacy concerns in bathrooms and bedrooms, while wearable devices suffer from compliance issues—elderly users often forget to wear them or remove them when showering, precisely when falls are most likely.

A comprehensive survey published in IEEE Robotics & Automation Magazine in September 2024 analyzed radar-based fall detection systems, emphasizing micro-Doppler, Range-Doppler, and Range-Doppler-Angles techniques. The survey noted that deep learning approaches, particularly CNNs and RNNs, significantly outperform traditional machine learning methods like Support Vector Machines and k-Nearest Neighbors in learning intricate features from large, unstructured datasets.

Recent systems achieve impressive performance metrics. A fall detection method using millimeter-wave radar coupled with voxelized point clouds, presented at the 2024 International Joint Conference on Robotics and Artificial Intelligence, addresses the challenge that "wearable devices are inconvenient and costly, while camera systems pose risks to privacy." The system leverages deep learning to function even with sparse radar data, a common characteristic of millimeter-wave point clouds compared to dense lidar returns.

A homecare fall detection study using mmWave technology, published in March 2025, systematically evaluated factors influencing detection accuracy including distance, angle, radar deployment configuration, and obstacles. Testing with multiple radar sensor sets demonstrated near-perfect accuracy (ROC = 1.000, p < 0.001) for fall detection at optimal configurations. The researchers recommended deployment of two radars positioned diagonally at 45° to -45° angles, covering distances from 0.5 to 6.0 meters.

Privacy-sensitive environments present unique challenges. A study published in PMC in April 2025 examined radar-based activity recognition specifically in bathroom settings using a Infineon BGT60TR13C Xensiv 60 GHz radar. Testing 16 different pre-trained feature extraction networks combined with bidirectional LSTM, the researchers achieved 97.02% overall accuracy with DenseNet201, demonstrating that radar can effectively monitor activities of daily living without capturing identifiable biometric features.

The challenge of domain generalization—ensuring systems work for new users with different body types and movement patterns—has been addressed through advanced training techniques. A study published in Tsinghua Science and Technology in September 2025 constructed a fall detection model based on anomaly detection, training only on non-fall samples and detecting falls as abnormal actions. The model uses domain feature alignment to extract domain-invariant features, improving generalization to new users without requiring elderly individuals to repeatedly fall during data collection—an unethical practice that has limited some earlier research.

Smart Home and Building Automation

Beyond healthcare, FMCW radar is enabling sophisticated smart building applications. An IoT system for people counting using multiple mmWave FMCW radars, developed at the University of A Coruña and presented in January 2024, deployed both IWR6843ISK and IWR6843AOPEVM modules in a laboratory environment. The system successfully tracked multiple people simultaneously across a 12-meter space, preserving privacy while enabling accurate occupancy monitoring for energy efficiency.

Commercial deployments are accelerating. Framery's smart office pods use TI's IWR6843AOP radar sensor for highly accurate occupancy detection with reduced false errors. The system automatically adjusts lighting and ventilation when people enter or leave, improving energy efficiency while displaying real-time availability status. According to Texas Instruments' industrial sensor overview, the radar's ability to detect motion even through plastic enclosures enables seamless integration into furniture and architectural elements.

QUMEA's RADIQ-1 sensor, also based on the IWR6843AOP, provides contactless monitoring of presence, vital signs, and fall events in care facilities. The system detects precise motion points to understand position, posture, and condition in real-time, providing nearly error-free alerts to caregivers.

For industrial safety, Inxpect's SBV-01 sensor uses IWR6843 to detect operators in dangerous areas, preventing machinery from restarting until hazardous zones are clear. The SIL-2 certified radar reduces false detections from dust, debris, smoke, and dirt while measuring even the smallest movements such as breathing, providing higher safety with fewer production interruptions than competing technologies.

Cross-Environment Robustness Challenges

Despite impressive laboratory results, deploying HAR systems across diverse real-world environments remains challenging. A study published in MDPI Electronics in February 2025 examined cross-environment robustness using an IWR6843AoP radar operating at 60-64 GHz. Testing in both an office environment and a cluttered assisted living room revealed that environmental factors significantly impact radar signatures.

The researchers found that furniture positioning affects activity patterns—chairs around a table partially constrained "sit-down" and "stand-up" movements, complicating these activities compared to open environments. Obstruction of line-of-sight between body parts and the radar degraded recognition accuracy. The study emphasized that while data augmentation and unsupervised domain adaptation can enhance robustness, carefully considering deployment environment characteristics during system design remains essential.

Frequency resolution on the Doppler axis also affects performance. Higher frequency resolution provides more nuanced patterns for slow movements like breathing or subtle gestures, but requires different radar configurations. The researchers noted that balancing range resolution, velocity resolution, and frame rate within hardware constraints requires application-specific optimization.

Signal Processing Innovation Continues

Recent research has explored optimizing the radar signal processing pipeline itself. A study published in MDPI Sensors in January 2025 compared three distinct two-dimensional radar processing techniques: range-FFT-based time-range maps, time-Doppler-based short-time Fourier transform (STFT) maps, and smoothed pseudo-Wigner-Ville distribution (SPWVD) maps. Testing these representations with four state-of-the-art CNN architectures (VGG-16, VGG-19, ResNet-50, MobileNetV2) on the Glasgow dataset revealed that the choice of time-frequency representation significantly impacts recognition accuracy.

The research positioned "radar-generated maps as a form of visual data, bridging radar signal processing and image representation domains while ensuring privacy in sensitive applications." This approach leverages decades of computer vision research, applying proven image classification techniques to radar data.

For resource-constrained devices, computational efficiency becomes paramount. A Korea Science publication from 2024 demonstrated a MIMO FMCW system with customized depthwise separable CNN achieving 98.28% accuracy with only 11.27M multiply-accumulate operations. Implemented on a Raspberry Pi, the system processed point clouds at 8 frames per second using a PointPillars architecture with depthwise separable convolutions—enabling deployment on edge devices without cloud connectivity.

Autonomous Driving Convergence

The automotive industry's massive investment in radar for autonomous driving has created substantial technology spillover into HAR applications. Point-based methods like Frustum PointNets and voxel-based methods like VoxelNet and SECOND, originally developed for vehicle detection, have been adapted for pedestrian recognition in autonomous vehicle contexts.

However, significant challenges remain. Radar point clouds are inherently sparser than lidar, with fewer reflection points per target. Human bodies present smaller radar cross-sections than vehicles, and their complex, articulated motions create more varied signatures. Noise and clutter mitigation, already critical in automotive applications, becomes even more important when detecting subtle human movements in complex environments.

Simulation Enables Rapid Development

The difficulty and expense of collecting diverse, high-quality radar datasets has led to increased interest in simulation. The RadHARSimulator V1, a model-based FMCW radar HAR simulator released in September 2025, integrates an anthropometrically-scaled 13-scatterer kinematic model to simulate 12 distinct activities. The simulator incorporates dynamic radar cross-section calculations, free-space or through-the-wall propagation models, and calibrated noise floors to ensure signal fidelity.

The complete processing pipeline includes moving target indication (MTI), bulk Doppler compensation, and Savitzky-Golay denoising, generating high-resolution range-time maps (RTMs) and Doppler-time maps (DTMs) via short-time Fourier transform. While simulated data cannot fully replace real-world collection, such tools enable rapid algorithm development and testing across activity variations that would be impractical to collect from human subjects.

Integration Challenges and Solutions

Successful HAR deployment requires addressing multiple technical challenges beyond core recognition algorithms. Data preprocessing must handle noise from environmental reflections, interference from other electronic devices, and variability in radar cross-section from different clothing materials. Clutter rejection algorithms distinguish human targets from static objects like furniture, while multi-person tracking requires association of reflection points across frames to maintain consistent identities.

Dataset imbalance presents another challenge—normal daily activities vastly outnumber rare but critical events like falls. Class balancing techniques, including synthetic minority oversampling and focal loss functions that weight hard-to-classify examples more heavily, help address this issue. The multi-classification focus loss function proposed in the May 2023 MDPI study specifically targets confusable activities that share similar radar signatures.

Real-time processing constraints require careful algorithm optimization. While research often evaluates architectures on powerful GPUs, deployment devices may have limited computational resources. Lightweight architectures, model compression techniques like pruning and quantization, and hardware-specific optimizations all play critical roles in practical systems.

Ethical Considerations and Standardization

As FMCW radar systems move from research laboratories to widespread deployment, questions of consent, data ownership, and system reliability become paramount. Unlike cameras, radar's inability to capture identifying visual information provides inherent privacy protection—the technology cannot be subverted to capture faces or read documents. However, behavioral patterns themselves may reveal sensitive information about occupants' health status, daily routines, or presence patterns.

Fall detection systems must balance sensitivity (detecting all falls) with specificity (avoiding false alarms that cause alert fatigue). The consequences of missed detection—delayed medical response to an injured elderly person—differ fundamentally from false positives that merely inconvenience caregivers. System designers must carefully tune decision thresholds based on application context and user vulnerability.

Standardization efforts are underway for safety-critical applications. Industrial safety systems like Inxpect's SBV-01 have achieved SIL-2 (Safety Integrity Level 2) certification, demonstrating reliability sufficient for machinery control applications. However, no comprehensive standards yet exist specifically for radar-based healthcare monitoring, an area that will require attention as deployment scales.

The Path Forward

FMCW radar technology for human activity recognition stands at an inflection point. Technical capabilities have matured to the point of practical deployment, with systems achieving accuracies exceeding 97% in controlled environments. Hardware costs have plummeted as automotive-scale production of millimeter-wave chips brings single-chip radars below $50 in volume. Deep learning algorithms continue to improve, with transformer architectures and advanced attention mechanisms pushing accuracy higher while reducing computational requirements.

Remaining challenges are primarily about deployment rather than laboratory performance. Cross-environment robustness, long-term reliability, edge deployment optimization, and integration into building management systems all require engineering attention rather than fundamental research breakthroughs. The technology's unique combination of privacy preservation, environmental robustness, and sub-millimeter precision positions it as a foundational sensing modality for the smart buildings and healthcare systems of the coming decades.

As populations age in developed countries and labor shortages strain healthcare systems, non-invasive monitoring technologies become increasingly critical. FMCW radar, once a niche military technology, has evolved into a privacy-preserving solution that can monitor human health and safety without the ethical baggage of camera surveillance. Its technical maturity, combined with growing societal need, suggests rapid expansion from current healthcare and smart building pilots to mainstream deployment in the next several years.

The convergence of radar sensing with artificial intelligence represents more than an incremental improvement in activity recognition—it represents a fundamental shift toward ambient intelligence that can unobtrusively support human wellbeing while respecting privacy and autonomy. FMCW radar's emergence as the enabling technology for this vision marks a significant milestone in the evolution of human-centered computing.


Verified Sources and Formal Citations

Primary Research Publications

  1. Bi, Z., Xu, J., Song, X., & Liu, M. (2025). "FMCW Radar Principles and Human Activity Recognition Systems: Foundations, Techniques, and Applications" (arXiv:2410.08483v2). Last revised December 22, 2025. Available at: https://arxiv.org/abs/2410.08483

  2. Dang, V. N., Hoang, N. C., & Nguyen, Q. C. (2025). "Advancing robust human activity recognition via informative mmWave radar characteristics and a lightweight spatio-spectro-temporal network," Measurement Science & Technology. Published June 14, 2025. Available at: https://www.sciencedirect.com/science/article/abs/pii/S0263224125014150

  3. Gao, W. (2025). "RadHARSimulator V1: Model-Based FMCW Radar Human Activity Recognition Simulator" (arXiv:2509.06751v2). Last revised September 10, 2025. Available at: https://arxiv.org/abs/2509.06751

  4. Li, Y., et al. (2023). "Human Activity Recognition Method Based on FMCW Radar Sensor with Multi-Domain Feature Attention Fusion Network," MDPI Sensors, vol. 23, no. 11, p. 5100. Published May 26, 2023. Available at: https://www.mdpi.com/1424-8220/23/11/5100

  5. Ayaz, F., Alhumaily, B., Hussain, S., Imran, M. A., Arshad, K., Assaleh, K., & Zoha, A. (2025). "Integration of CNNs with Conventional Radar Signal Processing for Human Activity Recognition," MDPI Sensors, vol. 25, no. 3, p. 724. Published January 25, 2025. Available at: https://www.mdpi.com/1424-8220/25/3/724

  6. Authors (2025). "Radar-Based Human Activity Recognition: A Study on Cross-Environment Robustness," MDPI Electronics, vol. 14, no. 5, p. 875. Published February 23, 2025. Available at: https://www.mdpi.com/2079-9292/14/5/875

  7. Authors (2025). "Radar-Based Activity Recognition in Strictly Privacy-Sensitive Settings Through Deep Feature Learning," PMC. Published April 2025. Available at: https://pmc.ncbi.nlm.nih.gov/articles/PMC12025198/

  8. Zhang, L., et al. (2023). "A lightweight hybrid vision transformer network for radar-based human activity recognition," Scientific Reports. Published October 21, 2023. Available at: https://www.nature.com/articles/s41598-023-45149-5

Fall Detection Research

  1. Hu, S., Cao, S., Toosizadeh, N., Barton, J., Hector, M.G., & Fain, M. (2024). "Radar-Based Fall Detection: A Survey," IEEE Robotics & Automation Magazine, vol. 31, no. 3, pp. 170-185. Published September 2024. Available at: https://pmc.ncbi.nlm.nih.gov/articles/PMC11507471/

  2. Authors (2024). "Fall Detection Based on Millimeter-Wave Radar Feature Extraction Using LSTM," Proceedings of the 2024 6th International Conference on Telecommunications and Communication Engineering (ICTCE '24). Available at: https://dl.acm.org/doi/10.1145/3705391.3705400

  3. Authors (2024). "Point Cloud Classification Fall Detection Method Based on Millimeter Wave Radar," Proceedings of the 2024 4th International Joint Conference on Robotics and Artificial Intelligence (JCRAI '24). Available at: https://dl.acm.org/doi/10.1145/3696474.3696488

  4. Lin, J. D., et al. (2025). "Developing a Homecare Fall Detection System Using Millimeter-Wave Technology," SSRN. Published March 18, 2025. Available at: https://papers.ssrn.com/sol3/Delivery.cfm/224c4ac8-5a6c-4847-9749-1b4c01afbdad-MECA.pdf?abstractid=5178701&mirid=1

  5. Sun, C., Han, S., & Shin, S. (2025). "Long Short-Term Memory-Based Fall Detection by Frequency-Modulated Continuous Wave Millimeter-Wave Radar Sensor for Seniors Living Alone," Applied Sciences, vol. 15, no. 15, p. 8381. Published July 28, 2025. Available at: https://www.mdpi.com/2076-3417/15/15/8381

  6. Yao, Y., Wang, P., Bai, Z., et al. (2025). "Privacy-Preserving Unobtrusive Fall Detection for Older Adults: A Highly Generalized Deep Anomaly Detection Model," Tsinghua Science and Technology. Published September 26, 2025. Available at: https://www.sciopen.com/article/10.26599/TST.2024.9010203

IEEE Publications

  1. Qi, C. R., Su, H., Mo, K., & Guibas, L. J. (2017). "PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation," IEEE Conference Publication. Available at: https://ieeexplore.ieee.org/document/8099499/

  2. Authors (2024). "Elderly Fall Detection System Using mm-Wave Radar Sensor," IEEE Conference Publication. Available at: https://ieeexplore.ieee.org/iel7/10468568/10468675/10468881.pdf

  3. Authors (2024). "Millimeter-Wave Radar-Based Bathroom Fall Detection using Time-Frequency Features," IEEE Conference Publication. Available at: https://ieeexplore.ieee.org/document/10780691

  4. Authors (2023). "Human Activity Recognition From FMCW Radar Signals Utilizing Cross-Terms Free WVD," IEEE Journals & Magazine. Available at: https://ieeexplore.ieee.org/document/10365493/

  5. Authors (2023). "Human Activity Recognition Based on FMCW Radar Using CNN and Transfer Learning," IEEE Conference Publication. Available at: https://ieeexplore.ieee.org/document/10317148/

  6. Authors (2023). "Radar-based human activity recognition using two-dimensional feature extraction," IEEE Conference Publication. Available at: https://ieeexplore.ieee.org/document/10135278/

  7. Authors (2022). "FMCW Radar Sensor Based Human Activity Recognition using Deep Learning," IEEE Conference Publication. Available at: https://ieeexplore.ieee.org/document/9748776/

Smart Building and IoT Applications

  1. Barral, V., Domínguez-Bolaño, T., Escudero, C. J., & García-Naya, J. A. (2024). "An IoT System for Smart Building Combining Multiple mmWave FMCW Radars Applied to People Counting" (arXiv:2401.17949v1). Published January 28, 2024. Available at: https://arxiv.org/html/2401.17949v1

Commercial Hardware Documentation

  1. Texas Instruments (2024). "Bringing Intelligence and Efficiency to Smart Home," Technical White Paper (SWRA807). Available at: https://www.ti.com/lit/swra807

  2. Texas Instruments (2024). "Improving overall home energy efficiency through mmWave radars," Video. Published April 24, 2024. Available at: https://www.ti.com/video/6351595638112

  3. Texas Instruments (2025). "Industrial mmWave radar sensors - Portfolio Overview." Available at: https://www.ti.com/sensors/mmwave-radar/industrial/overview.html

  4. Texas Instruments (2025). "IWR6843 Data Sheet, Product Information and Support." Available at: https://www.ti.com/product/IWR6843

  5. Texas Instruments (2025). "AWR1642 Data Sheet, Product Information and Support." Available at: https://www.ti.com/product/AWR1642

  6. MathWorks (2025). "TI mmWave Radar Sensors - Portfolio of devices enabling mmWave radar in industrial and automotive applications." Available at: https://www.mathworks.com/products/connections/product_detail/ti-mmwave.html

Industry News and Applications

  1. Electromaker (2024). "Exploring Texas Instruments' Robotics Innovations at Embedded World 2024." Available at: https://www.electromaker.io/blog/article/exploring-texas-instruments-robotics-innovations-at-embedded-world-2024

  2. DFRobot (2024). "High Precision C1001 60GHz millimeter-wave radar sensor for Real-time elderly fall monitoring and sleep tracker." Published December 31, 2024. Available at: https://www.dfrobot.com/product-2861.html

  3. Israel Electronics News (2017). "TI's New Radar Sensors for Autonomous Driving." Published May 21, 2017. Available at: https://techtime.news/2017/05/17/ti-radar-sensor/

  4. Texas Instruments (2023). "Radar Sensor for Smart TVs," Video. Published April 14, 2023. Available at: https://www.ti.com/video/6325071464112

Additional Technical Resources

  1. Journal of Progress in Electronics and Communication Engineering (2025). "Detailed Guide to Machine Learning Techniques in Signal Processing," vol. 2, no. 1, pp. 39-47. Available at: https://ecejournals.in/index.php/PECE/article/download/58/105/166

  2. Semantic Scholar (2024). "Elderly Fall Detection System Using mm-Wave Radar Sensor - Publication Database." Available at: https://www.semanticscholar.org/paper/Elderly-Fall-Detection-System-Using-mm-Wave-Radar-Joy-John/9f2f494e0bcafd2acc92ce0b2b4c932a155b5058


Dataset References

University of Glasgow Radar Signatures Dataset: Publicly available at https://researchdata.gla.ac.uk/848/ (accessed December 2024). This dataset contains FMCW radar signatures of human activities including walking, sitting, standing, bending, drinking water, and falling, collected from 48 participants using a C-band FMCW radar at 5.8 GHz with 400 MHz bandwidth.


Article prepared December 2025. All citations verified and URLs confirmed active as of publication date.

 

No comments:

Post a Comment

FMCW Radar Technology Emerges as Privacy-Preserving Solution for Human Activity Recognition

[2410.08483] FMCW Radar Principles and Human Activity Recognition Systems: Foundations, Techniques, and Applications FMCW Radar Principles a...