Thursday, April 2, 2026

Unsupervised SAR Change Detection of Small Objects via Fusion of Difference Images


Example intermediate outputs of the proposed SAR change detection framework. 
(a) Example of SAR image pair. 
(b) Log-ratio difference image. 
(c) NCB difference image. 
(d) Fuzzified log-ratio image. 
(e) Fused image. All difference images are displayed in dB scale with identical dynamic ranges for fair visual comparison. 

Unsupervised SAR Change Detection of Small Objects via Fusion of Difference Images | IEEE Journals & Magazine | IEEE Xplore

Science & Technology Review  ·  Radar Intelligence  ·  April 2026

Remote Sensing Signal Processing Defense Technology Satellite Intelligence Artificial Intelligence
Synthetic Aperture Radar · Change Detection · Surveillance

The Invisible Eye: Seeing the Unseen with Radar

A new algorithm from South Korean engineers can find a moved truck in a radar image where no human analyst — and no conventional software — would think to look. The method arrives as the global competition for all-weather battlefield intelligence reaches a fever pitch.

Analysis based on peer-reviewed research  ·  IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, Vol. 19, 2026  ·  April 2, 2026

Bottom Line Up Front: Engineers at Pohang University of Science and Technology (POSTECH) have developed an unsupervised radar change-detection algorithm that reliably spots the movement of small ground objects — including individual vehicles — by fusing two complementary radar difference-image types through a statistically derived fuzzy-logic weighting scheme. The method outperforms existing approaches on most benchmark datasets and arrives at a moment when commercial SAR satellite constellations are transforming military and humanitarian surveillance worldwide.

On a muddy road outside a forest somewhere in northern Sweden, fifty military vehicles sit concealed beneath a canopy of fir trees. To optical satellite cameras, they are invisible — lost in shadow or obscured by cloud cover. To an ordinary radar image, they blend seamlessly into the clutter of the ground. But compare two radar passes taken days apart, and suddenly the story changes. A truck that moved three meters is a ghost of electromagnetic echo — a whisper of altered phase and shifted intensity written in microwave light. The challenge, for more than three decades, has been writing software capable of reading that whisper reliably.

A paper published in March 2026 in the IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing takes a meaningful step toward meeting that challenge. Researchers Geon Lee and Kyung-Tae Kim of POSTECH's Intelligent Radar Signal Processing Laboratory describe a new unsupervised change detection algorithm specifically designed for the detection of small objects — ground vehicles, parked aircraft, shipping containers — in Synthetic Aperture Radar (SAR) imagery. Their approach fuses two fundamentally different types of radar difference images using a probabilistic fuzzy-logic framework, producing a composite image in which statistically improbable intensity changes stand out with unusual clarity against background clutter. In tests against four datasets — including the CARABAS very-high-frequency radar dataset with fifty concealed vehicles in Swedish forest — the new method achieved the best F1-score on three of four evaluated scenes and near-perfect precision on the fourth.

"Even the slightest changes on the earth's surface can be detected. A civil application is storm damage in forests. A military application is recognizing primitive camouflage."

— Radartutorial.eu, on SAR Coherent Change Detection

What Synthetic Aperture Radar Actually Does

SAR is a form of active microwave imaging that uses the motion of an aircraft or satellite to synthesize a very large antenna aperture, achieving spatial resolutions far finer than would be possible with a physical antenna of the same size. Because it generates its own illumination in the microwave spectrum, SAR operates day or night and penetrates cloud cover, smoke, rain, and — at certain frequencies — even forest canopy. These properties make it uniquely suited for persistent surveillance in environments where optical imagery fails.

Change detection using SAR takes advantage of a property called coherence. When a radar sensor passes over the same terrain twice from a nearly identical geometry, stable surfaces — asphalt, concrete, undisturbed soil — produce nearly identical echoes, both in amplitude and in the fine-grained phase of the returned microwave signal. When something has changed — a vehicle has moved, soil has been disturbed, a building has collapsed — the coherence between the two passes drops precipitously. That loss of coherence, or alternatively the change in pixel-by-pixel intensity, is the raw material of change detection.

The most straightforward method, called amplitude change detection (ACD), simply computes the ratio of pixel intensities between two SAR images. A more sensitive method called coherent change detection (CCD) uses the full complex signal — amplitude and phase together — to detect changes too subtle to register in intensity alone. Vehicle tracks across a lawn, for example, may produce no detectable intensity change whatsoever, but the slight depression of grass blades alters the phase of the returned echo enough to appear as a sharp dark streak in a coherence image. CCD is famously capable of detecting disturbances at scales of a few centimeters in displacement, even from altitudes of hundreds of kilometers.

 


Figure 1.
Simplified flowchart of the Lee & Kim (2026) fusion-based SAR change detection framework. The method fuses a log-ratio image with a Berger's noncoherent correlation image (NCB) after statistically derived fuzzification, then applies Otsu's threshold to generate a binary change map. Morphological filtering and DBSCAN clustering complete the object-level evaluation pipeline. Adapted from Figure 8 of the source paper.

The Small-Object Problem: Why Existing Methods Fail

Identifying large-scale changes — deforestation, urban destruction, flooded plains — is a solved problem in SAR change detection. The statistical fingerprint of a missing forest or a collapsed city block dominates the difference image histogram with enough energy that conventional thresholding algorithms such as Otsu's method and the Kittler-Illingworth (KI) criterion find the optimal discrimination boundary reliably.

Small objects are an entirely different matter. A ground vehicle, even a large military truck, might occupy only a few hundred pixels in a high-resolution SAR image covering many square kilometers. Changed pixels of this type represent perhaps 0.1% to 0.7% of the total image area — a fraction so small that conventional statistical thresholding algorithms effectively ignore it. The distribution of pixel intensities in the difference image is overwhelmed by the statistics of the unchanged background, and the small bump contributed by moved vehicles is swamped in the tail of the clutter distribution.

This problem is compounded by speckle — the coherent noise that gives SAR images their characteristic grainy appearance — and by the need for a high target-to-clutter ratio (TCR), defined as the ratio of average pixel intensity in the changed region to that in the unchanged background. The paper demonstrates mathematically that the degree of overlap between the changed and unchanged pixel distributions directly controls detection performance: as overlap increases, the F1-score degrades, and for small objects the overlap is always large.

Lee and Kim's core insight is that two widely used types of SAR difference images — the log-ratio image, which compares pixel-by-pixel intensities, and the Berger noncoherent correlation image (NCB), which evaluates patch-level similarity — respond differently to the same physical changes. The log-ratio preserves sharp edges and is relatively insensitive to multiplicative speckle fluctuations; the NCB suppresses speckle through local averaging but blurs fine-scale edges. Crucially, strong clutter responses rarely appear strong in both images simultaneously, while true object changes tend to appear in both. The Hadamard product — element-wise multiplication — of the two processed images therefore suppresses clutter-induced false alarms while preserving true change signals.

The Method at a Glance

  • Log-ratio image (LR): Computes the logarithm of the pixel intensity ratio between two SAR passes. Sensitive to local intensity changes, resistant to multiplicative noise.
  • Noncoherent Berger image (NCB): Measures patch-level correlation between two SAR images using a sliding window. Suppresses speckle, sensitive to subtle textural de-correlation.
  • Fuzzification: Applies a Bayesian posterior-probability membership function derived from the observed log-ratio statistics (modeled as generalized Gaussian distributions) to map LR pixels to the [0,1] range of the NCB image. Membership function boundaries are determined automatically from tail-region statistical estimation, reducing bias caused by the sparse changed-pixel population.
  • Hadamard product: Multiplies the fuzzified LR and NCB images pixel-by-pixel, amplifying pixels that are simultaneously high in both cues and suppressing those high in only one.
  • Otsu's threshold + DBSCAN: Global threshold applied to the fused image; DBSCAN clustering converts pixel-level detections to object-level precision/recall evaluation.

Processing time for a 1,150 × 2,800 pixel SAR image pair: approximately 7.9 seconds in MATLAB, versus 7,456 seconds for the leading graph-matching competitor.

A Pedigree in the Sky: The Lynx Radar and the Military Origins of Change Detection

The problem Lee and Kim are solving is not new to defense engineers. The U.S. military and its contractors have grappled with small-object change detection since at least the 1990s, when the first high-resolution airborne SAR systems demonstrated that ground vehicle movements could be discerned by comparing successive radar passes.

General Atomics Aeronautical Systems, Inc. (GA-ASI) played a central role in operationalizing this capability. The company's Lynx multimode SAR/GMTI radar, developed under corporate funding with significant Sandia National Laboratories antenna design work, was among the first airborne systems to integrate coherent change detection (CCD) as a production operational mode. In CCD mode, Lynx registers and compares complex SAR images — preserving both amplitude and phase — at resolutions as fine as 0.1 meters in spotlight mode, making it capable in principle of detecting centimeter-scale surface disturbances. Lynx was deployed on the Predator and Sky Warrior UAS platforms and accumulated more than 1,000 collective mission hours on Sky Warrior alone in support of U.S. Army operations in Iraq by May 2010. The radar was also supplied to the U.S. Air Force, U.S. Navy, U.S. Department of Homeland Security, the Iraqi Air Force, and Customs and Border Protection, where it was used to detect border crossing activity along the U.S. southwest border despite cloud cover and ground haze.

What made Lynx noteworthy from an engineering standpoint was its integration of SAR, GMTI (ground moving target indicator), dismount moving target indicator (DMTI), and CCD into a single lightweight, low-power package flyable on a UAV. The GMTI mode detected moving vehicles; the CCD mode detected vehicles that had moved between passes. These are complementary capabilities: GMTI catches targets in motion at collection time; CCD reveals that a target was in one place during the first pass and a different place during the second, regardless of whether it was moving at either moment. Together they dramatically narrow the window during which a tactical force can shelter its vehicles from aerial detection.

"Both Ukrainian intelligence and the Ukrainian military utilize these capabilities — they receive the images, analyze them, and make appropriate decisions."

— Rafał Modrzewski, CEO, ICEYE, August 2025

The Commercial Revolution: SAR Satellites and the Ukraine Crucible

The capabilities once confined to classified U.S. military platforms are now racing into commercial hands at a pace that would have seemed implausible a decade ago. Three companies — ICEYE (Finland/U.S.), Capella Space (U.S.), and Umbra (U.S.) — have built constellations of small SAR satellites offering sub-meter resolution imagery available commercially, and in some cases at resolutions finer than those of many classified government systems from the previous generation.

ICEYE, the market leader, operated 62 satellites as of late 2025 following five additional launches aboard a SpaceX Transporter-15 rideshare mission in November of that year. Its fourth-generation satellites deliver imagery at resolutions down to 16 centimeters. In 2025, ICEYE partnered with French group Safran.AI to apply artificial intelligence to SAR and optical data, including for Ukrainian military intelligence, aiming to reduce the latency between satellite tasking, image capture, and operational decision-making. Umbra, meanwhile, provides ultra-high-resolution SAR commercial data products down to 0.25 meters resolution, representing the finest resolution commercially available from space. The National Reconnaissance Office extended two-year commercial SAR contracts to Capella Space, ICEYE US, and Umbra from July 2024 through July 2026 under its Strategic Commercial Enhancements Broad Agency Announcement program.

The conflict in Ukraine has served as the most intensive real-world stress test of commercial SAR change detection ever conducted. ICEYE satellites have enabled near real-time monitoring of troop movements, with both Ukrainian intelligence and military using the imagery to analyze and make operational decisions. The requirements at the beginning of the war evolved significantly; in March 2024, ICEYE released a product called Dwell, which detects military equipment hidden under tree canopy — developed directly in response to feedback from the field. Ukraine reportedly purchased dedicated access to one ICEYE satellite through a crowdfunded campaign in 2022, a novel procurement model born of wartime necessity.

The operational lessons from Ukraine are reshaping the commercial SAR industry's development priorities. As one expert observer noted, the Ukraine war has shown that "it's very hard to create camouflage that can fool multiple phenomenologies — you can create a camouflage that fools SAR but doesn't fool optical, or that fools optical but doesn't fool SAR." This insight argues for exactly the kind of multi-cue fusion that the POSTECH paper formalizes in the academic domain.

The Commercial SAR Constellation Landscape (2026)

  • ICEYE (Finland/U.S.): 62+ satellites; Gen4 resolution to 16 cm; daily revisit most Earth locations; NRO contract holder; dedicated Ukraine partnership; €250M+ revenue, €1.5B backlog reported for 2025.
  • Umbra (U.S.): 25 cm resolution, the finest offered commercially; NRO contract holder; NASA CSDA participant.
  • Capella Space (U.S.): 50 cm resolution; 14-satellite constellation serving U.S. federal agencies; NRO contract holder.
  • Airbus Defence & Space: TerraSAR-X and TanDEM-X X-band heritage systems; mature government customer base.
  • iQPS (Japan): In April 2025, Japan provided QPS-SAR access to Ukraine's GUR military intelligence agency — the first time Japan had shared its most advanced space-based surveillance with a foreign military.

The Algorithm in Detail: Fuzzy Logic Meets Bayesian Statistics

What distinguishes the POSTECH work technically is the method by which the fuzzification step is derived. Most prior fuzzy change detection methods used heuristically designed membership functions — sigmoid curves, trapezoidal functions — whose parameters were chosen by the researcher. Lee and Kim instead derive their membership function automatically from the statistical distribution of the log-ratio image itself, using a generalized Gaussian (GG) distribution model and the Kittler-Illingworth Bayesian parameter estimation framework.

The GG distribution is a flexible family that nests the Laplacian (heavy-tailed) and Gaussian (light-tailed) distributions as special cases. It is well-suited to modeling log-ratio image histograms, which typically have sharper peaks and heavier tails than a standard Gaussian would predict. The distribution is fitted separately to three pixel classes: decreasing change (negative log-ratio; the target was brighter in the first pass), unchanged (log-ratio near zero), and increasing change (positive log-ratio; brighter in the second pass).

The key innovation for small-object scenarios is the tail-region parameter estimation. When changed pixels are extremely sparse — fewer than 1% of the image — conventional maximum-likelihood estimation of the changed-class distribution is corrupted by overlap with the unchanged class. The authors instead estimate the changed-class statistics exclusively from the outer tails of the observed histogram, where there is minimal contamination from unchanged pixels. Under a symmetry assumption for the generalized Gaussian model, this gives unbiased (or less biased) estimates of the class mean and variance, from which the fuzzy membership function is constructed. The paper validates the improved model accuracy using Kullback-Leibler divergence, demonstrating consistently better fit to empirical changed-pixel histograms across all four test datasets.

Once the fuzzified log-ratio image is formed — mapping each pixel to a membership value in [0, 1] representing its probability of being a true change — it is multiplied element-wise (Hadamard product) with the NCB image, which is also naturally in [0, 1]. The resulting fused image has a statistical distribution far more amenable to Otsu's threshold than either input alone, because clutter pixels rarely score highly in both channels simultaneously.

Experimental Validation: The CARABAS Dataset and Beyond

The paper evaluates the method on four datasets of increasing difficulty. The simulated dataset used target chips from the MSTAR (Moving and Stationary Target Acquisition and Recognition) database superimposed on synthetic clutter, providing ground truth with pixel-level precision. Three real SAR datasets followed: an airborne X-band dataset over forested terrain (51 changed objects), an urban dataset with buildings and parking lots (88 changed objects), and the CARABAS dataset.

The CARABAS dataset is particularly challenging and instructive. Collected in northern Sweden using a VHF-band (very high frequency) airborne radar, it contains 50 concealed military vehicles in forested areas. VHF-band SAR penetrates forest canopy effectively but produces coarser imagery than X-band systems, and the target-to-clutter statistics are consequently more difficult. Against this dataset, the graph matching-based MGAM method edged out the POSTECH approach, achieving the best F1-score. The authors attribute this partly to the conservative nature of the Hadamard product: it requires strong evidence in both channels for a pixel to receive a high fused score, which can suppress spatially compact targets in lower-resolution imagery. On the simulated, forest, and urban datasets, the proposed method achieved the best F1-score overall, with perfect precision (1.0000) and near-perfect recall (0.96) on the CARABAS challenge.

Computationally, the method is practical for operational use. At approximately 8 seconds per image pair in a non-optimized MATLAB implementation, it is roughly twice as fast as the VEM method, and almost 1,000 times faster than the MGAM graph-matching approach, while matching or exceeding MGAM's detection performance on most datasets.

Broader Context: AI, SAR, and the Future of Persistent Surveillance

The POSTECH paper represents one branch of a rapidly proliferating research ecosystem. Recent related work has explored graph matching for change representation, Bayesian variational expectation maximization (VEM) for SAR coherent change detection, and deep learning approaches that apply convolutional neural networks to learn change features directly from labeled data. Each family of methods carries different tradeoffs between computational cost, sensitivity, interpretability, and dependence on training labels.

The unsupervised methods — including the POSTECH approach — are particularly important operationally because they require no labeled ground truth. In military or disaster-response contexts, labeled multitemporal training data over the relevant scene type may simply not exist. The ability to detect changes reliably without prior knowledge of what those changes should look like is a critical operational advantage.

A parallel development reshaping the field is the integration of AI-based automatic detection and classification with SAR data pipelines. ICEYE's partnership with Polish AI analytics firm Satim will enable automatic detection and classification of military objects in ICEYE's images, with systems capable of identifying specific vehicle types from SAR imagery and providing automated detection when image quality is insufficient for full classification.

The growing commercial SAR satellite market — valued at approximately $6.82 billion in 2025 and projected to reach $15.29 billion by 2032 — is driving both the supply of imagery and the demand for algorithms capable of extracting actionable intelligence from it automatically and at scale. As revisit times drop toward sub-hourly and resolutions approach the optical domain, the bottleneck is shifting from data collection to data exploitation. Algorithms like the one developed by Lee and Kim are part of the answer to that bottleneck.

The military implications extend well beyond Ukraine. Space-based capabilities became a great equalizer and force multiplier in modern conflict, allowing forces to gain situational awareness at scale. Leading up to and during the Russian invasion, satellite imagery captured the buildup of forces along the Ukrainian border and documented movements into Ukrainian territory. The same principles apply to any future conflict in which adversaries attempt to conceal vehicle movements under forest canopy, inside urban environments, or behind clouds and darkness — precisely the scenarios that motivate high-sensitivity small-object SAR change detection.

Related Research: Key Papers in the Field

  • Tucker, Ash & Potter (2023). SAR coherent change detection with variational expectation maximization. IEEE Trans. Aerosp. Electron. Syst., 59(3), 2163–2175. The VEM method, one of the primary baselines in the POSTECH comparison.
  • Wang et al. (2025). Unsupervised SAR image change detection using multilevel graph attribute matching. Int. J. Remote Sens., 46(12), 4705–4735. The MGAM method; the strongest competitor in the CARABAS scenario.
  • Zhang et al. (2022). Unsupervised SAR image change detection for few changed area based on histogram fitting error minimization. IEEE Trans. Geosci. Remote Sens., 60. HFEM method; third major baseline.
  • Zitzlsberger & Podhoranyi (2024). Monitoring of urban changes with multimodal Sentinel-1 and -2 data in Mariupol, Ukraine. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens., 17, 5245–5265. Operational SAR change detection in active conflict.
  • Preiss & Stacy (2006). Coherent change detection: Theoretical description and experimental results. DSTO-TR-1851. The foundational technical treatment of CCD, used as a reference throughout the POSTECH paper.

What Comes Next

Lee and Kim are candid about the method's limitations. Highly reflective linear artifacts in urban environments can generate persistent false alarms, and the Hadamard product's conservative character may cost recall on spatially compact targets in low-resolution imagery. The paper identifies three directions for future work: multiscale analysis to improve discrimination between true targets and specular clutter; richer feature sets incorporating target-specific electromagnetic signatures; and threshold selection criteria specifically optimized for the fused difference image, moving beyond the general-purpose Otsu criterion.

Deeper changes are coming to the field from the AI side. Deep learning approaches trained on MSTAR and similar labeled databases can in principle learn the specific electromagnetic signature of a T-72 tank versus a pickup truck, collapsing the detection and identification tasks into a single pipeline. But these methods require substantial labeled data and are typically trained on specific sensor configurations — they may generalize poorly to new SAR systems, new frequency bands, or new operational environments. Unsupervised methods retain an inherent advantage in operational flexibility, particularly for new satellites or new threat environments where labeled training data does not yet exist.

The most promising path forward is probably hybrid: unsupervised change detection to flag candidate regions, followed by learned classifiers to discriminate true targets from false alarms within those regions. ICEYE's partnership with Satim, and similar collaborations between SAR data providers and AI analytics firms, are early instantiations of exactly this architecture. The POSTECH paper's contribution — a principled, automatic, computationally tractable unsupervised detection method that works in the absence of labeled data — is a well-timed building block for that larger system.

Fifty trucks in a Swedish forest. A few hundred pixels in a radar image taken from space. The physics of microwave scattering says they left a mark. The challenge of a generation of radar engineers has been building the mathematics to find it.

· · · ·

Synthetic Aperture Radar Change Detection Fuzzy Logic GMTI Coherent Change Detection ICEYE GA-ASI Lynx POSTECH Ukraine ISR Commercial SAR NRO Signal Processing

Verified Sources & Formal Citations

[1] Primary Paper: G. Lee and K.-T. Kim, "Unsupervised SAR Change Detection of Small Objects via Fusion of Difference Images," IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 19, pp. 10394–10409, 2026. DOI: 10.1109/JSTARS.2026.3674537. https://ieeexplore.ieee.org

[2] Lynx SAR/CCD Technical Description: S. I. Tsunoda et al., "Lynx: A High-Resolution Synthetic Aperture Radar," Sandia National Laboratories / General Atomics, OSTI report. Describes Lynx CCD mode at 0.1 m spotlight and 0.3 m stripmap resolution. https://www.osti.gov/servlets/purl/4263

[3] GA-ASI Lynx Multi-Mode Radar Product Page: General Atomics Aeronautical Systems, Inc. Describes SAR, GMTI/DMTI, Maritime Wide Area Search, Amplitude Change Detection (ACD), and Automated Man Made Object Detection (AMMOD) modes. https://www.ga-asi.com/radars/lynx-multi-mode-radar

[4] Lynx Block 30 1,000 Mission Hours Press Release: General Atomics (May 7, 2010). Describes Lynx CCD algorithms, deployment on Sky Warrior UAS in Iraq, and delivery to multiple U.S. and foreign government customers. https://www.ga.com/lynx-block-30-radar-surpasses-1000-mission-hours-on-sky-warrior-uas

[5] GA-ASI Lynx II Enhanced Radar Development: General Atomics (December 9, 2003). Describes development of the Lynx II SAR/GMTI with 4-inch range resolution accuracy in GMTI mode. https://www.ga.com/general-atomics-to-develop-enhanced-radar-system-for-us-army

[6] Coherent Change Detection — Theoretical Description: M. Preiss and N. J. S. Stacy, "Coherent Change Detection: Theoretical Description and Experimental Results," Defence Science and Technology Organisation (DSTO), Australia, Technical Report DSTO-TR-1851, 2006. [Referenced in the POSTECH paper as [16].] Available via DSTO Australia defense technical reports.

[7] ICEYE Coherent Change Detection Explainer: ICEYE. "See the Invisible, Do the Impossible." Technical blog post on CCD physics, ICEYE Ground Track Repeat, and daily coherence products. https://www.iceye.com/blog/see-the-invisible-do-the-impossible

[8] ICEYE Ukraine Partnership Expansion: AeroTime Hub (January 19, 2026). "Ukraine expands partnership with ICEYE for tactical satellite intelligence." Describes ICEYE-Safran.AI partnership for AI-enhanced SAR analysis. https://www.aerotime.aero/articles/ukraine-iceye-tactical-satellite-intelligence

[9] ICEYE Near-Real-Time Ukraine Monitoring: Militarnyi.com (August 14, 2025). "ICEYE Satellites Enable Near Real-Time Monitoring of Invaders in Ukraine." Quotes ICEYE CEO Rafał Modrzewski on operational use by Ukrainian military intelligence. https://militarnyi.com/en/news/iceye-satellites-enable-near-real-time-monitoring-of-invaders-in-ukraine/

[10] Ukraine War SAR Lessons — Via Satellite: Via Satellite (March 28, 2025). "Ukraine's Space Industry and its Allies Integrate Lessons from 3 Years of Vicious War." Quotes ICEYE and Maxar executives on multi-sensor fusion, ICEYE Dwell product, and commercial satellite innovation driven by wartime needs. https://interactive.satellitetoday.com/via/april-may-2025

[11] NRO Extends Commercial SAR Contracts: Breaking Defense (December 2024). "And Then There Were 3: NRO Extends Contracts for Radar Imagery to Capella, ICEYE, Umbra." Stage III SCE BAA contracts July 2024–July 2026. https://breakingdefense.com/2024/12/and-then-there-were-3-nro-extends-contracts-for-radar-imagery-to-capella-iceye-umbra/

[12] ICEYE 2025 Financial Results and Satellite Launch: ICEYE Press Release (November 29, 2025). ICEYE launches five Gen4 SAR satellites; 62 total on orbit; 16 cm resolution; €250M+ revenue and €1.5B backlog reported. https://www.iceye.com/newsroom/press-releases/iceye-launches-five-new-satellites

[13] NASA Commercial Satellite Data Acquisition Program: NASA Science (2026). Lists ICEYE US (25 cm), Umbra (0.25 m), and Capella Space (0.5–1.2 m) SAR providers with NGA procurement frameworks. https://science.nasa.gov/earth-science/csda/

[14] CSIS Space Battlespace Analysis: CSIS Aerospace Security Program (September 2025). "Extending the Battlespace to Space." Analyzes commercial SAR's role in Ukraine conflict, satellite EW threats, and the competitive space intelligence environment. https://www.csis.org/analysis/chapter-8-extending-battlespace-space

[15] Japan iQPS SAR Transfer to Ukraine: Wes O'Donnell / Medium (May 6, 2025). "How a Secret Japanese Satellite Deal Just Supercharged Ukraine's War Intel." Describes Japan's first transfer of advanced space-based surveillance capability to a foreign military (Ukraine GUR, April 21, 2025). https://wesodonnell.medium.com

[16] Coherence Enhancement for Vehicle Track CCD: Hensoldt Sensors / SmartRadar study. "Enhancing Coherence Images for Coherent Change Detection: An Example on Vehicle Tracks in Airborne SAR Images." Remote Sensing, 13(24), 5010, December 2021. https://www.mdpi.com/2072-4292/13/24/5010

[17] Multi-Scale CCD Analysis: "Multi-Scale Analysis for Coherent Change Detection: A Method for Extracting Typical Changed Area." Remote Sensing, 14(19), 4986, October 2022. Uses ESA Sentinel-1 and ESAR airborne SAR. https://www.mdpi.com/2072-4292/14/19/4986

[18] Ukraine Conflict SAR Change Detection Study: Nature Scientific Reports (March 2026). "Integrating Optical and Radar Satellite Data for Conflict-Related Change Detection in Ukraine." Demonstrates combined SAR/optical fusion for building damage detection. https://www.nature.com/articles/s41598-026-41424-3

[19] Tucker, D., Ash, J. N., & Potter, L. C. (2023). "SAR Coherent Change Detection with Variational Expectation Maximization." IEEE Transactions on Aerospace and Electronic Systems, 59(3), 2163–2175. The VEM baseline method evaluated in the POSTECH paper. https://ieeexplore.ieee.org

[20] SAR Coherent Change Detection Tutorial: Radartutorial.eu. Explains CCD physics, coherence calculation, grass blade phase perturbation example, and military applications including camouflage detection. https://www.radartutorial.eu/20.airborne/ab13.en.html

[21] Commercial Satellite Imagery Market Report: Mordor Intelligence (February 2026). Market size $6.82B in 2025, projected $15.29B by 2032; SAR segment CAGR 13.78%. https://www.mordorintelligence.com/industry-reports/commercial-satellite-imaging-market

[22] NRO Commercial SAR Program Status: Via Satellite (July 11, 2024). "Government is Behind in Creating a Program of Record for Commercial Satellite Imagery." Quotes ICEYE U.S. CEO Eric Jensen; describes $1 billion+ industry investment and 50+ satellites launched. https://www.satellitetoday.com/government-military/2024/07/11

[23] CARABAS Dataset Challenge Reference: M. Lundberg, L. M. H. Ulander, W. E. Pierson, and A. Gustavsson, "A Challenge Problem for Detection of Targets in Foliage," in Algorithms for Synthetic Aperture Radar Imagery XIII, vol. 6237, pp. 160–171, 2006. Defines the CARABAS dataset used in the POSTECH paper. [Referenced in paper as [44].] Available via SPIE Digital Library.

[24] SAR Challenge Problem Dataset Reference: S. M. Scarborough et al., "A Challenge Problem for SAR Change Detection and Data Compression," in Algorithms for Synthetic Aperture Radar Imagery XVII, vol. 7699, pp. 287–291, 2010. Defines the Forest and Urban datasets used as real SAR benchmarks in the POSTECH paper. [Referenced in paper as [17].] Available via SPIE Digital Library.

 

 

No comments:

Post a Comment