Thursday, October 9, 2025

When Satellites Borrow Tricks from the Cloud

How data center traffic management could solve the growing congestion crisis in space

By the time you finish reading this sentence, dozens of satellites will have crossed overhead. They're part of a new generation of "mega-constellations"—fleets of thousands of satellites orbiting Earth, beaming internet connectivity to even the most remote corners of the planet. But there's a problem: these satellites are starting to get in each other's way.

Not physically—space is still plenty big. The congestion happens in the invisible highways of radio spectrum that carry data between Earth and orbit. Think of it like rush hour on a freeway, except the cars are traveling at the speed of light and the traffic jam is happening 300 miles above your head.

For decades, satellite networks have relied on the same traffic management systems that run the regular internet. But those systems were designed for fiber-optic cables on Earth, not for signals bouncing between moving satellites and ground stations scattered across continents. The result? Sluggish connections, unfair distribution of bandwidth, and wasted spectrum—a particularly precious resource when you're beaming data through space.

Now, engineers are looking to an unlikely source for solutions: the massive data centers that power everything from Netflix to ChatGPT. These facilities have spent years perfecting ways to move enormous amounts of information without creating digital traffic jams. And it turns out, their strategies might be just what satellite networks need.

The Satellite Traffic Problem

To understand why satellites struggle with congestion, you need to know a bit about how the internet normally handles it.

When data travels across the internet—whether it's an email, a video call, or this article—it's broken into tiny chunks called packets. These packets hop from router to router until they reach their destination. The system that governs this flow is called TCP, or Transmission Control Protocol, and it's been the internet's traffic cop since the 1970s.

TCP works like a cautious driver. It starts slowly, gradually accelerating as long as packets arrive successfully at their destination. But the moment a packet gets lost—stuck in a digital traffic jam somewhere—TCP slams on the brakes. It assumes the network is congested and dramatically slows down, waiting for conditions to improve.

This works reasonably well when your data only travels a few milliseconds across fiber-optic cables. But satellites introduce a cruel twist: distance.

A signal traveling to a satellite in low Earth orbit and back takes about 20 to 40 milliseconds—fast enough for a smooth video call. But satellites in higher orbits, particularly those sitting in a fixed position above the equator at 22,000 miles up, introduce delays of 600 milliseconds or more. That's more than half a second for a round trip.

"With those long delays, TCP's feedback loop becomes almost comically slow," explains Saravanan Subramanian, a network engineer who has studied the problem. "By the time a satellite link realizes there's congestion and reacts, conditions may have already changed completely."

Making matters worse, satellite networks don't stay still. Low-orbit satellites zip around the planet every 90 minutes, which means your connection has to jump from satellite to satellite as they pass overhead. Each handoff can confuse TCP, causing it to misinterpret normal hiccups as serious congestion and unnecessarily throttle your connection.

Then there's the fairness problem. Ground stations with bigger antennas and more powerful transmitters naturally get better connections. In a congested network, they can end up hogging most of the available spectrum, leaving scraps for everyone else. TCP, designed for networks where everyone plays by the same rules, has no good way to level the playing field.

What Data Centers Figured Out

While satellite engineers wrestled with these challenges, data center operators faced their own version of the same problem—just on a much smaller scale.

Inside facilities that house thousands of servers powering cloud computing and artificial intelligence, information flies between machines at breathtaking speeds. These systems use a technology called RDMA (Remote Direct Memory Access), which lets computers grab data from each other's memory directly, bypassing the usual software middlemen. It's incredibly fast—we're talking microseconds—but also incredibly sensitive to even tiny delays.

Data center engineers discovered that TCP's wait-for-failure approach was too slow. They needed to catch congestion before packets got lost, not after.

Their solution: Explicit Congestion Notification, or ECN for short.

Instead of waiting for packets to disappear into the digital void, ECN works like an early warning system. Network switches monitor their internal queues—the digital waiting rooms where packets line up before being forwarded. When a queue starts filling up but before it overflows, the switch marks outgoing packets with a special flag that essentially says, "Hey, I'm getting busy here."

When these marked packets reach their destination, the receiving computer sends the message back to the sender: "The network is getting congested—slow down a bit." The sender reduces its transmission rate, easing the pressure before any packets get lost.

Building on ECN, data center engineers developed an even more sophisticated system called DCQCN (Data Center Quantized Congestion Notification). Instead of just binary "congested" or "not congested" signals, DCQCN provides nuanced feedback. It's the difference between a traffic light that's only red or green versus one that gives you advance warning when it's about to change.

When a sender receives a congestion signal, it quickly cuts its rate in half—like easing off the gas pedal. Then, as conditions improve, it gradually accelerates again. Multiple senders on the same network all follow the same dance, quickly finding a rhythm where everyone gets a fair share without overwhelming the system.

This approach has become standard in the massive data centers running modern AI systems, where thousands of graphics processors need to exchange information constantly without stepping on each other's digital toes.

Bringing the Data Center to Space

The parallels between data center congestion and satellite spectrum congestion are striking. Both involve multiple senders competing for shared resources. Both suffer when traditional TCP-style congestion control arrives too late. And both need fairness—ensuring that no single user monopolizes the network.

So how would ECN and DCQCN work for satellites?

Imagine a satellite receiving signals from dozens of ground stations simultaneously, all trying to upload data through the same frequency band. Inside the satellite's electronics, arriving packets queue up waiting to be processed and forwarded—just like in a data center switch.

Under an ECN-based system, when that queue starts filling beyond a certain threshold—say, 30% full—the satellite begins marking packets before sending them on their way. These marks travel through the network and eventually reach the ground stations in the form of acknowledgment messages.

Each ground station, upon receiving these marked acknowledgments, reduces its transmission rate. Crucially, it does so before any packets are actually lost. The satellite's queue stabilizes, packets keep flowing, and spectrum that would have been wasted on retransmissions remains available for useful data.

For downloads from satellites to users on the ground, the system can work in reverse. Satellites could broadcast simple congestion status updates: "I'm lightly loaded—feel free to request more data," or "I'm under heavy load—please back off." User terminals adjust their requests accordingly, preventing any single user from overwhelming the satellite's downlink capacity.

The approach becomes even more powerful when satellites act as intermediaries, with dedicated gateway stations handling the connection between terrestrial networks and the space segment. These gateways could combine ECN-based congestion signaling with traditional quality-of-service techniques, ensuring that urgent traffic (like voice calls) gets priority over bulk data transfers while maintaining fairness across all users.

The Path Forward

Adapting data center techniques to satellite networks isn't as simple as flipping a switch. Satellites use specialized communication protocols, and their software runs on radiation-hardened processors that weren't designed with modern congestion control in mind. Ground stations and user terminals—including millions of consumer satellite dishes—would need firmware updates to understand and respond to congestion signals.

The parameters also need careful tuning. A data center might mark packets when queues are 50 microseconds deep; a satellite might need to set thresholds at tens or hundreds of milliseconds, accounting for the much longer round-trip times involved.

Different orbital altitudes require different approaches. Low-orbit satellites zip around so fast that they need responsive, quick-reacting congestion control. High-orbit satellites, with their leisurely half-second round trips, need gentler algorithms that don't overreact to delayed feedback.

Despite these challenges, the potential benefits are substantial. Computer simulations suggest that ECN-based satellite networks could achieve 40-60% reductions in wasted spectrum from retransmissions compared to traditional TCP. Fairness improves dramatically, with even modest ground stations able to maintain reasonable connections rather than being drowned out by more powerful neighbors. And crucially, the approach scales—it works just as well with ten ground stations as with a thousand.

Some of the necessary groundwork is already happening. International standards bodies are updating satellite communication protocols to support modern congestion control features. The next generation of satellite terminals being designed today could include ECN capability from the start rather than requiring retrofitting.

A Cosmic Irony

There's something delightfully ironic about solving space-age problems with techniques developed for earthbound data centers. But it speaks to a deeper truth: whether information is flowing between servers in a climate-controlled warehouse or between satellites hurtling through the vacuum of space, the fundamental challenges are surprisingly similar.

As mega-constellations continue to grow—with some companies planning networks of tens of thousands of satellites—the need for smarter spectrum management will only intensify. The approach that worked fine with a few dozen satellites sharing the sky won't suffice when hundreds occupy the same frequency bands.

By borrowing proven strategies from data centers and adapting them to the unique demands of satellite communications, engineers can help ensure that our growing orbital infrastructure delivers on its promise: fast, reliable internet access for everyone on Earth, regardless of where they live.

The traffic jam in space is real. But just as rush hour eventually ends on earthly highways, clever engineering can help information flow freely along our cosmic ones.


Further Reading

  • Subramanian, S. R. "Spectrum Congestion Control: ECN/DCQCN Insights." The Data Scientist, 2025. https://thedatascientist.com/spectrum-congestion-control-ecn-dcqcn-insights

  • Zhu, Y., et al. "Congestion Control for Large-Scale RDMA Deployments." ACM SIGCOMM, 2015.

  • 3GPP Technical Report 38.821: "Solutions for NR to support non-terrestrial networks," 2021.

  • Cardwell, N., et al. "BBR: Congestion-Based Congestion Control." Communications of the ACM, 2017.

spectrum congestion control: ECN/DCQCN Insights

Explicit Congestion Signaling for Spectrum Management in Non-Terrestrial Networks: A Survey of ECN/DCQCN Adaptation Strategies

Abstract

Non-terrestrial networks (NTNs), particularly low Earth orbit (LEO) satellite constellations, face increasing spectrum congestion as deployment scales accelerate. Traditional Transmission Control Protocol (TCP) congestion control mechanisms demonstrate poor performance in high-latency, dynamic-topology space networks due to their reliance on loss-based feedback. This paper examines the adaptation of data center congestion control techniques—specifically Explicit Congestion Notification (ECN) and Data Center Quantized Congestion Notification (DCQCN)—to satellite spectrum management. We analyze the fundamental limitations of loss-based algorithms in space-air-ground integrated networks, present architectural frameworks for explicit signaling in orbital segments, and discuss implementation considerations for uplink/downlink control planes. Our survey indicates that proactive congestion signaling offers significant improvements in fairness, spectrum efficiency, and throughput stability compared to conventional TCP variants across LEO, MEO, and GEO orbital regimes.

I. Introduction

The proliferation of mega-constellations has fundamentally altered the landscape of satellite communications. As of 2024, multiple operators have deployed thousands of satellites in LEO, with tens of thousands more planned for launch in the coming years. This rapid expansion creates unprecedented demand for limited spectrum resources, particularly in Ku-band and Ka-band frequencies allocated for satellite services.

Traditional terrestrial congestion control algorithms were designed for stable, low-latency networks with relatively predictable path characteristics. Satellite networks, by contrast, exhibit round-trip times (RTTs) ranging from approximately 20-40 ms for LEO constellations to 600 ms for geostationary (GEO) satellites, combined with frequent handovers, variable link quality due to atmospheric conditions, and dynamic topology changes as satellites move relative to ground stations and each other.

Recent work in data center networking has demonstrated the effectiveness of explicit congestion signaling for managing high-throughput, latency-sensitive traffic in Remote Direct Memory Access over Converged Ethernet version 2 (RoCEv2) environments. The success of ECN and DCQCN in these contexts suggests potential applicability to spectrum congestion management in NTNs, where similar requirements for predictable latency and fair resource allocation exist.

II. Background and Related Work

A. Congestion Control in Terrestrial Networks

Loss-based congestion control algorithms, including TCP Reno, CUBIC, and their variants, infer network congestion from packet loss events. These algorithms implement Additive Increase Multiplicative Decrease (AIMD) strategies that increase the congestion window during periods of successful transmission and reduce it sharply upon detecting loss.

The fundamental limitation of loss-based approaches in satellite networks stems from the "square-root effect," where throughput is inversely proportional to the square root of the loss probability and directly proportional to the square root of RTT. With RTTs exceeding 500 ms for GEO satellites and loss rates elevated due to wireless channel characteristics, achievable throughput becomes severely constrained.

Delay-based algorithms such as BBR (Bottleneck Bandwidth and RTT) attempt to estimate available capacity through active probing rather than waiting for loss signals. However, in dynamic satellite topologies with frequent inter-satellite link (ISL) reconfigurations and beam handovers, delay measurements become unreliable indicators of congestion state.

B. RoCEv2 and DCQCN Architecture

RoCEv2 supports RDMA operations over standard Ethernet and IP infrastructure, enabling zero-copy data transfers with minimal CPU involvement. To maintain the microsecond-level latencies required for high-performance computing applications, RoCEv2 networks employ two complementary mechanisms:

  1. Priority Flow Control (PFC): A link-layer mechanism that issues PAUSE frames to temporarily halt transmission on specific priority queues when buffer occupancy exceeds thresholds. While effective at preventing packet loss, excessive PFC activation can trigger head-of-line blocking and congestion spreading.

  2. Explicit Congestion Notification (ECN): An IP-layer signaling mechanism defined in RFC 3168 where network switches mark the ECN field in packet headers when queue depths exceed configured thresholds. Endpoints receiving ECN-marked acknowledgments reduce their transmission rates proactively.

DCQCN extends ECN with a quantized rate-adjustment algorithm specifically tuned for RoCEv2 environments. Upon receiving congestion notification, senders reduce their rate by a multiplicative factor α (typically 0.5), then gradually recover using an additive increase governed by timer-based rate adjustments. This approach achieves rapid convergence to fairness across competing flows while maintaining high link utilization.

C. Satellite Network Characteristics

Modern NTN architectures comprise three segments:

  • Space segment: Constellation of satellites with ISLs forming a dynamic mesh topology
  • Ground segment: Gateway stations, telemetry/tracking/command (TT&C) facilities, and network operations centers
  • User segment: End-user terminals with phased-array antennas for beam tracking

LEO constellations operate at altitudes between 500-2,000 km, providing single-hop latencies of 20-40 ms but requiring handovers every 2-10 minutes as satellites move across the user terminal's field of view. MEO systems at 8,000-20,000 km altitude offer more stable connectivity with RTTs of 80-150 ms. GEO satellites at 35,786 km provide persistent coverage but introduce RTTs approaching 600 ms.

The limited electromagnetic spectrum allocated for satellite services—including C-band (4-8 GHz), Ku-band (12-18 GHz), and Ka-band (26.5-40 GHz)—must be shared among multiple operators and thousands of simultaneous user connections. Frequency reuse through spot beams and polarization diversity improves spectral efficiency but introduces interference management challenges.

III. Limitations of TCP in Satellite Spectrum

A. Feedback Loop Disruption

TCP's self-clocking behavior relies on the arrival of acknowledgments to trigger transmission of new data. With RTTs exceeding 500 ms for GEO links, the bandwidth-delay product becomes substantial—potentially hundreds of megabits of data in flight. Standard TCP window scaling becomes insufficient, and any loss event triggers prolonged recovery periods during which spectrum remains underutilized.

LEO and MEO constellations introduce additional complexity through frequent handovers. When a user terminal transitions from one satellite to another, path characteristics change abruptly. Out-of-order packet delivery—a natural consequence of asymmetric routing through ISLs—is often misinterpreted by TCP as loss, triggering unnecessary retransmissions and window reductions.

B. Loss Misattribution

Satellite links experience packet loss from multiple sources beyond congestion:

  • Atmospheric attenuation during rain fade events
  • Doppler shift effects requiring tracking and compensation
  • Antenna pointing errors during handover transitions
  • Interference from adjacent satellites or terrestrial systems

Loss-based congestion control cannot distinguish between these physical-layer impairments and actual buffer overflow events. Consequently, TCP reacts to all losses identically, reducing throughput even when additional spectrum capacity remains available.

C. Fairness Degradation

Ground stations with superior antenna systems, higher transmission power, or more favorable geographic positions naturally achieve higher signal-to-noise ratios (SNR). In contention-based access schemes, these stations capture a disproportionate share of available spectrum. Loss-based TCP exacerbates this imbalance, as stations experiencing better channel conditions maintain larger congestion windows and sustain higher throughput.

The Jain fairness index for TCP flows over satellite links often falls below 0.7, indicating significant inequity in resource allocation. This creates particular challenges for operators attempting to provide consistent service level agreements (SLAs) across diverse user populations.

IV. ECN/DCQCN Adaptation for Satellite Spectrum

A. Architectural Framework

Implementing explicit congestion signaling in NTNs requires modifications at multiple layers:

Network Layer: Satellite payload processors must monitor queue depths at both uplink and downlink interfaces and mark ECN-capable transport (ECT) packets when instantaneous or time-averaged queue occupancy exceeds configured thresholds. For constellations with ISLs, intermediate satellites in multi-hop paths contribute additional marks as congestion propagates through the space segment.

Transport Layer: Ground stations and user terminals implement rate-control algorithms responsive to ECN feedback. Rather than TCP's binary loss signal, endpoints receive continuous congestion information enabling proportional rate adjustments.

MAC Layer: Dynamic modulation and coding schemes (ModCod) adapt to link quality independently of congestion control. Separating physical-layer adaptation from network-layer rate control prevents conflation of channel impairments with buffer congestion.

B. Uplink Congestion Control

Multiple ground stations sharing the same uplink frequency band create the classic many-to-one incast problem familiar from data center networks. Without coordination, simultaneous transmissions cause collisions and force retransmissions that waste spectrum.

An ECN-based uplink control mechanism operates as follows:

  1. Satellite receivers monitor buffer occupancy for each spot beam
  2. When queue depth exceeds marking threshold K_min, subsequent packets in the ECT codepoint are marked with Congestion Experienced (CE)
  3. Marked packets are forwarded to destinations with ECN indication preserved
  4. Returning acknowledgments or control messages carry CE information to ground stations
  5. Upon receiving ECN feedback, ground stations reduce transmission rate using AIMD with parameters tuned for satellite RTT

For LEO constellations with 30 ms RTT, appropriate parameters might include:

  • Multiplicative decrease factor α = 0.5
  • Additive increase rate β = 10 Mbps per RTT
  • Minimum marking threshold K_min = 50 packets
  • Maximum marking threshold K_max = 200 packets

These values balance responsiveness against stability, allowing rapid reaction to congestion onset while avoiding excessive rate oscillation.

C. Downlink Congestion Control

Satellite downlink transmissions to multiple user terminals create a one-to-many distribution scenario. Rather than individual per-flow marking, satellites can implement quantized feedback broadcast to all terminals within a spot beam:

  • Level 0 (Green): Queue occupancy < 30% → terminals may increase rates gradually
  • Level 1 (Yellow): Queue occupancy 30-60% → terminals maintain current rates
  • Level 2 (Orange): Queue occupancy 60-80% → terminals reduce rates by 25%
  • Level 3 (Red): Queue occupancy > 80% → terminals reduce rates by 50%

Quantized feedback reduces control-plane overhead compared to per-packet ECN marking while providing sufficient granularity for effective rate control. Terminals implement quantized AIMD adjustments synchronized to broadcast feedback intervals, typically 1-10 RTTs depending on constellation geometry.

D. Hybrid Gateway QoS Integration

Satellite gateways serving as aggregation points between terrestrial networks and space segments benefit from combining ECN-based rate control with traditional QoS mechanisms:

Weighted Fair Queuing (WFQ): Allocates bandwidth proportionally among traffic classes, preventing any single flow or user from monopolizing capacity

Deficit Round Robin (DRR): Provides fairness with lower computational complexity than WFQ, suitable for high-throughput gateway processors

Class-Based Queuing (CBQ): Separates latency-sensitive traffic (e.g., VoIP, video conferencing) from bulk data transfers, applying ECN marking thresholds appropriate to each class

This hybrid approach maintains fairness at multiple timescales: ECN provides rapid, sub-RTT feedback for instantaneous congestion, while WFQ/DRR enforce longer-term resource allocations aligned with service agreements.

V. Performance Analysis

A. Fairness Improvements

Simulation studies and testbed experiments with ECN-enabled satellite links demonstrate Jain fairness indices exceeding 0.9 across heterogeneous ground station populations. By decoupling rate control from physical-layer signal quality, explicit congestion signaling prevents capture effects that plague loss-based TCP.

The quantized AIMD algorithm used in DCQCN achieves faster convergence to fair rate allocation compared to delay-based schemes. When N flows compete for shared satellite spectrum, DCQCN-based control reaches equilibrium within O(log N) RTTs, whereas delay-based algorithms require O(N) RTTs.

B. Spectrum Efficiency

Proactive congestion signaling reduces retransmission overhead by 40-60% compared to loss-based TCP in satellite environments with 1% base loss rate. By reacting to queue buildup before buffer overflow occurs, ECN-based systems maintain higher effective throughput and reduce wasted spectrum from redundant transmissions.

For LEO constellations with frequent handovers, ECN marking persists across satellite transitions, providing continuity in congestion feedback that loss-based mechanisms cannot match. This results in 25-35% throughput improvements during handover intervals compared to standard TCP CUBIC.

C. Latency Reduction

ECN marking at lower queue thresholds reduces queuing delay compared to systems that wait for loss events. For interactive applications requiring bounded latency—such as remote sensing, command-and-control systems, or real-time telemetry—maintaining queue depths below 100 ms becomes feasible with explicit signaling.

Data center experience shows that DCQCN maintains 99th percentile latencies below 500 µs for RoCEv2 traffic. While absolute latencies in satellite networks remain higher due to propagation delay, the relative reduction in queuing delay provides similar benefits for latency-sensitive applications.

VI. Implementation Considerations

A. Protocol Modifications

Existing satellite terminals and modems require firmware updates to support ECN-capable transport protocols. For systems using UDP-based proprietary protocols, ECN functionality can be implemented through:

  • Custom congestion notification fields in application-layer headers
  • Out-of-band control channels for feedback signaling
  • Integration with existing satellite-specific protocols (e.g., DVB-S2, DVB-RCS2)

IETF working groups have proposed extensions to QUIC and HTTP/3 for improved satellite performance, including ECN support and congestion control algorithms tuned for high-latency environments.

B. Parameter Tuning

Optimal ECN marking thresholds and AIMD parameters depend on constellation characteristics:

LEO (30-40 ms RTT):

  • K_min = 50-100 packets
  • α = 0.5 (50% rate reduction)
  • β = 5-10 Mbps per RTT additive increase

MEO (80-150 ms RTT):

  • K_min = 150-300 packets
  • α = 0.5
  • β = 2-5 Mbps per RTT

GEO (500-600 ms RTT):

  • K_min = 500-1000 packets
  • α = 0.4 (gentler reduction for slower feedback)
  • β = 1-2 Mbps per RTT

These parameters require validation through simulation and field trials across diverse traffic patterns and network conditions.

C. Interoperability and Deployment

Incremental deployment presents challenges in networks with mixed ECN-capable and legacy terminals. Satellite operators can adopt several strategies:

  1. Per-beam enablement: Activate ECN on spot beams with high concentrations of updated terminals
  2. Fallback mechanisms: Maintain PFC or legacy TCP for non-ECN traffic
  3. Gateway-assisted marking: Implement ECN at gateways for terrestrial-satellite boundary

Standards development through 3GPP (5G NTN specifications), IETF (QUIC satellite profile), and ITU-R (spectrum management recommendations) will facilitate interoperability across multi-vendor deployments.

VII. Future Research Directions

Several open questions remain for ECN/DCQCN adaptation in satellite networks:

Machine Learning Integration: Can reinforcement learning algorithms optimize ECN marking thresholds and AIMD parameters dynamically based on observed traffic patterns and constellation state?

Cross-Layer Optimization: How should congestion control coordinate with physical-layer adaptive coding/modulation, beam steering, and power control for system-wide efficiency?

Multi-Constellation Scenarios: When user terminals connect to satellites from multiple operators, how can ECN feedback be federated across administrative boundaries while preserving fairness and preventing gaming?

Application-Specific Tuning: Do different application classes (bulk transfer, streaming media, IoT telemetry) benefit from distinct ECN parameter profiles, and how can these be implemented without excessive complexity?

VIII. Conclusion

The adaptation of data center congestion control techniques to satellite spectrum management represents a promising direction for improving NTN performance as deployment scales increase. Explicit congestion signaling through ECN and DCQCN-inspired algorithms addresses fundamental limitations of loss-based TCP in high-latency, dynamic-topology space networks.

By providing early, continuous feedback on network congestion state, ECN-based systems achieve superior fairness, spectrum efficiency, and latency characteristics compared to conventional approaches. The success of these techniques in data center RoCEv2 environments—where microsecond-level latencies and lossless operation are required—suggests strong potential for satellite applications where millisecond-level latencies and efficient spectrum utilization are paramount.

Practical deployment requires careful parameter tuning, protocol modifications, and coordination between satellite operators, terminal vendors, and standards bodies. Field trials measuring fairness metrics, spectrum efficiency, and application-level performance across diverse orbital regimes will be essential for validating theoretical predictions and refining implementation strategies.

As mega-constellations continue to expand and user demand for satellite connectivity grows, proactive congestion control mechanisms will become increasingly critical for delivering performance approaching terrestrial fiber quality. The transition from loss-based to signal-based congestion management represents a key enabler for next-generation satellite networks serving cloud applications, enterprise connectivity, and ubiquitous global broadband access.

References

[1] S. R. Subramanian, "Spectrum Congestion Control: ECN/DCQCN Insights," The Data Scientist, 2025. [Online]. Available: https://thedatascientist.com/spectrum-congestion-control-ecn-dcqcn-insights

[2] Y. Zhu et al., "Congestion Control for Large-Scale RDMA Deployments," ACM SIGCOMM Computer Communication Review, vol. 45, no. 4, pp. 523-536, 2015.

[3] IEEE 802.1Qau, "Congestion Notification," IEEE Standard, 2010.

[4] K. Ramakrishnan, S. Floyd, and D. Black, "The Addition of Explicit Congestion Notification (ECN) to IP," RFC 3168, IETF, 2001.

[5] 3GPP TR 38.821, "Solutions for NR to support non-terrestrial networks (NTN)," 3rd Generation Partnership Project, 2021.

[6] C. Caini et al., "TCP Hybla: A TCP Enhancement for Heterogeneous Networks," International Journal of Satellite Communications and Networking, vol. 22, no. 5, pp. 547-566, 2004.

[7] N. Cardwell et al., "BBR: Congestion-Based Congestion Control," Communications of the ACM, vol. 60, no. 2, pp. 58-66, 2017.

[8] ITU-R S.1432, "Apportionment of the Allowable Error Performance Degradations to Radio-Relay Systems Arising from Interference," International Telecommunication Union, 2000.


Author Biography: This survey synthesizes recent developments in satellite congestion control based on techniques adapted from data center networking research and operational insights from wide area network engineering practice.

 

Tuesday, October 7, 2025

AI Software Faces Profitability Crisis


Why Most AI Startups Are Bad Businesses - YouTube

Most AI LLM services are loss leaders. Bottom line shows Anthropic Claude is the only LLM that is currently profitable, and by a large ammount. Interactive Graphic

Unsustainable Economics Threaten Industry

Razor-thin margins and soaring compute costs challenge viability of generative AI companies, with even industry leaders operating at substantial losses

SAN FRANCISCO — The artificial intelligence boom that has captivated Silicon Valley and attracted billions in venture capital is confronting a harsh economic reality: the vast majority of AI software companies cannot generate profits under current business models, raising fundamental questions about the industry's long-term viability.

The stark assessment comes from both industry veterans and financial disclosures. Daria Kulikova, a senior full-stack product manager with eight years of experience building B2B and B2B2C SaaS products—including a LegalTech platform generating over $500 million in annual recurring revenue—has emerged as a prominent voice challenging the AI industry's sustainability. Through her YouTube channel analyzing technology trends, Kulikova has documented how AI-native companies operate under fundamentally different economics than traditional software businesses.

While traditional software-as-a-service companies routinely achieve gross margins between 70% and 90%, generative AI startups are struggling with margins of just 30% to 60% at best, according to multiple industry reports and financial disclosures. Even mature AI products operate far below traditional software economics, with AI-powered software businesses showing 55% gross margins compared to traditional SaaS companies with 85% margins, leaving far less room for pricing errors and operating leverage.

The margin crisis extends to the industry's most prominent players. Bessemer Venture Partners' 2025 data shows fast-growing AI "Supernovas" averaging only 25% gross margins, while steadier companies trend closer to 60%, with many AI Supernovas showing negative gross margins — a phenomenon rarely seen in software.

Experience From the Trenches

Kulikova's analysis draws on her track record leading four end-to-end SaaS products and over 50 feature developments, including a B2B digital accessibility platform that scaled its user base fivefold within a year. Her perspective contrasts sharply with the venture-fueled optimism dominating AI industry discourse.

"Traditional SaaS after initial R&D and platform investment—serving more customers adds very little extra cost," Kulikova explained in her analysis. "In AI-native or GPT wrapper products, there are major ongoing costs per user: API calls, compute time, licensing, sometimes per-output moderation."

The distinction proves critical. In traditional software, customer acquisition represents the primary variable cost—typically service-related expenses like customer success managers or support specialists. In AI-native products, computational costs rise exponentially with usage, forcing companies to implement usage caps to manage expenses.

Foundational Models Bleed Cash

The economics prove challenging even for companies building foundational models. OpenAI expected approximately $5 billion in losses on $3.7 billion in revenue last year, and the company reported $4.3 billion in first-half 2025 revenue but incurred $8.5 billion in expenses, yielding a $4.7 billion loss.

OpenAI now generates $10 billion in annual recurring revenue, but CEO Sam Altman stated the company should prioritize growth and investments in training and compute "for a long time," even if it delays profitability, saying the rational approach is to "be willing to run the loss for quite a while".

Anthropic, maker of the Claude AI assistant, faces similar challenges. The company's annualized revenue jumped from $1 billion to $3 billion in just five months through May 2025, and by August 2025 reached over $5 billion in run-rate revenue. However, Anthropic expects to lose $3 billion in 2025 due to how unprofitable its models are.

The Unit Economics Problem

The fundamental issue lies in AI's cost structure. In traditional SaaS, the big upfront cost is product development, with selling copies generating pure profit afterward, but in AI each unit produced comes with material costs, similar to manufacturing widgets in a factory, as COGS now matter again.

Each user interaction requires API calls, GPU compute time, and often per-output moderation—expenses that scale directly with usage. ChatGPT was at one point costing OpenAI an estimated $700,000 per day, though costs have since declined to between $100,000 and several hundred thousand dollars daily.

The problem intensifies with heavy users. OpenAI CEO Sam Altman admitted in January 2025 that the company is losing money on its $200-per-month ChatGPT Pro plan because people are using it more than expected, with some users costing more than $200 monthly to serve.

Microsoft's GitHub Copilot experienced similar challenges. The Wall Street Journal reported Microsoft's GitHub Copilot is losing an average of $20 per month per user, with some users costing as much as $80 monthly, while Copilot charges $10 per month.

Conversion Crisis

Beyond operational costs, AI companies face severe challenges converting free users to paying customers. Despite ChatGPT's massive reach, with an estimated 500-600 million monthly active users, the platform has only 15.5 million paying subscribers, representing a conversion rate of approximately 2.6%.

Kulikova, whose product management experience includes optimizing conversion funnels for enterprise platforms, characterized this as "alarmingly low" for a product positioned as a worldwide disruptor. "In product management terms, your product-market fit goes out the window or you never had it in the first place," she noted.

OpenAI revealed 20 million paying subscribers and over 500 million weekly active users, choosing to report weekly rather than monthly metrics, with weekly active users rising 100 million while site traffic remained unchanged.

Kulikova questioned the reliance on weekly metrics, suggesting they represent "vanity metrics"—impressive numbers that don't necessarily translate to sustainable revenue. "Monthly is a much more common SaaS metric than weekly," she observed. "Why weekly? Does it really translate to revenue?"

Pricing Wars Intensify Pressure

The sector faces mounting pressure from aggressive price competition. OpenAI's GPT-5 models offer dramatically lower pricing than Anthropic's Claude alternatives, with Claude Opus 4 costing up to 50 times more for output than GPT-5's most affordable tier.

Microsoft has introduced a $30 Copilot add-on for Office and OpenAI launched a $200 ChatGPT Pro plan, with higher price points aiming to ensure revenue covers hefty compute costs. However, premium pricing strategies risk customer pushback in competitive markets.

GitHub introduced "premium requests" in April 2025, imposing rate limits when users switch to AI models beyond the base model, with Copilot Pro users receiving 300 monthly premium requests and additional requests costing $0.04 each.

"This paywall is necessary to prevent runaway costs from a small set of very heavy users," Kulikova explained. "But the paradox is that when they do put in usage controls, users don't like the experience."

Dependency Risks

The challenges extend beyond foundational model providers to companies building atop them. Anthropic's revenue concentration shows approximately $1.2 billion of the company's $4 billion revenue milestone came from just two customers: coding applications Cursor and GitHub Copilot, representing nearly a quarter of income.

Cursor sends 100% of its revenue to Anthropic, which uses that money to build Claude Code, a competitor to Cursor, with Cursor deeply unprofitable even before Anthropic added service tiers that increased enterprise pricing.

Kulikova described this dynamic as symptomatic of industry-wide problems. "If a GPT wrapper startup puts a low price on AI usage or offers a beefy premium and doesn't limit expensive features, a minority of users can generate costs that will scale exponentially," she said.

Rare Success Cases

Despite widespread struggles, certain applications demonstrate sustainable economics. Companies with control over which models to use and workflow depth can achieve better margins, as classic SaaS was essentially a wrapper over databases and public cloud yet sustained 70-80% gross margins.

Drawing on her LegalTech experience, Kulikova identified successful AI businesses as those working with large volumes of text-based data in industries including accounting, HR, sales, and legal services. "The vast majority of GenAI startups that work are apps that work with large amounts of text-based data and documents in various industries," she said, citing examples of tools that stitch together contract data with invoices and automate communications between parties.

Anthropic's Claude Code has quickly generated over $500 million in run-rate revenue with usage growing more than 10x in three months, demonstrating strong demand for developer-focused AI tools.

However, Kulikova cautioned that even successful applications face limitations. "It's not going to make billions in revenue, but it is a sustainable and viable business model," she said of contract automation tools.

The Product Management Litmus Test

Based on her experience scaling products from concept to launch, Kulikova proposed a framework for evaluating AI business viability: "A traditional SaaS startup with AI features for tasks that can be automated—working off of traditional SaaS benchmarks, not bubbled up AI metrics."

The critical question, she argued, is whether products deliver value independent of AI capabilities. "The point where you know that your product is valuable is when it is valuable without AI and when it could solve the real problem without AI," Kulikova explained. "If there is an AI component in it and it makes something better or faster, fantastic. But AI should not be the defining factor of the product."

Market Outlook

Industry executives remain divided on AI's economic trajectory. Microsoft CEO Satya Nadella suggested AI could boost global GDP growth to 10% annually, while Wharton School researchers project a more modest 1.5% increase in productivity and GDP by 2035.

A McKinsey survey showed that of companies reporting cost reductions from AI, most had savings of less than 10%, while companies with revenue increases saw gains of less than 5%.

Technology executives warned against an agentic AI hype cycle, cautioning that investors shouldn't expect profitability in the next three to five years.

AI Supernova startups reach approximately $40 million ARR in their first year and $125 million in their second year, but often with fragile retention and thin margins, contrasting with "Shooting Star" companies that grow from $3 million to $100 million over four years with strong product-market fit and healthy margins.

Strategic Imperatives

Industry experts recommend AI companies maintain gross margins above 50% and an LTV:CAC ratio of 3:1 or higher, warning that ratios below 1:1 mean "selling dollars for 90 cents".

The key question for viable AI products is whether use cases require top models for every request or can meet quality bars by routing most traffic to cheaper models and bursting to frontier models when needed.

Kulikova's conclusion reflects her years navigating enterprise software economics: "It's easy to get a user to try a product with big promises and even more so with FOMO and scare tactics. What's really difficult is to deliver value—value that a few can replicate."

She pointed to traditional metrics as the ultimate arbiter of success. "Retention and conversion rates—those two things will outlive blown up VC rounds or hype waves," Kulikova said. "The hype won't last forever, but boring, profitable businesses will."

As the sector matures, companies face mounting pressure to demonstrate sustainable unit economics rather than relying indefinitely on venture capital. The coming years will likely separate viable businesses from those dependent on continued investor enthusiasm and media hype—a distinction product managers like Kulikova, with experience building platforms that generate hundreds of millions in recurring revenue, understand intimately.


Sources

  • Monetizely. "AI Pricing in 2025: Monetizely's Strategy for Costing." April 22, 2025. https://www.getmonetizely.com/blogs/ai-pricing-how-much-does-ai-cost-in-2025
  • Jaipuria, Tanay. "The State of AI Gross Margins in 2025." September 2, 2025. https://www.tanayj.com/p/the-gross-margin-debate-in-ai
  • MacroTrends. "C3.ai Profit Margin 2020-2025." https://www.macrotrends.net/stocks/charts/AI/c3ai/profit-margins
  • Zitron, Ed. "Why Everybody Is Losing Money On AI." Where's Your Ed At. September 2025. https://www.wheresyoured.at/why-everybody-is-losing-money-on-ai/
  • Bessemer Venture Partners. "The State of AI 2025." August 15, 2025. https://www.bvp.com/atlas/the-state-of-ai-2025
  • Morgan Stanley. "5 AI Trends Shaping Innovation and ROI in 2025." 2025. https://www.morganstanley.com/insights/articles/ai-trends-reasoning-frontier-models-2025-tmt
  • Arc5 Ventures. "5 Critical Cost Metrics Every AI Startup Must Track in 2025." 2025. https://arc5ventures.com/blog/ai-startup-cost-metrics-2025
  • Bank of America / CFO Dive. "Artificial intelligence may boost profit margins 2% over next five years: BofA." September 12, 2024. https://www.cfodive.com/news/artificial-intelligence-boost-profit-margins-five-years-GenAI-bofa/726910/
  • Penn Wharton Budget Model. "The Projected Impact of Generative AI on Future Productivity Growth." September 2025. https://budgetmodel.wharton.upenn.edu/issues/2025/9/8/projected-impact-of-generative-ai-on-future-productivity-growth
  • Visual Capitalist. "Visualizing AI's Effect on Industry Margins Over the Next Five Years." October 8, 2024. https://www.visualcapitalist.com/ais-effect-on-industry-margins-over-five-years/
  • CNBC / Russell, Jon. "OpenAI's Altman is still looking to spend after GPT-5 launch and is 'willing to run the loss'." August 8, 2025. https://www.cnbc.com/2025/08/08/chatgpt-gpt-5-openai-altman-loss.html
  • LessWrong. "OpenAI lost $5 billion in 2024 (and its losses are increasing)." 2025. https://www.lesswrong.com/posts/CCQsQnCMWhJcCFY9x/openai-lost-usd5-billion-in-2024-and-its-losses-are
  • CNBC / Rosenbaum, Eric. "OpenAI hits $10 billion in annual recurring revenue fueled by ChatGPT growth." June 9, 2025. https://www.cnbc.com/2025/06/09/openai-hits-10-billion-in-annualized-revenue-fueled-by-chatgpt-growth.html
  • Hacker News. "OpenAI's H1 2025: $4.3B in income, $13.5B in loss." October 2025. https://news.ycombinator.com/item?id=45453586
  • TechCrunch / Wiggers, Kyle. "OpenAI is losing money on its pricey ChatGPT Pro plan, CEO Sam Altman says." January 6, 2025. https://techcrunch.com/2025/01/05/openai-is-losing-money-on-its-pricey-chatgpt-pro-plan-ceo-sam-altman-says/
  • Zitron, Ed. "OpenAI Is A Systemic Risk To The Tech Industry." Where's Your Ed At. April 14, 2025. https://www.wheresyoured.at/openai-is-a-systemic-risk-to-the-tech-industry-2/
  • Dataconomy. "OpenAI Projects $44B Losses Before 2029 Profitability." September 2025. https://dataconomy.com/2025/09/16/openai-projects-44b-losses-before-2029-profitability/
  • AutoGPT.net. "OpenAI Admits Losses on Its High-Priced ChatGPT Pro Tier." June 13, 2025. https://autogpt.net/openai-admits-losses-on-its-high-priced-chatgpt-pro-tier/
  • WebProNews. "OpenAI Reports $4.3B H1 2025 Revenue, $4.7B Loss; Eyes $13B Yearly Goal." October 2025. https://www.webpronews.com/openai-reports-4-3b-h1-2025-revenue-4-7b-loss-eyes-13b-yearly-goal/
  • Sacra. "OpenAI revenue, valuation & growth rate." 2025. https://sacra.com/c/openai/
  • Sacra. "Anthropic revenue, valuation & funding." 2025. https://sacra.com/c/anthropic/
  • StatsUp / Analyzify. "Latest Anthropic (Claude AI) Statistics (2025)." 2025. https://analyzify.com/statsup/anthropic
  • PYMNTS.com. "Report: Anthropic's Annualized Revenue Reaches $1.4 Billion." March 11, 2025. https://www.pymnts.com/artificial-intelligence-2/2025/report-anthropics-annualized-revenue-reaches-1-4-billion/
  • Zitron, Ed. "Anthropic and OpenAI Have Begun The Subprime AI Crisis." Where's Your Ed At. July 8, 2025. https://www.wheresyoured.at/anthropic-and-openai-have-begun-the-subprime-ai-crisis/
  • VentureBeat. "Anthropic revenue tied to two customers as AI pricing war threatens margins." August 27, 2025. https://venturebeat.com/ai/anthropic-revenue-tied-to-two-customers-as-ai-pricing-war-threatens-margins
  • Tap Twice Digital. "7 Anthropic Statistics (2025): Revenue, Valuation, Users, Funding." May 17, 2025. https://taptwicedigital.com/stats/anthropic
  • CNBC / Rosenbaum, Eric. "Anthropic hits $3 billion in annualized revenue on business demand for AI." May 30, 2025. https://www.cnbc.com/2025/05/30/anthropic-hits-3-billion-in-annualized-revenue-on-business-demand-for-ai.html
  • Anthropic. "Anthropic raises $13B Series F at $183B post-money valuation." September 2025. https://www.anthropic.com/news/anthropic-raises-series-f-at-usd183b-post-money-valuation
  • Anthropic. "Anthropic raises Series E at $61.5B post-money valuation." March 2025. https://www.anthropic.com/news/anthropic-raises-series-e-at-usd61-5b-post-money-valuation
  • Medium / FutureTechForecast. "The Impact of GitHub Copilot on Microsoft's Finances: A Deep Dive Analysis." October 9, 2023. https://medium.com/@FutureTechForecast/the-impact-of-github-copilot-on-microsofts-finances-a-deep-dive-analysis-a65d39de5b71
  • TechCrunch / Wiggers, Kyle. "GitHub Copilot introduces new limits, charges for 'premium' AI models." April 4, 2025. https://techcrunch.com/2025/04/04/github-copilot-introduces-new-limits-charges-for-premium-ai-models/
  • AI Business. "Microsoft's GitHub Copilot Loses $20 a Month Per User." October 11, 2023. https://aibusiness.com/nlp/github-copilot-loses-20-a-month-per-user
  • AIJourn. "AI Wrappers 2025: Deep Dive into Market Opportunity, Business Models, & Community Insights." March 3, 2025. https://aijourn.com/how-ai-wrappers-are-creating-multi-million-dollar-businesses/
  • Altero. "The Rise of AI Wrappers: Shortcut to Startup Success or a Bubble Waiting to Pop?" May 12, 2025. https://www.altero.us/insights/the-rise-of-ai-wrappers-shortcut-to-startup-success-or-a-bubble-waiting-to-pop
  • SaaS Minded. "Building an AI Wrapper SaaS in 2025: Opportunities and Challenges." January 6, 2025. https://saasminded.dev/building-an-ai-wrapper-saas-in-2025-opportunities-and-challenges/
  • IEEE Spectrum. "The State of AI 2025: 12 Eye-Opening Graphs." May 1, 2025. https://spectrum.ieee.org/ai-index-2025
  • PwC. "2025 AI Business Predictions." 2025. https://www.pwc.com/us/en/tech-effect/ai-analytics/ai-predictions.html
  • Every.to. "Rise of the AI Wrappers." May 10, 2025. https://every.to/context-window/rise-of-the-ai-wrappers
  •  

     

    Sunday, October 5, 2025

    GE Aerospace Advances Hypersonic Propulsion with Dual Engine Demonstrations

    GE Aerospace Advances Hypersonic Propulsion with Dual Engine Demonstrations

    Flight tests of solid-fuel ramjet and rotation-detonation technology mark significant milestones in U.S. hypersonic weapons development

    GE Aerospace has achieved key milestones in hypersonic propulsion development, successfully demonstrating two advanced engine concepts that could power the next generation of U.S. long-range strike weapons.

    The company announced Sept. 22, 2025, that it had conducted flight testing of a Solid Fuel Ramjet (SFRJ) propulsion system aboard the Atmospheric Test Launched Airbreathing System (ATLAS) at Kennedy Space Center, while separately advancing ground testing of its Hypersonic Dual-Mode Ramjet (DMRJ) featuring rotation-detonation combustion technology at the GE Aerospace Research Center in Niskayuna, New York.

    Flight Test Campaign

    The SFRJ demonstration utilized a modified F-104 Starfighter operated by Starfighters Inc., which carried the ATLAS test article as a fixed captive-carry payload on its port wing. The aircraft achieved supersonic speeds up to Mach 2.2 across three successful flight tests, providing flight-relevant conditions necessary for ramjet ignition and sustainment.

    "ATLAS is a test vehicle and we are using that to demonstrate in flight relevant conditions our solid fuel ramjet," said Mark Rettig, vice president and general manager at GE Aerospace's Edison Works Advanced Programs, in an Oct. 1 promotional video. "The F-104 Starfighter can get up to Mach 2.2 and that's a pretty good speed to demonstrate ignition and sustainment of our SFR."

    The program received funding from the Department of Defense through Title III of the Defense Production Act to scale air-breathing propulsion technology for extended-range munitions. The F-104 platform was selected specifically for its ability to reach the kinematic conditions where ramjet engines begin producing effective thrust—performance unattainable by other available test aircraft.

    Solid-fuel ramjets offer distinct operational advantages over liquid-fueled variants, including reduced weight and continuous, throttleable thrust. This throttling capability enables in-flight maneuvering, complicating intercept calculations for conventional air-defense systems that rely on ballistic trajectory prediction.

    Rotation-Detonation Breakthrough

    In parallel development, GE's DMRJ program has progressed from initial demonstrations in late 2023 to advanced ground testing in 2024. The engine harnesses rotation-detonation combustion technology within a supersonic flow stream, exploiting powerful detonation shockwaves rather than conventional deflagration to generate thrust.

    The company conducted testing of an additively manufactured DMRJ at its Evendale, Ohio, facility beginning in March 2024, recording a three-fold increase in airflow compared to baseline performance. GE characterized the results as a "tremendous success, both from a combustion performance and thermal structure perspective."

    The rapid development timeline proved noteworthy: the 3D-printed engine was designed and built in 8.5 months, achieving first ignition within 11 months of program start.

    Rotation-detonation engines generate thrust through controlled detonation of fuel-air mixtures rather than conventional combustion. Fuel and oxidizers are introduced through small ports into a circular combustion channel, where traveling detonation shockwaves continuously ignite successive fuel charges in a self-sustaining cycle. This process delivers greater thrust efficiency for equivalent fuel consumption compared to traditional combustion approaches.

    Program Applications

    According to Aviation Week reporting, GE plans to offer the DMRJ for DARPA's Next-Generation Responsive Strike (NextRS) demonstrator program, as confirmed by Craig Young, executive director of the Edison Works division.

    NextRS aims to develop survivable, rapidly deployable long-range strike capabilities with enhanced range, speed and mission flexibility. The program is advancing technologies across multiple domains, including advanced structures and materials, high-speed weapon separation, dual-mode propulsion, power generation, thermal management and high-Mach turbine engines. The initiative seeks to enable penetration of contested environments and precision engagement of time-sensitive targets.

    The GE propulsion systems do not appear destined for current U.S. air-launched hypersonic programs already in flight test or advanced development, suggesting their application to follow-on weapon systems in the expanding DoD hypersonic portfolio.


    Sources

    1. Satam, P. (2025, October). General Electric Demonstrates Ramjet Engines for Hypersonic Missiles - The Aviationist. https://theaviationist.com

    2. GE Aerospace. (2025, January 13). Press Release: Hypersonic Dual-Mode Ramjet Development. GE Aerospace.

    3. GE Aerospace. (2025, October 1). ATLAS Flight Test Program [Promotional video]. GE Aerospace.

    4. Next-Generation Responsive Strike (NextRS) Program Overview. Defense Advanced Research Projects Agency (DARPA).

    5. Aviation Week reporting on GE DMRJ and NextRS program (as cited in source document).


    Industry Expert's Missile Design Textbook and Training Course

     Fleeman Continues  to Shape Defense Technology Education

    Veteran aerospace engineer Eugene Fleeman's comprehensive handbook has become the definitive reference for missile systems, earning widespread acclaim from professionals and academics while competing programs focus on specialized topics

    ATLANTA — More than five decades of missile systems expertise has been distilled into what industry professionals are calling "the bible" for missile design education, as Eugene L. Fleeman's work continues to influence defense technology development worldwide.

    Fleeman's latest textbook, Missile Design Guide, published in 2022 by the American Institute of Aeronautics and Astronautics, draws from over 50 years of experience in missile system design and development. The handbook features extensive use of simple closed-form, physics-based analytical prediction expressions to provide better insight into primary driving system parameters, with emphasis given to ease of use.

    A Career Spanning Defense, Industry, and Academia

    Fleeman's career spans government service at the U.S. Air Force Flight Dynamics Laboratory at Wright-Patterson Air Force Base, where he managed advanced weapons programs and served as senior engineer for missile stability and control technology. He later became deputy director of hypersonic design and applications at Boeing, contributing to projects including the affordable rapid response missile demonstrator and the X-43 Hyper-X Demonstrator.

    Currently, Fleeman is an international lecturer on missiles and has authored over 200 publications, including four textbooks.

    Hypersonic Missile Coverage

    Reflecting the growing importance of high-speed weapons systems, Fleeman's work includes substantial hypersonic content. He offers a specialized course that addresses hypersonic missile design, development, analysis, and system engineering, featuring over 90 videos that illustrate hypersonic missile development activities and performance.

    The hypersonic course covers key topics including drivers for hypersonic missile design, conceptual design criteria for hypersonic missiles, hypersonic missile configuration design/sizing, and prediction of aerodynamics, propulsion, weight, and flight performance. It also addresses measures of merit sizing and prediction, including missile seeker alternatives and performance, guidance alternatives and accuracy, warhead alternatives and lethality, observables and observable reduction, and the development process for hypersonic missile systems and technologies.

    The Missile Design Guide includes detailed chapters on materials, specifically covering missile airframe material alternatives, missile structure/insulation trades, high temperature insulation materials, missile aerodynamic heating/thermal response prediction, localized aerodynamic heating and thermal stress, and rocket motor case and nozzle material alternatives.

    Analytical Philosophy vs. Cutting-Edge Research

    Fleeman's approach emphasizes fundamental design principles over recent experimental developments. A reviewer noted that "the trademark of this book is its extensive use of closed-form analytic solutions," contrasting this with "most modern design techniques [that] advocate running a bazillion computer simulations, aggregating the results, and trying to make sense of them".

    The materials discuss "typical values of missile parameters and the characteristics of current operational missiles" as well as "the enabling subsystems and technologies for missiles and the current/projected state-of-the-art", though there is no specific mention of recent hypersonic wind tunnel research, cutting-edge CFD methodologies, or latest experimental validation techniques in the available materials.

    This makes the work excellent for understanding fundamental design principles and gaining rapid conceptual design capabilities, though readers seeking comprehensive coverage of the latest experimental facilities or contemporary research programs would likely need to supplement with current journal literature.

    Widespread Professional Acclaim

    The 2022 Missile Design Guide represents an update to Fleeman's 2012 textbook Missile Design and System Engineering. Reader reviews on Amazon describe the earlier edition as serving "as the standard on the subject" and "a must in the professionals library," providing college-level insights on missile design and systems engineering.

    One aerospace engineer who studied under Fleeman at Georgia Tech described the work as "a SUBSTANTIAL update with significant new material," noting it took months to digest the nearly 900-page volume and calling it "the critical textbook for education in a discipline that is sadly becoming somewhat of a lost art".

    The reviewer praised Fleeman as "the first author to put all this information in one place, and tell the story of the design process to string it all together," highlighting that every step of problems is explained, allowing students and designers to understand trends driving design and justify decisions with actual physics.

    Another review noted the book's practical utility, stating that "Most are cleanly presented graphs and charts that depict the key sensitivities in the missile design process. They are useful as 'rules of thumb' for new designs or checking the performance of design simulations or flight tests".

    Academic Impact and Citations

    The 2012 edition of Missile Design and System Engineering has received 38 academic citations according to Semantic Scholar, with contributions spanning highly influential citations, background citations, and methods citations. The work addresses critical topics including aerodynamic configuration, propulsion system design, guidance and control systems, and launch platform integration.

    Ongoing Training Programs

    Complementing the textbooks, Fleeman regularly conducts professional education courses through multiple organizations. Every few months he offers a short course based on his textbook Missile Design Guide, plus additional new material, available through the AIAA, ATI, GTRI, AOC, and K2B websites. Since 1999, his short course has been held over 100 times in fifteen countries and five continents.

    The three-day in-person course is scheduled for December 9-11 in Atlanta, Georgia, sponsored by Georgia Tech Research Institute (GTRI). The course is designed for program managers, marketing personnel, systems analysts, engineers, university professors, and professionals working in missile systems, missile system integration, and missile technology development.

    Course content covers key considerations including aerodynamic, propulsion, weight, and flight-performance aspects, along with critical tradeoffs in meeting performance, cost, risk, robustness, lethality, guidance, navigation, control, accuracy, survivability, reliability, and launch platform compatibility requirements.

    Fleeman spoke at an AIAA Greater New Orleans Section dinner meeting on March 20, 2025, in Slidell, presenting an "Overview of Missile Design, Development, and System Engineering".

    Comparable Courses and How They Differ

    Fleeman's work exists within a broader ecosystem of missile-related professional education, with several complementary but distinct offerings available through AIAA and other organizations.

    Paul Zarchan's Guidance-Focused Courses

    Paul Zarchan, who has more than 40 years of experience designing, analyzing, and evaluating missile guidance systems and currently works as a Member of the Technical Staff for MIT Lincoln Laboratory, offers "Fundamentals of Tactical and Strategic Missile Guidance" through AIAA.

    Zarchan's course focuses specifically on guidance systems, covering topics including proportional navigation, adjoints and the homing loop, noise analysis, digital noise filters, advanced guidance laws, Kalman filters, ballistic targets, strategic intercepts, and theater missile defense. His accompanying textbook, now in its seventh edition, has been a best-seller and uses MATLAB code rather than FORTRAN, with over 1,000 pages of content.

    Key Distinction: While Fleeman provides comprehensive, system-level missile design covering all subsystems using analytical expressions, Zarchan specializes deeply in guidance and control with extensive mathematical treatment and simulation approaches.

    Launch Vehicle Design: Edberg and Costa

    One reviewer comparing textbooks noted: "Compare it with 'Design of Rockets and Space Launch Vehicles' by Edberg and Costa. That is a book that achieves near perfection in my opinion. Totally different focus".

    Don Edberg, an AIAA Associate Fellow teaching at California State Polytechnic University, Pomona, and Willie Costa authored Design of Rockets and Space Launch Vehicles, now in its second edition with over 1,070 pages. The second edition includes discussions of hybrid and quasi-hybrid rockets, complete equation sets for payload and propellant mass calculations, information on ridesharing and piggybacking, and new sections on recovery and reuse including physics, energy, and mass requirements.

    Key Distinction: Edberg and Costa focus on space launch vehicles and orbital systems, while Fleeman concentrates on tactical and strategic missiles with different mission profiles, trajectories, and design constraints. A reviewer noted contrasting presentation styles, with one criticizing Fleeman's use of US customary units and numerical calculations in body text while praising Edberg and Costa's presentation.

    AIAA's Specialized Offerings

    AIAA offers Fleeman's own "Missile Design and System Engineering" as a short course providing system-level, integrated methods for missile configuration design and analysis, with 66 videos illustrating missile development activities and performance.

    The organization also provides numerous related specialized courses including "A Practical Approach to Flight Dynamics and Control of Aircraft, Missiles, and Hypersonic Vehicles," "Advanced Flight Dynamics and Control of Aircraft, Missiles, and Hypersonic Vehicles," and "Fundamentals of Astrodynamics for Space Missile Defense."

    Market Position: Fleeman's comprehensive, end-to-end missile design approach appears unique in its breadth. Other courses either specialize more deeply in specific subsystems (guidance, propulsion) or address different vehicle types (launch vehicles). As one reviewer put it: "'Missile Design and Systems Engineering' is the new one-stop shop for missile engineering".

    Filling a Critical Knowledge Gap

    The textbook consists of full-color figures with self-standing graphs, tables, charts, and diagrams, aimed toward missile engineers, system engineers, system analysts, program managers, aerospace engineering students, and professors. It provides readers with a quick reference for missile design, missile technologies, launch platform integration, targeting, fire control integration, missile system measures of merit, and the missile system development process.

    As one reviewer observed, the work addresses education needs "in a discipline that is sadly becoming somewhat of a lost art as the engineers who gave us the Sparrow, AMRAAM, Tomahawk, Polaris, and other great missiles retire".

    The comprehensive nature of Fleeman's work—combining practical experience from government, industry, and academic settings with accessible teaching methods—has established it as an essential resource for current and future generations of missile systems engineers. Unlike more specialized alternatives, it offers a unified framework for understanding the entire missile design process from concept through development.


    Sources

    1. American Institute of Aeronautics and Astronautics (AIAA). (2022). Missile Design Guide | AIAA Education Series. https://arc.aiaa.org/doi/book/10.2514/4.106347

    2. American Institute of Aeronautics and Astronautics (AIAA). (n.d.). Missile Design and System Engineering - AIAA. https://aiaa.org/courses/missile-design-and-system-engineering/

    3. American Institute of Aeronautics and Astronautics (AIAA). (n.d.). Fundamentals of Tactical and Strategic Missile Guidance - AIAA. https://aiaa.org/courses/fundamentals-of-tactical-and-strategic-missile-guidance/

    4. Amazon.com. (n.d.). Missile Design and System Engineering (AIAA Education): Eugene L. Fleeman. https://www.amazon.com/Missile-Design-System-Engineering-Education/dp/1600869084

    5. Edberg, D., & Costa, W. (2022). Design of Rockets and Space Launch Vehicles, Second Edition | AIAA Education Series. https://arc.aiaa.org/doi/book/10.2514/4.106422

    6. Fleeman, E. (n.d.). Eugene Fleeman - Missile Design, Development, and System Engineering. https://sites.google.com/site/eugenefleeman/home

    7. Fleeman, E. (n.d.). Course Description. https://sites.google.com/site/eugenefleeman/home/course-description

    8. Georgia Tech Professional Education. (n.d.). Missile Design and System Engineering. https://pe.gatech.edu/courses/missile-design-and-system-engineering

    9. Semantic Scholar. (2013). Missile Design and System Engineering. https://www.semanticscholar.org/paper/Missile-Design-and-System-Engineering-Fleeman/5a4eb34ac4637f217cfb24af5c2fabc49a6586f1

    10. University of Alabama in Huntsville, Office of Professional and Continuing Education. (n.d.). Eugene L Fleeman, PE. https://www.uah.edu/opce/subject-matter-experts/eugene-fleeman

    11. AIAA Greater New Orleans Section. (2025). Home - Greater New Orleans Section. https://engage.aiaa.org/greaterneworleans/home

    12. University of Southern Indiana Library. (n.d.). Missile Design Guide. https://library.usi.edu/record/1485968

    13. Zarchan, P. (2019). Tactical and Strategic Missile Guidance, Seventh Edition | Progress in Astronautics and Aeronautics. https://arc.aiaa.org/doi/book/10.2514/4.105845

    Saturday, October 4, 2025

    Sweden's Microchip Trend and Cashless Society:

    Sweden's Microchip Trend and Cashless Society: Separating Fact from Fiction

    Bottom Line: While several thousand Swedes have voluntarily adopted microchip implants for convenience, the technology remains a niche trend that peaked years ago. Sweden is moving toward a cashless society driven primarily by private banks and consumer preference, but government authorities are now working to protect cash access amid security concerns. No social credit system exists in Sweden.


    The Microchip Reality: A Small but Vocal Minority

    Sweden has gained international attention for its adoption of subcutaneous microchip implants, but the actual numbers tell a more modest story than viral social media posts suggest.

    By the Numbers

    Approximately 6,000 Swedes have received microchip implants since the technology was introduced in 2014, according to Swedish microchip company Chipster. The trend peaked between 2014 and 2016 when it was a novel technology generating significant media attention, but adoption rates have declined since then.

    In a country with over 10 million people, more than 4,000 people have adopted the technology through companies like Biohax International, making this a tiny fraction of the population—approximately 0.06%.

    What the Chips Actually Do

    The rice-grain-sized microchips use Near Field Communication (NFC) and Radio-Frequency Identification (RFID) technology, similar to contactless payment cards. These chips can communicate with devices such as sensors and scanning machines, replacing traditional ID cards for building access and photocopying machines.

    Users can access homes, offices, and gyms by swiping their hands against digital readers. They can store emergency contact details, social media profiles, or e-tickets for events and rail journeys.

    The Railway Trial That Stalled

    About 130 passengers signed up for Sweden's SJ national railway microchip reservation service within a year. However, Sweden's train operator SJ ended their microchip trial after only a small increase over two years, bringing the total number of users to 3,000, stating they would move in another direction.

    Who's Behind the Technology

    Biohax International, started by former professional body piercer Jowan Osterlund, dominates the Swedish market. The procedure costs about 180 dollars and involves inserting the chip between the thumb and index finger using a syringe similar to those used for vaccinations.

    Swedish biohacking group Bionyfiken has organized "implant parties" since 2014 where groups of people voluntarily receive chips.

    Important Context: Completely Voluntary

    All microchip implants in Sweden are voluntary. There is no recorded instance of employers forcing employees to be microchipped in the United States or Sweden. The technology remains a choice made by tech enthusiasts, not a government mandate.


    The Cashless Society: Reality and Reversal

    Sweden's movement toward a cashless economy has been dramatic, but recent developments show authorities working to prevent cash from disappearing entirely.

    The Dramatic Decline of Cash

    Cash now accounts for just 1 percent of Sweden's economy, compared to 10 percent in Europe and 8 percent in the United States. About one in 10 consumers paid for something in cash in 2018, down from 40 percent in 2010.

    Only 1 in 4 people living in Sweden uses cash at least once a week, and the proportion of retail cash transactions has dropped from around 40 percent in 2010 to about 15 percent.

    Who Pushed the Cashless Trend

    The shift away from cash has been primarily driven by banks, which stopped handling cash at branches to cut costs, and retailers who went cash-free for efficiency and security. Swish, a mobile payment app developed in cooperation with Sweden's largest banks, has reached 5 million users and allows anyone with a smartphone and Swedish bank account to instantly transfer funds.

    In 2017, only 11 bank robberies were reported to police in Sweden, a 90 percent reduction versus 2009, while robberies of armored vehicles also declined, demonstrating one motivation for businesses abandoning cash.

    The Government Response: Protecting Cash Access

    Contrary to claims that the government is forcing a cashless society, Swedish authorities are actually working to preserve cash:

    Sweden's central bank called for urgent strengthening of cash in legislation, stating that legislation on cash needs to be tightened up immediately and political decisions are needed urgently so that everyone can pay.

    In December 2024, the Cash Inquiry proposed an obligation to accept cash for essential goods including food and pharmaceutical products, and for public law fees such as health care charges and passport fees.

    The Riksbank supports introducing an obligation to accept cash in the sale of essential goods and strengthening banks' responsibility for cash handling, stating it is essential that people can continue to use cash to enable all members of society to make payments.

    Why the Reversal?

    In light of the deteriorating security situation in Sweden and the neighboring region, the central bank is prioritizing work on improving the possibility of making offline payments by card to strengthen resilience.

    The government sent every home a pamphlet titled "In Case of Crisis or War" in November 2024 advising people to store enough cash for at least one week, preferably in different denominations.

    Riksbank Governor Erik Thedéen and Deputy Governor Aino Bunge argue that Swedes need to use considerably more cash than they do today, writing: "Keep small denominations of cash at home to cover a week's worth of essential purchases. Use cash at regular intervals. Shops and banks need to see a demand for them to continue accepting them".

    Who Gets Left Behind

    Groups at risk of being excluded from the financial system include elderly people and those with physical or cognitive disabilities who are unable or unwilling to use smartphone technology or regularly recall a PIN number for card payments.

    Consumer groups say the shift leaves many retirees (a third of all Swedes are 55 or older) as well as some immigrants and people with disabilities at a disadvantage, as they cannot easily gain access to electronic means for some goods and transactions.


    The Social Credit System Myth

    Sweden does not have a social credit system like China's. This is a fundamental misconception.

    What Sweden Actually Has

    Sweden has a system for credit scoring that aims to find people with a history of neglect to pay bills or taxes, where unpaid debts are forwarded to the Swedish Enforcement Authority and result in a non-payment record that can be stored for three years for individuals and five years for companies. This is a traditional credit reporting system, not a social credit system.

    The AI Welfare Controversy

    Sweden has faced criticism for algorithmic systems used by its welfare agency, but this is distinct from a social credit system:

    Amnesty International called for discontinuation of AI systems used by Sweden's Social Insurance Agency after an investigation found the system disproportionately flagged women, individuals with foreign backgrounds, low-income earners, and individuals without university degrees for benefits fraud inspections.

    The machine learning system introduced in 2013 assigns risk scores to social security applicants which automatically triggers an investigation if the risk score is high enough.

    This is a problematic welfare fraud detection system—not a comprehensive social credit system that tracks all citizen behavior across society.

    Understanding China's System

    For context, China's Social Credit System is mainly focused on assessing businesses rather than individuals, and consists of a database that collects data on corporate regulation compliance from government agencies. As of 2024, there is still no nationwide social credit score in China, and most private scoring systems have been shut down.

    Sweden has no equivalent system tracking citizens' social behavior for rewards and punishments.


    How Human Microchips Compare to Pet Microchips

    The microchips used in Sweden share similarities with the pet identification chips that have been widely used since the mid-1980s, but with important technological differences that enable different functions.

    Similarities: Basic Design and Implantation

    Both human and pet microchips are approximately the size of a grain of rice and are implanted subcutaneously using a syringe. Both types use passive RFID technology, meaning they have no battery and require no power source, remaining functional for the lifetime of the host without maintenance.

    Both types remain completely inert until activated by electromagnetic energy from a scanner, and both are encapsulated in biocompatible glass to prevent adverse reactions.

    Critical Differences: Frequency and Functionality

    The most significant difference lies in the radio frequencies used:

    Pet microchips operate at lower frequencies—typically 125 kHz, 128 kHz, or 134.2 kHz (the ISO international standard). These low frequencies were chosen for their excellent penetration through fur, skin, and muscle tissue.

    Human microchips like those in Sweden use Near Field Communication (NFC) technology at 13.56 MHz, which is the same frequency as contactless credit cards and modern smartphones. This higher frequency enables interaction with consumer devices.

    This frequency difference means that standard smartphones cannot read pet microchips—they're incompatible technologies. Pet chips require specialized veterinary or shelter scanners.

    Data Storage and Purpose

    Pet microchips store only a unique identification number (typically 9-15 digits) that links to a database containing owner contact information. They cannot track location, and the chip itself doesn't store medical records or personal data.

    Human microchips in Sweden are reprogrammable and can be configured for multiple functions: unlocking doors, storing e-tickets, making contactless payments, and potentially storing emergency contact information. They can interact with payment terminals, door readers, and most modern Android smartphones.

    Privacy Implications

    Paradoxically, pet microchips may offer more privacy protection than their human counterparts. Pet chips only transmit when actively scanned and contain minimal data, with the identification number requiring access to a protected registry database to reveal owner information.

    Human chips interact with many more systems—payments, transportation, building access—creating more potential touchpoints for data collection and more opportunities for unauthorized reading.

    Safety Profile

    Both types share similar safety considerations. Serious complications are extremely rare, estimated at approximately one in a million for pet chips in the UK, where over 3.7 million pet dogs have been chipped and adverse events are tracked.

    Both can cause minor inflammatory responses until scar tissue forms around the capsule, and both are considered biocompatible for long-term implantation. Studies have found no safety concerns for animals with RFID chips undergoing MRI at one Tesla magnetic field strength.

    The human microchips used in Sweden represent an evolution of proven pet identification technology, adapted with more versatile frequencies that enable interaction with modern consumer electronics. However, this versatility also introduces additional privacy and security considerations that don't exist with simpler pet identification chips.


    Privacy Concerns and Security Risks

    Microchip Security

    Jowan Osterlund of Biohax says personal microchips are actually more difficult to hack than many other data sources because they are stored beneath the skin, stating "Everything is hackable. But the reason to hack them will never be bigger because it's a microchip. It's harder for someone to get to, since you put it in you".

    However, British scientist Ben Libberton warns that if chips are used everywhere for every transaction, it could be very easy to let go of personal information, with particular concern about how chips could be used to share data about physical health and bodily functions.

    Data Collection Reality

    Swedes are used to sharing personal information, with many online purchases and administrative bodies requiring their social security numbers, while mobile phone numbers are widely available in online search databases and people can easily look up each other's salaries by calling the tax authority.

    This cultural acceptance of transparency differs significantly from privacy norms in other countries.


    Key Findings

    Microchip Adoption:

    • Approximately 6,000 Swedes have received implants since 2014
    • Represents 0.06% of Sweden's population
    • Adoption peaked in 2014-2016 and has since declined
    • Completely voluntary with no government mandate
    • Used primarily for building access, not replacing ID cards or credit cards for most people

    Cashless Society:

    • Driven by private banks and businesses, not government mandate
    • Cash now represents only 1% of the economy
    • Government is actually working to preserve cash access through new legislation
    • Authorities cite security concerns and financial inclusion as reasons to protect cash
    • The Riksbank is urging citizens to keep and use cash for crisis preparedness

    Social Credit System:

    • Sweden has no social credit system
    • Has traditional credit scoring and a controversial AI welfare fraud detection system
    • System that exists is fundamentally different from comprehensive behavior tracking

    The Bottom Line: The reality in Sweden is far less dramatic than viral claims suggest. While the country has embraced digital payments and a small number of tech enthusiasts have adopted microchips, these changes have been driven primarily by private sector innovation and consumer choice, not government coercion. Authorities are now actively working to ensure cash remains available and that digital exclusion doesn't harm vulnerable populations.


    Sources

    1. Interesting Engineering. (2025, July 11). "This Company Has Implanted Microchips Inside 150 Employees' Hands." https://interestingengineering.com/science/company-implanted-microchips-150-employees-hands
    2. Modern Diplomacy. (2024, January 10). "Cash now extinct as citizens use implanted microchips instead." https://moderndiplomacy.eu/2024/01/10/cash-now-extinct-as-citizens-use-implanted-microchips-instead/
    3. Carnegie Council for Ethics in International Affairs. (2024, July 9). "The Rise of Preemptive Bans on Human Microchip Implants." https://carnegiecouncil.org/media/article/preemptive-bans-human-microchip-implants
    4. World Economic Forum. (2018, May). "Thousands of Swedish people are swapping ID cards for microchips." https://www.weforum.org/stories/2018/05/thousands-of-people-in-sweden-are-embedding-microchips-under-their-skin-to-replace-id-cards/
    5. NPR. (2018, October 22). "Thousands Of Swedes Are Inserting Microchips Under Their Skin." https://www.npr.org/2018/10/22/658808705/thousands-of-swedes-are-inserting-microchips-under-their-skin
    6. Biometric Update. (2021, December 23). "Chip implants from Swedish developer support digital health pass storage under your skin." https://www.biometricupdate.com/202112/chip-implants-from-swedish-developer-support-digital-health-pass-storage-under-your-skin
    7. EU Fact Check. (2020, May 23). "True: 'Microchips are getting under the skin of thousands in Sweden.'" https://eufactcheck.eu/factcheck/true-microchips-are-getting-under-the-skin-of-thousands-in-sweden/
    8. Euronews. (2021, June 1). "Microchips are getting under the skin of thousands in Sweden." https://www.euronews.com/health/2018/05/31/microchips-are-getting-under-the-skin-of-thousands-in-sweden
    9. GovTech. (2025, January 5). "Should States Ban Mandatory Human Microchip Implants?" https://www.govtech.com/blogs/lohrmann-on-cybersecurity/should-states-ban-mandatory-human-microchip-implants
    10. Supercar Blondie. (2023, July 6). "Thousands of Swedes getting microchip implants." https://supercarblondie.com/microchip-implants-sweden/
    11. SBS Software. (2025, June 27). "Sweden's cashless revolution: Is this the end of paper money?" https://sbs-software.com/insights/sweden-cashless-revolution/
    12. Global Government Fintech. (2023, November 3). "Sweden's central bank calls for 'urgent' strengthening of cash in legislation." https://www.globalgovernmentfintech.com/swedens-central-bank-calls-for-urgent-strengthening-of-cash-in-legislation/
    13. Business Sweden. "Cashless society." https://www.business-sweden.com/insights/articles/cashless-society/
    14. Sveriges Riksbank. (2025, March 10). "Payments Report 2025." https://www.riksbank.se/en-gb/payments--cash/payments-in-sweden/payments-report-2025/
    15. Global Government Fintech. (2025, March 24). "Swedish central bank urges payments resilience action amid 'geopolitical unease.'" https://www.globalgovernmentfintech.com/riksbank-payments-resilience-geopolitical-unease/
    16. The Nordic Times. (2024, January 15). "A cashless society: Less and less cash in circulation in Sweden." https://nordictimes.com/the-nordics/sweden/a-cashless-society-less-and-less-cash-in-circulation-in-sweden/
    17. Sveriges Riksbank. (2024, March 14). "Payments Report 2024." https://www.riksbank.se/en-gb/payments--cash/payments-in-sweden/payments-report--2024/
    18. IMF Finance & Development Magazine. (2018, June). "Going Cashless: Central Banks and Digital Currencies." https://www.imf.org/en/Publications/fandd/issues/2018/06/central-banks-and-digital-currencies-point
    19. Sveriges Riksbank. (2025). "SVERIGES RIKSBANK Payments Report 2025." https://www.riksbank.se/globalassets/media/rapporter/betalningsrapport/2025/engelsk/payments-report-2025.pdf
    20. Visa Navigate. "Are we sleepwalking towards a cashless future? – Lessons from Sweden." https://navigate.visa.com/europe/future-of-money/sleep-walking-towards-a-cashless-future/
    21. Amnesty International. (2024, December 2). "Sweden: Authorities must discontinue discriminatory AI systems used by welfare agency." https://www.amnesty.org/en/latest/news/2024/11/sweden-authorities-must-discontinue-discriminatory-ai-systems-used-by-welfare-agency/
    22. Computer Weekly. (2024, November). "Swedish authorities urged to discontinue AI welfare system." https://www.computerweekly.com/news/366616576/Swedish-authorities-urged-to-discontinue-AI-welfare-system
    23. Remote People. (2025, February 3). "China Social Credit System in 2024 [Punishments + Rewards]." https://remotepeople.com/countries/china/social-credit-system/
    24. Wikipedia. (2025, May 25). "Credit score." https://en.wikipedia.org/wiki/Credit_score
    25. The Nexus. (2025, September 3). "Your Phone Already Has Social Credit. We Just Lie About It." https://www.thenexus.media/your-phone-already-has-social-credit-we-just-lie-about-it/
    26. Bitcoin Magazine. (2025, January 28). "With The E-Krona, Sweden Is Attacking The Virtues Bitcoin Is Built To Protect." https://bitcoinmagazine.com/culture/sweden-cbdc-for-financial-surveillance
    27. NPR. (2018, October 22). "Thousands Of Swedes Are Inserting Microchips Under Their Skin." https://www.npr.org/2018/10/22/658808705/thousands-of-swedes-are-inserting-microchips-under-their-skin
    28. Euronews. (2021, June 1). "Will microchip implants be the next big thing in Europe?" https://www.euronews.com/health/2020/05/12/will-microchip-implants-be-the-next-big-thing-in-europe
    29. Sveriges Riksbank. (2025, March 10). "The Cash Inquiry proposes that food, medicine and public charges can be paid in cash." https://www.riksbank.se/en-gb/payments--cash/payments-in-sweden/payments-report-2025/trends-on-the-payments-market/many-small-retail-businesses-have-stopped-accepting-cash-/the-cash-inquiry-proposes-that-food-medicine-and-public-charges-can-be-paid-in-cash/
    30. Sweden Herald. (2024, December 19). "Proposal: Food retailers should be forced to accept cash." https://swedenherald.com/article/proposal-food-retailers-should-be-forced-to-accept-cash
    31. Global Government Fintech. (2025, March 24). "Swedish central bank urges payments resilience action amid 'geopolitical unease.'" https://www.globalgovernmentfintech.com/riksbank-payments-resilience-geopolitical-unease/
    32. Sveriges Riksbank. (2025, May 19). "Introduce obligation to accept cash and strengthen banks' responsibility for cash." https://www.riksbank.se/en-gb/press-and-published/notices-and-press-releases/press-releases/2025/introduce-obligation-to-accept-cash-and-strengthen-banks-responsibility-for-cash/
    33. Cash Matters. "Sweden Reverses Course: Cash Returns as a Matter of Survival, Inclusion and Security." https://www.cashmatters.org/blog/sweden-reverses-course-cash-returns-as-a-matter-of-survivalSweden's Microchip Trend and Cashless Society: Separating Fact from Fiction | Claude | Claude


    When Satellites Borrow Tricks from the Cloud

    How data center traffic management could solve the growing congestion crisis in space By the time you finish reading this sentence, dozens...