Tesla LiDAR stance accelerates NHTSA investigation into FSD - TheStreet
As federal investigations multiply, jury verdicts pile up, and a fatal crash record grows, the evidence increasingly suggests that Tesla's camera-only approach to autonomous driving is an ideological bet — not an engineering consensus.
Bottom Line Up Front: Tesla's Full Self-Driving (FSD) system, which relies exclusively on cameras and neural networks while rejecting LiDAR and radar, has accumulated a damning evidentiary record: nine federally documented crashes linked to degraded camera visibility (including one fatality), three concurrent NHTSA investigations covering 3.2 million vehicles, a landmark $243 million jury verdict for misleading marketing of driver-assist technology, and a robotaxi crash rate estimated at roughly 8× higher than average human drivers — even when safety monitors were present. Every major competitor uses sensor fusion (LiDAR + radar + cameras). Waymo's sensor-rich approach has achieved 6.8× fewer casualty crashes than the human benchmark. Until Tesla can demonstrate, at verifiable scale, that cameras alone match the safety record of multi-sensor systems, the weight of evidence supports treating FSD as a supervised driver-assistance tool — not autonomous driving — and treating the company's camera-only philosophy as an unproven bet with public-safety consequences.
The Philosophical Divide: Cameras vs. Sensor Fusion
No question divides the autonomous-vehicle industry more starkly than this one: do self-driving cars need LiDAR? On one side stands virtually the entire field — Waymo, Zoox, Mercedes-Benz, Toyota, Aurora, and others — all of which deploy multi-sensor architectures combining cameras, radar, and light detection and ranging (LiDAR) arrays. On the other side stands Elon Musk, who has called LiDAR a "fool's errand," dismissing it as "expensive hardware that's worthless on the car."
Tesla's position is not merely a cost-cutting measure — it is a fully articulated engineering philosophy. Musk's argument: humans navigate the world with two eyes; cameras mimic that biological system; therefore cameras, supplemented by powerful AI, should be sufficient. He has extended this logic by pointing to Tesla's enormous fleet data advantage — by early 2026, FSD had accumulated 3.6 billion cumulative miles of driving data, roughly triple what it had logged just a year prior.
But as of March 2026, the real-world record tells a more complicated and troubling story. Federal regulators have escalated their scrutiny of FSD to its highest level yet. Courts have rendered the first major liability verdict against Tesla's driver-assistance technology. And the company's own robotaxi pilot has produced a crash rate that independent analysts say should alarm policymakers and consumers alike.
What LiDAR Does — and Why It Matters
LiDAR (Light Detection and Ranging) uses pulsed laser light to measure distances and construct detailed, real-time three-dimensional maps of a vehicle's surroundings. Unlike cameras, which interpret depth through image processing and learned inference, LiDAR directly measures distance to objects with centimeter-level precision. Unlike radar, it can resolve fine detail — distinguishing a child from a trash can, or a stopped vehicle from a shadow.
Critically, LiDAR operates independently of ambient light and can function in conditions — fog, bright glare, airborne dust — where camera-based perception degrades significantly. In sensor-fusion architectures, LiDAR data is cross-referenced against camera imagery, a process that allows the system to precisely locate objects in three-dimensional space even when one sensor is compromised.
Sensor Comparison: Key Capabilities
| Capability | Camera (Tesla Vision) | Radar | LiDAR |
|---|---|---|---|
| 3D Distance Measurement | Inferred (AI) | Limited resolution | Direct / Precise |
| Performance in Fog/Rain | Severely degraded | Good | Moderate (rain) / Good (fog) |
| Performance in Bright Glare | Severely degraded | Unaffected | Largely unaffected |
| Fine Object Detail | Excellent (clear conditions) | Poor | Excellent |
| Redundancy to Camera | None (same modality) | Yes | Yes (different physics) |
| Cost (approx. vehicle impact) | Very low | Low | Moderate–high |
Waymo's fifth-generation vehicles deploy five LiDARs, six radars, and 29 cameras. When Waymo VP Srikanth Thirumalai presented at the AI4 conference in August 2025, he showed video of LiDAR sensors detecting pedestrians readying to step into a roadway — in both cases before the vehicle's cameras had registered any threat. Both times, the vehicle stopped or maneuvered safely. Thirumalai declined to say directly whether he considered camera-only systems safe for public roads, but said that "objective measures" require safety comparisons at scale, and that claims of sensor-parity needed to be demonstrated, not asserted.
The Federal Investigation: Three Concurrent Probes
As of March 2026, the National Highway Traffic Safety Administration (NHTSA) is running three simultaneous federal investigations into Tesla FSD — an unprecedented level of regulatory scrutiny for a single driver-assistance product.
Investigation 1: Degraded Visibility Crashes (EA26002)
The most significant probe was escalated on March 19, 2026, from a Preliminary Evaluation to an Engineering Analysis — the final investigative step before NHTSA can demand a recall. The investigation, covering an estimated 3.2 million Tesla vehicles equipped with FSD, centers on nine documented crashes in which FSD's degradation detection system allegedly failed to warn drivers when cameras were impaired by sun glare, fog, airborne dust, or other common environmental conditions.
Of the nine incidents, one was fatal. In multiple crashes, NHTSA found that FSD lost track of or failed to detect a lead vehicle entirely. The agency further noted that Tesla had disclosed "internal data and labeling limitations" that may have resulted in under-reporting of similar incidents — meaning the actual number of affected crashes could be higher than currently known. NHTSA also found that a software update Tesla deployed to address the problem may have remediated only three of the nine documented crashes, leaving the remainder unaddressed.
The Engineering Analysis phase — historically the final step before a recall demand — gives Tesla a deadline to submit detailed technical documentation on how FSD's neural networks process degraded visual inputs, what fallback behaviors the system employs when sensor confidence drops, and crash reconstruction data from the incidents under review.
Investigation 2: Traffic Safety Violations (PE25012)
A separate NHTSA preliminary evaluation opened in October 2025 is examining 58 incidents in which FSD vehicles executed maneuvers constituting traffic safety law violations — including proceeding through red traffic signals and driving against the direction of travel on public roadways. The investigation covers approximately 2.9 million vehicles. In six incidents, FSD-engaged vehicles ran red lights and were involved in crashes; four of those crashes resulted in injuries. Multiple incidents of wrong-way driving and illegal lane changes have also been documented.
Investigation 3: Crash Reporting Failures
A third concurrent NHTSA inquiry is examining Tesla's compliance with the agency's Standing General Order requiring crash reporting. Investigators have found that Tesla submitted required crash reports months late in multiple cases. The probe raises questions about whether Tesla's crash data — which the company cites to support FSD safety claims — is complete and timely.
- EA26002 — Degraded visibility crashes; 9 incidents, 1 fatality; covers ~3.2M vehicles; Engineering Analysis phase (pre-recall)
- PE25012 — Traffic violations; 58 incidents, 6 red-light crashes with injuries; covers ~2.9M vehicles; Preliminary Evaluation
- Crash Reporting Probe — Late SGO submissions; ongoing
- Total NHTSA Tesla investigations since 2016: More than 40
The Legal Record: Courts Begin to Hold Tesla Accountable
For years, Tesla successfully deflected litigation over Autopilot and FSD crashes, arguing that the driver — not the system — bore ultimate responsibility, and that the "Supervised" label absolved the company of liability for driver over-reliance. That legal posture suffered major defeats in 2025 and early 2026.
Benavides v. Tesla: The Landmark $243 Million Verdict
In August 2025, a Miami federal jury in Benavides v. Tesla awarded more than $240 million in damages — including $200 million in punitive damages — to the family of 22-year-old Naibel Benavides Leon, killed in a 2019 crash when a Tesla Model S on Autopilot ran a stop sign at approximately 62 miles per hour. The jury found Tesla 33% liable, despite the driver having admitted to distraction, placing 67% of fault on the driver.
The verdict was the first successful trial judgment against Tesla's Autopilot system in the United States. The jury accepted plaintiffs' arguments that Tesla had designed Autopilot to be activatable on unsafe roads, had failed to adequately monitor driver attentiveness, and had engaged in misleading marketing. Critically, recovered vehicle data demonstrated that the Autopilot system had detected obstacles prior to the crash but failed to brake — directly contradicting Tesla's claim that driver behavior was the sole cause.
In February 2026, U.S. District Judge Beth Bloom upheld the verdict, rejecting Tesla's motion for a new trial and ruling that evidence at trial "more than supported" the jury's finding. Tesla has appealed but exhausted its post-trial motions at the district court level.
California Courts and the Branding Question
In December 2025, a California judge ruled that Tesla's use of "Autopilot" in its marketing was misleading and violated state law, calling "Full Self-Driving" a product name that was "actually, unambiguously false and counterfactual." Tesla subsequently discontinued Autopilot as a standalone product in the United States and Canada — a significant admission of the marketing problem the company had long denied.
The Cybertruck FSD Lawsuit and the LiDAR Allegation
In March 2026, a new Texas lawsuit filed in Harris County District Court over a Cybertruck FSD crash on a Houston freeway included an explicit product liability claim over the "absence of LiDAR" as a design defect. The complaint alleged the vehicle was "defective and unreasonably dangerous" due to the lack of LiDAR, an ineffective automatic emergency braking system, inadequate driver monitoring, and misleading marketing. The lawsuit also alleges Tesla negligently hired and retained Elon Musk as CEO — an unusual but legally significant framing that directly attributes the camera-only decision to executive misconduct. The case is pending.
- Apr 2019 Key Largo Fatal Crash Tesla Model S on Autopilot runs stop sign at 62 mph; Naibel Benavides Leon killed. Becomes foundation of landmark liability suit.
- 2021 Tesla Removes Radar Tesla discontinues radar on new vehicles, transitioning to camera-only "Tesla Vision." Driver reports of worsened performance in fog and rain follow.
- Oct 2024 NHTSA Opens PE24031 Preliminary evaluation opened after four FSD crashes in reduced-visibility conditions including one pedestrian fatality.
- Aug 2025 $243M Florida Verdict Miami jury finds Tesla 33% liable in Benavides case; first successful Autopilot trial verdict in U.S. history. Punitive damages: $200M.
- Oct 2025 NHTSA Opens PE25012 New probe opened into FSD traffic safety violations: 58 incidents, including red-light running and wrong-way driving; covers ~2.9M vehicles.
- Dec 2025 California Court: FSD Name "Unambiguously False" California judge rules Tesla's "Full Self-Driving" marketing violates state law. Tesla discontinues Autopilot as standalone product in U.S. and Canada.
- Feb 2026 Florida Verdict Upheld Judge Bloom upholds $243M jury verdict; Tesla's post-trial motions exhausted at district court level. Appeal filed.
- Mar 2026 NHTSA Escalates to Engineering Analysis (EA26002) Probe upgraded to pre-recall phase; covers 3.2M vehicles; 9 crashes documented. Third concurrent FSD investigation. Texas LiDAR defect lawsuit filed.
The Robotaxi Reality Check
Tesla launched its paid robotaxi service in Austin, Texas in June 2025 — initially with human safety monitors in the front passenger seat. The company then began gradually removing monitors from a small number of vehicles starting in January 2026. Elon Musk described the development as Tesla joining "the exclusive club of companies operating truly driverless public transit."
The data behind that milestone, however, was more sobering. By mid-October 2025, Tesla had reported seven crash incidents to NHTSA from its Austin fleet — despite safety monitors whose explicit purpose was preventing additional incidents. Based on Tesla's disclosure that the fleet had traveled approximately 250,000 miles through early November, independent analysts calculated a crash rate of roughly once every 60,000 miles — compared to the average human driver's roughly 500,000 miles between crashes. That implied Tesla's supervised robotaxi fleet was crashing at more than eight times the human rate.
Philip Koopman, an emeritus professor at Carnegie Mellon University and a leading autonomous-systems safety researcher, noted that with a fleet of 30 or fewer vehicles with trained safety supervisors, the number of reportable accidents "should have been fewer than seven." He also observed that Tesla had withheld the narrative descriptions of each crash from NHTSA reports, making independent safety analysis impossible.
Further complicating Tesla's narrative: a reverse-engineering analysis by a Texas A&M engineering student found that the Austin robotaxi service was unavailable roughly 60% of the time, and that only 1–5 vehicles were in active operation simultaneously at most hours — far below Musk's stated goal of 500 vehicles by year-end 2025. Reports also emerged that Tesla's "unsupervised" robotaxis were being followed by trailing Tesla vehicles carrying monitors — suggesting the removal of in-vehicle safety monitors may have been an optics exercise rather than a genuine operational transition.
By contrast, Waymo — operating with LiDAR, radar, and cameras — has accumulated more than 100 million fully driverless miles, achieved 6.8× fewer casualty crashes per million miles than the human benchmark, and operates profitable robotaxi services in Phoenix, San Francisco, Austin, and Los Angeles.
The Expert Consensus — and the Dissent
The autonomous-driving research community is not uniformly opposed to camera-based approaches. Several researchers have noted that the camera-only philosophy could, in theory, be vindicated by sufficiently powerful AI and sufficient training data. Rich Sutton's influential "Bitter Lesson" framework — which argues that general learning systems eventually outperform expert-designed solutions — is frequently invoked by Tesla partisans. And Xpeng Motors in China has also moved toward a camera-first architecture, suggesting the approach is not unique to one eccentric billionaire.
Moreover, Tesla's own safety data, when presented on Tesla's terms, looks impressive: the company claims FSD users travel approximately 2.9 million miles between major collisions, compared to a national average of 505,000 miles. The company's overall crash rate with Autopilot engaged on highways is approximately 8× better than human drivers — a real achievement that deserves acknowledgment.
But these statistics carry important caveats. Highway driving — where Autopilot excels — has a substantially lower baseline crash rate than urban driving, making direct comparisons to the national average misleading. Tesla's three concurrent NHTSA investigations directly question the completeness of the company's crash reporting. And Waymo co-CEO Tekedra Mawakana noted in November 2025 that there are no standardized, comparable safety metrics between the two systems, and challenged the industry to release transparent data: "If you are not being transparent, then…you are not doing what is necessary to earn the right to make the road safer."
What Tesla Owners Should Know Right Now
The escalation of NHTSA's visibility-crash investigation to Engineering Analysis status means that regulators now believe sufficient evidence of a safety defect may exist to warrant a recall. While the process could still result in a mandatory over-the-air software update rather than a physical recall, the agency's position is clear: FSD's degradation detection system, in its current and previously updated forms, may fail under common driving conditions.
Consumer Guidance
Owners using FSD (Supervised) should manually disengage the system any time visibility is compromised — including sun glare, fog, dust, heavy rain, or smoke. Do not rely on FSD's degradation alert to warn you in time; NHTSA's investigation has documented nine cases in which that warning arrived too late or not at all. Treat FSD as a Level 2 driver-assistance system requiring constant, active supervision — not as autonomous driving. The term "Full Self-Driving" has been ruled misleading by a California court. Do not allow the name to shape your expectation of the system's capabilities.
The Broader Industry Implication
For the broader autonomous vehicle industry, NHTSA's escalation carries implications beyond Tesla. Every AV developer must ultimately demonstrate, at scale, that their chosen sensor architecture can safely navigate the full range of conditions public roads present. Waymo, Zoox, and Aurora have chosen sensor fusion precisely because it provides independent physics-based redundancy when any single sensing modality fails — the automotive engineering equivalent of defense-in-depth.
Tesla's camera-only architecture is not inherently doomed. Advances in AI, compute, and neural network architecture may eventually close the gap. But as of March 2026, the record indicates that gap is real, measurable, and consequential. An engineering philosophy that was once a bold bet has become the subject of three federal investigations, a landmark jury verdict, a judicial finding of false advertising, and a casualty record that independent researchers have flagged as statistically alarming.
Elon Musk has been right about many things that experts dismissed. He may yet be proven right about cameras. But "may yet" is not a safety standard, and the current evidentiary record does not support the conclusion that Tesla FSD has achieved, or is imminently approaching, the safety profile that autonomous operation of motor vehicles on public roads requires.
Verified Sources & Formal Citations
-
Engineering Analysis EA26002 — Tesla FSD Degraded Visibility Investigation
National Highway Traffic Safety Administration (NHTSA), Office of Defects Investigation · March 19, 2026
https://static.nhtsa.gov/odi/inv/2024/INIM-PE24031-62887.pdf -
Preliminary Evaluation PE25012 — Tesla FSD Traffic Safety Violations
NHTSA Office of Defects Investigation · October 7, 2025
https://static.nhtsa.gov/odi/inv/2025/INOA-PE25012-19171.pdf -
NHTSA is one step away from having to recall FSD in visibility crash probe
Electrek · March 19, 2026
https://electrek.co/2026/03/19/nhtsa-upgrades-tesla-fsd-visibility-investigation-3-2-million-vehicles/ -
NHTSA Escalates Tesla FSD Investigation After Additional Crashes
CBT News · March 19, 2026
https://www.cbtnews.com/nhtsa-escalates-tesla-fsd-investigation/ -
NHTSA Upgrades Tesla FSD Probe One Step Short of Recall
Automotive World · March 19, 2026
https://www.automotiveworld.com/news/nhtsa-upgrades-tesla-fsd-probe-one-step-short-of-recall/ -
Tesla LiDAR Stance Accelerates NHTSA Investigation into FSD
TheStreet · March 20, 2026
https://www.thestreet.com/automotive/tesla-lidar-stance-accelerates-nhtsa-investigation-into-fsd -
Tesla Has to Pay Historic $243 Million Judgement Over Autopilot Crash, Judge Says
Electrek · February 20, 2026
https://electrek.co/2026/02/20/tesla-has-to-pay-historical-243-million-judgement-over-autopilot-crash-judge-says/ -
Jury Orders Tesla to Pay More Than $240 Million in Autopilot Crash
NPR · August 2, 2025
https://www.npr.org/2025/08/02/nx-s1-5490930/tesla-autopilot-crash-jury-240-million-florida -
Benavides v. Tesla: A Defense-Side Perspective on Florida's Landmark Autopilot Verdict
Walsworth LLP (WSHB Law) · 2025
https://www.wshblaw.com/publication-benavides-v-tesla-a-defense-side-perspective-on-floridas-landmark-autopilot-verdict -
Tesla Cybertruck Owner Sues Over FSD Crash, Alleges 'Negligent' Retention of Musk
Electrek · March 11, 2026
https://electrek.co/2026/03/11/tesla-cybertruck-fsd-lawsuit-musk-negligent-hiring/ -
Tesla Starts Robotaxi Rides Without Safety Monitor in Austin
Electrek · January 22, 2026
https://electrek.co/2026/01/22/tesla-starts-robotaxi-rides-without-safety-monitor-in-austin-what-you-need-to-know/ -
Tesla's Robotaxi Project in Austin Is Much Smaller Than Musk Claims
Electrek · December 22, 2025
https://electrek.co/2025/12/22/tesla-robotaxi-project-austin-much-smaller-than-musk-claims/ -
Tesla Stocks Driverless Robotaxi Tests in Austin — Philip Koopman Comments
CNBC · December 15, 2025
https://www.cnbc.com/2025/12/15/tesla-tests-driverless-cars-in-austin-without-humans-on-board.html -
Tesla's Robotaxi Launches in Austin with Safety Drivers in Passenger Seat
KVUE (ABC Austin) · June 2025
https://www.kvue.com/article/money/cars/austin-tesla-robotaxi-launch/269-9d0118a0-a22a-486e-ac6c-a23b84e45d33 -
Waymo Experimenting with Generative
AI, but Exec Says LiDAR and Radar Sensors Important to Self-Driving
Safety 'Under All Conditions'
Fortune · August 15, 2025
https://fortune.com/2025/08/15/waymo-srikanth-thirumalai-interview-ai4-conference-las-vegas-lidar-radar-self-driving-safety-tesla/ -
Tesla Releases Detailed Safety Report After Waymo Co-CEO Called for More Data
TechCrunch · November 14, 2025
https://techcrunch.com/2025/11/14/tesla-releases-detailed-safety-report-after-waymo-co-ceo-called-for-more-data/ -
Waymo and Tesla's Self-Driving Systems Are More Similar Than People Think
Understanding AI (Timothy B. Lee) · December 17, 2025
https://www.understandingai.org/p/waymo-and-teslas-self-driving-systems -
Camera versus LiDAR: Waymo vs. Tesla Compared
The Last Driver License Holder · July 1, 2025
https://thelastdriverlicenseholder.com/2025/06/25/camera-versus-lidar/ -
Tesla's Big Bet: Cameras Over LiDAR for Self Driving Cars
Vik's Newsletter · November 17, 2024
https://www.viksnewsletter.com/p/teslas-big-bet-cameras-over-lidar -
Tesla Bet on 'Pure Vision' for Self-Driving. That's Why It's in Hot Water
InsideEVs · October 22, 2024
https://insideevs.com/news/738204/tesla-pure-vision-camera-only/ -
Tesla Autopilot Hardware — Wikipedia
Wikipedia, citing primary Tesla and FCC filings · Accessed March 2026
https://en.wikipedia.org/wiki/Tesla_Autopilot_hardware -
NHTSA Launches New Tesla 'Full Self-Driving' Investigation on Nearly 2.9 Million Vehicles
Repairer Driven News · October 10, 2025
https://www.repairerdrivennews.com/2025/10/10/nhtsa-launches-new-tesla-full-self-driving-investigation-on-nearly-2-9-million-vehicles/ -
Tesla's Self-Driving Ambitions Hit a Wall: NHTSA Probe Puts a March 2026 Deadline on Answers
WebProNews · March 2026
https://www.webpronews.com/teslas-self-driving-ambitions-hit-a-wall-nhtsa-probe-puts-a-march-2026-deadline-on-answers/ -
Tesla vs. Waymo — Who Is Closer to Level 5 Autonomous Driving?
Think Autonomous · September 10, 2025
https://www.thinkautonomous.ai/blog/tesla-vs-waymo-two-opposite-visions/ -
Mark Rober Tesla Autopilot vs. LiDAR Comparison Video — Analysis
Electrek · March 23, 2025
https://electrek.co/2025/03/23/everyones-missing-the-point-of-the-tesla-vision-vs-lidar-wile-e-coyote-video/