The Safest Autonomous Fleet Is Capped at 2,500 Cars. The Least Safe One Has No Limit.
Nine. That is the number of crashes NHTSA has recorded for Tesla's robotaxi fleet in Austin between July and November 2025, across roughly 500,000 miles of operation. One crash every 55,000 miles, with a safety monitor sitting in every car.
Now multiply. On April 23, 2026, Elon Musk confirmed during Tesla's Q1 earnings call that Cybercab production has begun. Lars Moravy, Tesla's VP of Vehicle Engineering, confirmed something more consequential: because Tesla designed the Cybercab to comply with all existing Federal Motor Vehicle Safety Standards, it self-certifies like a Camry. No exemption needed, no unit cap, no federal pre-review of whether the software can actually drive. Tesla can build as many Cybercabs as its factory can stamp out, using the same regulatory pathway available to every sedan, pickup truck, and minivan on American roads.
Meanwhile, Waymo, which has driven 170.7 million fully driverless miles and demonstrated 82-92% fewer crashes than human drivers across four cities, remains capped at 2,500 vehicles per year.
Nobody has quantified what this regulatory asymmetry produces at scale, so we ran them ourselves.
How Self-Certification Bypasses the Cap
Federal vehicle safety regulation in the United States operates on a self-certification model. Manufacturers attest that their vehicles meet all applicable FMVSS requirements. NHTSA does not pre-approve vehicles before sale; it investigates after the fact, when crashes, complaints, or defect trends surface. This system has governed every car sold in America since 1966, and it works reasonably well for vehicles with steering wheels, brake pedals, and human drivers.
Autonomous vehicles introduced a problem that nobody anticipated when those rules were written. Some AV designs eliminate steering wheels and pedals entirely, which means they cannot comply with FMVSS standards written for human-operated cars (standards like FMVSS 101, which requires dashboard controls, or FMVSS 135, which specifies brake pedal force). To deploy these vehicles legally, companies must apply to NHTSA for an exemption. Congress capped exemptions at 2,500 units per manufacturer per year, a number chosen in the mid-2010s when autonomous vehicles were science projects deployed in fenced-off test zones, and Waymo, Cruise, and Nuro have all navigated this bottleneck by filing detailed safety cases and waiting months for approval.
Tesla took a different path. It built the Cybercab with no steering wheel and no pedals but engineered it to satisfy every FMVSS requirement through alternative compliance methods: airbags for passengers, crashworthiness standards, seatbelts, lighting, mirrors (or camera-based equivalents approved under NHTSA's 2022 ADS rule updates). Because the Cybercab meets FMVSS on paper, it self-certifies. No exemption application. No 2,500-unit ceiling.
Here is the catch. FMVSS tests whether a vehicle is safe to crash in. It does not test whether the vehicle can drive safely. Crashworthiness and driving competence are entirely separate regulatory questions, and only one of them has a federal answer.
Crash Rates: Running the Numbers Nobody Ran
NHTSA's Standing General Order data through November 2025 documents nine crashes from Tesla's Austin robotaxi fleet in approximately 500,000 miles. Every crash narrative was redacted with the notation: "[REDACTED, MAY CONTAIN CONFIDENTIAL BUSINESS INFORMATION]." We do not know whether Tesla was at fault in any of these incidents. We do not know how severe they were. We know only that they occurred and that Tesla chose not to disclose the details.
Waymo, by contrast, publishes full narrative descriptions for every reportable incident on its safety impact page. Its 170.7 million rider-only miles across Phoenix (68.6M), San Francisco Bay Area (53.5M), Los Angeles (37.9M), and Austin (10.7M) show 92% fewer serious-injury-or-worse crashes, 83% fewer airbag deployments, and 82% fewer injury-causing crashes compared to human driver baselines. Swiss Re independently validated these results in a December 2024 reinsurance study.
Compare the two systems at scale:
| Metric | Tesla Robotaxi | Waymo | Human Drivers |
|---|---|---|---|
| Miles driven | ~500,000 | 170.7 million | ~3.2 trillion/year |
| Reported crashes | 9 | Published per incident | ~6.7 million/year |
| Crash rate (all severity) | 1 per 55,000 mi | 82-92% below human | 1 per 200,000 mi |
| Safety monitor present | Yes (in every car) | No (fully driverless) | N/A |
| Crash narratives public | All redacted | All published | Police reports (FOIA) |
| Annual production cap | None (self-certified) | 2,500 (exemption) | N/A |
What 50,000 Cybercabs Produce
Tesla has not disclosed production targets for the Cybercab. But Musk has repeatedly described autonomous vehicles as a multi-million-unit opportunity, and Tesla built 408,386 vehicles in Q1 2026 alone. A 50,000-unit annual Cybercab run is conservative for a company operating at that scale.
At 50,000 Cybercabs driving 30,000 miles per year each (a reasonable estimate for a ride-hail fleet that operates 12-16 hours daily), the fleet accumulates 1.5 billion miles annually. At Tesla's current crash rate of 1 per 55,000 miles, that yields 27,273 crashes per year. At the human all-severity rate of 1 per 200,000 miles, the same mileage would produce 7,500 crashes.
Excess crashes attributable to the autonomous system, using Tesla's own NHTSA-reported rate against the human baseline: 19,773 per year, a number that grows linearly with every additional Cybercab that rolls off the line in a factory operating without any federal constraint on volume.
If Waymo stays at 2,500 units running the same 30,000 miles annually, its fleet covers 75 million miles. At its demonstrated 82% reduction in injury crashes relative to humans, Waymo's fleet would be expected to produce dramatically fewer injuries per mile than either Tesla or human drivers, but across 20 times fewer vehicles.
Put differently: the regulatory framework grants unlimited deployment to the system crashing 3.6 times more often than humans while capping the system crashing 82-92% less often. Congress is currently debating whether to raise Waymo's cap from 2,500 to 90,000 via the SELF DRIVE Act of 2026. Tesla does not need the SELF DRIVE Act. It never did.
Leadership Exodus, Redacted Data
Production has begun, but the team that built the Cybercab has not stuck around to see it through, and the exodus reads like a warning label that nobody attached to the product. Victor Nechita, the vehicle program manager, departed in February 2026, days after the first unit came off the line. Thomas Dmytryk, who spent 11 years at Tesla directing OTA updates and ride-hailing infrastructure, left around the same time. Mark Lupkey, assembly leader, followed in March. Tesla has no original program managers remaining for any production vehicle.
Pair that with the crash narrative redactions. Nine crashes occurred, and NHTSA's public filings contain zero detail about what happened in any of them. Waymo publishes complete incident descriptions: vehicle speed, road conditions, fault determination, third-party involvement. Tesla filed "[REDACTED, MAY CONTAIN CONFIDENTIAL BUSINESS INFORMATION]" for every single narrative field. A company preparing to deploy unlimited autonomous vehicles on public roads chose opacity over disclosure at the one regulatory checkpoint where transparency was required.
A Strong Counterargument, Stated Fairly
Tesla's Cybercab complies with every physical safety standard on the books. Crashworthiness, airbags, seatbelts, lighting, structural integrity. Self-certification works exactly as designed: it verifies that a vehicle protects occupants in a crash. It was never intended to evaluate whether software can drive. Conflating vehicle safety with driving performance is a category error. If Congress wants to regulate autonomous driving software, it should pass legislation specifically addressing software competence, not stretch a 60-year-old vehicle safety framework to cover a problem it was never built for. Tesla is doing nothing illegal, nothing improper, and nothing that any other manufacturer could not do if they designed their AV to meet FMVSS. Blaming Tesla for exploiting a regulatory gap is blaming the player for understanding the rules.
What This Analysis Did Not Prove
Several caveats limit the strength of these conclusions. Tesla's 500,000-mile dataset is small; nine crashes is a sample from which precise rate estimates carry wide confidence intervals. We do not know fault distribution because Tesla redacted every narrative. Some or all of those crashes could have been caused by other drivers. Tesla's FSD software updates continuously; version 14-lite, expected in June 2026, could materially improve the crash rate. Musk himself set a target of "probably Q4" 2026 for unsupervised FSD, implying the company expects significant near-term improvement. Waymo and Tesla also operate in different geographies and traffic conditions (Phoenix suburbs versus Austin urban core), making direct rate comparisons imperfect. And our 30,000-mile-per-vehicle annual assumption is an estimate; actual fleet utilization rates could be higher or lower depending on ride-hail demand and operational hours.
What You Can Do
If you live in Austin or another Cybercab deployment city: Track NHTSA's Standing General Order database quarterly. It is the only public source of Tesla AV crash data. If crash counts rise proportionally with fleet size, that is a leading indicator. If Tesla continues redacting narratives while expanding deployment, file a FOIA request for the unredacted reports.
If you are a state legislator: FMVSS is federal, but states control vehicle registration and operational permits. California, Arizona, and Texas each have the authority to impose state-level safety reporting requirements on autonomous fleets operating within their borders, independent of federal self-certification. Several states already do. If yours does not, the gap is yours to close.
If you work on autonomous vehicle policy: Watch the SELF DRIVE Act markup. It addresses the exemption cap but does not create a federal driving-competence standard. Any bill that raises Waymo's cap without simultaneously requiring crash-rate transparency from self-certified AVs like the Cybercab will widen, not narrow, the paradox described here.
The Bottom Line
American autonomous vehicle regulation was built for a world where the vehicle and the driver were separate entities: the government tested the car, and the DMV tested the human. Autonomy collapsed that distinction, and regulation has not caught up. Tesla found the seam. By building a car that satisfies every crashworthiness standard while shipping driving software that crashes 3.6 to 9 times more often than the humans it replaces, Tesla secured unlimited production access through the same door that opens for a Honda Civic. Waymo, with 170 million miles of safety data and an 82-92% crash reduction, waits for Congress to decide whether it deserves more than 2,500 vehicles per year. What we are witnessing is not a safety debate. It is a classification problem: the rules that govern what a car is were written before cars could think, and the first company to exploit that gap will put more autonomous vehicles on American roads than every other AV company combined.
Sources
- Tesla Q1 2026 earnings: Cybercab production begins, $22.4B revenue, $0.41 EPS (Electrek, April 23, 2026)
- NHTSA Standing General Order data: 9 crashes in Tesla's Austin robotaxi fleet, July-Nov 2025 (Electrek, January 29, 2026)
- Waymo Safety Impact: 170.7 million rider-only miles, 82-92% crash reduction (Waymo, updated through December 2025)
- SELF DRIVE Act of 2026: exemption cap increase from 2,500 to 90,000 units (Sidley Austin analysis, January 8, 2026)
- NHTSA Automated Vehicles Safety overview and Standing General Order database
- Swiss Re reinsurance validation of Waymo safety data (December 2024)