Autonomous and Self-Driving Vehicles, Robotaxis, and the Risk of Accidents

From artificial intelligence to robotics, in 2025 tech crept its way ever deeper into our daily lives. Now as we move into 2026, everything indicates that an unprecedented speed of technological infiltration will be the order of the day. At January’s massive CES convention in Las Vegas, consumer-level items like smart glasses, folding smartphones, stair-climbing robot vacuums, and app-controlled appliances made waves, while AI-driven changes in research, administration, and creation are transforming how we do business. Tech is everywhere, and the growth of autonomous automobiles is trending right there with it.
More Driverless Cars = More Accidents Involving Them
Even though early stats suggest that autonomous vehicles drive more safely than human drivers, as more self-driving cars hit the road, naturally more accidents involving them will happen. With that growth comes new and interesting legal and liability implications. Because autonomous vehicles generally shift responsibility from human drivers to technology providers, laws are complex and evolving.
Traditionally, accident laws focus on driver negligence. But with driverless cars, liability pushes toward product defects, software failures, or system design flaws. This area of law is still developing, with variations by jurisdiction. Courts, regulators, and insurers are racing to adapt and address gaps.
Self-Driving Cars Have Various Levels of Automation
Autonomous cars operate at different levels of automation, as defined by the Society of Automotive Engineers (SAE):
- Level 0-2: No or partial automation; a human driver is primarily responsible for the vehicle.
- Level 3: Conditional automation; the vehicle handles most tasks, but a human must intervene as needed.
- Level 4-5: High or full automation; the vehicle handles all driving, potentially without human input.
From Tesla’s semi-autonomous autopilot mode, to the advent of fully self-driving taxis from companies like Waymo and Zoox, our roadways are now busy with vehicles that are only partially under direct human control, or not at all. That new reality raises plenty of questions about safety, responsibility, and liability.
The Lag of Legislation
As we have seen with tech-driven companies like Bird (e-scooter rentals), Uber, and others, technology often presses forward quickly, relying on an ethos of disruption before regulation, and forgiveness before permission. We have already witnessed this approach with the rapidly growing segment of e-bikes and e-scooters, where legislation and regulation struggles to catch up as injuries and fatalities stack up.
The same applies to self-driving car technology. Tesla was arguably the first automaker to seriously investigate and implement self-driving technology. It began the development of a semi-autonomous driving mode in the early 2010s, and introduced it on vehicles available to consumers by late 2014. Not long after, the system registered its first fatal accident.
Notable Autonomous-Driving Traffic Incidents
- 2016: First fatal Tesla Autopilot crash; driver killed when vehicle failed to detect a tractor-trailer.
- 2017: A Navya-Keolis autonomous shuttle collided with a delivery truck on its first day of public service in downtown Las Vegas, triggering an NTSB investigation.
- 2018:
- Uber self-driving test vehicle fatally struck a pedestrian in Arizona (first pedestrian death involving a fully autonomous prototype).
- Tesla Model X on Autopilot) accelerated into a highway barrier on US-101, killing the driver, and leading to an NTSB investigation and lawsuits against Tesla.
- 2022: A Tesla in Full Self-Driving mode abruptly slowed and swerved, causing an eight-vehicle chain-reaction crash with minor injuries.
- 2023: A Cruise (GM) robotaxi dragged a pedestrian in San Francisco, leading to permit revocation and eventual shutdown of its robotaxi operations.
- 2024: A Zoox vehicle unexpectedly braked in two separate incidents, causing motorcyclists to rear-end it and sustain injuries. This prompted a voluntary recall and software updates and an NHTSA investigation.
- 2025:
- Waymo was involved in one fatality (motorcyclist rear-ended a slowing Waymo, then hit by hit-and-run driver). Also, a power outage in San Francisco caused multiple Waymo vehicles to stall and block traffic.
- An apparently confused Zoox robotaxi came to a stop in the middle of an intersection on the Las Vegas Strip.
- An Unoccupied Zoox robotaxi collided with a passenger vehicle. Zoox issued a voluntary software recall and temporarily paused driverless testing, underscoring software prediction flaws in real-world scenarios.
Despite these incidents and accidents, and more than 10 years after self-driving technology development began, the U.S. still lacks comprehensive legislation specific to autonomous vehicles. At the federal level, regulation is limited to rules set by executive agencies, including the NHTSA. The result is a patchwork of state laws, with more than 29 states enacting laws for testing and deployment.
Current regulations are so lax, for instance, that it took California until 2026 to allow its law enforcement officers to issue basic “noncompliance notices” to driverless vehicles committing traffic violations.
Computers Can Drive Better Than Us, Right?
Despite early data suggesting that autonomous vehicles are involved in fewer instances of accidents, that does not mean they are foolproof. Driverless vehicles have been observed to make left turns from the center lane of a one-way street and stop in the middle of intersections if they get confused. In a recent electrical blackout in San Francisco, robotaxis across the city came to an immediate stop, as if the hive mind had not been programmed how to react in a power outage.
Because of this behavior and more, driverless vehicles are not perfect. How they react and interact is still a matter of human-developed technology and protocols. But when an accident happens, the liability is complex.
Potentially Liable Parties in a Driverless Car Accident
- Manufacturers & Software Developers: In fully autonomous modes, the vehicle maker or tech company is typically held responsible under product liability laws if a defect in hardware, sensors, algorithms, or software caused the crash. This can include strict liability, where proof of negligence isn’t required—just that the product was defective and caused harm.
- Vehicle Owners or Operators: Even with self-driving cars, human involvement can trigger liability. Owners might be at fault if they misuse the autonomous system, or if the vehicle was in a semi-autonomous mode requiring human intervention. In some states, the “operator” could be redefined to include passengers or remote monitors, potentially exposing them to negligence claims.
- Component Suppliers or Third Parties: Suppliers of parts like LIDAR sensors or mapping software could share liability if their contributions are defective. Government entities might also face claims for poor road infrastructure that confuses AV systems.
- Fleet Operators: For ride-sharing services or robotaxis (Waymo, Zoox), the company operating the fleet could be vicariously liable for accidents during commercial use.
Insurance Evolution & Financial Implications
As expected, insurers are busy creating autonomous-specific policies. As autonomous vehicles become more prevalent, personal auto usage — and thus, insurance usage and cost — may decrease as commercial policies for manufacturers and operators increase. Accident victims could recover damages for medical bills, lost wages, pain and suffering, or property damage via claims against deep-pocketed companies.
Proving causation in “black box” data requires expert analysis, and bringing cases against an autonomous vehicle and all those liable can be difficult. If you are involved in a crash with a self-driving vehicle (or one that is capable of it), make sure you hire a law firm that understands the complexities involved, and will go the distance to make sure you get What’s Right. Call Sam & Ash Injury Law for a free case consultation, 24/7, at 702-820-1234. We fight, you win.

