You’re driving home on the 210, minding your own business.
Crashes involving driverless cars, autonomous vehicles, and high-tech self-driving systems are rising across Southern California.
A Tesla next to you has that glow-from-the-dashboard look; you can tell the driver’s only half-paying attention, trusting Autopilot. Up ahead in downtown L.A., you see a sleek car with no one in the front seat—a robotaxi slowly merging into traffic. It feels like the future… until something goes wrong.
A sudden lane change.
A misread traffic light.
A system that doesn’t brake in time.
Next thing you know, there’s a crash. And now you’re stuck in a very modern nightmare:
- Was it the “self-driving” system’s fault, or the human driver’s?
- Does the car company pay, or the person behind the wheel?
- What if there was no driver at all?
If you were hurt in a crash involving a driverless car, a Tesla on Autopilot or Full Self-Driving (FSD), or any high-tech vehicle in Pasadena, Alhambra, or anywhere in Southern California, you don’t have to untangle this by yourself.
You can call Doyle Accident & Injury Attorneys at (626) 737-0036 for a free consultation about your rights and options.

The New Reality: Self-Driving and High-Tech Cars Are Already Here
This isn’t science fiction anymore.
Autonomous Vehicles on California Roads
California has become the epicenter of autonomous vehicle (AV) testing. The California DMV requires companies testing driverless cars to report any collision that causes property damage, injury, or death. As of October 31, 2025, the DMV has logged 888 autonomous vehicle collision reports, more than any other state. (California DMV)
California now leads the nation in both testing and collision reporting for driverless cars and fully autonomous vehicles.
That number has climbed quickly from about 781 reported collisions in early 2025, showing just how fast this technology is spreading—and how often real-world crashes still happen. (Pierce Skrabanek)
Companies like Waymo have reported hundreds of crashes involving their vehicles nationwide between 2021 and 2025, with dozens of injuries and at least one fatality, even though Waymo argues that many of those collisions were caused by human drivers around them. (DAM Firm)
Advanced Driver Assistance Systems (ADAS) Like Tesla Autopilot and FSD
On top of fully driverless cars, millions of regular vehicles now come with Level 2 “hands-on” systems—like Tesla’s Autopilot and Full Self-Driving (FSD), GM Super Cruise, and others. These partially automated systems are often confused with driverless cars, even though they still require human attention.
These systems can:
- Keep the car in its lane
- Adjust speed based on traffic
- Even change lanes automatically in some cases
But they are not fully self-driving. The driver is still legally supposed to pay attention and take over at any time.
The National Highway Traffic Safety Administration (NHTSA) has a special Standing General Order that requires automakers to report crashes when these systems or full automated driving systems were active within 30 seconds of a crash. (NHTSA)
Teslas with Autopilot and FSD have faced especially intense scrutiny. A federal investigation found Autopilot was involved in multiple fatal and serious crashes, raising concerns that the system encourages overconfidence and misuse. (The Guardian)
Bottom line:
High-tech cars are already sharing the road with you in Southern California every day. When they crash, the legal questions get complicated fast.
Common Crash Scenarios Involving Driverless and High-Tech Vehicles
Let’s talk about how these crashes actually happen in real life.
- The “Autopilot” Rear-End on the Freeway
You’re heading east on the 210 near Pasadena. Traffic slows suddenly. You brake in time—but the Tesla behind you doesn’t.
Later, you find out the Tesla driver:
- Had Autopilot engaged
- Was glancing at their phone
- Assumed the car would react on its own
You get whiplash and a back injury. The Tesla’s front end is crumpled; your trunk is smashed.
Here, the human driver can still be held responsible. Autopilot and FSD are driver-assist systems, not replacements for an attentive driver. If they weren’t paying attention and failed to brake, they likely violated their duty to drive safely.
But if the system malfunctioned—failed to detect your car, ignored stopped traffic, or otherwise misbehaved—there may be arguments that the manufacturer shares some fault.
- The Robotaxi That Didn’t Quite Understand the Intersection
Imagine this in downtown L.A.:
You’re crossing legally in a crosswalk. A driverless vehicle—a robotaxi with no one in the front seat—is making an unprotected left turn across the intersection.
Even without a human behind the wheel, these driverless cars can still misjudge pedestrian movement or intersection timing.
In this type of crash, liability may involve:
- The AV company that designed and operates the driverless system
- The vehicle manufacturer, if hardware defects contributed
- Possibly others, depending on who controls maintenance and software updates
California regulators have already suspended and restricted robotaxi operations after high-profile pedestrian incidents in San Francisco, including a Cruise vehicle that struck and dragged a pedestrian in 2023—leading to DMV license revocation, fines, and a major pullback in Cruise’s operations. (Reuters)
- The Chain-Reaction Crash with a Driverless Car in the Middle
Sometimes, a driverless car is involved in a crash but not necessarily at fault.
For example:
- A speeding human driver plows into a row of cars at a stoplight in L.A.
- One of those cars happens to be a Waymo robotaxi
- Multiple vehicles—including yours—are damaged and several people are hurt
Recent crashes in San Francisco show that Waymo vehicles have been involved in multi-car collisions where a human driver going at an “extreme rate of speed” caused the initial crash and the robotaxi was one of several vehicles hit. (San Francisco Chronicle)
In these cases, pursuing compensation may still focus primarily on the reckless human driver who caused the pileup, but the presence of a driverless car adds extra layers of investigation and data.

Who Could Be Responsible When a High-Tech Vehicle Crashes?
Liability in these cases often comes down to a mix of:
Determining fault in a driverless car crash requires analyzing both the automated system and human behavior.
- The Human Driver
Even with advanced tech, the person behind the wheel usually has a legal duty to:
- Pay attention
- Take over when needed
- Use the system as intended
If they:
- Watch movies
- Play on their phone
- Sleep
- Or treat Autopilot/FSD like full self-driving
…they can be held responsible for negligence when the car crashes.
- The Vehicle or Software Manufacturer
Sometimes, a crash is worsened—or directly caused—by:
- Faulty sensors or cameras
- Bad software updates
- Poorly designed driver monitoring systems
- Misleading marketing about what the system can really do
Federal crash data and investigations into Tesla, Waymo, Cruise, and others show that autonomy and advanced driver assistance systems are still being refined, and some crashes raise serious questions about system design, testing, and safety measures. (NHTSA)
If a system makes unreasonable decisions, fails in predictable situations, or doesn’t properly warn drivers, product liability claims may come into play.
- The Fleet Operator or AV Company
For true “driverless” vehicles (no human driver):
- The company operating the fleet (Waymo, Cruise, Zoox, etc.) is often the primary target
- They control the software, mapping, maintenance, updates, and operating areas
If they rush expansion, ignore safety warnings, or hide crash details—as regulators have accused some companies of doing—they may be exposed to civil liability and regulatory penalties.
- Government or Public Entities
In some cases, a dangerous road design or poorly marked intersection contributes to a crash:
- Confusing lane markings
- Inadequate signage
- Poor lighting at complex intersections
Claims against cities, counties, or the state are more complex and have shorter deadlines, but they can be part of the picture if a hazardous road played a major role.
Evidence in AV and High-Tech Vehicle Crashes: Data Is Everything
One big difference between a “normal” crash and a high-tech vehicle crash is the amount of data available—and how quickly it can disappear or get locked down.
Key evidence can include:
- Event Data Recorders and “Black Boxes”
Many modern vehicles, especially AVs and ADAS-equipped cars, record:
- Speed
- Braking
- Steering inputs
- Whether driver-assist systems were active
- When they disengaged
Event data recorders in driverless cars often show what the automated system was doing seconds before the collision.
- Onboard Cameras and Radar/LiDAR Data
Driverless cars and many assisted-driving systems rely on:
- Front, rear, and side cameras
- Radar
- LiDAR (laser-based distance measurement)
These feeds may record:
- What the car “saw”
- When it detected obstacles or pedestrians
- How it interpreted traffic lights and signs
- Cloud Logs and Remote Monitoring
Some companies process data in the cloud or monitor vehicles remotely. There may be:
- Logs of software warnings or error codes
- Remote interventions
- Mapping and localization data
- NHTSA and DMV Reporting
Because of NHTSA’s crash-reporting order, companies must report certain crashes involving automated systems, and the California DMV posts collision reports from AV testers. (NHTSA)
All of this can be powerful evidence—but you usually can’t get it by just asking nicely. Getting the right data often requires:
- Fast letter-of-preservation notices
- Targeted requests
- Sometimes, formal litigation
That’s one reason getting a lawyer involved early can matter so much in these cases.

What To Do If You’re Hit by a Driverless or “Self-Driving” Vehicle
If you’re involved in a crash with a robotaxi, Tesla on Autopilot/FSD, or any of these driverless cars, try to:
- Prioritize Your Safety and Health
- Call 911 if anyone is hurt
- Move to a safe location if possible
- Accept medical help at the scene
Even if you feel “okay,” see a doctor as soon as you can. Some injuries show up later.
- Treat It Like Any Other Crash—Plus a Bit More
At the scene:
- Take photos and video of all vehicles, license plates, damage, and the surroundings
- Photograph any visible sensors, cameras, or branding on the AV (Waymo, Cruise, etc.)
- Photograph the interior if possible (to show there was no human driver in a robotaxi)
- Get names and contact info of witnesses
If the vehicle is a robotaxi, note:
- The company name
- Any fleet number or vehicle ID displayed on the car
- Make Sure a Police Report Is Filed
Ask that police respond and file a report. Make sure you:
- Clearly explain what you experienced
- Mention if you saw “self-driving,” “Autopilot,” or a driverless operation in use
These details can matter later.
- Avoid Speculating About Fault or “Tech Stuff”
At the scene and in any early conversations:
- Stick to what you actually saw and felt
- Don’t guess about why the system failed or who is ultimately responsible
That’s what investigations, engineers, and lawyers are for.
- Talk to a Lawyer Before Dealing with the Companies
You may be contacted by:
- The AV company
- The vehicle manufacturer
- Multiple insurance carriers
They may be polite—but their job is to limit what they pay and protect the company, not you.
Before you give a detailed or recorded statement or sign anything, it’s smart to talk to a lawyer who understands these cases.
If your injuries were caused by reckless or distracted driving, you may also want to review our California pedestrian accident guide for additional safety tips.
How Doyle Accident & Injury Attorneys Handles Complex Tech-Injury Cases
Crashes involving driverless cars and advanced driver-assistance systems aren’t like typical fender-benders. There’s more complexity, more data—and often more finger-pointing.
Here’s how Doyle Accident & Injury Attorneys approaches them.
- Thorough Investigation and Policy Review
The firm can:
- Gather the police report, witness statements, and scene photos
- Analyze how and where the crash happened
- Review all relevant insurance policies—yours and theirs
The goal is to identify every potential source of coverage, including:
- The AV company’s policy
- The driver’s auto insurance
- Your own uninsured/underinsured motorist coverage
- Any separate commercial policy if the vehicle was used for business or ride-hailing
- Pursuing Critical Data and Expert Analysis
Doyle Accident & Injury Attorneys can:
- Send preservation letters to AV companies and manufacturers
- Seek access to event data recorders, camera footage, and system logs
- Work with experts in crash reconstruction, human factors, and vehicle automation
This helps answer key questions:
- Was the human driver paying attention?
- Was the system used as intended?
- Did the tech perform reasonably—or fail in a way that made the crash worse?
- Dealing with Big Companies and Their Lawyers
Large tech and auto companies have:
- Teams of attorneys
- PR staff
- Engineers ready to defend the system
You don’t need to go up against that alone.
Your lawyer can handle communications, negotiations, and, if necessary, litigation, while you focus on healing.
- Valuing the Full Impact of the Crash
A serious high-tech vehicle crash can leave you with:
- Medical bills
- Time off work
- Chronic pain
- Fear of driving
- Disruption to your family life
Doyle Accident & Injury Attorneys can help you pursue compensation not just for immediate expenses, but for long-term harm—physical, emotional, and financial.
You can also learn about California auto insurance changes that may affect your accident claim.
Talk to a Lawyer Before You Sign Anything
If you were injured in a crash involving:
- A driverless car (robotaxi)
- A Tesla on Autopilot or Full Self-Driving
- Another vehicle with advanced driver-assistance systems
…there is real money and responsibility at stake—and the companies involved know it.
Before you:
- Accept a quick settlement
- Sign a release
- Assume “it was just an accident” and walk away
…get advice from someone whose only job is to protect you.
Injured in a crash with a driverless or high-tech vehicle in Pasadena, Alhambra, or anywhere in Southern California? Call Doyle Accident & Injury Attorneys at (626) 737-0036 today for a free consultation about your case.
References
- California DMV – Autonomous Vehicle Collision Reports https://www.dmv.ca.gov/portal/vehicle-industry-services/autonomous-vehicles/autonomous-vehicle-collision-reports/
- Pierce Skrabanek – Self-Driving Car Accident Statistics (summary of DMV data) https://www.pstriallaw.com/legal-news/self-driving-car-accident-statistics
- NHTSA – Standing General Order on Crash Reporting for ADS and Level 2 ADAS https://www.nhtsa.gov/laws-regulations/standing-general-order-crash-reporting
- Reuters / NHTSA – Tesla Autopilot / FSD crash investigations and fatal crash data https://static.nhtsa.gov/odi/inv/2022/INCR-EA22002-14496.pdf https://www.theguardian.com/technology/2024/apr/26/tesla-autopilot-fatal-crash https://www.reuters.com/business/autos-transportation/us-opens-probe-into-28-million-tesla-vehicles-over-traffic-violations-when-using-2025-10-09/
- Waymo Accident Statistics and Safety Impact https://www.damfirm.com/waymo-accident-statistics.html https://waymo.com/safety/impact/
- Cruise Robotaxi Crashes, DMV Suspension, and Federal Fines https://www.reuters.com/business/autos-transportation/how-gms-cruise-robotaxi-tech-failures-led-it-drag-pedestrian-20-feet-2024-01-26/ https://www.ktvu.com/news/cruise-fined-500000-filing-false-report-about-driverless-car-dragging-pedestrian https://www.theverge.com/2024/11/15/24297248/cruise-robotaxi-criminal-fine-falsify-report-pedestrian https://apnews.com/article/48591349c24c6f969c07c65f1ed4e9d8
- UC Berkeley TIMS – Autonomous Vehicles (AV) Safety Dashboard https://tims.berkeley.edu/tools/avsafety.php












