mobile-post-banner

When an autonomous vehicle hits a pedestrian, who is responsible?

The tragic incident in Tempe, Arizona, involving a pedestrian who was fatally struck by a self driving car has highlighted ongoing concerns about autonomous vehicle liability and public safety. As autonomous vehicles become more prevalent on American roads, the crash makes urgent what has, until now, been mostly a hypothetical question: when an autonomous vehicle strikes a pedestrian or cyclist, who is responsible? This question is critically important for those pursuing claims after being injured in an autonomous vehicle crash. However, because autonomous vehicles are still evolving technology with complex regulatory frameworks, and because there is limited case law to guide decisions, answering this question remains challenging for legal experts, safety advocates, and accident victims alike.

The Current State of Driverless Vehicle Technology in 2025

The landscape of self driving vehicle technology has evolved significantly since the early incidents that first brought liability questions to public attention. According to recent statistics, self driving cars have 9.1 crashes per million miles driven, while regular vehicles have 4.1 crashes per million miles driven. This means that, on average, self driving car accidents occur more than twice as frequently as those involving human drivers, though it’s important to note that most autonomous vehicle crashes involve minor incidents where the driverless vehicle was rear-ended by other vehicles.

As of 2025, California alone has reported 791 autonomous vehicle collision reports, demonstrating the growing presence of these motor vehicles on public roads. The data reveals that fully autonomous vehicles have been involved in fewer fatal accidents compared to vehicles equipped with Advanced Driver Assistance Systems (ADAS), with only one reported fatality involving a fully autonomous vehicle up to March 2025. However, the increasing deployment of driverless cars across multiple states means that accident involving autonomous systems will likely become more common as the technology expands.

The development of fully autonomous vehicles has progressed beyond experimental phases, with companies like Waymo, Tesla, and others deploying driverless vehicle fleets in major metropolitan areas including San Francisco and other California cities. These automated vehicles now operate without human operators in the driver’s seat, representing a significant shift from earlier models that required backup drivers to monitor the autonomous system and take control when necessary.

Regulatory Framework and Lack of Comprehensive Oversight

Arizona’s Permissive Approach

The accident in question happened in Arizona, which, in a bid to attract self driving car companies, has largely left autonomous vehicles unregulated. While a licensed driver was initially required to be in the car and to take over to avoid a collision, the fact that autonomous cars largely operate themselves means that a person driving may not be paying full attention to their surroundings. Video from the Tempe crash, for example, shows the human operator looking down at her lap just seconds before the car accident occurred.

Furthermore, insurance regulators in Arizona have largely taken a “wait-and-see” approach to civil liability in crashes between self driving vehicles and pedestrians. This regulatory vacuum has created uncertainty about autonomous vehicle liability and has made it difficult to establish clear standards for determining liability when a crash occurs between an automated vehicle and vulnerable road users.

California’s Evolving Regulatory Landscape

While California has more comprehensive regulations for self driving cars than Arizona, the technology continues to advance rapidly in the Golden State. The California Department of Motor Vehicles has recently updated its autonomous vehicle regulations, with a 45-day public comment period that ran from April 25, 2025, through June 9, 2025. These updated regulations allow heavy-duty autonomous vehicles weighing 10,001 pounds or more to be tested with a DMV-approved permit, while also refining guidelines for light-duty automated vehicles.

The proposed regulatory framework notably gives broader discretion to the California Department of Motor Vehicles to not only revoke or suspend permits but also to take more “incremental enforcement measures” against manufacturers. These measures could include reducing the number of vehicles in an autonomous fleet, limiting hours of operation, or restricting operational areas when safety concerns arise.

California regulators have indicated that they would allow self driving cars to pick up passengers without a backup driver being present in the vehicles. This announcement shows that even after various incidents involving autonomous vehicles, the technology continues to advance and expand, with driverless cars becoming an increasingly common sight on California highways and city streets.

Understanding Liability in Autonomous Vehicle Accidents

Traditional vs. Autonomous Vehicle Liability Models

Unfortunately, answering the question of who is to blame when a self driving car hits a pedestrian remains complex. Autonomous vehicle liability may rest with multiple parties, including the vehicle’s owner, any human operator present, the car manufacturer, software developers, or component suppliers. Unlike traditional motor vehicle accidents where liability typically falls on human drivers who failed to exercise reasonable care, accidents involving autonomous vehicles present novel legal challenges that existing traffic laws were not designed to address.

When a crash involving an automated vehicle occurs, determining liability requires a thorough investigation of multiple factors. These include the performance of the autonomous system, any manufacturing defects in hardware or software, the behavior of the pedestrian or other road users, road conditions, weather factors, and whether the vehicle was operating within its designed parameters. This complexity means that personal liability may be distributed among several parties rather than falling solely on one responsible entity.

Product Liability Considerations for Autonomous Car Accidents

An accident involving an autonomous vehicle may rest primarily in the realm of product liability law. Courts, lawmakers, and regulators are increasingly considering whether crashes should be treated as product defects rather than driver negligence. If a self driving vehicle hits a pedestrian through no fault of the pedestrian, then the problem may rest with the product itself – the autonomous car and its various systems.

This approach would open car manufacturers to liability claims in such accidents, particularly when manufacturing defects in sensors, software algorithms, or other critical components contribute to the collision. The concept of product liability in autonomous vehicle cases is still evolving, but it represents a significant shift from traditional automotive liability models where individual drivers bore primary responsibility for crash outcomes.

The RAND Corporation has conducted extensive research on autonomous vehicle liability issues, noting that the existing automobile insurance system in the United States should be flexible enough to accommodate the introduction of autonomous vehicles. However, some changes to insurance models may be indicated as vehicles incorporate higher levels of automation, though researchers caution that it may be too early to make radical changes to existing frameworks.

Types of Autonomous Vehicle Systems and Liability Implications

Levels of Vehicle Automation

Understanding liability in autonomous vehicle crashes requires recognizing the different levels of automation present in modern vehicles. Level 2 systems, which include features like Tesla’s Autopilot, still require constant human supervision and the person driving remains responsible for maintaining control. However, when accidents occur involving these systems, questions arise about whether the car manufacturer adequately warned users about system limitations and whether the autonomous system functioned as designed.

Level 4 and Level 5 fully autonomous vehicles operate without human intervention under specific or all conditions respectively. When these driverless cars are involved in accidents, traditional notions of driver responsibility become largely irrelevant. Instead, liability analysis must focus on whether the vehicle’s manufacturer, software provider, or other entities involved in the vehicle’s design and deployment can be held liable for the incident.

Emergency Response and Human Intervention

One critical aspect of autonomous vehicle liability involves situations where the automated vehicle encounters emergency vehicles or unusual road conditions. Current autonomous systems may struggle to appropriately respond to situations that human drivers would instinctively handle, such as moving over for emergency vehicles or navigating around unexpected obstacles. When a collision happened under these circumstances, determining whether the autonomous system should have detected and responded to the hazard becomes central to liability analysis.

The question of human intervention also remains significant in many autonomous vehicle designs. Some systems require the human operator to remain alert and ready to take control, while others operate as fully autonomous vehicles without any expectation of human oversight. The level of human involvement expected by the manufacturer can significantly impact how courts and insurers approach liability questions when accidents occur.

Federal Government and State Regulatory Responses

Federal Oversight and Guidelines on Autonomous Vehicle Crashes

The federal government has taken a measured approach to autonomous vehicle regulation, providing guidance while allowing states to develop their own specific requirements. Federal agencies like the National Highway Traffic Safety Administration (NHTSA) collect data on autonomous vehicle crashes and work to establish safety standards, but comprehensive federal liability frameworks remain in development.

This federal approach recognizes that autonomous vehicle technology continues to evolve rapidly, and overly prescriptive regulations could stifle beneficial innovations. However, the lack of comprehensive federal standards means that liability questions must often be resolved using existing legal frameworks that were designed for human-driven vehicles.

State-by-State Variations

Different states have adopted varying approaches to autonomous vehicle liability, creating a complex patchwork of regulations and legal precedents. Some states have enacted specific legislation addressing autonomous vehicle liability, while others rely on existing motor vehicle laws and common law principles to address these novel situations.

This variation means that the outcome of autonomous vehicle liability cases may depend significantly on where the accident occurred and which state’s laws apply. Legal practitioners handling autonomous vehicle cases must navigate these varying state approaches while also considering how federal regulations and interstate commerce issues may affect their cases.

Impact on Different Road Users

Pedestrian and Cyclist Vulnerabilities

Pedestrians and cyclists face particular risks from autonomous vehicle technology, as these vulnerable road users often behave in ways that are difficult for automated systems to predict. While human drivers can make eye contact with pedestrians and use intuitive judgment about pedestrian intentions, autonomous vehicles rely on sensors and algorithms that may not capture these subtle human interactions.

When autonomous vehicles strike pedestrians or cyclists, questions arise about whether the vehicle’s sensors were adequate to detect vulnerable road users, whether the software appropriately prioritized pedestrian safety, and whether the vehicle was traveling at appropriate speeds for the conditions. These technical questions often require expert testimony and detailed analysis of vehicle data to resolve.

Self-Driving Car Interactions with Other Vehicles

Autonomous vehicles must also share the road with human-driven vehicles, creating complex scenarios where mixed traffic may contribute to accidents. When a crash involving an autonomous vehicle and human drivers occurs, determining liability may require analyzing both the performance of the automated system and the behavior of human drivers who may not have anticipated how the autonomous vehicle would respond to various traffic situations.

The different response times, decision-making processes, and operational parameters of autonomous vehicles compared to human drivers can create unexpected interactions that contribute to accidents. These scenarios highlight the challenges of integrating autonomous technology into existing transportation systems designed around human drivers.

Insurance and Financial Responsibility Models

Evolving Insurance Frameworks

Traditional auto insurance models are being challenged by the rise of autonomous vehicles, as the shift from driver liability to potential manufacturer liability requires new insurance approaches. Some manufacturers are beginning to provide insurance coverage for their autonomous vehicles, while traditional insurers are developing new products designed to address the unique risks posed by driverless vehicle technology.

The question of who maintains insurance coverage – vehicle owners, manufacturers, ride-sharing companies, or other entities – remains unsettled and may vary depending on the specific type of autonomous vehicle operation involved. This uncertainty can complicate the claims process for accident victims who may be unsure which parties to pursue for compensation.

Compensation Models for Victims

Accident victims injured by autonomous vehicles may face unique challenges in obtaining compensation, as traditional first-party insurance benefits may not adequately address the complex liability questions these cases present. Some states are considering no-fault insurance models specifically designed for autonomous vehicles, while others are exploring manufacturer-backed compensation funds.

These evolving compensation models reflect the recognition that autonomous vehicle accidents may require different approaches to ensure that injured parties can obtain appropriate compensation regardless of the complex liability issues that autonomous technology presents.

Current Legal Precedents and Case Development

Limited Case Law

The relative newness of autonomous vehicle technology means that there is still limited case law addressing liability questions in autonomous vehicle accidents. Early cases have focused primarily on accidents involving semi-autonomous systems where human drivers retained some responsibility, rather than fully autonomous vehicles operating without human oversight.

As more cases involving fully autonomous vehicles work their way through the courts, legal precedents will begin to establish clearer guidelines for determining liability. However, the rapid pace of technological development means that legal frameworks must continually adapt to address new types of autonomous vehicle systems and operational scenarios.

Emerging Legal Theories

Legal practitioners are developing new theories for addressing autonomous vehicle liability, including concepts of algorithmic negligence, where the programming or training of artificial intelligence systems may be deemed inadequate for safe vehicle operation. Other emerging theories focus on duties of manufacturers to continuously update and improve autonomous systems through over-the-air updates.

These developing legal theories reflect the unique challenges posed by autonomous vehicles, where traditional concepts of negligence, product liability, and duty of care must be adapted to address the complexities of artificial intelligence and machine learning systems operating in dynamic traffic environments.

Future Outlook and Recommendations

Anticipated Regulatory Development

As autonomous vehicle technology continues to mature and deployment expands, more comprehensive regulatory frameworks are likely to emerge at both federal and state levels. These frameworks will need to balance encouraging beneficial technology development with ensuring adequate protection for all road users, including pedestrians, cyclists, and occupants of other vehicles.

The development of clearer liability standards will likely require collaboration between legal experts, technology developers, insurance professionals, and public safety advocates to ensure that autonomous vehicle liability frameworks serve the public interest while enabling continued innovation in transportation technology.

Importance of Legal Representation

Given the complex and evolving nature of autonomous vehicle liability, accident victims need experienced legal representation that understands both traditional personal injury law and the unique challenges posed by autonomous vehicle technology. The technical complexity of these cases often requires extensive investigation, expert testimony, and familiarity with rapidly changing regulatory frameworks.

At Callahan & Blaine, our attorneys stay current with developments in autonomous vehicle liability law and have the resources necessary to handle complex cases involving emerging technologies. We understand that autonomous vehicle accidents present unique challenges for victims seeking compensation and are committed to ensuring that our clients receive appropriate representation regardless of the technological complexities their cases may involve.

Why Choose Callahan & Blaine for Truck Accident Cases

With over 40 years of experience, Callahan & Blaine is one of the premier litigation firms in California. Our team of skilled attorneys has successfully handled numerous accident cases, navigating the complexities of state and federal regulations to provide expert legal representation.

If you’ve been injured in an accident involving an autonomous vehicle, don’t navigate this complex legal landscape alone. Call (714) 241-4444 or contact us via our contact form today to discuss your case with a senior attorney. The experienced attorneys at Callahan & Blaine understand the unique challenges posed by autonomous vehicle liability cases and have the technical knowledge necessary to pursue maximum compensation for your injuries.

FAQ about Autonomous Vehicle Liability

Who is liable when a fully autonomous vehicle causes an accident?

Liability for accidents involving fully autonomous vehicles typically depends on the specific circumstances and may involve the vehicle manufacturer, software developer, component suppliers, vehicle owner, or fleet operator. Unlike traditional car accidents, where human drivers are usually at fault, autonomous vehicle liability often falls under product liability law. The responsible party may be determined based on whether the accident resulted from manufacturing defects, software malfunctions, inadequate sensors, or failure to exercise reasonable care in designing the autonomous system. A thorough investigation is usually required to determine which party or parties should be held liable for damages and injuries.

How do autonomous vehicle accidents compare to human driver accidents?

Current statistics show that self-driving cars have 9.1 crashes per million miles driven compared to 4.1 crashes per million miles for human-driven vehicles, making autonomous vehicles more than twice as likely to be involved in accidents. However, it’s important to note that most autonomous vehicle crashes involve minor incidents where the driverless vehicle was rear-ended by other vehicles. As of March 2025, there was only one reported fatality involving a fully autonomous vehicle, whereas vehicles with Advanced Driver Assistance Systems have been involved in more fatal accidents. The data suggests that while autonomous vehicles may have more frequent minor accidents, they may be involved in fewer severe crashes.

What should I do if I’m injured by an autonomous vehicle?

If you’re injured in an accident involving an autonomous vehicle, seek immediate medical attention and contact law enforcement to file a police report. Document the scene, including the vehicle’s make, model, and any identifying information about its autonomous system. Contact an experienced personal injury attorney who understands autonomous vehicle liability, as these cases involve complex technical and legal issues different from traditional car accidents. Your attorney will need to conduct a thorough investigation that may include analyzing vehicle data logs, examining the autonomous system’s performance, and determining whether manufacturing defects or software malfunctions contributed to the crash occurring.

How does insurance work for autonomous vehicle accidents?

Insurance for autonomous vehicle accidents is still evolving, with coverage potentially involving the vehicle owner’s insurance, manufacturer liability coverage, fleet operator insurance, or specialized autonomous vehicle policies. Some car manufacturers are beginning to provide liability insurance for their fully autonomous vehicles, while traditional insurers are developing new products to address driverless car risks. The specific insurance arrangements may depend on whether the vehicle was privately owned, part of a ride-sharing service, or operating as a commercial driverless vehicle. Determining which insurance policies apply and how claims should be processed often requires legal expertise due to the complex liability questions these accidents present.

Can autonomous vehicle manufacturers be held liable for accidents?

Yes, car manufacturers can be held liable for autonomous vehicle accidents under product liability law, particularly when manufacturing defects, software errors, inadequate sensors, or design flaws contribute to accidents. This represents a significant shift from traditional automotive liability, where individual drivers bore primary responsibility. Manufacturers may be liable if their autonomous system failed to perform as intended, if they inadequately warned users about system limitations, or if they failed to implement reasonable safety measures. The RAND Corporation and other research organizations have noted that this shift toward manufacturer liability will likely require changes to existing insurance models and liability frameworks as autonomous vehicle technology becomes more widespread.

Contact Us To Speak With a Car Accident Attorney Today

Recent Business Litigation Insights
Practice Areas
Related Firm News

*CV, BV, and AV are registered certification marks of Reed Elsevier Properties Inc., used in accordance with the Martindale-Hubbell certification procedures, standards, and policies. Martindale-Hubbell is the facilitator of a peer review rating process. Ratings reflect the confidential opinions of members of the Bar and the judiciary. Martindale-Hubbell ratings fall into two categories — legal ability and general ethical standards.

CONTACT

Discussion of Potential Case

Fill out the form regarding your potential case.

This field is for validation purposes and should be left unchanged.