Traditionally, an individual who is found to be at “fault” for an accident is liable for the damages and/or injuries that result from that accident.

As we discussed last week, self-driving vehicles complicate liability. When a car is marketed as self-driving, autonomous, or capable of an auto-pilot function, it’s unclear whether the human driver or the car itself is liable when an accident occurs.

Tesla is expected to use this grey area as their defense in a recent lawsuit against them. At the same time, the National Highway Traffic Safety Administration (NHTSA) is taking steps to collect data that may undermine Tesla’s ability to use that defense in the future. While the Tesla case and the NHTSA are specific to the US, this could be an indication of what’s to come in Canada.

A Summary of Liability & Self-Driving Cars

If you’re found liable in a motor vehicle accident, it means you’re responsible for the damage and/or injuries that result from that accident. Historically, the driver who is at fault is the driver who is liable.

You may be found at fault for an accident if you were negligent, reckless, or intentional in causing the accident. Strict liability, which involves defective products or extra hazardous activities, is also a level of fault.

Self-driving cars complicate this understanding of fault and liability. This is especially true in the context of the level of autonomy self-driving cars are currently at. While a Level 4 or 5 self-driving car doesn’t require interaction between the car and the human driver, these don’t exist yet. Right now, we only have Level 2 self-driving cars, which requires the presence and action of the human driver.

Adding to this complexity of this issue is the marketing behind these cars and even the terms being used to describe them (i.e. Autopilot). It’s become clear that many drivers are under the impression that these cars are capable of driving themselves and automakers have not made sufficient effort to overcome that belief. 

Thus, when an accident occurs, it’s unclear who is liable. On one side of the coin are the people arguing that the human driver is still required to operate the vehicle and therefore wholly responsible for any accidents that occur. On the other side are the people arguing that the self-driving car and the manufacturer have some (if not all) responsibility for the occurrence of accidents.

Liability: An Example from Tesla

A recent lawsuit against Tesla provides an excellent example of this battle over liability.

On April 29, 2018, a Tesla Model X that was using Autopilot struck and killed a pedestrian. A lawsuit was filed against Tesla in the U.S. District Court of Northern California. The lawsuit includes six alleged causes of actions or counts, including strict product liability (design defects and failure to warn), negligence, and wrongful death.

The lawsuit, filed on behalf of the widow and daughter of the deceased pedestrian, states that Autopilot is fatally flawed in design and that Tesla is beta-testing self-driving cars on a public that hasn’t been properly informed of the capability, or lack thereof, of Autopilot. It alleges that Autopilot is putting the public at risk and is responsible for the death of the victim and, therefore, Tesla is liable for the resulting damages.

It’s assumed that Tesla will use the same defense they’ve used in the past against this argument. That argument is that that human driver is required for the operation of the vehicle even when Autopilot is engaged. With this defense, Tesla can and has avoided responsibility, accountability, and liability for injuries and damages resulting from accidents where Autopilot was engaged.

New NHTSA Rules

In June 2021, the NHTSA introduced a new rule that might help clarify liability issues around self-driving cars in the future. According to this new rule, which only applies to the US, automakers are required to report all crashes involving self-driving systems that result in “a hospital-treated injury, a fatality, a vehicle tow-away, an air bag deployment, or a vulnerable road user such as a pedestrian or bicyclist.”

These reports must be made within one day of learning about the accident and updates must be provided 10 days later. On a monthly basis, automakers will have to report any accidents that involved an injury or property damage. These rules also apply to self-driving pilot programs, such as those being run by Wayma, Zoox, and Cruise.

Failure to comply carries heavy fines that start at $22,992 per day. The maximum penalty for failure to report is more than $100 million.

A Step Toward Understanding Liability?

The majority of data we have on self-driving vehicles and their safety has come from automakers themselves. Of course, this presents a problem because it’s in the interest of automakers to make their cars appear safe.

Thanks to these new NHTSA rules, there will be far more unbiased data on the relative safety of these vehicles. The NHTSA has stated that they’ll be using the data collected to “…identify potential safety issues and impacts resulting from the operation of advanced technologies on public roads and increase transparency.”

The data will provide a clearer understanding of just how safe self-driving systems are and will also reveal whether or not there are particularly unsafe systems. In the future, this data could be used to negate a defense such as the one being used by Tesla and to inform policymakers in the formation of laws around self-driving vehicles in both the US and Canada.