Who is Liable in a Self-Driving Car Accident?

 

 

Cars have been handling some of the driving for us since at least the year 2,000. Then, select models offered self-driving features like blind spot detection and a forward collision warning. As time went by auto makers added features like automatic emergency braking and lane centering assist. Now some cars can literally drive themselves with minimal assistance from human drivers.

 

Since most car crashes are attributed to human error, logic suggests taking humans out of the equation will make driving safer. Cars don’t get fatigue, experience road rage, drive drunk or get distracted by text messages.

 

The problem is, self-driving technology is still in the early stages. Machines don’t actually make decisions, they follow their programming. And accidents have occurred when drivers were relying on self-driving features to keep them safe. So what happens if you’re trusting your vehicle’s technology for some of the driving and a crash occurs? Or, what if another driver turns control over to their Tesla and you get hit? Who is liable in a self-driving car accident?

 

Self-Driving Car Accident Examples

 

There have already been crashes that resulted in fatalities. In 2016, 45-year-old Florida driver Joshua Brown was using the auto pilot feature on his Tesla Model S when a tractor-trailer turned left in front of him. The car went under the trailer, killing Brown. NTSB released a 500 page report of the investigation that states the Tesla audibly warned Brown six times to take control of the wheel, and he received seven visual warnings on the dash. A federal investigation eventually closed without finding a defect or need for recall with Tesla.

 

Then in 2019, 50-year-old Jeremy Banner died in a similar accident. Just seconds after he turned on auto-pilot, his Tesla Model 3 crossed paths with a combination vehicle, shearing off the Tesla’s roof and killing Banner. The family has filed a wrongful death lawsuit.

 

In another unsettling case, a self-driving Uber SUV crashed into and killed a pedestrian in Arizona. The investigation found evidence the pedestrian might have been at fault, because she was jaywalking and stepped into the vehicle’s path from out of the shadows. There were other issues at play, though, since the Volvo XC90 wasn’t programmed to recognize pedestrians outside of crosswalks and Uber shut off the car’s automatic emergency braking feature to keep from interfering with its self-driving testing.

 

Human Error Still Occurs

 

One problem researchers have discovered occurs during the handoff between driver and vehicle. Stanford found drivers often over-correct or under-correct when they switch off auto-pilot, especially if the road conditions are different than they were when they turned driving over to their vehicle. Subtle changes drivers are used to feeling during driving aren’t present when they let go of the wheel, making it more likely they’ll wobble or miss a turn.

 

So Who is At Fault in Self-Driving Car Accidents?

 

This is a relatively new issue that will continue to evolve with technology and its interaction with humans. Like with most car accidents, there’s usually more than one factor involved. In each case, investigators are going start by asking the following questions:

  • Was the vehicle doing what it was programmed to do?
  • Did that programming include extensive safety protocols?
  • Were there malfunctioning components?
  • What was the driver doing or not doing before the accident?
  • If another vehicle was involved, how did that driver contribute?

 

If you or someone you love became injured in a self-driving car accident, whether you were the driver or the person struck, we can help you understand your rights. Schedule your free consultation today.