Tesla Autopilot Crash Investigation Opens Door for Product Liability Claims
Summary: The National Transportation Safety Board (NTSB) released a report on February 25, 2020, that found that the human error in the fatal crash of 2018 involving a Tesla Model X P100D was foreseeable. This potentially takes the defense of user error and makes it a grounds for liability as an anticipated misuse of the product.
The National Transportation Safety Board (NTSB) released a report on February 25, 2020, that opens doors to liability on TESLA for crashes involving its autopilot system—even when human error occurs. The report presents the findings of NTSB’s two-year investigation into a crash involving a Tesla Model X P100D that hit a concrete barrier on March 23, 2018 in Mountain View, California.
According to the report, around 9:27 a.m. the Tesla driver was playing a video game on his phone while the vehicle autopilot was engaged. The vehicle struck the concrete barrier at 71 mph at the point that divides the space between the highway lanes and a lane that exits the highway. The vehicle’s guidance system was unable to timely detect the barrier and the driver, was distracted by a game on his phone. The vehicle erupted unto flames upon impact and the driver was killed.
Was the Tesla Autopilot to Blame?
The Tesla manual indicates that users are to keep both hands on the wheel at all times, even when autopilot is engaged. However, the driver clearly was lulled into a false sense of security and was not paying attention to the road nor the wheel. The NTSB’s investigation concluded that the driver was distracted, that the Tesla guidance system was not sufficient to detect the concrete barrier and that the vehicle did not have an effective means of monitoring the driver’s level of engagement to ensure that the driver was not completely disassociating with the driving task.
This wreck appears to be an example of the concern raised by critics and car accident lawyers in the past: the autonomous feel of the system coupled with the branding of the system as the “Autopilot” system may lead users into believing the system requires less participation by the operator than is really needed. The system has the ability to monitor steering wheel torque but the NTSB says this is not an accurate monitor of the driver’s real level of engagement. Resting one’s hands on the wheel does not mean they are watching the road. The NTSB went further to say that they anticipate more crashes if “Tesla does not incorporate system safeguards that limit the use of the Autopilot system to those conditions for which it was designated…”
Product Liability Legal Implications
While the driver error issue is clearly a part of the reason for this crash, product manufacturers have to take notice when a government body reports findings with these implications. A product manufacturer generally has an obligation to provide protections against foreseeable misuse of a product. The NTSB report specifically calls to Tesla’s attention, the fact that this will continue to happen if changes are not made to the Tesla Autopilot system. Thus, Tesla will have a hard time claiming that they could not foresee this happening again. What used to be the “driver error” defense has now become foreseeable conduct for which Tesla must either account for or face potential product liability.
Legal Articles Additional Disclaimer