Operational Limits Played ‘Major Role’ in Fatal Tesla Autopilot Crash, Says NTSB

According to a preliminary report from the National Transportation Safety Board, the “operational limitations” of Tesla’s Autopilot system played “major role” in a highly publicized crash in May of 2016 that resulted in the death of a Model S driver.

On Tuesday, the NTSB cited the incident as a perfect storm of driver error and Tesla’s Autopilot design, which led to an over-reliance on the system’s semi-autonomous features. After a meeting lasting nearly three hours, the agency’s board determined probable cause of the accident was a combination of a semi truck driver failing to yield the right-of-way, the Tesla driver’s unwillingness to retake the wheel, and Tesla’s own system — which may have set the framework for the accident. 

While the investigation is ongoing, aided by the Florida Highway Patrol and Tesla Motors, the book is closed on this one as far as we’re concerned.

Previously, the board ruled that the crash was not the result of mechanical error, but has since revised its position to make clear that the very nature of Tesla’s Autopilot was at least partially responsible. It recommended automakers do not allow drivers to use any automated control systems in ways they were not intended to be used. Joshua Brown, the driver in the fatal Tesla crash, was using Autopilot in a manner the company advised against, and had the feature engaged for roughly 37 minutes of the 41 minutes leading up to the accident.

He was not holding onto the wheel during that time.

Investigators said using torque sensors on the steering wheel (to indicate when a driver is holding it) was a ineffective method for gauging operator involvement. Driving is a highly visual activity, and NTSB investigator Ensar Becic said holding the wheel does not necessarily indicate a driver is paying attention. These concerns seem to have reached other automakers already, as Cadillac’s Super Cruise system uses a small camera to monitor the operator’s eyes.

Similarly, the version of Autopilot used in May of 2016 would issue warnings to retake the wheel but would not halt the car immediately if the driver failed to do so. The board also mentioned that the system could be used at speeds of up to 90 miles per hour. According to the preliminary report on the Florida crash, the operator had the system engaged roughly 10 mph above the posted limit and ignored numerous warnings to regain control of the vehicle.

“Today’s automation systems augment, rather than replace human drivers. Drivers must always be prepared to take the wheel or apply the brakes,” NTSB Chairman Robert Sumalt said in a statement to Reuters.

Throwing further condemnation onto Tesla, the NTSB said Autopilot’s design did not sufficiently detect cross traffic and the automaker “did little to constrain the use of Autopilot to roadways for which it was designed.”

However, before we form an angry mob and call for Elon Musk’s head, it should be noted that investigators came to the conclusion that both drivers had “at least 10 seconds to observe and respond to each other” before impact. It doesn’t make Brown’s death any less tragic, but that is certainly enough time to make a decision.

On Monday, Brown’s family said the car was not to blame for the crash. Neither Tesla, nor the family’s lawyer Jack Landskroner, have indicated if the automaker has reached a settlement on the matter.

“We heard numerous times that the car killed our son. That is simply not the case,” the family’s statement read. “There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car.”

“People die every day in car accidents,” they continued. “Change always comes with risks, and zero tolerance for deaths would totally stop innovation and improvements.”

[Image: Tesla Motors]


Leave a Reply

Your email address will not be published. Required fields are marked *