Lack of 'safeguards' in Tesla's Autopilot contributed to fatal crash

Andrew Cummings
September 13, 2017

The US National Transportation Safety Board (NTSB) has found that Tesla's Autopilot system was partly to blame for a fatal accident in which a Model S collided with a lorry.

According to Reuters, the NTSB found that the system worked as designed, but should've done more to ensure driver attentiveness and restrict its use to highways and limited-access roads.

The driver of a Tesla who was killed in a crash that drew worldwide attention a year ago was too reliant on the car's "Autopilot" system when he plowed into the side of a tractor-trailer at more than 70 miles per hour, federal investigators concluded Tuesday.

Joshua Brown, a former Navy SEAL, died May 7, 2016 when his Model S struck a truck crossing the road in front of him on a Florida highway. The system allows the vehicle to guide itself - using multiple sensors linked to a computer system - like a greatly enhanced cruise control system, and comes with automatic emergency braking created to avoid frontal collisions.

While the board faulted Brown for not paying attention in the seconds before the crash, they noted Autopilot did not do an adequate job of detecting other traffic and did not inform the driver early enough to allow for sufficient reaction time.

The board recommended that regulators find better ways to measure driver attentiveness, such as using scanners that focus on where drivers are looking.


The NTSB's findings came an hour before the Department of Transportation and National Highway Traffic Safety Administration released new autonomous vehicle guidance, "A Vision for Safety 2.0", which explicitly ignored so-called Level 2 technologies like Autopilot.

"System safeguards were lacking", NTSB Chairman Robert Sumwalt said.

The NTSB issued 13 findings related to the crash, including a pair of citations over Brown's driving patterns. "The system gave far too much leeway to the driver to divert his attention".

"It sounds to me like Tesla is sort of speaking out of both sides of their mouth in this respect", he said.

"We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times", Tesla said. The agency found, among other things, that Autopilot's adaptive cruise control and lane-centering steering were engaged when the Model S struck the trailer at 74 miles per hour.

Two minutes earlier, according to reports, Brown had set the speed at nearly 10 miles per hour above the posted speed limit. They believe Brown and his auto simply failed to see the truck in the moment before the crash.


Brown had his hands on the wheel for 25 seconds during the 37 minutes Autopilot was activated, the NTSB wrote in a June report.

Autopilot has torque sensors on the steering wheel that detect whether the driver is holding it. The Tesla autopilot is a Level 2 system, with Level 5 as the standard for a fully autonomous auto.

It's the first known fatal crash of a highway vehicle operating under automated control systems, according to the NTSB.

'Nobody wants tragedy to touch their family, but expecting to identify all limitations of an emerging technology and expecting perfection is not feasible either, ' the statement said. "That is simply not the case", the family's statement said.

NTSB recommended that NHTSA require automakers to have safeguards to prevent the misuse of semi-autonomous vehicle features.


Other reports by iNewsToday

FOLLOW OUR NEWSPAPER