Source Code: What matters in tech, in your inbox every morning

×
×

Protocol's newsletters deliver breaking news and analysis on the people, power and politics of tech. Sign up below.

Not today, Thank you!

Will be used in accordance with our Privacy Policy

National Transportation Safety Board Chairman Robert Sumwalt says a 2018 crash of a Tesla in self-driving mode was due to "the lack of system safeguards to prevent foreseeable misuses of technology."

Photo: Chip Somodevilla via Getty Images
Power

NTSB slams Tesla over deadly crash but can’t do much about it

The NTSB calls for more oversight, but it can only make recommendations.

The National Transportation Safety Board's blistering attack on Tesla over a fatal 2018 crash is grabbing headlines, but that may be all it does; the NTSB itself can't impose fines or issue enforceable regulations.

Board members will have to wait on the National Highway Traffic Safety Administration to act.

Get what matters in tech, in your inbox every morning. Sign up for Source Code.

NTSB's findings come after a nearly two-year investigation into a car crash on U.S. 101 in California that left driver Wei "Walter" Huang dead. Tesla's Autopilot feature was engaged at the time of the crash, and investigators determined that Huang's hands were not on the wheel — and that he was potentially even distracted by his iPhone.

Sign up for Protocol newsletters

There were "many facets" involved in the incident, said NTSB Chairman Robert Sumwalt. "But what struck me the most about the circumstances of this crash was the lack of system safeguards to prevent foreseeable misuses of technology."

Instead of bolstering those safeguards against things like driver distraction, the industry keeps implementing assisted driving technologies in ways that result in death or injury, Sumwalt said. "And the industry in some cases is ignoring the NTSB's recommendations intending to help prevent such tragedies," he added.

Errors in processing by Tesla's assisted driving system contributed to the crash, board staff said during the meeting.

NTSB investigates significant U.S. transportation incidents and makes recommendations but does not have the authority to set new rules for things like self-driving vehicles. That responsibility falls to NHTSA. During the meeting, members of the board expressed disappointment that the agency had taken so little action on emerging technologies like Autopilot.

In an emailed statement, NHTSA said it is carefully reviewing NTSB's findings and noted that current law requires human control of vehicles.

"NHTSA has for years recommended that developers of advanced driver assistance systems incorporate appropriate driver-vehicle interaction strategies in deployed technology and has made resources available that assist with that recommendation," the agency said.

However, so far, NHTSA's approach has been to make suggestions, not real rules of the road for assisted driving tech.

Tesla did not immediately return a request for comment.

The investigation into the March 23, 2018, crash is the second NTSB probe of an incident involving Tesla's self-driving technology to conclude. In September 2017, the agency found that lack of "safeguards" in the carmaker's assisted driving features contributed to the death of a Florida driver. It also issued two recommendations aimed at improving safety of assisted driving systems to six automakers — one calling for the safeguards to prevent inattentive driver misuse of the technology, and another calling for carmakers to restrict deployment of the features to only the kind of driving conditions for which they're designed.

According to Sumwalt, Tesla was the only one of those manufacturers that didn't respond. The agency typically asks companies to respond to recommendations within 90 days.

"But it's been 881 days since these recommendations were sent to Tesla, and we've heard nothing. We're still waiting," Sumwalt said.

Though we're only just now seeing the final report, the NTSB released dozens of documents about the Huang crash last week and a preliminary report last year. The vehicle's driving assistance systems misread the road, ultimately speeding up and crashing into the divider, according to the records. The car's systems also alerted Huang to pay attention during the drive with visual and audio cues.

The data suggested that driving assistance technology in Huang's vehicle previously had issues with the same stretch of highway during his commute on previous drives, and he regularly used mobile apps during that drive, including on the day of the incident — although logs reviewed my NTSB could not determine if he was holding his phone at the time of the crash.

"During the final four seconds of travel before impact, the Tesla accelerated toward the crash attenuator, and the driver took no evasive braking or steering action to avoid a crash," NTSB staffer Donald Karol said during the Tuesday board meeting. "This level of inaction given the numerous visual cues and unobstructed view indicates the driver was inattentive and not appropriately supervising the Autopilot partial automation system."

NTSB documents also noted that the injury caused by the crash was increased by the need for a replacement highway safety feature called an attenuator.

Huang's family is suing Tesla and the state of California over his death.

Mark Fong, the family's lawyer, argued that Huang's Tesla "failed to perform according to its claimed capabilities" and that he "was using Autopilot where Tesla told its customers it was safe to use," in an email to Protocol.

Regarding driver distraction, Fong added that Huang was "using Autopilot in a foreseeable manner, given the system's claimed capabilities and the way Tesla encouraged its customers to use it."

Update: This story was updated to include comments from Mark Fong, the lawyer of Walter Huang's family. The story was updated Feb. 26.

Latest Stories

Source Code: What matters in tech, in your inbox every morning

×