Feds Say Tesla Autopilot Is Partly to Blame for a 2018 Crash | WIRED | xxxFeds Say Tesla Autopilot Is Partly to Blame for a 2018 Crash | WIRED – xxx
菜单

Feds Say Tesla Autopilot Is Partly to Blame for a 2018 Crash | WIRED

七月 31, 2019 - MorningStar

Skip to main content

Feds Say Tesla Autopilot Is Partly to Blame for a 2018 Crash | WIRED
Feds Say Tesla Autopilot Is Partly to Blame for a 2018 Crash

Search

Feds Say Tesla Autopilot Is Partly to Blame for a 2018 Crash

The National Transportation Safety Board says the design of Tesla's Autopilot contributed to a crash in which the driver did not actively steer for 13 minutes.

Feds Say Tesla Autopilot Is Partly to Blame for a 2018 Crash | WIRED
Tesla has reduced the time before drivers are warned to reapply pressure to the steering wheel, but the changes may not be adequate for investigators.Jasper Junien/Getty Images

The design of Tesla’s Autopilot feature contributed to a January 2018 accident in which a Model S sedan smashed into the back of a fire truck in Southern California, according to federal safety investigators. It is the second time the National Transportation Safety Board has found Tesla partially responsible for a crash involving the semiautomated feature. The federal board says it’s also investigating two other Autopilot-involved crashes.

No one was hurt in the 2018 crash, but investigators found that the driver had flipped on Autopilot about 14 minutes before the crash and had not actively steered for the final 13 minutes. Investigators said the driver’s inattention and overreliance on Autopilot were probable causes of the crash. During those final 14 minutes, the car warned the driver to apply pressure to the steering wheel four times, but he did not apply pressure in the roughly four minutes before the crash, investigators found.

Investigators said the driver’s use of Autopilot was “in ways inconsistent” with Tesla’s guidance. The driver said he learned how to use Autopilot from a Tesla salesperson but did not read the owner’s manual, which tells drivers exactly when and where they should use Autopilot.

Want the latest news on self-driving cars in your inbox? Sign up here!

The incident emphasizes what industry watchdogs and even Tesla itself have said before: Autopilot isn’t a self-driving technology. It requires drivers’ attention, even when the road ahead looks like smooth sailing.

But investigators also seem to believe that Tesla isn’t doing enough to make Autopilot safe. In its report, the NTSB highlighted a recommendation following another Autopilot-involved crash that killed a Florida driver in 2016. The panel asked automakers to “develop applications to more effectively sense the driver’s level of engagement and alert the driver when engagement is lacking” when using “automated vehicle control systems.” Tesla has changed how Autopilot works, requiring drivers to put pressure on the wheel more frequently while the feature is engaged. But the NTSB seems to believe it’s not enough.

“Fool me once, shame on you; fool me twice, shame on me. Fool me four, five, or six times now—that’s too much,” says David Friedman, former acting head of the National Highway Traffic Safety Administration and now director of advocacy at Consumer Reports. “If Tesla doesn’t fix Autopilot, then [the federal government] should do it for them.” (The NTSB can only recommend safety improvements; the NHTSA can enact regulations.)

Tesla said in a statement that “Tesla drivers have driven billions of miles with Autopilot engaged, and data from our quarterly Vehicle Safety Report indicates that drivers using Autopilot remain safer than those operating without assistance. While our driver-monitoring system for Autopilot repeatedly reminds drivers of their responsibility to remain attentive and prohibits the use of Autopilot when warnings are ignored, we’ve also introduced numerous updates to make our safeguards smarter, safer, and more effective across every hardware platform we’ve deployed. Since this incident occurred, we have made updates to our system, including adjusting the time intervals between hands-on warnings and the conditions under which they’re activated.”

The vehicle in the 2018 crash, in Culver City, California, was a 2014 model. Tesla has since revamped the hardware—the front-facing cameras and radar, the ultrasonic sensors—in its vehicles. (CEO Elon Musk has famously said that today’s Teslas have all the hardware they need to drive themselves. The electric automaker is still working on the software part.)

Advertisement

Investigators examined the driver’s cell phone after the crash and found that he was neither texting nor on a phone call before the incident. But the report warns that the NTSB can’t tell whether he was playing with an app on his phone. (He told investigators he wasn’t.) A witness who had been driving next to the car before it collided with the fire truck said the driver appeared to be looking down at something in his left hand.

Then there was the bagel and coffee. The driver said those foods were in the car with him and that he believed there was food next to him during the crash. But the coffee spilled and the bagel was smashed, so he couldn't be sure, and they may have been in his hand.

LEARN MORE



The WIRED Guide to Self-Driving Cars

The incident also highlights what many have criticized about Tesla’s approach to Autopilot, and about some automakers’ semiautomated strategies, which rely on humans to monitor advanced driving features. Since at least World War II, researchers have known that humans are garbage at monitoring technology that is near-perfect and can't be trusted to react when something goes wrong. The British Royal Air Force found people monitoring radar sometimes missed the blips on their screens that indicated German subs; the self-driving-vehicle developer Waymo reportedly discovered that drivers charged with monitoring its tech from behind the wheel sometimes fell asleep.

“Humans are really bad at watching paint dry, and that’s what you’re asking them to do if the car can do a lot of the functions itself,” says Friedman.

General Motors, which calls its semiautomated feature SuperCruise, takes a different approach. Rather than relying on steering wheel torque, like Tesla, the Detroit carmaker has installed cameras inside the vehicle to monitor drivers’ eyes and ensure they’re watching the road. Tesla executives reportedly considered and then rejected that approach to driver monitoring.

Musk has acknowledged that finding the balance in semiautomated tech is difficult. “When there is a serious accident it is almost always—in fact maybe always—the case that it is an experienced user, and the issue is more one of complacency,” he has said.

But Musk has also said that publicizing Tesla crashes can kill more people in the long run by encouraging drivers not to use Autopilot, which he believes enhances safety. “It's really incredibly irresponsible of any journalists with integrity to write an article that would lead people to believe that autonomy is less safe, because people might actually turn it off and then die,” he said last year.

Musk promised that Tesla would publish quarterly reports on Autopilot safety, but thus far those reports have come in the form of a short sentence in each earnings report, with no in-depth data to back it up. This week’s NTSB finding will likely put pressure on the carmaker to defend its signature tech breakthrough. And, perhaps, not a moment too soon: Musk has said Tesla could have 1 million totally self-driving vehicles on the road by next year—though the company has yet to demonstrate the tech.


More Great WIRED Stories

Feds Say Tesla Autopilot Is Partly to Blame for a 2018 Crash | WIRED
Aarian Marshall writes about autonomous vehicles, transportation policy, urban planning, and everyone’s favorite topic: How to destroy traffic. (You can’t, really.) She’s an aspiring bike commuter and New Yorker going soft on San Francisco, where she’s based. Before WIRED, Marshall wrote for The Atlantic’s CityLab, GOOD, and Agri-Pulse, an agriculture… Read more
Staff Writer


Notice: Undefined variable: canUpdate in /var/www/html/wordpress/wp-content/plugins/wp-autopost-pro/wp-autopost-function.php on line 51