U.S. auto safety regulators said on Thursday they found no evidence of defects in a Tesla Motors car involved in the death of a man whose Model S collided with a truck while he was using its Autopilot system.
The case has been closely watched as automakers race to automate more driving tasks without exposing themselves to increased liability risks.
Tesla Chief Executive Elon Musk, on his Twitter account, praised the decision by the National Highway Traffic Safety Administration, which did not order a recall and put the responsibility for the accident primarily on the driver, former Navy SEAL Joshua Brown.
U.S. Transportation Secretary Anthony Foxx told reporters on Thursday that drivers have a duty to take seriously their obligation to maintain control of a vehicle. He said automakers also must explain the limits of semi-autonomous systems. In the case of Tesla’s Autopilot, one limitation was that the system could not detect a truck trailer that crossed the road in front of the victim’s Tesla.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
“The (auto) industry is going to have to be clear about what the technology does and what it is does not do, and communicate it clearly,” Foxx said.
Jack Landskroner, a lawyer for Brown’s family, said they plan to evaluate all the information from government agencies investigating the crash “before making any decisions or taking any position on these matters.”
U.S. Senator Gary Peters, a Michigan Democrat, said in an interview on Thursday “it is important regulators allow the flexibility and freedom to innovate, but also prevent technology that is not quite ready for prime time to get on the road.”
Confusion over who is in control
Legal experts said the agency’s decision does not mean automakers would escape liability claims in cases where driver assistance systems fail to prevent a crash.
“If it is known that drivers are misusing and being confused by your self-driving system, then that in and of itself can be a safety-related defect,” product liability lawyer Jason Stephens said.
The crash occurred near Williston, Florida, last May. Brown was operating his Model S in Autopilot mode just before he collided with a truck and was killed.
The first fatality in a Tesla vehicle operating in Autopilot mode, the incident raised questions about the safety of systems that can perform driving tasks for long stretches with little or no human intervention, but which cannot completely replace human drivers.
NHTSA said in a report that Brown did not apply the brakes and his last action was to set the cruise control at 74 miles per hour (119 kph), less than two minutes before the crash.
The agency said Brown “should have been able to take some action before the crash, like braking, steering or attempting to avoid the vehicle. He took none of those actions.”
The agency said the truck should have been visible to Brown for at least seven seconds before impact. Brown “took no braking, steering or other actions to avoid the collision,” the report said.
NHTSA also said in the report that drivers could be confused about whether the system or the driver is in control of the vehicle at certain times.
NHTSA issued numerous subpoenas and requests for information to Tesla, and also sought information from Tesla supplier Mobileye. The agency asked Tesla to describe how it monitored misuse of the system and steps it took before introducing the technology to prevent misuse, but nearly all of Tesla’s answers were redacted by the agency.
Tesla said “the safety of our customers comes first, and we appreciate the thoroughness of NHTSA’s report and its conclusion.”
Musk, in a tweet, called the report “very positive.” He also cited NHTSA’s analysis of Tesla data which suggested vehicle crash rates fell by 40 percent after the installation of its Autosteer lane-keeping function.
Tesla in September unveiled improvements to Autopilot, adding new limits on hands-off driving and other features that its chief executive officer said likely would have prevented a fatality.
The agency also said its decision to close the investigation was not based on the software improvements announced in September.
In October, Musk said all new Tesla models will come with an $8,000 package for technology that allows the car to drive itself. By the end of 2017 a Tesla should be able to drive in full autonomous mode from Los Angeles to New York “without the need for a single touch” on the wheel, Musk said.
Rival automakers have said they expect to be able to field autonomous driving capability by around 2020.
The U.S. National Transportation Safety Board is also probing the crash. NHTSA said there have been no reported incidents in the United States involving a Tesla in autopilot mode that resulted in fatalities or injuries since a Pennsylvania crash in July injured two people.
(By David Shepardson; Additional reporting by Erica Teichert in New York; Editing by Jeffrey Benkoe and Matthew Lewis)
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More