Three people were killed in crashes in California and Indiana on Sunday. Police in both places later tried to find an answer to a strange question: Who was in control of the car at the time of the accident? Shortly after midnight local time on December 29, a Tesla Model S in Gardiner, California, ran a red light and hit a Honda Civic, killing two passengers on the Civic, and two passengers on the Tesla car were injured but not life-threatening. A few hours later, a Tesla Model 3 crashed into a fire truck parked on the left lane of an Indiana highway, killing a woman inside the Tesla and injuring her husband.
Pictured: Tesla Model S accident scene in Gardiner, California
Police have not determined whether the drivers were using Tesla’s autopilot assist system, Autopilot, or driving manually while the vehicle was in motion. Autopilot has long attracted the interest of federal security officials. Police say they hope to investigate in the coming days.
This distinction is important in determining the cause of the accident, and the auto industry is rapidly moving toward partial or fully autonomous driving, and the investigation will help answer regulators’ questions.
For years, cars have been equipped with so-called “accident data recorders”, a “black box” used to record key information needed to restore an accident, such as brakes, airbag openings, and other associated data that can be downloaded from specific tools.
However, only automakers have access to detailed data on autonomous driving technology in cars in the first few seconds of an accident. That means investigators have to turn to automakers when trying to understand whether automated systems are working during a crash.
“We should never rely on manufacturers to get this kind of data because they may be responsible for product defects and they have an inherent conflict of interest with vehicle accidents,” said Sam Abuelsamid, chief analyst at Navigant Research, a Detroit-based research firm. “
He says there should be a standard way to determine which control mode is at work before and during a crash.
“As we put in more vehicles with partial automation systems, we should be forced to record information about automation status and drivers,” he said. “
In the wake of the 2016 Crash of a Tesla Electric Vehicle, the National Transportation Safety Board (NTSB) asked the U.S. Department of Transportation to determine what data should be collected, but the work has not yet been done.
“As more and more manufacturers install automated systems on their vehicles, it is necessary to develop detailed information-gathering rules for the performance of active safety systems during accidents and drivers’ responses to accidents, in order to improve system safety,” the National Transportation Safety Board warned in an accident report. “In the event of a vehicle failure or crash, automakers, regulators, and accident investigators need specific data. “
Tesla did not respond to an email seeking comment. The company stresses that drivers are ultimately responsible for the control of the vehicle when using Autopilot and must always keep their hands on the steering wheel to stay focused. The company also hit back at criticism that Autopilot is unsafe, often citing quarterly data released by the company showing that drivers using Autopilot systems are safer than those who do not.
The National Highway Traffic Safety Administration (NHTSA) said it was investigating the california crash. A spokesman for the agency declined to comment on whether autopilot systems were working during the Tesla crash in California.
Raul Arbelaez, director of the Crash Testing Center at the American Highway Safety Insurance Association, said the lack of easy access to data prevents researchers from gaining insight into the performance of autonomous driving aids in a variety of situations, especially in minor crashes, which account for the largemajorit of traffic accidents.
“How do people interact with these technologies?” Under what conditions do they work? Do they really work badly on rainy, rainy nights? Or do you perform well under these conditions? Abilez said. “I believe this is very useful for car manufacturers. This can help them improve their products. But in terms of investigating how these automated systems are doing in car crashes, we really can’t access the data. What we want is the ability to extract data quickly after an accident without having to work with car manufacturers. “
In 2016, the National Transportation Safety Board and the National Highway Traffic Safety Administration investigated the first fatal accident involving Tesla’s Autopilot system. In that crash, the Tesla Model S was hit on the roof of a semi-trailer across a highway, killing the driver.
Florida car crash
The National Transportation Safety Board says the Florida-based Tesla car did not have a traditional accident data recorder that could be read by a common tool. Instead, Tesla collected a large amount of data and provided it to the National Transportation Safety Board. The data showed that the Tesla Model S driver was using the Autopilot system at the time of the accident.
Even if the tesla car involved in the accident had an accident data recorder, the 15 data parameters required in the 2006 rules “are not enough to answer the simplest question of who was in control of the car at the time of the accident.” The National Transportation Safety Board said in a detailed report.
Starting at least in 2018, Tesla motors will have accident data recorders and provide the public with the tools to download accident data.
A spokesman for the National Transportation Safety Board said the agency is calling on the U.S. Department of Transportation to determine what data parameters are needed to understand the automatic vehicle control system involved in the accident, but the work remains to be resolved.
The National Transportation Safety Board is investigating tesla’s involvement in other crashes involving Autopilot systems, including a fatal crash in Mountain View, California, in March 2018. Meanwhile, the National Highway Traffic Safety Administration has investigated 13 Tesla car accidents that they believe may have occurred when drivers were using Autopilot systems, including an incident in Connecticut on December 7 in which a Tesla car went straight after a police car parked on the side of the road.
Frank Borris, a former director of the NHTSA Defect Investigation Office, said there was interest in the use of new technology in the field, given the number of investigations launched by the agency. Burris said the role of accident investigation agencies is to collect real-world incidents to better understand potential traffic safety risks, including new technology applications.
Mr Burris said he had been concerned that Tesla’s claim that its Autopilot system was “autopilot” could lead drivers to rely too much on it, and that in some cases Tesla did not intend to make autopilot work. He added that there is little empirical data to prove how Autopilot and other autonomous driving assistive technologies are performing in the real world.
At the same time, as automation systems become more widely available and more advanced, the need for data by investigators may become more urgent.
In April 2019, during the Investor Experience Day at Tesla’s headquarters in Palo Alto, California, CEO Elon Musk said the “first fully driverless self-driving taxi” will be deployed next year (2020).
In a conference call in October, Musk said they should be able to upload software that would make Tesla a self-driving taxi “by the end of next year.
“The level of acceptance of regulators will vary from jurisdiction to jurisdiction,” he explains. But I think this shift from non-robot taxis to robot taxis is probably the biggest increase in asset valueins in history. “