BEIJING, Feb. 25 (Upi) — The National Transportation Safety Board (NTSB) will hold a second hearing Tuesday on a fatal crash involving Tesla’s Autopilot assist, even though the electric car maker has yet to formally respond to its first recommendation more than two years ago.
In 2017, the NTSB recommended that automakers, including Tesla, increase the flexibility of driver assistance systems to avoid inadvertent driver misuse and limit the use of those systems only in the environment skewed by their design.
Automakers including Volkswagen, Nissan and BMW have told the NTSB how their systems will ensure driver engagement and have been approved by the NTSB. NTSB spokesman Chris O’Neil said Tesla had not formally communicated with NTSB officials responsible for monitoring the implementation of the security recommendations.
“This is not the norm,” O’Neill said. “
Tesla did not comment, but said it had partially updated Autopilot to issue more frequent warnings to drivers who were not focused.
The role of Tesla’s Autopilot autopilot and other factors, including driver distractions and highway infrastructure, will be reviewed at an NTSB hearing on Tuesday that investigated the March 2018 crash in Mountain View, California. The accident killed Walter Huang, a 38-year-old Apple engineer, when his Tesla SUV, which was driving Autopilot, crashed into a road barrier.
The investigation marks an unusually public tension between the NTSB and Tesla CEO Elon Musk. The conflict reached a peak after Tesla broke the rules about the crash and was kicked out of the NTSB investigation.
As self-driving features become more common in new cars, this hearing could provide lessons for the automotive industry. Several other automakers have also equipped their vehicles with technology that provides automatic steering, acceleration and braking, and some have installed systems to ensure drivers’ concentration. GM and Subaru use infrared cameras to track the movement of the driver’s head and eyes. Nissan said last year that it would include a similar driver’s monitor on a motorway-assisted driving system.
Tesla said Autopilot improved driving safety, noting that internal data released quarterly showed drivers were less likely to crash when using the feature than when driving manually. The company stresses that when using Autopilot, drivers must put their hands on the steering wheel and keep their attention. The entire system monitors whether the driver is holding the steering wheel.
The company said it had adjusted its warning sly response to drivers’ actions when their hands were off the steering wheel, a feature that federal investigators mistakenly believed could easily be bypassed.
In 2017, the NTSB called on Tesla to end its first investigation into Autopilot-related fatal crashes by requiring Tesla to better ensure that drivers are focused when using Autopilot. It also called on major automakers to take steps to limit the use of autonomous driving aids to driver-only driving scenarios.
The recommendations stem from the agency’s investigation into a 2016 incident in which former U.S. Navy SEAL Joshua Brown rammed a commercial truck on a Florida highway while using Autopilot. The agency noted that Brown relied too heavily on the auto’s automation and lacked protection in the car to prevent distractions, which were key factors in the crash.
Last fall, the NTSB again cited the focus and Autopilot design involved in a January 2018 crash, when a Tesla driver ran over a stationary fire truck on a highway near Los Angeles. The agency says Autopilot’s design allows drivers to stop focusing on the road.
In the wake of the crash, Tesla said it had updated Autopilot to issue more frequent warnings to drivers who were not focused. O’Neill said the company has also been in regular contact with NTSB investigators and has provided the agency with information about its systems.
“This is not a substitute for a formal response to security advice. “This process is designed to help us understand what they are doing to implement these security recommendations and how they are moving in that direction.” This can also help us understand whether other recommendations are necessary. “
Mountain View’s investigation records suggest several factors that the NTSB may have highlighted at Tuesday’s meeting. Walter Huang’s 2017 Tesla Model X accelerated and hit a cement barrier as Autopilot started and cruised at 75 miles per hour. The NTSB said the vehicle data showed that neither the driver nor the vehicle’s automatic system applied braking before the collision.
Before the accident, Walter Wong complained that Autopilot repeatedly turned his vehicle on the same highway to the same location. Records released by the NTSB show that data obtained from his Tesla computer confirms that the incident occurred at the same location four days before the fatal accident and again at the same location a few weeks ago.
The top of the concrete lane barrier that Walter Huang’s Tesla hit should have been equipped with a collision attenuator to absorb the impact force. But the attenuator was damaged 11 days ago and was not repaired by the California Department of Transportation before the Walter Yellow crash.
The NTSB, citing data transmission records, said Walter Wong was playing games on a mobile device provided by Apple before the crash. But the NTSB said the data could not show how involved he was in the game or whether he held the device with both hands during the crash.
National Highway Traffic Safety Administration (NHTSA) crash investigators have conducted 14 investigations into possible autopilot-related accidents, as well as 11 investigations into other automakers using semi-autonomous vehicles.