On March 23, 2018, Apple engineer Walter Huang was driving his Tesla Model X when he got into an accident when his car hit a barrier at the lower entrance of the highway. At the time, Tesla Autopilot was on, but failed to identify obstacles on the road. Even more frightening, it also controls the vehicle’s increased speed as it approaches the barrier.
In the end, the model X, which has a strong passive safety capability, failed to save Walter Huang’s life. After the accident, the National Transportation Safety Board (NTSB) began to investigate it, and it was only recently that they finally released the report. On the 25th of this month, there will also be a hearing on the NTSB’s full report.
The accident was a real worry, in short, a high-speed stretch that every One Silicon Valleyman could nun, and the location of the accident was a few miles from Tesla’s headquarters in Palo Alto, so Tesla engineers were regulars on that stretch.
An analysis of the entire report reveals that the NTSB has only added some new data in moderation, and that the additions are almost certain. Obviously, Walter Huang made a big mistake, so he has to take most of the blame, after all, the Tesla Autopilot is just a driver assistance system that still requires driver supervision. Still, Tesla and the autopilot community are interested in Autopilot’s performance and how it can do better.
In addition to Walter Huang, who is responsible for road maintenance, Caltrans is also responsible. Because a few days before Walter Huang’s accident, a car hit the same barrier, but the lucky driver survived. Unfortunately, Caltrans was unable to replace the isolation in time, otherwise Walter Huang’s chances of survival were much higher. From the relevant information, the highway under the road entrance collision accident is really quite a lot, do not know whether the current Autopilot has evolved to distinguish whether the isolation barrier is damaged.
It is important to note that ntSB reports may not be responsible for Autopilot. Although it doesn’t appear as divine as advertised, Walter Huang is well aware that Autopilot doesn’t handle the lower pass in high speed well, because he’s had at least two problems in that position.
On top of that, the report speculates on what Walter Huang was doing in his last period, not ruling out the possibility that he was playing games on his iPhone. The report is also controversial as to whether his hands were on the steering wheel as required.
What the hell happened that day?
At the time, Walter Huang was on his way with Autopilot as usual. However, the lane line on the high-speed highway is somewhat blurred by the wind and sun, especially the left lane line, which is an important dividing line between the V-lane line and the main road. In fact, a Google search reveals that the lane line here began to look blurry in 2016, when Autopilot apparently “looked away” and it might even have seen the left-hand line of the V-line as the left lane line, while the real left lane line was used as the right lane line. Walking down this road waiting for him was the car wreck.
Even more unfortunate, Walter Huang set a cruising speed of 75 mph (about 120 km/h), but autopilot was slowed down due to the obstruction of the vehicle ahead. However, when the Model X was offside, no other vehicles were blocked in front of it, and Autopilot began to guide the vehicle back to 75 mph, but the front was not a road, but a cement barrier.
It is important to note that the Cement Barrier on the U.S. Highway should have placed a metal collision buffer before the pier, and in the event of a collision it will suck energy to help the owner survive. Unfortunately, the collision buffer here has been deformed and does not act as a absorbent. In addition, the deformation of the buffer also contributed to the error of Autopilot target recognition.
Some would say, doesn’t the camera waste Autopilot? Yes, but radar sits at stationary objects, which perform worse, or Tesla won’t hit a fire truck on the side of the road many times.
Is there a “big samaritt”?
To get Autopilot working properly, the driver has to “pressure” the steering wheel from time to time to tell it that you’re still there. If you do not touch the steering wheel for too long, a warning will appear. If the driver continues to ignore it, there will be a voice warning and the vehicle will slowly stop. In general, the driver will turn his attention back after the first stage of the visual warning.
Autopilot was online for 19 minutes before the collision, but it gave Walter Huang two visual warnings. Walter Huang loosened the steering wheel for approximately 30 seconds 2-3 minutes before the collision. He holds the steering wheel again with visual reminders, but the vehicle is still in a “big sprinkling” state 6 seconds before the collision.
Driving to play games?
There is evidence that Walter Huang was playing the “Three Kingdoms” game while driving because his iPhone passed data on the game server a minute before the collision, and the game required two hands.
Of course, the NTSB’s report doesn’t say whether the game will automatically transfer data in the background. However, if Walter Huang is driving to play the game, he will be responsible for the accident.
Why did you go the wrong way?
It is well known that, at today’s technical level, blurred road markings can easily lead to ADAS system errors, and the tragedy of Walter Huang is closely related to this. Tesla’s system could have been smarter, but Musk explicitly rejected the technical route of The High Map.
If there is a high-precision map, at least the location of the cement isolation and collision buffer will be identified. In addition, even if the road markings change, bridges rarely change. In other words, the accident could have been avoided.
A head to the isolation?
In addition to going the wrong way, the vehicle’s perception system was in a state of total failure, the camera was not able to catch a clear picture of the isolation, the radar did not receive a echo. Apparently, Autopilot’s safety redundancy is not sufficient, and when the camera is out of order, the radar is helpless with stationary objects. In addition to not being good at detecting stationary objects, the radar has a poor resolution, with almost no resolution on the upper and lower field of view of the vertical axis, and a horizontal field of view of up to 5 degrees.
If the car is equipped with a high-precision map, it can give the computer at least one radar echo of the expected value and object location information. However, even the radar’s low resolution can drag it down at critical moments.
The camera has no resolution problem, but a shaped buffer is not what it can hold, and Tesla’s neural network has never seen it. What’s more, the normal buffer sits with yellow and black stripe warning signs, but the deformed one doesn’t.
After the accident, Tesla also upgraded its own system to enhance its ability to detect unknown obstacles. Although various objects can be noticed by calculating vision using techniques such as motion parallax, neural networks can make a bit of a hard time identifying things they have never seen before. Given Musk’s snub to LiDAR, Tesla’s camera system has a long way to go to cover the entire scene.
All in all, no matter who’s tuned down, the combination of camera-and-radar is now still only attributable to ADAS, which is not fully autonomous.
Who is responsible?
There is no doubt that Walter Huang will pay for his unexpected death, because he is well aware that Autopilot has a hard time in this way (he has had similar problems twice before). In addition, driving to play the game is also his wrong. Today’s Autopilot is just an ADAS system, and drivers can’t expect it to be “all-in-one.”
Caltrans, which is responsible for road management, also needs to improve its work, such as repairing damaged facilities and updating road markings as soon as possible. Unfortunately, in reality these two things are hard to do, so future autonomous driving and ADAS systems must learn lessons and learn to deal with this particular situation. Sure, the V-line has now been repainted, but who knows if there is a similar danger elsewhere?
It’s important to note that the NTSB did not pull Tesla to help in the investigation, because Musk did not comply with the NTSB’s management and even hung up the NTSB chairman’s phone directly, which was not really good for either side.
According to the investigation report, Tesla is not responsible for the accident, but they can be learned a lot, and can be used on the road to fullself autonomous driving. After the accident, Tesla promptly upgraded the vehicle’s ability to read blurry markers, while improving the detection of static objects. However, the combination of camera-and-radar is still not 100% avoiding accidents that tail a stationary object.
In fact, a year after Walter Huang’s accident, another Tesla chased a truck on the side of the road, and the truck wasn’t completely stationary, and it was slowly walking across the road. The NTSB issued only a preliminary report on the incident, but it did not include what Tesla Autopilot was doing at the time and the analysis.