Musk’s promise to have a fully functional “fully functional fully autonomous driving” (FSD) feature by the end of 2019 failed to materialise, admitting that it is at least “several months” away from the launch of the full FSD. Musk said a hackathon would be held to solve the problem of self-driving, suggesting that Tesla hopes to force its way into the city by training neural networks to identify and act on every object that may be encountered in the city.
Tesla CEO Elon Musk recently announced that he will hold a “super fun AI party/hackathon” with Tesla engineers and some invitees.
Promised fully autonomous driving feature sits on track
Elon Musk told Tesla shareholders in February that the company would complete its fully functional fully autonomous driving feature (Full Self Drive, FSD) by the end of 2019.
Musk told reporters in a telephone interview last February that the release at the end of the 19th year would be fine:
“It’s time to start with the driver’s supervision, because it will take billions of miles to reach the level of safety that no longer requires driver observation,” he said. Then we need to convince the regulatorthat that this will take some steps … But as I’ve said before, I’m sure we’ll release the all-autonomous driving package this year. “
Full Self Drive (Full Self Drive, FSD) is the name of a Tesla package that Musk says will allow existing Tesla cars to be able to drive from residential, freeways and city streets without any human intervention.
Ahead of last week’s fourth-quarter earnings conference call, Musk acknowledged that Tesla is at least “a few months away” from launching a “fully functional, fully autonomous driving system,” Business Insider reported.
In October 2019 Musk was forced to acknowledge that Tesla would miss the 2019 deadline to recuse himself from his previous comments.
Musk recently announced that he will hold a “super fun AI party/hackathon” with Tesla engineers and some invitees. Musk will host an old-fashioned hackathon to tackle the problem of self-driving.
In another way, Musk wants to solve the problem with a hackathon.
Let’s take a quick look back at the recent conference call and use his original words:
“Yes, it’s functional, I mean, this car is able to drive from home to work, probably without intervention. As a result, while it will remain monitored, it will be able to drive to the company. You have high-speed autonomy on the highway, autonomy in medium-speed areas, which in fact simply means more traffic lights and stop signs at low speeds.
As a result, functionality means that it is likely to do this without human intervention, but will still be monitored. And I’ve said it several times, but it’s often misunderstood as having three main levels of autonomy. Cars can drive autonomously, but sometimes require supervision and intervention. This is already functional. Functionality does not mean being able to handle any situation (including every extreme situation) anywhere on Earth, and functionality only means most cases. “
Musk told reporters on a earnings call that the company’s goal now is to lower its ambitions, according to Business Insider:
“Functional improvement just means that self-driving can be driven from home to the company to a certain extent without any intervention. But that doesn’t mean all features are working well. “
Musk and Tesla see this as a computer vision issue, citing a quote from Musk in Tesla’s latest report:
“I think it looks like it could be a few months now. And the obvious thing about autonomous and fully autonomous driving is that much work needs to be done to improve the basic elements of autonomy. “
“In terms of labeling, video is labeled simultaneously in all eight cameras. As far as label efficiency is concerned, I mean, i can say that there are three orders of magnitude increasein in label efficiency. This is extremely important for those who understand this, and much progress has indeed been made in this regard. “
Musk, it sounds, is talking about a tricky problem with computer vision: real-world labeling. But we vaguely sense that Tesla may be using violent methods to train neural networks so that they can identify all possible objects in the real world and act accordingly.
There are potential flaws in this approach: because things don’t always look easy.
Tesla aside, self-driving research needs to carefully consider the city’s infrastructure to better serve autonomous driving. Now urban cabs can’t call driverless ubers, not because regulators don’t let them down, but because computer vision and lidar don’t solve the problem.