Did Consumer Reports blast FSD Tesla for a loss?

For Tesla, Consumer Reports has always been impartial, never stingy when it comes to praise, and when it comes to pointing the finger at the heart of the problem. In early September, Consumer Reports released its long-awaited review of Tesla’s FSD features. Clearly, Consumer Reports, which has an independent, objective third-party aura, was rather disappointed with Musk, giving a direct assessment that “Tesla’s ‘fully autonomous driving feature’ is really hard to come by.”

Did Consumer Reports blast FSD Tesla for a loss?

For insiders, Consumer Reports’ review gives them self-affirmation, as most industry experts worry about Tesla and Musk’s irresponsible promises on the FSD.

In ordinary matters, too much commitment is one thing, and in dealing with the safety of vehicles involved in life and death, too much commitment is another. If the seller exaggerates that their washing powder has a super-staining function, those who believe in the effect of publicity may not suffer significant damage because of the poor results. However, vehicles on public roads can weigh tons of “shells”, once misjudged, the wheel may be the life of an innocent people, compared with the clothes washed clean, this is quite a serious consequence.

First, let’s talk about the functional naming of FSD, and then we’ll go into more detail about what FSD does. Most importantly, what Exactly is Musk’s promise has yet to be fulfilled.

Those who prefer the nickname “fully autonomous driving” will be quick to explain that it is nothing more than an ambitious name. As a result, they may admit that Tesla’s FSD does not achieve the self-driving class advertised, but they are confident that Musk will one day fulfill his promise. In that sense, they think it’s reasonable to name the product “fully autonomous driving” directly, otherwise one day you’ll have to rename the product, wasting money and hurting brand value because users will feel you lack a vision for the future.

However, a German court has a different view of the matter. In July, the court ruled that Tesla’s Autopilot brand name was misleading and exaggerated, stressing that Autopilot meant fully autonomous driving, not driver assistance. If you think autopilot’s naming leaves room, FSD is completely ambiguous.

Tesla owners often mention that whenever they use FSD-related features, they relax while driving, relying on an assisted driving system that is not fully autonomous. In addition to giving the illusion of product names, these features themselves are deceptive and induce some car owners to play “big tricks” on the road.

Some Tesla owners will question the fact that drivers who are fooled or overconfident by the FSD are not the majority of Tesla owners. Even if that’s true, how many Tesla drivers can our society be spoiled by FSD? Because even if it’s only 1%, it means that someone could lose their life because of their negligence. If changing the name of FSD saves lives, why not?

Another point of contention relates to the huge fees Tesla owners pay for the FSD, which in fact cost them money but didn’t get the service, at least not yet. They believe Musk’s commitment to updating the latest fully autonomous driving features by using OTAs once ready.

Of course, Tesla owners can say they’re willing to pay for the future of FSD, and no one can criticize them for that. In addition, since they have already used driver assistance, the money has not been watered down. However, people who stayed out of the way still felt that the owner had been tricked into buying something that didn’t exist, and they didn’t even know if the FSD really existed.

In short, the topic of money is likely to end up in court, presumably through lawsuits brought by Tesla owners. If FSD is always difficult to produce, someone will come forward and ask Musk to pay it back. Of course, legal bodies are also likely to come forward, and they will take Tesla to court on the basis that VW was misled.

Here we can say, uncereponsally, that anyone who tries to argue that FSD is indeed fully autonomous at the moment is not actually within the scope of rational discussion. They either have no idea what the Tesla FSD does, or they completely misunderstand what fully autonomous cars mean.

What level is tesla’s FSD at now?

By the L4/5 stage, the car doesn’t need a driver, AI is in charge of everything, and the FSD is now an L2, which doesn’t even meet the L3 standard. Apparently, Tesla’s FSD name for it is a bit of a name.

“Despite Tesla’s significant progress in self-driving, based on Consumer Reports’ extensive testing and experience, owners should not rely on Tesla’s driver assistance to increase safety or simplify driving,” Consumer Reports wrote in its review. “

In fact, Consumer Reports isn’t tough enough, because anyone with a little discernment should know that FSD isn’t really fully autonomous.

It’s important to note that Autopilot is now standard when buying a car, and it’s the FSD kit that costs an additional cost that includes features such as automatic parking, automatic line change, smart calling, traffic light recognition, and Navigate on Autopilot (NOA).

Here, focus on what looks like the coolest smart calling feature.

The basic concept of Smart Calling is that owners can use the smartphone app provided by Tesla to ask Tesla cars to come around or to destinations specified on the smartphone map. If the car is parked in a busy shopping mall parking lot, it is really convenient, it allows you to get out of the mall gate on the car, it can be used to avoid rainy days is even better.

Consumer Reports also has a targeted review of Smart Calling: “We’ve experimented with Smart Calling in a number of situations, including in the parking lot of Consumer Reports headquarters in New York State. In general, the system is not reliable. “

Seeing this, you might ask, what is unreliable about Smart Calling?

The questions are as follows:

·        Tesla’s cars sometimes don’t stop and don’t even react when they see a parking sign;

·        Tesla’s cars sometimes go the retrograde route;

·        Tesla’s cars sometimes turn things around and drive toward other parked cars;

·        Tesla’s cars sometimes have to be reversed because it’s too close to the other cars.

If you think Consumer Reports is a bone in the egg, don’t forget that it’s a two-ton car that’s almost running itself, and the parking lot is usually a crowded place. As a result, Consumer Reports concludes, “all of these situations can cause confusion for other drivers and potentially harm pedestrians on the road.” “

Fortunately, cars in the parking lot are not too fast, allowing some reaction time for others or vehicles. Tesla, of course, is also smart, and they directly state in the note that the person responsible for smart calling is the owner of the electric car who “calls” his electric car through the app. “Owners also have to take care of their cars and monitor the surroundings within sight because they may not be able to detect all obstacles.”

In addition, some self-driving professionals have other concerns:

In terms of ability, Smart Calling is more like a student still learning a car. We’ve all seen such nods, and if you’re sitting with a co-pilot, their every move can make your hands sweat, and you just pray that there won’t be any other crazy drivers on the road, and pedestrians are better off avoiding the car. Always, this feeling is a mess.

Obviously, the FSD didn’t do enough in driving planning, and the system should have predicted the next move before turning, rather than it was too late to follow. Clumsiness in cornering also makes one wonder if the entire system has bottlenecks in sensors and detection capabilities.

Given that Tesla and Musk have directly rejected LiDAR, it’s clear that their cameras and radars currently don’t meet the requirements of monitoring the surrounding environment. If so, it could be another sign of a temporary victory for LiDAR advocates and could be used in potential court cases over Tesla’s car accident. Of course, it’s also possible that Tesla’s on-board chips aren’t fast enough to make accurate calculations before turning.

Summarize.

Kelly Funkhouser, head of networking and self-driving testing at Consumer Reports, also made a statement, “it seems that Tesla is focused on being the car manufacturer with the most features, not making sure they all work.” “

Some of the comments were deeply worrying, and it’s clear that Tesla is now pulling us all into a dangerous road test, introducing experiments and other unproven features that Musk has put together, and a futurism that has been hailed by fans as a god, and fewer people are aware of insecurity and warning.

Where exactly does FSD go? Time will tell.