As a model officially delivered in July 2017, the Tesla Model 3 is no longer young. However, with hardware and software upgrades (hardware changed from HW 2.5 to HW 3.0), the Model 3 still has a strong product power and will be easily upgraded to fully autonomous driving in the future.
At least, Tesla promised.
Compared to its big brothers, the Model S and Model X, the Model 3 is Tesla’s first truly “civilian” electric car. After the launch of HW 3.0 last year, Tesla promised users that it would be able to get an HW 3.0 upgrade after sales as long as it bought the FSD kit.
Of course, even if you spend FSD money, you don’t need full yautonomous driving right now. After all, today’s FSD is an Autopilot enhanced version. Still confident about the FSD, Musk plans to raise its price again in July.
As a result, Tesla’s followers are far from over in their quest for the Model 3. So what exactly did the new HW 3.0 bring to Model 3? How will the future of this bowl of large and full-footed models evolve? Today, we’re reading about its on-board sensors and ECUs.
In recent years, the status of in-vehicle chips has become increasingly important because their computing power is the mainstay of driver assistance, autonomous driving and active safety.
To optimize this capability, many OEMs and Tier 1 suppliers are arming vehicles with sensors such as cameras, radars, liDARs, and helping them with their surroundings. Of course, all the information collected by the sensor comes together, and the importance of the control unit is reflected.
Given Tesla’s “high-tech” brand image in the public eye, the hardware on the Model 3 has naturally become synonymous with black technology.
However, System Plus CEO Romain Fraux pointed out that Tesla’s main goal in designing the Model 3 was to reduce the cost of the ADAS system in order to drive down the price of the model.
On the Model 3, Tesla comes standard with eight cameras, one radar and 12 ultrasonic sensors. As for LiDAR, Musk chose to ignore it.
Sensors for enabling ADAS
The Model 3’s sensor array consists of eight cameras that provide a 360-degree view of the vehicle and a detection radius of up to 250 meters. The 12 ultrasonic sensors complement the vision system. The combination of the two sets of sensors not only ensures long-range detection capability, but also doubles the accuracy of the previous system. In addition, model 3’s sensor suite incorporates a processing-enhanced front-view radar system. It provides additional environmental data for the vehicle while acting as a safety redundancy in weather such as rain and fog, sand dust, and even “crossing” the front car to provide a “thousand-mile eye” view of the Model 3.
In the camera configuration, the head is equipped with three forward visions, complementing the radar. The three cameras have different technical characteristics, with the main camera having a detection distance of 250 meters, but the field of view is narrow, and the other two detection distances are 150 and 60 meters, respectively, but the field of view is much wider. The other five cameras monitor the side and rear of the vehicle, with a detection distance of up to 100 meters.
As for the 12 ultrasonic sensors, the operating radius is only 8 meters. However, it can work at any speed and is useful for controlling blind areas for vehicles. The data collected by these sensors is also used by Autopilot to change the wire. Finally, for determining the location of the vehicle, Tesla uses GPS.
To handle the forward field of view, Tesla has developed a three-camera module to use with three image sensors. The Model 3 also features 2 front-view side cameras, 2 rear-view side cameras and 1 rear view camera.
In terms of pixels, each of these eight cameras is 1.2 megapixels and is technology dating from the 2015 On Semiconductor product. “They’re all cheap, they’re not technically new, they’re not resolution.” Fraux said.
Buying eight cameras at a time from a vendor also means, “Tesla wants to get the best deal.” Fraux explains.
What’s the mystery of the three-camera system?
Tesla’s three-camera system uses The On Semiconductor’s 1.2-megapixel (1280×960) AR0136A CMOS sensor, with a single pixel size of 3.75 s.
For a comparison, System Plus looked at ZF’s S-Cam4 three-camera system. The system features Omnivision’s COMS sensors and Mobileye’s EyeQ4 vision processor.
As shown in the image above, Tesla’s PCB packaging technology is different from BMW’s. German giants prefer to place different sensors on different PCBs. Tesla’s forward-looking three-camera module places all cMOS on a SINGLE PCB and does not have the SOC to handle. ZF’s S-Cam4 is supported by Mobileye’s vision processor.
On Semiconductor’s mature sensors, combined with the need for post-processing, allow this low-cost camera module to finally complete cost control. ZF’s three-camera system is understood to cost about $165, while Model 3 costs about $65.
Tesla’s steady-mindedness was also exposed on the radar module, opting for continental arS4-B. ARS4-B used the 77GHz radar chipset with NXP’s 32-bit MCU. Fraux points out that While MediaTek and Texas Instruments are eyeing the in-vehicle chip market, NXP and Infineon remain undisputed leaders. Continental has a huge advantage in radar modules. The ARS4-B has landed on at least 15 models, including the Audi Q3, Volkswagen Togo, Nissan Qijun and other heavyweight products.
Continental also has a radar system called ARS4-A, which is primarily used for collision warning, emergency brake assistance, collision mitigation and adaptive cruises. The power of this product is its long-range real-time measurement capability, with accuracy of up to ./0.2 m in the 250-meter range. Close-up measurements can also reach up to 70 meters and provide direct access to relative velocity and angular data between two objects.
Tesla Computing Unit
Tesla has developed a “liquid-cooled dual-core computing platform” that directly includes Autopilot and information processing computers. In short, they are placed on two circuit plates, but integrated into a module.
Specifically, the new computing platform integrates the ECU and Autopilot ECU responsible for central infotainment, while in the HW 2.5 era, Autopilot used Nvico’s SoC and GPU.
Although geniuses such as Musk, one-stop autonomy is not realistic, so after developing fSDs, they use Nvito’s GPUs, Intel processors, NXP and Infineon microcontrollers, magnesium, Samsung flash memory, and ST’s audio amplifiers.
Evolution of Autopilot ECU
System Plus notes that Tesla’s evolution in computing power has been reflected in the Autopilot ECU. In the HW 2.5 era, Tesla integrated two Nvida Parker SoCs, a Nvida Pacal GPU and a TriCore CPU of Infineon. By the time of HW 3.0, Tesla had two self-developing SoCs, two GPUs, two neural network processors, and a lock-step CPU.
Tesla’s price is much more affordable than the ZF system on the Audi A8, and there is safety redundancy.
HW2.5 vs. HW3.0
Fraux says that at the same volume, Tesla stuffed more parts into HW 3.0 (4,746, 65 more parts than HW 2.5).
In terms of process, Tesla’s self-developing SoC is 14nm, a bit more advanced than the HW 2.5 Inveada 16nm. Fraux points out that this is also the first time a car has used a 14nm FinFET chip.
Self-researching SoC is really fragrant?
In the automotive industry, it is rare to design asCCs (dedicated chips) themselves. Because it’s too risky. “Unless you have a talented hardware design team in-house,” explains Fraux. Given that the car market is not as hot as it used to be, designing chips out of your own pocket is even more costly.
But there are not absolute, and there are many manufacturers who want to replicate Tesla’s self-developed chip practices, and they want to make a strong core for their own Autopilot products.
But is it really worth it to spend a lot of money on self-research chips?
“If you want to have a good profit report and there’s a lot of demand for mass production, it’s worth it.” Fraux explains. Over the past few years, vehicles have been integrated with more and more electronic components, and leading players such as Nvito and Intel are reluctant to play “small profits and more.” If OEMs are reluctant to hand over their profits for the next five years, I’m afraid developing their own chips is the best way to hold off fate.
System Plus expects the Tesla HW 2.5 to cost $280, but the HW 3.0 cost is $190 when you replace your self-designed FSD. “Assuming that the self-developed chip would cost $150 million and produce 400,000 yuan a year, it would be back in four years.” Fraux said. Obviously, self-research chips are definitely a good choice for the auto giant.