Intel Mobileye Shares Unedited Video of Self-Driving Car Road Test

Mobileye, Intel’s self-driving car division, has released a new video showing one of its self-driving cars running on the road,media reported. This is an unedited video that lasts 40 minutes and shows the car driving in a challenging urban environment.

Intel Mobileye Shares Unedited Video of Self-Driving Car Road Test

Mobileye is understood to have been acquired by Intel in early 2017 in a deal valued at more than $15 billion.

Of course, the self-driving car demo believes that everyone is close to nuts a lot. But mobileye’s interesting and rather unique thing is that the entire 40-minute demo actually uses only half of the autonomous system developed by the company. In this case, the self-driving system is a camera vision system with eight remote cameras and four parking cameras, but no radar or lidar sensors.

Ironically, given Tesla’s decision back in 2016 to stop using Mobileye technology and switch to its own system when developing Autopilot. Autopilot is a camera-based self-driving technology that Elon Musk has been a strong advocate. The Tesla CEO has been dismissive of lidar, saying it is unnecessary and too expensive, and instead focuses more on vision-based technology.

The difference between the two approaches, however, is that, unlike Tesla, Mobileye says that camera-based autonomous driving is necessary, but not enough. The Intel-owned company argues that vision-based autonomous driving systems are not enough in themselves because, while they are safer than human drivers, the difference in proficiency is still not enough.

Intel Mobileye Shares Unedited Video of Self-Driving Car Road Test

It emphasizes not only the technology itself, but also the point of alignment between automation and the public’s perception of it. “From a social point of view, it would be a huge achievement if all cars were more than 10 times better on the road than (human drivers),” said Amnon Shashua, President and CEO of Mobileye. “

The reality, says Shashua, is that self-driving cars need to be at least 1,000 times higher than humans on an acceptable scale, which means reaching the level of only one accident every 500 million miles.

“To achieve the MTBF of a perceptive system, redundancy needs to be introduced – especially system redundancy, not sensor redundancy within the system,” he explains. All possibilities are the product of the probability of each device crashing on its own. “

Sensor fusion has become the target of many self-driving car projects, with data including data from cameras, radar, lidar and others then mixed into a single collection and fed to self-driving algorithms. In this case, redundancy usually has a second set of hardware, and sometimes even a second set of sensors. Theoretically, if one computer breaks down, the second computer will step in. This also means that the vehicle’s handling can be determined by the two systems and can only be released with the consent of each computer.

However, Mobileye’s approach deliberately avoids the use of sensor fusion. Instead, the company uses two separate redundant subsystems, one using camera vision and one using sonar/lidar. “It’s like two smartphones,” Says Shashua, “and the likelihood of both systems experiencing perceived failure at the same time decreases significantly.” “

Mobileye sees its approach as more complex but more secure than its competitors using sensor fusion technology. The reality, of course, is that there are still no commercial self-driving cars on the market, and while several projects promise to change that, it’s not entirely clear when that’s going to happen.