Why would Apple add a lidar to the iPad?

Maybe it’s not just pre-buy productivity, it’s about buying. There’s no press conference, no Johnny Ivy’s magnetic British commentary, and the just-released iPad Pro is arguably pretty low-key. But nearly two years after the launch of the previous generation, the iPad Pro upgrade has been a bright spot, and it has attracted a lot of attention.

Why would Apple add a lidar to the iPad?

With processor and keyboard upgrades, Apple sees the future of the iPad Pro’s productivity extending further. Beyond that, the new iPad Pro is most focused on the “bath babe” camera behind it, and although the geek option discussed apple’s chances of adding a camera to the iPad Pro as early as last year, the seemingly familiar back has a new member, lidar.

More than a sweep code iPad Pro

A decade from the first iPad release, the camera has never been the most important part of the iPad. Compared to the changes in the audio and video effect soutcomes of screen and speaker boosts, people have not been much demanding of cameras on the iPad, and “can take a photo and sweep a yard is enough” is the positioning of the iPad camera by many users, combing through Apple’s official iPad product camera changes over the years, and you can see this phenomenon.

Why would Apple add a lidar to the iPad?

But as Apple explores the direction of the iPad Pro productivity tool, more features than the camera, as you can see, Apple officially adopted a new dual-camera group this time around, with a 12-megapixel, ultra-wide-angle 10-megapixel lens, and a dual-camera combination that supports 2x zoom optics and 5x digital zoom. Because of the presence of two cameras, you can finally take ultra-wide-angle and portrait photos with the iPad Pro.

Not only that, but Apple also has an iPhone-free lidar scanner (LiDAR) on the new iPad Pro, which gives the iPad Pro a new space positioning capability, a more powerful AR app beyond the swipe and the iPad Pro. Larger expansion space.

Why would Apple add a lidar to the iPad?

“Even NASA will use it on its next Mars landing mission. “This is an official introduction to the lidar scanner by Apple. If you’re looking at the automotive industry, you’re probably no stranger to lidar, similar to sonar (SONAR) and radar, which is also a technology for detecting the space environment and has many applications in real life.

But unlike the way sound and electromagnetic waves work, lidar measures the time it takes for the sensor to return light by emitting light that is almost invisible to the naked eye, so it allows the device to position in space more quickly and accurately, which is actually the same as before OPPO, Huawei and other mobile phone manufacturers to join the mobile phone. The TOF module is similar to the way the camera is used to improve the photo-taking effect, but Apple is laying the groundwork for AR eco-extensions more than these manufacturers.

Why would Apple add a lidar to the iPad?

In Apple’s official introduction, for example, there is a medical AR app that opens with only the patient to be targeted and lets them act accordingly, and with real-time monitoring of AR technology, it helps doctors assess the patient’s recovery status, and the team behind the app has developed an anatomical app, Complete Anatomy. Platform, which presents the internal structure of the human body through AR technology, makes it easier for medical researchers to understand the structure of the human body.

Another example of Apple’s new AR game Hot Lava, which opens, turns your room into a level full of magma and rock, and you need to control the game characters to jump in this level, then get rewards and clear. As you can see, lidar enhances the AR capabilities of the iPad Pro, both as a professional tool and as a game and entertainment, and in fact, Apple has more to achieve.

The future of utilities

Enthusiasts of digital products should remember that Apple unveiled AR Development Tool ARKit at WWDC 2017, and then demonstrates its many scenarios every year, not just games. For example, the real-time motion capture we saw at a press conference last year can make a lot of interesting interactions, and the use of AR technology to help basketball players test training results was unthinkable in the past.

Why would Apple add a lidar to the iPad?

But to turn a technology from itching point to pain point, improving accuracy and enhancing the experience are the focus. All this requires both powerful hardware technology and further optimization of the experience, and while single-camera phones such as the iPhone XR can also use AR capabilities, accurate positioning in space still requires more powerful hardware support, and the accuracy of a lidar scanner is clearly necessary, and this iPad Pro acts as such a front-rower.

Apple’s official introduction claims that lidar scanners can quickly calculate a person’s height and automatically display useful vertical and edge guides, and that AR Ruler View, for example, can make more detailed measurements for user-friendly use, so that a few simple words may not see its power, but you can see that Apple is working on There’s a big upgrade to the AR experience, and without incident, we’re likely to see more demos at this year’s WWDC online launch.

Why would Apple add a lidar to the iPad?

For example, the AR online game, which has been shown on Google I/O, allows people from different locations to play an air bowling game through AR, which will no longer be an imagination on the iPad Pro with lidar. For example, more accurate map navigation, the iPad Pro provides navigation services with lidar scanners for more accurate real-time positioning in a variety of complex scenarios, which will make our lives easier and more interesting.

Looking back at the iPad Pro product line, we’ll see that there are a lot of advanced technologies that come first on this device, whether it’s a Series a-series processor that performs better than a laptop, a high-refresh rate display at 120Hz, or a Face ID that can be used vertically. Many of the new technologies that make users aware first appeared on the iPad Pro, raising expectations of when they would appear on the iPhone.

According tomedia AppleInsider, there is also code for TOF apps in iOS 14 code, which may suggest that the iPhone 12 series, which we’ll meet with, will take a bigger step forward on camera this fall, after three years of paving the way for AR technology and the iPad Pro With the new upgrade, Apple may show us the exciting future of AR technology.