Back in December, Google showed how to use a single camera to create a deep picture of AR augmented reality, and today, the ARcore Deep API is finally available on Android, and several third-party apps are already in use. While other vendors are also trying to add AR capabilities to their products, most of them are through additional hardware devices, such as adding ToF modules.
The perception and creation of the deep image is realized by means of dual cameras, and it is difficult to determine the distance of the objectinoscope from the camera because the single camera is not enough information.
Google, on the other hand, uses dynamic depth algorithms to create deep-depth images on a single camera alone, and has good precision. Ensure that virtual objects are properly obscured, not floated in space, or placed in places that are not physically possible.
It is reported that with the launch of ARCore 1.18(Google Play service for AR), the Depth API will be available on “hundreds of millions of compatible Android devices.” Google will first demonstrate the feature with AR Animals in Search, and then a new partner will be showcasing the technology.
But for now, As partners in developing the technology, Snapchat and Samsung are already coming online to take advantage of the Depth API. Snapchat has updated several filters to take advantage of the Depth API to achieve AR effects for single-camera phones.
Samsung, which will use the Deep API in its Quick Measure application on the Galaxy Note 10 plus and S20 Ultra, already has a ToF sensor, but combined with the new algorithm can further improve quality, reduce scan time and speed up measurements, and Samsung will update its application in a matter of months.
Google Creative also showed off a new game, such as playing dominoes with the Depth API. In the deep lab applications provided by Google, other applications of the Depth API are available in the sea war, such as realistic physics, surface interactions, and environment traversal.