In order for augmented reality systems to be able to identify locations from the user’s ground perspective, they must usually be “trained” first with ground images from those locations. However, Sturfee’s City AR system works faster by using satellite photos.
Sturfee announced the technology Tuesday, when City AR first built a 3D digital grid model of the city based on high-resolution 2D satellite images of ground data, such as the geometry of buildings, trees and roads. Each location in the model is assigned a “visual fingerprint” to produce a machine-readable “fingerprint map”.
When a City AR-enabled smartphone app then shoots these locations from the ground, the system’s cloud-based computer vision algorithm matches the app’s visual information to the “fingerprint” on the map. As a result, it can determine where users are in the city and where their phones are “looking” at buildings, allowing text or graphics to be displayed on the screen in the right place.
According to Anil Cheriyadat, chief executive of Sturfee, this is much more effective than visualizing city maps from the ground using a vehicle equipped with a camera or other means.
“These are operationally intensive methods and the cost of expansion is high, ” he says. With our technology, we can create machine-readable versions of San Francisco in a week and detect and update any city changes faster. “
To date, Sturfee has mapped 15 cities on three continents and has signed multi-year licensing agreements with KDDI Corp., Japan’s second-largest telecommunications provider.