Google announced Wednesday that it is upgrading image search results on U.S. phones to gather more information from its Knowledge Graph,media reported. Of the billions of pieces of information collected from Google’s Knowledge Map database, information about people, places or objects associated with the images is now displayed under the images, allowing users to learn more about the images and their surroundings.
For example, suppose a user is searching a beautiful state park nearby and wants to visit it. Users want to swim during the tour, so they may click on a picture of a park with a river. Below the photo, users may see related topics, such as the name of the river, or the city of the park. If a user clicks on a particular topic, it expands and shows the user a short description of the people, places, or things it refers to, as well as a link for more information and other related topics to explore.
To generate links to relevant knowledge map entities, Google uses in-depth learning to combine their understanding of images, evaluate visual and text signals of images, and combine them with Google’s understanding of text on the image pages. This information helps them identify the most likely people, places, or things associated with a particular image. They match edited it to existing themes in the knowledge map and then surfaced them in Google Images when they were sure they found a matching image.
As U.S. users search on their phones, the feature will begin to appear on images of people, places, and things in Google Images, and will expand to more images, languages, and surfaces over time.