Google AI tool no longer generates “male” “female” tags for images

In an artificial intelligence tool provided to developers, Google says it will not label images because they believe a person’s gender cannot be determined by their image in the photo alone.

Today, Google sent an open letter to developers saying it would make adjustments to the widely used Cloud Vision API tool, which primarily uses AI to analyze images to identify faces, landmarks, displayable content, and more.

In the future, Google will use “person” (people) to mark images, rather than “man” (male) or “women” to identify images to avoid artificial intelligence algorithms conveying bias.

In an email to developers, Google cited its own AI guidelines, Business Insider reported. “Given that it is not possible to infer a person’s gender by appearance, we have decided to remove these tags to comply with Google’s artificial intelligence principles, in particular Principle 2: Avoid creating or reinforcing unfair prejudices.” “

Artificial intelligence image recognition has been a thorn in Google’s problems in the past. In 2015, a software engineer pointed out that Google Photos’ image recognition algorithm classifies its black friends as “gorillas.” Google promised to fix the problem, but Wired found in a follow-up report in 2018 that It had blocked its AI from identifying gorillas, but the core issues were not addressed properly.