Google’s AI To Stop Label People In Photos As Man Or Woman

This is because you “can’t deduce someone’s gender by their appearance alone”

According to a Business Insider report, Google’s Cloud Vision API service will no longer label photos of people with ‘man’ or ‘woman’.

Cloud Vision API is an AI-powered tool that helps developers identify components in an image. Google sent out an email to Cloud Vision API customers stating that the tool will no longer attach gender labels to pictures.

Google mentioned in the email that they had decided to discontinue gender labels because “you can’t deduce someone’s gender by their appearance alone” and the use of these labels enforce an unethical use of AI. Google also mentioned that an individual in a photo will only be tagged as ‘person’.

AI bias expert Frederike Kaltheuner, speaking to Business Insider, called this change “very positive” stating that “classifying people as male or female assumes that gender is binary.

Anyone who doesn’t fit it will automatically be misclassified and misgendered. So this is about more than just bias — a person’s gender cannot be inferred by appearance. Any AI system that tried to do that will inevitably misgender people.”

Also Read:

Google also noted in the email that they plan to continue evolving their AI to ensure that people are not discriminated against based on gender, race, ethnicity, income and religious belief.

Latest Posts

Related Articles