Google recently introduced a new Gmail feature that completes sentences using the search engine’s artificial intelligence technology, but it’s already modified it to keep from offending users.
If users are writing about an object, there should be no problem. But if the subject of users’ sentences are human beings, they will be out of luck. The AI feature will no longer suggest “he” or “she” to be the subject of sentences, according to Reuters.
Product leaders for Google told the news outlet the tech company’s autofill feature, “Smart Compose,” will no longer suggest gender-based pronouns because there is just too much of a risk that the tool might offend users by employing the wrong descriptors.
Banning entirely value-neutral words that describe objective reality definitely isn't biased at all. https://t.co/GWOcE6FWfY
— Sean Davis (@seanmdav) November 27, 2018
Paul Lambert, a product manager for Gmail, noticed the potential for a problem in January, when he typed, “I am meeting an investor next week.” Smart Compose then suggested this follow-up question: “Do you want to meet him?”
“Not all ‘screw ups,’” Lambert said, “are equal,” noting mistakes about gender and sexual orientation are “a big, big thing.”
News of Google’s AI alteration comes the same week Twitter made headlines for banning “misgendering” and “deadnaming” on the social media network. “Misgendering” refers to the use of pronouns that don’t correlate with a transgender person’s chosen gender identity and “deadnaming” is the practice of using a transgender person’s pre-transition name.
Twitter users who are guilty of violating either rule are subject to temporary or permanent suspensions.