Top

As Google’s AI drops gender labels like \'man\', \'woman\', Microsoft, IBM under pressure

Research showed that the AI tools used by Microsoft and IBM were more likely to misidentify the gender of dark-skinned women

Chennai: Gender rights campaigns have forced tech companies developing artificial intelligence such a Google to drop gender labels in images of people, and instead use the term ‘person’. Following Google’s announcement on compliance last week, Microsoft, IBM and Amazon are now under pressure to also tweak how AI labels photos of people.

Admitting it was not possible to identify the gender of a person by appearance, Google said its image classification service Cloud Vision API would stop categorising people under gender labels.

“Google's move sends a message that design choices can be changed. With technology it is easy to think some things cannot be changed or are inevitable. This isn't necessarily true,” AI expert Joy Buolamwini, a computer scientist at MIT, was quoted as saying by Business Insider.

The report said Buolamwini had a direct influence on Google’s decision.

"I would encourage all companies including the ones we've audited to reexamine the identity labels they are using as demographic markers,” Buolamwini said referring to Amazon's Rekognition, IBM's Watson, and Microsoft's Azure facial recognition, among others that she examined in her research published in 2018.

The findings showed that the AI tools used by Microsoft and IBM were more likely to misidentify the gender of dark-skinned women. This also brought into focus labelling by race, class and disability.

Another argument is that at a time when one chooses to belong to a certain gender, AI must not box them into pre-set classifications. “All classification tags on humans should be opt-in, consensual, and revokable," associate professor of MIT Sasha Constanza-Chock was quoted as saying by Business Insider.

Giving an example, she said transgender Uber drivers were being locked out of the ride-hailing app because their physical appearance no longer matched photos on record.

Next Story