ADVERTISEMENT

Translations are hard work—some details, nuances, and philological quirks can get lost in translation that only a real language expert would notice. And even though computer-based translation programs like the one being run by Google have been a lifesaver, they’re far from perfect. Some people on Twitter, like Dora Vargha, have been calling out Google Translate for being ‘sexist’ because it assigns genders to professions and activities when translating gender-neutral languages. Scroll down to have a look and let us know what you think, dear Pandas.

But before we dive in, a small heads-up. There might be some misconceptions about how the Google Neural Machine Translation system works: its system uses an artificial neural network that’s capable of deep learning. It goes through millions of examples and ‘learns’ how to translate. In short: the ‘bias’ wasn’t programmed in; rather, the system relies on existing translations. What’s more, Google recently announced its Rewriter program that uses post-editing to address gender bias.

Artist, musician, and writer Bani Haykal is one of the people who drew attention to the fact that Google Translate is biased when dealing with gender-neutral languages by making a video which he shared on Twitter. He told Bored Panda that one of the main points why he made the video was to identify the inherent problem of gender stereotypes. “We cannot look at the technology without first looking at what / who made them. As much as I want to say this is a technological or algorithmic problem, which it is, it first has to be examined with consideration of its ecology. How did it become like this?” Bani mused. “With such stereotypes, biases, harm that is very much prevalent in oppressive, patriarchal culture, politics and economics, I feel that we are perhaps missing the point to only look at the tools and consider how to eliminate bias.”

Some Twitter users have been pointing out that Google Translate may be ‘sexist’ in the way that it deals with gender-neutral languages

ADVERTISEMENT

Image credits: DoraVargha

Image credits: DoraVargha

Someone else posted another example of this

Image credits: fdbckfdfwd

Image credits: fdbckfdfwd

Bani explained that we ought to first consider the role of those who shape these technologies. In Bani’s opinion, moving forward and eliminating biases is only possible by taking a closer look at hiring policies, how inclusive, diverse, and progressive the employees are. “It is not enough to just say ‘we need to fix this feature.’ No, we need to fix a core component of the system that perpetuates this brokenness, the people who are part of the mechanism which develop these tools.”

Bani believes that it isn’t fair to compare real-life translators to Google translate because they’re vastly different. “It’s comparing apples and oranges. A person who is effectively bilingual or a polyglot can translate complex expressions because of their deep understanding of a particular culture and context, so I don’t think it is fair to compare the two or suggest which is better, they are not the same!”

ADVERTISEMENT

He mentioned that it’s important that we don’t conflate automated tools with skilled professionals. “Offsetting and attributing these skills to tools flattens our interactions to something purely technical, and in my opinion, it shouldn’t be done as it is dehumanizing. It completely reduces a person with a skill to a mere tool. We should be resisting this!”

Others have been experimenting with Google Translate to check whether this is true

Image credits: ZombieTron

Image credits: ZombieTron

Plenty of languages appear to have the same translation issues that the Twitter users pointed out

Image credits: kaiju_kirju

ADVERTISEMENT

Image credits: lauraolin

Image credits: lauraolin

ADVERTISEMENT

Image credits: sofimi

Image credits: sofimi

Image credits: ZombieTron

Image credits: ZombieTron

Image credits: SaimDI

Image credits: ZombieTron

Some internet users believe that Google isn’t as neutral as it should be

Image credits: relevanne

Image credits: relevanne

ADVERTISEMENT

Image credits: uffishthot

What’s more, Google Translate looks at the broader context to find out which translation is the most relevant. The algorithm works like a mirror that adapts to show us, in its opinion, the most accurate depiction of what we asked for, based on its experience. Or, in other words, it’s a reflection and a compilation of how people all around the world would translate things.

While Google Translate could use the gender-neutral ‘they’ instead of ‘he’ or ‘she,’ this would actually cause a lot of potential confusion just to ensure that the system is being polite. Though ‘they’ can be used as singular, it’s also plural and has very different connotations. For anyone working with Google Translate, this would be a nightmare on a practical level because cases exist in some languages, meaning that the plural use of ‘they’ would change the entire sentence.

Meanwhile, using the singular ‘it’ wouldn’t work either. It might be gender-neutral but it’s also dehumanizing and might even cause more offense. Let’s not forget that even though it’s a program doing the translations, language is a passionate, innate part of the human experience: making mistakes is part of the learning process and we should embrace that. If Google Translate were a student, we ought to be encouraging it to do better and to learn from its mistakes instead of trying to get it canceled for being ‘sexist.’

The issue that some people have with Google Translate could potentially be solved by providing all possible gendered versions of translated sentences so that the user could pick what’s right for them. Or we could walk down the path of randomization and have the system assign random genders to sentences rather than drawing from its millions and millions of real-world examples. It’s either that or changing how the entire world’s translators work so that Google’s algorithm takes these new examples into its learning process.

ADVERTISEMENT

While some Twitter users were adamant to stomp out sexism wherever they found it, a few others asked whether ranting about Google Translate was the wisest approach, instead of acting against more pressing gender-based discrimination issues.

However, others pointed out that the situation is much more nuanced

Image credits: la_lea_la

There were people on both sides of the fence in this debate

Image credits: SarionBowers

Image credits: DoraVargha

Image credits: grumpwitch

Image credits: feature_envy

Image credits: FunPoliceCo

Image credits: ray_sniderII

Image credits: jcacperalta

Image credits: EG198

Is Google Translate’s ‘sexism’ something that affects you, dear Pandas? What’s your opinion about some Twitter users calling out the system for assigning genders the way that real-life translators do? How do you think Google Translate could be improved? Should it be changed at all? Share your thoughts and feelings in the comment section below.