Machine Translation Is Gender-Biased Translation
There is plenty of information available on the Internet to a global audience, but of course, with a global population, not every piece of text is going to be understandable to every reader. A Japanese resident is likely to struggle with Italian in the same way an English speaker will be helpless when faced with the Chinese language and alphabet.
Fortunately, there are now “quick fixes” available through different web services that allow people to get a rough, instant translation of a particular webpage or piece of content they are looking at. However, there is a problem with these translation functions, one that even a giant like Google has acknowledged and is struggling with: the mechanized, automatic nature of gender bias in these machine translations.
Learning by Example
The gender bias in machine translation was discussed much more actively when it was discovered that Google Translate was ignoring accuracy in favour of convenience. The example that set things off was when some people looked at the English translation of the Turkish phrase, “o bir doktor,” and would receive the translated phrase, “He is a doctor.” The original Turkish phrase is gender-neutral and in no way implies “he,” but Google Translate automatically gave English speakers this translated version.
Of course, this isn’t because Google Translate is sexist and assumes all doctors must be male. The reason this occurred is that the translation algorithms (i.e., the mathematical formulae that govern the software’s decision-making process) analyze common examples. They then lay out ground rules based on those common examples.
In this case, because the majority of work, especially older work from the 20th century and before, tends to have a male gender bias, the algorithms decide that this common factor must be the correct one and imitate it. In other words, machine translation is gender-biased because it is following pre-existing gender-biased reference material.
The Human Difference
This is where nuanced, experienced human translation is necessary, especially for more permanent translations. It’s easy to understand how mathematical algorithms would be gender-biased if they are programmed to use enormous amounts of gender-biased materials from past generations. But these tools won’t correct that gender bias on their own.
The only way that the pre-existing gender bias can be addressed without completely rewriting translation algorithms is with new content that is not gender-biased and new translations of older content that remove the gender bias. Human translators, with nuance and awareness, are required for this change to take place.
Do Your Part
Google is attempting to address the bias in their algorithms, but of course, when it comes to a mathematical, rather than human interpretation, software will always err on the safe/common choice, not necessarily the progressive one that brings people a step closer to equality.
This is why, when it comes to your content, you should always turn to an experienced human translator to provide the right translation when time and budget permit. That way, you can eliminate bias and take a more equitable position. Contact ITC Global Translations for a free quote today.
Leave a Reply
Want to join the discussion?Feel free to contribute!