In this scenario, MT systems tend to favour masculine forms in translation, except when troublesome stereotypical associations come into play. Then, feminine gender might be generated in association with less prestigious occupations or activities (she cooks, he works), or even controversial descriptors (clever professor as masculine, but pretty professor as feminine). Such behaviours are not limited to ambiguous translation and can impact the representation of explicitly gendered referents, too.
Gender bias in MT can be witnessed by any user, for example, if they rely on commercial systems available online. Not only; perhaps with more pernicious effects, when reading automatically translated webpages latvia mobile database or social media posts we might be presented with biased gender assignments and not even realise it.
Language is a powerful tool that can enable the visibility of social groups, but through which we also reveal prejudices and propagate stereotypes. Thus, given the ubiquitous role that they occupy in our daily lives, concerns over the appropriate use of gendered language have extended to language technologies, too.