Gender bias in English-to-Greek machine translation
- Author
- Eleni Gkovedarou (UGent) , Joke Daems (UGent) and Luna De Bruyne
- Organization
- Abstract
- As the demand for inclusive language increases, concern has grown over the susceptibility of machine translation (MT) systems to reinforce gender stereotypes. This study investigates gender bias in two commercial MT systems, Google Translate and DeepL, focusing on the understudied English-to-Greek language pair. We address three aspects of gender bias: i) male bias, ii) occupational stereotyping, and iii) errors in anti-stereotypical translations. Additionally, we explore the potential of prompted GPT-4o as a bias mitigation tool that provides both gender-explicit and gender-neutral alternatives when necessary. To achieve this, we introduce GendEL, a manually crafted bilingual dataset of 240 gender-ambiguous and unambiguous sentences that feature stereotypical occupational nouns and adjectives. We find persistent gender bias in translations by both MT systems; while they perform well in cases where gender is explicitly defined, with DeepL outperforming both Google Translate and GPT-4o in feminine gender-unambiguous sentences, they are far from producing gender-inclusive or neutral translations when the gender is unspecified. GPT-4o shows promise, generating appropriate gendered and neutral alternatives for most ambiguous cases, though residual biases remain evident. As one of the first comprehensive studies on gender bias in English-to-Greek MT, we provide both our data and code at [github link].
Downloads
-
2025.gitt-1.2.pdf
- full text (Published version)
- |
- open access
- |
- |
- 751.48 KB
Citation
Please use this url to cite or link to this publication: http://hdl.handle.net/1854/LU-01K361M1KK15JRRNSGCJGDP135
- MLA
- Gkovedarou, Eleni, et al. “Gender Bias in English-to-Greek Machine Translation.” Proceedings of the 3rd Workshop on Gender-Inclusive Translation Technologies (GITT 2025), edited by Janica Hackenbuchner et al., European Association for Machine Translation (EAMT), 2025, pp. 17–45.
- APA
- Gkovedarou, E., Daems, J., & Bruyne, L. D. (2025). Gender bias in English-to-Greek machine translation. In J. Hackenbuchner, L. Bentivogli, J. Daems, C. Manna, B. Savoldi, & E. Vanmassenhove (Eds.), Proceedings of the 3rd Workshop on Gender-Inclusive Translation Technologies (GITT 2025) (pp. 17–45). European Association for Machine Translation (EAMT).
- Chicago author-date
- Gkovedarou, Eleni, Joke Daems, and Luna De Bruyne. 2025. “Gender Bias in English-to-Greek Machine Translation.” In Proceedings of the 3rd Workshop on Gender-Inclusive Translation Technologies (GITT 2025), edited by Janica Hackenbuchner, Luisa Bentivogli, Joke Daems, Chiara Manna, Beatrice Savoldi, and Eva Vanmassenhove, 17–45. European Association for Machine Translation (EAMT).
- Chicago author-date (all authors)
- Gkovedarou, Eleni, Joke Daems, and Luna De Bruyne. 2025. “Gender Bias in English-to-Greek Machine Translation.” In Proceedings of the 3rd Workshop on Gender-Inclusive Translation Technologies (GITT 2025), ed by. Janica Hackenbuchner, Luisa Bentivogli, Joke Daems, Chiara Manna, Beatrice Savoldi, and Eva Vanmassenhove, 17–45. European Association for Machine Translation (EAMT).
- Vancouver
- 1.Gkovedarou E, Daems J, Bruyne LD. Gender bias in English-to-Greek machine translation. In: Hackenbuchner J, Bentivogli L, Daems J, Manna C, Savoldi B, Vanmassenhove E, editors. Proceedings of the 3rd Workshop on Gender-Inclusive Translation Technologies (GITT 2025). European Association for Machine Translation (EAMT); 2025. p. 17–45.
- IEEE
- [1]E. Gkovedarou, J. Daems, and L. D. Bruyne, “Gender bias in English-to-Greek machine translation,” in Proceedings of the 3rd Workshop on Gender-Inclusive Translation Technologies (GITT 2025), Geneva, Switzerland, 2025, pp. 17–45.
@inproceedings{01K361M1KK15JRRNSGCJGDP135,
abstract = {{As the demand for inclusive language increases, concern has grown over the susceptibility of machine translation (MT) systems to reinforce gender stereotypes. This study investigates gender bias in two commercial MT systems, Google Translate and DeepL, focusing on the understudied English-to-Greek language pair. We address three aspects of gender bias: i) male bias, ii) occupational stereotyping, and iii) errors in anti-stereotypical translations. Additionally, we explore the potential of prompted GPT-4o as a bias mitigation tool that provides both gender-explicit and gender-neutral alternatives when necessary. To achieve this, we introduce GendEL, a manually crafted bilingual dataset of 240 gender-ambiguous and unambiguous sentences that feature stereotypical occupational nouns and adjectives. We find persistent gender bias in translations by both MT systems; while they perform well in cases where gender is explicitly defined, with DeepL outperforming both Google Translate and GPT-4o in feminine gender-unambiguous sentences, they are far from producing gender-inclusive or neutral translations when the gender is unspecified. GPT-4o shows promise, generating appropriate gendered and neutral alternatives for most ambiguous cases, though residual biases remain evident. As one of the first comprehensive studies on gender bias in English-to-Greek MT, we provide both our data and code at [github link].}},
author = {{Gkovedarou, Eleni and Daems, Joke and Bruyne, Luna De}},
booktitle = {{Proceedings of the 3rd Workshop on Gender-Inclusive Translation Technologies (GITT 2025)}},
editor = {{Hackenbuchner, Janica and Bentivogli, Luisa and Daems, Joke and Manna, Chiara and Savoldi, Beatrice and Vanmassenhove, Eva}},
isbn = {{9782970189749}},
language = {{eng}},
location = {{Geneva, Switzerland}},
pages = {{17--45}},
publisher = {{European Association for Machine Translation (EAMT)}},
title = {{Gender bias in English-to-Greek machine translation}},
url = {{https://aclanthology.org/2025.gitt-1.2/#}},
year = {{2025}},
}