dc.contributor.author
Sackl, Stefanie
dc.date.accessioned
2025-09-30T19:30:09Z
dc.date.available
2025-09-30T19:30:09Z
dc.date.issued
2025-09-29T12:59:32Z
dc.date.issued
2025-09-29T12:59:32Z
dc.identifier
http://hdl.handle.net/10230/71289
dc.identifier.uri
http://hdl.handle.net/10230/71289
dc.description.abstract
Treball de fi de màster Estudis Internacionals sobre Mitjans, Poder i Diversitat
dc.description.abstract
Supervisora: Pilar Medina-Bravo
dc.description.abstract
This analysis compares the different kinds of gender bias apparent in human-written and artificially created News articles. Focusing on the 2022 FIFA World Cup in Qatar, articles published by EuroNews on that topic get used to generate similar articles produced by ChatGPT, Perplexity, and DeepAI. By applying a feminist qualitative content analysis, the study explores how gendered language, stereotypes or forms of discrimination can manifest in both types of texts. While results show that both human and AI-generated articles reflect some forms of existing gender bias, the AI-generated texts offered the readers more context on how those inequalities could be dangerous for marginalized groups and included more diverse voices. The outputs by the different AI-models were generally less biased, most likely through content moderation and guidelines inside the algorithms. The findings highlight the mutual shaping of technology and society and urges for more transparency and diversity in AI development.
dc.format
application/pdf
dc.rights
Llicència CC Reconeixement - No Comercial-Sense Obra Derivada 4.0 Internacional (CC BY-NC-ND 4.0)
dc.rights
https://creativecommons.org/licenses/by-nc-nd/4.0/
dc.rights
info:eu-repo/semantics/openAccess
dc.subject
Intel·ligència artificial
dc.title
A Feminist Analysis of Gender Bias in Artificially Created and Human News Articles
dc.type
info:eu-repo/semantics/masterThesis