Exploring morphology-aware tokenization: a case study on Spanish language modeling

Publication date

2026-03-06T15:20:12Z

2026-03-06T15:20:12Z

2025

2026-03-06T15:20:12Z



Abstract

This paper investigates to what extent the integration of morphological information can improve subword tokenization and thus also language modeling performance. We focus on Spanish, a language with fusional morphology, where subword segmentation can benefit from linguistic structure. Instead of relying on purely data-driven strategies like Byte Pair Encoding (BPE), we explore a linguistically grounded approach: training a tokenizer on morphologically segmented data. To do so, we develop a semi-supervised segmentation model for Spanish, building gold-standard datasets to guide and evaluate it. We then use this tokenizer to pre-train a masked language model and assess its performance on several downstream tasks. Our results show improvements over a baseline with a standard tokenizer, supporting our hypothesis that morphology-aware tokenization offers a viable and principled alternative for improving language modeling.


The work presented in this paper has been partially supported by the European Commission in the framework of the Horizon Europe program (contract number 101070278). We sincerely appreciate the anonymous reviewers for their valuable suggestions and thoughtful feedback, which greatly contributed to enhancing the quality of this paper. We also acknowledge the use of the MareNostrum 5 supercomputer at the Barcelona Supercomputing Center (BSC) for model training.

Document Type

Chapter or part of a book


Published version

Language

Catalan

Publisher

ACL (Association for Computational Linguistics)

Related items

Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing (EMNLP); 2025 Nov 4-9. Suzhou, China. Kerrville: ACL; 2025.

Recommended citation

This citation was generated automatically.

Rights

© ACL, Creative Commons Attribution 4.0 License

http://creativecommons.org/licenses/by/4.0/

This item appears in the following Collection(s)