ROUBERTa cased
This is a RoBERTa-base LM trained from scratch exclusively on Uruguayan press [1].
Cite this work
@inproceedings{rouberta2024,
title={A Language Model Trained on Uruguayan Spanish News Text},
author={Filevich, Juan Pablo and Marco, Gonzalo and Castro, Santiago and Chiruzzo, Luis and Ros{\'a}, Aiala},
booktitle={Proceedings of the Second International Workshop Towards Digital Language Equality (TDLE): Focusing on Sustainability@ LREC-COLING 2024},
pages={53--60},
year={2024}
}
[1] huggingface.co/datasets/pln-udelar/uy22
- Downloads last month
- 4