Fill-Mask
Transformers
PyTorch
Catalan
roberta
catalan
masked-lm
RoBERTa-base-ca-v2
CaText
Catalan Textual Corpus
Instructions to use projecte-aina/roberta-base-ca-v2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use projecte-aina/roberta-base-ca-v2 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="projecte-aina/roberta-base-ca-v2")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("projecte-aina/roberta-base-ca-v2") model = AutoModelForMaskedLM.from_pretrained("projecte-aina/roberta-base-ca-v2") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- c12c1fbfe7b49416365b1977e40799e9b068be6e3079824c39bd39c5d653b119
- Size of remote file:
- 499 MB
- SHA256:
- 84a86bfe6ca6ae13e245bc6d7375b08c0ebe2634e723006477b38328015540f8
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.