wmt/wmt16
Viewer • Updated • 9.98M • 8.68k • 26
How to use autoevaluate/translation with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("autoevaluate/translation")
model = AutoModelForSeq2SeqLM.from_pretrained("autoevaluate/translation")This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-ro on the wmt16 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
|---|---|---|---|---|---|
| 0.8302 | 0.03 | 1000 | 1.3170 | 28.5866 | 33.9575 |