Helsinki-NLP/opus_books
Viewer • Updated • 1.25M • 14.1k • 88
How to use nkthakur/flan-t5-small-translator with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("nkthakur/flan-t5-small-translator")
model = AutoModelForSeq2SeqLM.from_pretrained("nkthakur/flan-t5-small-translator")This model is a fine-tuned version of google/flan-t5-small on opus_books/en-fr dataset. It achieves the following results on the evaluation set:
Try this sentence - translate English to French: what is love?
You should get response like - Qu'est-ce que l'amour?
Ensure that you are prepending
translate English to French:for all translations
This model has been trained only on en-fr subset of OPUS dataset.
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
|---|---|---|---|---|---|
| 0.0 | 1.0 | 6355 | nan | 1.078 | 18.0374 |
| 0.0 | 2.0 | 12710 | nan | 1.078 | 18.0374 |
Base model
google/flan-t5-small