Helsinki-NLP/kde4
Updated • 1.42k • 25
How to use NDugar/m2m100_418M-fr with Transformers:
# Use a pipeline as a high-level helper
# Warning: Pipeline type "translation" is no longer supported in transformers v5.
# You must load the model directly (see below) or downgrade to v4.x with:
# 'pip install "transformers<5.0.0'
from transformers import pipeline
pipe = pipeline("translation", model="NDugar/m2m100_418M-fr") # Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("NDugar/m2m100_418M-fr")
model = AutoModelForSeq2SeqLM.from_pretrained("NDugar/m2m100_418M-fr")This model is a fine-tuned version of facebook/m2m100_418M on the kde4 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Bleu |
|---|---|---|---|---|
| 0.749 | 1.0 | 23645 | 0.7021 | 51.1344 |
Base model
facebook/m2m100_418M