Helsinki-NLP/europarl
Viewer • Updated • 186M • 10.8k • 39
How to use halfrot/sft-mt5-base with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("halfrot/sft-mt5-base")
model = AutoModelForSeq2SeqLM.from_pretrained("halfrot/sft-mt5-base")Trained SFT policy for MT task in the paper "ALaRM: Align Language Models via Hierarchical Rewards Modeling".
Check out our project page for more information.