roberta-swahili / flax_to_torch.py
fgaim's picture
set new dataset in train_tokenizer
e5bd3ab
raw
history blame contribute delete
137 Bytes
from transformers import RobertaForMaskedLM
model = RobertaForMaskedLM.from_pretrained("./", from_flax=True)
model.save_pretrained("./")