plan on multilingual variant?

#2
by ahxxm - opened

from tokenizer.json and section Training in blog post, it seems an English-only(the same naming recipe as origin BERT I guess)

I come to the community section to ask the same question, any ideas on the multilingual part ?

I join everyone to beg for a wondeful ModernXLM-RoBERTa-Large! :)

Sign up or log in to comment