ru-word-stress-transformer / tokenizer_config.json
IlyaGusev's picture
Tokenizer config fix
3400828
raw
history blame
244 Bytes
{
"bos_token": "[BOS]",
"eos_token": "[EOS]",
"pad_token": "[PAD]",
"unk_token": "[UNK]",
"model_max_length": 40,
"tokenizer_class": "CharTokenizer",
"auto_map": {
"AutoTokenizer": ["char_tokenizer.CharTokenizer", null]
}
}