ru-word-stress-transformer / tokenizer_config.json
IlyaGusev's picture
fix
bae83dd
raw
history blame contribute delete
269 Bytes
{
"bos_token": "[bos]",
"do_lower_case": true,
"eos_token": "[eos]",
"model_max_length": 42,
"pad_token": "[pad]",
"tokenizer_class": "CharTokenizer",
"unk_token": "[unk]",
"auto_map": {
"AutoTokenizer": ["char_tokenizer.CharTokenizer", null]
}
}