RUAccent-stressed-encoder / tokenizer_config.json
Den4ikAI's picture
Upload tokenizer
52bcb83 verified
raw
history blame contribute delete
115 Bytes
{
"name": "CharacterTokenizer",
"vocab_file": "vocab.json",
"model_max_length": 2048,
"size": 668
}