segment_nt_multi_species / tokenizer_config.json
hdallatorre's picture
Upload tokenizer
a2ae43a verified
raw
history blame contribute delete
129 Bytes
{
"clean_up_tokenization_spaces": true,
"eos_token": null,
"model_max_length": 2048,
"tokenizer_class": "EsmTokenizer"
}