Blenderbot_Prosocial / tokenizer_config.json
TheMrguiller's picture
Upload tokenizer
193f418
raw
history blame contribute delete
247 Bytes
{
"bos_token": "__start__",
"eos_token": "__end__",
"model_max_length": 512,
"pad_token": "__null__",
"special_tokens_map_file": null,
"tokenizer_class": "BlenderbotSmallTokenizer",
"tokenizer_file": null,
"unk_token": "__unk__"
}