step3_mk7 / tokenizer_config.json
jojo0217's picture
Upload 5 files
cc9c878
raw
history blame contribute delete
236 Bytes
{
"clean_up_tokenization_spaces": true,
"eos_token": "<|endoftext|>",
"fast_tokenizer": true,
"model_max_length": 1000000000000000019884624838656,
"pad_token": "<|endoftext|>",
"tokenizer_class": "PreTrainedTokenizerFast"
}