File size: 146 Bytes
450f513
 
 
 
 
1
2
3
4
5
6
{
  "clean_up_tokenization_spaces": true,
  "model_max_length": 1000000000000000019884624838656,
  "tokenizer_class": "PreTrainedTokenizerFast"
}