banglabert / tokenizer_config.json
abhik1505040's picture
Initial commit
5e8e929
raw
history blame contribute delete
119 Bytes
{"do_lower_case": false, "tokenize_chinese_chars": false, "special_tokens_map_file": null, "full_tokenizer_file": null}