hebrew-gpt_neo-tiny / tokenizer_config.json
Doron Adler
Checkpoint 390500
420342d
raw
history blame contribute delete
216 Bytes
{"do_lower_case": false, "max_len": 1024, "bos_token": "<|startoftext|>", "eos_token": "<|endoftext|>", "unk_token": "<|endoftext|>", "special_tokens_map_file": "special_tokens_map.json", "full_tokenizer_file": null}