new_GPT2tokenizer_tokenizer / special_tokens_map.json
Ghost1's picture
add tokenizer
babf4b7
raw
history blame contribute delete
90 Bytes
{"bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "unk_token": "<|endoftext|>"}