gpt2-urdu-tokenizer / special_tokens_map.json
hadidev's picture
add tokenizer
ff35605
raw
history blame contribute delete
99 Bytes
{
"bos_token": "<|endoftext|>",
"eos_token": "<|endoftext|>",
"unk_token": "<|endoftext|>"
}