gpt_bigcode-santacoder / special_tokens_map.json
jlamypoirier's picture
Add tokenizer
1233b5d
raw
history blame contribute delete
138 Bytes
{
"additional_special_tokens": [
"<|endoftext|>",
"<fim-prefix>",
"<fim-middle>",
"<fim-suffix>",
"<fim-pad>"
]
}