law-glm-10b / special_tokens_map.json
law-llm's picture
Upload tokenizer
b0ce0a4
raw
history blame contribute delete
254 Bytes
{
"additional_special_tokens": [
"<|startofpiece|>",
"<|endofpiece|>",
"[gMASK]",
"[sMASK]"
],
"cls_token": "[CLS]",
"eos_token": "<|endoftext|>",
"mask_token": "[MASK]",
"pad_token": "<|endoftext|>",
"unk_token": "[UNK]"
}