ArgumentMining-EN-ARI-AIF-ALBERT / special_tokens_map.json
yevhenkost's picture
add tokenizer
fcc4b45
raw
history blame
245 Bytes
{"bos_token": "[CLS]", "eos_token": "[SEP]", "unk_token": "<unk>", "sep_token": "[SEP]", "pad_token": "<pad>", "cls_token": "[CLS]", "mask_token": {"content": "[MASK]", "single_word": false, "lstrip": true, "rstrip": false, "normalized": false}}