Text Classification
Transformers
Safetensors
English
llama
text-generation-inference
Inference Endpoints
tulu-v2.5-13b-uf-rm / special_tokens_map.json
hamishivi's picture
Upload folder using huggingface_hub
ea1a019 verified
raw
history blame contribute delete
330 Bytes
{"bos_token": {"content": "<s>", "lstrip": false, "normalized": true, "rstrip": false, "single_word": false}, "eos_token": {"content": "</s>", "lstrip": false, "normalized": true, "rstrip": false, "single_word": false}, "unk_token": {"content": "<unk>", "lstrip": false, "normalized": true, "rstrip": false, "single_word": false}}