eos_token should be <|eot_id|>
#1
by
AUTOMATIC
- opened
tokenizer_config.json should list "eos_token" as "<|eot_id|>", othwerwise the chat is spammed with .assistant things and never ends.
I had to change it in both tokenizer_config.json as well as in special_tokens_map.json.
Is that the accepted fix? The files were just copied from the original Meta L3 files.
I don't believe so as changing this in exl2 quant affected the way model behaved and followed instructions
This MR is merged: https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct/discussions/4
It looks like the most official fix to me.
I've opened the pull request for this fix in #2, hope that it will be merged. Amazing model, shame that it has this tokenizer problem on the start.