Update config.json
#14
by
Esmeetu
- opened
i think max_position_embeddings
should be 4096 here. This value is the original size before scaling.
Reference: https://huggingface.co/docs/transformers/v4.35.2/en/model_doc/llama#transformers.LlamaConfig.rope_scaling
Esmeetu
changed pull request status to
closed