Snyhlxde's picture
Update README.md
a5392be verified
|
raw
history blame
310 Bytes

See our Github repo for more details: https://github.com/hao-ai-lab/Consistency_LLM

Metadata:

AR loss to consistency loss ratio: 10: 1

shareGPT dataset size: 48k

n-token sequence length: 32

Jacobi trajectory data cleaning: True

Target model: LLaMA2-7B fine-tuned on ShareGPT48k

release date: 02/26/2024