Meta-Llama-3-8B / original /params.json
satpalsr's picture
Upload folder using huggingface_hub
140d42f verified
raw
history blame contribute delete
211 Bytes
{
"dim": 4096,
"n_layers": 32,
"n_heads": 32,
"n_kv_heads": 8,
"vocab_size": 128256,
"multiple_of": 1024,
"ffn_dim_multiplier": 1.3,
"norm_eps": 1e-05,
"rope_theta": 500000.0
}