llama-3.1-fine-tuned / params.json
Tanabodee Limpaitoon
Upload 2 files
7e65e93 verified
raw
history blame
199 Bytes
{"dim": 4096, "n_layers": 32, "n_heads": 32, "n_kv_heads": 8, "vocab_size": 128256, "ffn_dim_multiplier": 1.3, "multiple_of": 1024, "norm_eps": 1e-05, "rope_theta": 500000.0, "use_scaled_rope": true}