LLaMA2-Accessory / config /code_34B_params.json
Enderfga's picture
Upload 4 files
1500a73
raw
history blame contribute delete
184 Bytes
{
"dim": 8192,
"n_layers": 48,
"n_heads": 64,
"n_kv_heads": 8,
"multiple_of": 256,
"ffn_dim_multiplier": 1.0,
"norm_eps": 1e-5,
"rope_theta": 1000000
}