Edit model card
Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

fits into 24gb with 24576 ctx (q4)

set rope_alpha to 3.75

Downloads last month
14
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for waldie/gemma-2-27b-it-SimPO-37K-5.5bpw-h6-exl2

Base model

google/gemma-2-27b
Quantized
(5)
this model