Edit model card

Qwen2.5-32B

ExLlamav2 8 bpw quant of https://huggingface.co/Qwen/Qwen2.5-32B

Downloads last month
6
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for altomek/Qwen2.5-32B-8bpw-EXL2

Base model

Qwen/Qwen2.5-32B
Quantized
(36)
this model