--- license: llama3.1 language: - en pipeline_tag: text-generation quantized_by: TheMelonGod tags: - quantized - safetensors - exllamav2 base_model: - ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.3 base_model_relation: quantized --- ExLlamaV2 quantizations of: [ArliAI - Llama-3.1-8B-ArliAI-RPMax-v1.3](https://huggingface.co/ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.3) Quantizations (6hb) [8.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-8B-ArliAI-RPMax-v1.3-exl2/tree/8.0bpw) [7.5bpw](https://huggingface.co/TheMelonGod/Llama-3.1-8B-ArliAI-RPMax-v1.3-exl2/tree/7.5bpw) [7.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-8B-ArliAI-RPMax-v1.3-exl2/tree/7.0bpw) [6.5bpw](https://huggingface.co/TheMelonGod/Llama-3.1-8B-ArliAI-RPMax-v1.3-exl2/tree/6.5bpw) [6.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-8B-ArliAI-RPMax-v1.3-exl2/tree/6.0bpw) [5.5bpw](https://huggingface.co/TheMelonGod/Llama-3.1-8B-ArliAI-RPMax-v1.3-exl2/tree/5.5bpw) [5.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-8B-ArliAI-RPMax-v1.3-exl2/tree/5.0bpw) [4.5bpw](https://huggingface.co/TheMelonGod/Llama-3.1-8B-ArliAI-RPMax-v1.3-exl2/tree/4.5bpw) [4.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-8B-ArliAI-RPMax-v1.3-exl2/tree/4.0bpw) [3.5bpw](https://huggingface.co/TheMelonGod/Llama-3.1-8B-ArliAI-RPMax-v1.3-exl2/tree/3.5bpw) [3.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-8B-ArliAI-RPMax-v1.3-exl2/tree/3.0bpw) [2.5bpw](https://huggingface.co/TheMelonGod/Llama-3.1-8B-ArliAI-RPMax-v1.3-exl2/tree/2.5bpw) [2.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-8B-ArliAI-RPMax-v1.3-exl2/tree/2.0bpw) If you need a specific model quantization or a particular bits per weight, please let me know. I’m happy to help quantize lesser known models. This is my first model quantization! If you have any suggestions for improvements or feedback, feel free to reach out. Your input is greatly appreciated and helps me make quantizations better for everyone. Special thanks to [turboderp](https://huggingface.co/turboderp) for developing the tools that made these quantizations possible. Your contributions are greatly appreciated!