--- title: README emoji: 🔥 colorFrom: purple colorTo: purple sdk: static pinned: true --- These are my own quantizations (updated almost daily). The difference with normal quantizations is that I quantize the output and embed tensors to f16. and the other tensors to 15_k,q6_k or q8_0. This creates models that are little or not degraded at all and have a smaller size. They run at about 3-6 t/sec on CPU only using llama.cpp And obviously faster on computers with potent GPUs ALL the models were quantized in this way: quantize.exe --allow-requantize --output-tensor-type f16 --token-embedding-type f16 model.f16.gguf model.f16.q5.gguf q5_k quantize.exe --allow-requantize --output-tensor-type f16 --token-embedding-type f16 model.f16.gguf model.f16.q6.gguf q6_k quantize.exe --allow-requantize --output-tensor-type f16 --token-embedding-type f16 model.f16.gguf model.f16.q6.gguf q8_0 quantize.exe --allow-requantize --pure model.f16.gguf model.f16.q8_p.gguf q8_0 and there is also a pure f16 and a pure q8 in every directory. * [ZeroWw/Tiger-Gemma-9B-v1-GGUF](https://huggingface.co/ZeroWw/Tiger-Gemma-9B-v1-GGUF) * [ZeroWw/gpt2-xl-GGUF](https://huggingface.co/ZeroWw/gpt2-xl-GGUF) * [ZeroWw/Arcee-Spark-GGUF](https://huggingface.co/ZeroWw/Arcee-Spark-GGUF) * [ZeroWw/phillama-3.8b-v0.1-GGUF](https://huggingface.co/ZeroWw/phillama-3.8b-v0.1-GGUF) * [ZeroWw/codegeex4-all-9b-GGUF](https://huggingface.co/ZeroWw/codegeex4-all-9b-GGUF) * [ZeroWw/DeepSeek-V2-Lite-Chat-GGUF](https://huggingface.co/ZeroWw/DeepSeek-V2-Lite-Chat-GGUF) * [ZeroWw/NuminaMath-7B-TIR-GGUF](https://huggingface.co/ZeroWw/NuminaMath-7B-TIR-GGUF) * [ZeroWw/Phi-3-mini-128k-instruct-abliterated-v3-GGUF](https://huggingface.co/ZeroWw/Phi-3-mini-128k-instruct-abliterated-v3-GGUF) * [ZeroWw/Phi-3-song-lyrics-1.0-GGUF](https://huggingface.co/ZeroWw/Phi-3-song-lyrics-1.0-GGUF) * [ZeroWw/Meta-Llama-3-8B-Instruct-GGUF](https://huggingface.co/ZeroWw/Meta-Llama-3-8B-Instruct-GGUF) * [ZeroWw/LLaMAX3-8B-Alpaca-GGUF](https://huggingface.co/ZeroWw/LLaMAX3-8B-Alpaca-GGUF) * [ZeroWw/LLaMAX3-8B-GGUF](https://huggingface.co/ZeroWw/LLaMAX3-8B-GGUF) * [ZeroWw/Moistral-11B-v3-GGUF](https://huggingface.co/ZeroWw/Moistral-11B-v3-GGUF) * [ZeroWw/Moistral-11B-v4-GGUF](https://huggingface.co/ZeroWw/Moistral-11B-v4-GGUF) * [ZeroWw/L3-Blackfall-Summanus-v0.1-15B-GGUF](https://huggingface.co/ZeroWw/L3-Blackfall-Summanus-v0.1-15B-GGUF) * [ZeroWw/Smegmma-Deluxe-9B-v1-GGUF](https://huggingface.co/ZeroWw/Smegmma-Deluxe-9B-v1-GGUF) * [ZeroWw/Smegmma-9B-v1-GGUF](https://huggingface.co/ZeroWw/Smegmma-9B-v1-GGUF) * [ZeroWw/internlm2_5-7b-chat-GGUF](https://huggingface.co/ZeroWw/internlm2_5-7b-chat-GGUF) * [ZeroWw/glm-4-9b-chat-GGUF](https://huggingface.co/ZeroWw/glm-4-9b-chat-GGUF) * [ZeroWw/llama3-8B-DarkIdol-2.2-Uncensored-1048K-GGUF](https://huggingface.co/ZeroWw/llama3-8B-DarkIdol-2.2-Uncensored-1048K-GGUF) * [ZeroWw/Gemma-2-9B-It-SPPO-Iter3-GGUF](https://huggingface.co/ZeroWw/Gemma-2-9B-It-SPPO-Iter3-GGUF) * [ZeroWw/Phi-3-mini-4k-geminified-GGUF](https://huggingface.co/ZeroWw/Phi-3-mini-4k-geminified-GGUF) * [ZeroWw/CodeQwen1.5-7B-Chat-GGUF](https://huggingface.co/ZeroWw/CodeQwen1.5-7B-Chat-GGUF) * [ZeroWw/NeuralPipe-7B-slerp-GGUF](https://huggingface.co/ZeroWw/NeuralPipe-7B-slerp-GGUF) * [ZeroWw/Llama-3-8B-Instruct-Gradient-4194k-GGUF](https://huggingface.co/ZeroWw/Llama-3-8B-Instruct-Gradient-4194k-GGUF) * [ZeroWw/gemma-2-9b-it-GGUF](https://huggingface.co/ZeroWw/gemma-2-9b-it-GGUF) * [ZeroWw/llama3-8B-DarkIdol-2.1-Uncensored-32K-GGUF](https://huggingface.co/ZeroWw/llama3-8B-DarkIdol-2.1-Uncensored-32K-GGUF) * [ZeroWw/Meta-Llama-3-8B-Instruct-abliterated-v3-GGUF](https://huggingface.co/ZeroWw/Meta-Llama-3-8B-Instruct-abliterated-v3-GGUF) * [ZeroWw/Hathor_Stable-v0.2-L3-8B-GGUF](https://huggingface.co/ZeroWw/Hathor_Stable-v0.2-L3-8B-GGUF) * [ZeroWw/L3-Aethora-15B-V2-GGUF](https://huggingface.co/ZeroWw/L3-Aethora-15B-V2-GGUF) * [ZeroWw/L3-8B-Stheno-v3.3-32K-GGUF](https://huggingface.co/ZeroWw/L3-8B-Stheno-v3.3-32K-GGUF) * [ZeroWw/Llama-3-8B-Instruct-Gradient-1048k-GGUF](https://huggingface.co/ZeroWw/Llama-3-8B-Instruct-Gradient-1048k-GGUF) * [ZeroWw/Pythia-Chat-Base-7B-GGUF](https://huggingface.co/ZeroWw/Pythia-Chat-Base-7B-GGUF) * [ZeroWw/Yi-1.5-6B-Chat-GGUF](https://huggingface.co/ZeroWw/Yi-1.5-6B-Chat-GGUF) * [ZeroWw/DeepSeek-Coder-V2-Lite-Base-GGUF](https://huggingface.co/ZeroWw/DeepSeek-Coder-V2-Lite-Base-GGUF) * [ZeroWw/Yi-1.5-9B-32K-GGUF](https://huggingface.co/ZeroWw/Yi-1.5-9B-32K-GGUF) * [ZeroWw/aya-23-8B-GGUF](https://huggingface.co/ZeroWw/aya-23-8B-GGUF) * [ZeroWw/MixTAO-7Bx2-MoE-v8.1-GGUF](https://huggingface.co/ZeroWw/MixTAO-7Bx2-MoE-v8.1-GGUF) * [ZeroWw/Phi-3-medium-128k-instruct-GGUF](https://huggingface.co/ZeroWw/Phi-3-medium-128k-instruct-GGUF) * [ZeroWw/Phi-3-mini-128k-instruct-GGUF](https://huggingface.co/ZeroWw/Phi-3-mini-128k-instruct-GGUF) * [ZeroWw/Qwen1.5-7B-Chat-GGUF](https://huggingface.co/ZeroWw/Qwen1.5-7B-Chat-GGUF) * [ZeroWw/NeuralDaredevil-8B-abliterated-GGUF](https://huggingface.co/ZeroWw/NeuralDaredevil-8B-abliterated-GGUF) * [ZeroWw/Mistroll-7B-v2.2-GGUF](https://huggingface.co/ZeroWw/Mistroll-7B-v2.2-GGUF) * [ZeroWw/Samantha-Qwen-2-7B-GGUF](https://huggingface.co/ZeroWw/Samantha-Qwen-2-7B-GGUF) * [ZeroWw/NSFW_DPO_Noromaid-7b-Mistral-7B-Instruct-v0.1-GGUF](https://huggingface.co/ZeroWw/NSFW_DPO_Noromaid-7b-Mistral-7B-Instruct-v0.1-GGUF) * [ZeroWw/microsoft_WizardLM-2-7B-GGUF](https://huggingface.co/ZeroWw/microsoft_WizardLM-2-7B-GGUF) * [ZeroWw/Mistral-7B-Instruct-v0.3-GGUF](https://huggingface.co/ZeroWw/Mistral-7B-Instruct-v0.3-GGUF)