Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
v000000
/
L3.1-Niitorm-8B-LATCOSx2-Version-GGUFs-IMATRIX
like
3
GGUF
llama
Merge
llama-cpp
Inference Endpoints
conversational
Model card
Files
Files and versions
Community
Deploy
Use this model
main
L3.1-Niitorm-8B-LATCOSx2-Version-GGUFs-IMATRIX
1 contributor
History:
28 commits
v000000
Upload l3.1-niitorm-8b-latcosx2.q6_k.gguf with huggingface_hub
1e193d7
verified
2 months ago
.gitattributes
Safe
2.43 kB
Upload l3.1-niitorm-8b-latcosx2.q6_k.gguf with huggingface_hub
2 months ago
README.md
Safe
1.21 kB
Update README.md
2 months ago
imatrix.dat
Safe
4.99 MB
LFS
Upload imatrix.dat
2 months ago
l3.1-niitorm-8b-latcosx2-q8_0.gguf
Safe
8.54 GB
LFS
Upload l3.1-niitorm-8b-latcosx2-q8_0.gguf with huggingface_hub
2 months ago
l3.1-niitorm-8b-latcosx2.imat-iq4_xs.gguf
Safe
4.45 GB
LFS
Upload l3.1-niitorm-8b-latcosx2.imat-iq4_xs.gguf with huggingface_hub
2 months ago
l3.1-niitorm-8b-latcosx2.imat-q4_0_4_4-ARM_SOC.gguf
Safe
4.66 GB
LFS
Upload l3.1-niitorm-8b-latcosx2.imat-q4_0_4_4-ARM_SOC.gguf with huggingface_hub
2 months ago
l3.1-niitorm-8b-latcosx2.imat-q4_0_4_8-ARM_SOC.gguf
Safe
4.66 GB
LFS
Upload l3.1-niitorm-8b-latcosx2.imat-q4_0_4_8-ARM_SOC.gguf with huggingface_hub
2 months ago
l3.1-niitorm-8b-latcosx2.imat-q4_k_m.gguf
Safe
4.92 GB
LFS
Upload l3.1-niitorm-8b-latcosx2.imat-q4_k_m.gguf with huggingface_hub
2 months ago
l3.1-niitorm-8b-latcosx2.imat-q4_k_s.gguf
Safe
4.69 GB
LFS
Upload l3.1-niitorm-8b-latcosx2.imat-q4_k_s.gguf with huggingface_hub
2 months ago
l3.1-niitorm-8b-latcosx2.imat-q5_k_m.gguf
Safe
5.73 GB
LFS
Upload l3.1-niitorm-8b-latcosx2.imat-q5_k_m.gguf with huggingface_hub
2 months ago
l3.1-niitorm-8b-latcosx2.imat-q5_k_s.gguf
Safe
5.6 GB
LFS
Upload l3.1-niitorm-8b-latcosx2.imat-q5_k_s.gguf with huggingface_hub
2 months ago
l3.1-niitorm-8b-latcosx2.imat-q6_k.gguf
Safe
6.6 GB
LFS
Upload l3.1-niitorm-8b-latcosx2.imat-q6_k.gguf with huggingface_hub
2 months ago
l3.1-niitorm-8b-latcosx2.imat-q8_0.gguf
Safe
8.54 GB
LFS
Upload l3.1-niitorm-8b-latcosx2.imat-q8_0.gguf with huggingface_hub
2 months ago
l3.1-niitorm-8b-latcosx2.q6_k.gguf
Safe
6.6 GB
LFS
Upload l3.1-niitorm-8b-latcosx2.q6_k.gguf with huggingface_hub
2 months ago