Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Kquant03
/
MistralTrix-4x9B-ERP-GGUF
like
9
GGUF
Merge
Inference Endpoints
arxiv:
2101.03961
License:
apache-2.0
Model card
Files
Files and versions
Community
3
Deploy
Use this model
792b7a9
MistralTrix-4x9B-ERP-GGUF
1 contributor
History:
5 commits
Kquant03
Update README.md
792b7a9
10 months ago
.gitattributes
1.98 kB
Upload 8 files
10 months ago
README.md
5.13 kB
Update README.md
10 months ago
ggml-model-q2_k.gguf
10 GB
LFS
Upload 8 files
10 months ago
ggml-model-q3_k_m.gguf
13.1 GB
LFS
Upload 8 files
10 months ago
ggml-model-q4_0.gguf
17 GB
LFS
Upload 8 files
10 months ago
ggml-model-q4_k_m.gguf
17 GB
LFS
Upload 8 files
10 months ago
ggml-model-q5_0.gguf
20.7 GB
LFS
Upload 8 files
10 months ago
ggml-model-q5_k_m.gguf
20.7 GB
LFS
Upload 8 files
10 months ago
ggml-model-q6_k.gguf
24.7 GB
LFS
Upload 8 files
10 months ago
ggml-model-q8_0.gguf
32 GB
LFS
Upload 8 files
10 months ago