Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
hermes42
/
WizardLM-2-8x22B-imatrix-GGUF
like
0
GGUF
Inference Endpoints
imatrix
conversational
arxiv:
2304.12244
arxiv:
2306.08568
arxiv:
2308.09583
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
ea7a12e
WizardLM-2-8x22B-imatrix-GGUF
1 contributor
History:
18 commits
hermes42
Upload WizardLM-2-8x22B-imatrix-IQ2_XXS.gguf with huggingface_hub
ea7a12e
verified
7 months ago
.gitattributes
2.85 kB
Upload WizardLM-2-8x22B-imatrix-IQ2_XXS.gguf with huggingface_hub
7 months ago
WizardLM-2-8x22B-imatrix-IQ2_XXS.gguf
37.9 GB
LFS
Upload WizardLM-2-8x22B-imatrix-IQ2_XXS.gguf with huggingface_hub
7 months ago
WizardLM-2-8x22B-imatrix-Q4_0-00001-of-00002.gguf
42.9 GB
LFS
Upload WizardLM-2-8x22B-imatrix-Q4_0-00001-of-00002.gguf with huggingface_hub
7 months ago
WizardLM-2-8x22B-imatrix-Q4_0-00002-of-00002.gguf
36.9 GB
LFS
Upload WizardLM-2-8x22B-imatrix-Q4_0-00002-of-00002.gguf with huggingface_hub
7 months ago
WizardLM-2-8x22B-imatrix-Q4_1-00001-of-00003.gguf
42.9 GB
LFS
Rename WizardLM-2-8x22B-imatrix-Q4_1-00001-of-00002.gguf to WizardLM-2-8x22B-imatrix-Q4_1-00001-of-00003.gguf
7 months ago
WizardLM-2-8x22B-imatrix-Q4_1-00002-of-00003.gguf
42.9 GB
LFS
Rename WizardLM-2-8x22B-imatrix-Q4_1-00002-of-00002.gguf to WizardLM-2-8x22B-imatrix-Q4_1-00002-of-00003.gguf
7 months ago
WizardLM-2-8x22B-imatrix-Q4_1-00003-of-00003.gguf
2.35 GB
LFS
Upload WizardLM-2-8x22B-imatrix-Q4_1-00003-of-00003.gguf with huggingface_hub
7 months ago
WizardLM-2-8x22B-imatrix-Q5_0-00001-of-00003.gguf
42.9 GB
LFS
Upload WizardLM-2-8x22B-imatrix-Q5_0-00001-of-00003.gguf with huggingface_hub
7 months ago
WizardLM-2-8x22B-imatrix-Q5_0-00002-of-00003.gguf
42.9 GB
LFS
Upload WizardLM-2-8x22B-imatrix-Q5_0-00002-of-00003.gguf with huggingface_hub
7 months ago
WizardLM-2-8x22B-imatrix-Q5_0-00003-of-00003.gguf
11.4 GB
LFS
Upload WizardLM-2-8x22B-imatrix-Q5_0-00003-of-00003.gguf with huggingface_hub
7 months ago
WizardLM-2-8x22B-imatrix-Q5_1-00001-of-00002.gguf
42.9 GB
LFS
Upload WizardLM-2-8x22B-imatrix-Q5_1-00001-of-00002.gguf with huggingface_hub
7 months ago
WizardLM-2-8x22B-imatrix-Q5_1-00002-of-00002.gguf
42.9 GB
LFS
Upload WizardLM-2-8x22B-imatrix-Q5_1-00002-of-00002.gguf with huggingface_hub
7 months ago
WizardLM-2-8x22B-imatrix-Q5_1-00003-of-00002.gguf
19.8 GB
LFS
Upload WizardLM-2-8x22B-imatrix-Q5_1-00003-of-00002.gguf with huggingface_hub
7 months ago
imatrix.dat
58.3 MB
LFS
Upload imatrix.dat with huggingface_hub
7 months ago