Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Artefact2
/
Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-GGUF
like
3
GGUF
English
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Deploy
Use this model
bfce4a7
Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-GGUF
1 contributor
History:
9 commits
Artefact2
Upload Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-imatrix.dat with huggingface_hub
bfce4a7
verified
10 months ago
.gitattributes
2.18 kB
Upload Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-imatrix.dat with huggingface_hub
10 months ago
Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-IQ2_XS.gguf
13.7 GB
LFS
Upload Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-IQ2_XS.gguf with huggingface_hub
10 months ago
Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-IQ2_XXS.gguf
12.3 GB
LFS
Upload Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-IQ2_XXS.gguf with huggingface_hub
10 months ago
Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-Q2_K.gguf
17.3 GB
LFS
Upload Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-Q2_K.gguf with huggingface_hub
10 months ago
Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-Q2_K_S.gguf
16 GB
LFS
Upload Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-Q2_K_S.gguf with huggingface_hub
10 months ago
Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-Q3_K_S.gguf
20.4 GB
LFS
Upload Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-Q3_K_S.gguf with huggingface_hub
10 months ago
Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-Q4_K_S.gguf
26.7 GB
LFS
Upload Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-Q4_K_S.gguf with huggingface_hub
10 months ago
Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-imatrix.dat
25.7 MB
LFS
Upload Nous-Hermes-2-Mixtruct-v0.1-8x7B-DPO-DARE_TIES-imatrix.dat with huggingface_hub
10 months ago
README.md
414 Bytes
Create README.md
10 months ago