Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
latestissue
/
rwkv-6-finch-14b-gguf
like
5
GGUF
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
main
rwkv-6-finch-14b-gguf
1 contributor
History:
79 commits
latestissue
Upload 9 files
f16e4d8
verified
about 1 month ago
.gitattributes
Safe
3.06 kB
Upload 23 files
3 months ago
README.md
Safe
27 Bytes
Update README.md
3 months ago
rwkv-6-finch-14b-BF16.gguf
Safe
28.5 GB
LFS
Upload 13 files
about 1 month ago
rwkv-6-finch-14b-F16.gguf
Safe
28.6 GB
LFS
Upload 13 files
about 1 month ago
rwkv-6-finch-14b-IQ3_S.gguf
Safe
6.74 GB
LFS
Upload 9 files
about 1 month ago
rwkv-6-finch-14b-IQ3_XXS.gguf
Safe
6.08 GB
LFS
Upload 9 files
about 1 month ago
rwkv-6-finch-14b-IQ4_NL.gguf
Safe
8.55 GB
LFS
Upload 9 files
about 1 month ago
rwkv-6-finch-14b-IQ4_XS.gguf
Safe
8.12 GB
LFS
Upload 9 files
about 1 month ago
rwkv-6-finch-14b-Q2_K.gguf
Safe
5.36 GB
LFS
Upload 9 files
about 1 month ago
rwkv-6-finch-14b-Q3_K.gguf
Safe
6.74 GB
LFS
Upload 13 files
about 1 month ago
rwkv-6-finch-14b-Q3_K_L.gguf
Safe
6.74 GB
LFS
Upload 13 files
about 1 month ago
rwkv-6-finch-14b-Q4_0.gguf
Safe
8.55 GB
LFS
Upload 13 files
about 1 month ago
rwkv-6-finch-14b-Q4_0_4_4.gguf
Safe
8.55 GB
LFS
Upload 13 files
about 1 month ago
rwkv-6-finch-14b-Q4_0_4_8.gguf
Safe
8.55 GB
LFS
Upload 13 files
about 1 month ago
rwkv-6-finch-14b-Q4_0_8_8.gguf
Safe
8.55 GB
LFS
Upload 13 files
about 1 month ago
rwkv-6-finch-14b-Q4_1.gguf
Safe
9.4 GB
LFS
Upload 13 files
about 1 month ago
rwkv-6-finch-14b-Q4_K.gguf
Safe
8.55 GB
LFS
Upload 9 files
about 1 month ago
rwkv-6-finch-14b-Q4_K_S.gguf
Safe
8.55 GB
LFS
Upload 9 files
about 1 month ago
rwkv-6-finch-14b-Q5_0.gguf
Safe
10.3 GB
LFS
Upload 13 files
about 1 month ago
rwkv-6-finch-14b-Q5_1.gguf
Safe
11.1 GB
LFS
Upload 13 files
about 1 month ago
rwkv-6-finch-14b-Q5_K.gguf
Safe
10.3 GB
LFS
Upload 9 files
about 1 month ago
rwkv-6-finch-14b-Q5_K_S.gguf
Safe
10.3 GB
LFS
Upload 9 files
about 1 month ago
rwkv-6-finch-14b-Q6_K.gguf
Safe
12.1 GB
LFS
Upload 13 files
about 1 month ago
rwkv-6-finch-14b-Q8_0.gguf
Safe
15.4 GB
LFS
Upload 13 files
about 1 month ago