Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
akx
/
Poro-34B-gguf
like
3
GGUF
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Deploy
Use this model
e425e36
Poro-34B-gguf
2 contributors
History:
5 commits
akx
Upload ggml-model-Q5_K.gguf with huggingface_hub
e425e36
12 months ago
.gitattributes
1.69 kB
Upload ggml-model-Q5_K.gguf with huggingface_hub
12 months ago
README.md
334 Bytes
Update README.md
12 months ago
ggml-model-Q3_K.gguf
18.5 GB
LFS
Upload ggml-model-Q3_K.gguf with huggingface_hub
12 months ago
ggml-model-Q4_K.gguf
22.4 GB
LFS
Upload ggml-model-Q4_K.gguf with huggingface_hub
12 months ago
ggml-model-Q5_K.gguf
26.1 GB
LFS
Upload ggml-model-Q5_K.gguf with huggingface_hub
12 months ago