Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
akx
/
Poro-34B-gguf
like
3
GGUF
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Deploy
Use this model
6eaff6e
Poro-34B-gguf
2 contributors
History:
4 commits
akx
Upload ggml-model-Q4_K.gguf with huggingface_hub
6eaff6e
about 1 year ago
.gitattributes
1.63 kB
Upload ggml-model-Q4_K.gguf with huggingface_hub
about 1 year ago
README.md
334 Bytes
Update README.md
about 1 year ago
ggml-model-Q3_K.gguf
18.5 GB
LFS
Upload ggml-model-Q3_K.gguf with huggingface_hub
about 1 year ago
ggml-model-Q4_K.gguf
22.4 GB
LFS
Upload ggml-model-Q4_K.gguf with huggingface_hub
about 1 year ago