Poro-34B-gguf / README.md
akx's picture
700B
2ebf61a
|
raw
history blame
494 Bytes
metadata
license: apache-2.0

Poro-34B-gguf

This is a GGUF quantization of the Poro-34B model.

Please refer to that repository's model card for details.

The current revision is a quantization of the 700B token checkpoint.

The conversion was done with llama.cpp version b1641 (6744dbe924a317e3e2a5a2a4a2037061b2223449) on a Google Compute machine generously sponsored by Valohai.