NanoByte's picture
Create README.md
e0ea36e verified
|
raw
history blame
753 Bytes
metadata
base_model: wolfram/miquliz-120b
inference: false
model_creator: Wolfram Ravenwolf
model_name: miquliz-120b

miquliz-120b - Q4 GGUFs

Description

This repo contains Q4_K_S and Q4_K_M GGUF format model files for Wolfram Ravenwolf's miquliz-120b.

Provided files

Name Quant method Bits Size
miquliz-120b.Q4_K_S.gguf Q4_K_S 4 66.81 GB
miquliz-120b.Q4_K_M.gguf Q4_K_M 4 70.64 GB

Note: HF does not support uploading files larger than 50GB. Therefore the files are uploaded as split files.