Spaces:
Running
Running
Model,Params (B),Repo,Quantization,Final Score,Strict Prompt Score,Strict Inst Score,Loose Prompt Score,Loose Inst Score,Link | |
LLama 3 70B,70,perplexity/llama-3-sonar-large-32k-chat (Openrouter),FP16,0.8331,0.7671,0.8381,0.8373,0.8897,https://openrouter.ai/models/perplexity/llama-3-sonar-large-32k-chat | |
LLama 3 70B,70,llama3-70b-8192 (Groq),FP16,0.8175,0.7431,0.8213,0.8244,0.8813,https://console.groq.com/docs/models | |
Llama 3 8B,8,failspy/Meta-Llama-3-8B-Instruct-abliterated-v3-GGUF,Q8_0,0.7589,0.7001,0.7818,0.7394,0.8141,https://huggingface.co/failspy/Meta-Llama-3-8B-Instruct-abliterated-v3-GGUF | |
Llama 3 8B,8,MaziyarPanahi/Meta-Llama-3-8B-Instruct-GGUF,Q8_0,0.7366,0.6765,0.7614,0.7172,0.7914,https://huggingface.co/MaziyarPanahi/Meta-Llama-3-8B-Instruct-GGUF | |
Phi 3 Medium 4K,14,bartowski/Phi-3-medium-4k-instruct-GGUF,Q8_0,0.6673,0.6100,0.7014,0.6322,0.7254,https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-GGUF | |
Codestral 22B v0.1,22,bartowski/Codestral-22B-v0.1-GGUF,Q6_K,0.6074,0.5305,0.6415,0.5730,0.6847,https://huggingface.co/bartowski/Codestral-22B-v0.1-GGUF | |
Mixtral 8x7B v0.1,56,mixtral-8x7b-32768 (Groq),FP16,0.5887,0.5028,0.6247,0.5545,0.6727,https://console.groq.com/docs/models | |
Mistral 7B v0.3,7,MaziyarPanahi/Mistral-7B-Instruct-v0.3-GGUF,Q8_0,0.5689,0.4972,0.5983,0.5397,0.6403,https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-v0.3-GGUF | |
Llama 3 8B,8,bartowski/Hermes-2-Pro-Llama-3-8B-GGUF,Q8_0,0.5686,0.5010,0.6079,0.5287,0.6367,https://huggingface.co/bartowski/Hermes-2-Pro-Llama-3-8B-GGUF | |
Mistral 7B v0.1,7,TheBloke/CapybaraHermes-2.5-Mistral-7B-GGUF,Q8_0,0.5473,0.4750,0.5995,0.4935,0.6211,https://huggingface.co/TheBloke/CapybaraHermes-2.5-Mistral-7B-GGUF | |