File size: 2,521 Bytes
076c0f7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1308f4d
 
 
 
 
 
 
 
 
 
 
 
 
 
2768d53
43b20dc
2768d53
 
 
 
 
 
 
 
 
 
 
227e63c
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
---
license: cc-by-nc-4.0
---

<a href="https://www.buymeacoffee.com/PulsarAI" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;" ></a>

Merge of [ehartford/dolphin-2.1-mistral-7b](https://huggingface.co/ehartford/dolphin-2.1-mistral-7b) and [Open-Orca/Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca) using ties merge.

### *Weights*

- [ehartford/dolphin-2.1-mistral-7b](https://huggingface.co/ehartford/dolphin-2.1-mistral-7b): 0.5

- [Open-Orca/Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca): 0.3

### *Density*

- [ehartford/dolphin-2.1-mistral-7b](https://huggingface.co/ehartford/dolphin-2.1-mistral-7b): 0.5

- [Open-Orca/Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca): 0.5

# Quantizationed versions

Quantizationed versions of this model is available thanks to [TheBloke](https://hf.co/TheBloke).

##### GPTQ

- [TheBloke/Dolphin2.1-OpenOrca-7B-GPTQ](https://huggingface.co/TheBloke/Dolphin2.1-OpenOrca-7B-GPTQ)

##### GGUF

- [TheBloke/Dolphin2.1-OpenOrca-7B-GGUF](https://huggingface.co/TheBloke/Dolphin2.1-OpenOrca-7B-GGUF)

##### AWQ

- [TheBloke/Dolphin2.1-OpenOrca-7B-AWQ](https://huggingface.co/TheBloke/Dolphin2.1-OpenOrca-7B-AWQ)
# Evaluation Results ([Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard))

| Metric                | Value                     |
|-----------------------|---------------------------|
| Avg.                  | 53.0   |
| ARC (25-shot)         | 63.91          |
| HellaSwag (10-shot)   | 84.26    |
| MMLU (5-shot)         | 62.66         |
| TruthfulQA (0-shot)   | 53.84   |
| Winogrande (5-shot)   | 78.22   |
| GSM8K (5-shot)        | 19.94        |
| DROP (3-shot)         | 8.17         |        

# [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) Evaluation Results ([Details](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Dolphin2.1-OpenOrca-7B))

| Metric                | Value                     |
|-----------------------|---------------------------|
| Avg.                  | 53.0   |
| ARC (25-shot)         | 63.91          |
| HellaSwag (10-shot)   | 84.26    |
| MMLU (5-shot)         | 62.66         |
| TruthfulQA (0-shot)   | 53.84   |
| Winogrande (5-shot)   | 78.22   |
| GSM8K (5-shot)        | 19.94        |
| DROP (3-shot)         | 8.17         |