Locutusque commited on
Commit
fa23fe6
1 Parent(s): 713b227

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +123 -1
README.md CHANGED
@@ -10,4 +10,126 @@ language:
10
  license: apache-2.0
11
  ---
12
 
13
- still in training. Trained on about ~21 billion tokens so far.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  license: apache-2.0
11
  ---
12
 
13
+ still in training. Trained on about ~21 billion tokens so far.
14
+
15
+ | Tasks |Version| Filter |n-shot| Metric | | Value | |Stderr|
16
+ |----------------------------------------|-------|----------------|-----:|-----------|---|------:|---|-----:|
17
+ |Open LLM Leaderboard | N/A| | | | | | | |
18
+ | - arc_challenge | 1|none | 25|acc |↑ | 0.2005|± |0.0117|
19
+ | | |none | 25|acc_norm |↑ | 0.2406|± |0.0125|
20
+ | - gsm8k | 3|flexible-extract| 5|exact_match|↑ | 0.0083|± |0.0025|
21
+ | | |strict-match | 5|exact_match|↑ | 0.0000|± |0.0000|
22
+ | - hellaswag | 1|none | 10|acc |↑ | 0.2724|± |0.0044|
23
+ | | |none | 10|acc_norm |↑ | 0.2838|± |0.0045|
24
+ | - mmlu | 2|none | |acc |↑ | 0.2290|± |0.0035|
25
+ | - humanities | 2|none | |acc |↑ | 0.2380|± |0.0062|
26
+ | - formal_logic | 1|none | 5|acc |↑ | 0.2460|± |0.0385|
27
+ | - high_school_european_history | 1|none | 5|acc |↑ | 0.1818|± |0.0301|
28
+ | - high_school_us_history | 1|none | 5|acc |↑ | 0.2647|± |0.0310|
29
+ | - high_school_world_history | 1|none | 5|acc |↑ | 0.2911|± |0.0296|
30
+ | - international_law | 1|none | 5|acc |↑ | 0.2149|± |0.0375|
31
+ | - jurisprudence | 1|none | 5|acc |↑ | 0.2685|± |0.0428|
32
+ | - logical_fallacies | 1|none | 5|acc |↑ | 0.2209|± |0.0326|
33
+ | - moral_disputes | 1|none | 5|acc |↑ | 0.2457|± |0.0232|
34
+ | - moral_scenarios | 1|none | 5|acc |↑ | 0.2369|± |0.0142|
35
+ | - philosophy | 1|none | 5|acc |↑ | 0.1865|± |0.0221|
36
+ | - prehistory | 1|none | 5|acc |↑ | 0.1975|± |0.0222|
37
+ | - professional_law | 1|none | 5|acc |↑ | 0.2432|± |0.0110|
38
+ | - world_religions | 1|none | 5|acc |↑ | 0.3099|± |0.0355|
39
+ | - other | 2|none | |acc |↑ | 0.2375|± |0.0076|
40
+ | - business_ethics | 1|none | 5|acc |↑ | 0.3200|± |0.0469|
41
+ | - clinical_knowledge | 1|none | 5|acc |↑ | 0.2226|± |0.0256|
42
+ | - college_medicine | 1|none | 5|acc |↑ | 0.1965|± |0.0303|
43
+ | - global_facts | 1|none | 5|acc |↑ | 0.1800|± |0.0386|
44
+ | - human_aging | 1|none | 5|acc |↑ | 0.3004|± |0.0308|
45
+ | - management | 1|none | 5|acc |↑ | 0.1942|± |0.0392|
46
+ | - marketing | 1|none | 5|acc |↑ | 0.2735|± |0.0292|
47
+ | - medical_genetics | 1|none | 5|acc |↑ | 0.3000|± |0.0461|
48
+ | - miscellaneous | 1|none | 5|acc |↑ | 0.2478|± |0.0154|
49
+ | - nutrition | 1|none | 5|acc |↑ | 0.2222|± |0.0238|
50
+ | - professional_accounting | 1|none | 5|acc |↑ | 0.2021|± |0.0240|
51
+ | - professional_medicine | 1|none | 5|acc |↑ | 0.1912|± |0.0239|
52
+ | - virology | 1|none | 5|acc |↑ | 0.2590|± |0.0341|
53
+ | - social sciences | 2|none | |acc |↑ | 0.2203|± |0.0075|
54
+ | - econometrics | 1|none | 5|acc |↑ | 0.2368|± |0.0400|
55
+ | - high_school_geography | 1|none | 5|acc |↑ | 0.2020|± |0.0286|
56
+ | - high_school_government_and_politics| 1|none | 5|acc |↑ | 0.1865|± |0.0281|
57
+ | - high_school_macroeconomics | 1|none | 5|acc |↑ | 0.2205|± |0.0210|
58
+ | - high_school_microeconomics | 1|none | 5|acc |↑ | 0.2143|± |0.0267|
59
+ | - high_school_psychology | 1|none | 5|acc |↑ | 0.1908|± |0.0168|
60
+ | - human_sexuality | 1|none | 5|acc |↑ | 0.2672|± |0.0388|
61
+ | - professional_psychology | 1|none | 5|acc |↑ | 0.2386|± |0.0172|
62
+ | - public_relations | 1|none | 5|acc |↑ | 0.1727|± |0.0362|
63
+ | - security_studies | 1|none | 5|acc |↑ | 0.2367|± |0.0272|
64
+ | - sociology | 1|none | 5|acc |↑ | 0.2488|± |0.0306|
65
+ | - us_foreign_policy | 1|none | 5|acc |↑ | 0.2600|± |0.0441|
66
+ | - stem | 2|none | |acc |↑ | 0.2157|± |0.0073|
67
+ | - abstract_algebra | 1|none | 5|acc |↑ | 0.2200|± |0.0416|
68
+ | - anatomy | 1|none | 5|acc |↑ | 0.1778|± |0.0330|
69
+ | - astronomy | 1|none | 5|acc |↑ | 0.1908|± |0.0320|
70
+ | - college_biology | 1|none | 5|acc |↑ | 0.2778|± |0.0375|
71
+ | - college_chemistry | 1|none | 5|acc |↑ | 0.2200|± |0.0416|
72
+ | - college_computer_science | 1|none | 5|acc |↑ | 0.2100|± |0.0409|
73
+ | - college_mathematics | 1|none | 5|acc |↑ | 0.2100|± |0.0409|
74
+ | - college_physics | 1|none | 5|acc |↑ | 0.2157|± |0.0409|
75
+ | - computer_security | 1|none | 5|acc |↑ | 0.2700|± |0.0446|
76
+ | - conceptual_physics | 1|none | 5|acc |↑ | 0.2638|± |0.0288|
77
+ | - electrical_engineering | 1|none | 5|acc |↑ | 0.2483|± |0.0360|
78
+ | - elementary_mathematics | 1|none | 5|acc |↑ | 0.2037|± |0.0207|
79
+ | - high_school_biology | 1|none | 5|acc |↑ | 0.1774|± |0.0217|
80
+ | - high_school_chemistry | 1|none | 5|acc |↑ | 0.2020|± |0.0282|
81
+ | - high_school_computer_science | 1|none | 5|acc |↑ | 0.2500|± |0.0435|
82
+ | - high_school_mathematics | 1|none | 5|acc |↑ | 0.2148|± |0.0250|
83
+ | - high_school_physics | 1|none | 5|acc |↑ | 0.2053|± |0.0330|
84
+ | - high_school_statistics | 1|none | 5|acc |↑ | 0.1481|± |0.0242|
85
+ | - machine_learning | 1|none | 5|acc |↑ | 0.3125|± |0.0440|
86
+ | - truthfulqa_gen | 3|none | 0|bleu_acc |↑ | 0.2362|± |0.0149|
87
+ | | |none | 0|bleu_diff |↑ |-1.0138|± |0.2569|
88
+ | | |none | 0|bleu_max |↑ | 7.9522|± |0.4088|
89
+ | | |none | 0|rouge1_acc |↑ | 0.2595|± |0.0153|
90
+ | | |none | 0|rouge1_diff|↑ |-1.9129|± |0.4349|
91
+ | | |none | 0|rouge1_max |↑ |21.7885|± |0.7307|
92
+ | | |none | 0|rouge2_acc |↑ | 0.1200|± |0.0114|
93
+ | | |none | 0|rouge2_diff|↑ |-1.9771|± |0.3475|
94
+ | | |none | 0|rouge2_max |↑ | 9.0199|± |0.5842|
95
+ | | |none | 0|rougeL_acc |↑ | 0.2570|± |0.0153|
96
+ | | |none | 0|rougeL_diff|↑ |-1.8812|± |0.4185|
97
+ | | |none | 0|rougeL_max |↑ |19.6284|± |0.6850|
98
+ | - truthfulqa_mc1 | 2|none | 0|acc |↑ | 0.1983|± |0.0140|
99
+ | - truthfulqa_mc2 | 2|none | 0|acc |↑ | 0.3861|± |0.0147|
100
+ | - winogrande | 1|none | 5|acc |↑ | 0.4972|± |0.0141|
101
+
102
+ | Groups |Version|Filter|n-shot|Metric| |Value | |Stderr|
103
+ |-------------------|------:|------|------|------|---|-----:|---|-----:|
104
+ | - mmlu | 2|none | |acc |↑ |0.2290|± |0.0035|
105
+ | - humanities | 2|none | |acc |↑ |0.2380|± |0.0062|
106
+ | - other | 2|none | |acc |↑ |0.2375|± |0.0076|
107
+ | - social sciences| 2|none | |acc |↑ |0.2203|± |0.0075|
108
+ | - stem | 2|none | |acc |↑ |0.2157|± |0.0073|
109
+
110
+ | Tasks |Version|Filter|n-shot| Metric | |Value | |Stderr|
111
+ |---------------------------------|------:|------|-----:|--------|---|-----:|---|-----:|
112
+ |agieval_nous | 0|none | |acc_norm|↑ |0.2133|± |0.0081|
113
+ | - agieval_aqua_rat | 1|none | 0|acc |↑ |0.2047|± |0.0254|
114
+ | | |none | 0|acc_norm|↑ |0.1969|± |0.0250|
115
+ | - agieval_logiqa_en | 1|none | 0|acc |↑ |0.2043|± |0.0158|
116
+ | | |none | 0|acc_norm|↑ |0.2304|± |0.0165|
117
+ | - agieval_lsat_ar | 1|none | 0|acc |↑ |0.1739|± |0.0250|
118
+ | | |none | 0|acc_norm|↑ |0.1957|± |0.0262|
119
+ | - agieval_lsat_lr | 1|none | 0|acc |↑ |0.1549|± |0.0160|
120
+ | | |none | 0|acc_norm|↑ |0.1608|± |0.0163|
121
+ | - agieval_lsat_rc | 1|none | 0|acc |↑ |0.1636|± |0.0226|
122
+ | | |none | 0|acc_norm|↑ |0.2119|± |0.0250|
123
+ | - agieval_sat_en | 1|none | 0|acc |↑ |0.2670|± |0.0309|
124
+ | | |none | 0|acc_norm|↑ |0.2621|± |0.0307|
125
+ | - agieval_sat_en_without_passage| 1|none | 0|acc |↑ |0.2670|± |0.0309|
126
+ | | |none | 0|acc_norm|↑ |0.2621|± |0.0307|
127
+ | - agieval_sat_math | 1|none | 0|acc |↑ |0.2182|± |0.0279|
128
+ | | |none | 0|acc_norm|↑ |0.2318|± |0.0285|
129
+ |arc_challenge | 1|none | 0|acc |↑ |0.1945|± |0.0116|
130
+ | | |none | 0|acc_norm|↑ |0.2372|± |0.0124|
131
+ |truthfulqa_mc2 | 2|none | 0|acc |↑ |0.3861|± |0.0147|
132
+
133
+ | Groups |Version|Filter|n-shot| Metric | |Value | |Stderr|
134
+ |------------|------:|------|------|--------|---|-----:|---|-----:|
135
+ |agieval_nous| 0|none | |acc_norm|↑ |0.2133|± |0.0081|