Update README.md
Browse files
README.md
CHANGED
@@ -20,7 +20,7 @@ Due to the yet-emerging good performance of tinyllama, this model is considered
|
|
20 |
|tinyllama-2t| 0.2807| 0.5463| 0.7067| 0.5683|
|
21 |
|palmer-001 | 0.2807| 0.5524| 0.7106| 0.5896|
|
22 |
|sheared-1.3b| 0.2910| 0.5935| 0.7339| 0.5809|
|
23 |
-
|palmer-
|
24 |
|falcon-rw-1b-instruct-openorca (sota) | **0.3362**| 0.5997| **0.7394**| **0.6148**|
|
25 |
|
26 |
This model was trained on less than 25% of the dataset yet achieves competitive performance to current sota on open llm leaderboard. Wait for what it's coming!
|
|
|
20 |
|tinyllama-2t| 0.2807| 0.5463| 0.7067| 0.5683|
|
21 |
|palmer-001 | 0.2807| 0.5524| 0.7106| 0.5896|
|
22 |
|sheared-1.3b| 0.2910| 0.5935| 0.7339| 0.5809|
|
23 |
+
|palmer-1.3b | 0.3157| **0.6022**| 0.7334| 0.5864|
|
24 |
|falcon-rw-1b-instruct-openorca (sota) | **0.3362**| 0.5997| **0.7394**| **0.6148**|
|
25 |
|
26 |
This model was trained on less than 25% of the dataset yet achieves competitive performance to current sota on open llm leaderboard. Wait for what it's coming!
|