Update README.md
Browse files
README.md
CHANGED
@@ -20,11 +20,11 @@ palmer is a series of ~1b parameters language models fine-tuned to be used as ba
|
|
20 |
|tinyllama-3 | 0.3029 |0.5935| 0.7329 | **0.5959**|
|
21 |
|palmer-002|**0.3242**|**0.5956**|**0.7345**| 0.5888|
|
22 |
|
|
|
23 |
|
24 |
### training
|
25 |
Training took ~3.5 P100 gpu hours. It was trained on 15,000 gpt-4 shuffled samples. palmer was fine-tuned using lower learning rates ensuring it keeps as much general knowledge as possible.
|
26 |
|
27 |
-
|
28 |
### prompt
|
29 |
```
|
30 |
no prompt
|
|
|
20 |
|tinyllama-3 | 0.3029 |0.5935| 0.7329 | **0.5959**|
|
21 |
|palmer-002|**0.3242**|**0.5956**|**0.7345**| 0.5888|
|
22 |
|
23 |
+
This model shows exceptional performance and as of now is the best tinyllama-size base model. Furthermore, this proves LIMA paper point and serves as a good open-source alternative to `babbage-002`.
|
24 |
|
25 |
### training
|
26 |
Training took ~3.5 P100 gpu hours. It was trained on 15,000 gpt-4 shuffled samples. palmer was fine-tuned using lower learning rates ensuring it keeps as much general knowledge as possible.
|
27 |
|
|
|
28 |
### prompt
|
29 |
```
|
30 |
no prompt
|