palmer-002 / README.md
appvoid's picture
Update README.md
adb46a6
|
raw
history blame
No virus
1.78 kB
metadata
license: apache-2.0
language:
  - en
pipeline_tag: text-generation
datasets:
  - appvoid/no-prompt-15k

palmer

palmer

a better base model

palmer is a series of ~1b parameters language models fine-tuned to be used as base models instead of using custom prompts for tasks. This means that it can be further fine-tuned on more data with custom prompts as usual or be used for downstream tasks as any base model you can get. The model has the best of both worlds: some "bias" to act as an assistant, but also the abillity to predict the next-word from its internet knowledge base. It's a 1.1b llama 2 model so you can use it with your favorite tools/frameworks.

evaluation

Model ARC_C HellaSwag PIQA Winogrande
tinyllama-2 0.2807 0.5463 0.7067 0.5683
palmer-001 0.2807 0.5524 0.7106 0.5896
tinyllama-2.5 0.3191 0.5896 0.7307 0.5872
tinyllama-3 0.3029 0.5935 0.7329 0.5959
palmer-002 0.3242 0.5956 0.7345 0.5888

This model shows exceptional performance and as of now is the best tinyllama-size base model. Furthermore, this proves LIMA paper point and serves as a good open-source alternative to openai's babbage-002.

training

Training took ~3.5 P100 gpu hours. It was trained on 15,000 gpt-4 shuffled samples. palmer was fine-tuned using lower learning rates ensuring it keeps as much general knowledge as possible.

prompt

no prompt

Buy Me A Coffee