rwkv-4-pile-14b / README.md
BlinkDL's picture
Update README.md
7a2e9f4
|
raw
history blame
No virus
789 Bytes
---
language:
- en
tags:
- pytorch
- text-generation
- causal-lm
- rwkv
license: apache-2.0
datasets:
- the_pile
---
# RWKV-4 14B
## Model Description
RWKV-4 14B is a L40-D5120 causal language model trained on the Pile. See https://github.com/BlinkDL/RWKV-LM for details.
Use https://github.com/BlinkDL/ChatRWKV to run it.
ctx_len = 1024
n_layer = 40
n_embd = 5120
RWKV-4-Pile-14B-2023xxxx-ctx4096-testxxx.pth : Fine-tuned to ctx_len 4096.
* Likely better. Please test.
RWKV-4-Pile-14B-20230213-8019.pth : Trained on the Pile for 331B tokens.
* Pile loss 1.7579
* LAMBADA ppl 3.81, acc 71.05%
* PIQA acc 77.42%
* SC2016 acc 75.57%
* Hellaswag acc_norm 70.24%
* WinoGrande acc 62.98%
ctx4096-test282: use this when you need long ctxlen. might be slightly weaker for short ctxlens.