metadata
license: apache-2.0
datasets:
- togethercomputer/RedPajama-Data-1T
language:
- en
pipeline_tag: text-generation
library_name: transformers
BSL-470M
BSL-470M is a 470M model with Mistral achitecture pre-trained from scratch on the CC split of Redpajama.
It is used as the baseline for PDS-470M.
Evaluation
PDS-selected data improves the performance of language models pre-trained from scratch and saves pre-training comptation. The improvement scales up to large model sizes.
Citation
@article{gu2024data,
title={Data Selection via Optimal Control for Language Models},
author={Gu, Yuxian and Dong, Li and Wang, Hongning and Hao, Yaru and Dong, Qingxiu and Wei, Furu and Huang, Minlie},
journal={arXiv preprint arXiv:2410.07064},
year={2024}
}