RedPJs-B1.58-1B is a 1B parameter model trained using the method described in The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits.

It was trained on 1T tokens of the Red Pajamas dataset, so it is merely a research proof-of-concept to test out the methodolgy.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .