Transformers
English
Inference Endpoints

Was this also trained on 7000 steps?

#1
by Xianjun - opened

Hi,

I noticed in another link https://huggingface.co/EleutherAI/sae-llama-3-8b-32x-v2 you mentioned that they were trained on 7000 steps. Were this repo also trained on 7000 steps?

EleutherAI org

It was trained for over 30k steps on over 8B tokens.

Sign up or log in to comment