May I ask if you are considering expanding the dataset and model scale to 30~65B?
#5
by
Jackdiy
- opened
Personally, I am very much looking forward to your new model.
In fact, I believe that the Pygmalion 13B version is superior to other models of the same kind.
However, with the emergence of more high-scoring models above 30B such as Falcon and Guanaco, as a fan, I still hope to see Pygmalion launch larger-scale models.
Of course, I have also seen models such as Manticore and Hippogriff trained using the Pygmalion dataset, and it seems that you are also trying to collaborate to produce higher quality models.
If compute is really an issue, I would be happy to assist by renting some GPUs to help with training models of 33B and 40B. I can also help test quantized models for you.