great model - any chance of qwen update?
magnum is one of my favorite models.
any chance we can get a magnum based off the new qwen 2.5 32B? (this one is qwen 1.5 looks like).
thanks.
Possibly. We've still yet to experiment with the new 32b - In the meanwhile it's worth trying out our 27B and 34B (Based off Gemma and Yi respectively)
Yes please 🙂
I’ve been using the 34b and it is indeed good but the new qwen 2.5 32b is benchmarking higher than some 70Bs at the moment
I’ve been using the 34b and it is indeed good but the new qwen 2.5 32b is benchmarking higher than some 70Bs at the moment
I would take the qwen benches with a grain of salt if I were you. Admittedly, I haven't tried fine-tuning anything bigger than the 7B, but performance was below a minitron 8B tuned on the exact same data. Not to mention the apparent filtering of pretrain data.