Spaces:
Running
on
CPU Upgrade
Can I get the error log for my failed model?
Thanks for the great leaderboard.
My model adhered the guideline properly,
and I tested the model in huggingface_hub (downloaded weight) by local lm-evaluation-harness environment.
It goes well in local env, but it has been failing in leaderboard continuously.
Could you please provide me the error log for my model?
I apologize for the inconvenience.
MODEL NAME: heavytail/kullm-polyglot-12.8b-S
Hello,
Apologies for the delayed response.
I will forward your request to our PIC.
Regards
Hello,
The Error log is here.
ValueError: The current architecture does not support Flash Attention 2.0. Please open an issue on GitHub to request support for this architecture: https://github.com/huggingface/transformers/issues/new
I will modify the models that do not support Flash attention to be evaluated without applying Flash attention.
Thanks
@choco9966 Thank you so much for your kind assistance! It has perfectly cleared up my curiosity.