Language support problem
hello,
In the Model Details --> Model Description --> Language(s) (NLP), you can find "Persian" as a supported language.
But when I input Persian text into the "Hosted Inference API" It does not output anything. loading the example model into google colab didn't work either; it returns several tokens.
Is there anything I'm missing? any clue would be helpful.
Thank you very much
Hi
@SepehrAA
Thanks for the issue
Please have a look at this GitHub issue: https://github.com/google-research/t5x/issues/1131 it turns out the released checkpoints do not include the multi-lingual capabilities stated in the paper. We have recently changed the model card to reflect the changes and state the supported languages of this model
As soon as Google releases multilingual checkpoints we'll make sure it is going to be supported from day 0 on HF transformers.