mt5-small-esquad-qg-ae / eval /metric.first.answer.paragraph_answer.question.asahi417_qg_esquad.default.json
asahi417's picture
model update
3fdcf8a
raw
history blame
461 Bytes
{"validation": {"Bleu_1": 0.2446635702449636, "Bleu_2": 0.16418451278614196, "Bleu_3": 0.11823868647011701, "Bleu_4": 0.0878258444925781, "METEOR": 0.2136834484910009, "ROUGE_L": 0.229652739792337, "BERTScore": 0.8310507408661333}, "test": {"Bleu_1": 0.24401122581938253, "Bleu_2": 0.16407196791015455, "Bleu_3": 0.11786223012607369, "Bleu_4": 0.08749247245366705, "METEOR": 0.21622474756077356, "ROUGE_L": 0.23089303417191653, "BERTScore": 0.8339754274687682}}