mt5-small-esquad-qg / eval /metric.middle.sentence.paragraph_answer.question.asahi417_qg_esquad.default.json
asahi417's picture
model update
bd90dfd
raw
history blame
464 Bytes
{"validation": {"Bleu_1": 0.26551986740133254, "Bleu_2": 0.18090739229530525, "Bleu_3": 0.1312650889906266, "Bleu_4": 0.09791974944266672, "METEOR": 0.22270198760191523, "ROUGE_L": 0.24769038536606258, "BERTScore": 0.8335144254266562}, "test": {"Bleu_1": 0.26027154514079115, "Bleu_2": 0.1775501365996318, "Bleu_3": 0.12883948504882092, "Bleu_4": 0.09615584726648878, "METEOR": 0.22718274705628438, "ROUGE_L": 0.24626052122503178, "BERTScore": 0.8404490829277896}}