mt5-small-esquad-qg-ae / eval /metric.first.sentence.sentence_answer.question.asahi417_qg_esquad.default.json
asahi417's picture
model update
3fdcf8a
raw
history blame
464 Bytes
{"validation": {"Bleu_1": 0.21792636845923863, "Bleu_2": 0.1391921074136978, "Bleu_3": 0.09777802796816526, "Bleu_4": 0.07188205080719495, "METEOR": 0.18237218489404464, "ROUGE_L": 0.19531823037044455, "BERTScore": 0.8128695168058157}, "test": {"Bleu_1": 0.2063536589641975, "Bleu_2": 0.13014381926677407, "Bleu_3": 0.09034457503515292, "Bleu_4": 0.06563927859821322, "METEOR": 0.17841670321516934, "ROUGE_L": 0.18824461469473566, "BERTScore": 0.8123440295728589}}