mt5-small-esquad-qg-ae / eval /metric.middle.sentence.sentence_answer.question.asahi417_qg_esquad.default.json
asahi417's picture
model update
3fdcf8a
raw
history blame
464 Bytes
{"validation": {"Bleu_1": 0.21819959021785917, "Bleu_2": 0.13950204008420325, "Bleu_3": 0.09802172254092724, "Bleu_4": 0.07205809316645144, "METEOR": 0.18243528733366013, "ROUGE_L": 0.19549223449343636, "BERTScore": 0.8123438428444653}, "test": {"Bleu_1": 0.20619929561423309, "Bleu_2": 0.1300537207730615, "Bleu_3": 0.09025929420869548, "Bleu_4": 0.06556396851925318, "METEOR": 0.1783634674794608, "ROUGE_L": 0.18816674181811244, "BERTScore": 0.8122122343040007}}