mt5-small-dequad-qg / eval /metric.middle.sentence.paragraph_answer.question.asahi417_qg_dequad.default.json
asahi417's picture
model update
f90d0a0
raw
history blame
540 Bytes
{"validation": {"Bleu_1": 0.11813572122126885, "Bleu_2": 0.048813805474005025, "Bleu_3": 0.021358538622395132, "Bleu_4": 0.007024256092471934, "METEOR": 0.12901405970686994, "ROUGE_L": 0.11367946195630503, "BERTScore": 0.8173051628433459, "MoverScore": 0.5571131906947461}, "test": {"Bleu_1": 0.10079532543417731, "Bleu_2": 0.039860753814126806, "Bleu_3": 0.015592228739478001, "Bleu_4": 0.004204082237335996, "METEOR": 0.11462500299930649, "ROUGE_L": 0.09990296575569915, "BERTScore": 0.7984951820456914, "MoverScore": 0.5460759035900534}}