How to compute the aggerate score?
#35
by
mornmirror
- opened
For each row of data in the eval_results.csv, I tried to calculate the average value of the metrics for all tasks, but the final result is not equal to agg_score. So how is this score calculated?
Additionally, I tried using lighteval for evaluation, and there is a metric in the evaluation result called "all (acc / acc_norm)". Is it equivalent to the agg_score?
It is the average of these columns: ['commonsense_qa/acc_norm', 'hellaswag/acc_norm', 'openbookqa/acc_norm', 'piqa/acc_norm', 'siqa/acc_norm', 'winogrande/acc_norm', 'arc/acc_norm', 'mmlu/acc_norm']
guipenedo
changed discussion status to
closed