Datasets:

leaderboard_dev / czechbench_leaderboard /gemma-2-9b-it_eval_request.json
davidadamczyk's picture
Add gemma-2-9b-it to eval queue
38e88b5 verified
raw
history blame
640 Bytes
{"eval_name": "gemma-2-9b-it", "precision": "bfloat16", "hf_model_id": "https://huggingface.co/google/gemma-2-9b-it", "contact_email": "jirkoada@cvut.cz", "agree_cs": 0.6586921850079744, "anli_cs": 0.5658333333333333, "arc_challenge_cs": 0.8208191126279863, "arc_easy_cs": 0.9212962962962963, "belebele_cs": 0.9039106145251397, "ctkfacts_cs": 0.6989247311827957, "czechnews_cs": 0.799, "fb_comments_cs": 0.766, "gsm8k_cs": 0.5064442759666414, "klokanek_cs": 0.275990099009901, "mall_reviews_cs": 0.6323333333333333, "mmlu_cs": 0.5926015473887815, "sqad_cs": 0.7532621589561092, "subjectivity_cs": 0.828, "truthfulqa_cs": 0.6403940886699507}