llama-30b / README.md
leaderboard-pr-bot's picture
Adding Evaluation Results
a81b8c9
|
raw
history blame
1.13 kB
metadata
license: other

This contains the weights for the LLaMA-30b model. This model is under a non-commercial license (see the LICENSE file). You should only use this repository if you have been granted access to the model by filling out this form but either lost your copy of the weights or got some trouble converting them to the Transformers format.

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 49.73
ARC (25-shot) 61.43
HellaSwag (10-shot) 84.73
MMLU (5-shot) 58.45
TruthfulQA (0-shot) 42.27
Winogrande (5-shot) 80.03
GSM8K (5-shot) 14.86
DROP (3-shot) 6.33