Files changed (1) hide show
  1. README.md +8 -7
README.md CHANGED
@@ -48,13 +48,14 @@ Model evaluation metrics and results.
48
 
49
  | Benchmark | Metric | Llama-2-7b-ultrachat | Llama-2-7b-pruned50-retrained-ultrachat-quant-ds |
50
  |------------------------------------------------|---------------|-------------|-------------------------------|
51
- | [MMLU](https://arxiv.org/abs/2009.03300) | 5-shot, top-1 | xxxx | xxxx |
52
- | [HellaSwag](https://arxiv.org/abs/1905.07830) | 0-shot | xxxx | xxxx |
53
- | [WinoGrande](https://arxiv.org/abs/1907.10641) | partial score | xxxx | xxxx |
54
- | [ARC-c](https://arxiv.org/abs/1911.01547) | | xxxx | xxxx |
55
- | [TruthfulQA](https://arxiv.org/abs/2109.07958) | 5-shot | xxxx | xxxx |
56
- | [HumanEval](https://arxiv.org/abs/2107.03374) | pass@1 | xxxx | xxxx |
57
- | [GSM8K](https://arxiv.org/abs/2110.14168) | maj@1 | xxxx | xxxx |
 
58
 
59
  ## Help
60
 
 
48
 
49
  | Benchmark | Metric | Llama-2-7b-ultrachat | Llama-2-7b-pruned50-retrained-ultrachat-quant-ds |
50
  |------------------------------------------------|---------------|-------------|-------------------------------|
51
+ | [MMLU](https://arxiv.org/abs/2009.03300) | 5-shot | 46.1% | 36.9% |
52
+ | [HellaSwag](https://arxiv.org/abs/1905.07830) | 0-shot | 75.9% | 69.0% |
53
+ | [WinoGrande](https://arxiv.org/abs/1907.10641) | 5-shot | 72.6% | 65.7% |
54
+ | [ARC-c](https://arxiv.org/abs/1911.01547) | 25-shot | 52.8% | 45.7% |
55
+ | [TruthfulQA](https://arxiv.org/abs/2109.07958) | 5-shot | 44.8% | 40.5% |
56
+ | [GSM8K](https://arxiv.org/abs/2110.14168) | 5-shot | 12.4% | 4.5% |
57
+ | [AlpacaEval](https://arxiv.org/abs/2107.03374) ([Llama-2-7b-chat-hf](https://huggingface.co/meta-llama/Llama-2-70b-chat-hf) evaluator) | Win rate | 57.6% | 60.6% |
58
+ | [AlpacaEval](https://arxiv.org/abs/2107.03374) (GPT-4 Turbo evaluator) | Win rate | 60.6% | 60.6% |
59
 
60
  ## Help
61