File size: 690 Bytes
99f2570
 
546b5bf
 
 
 
 
 
 
b9205ba
 
99f2570
546b5bf
 
 
 
fb65f69
 
 
 
 
 
 
 
 
 
 
 
 
546b5bf
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
---
license: mit
datasets:
- Open-Orca/SlimOrca
- beaugogh/openorca-multiplechoice-10k
language:
- en
metrics:
- accuracy
tags:
- merge
---


This model is based on the LLama 7b model as a backbone, and datasets from various Orcas have been fine-tuned and merged.


The three models were combined, and the model with the best ARC and MMLU performance was given the highest weight.


First: fine-tuning beaugogh/openorca-multiplechoice-10k on llama2 7b, but using the NEFTune method.


Second: model fine-tuned with the SlimOrca dataset on llama2 7b.

Third : Model with beaugogh/openorca-multiplechoice-10k fine-tuned on llama2 7b.



We'll add the results once we have the official results