Update README.md
Browse files
README.md
CHANGED
@@ -22,14 +22,20 @@ Highest quality will be Q6/Q8.
|
|
22 |
This merge was an experiment to test already established Roleplay, Fiction and Story
|
23 |
generation of "Tiefighter" with a some of "Orca 2"'s qualities.
|
24 |
|
25 |
-
For Imatrix plus this was a test of high precision in specific areas of the model leading to a slightly larger
|
26 |
In addition the Imatrix process itself used a larger "calibration" file than standard to further enhance quality.
|
27 |
|
|
|
|
|
28 |
A blank or standard Alpaca Template for text generation will work.
|
29 |
Currently "CHATML" is untested.
|
30 |
|
31 |
Context length: 4096.
|
32 |
|
|
|
|
|
|
|
|
|
33 |
# merge
|
34 |
|
35 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
|
|
22 |
This merge was an experiment to test already established Roleplay, Fiction and Story
|
23 |
generation of "Tiefighter" with a some of "Orca 2"'s qualities.
|
24 |
|
25 |
+
For Imatrix plus this was a test of high precision in specific areas of the model leading to a slightly larger compressed file.
|
26 |
In addition the Imatrix process itself used a larger "calibration" file than standard to further enhance quality.
|
27 |
|
28 |
+
The process added appoximately 310 MB to each compressed file.
|
29 |
+
|
30 |
A blank or standard Alpaca Template for text generation will work.
|
31 |
Currently "CHATML" is untested.
|
32 |
|
33 |
Context length: 4096.
|
34 |
|
35 |
+
Please see the orginal model card for specific details of use, additional credits and tips:
|
36 |
+
|
37 |
+
[KoboldAI/LLaMA2-13B-Tiefighter](https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter)
|
38 |
+
|
39 |
# merge
|
40 |
|
41 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|