kingbri commited on
Commit
ece98fb
1 Parent(s): 58689ad

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +51 -0
README.md ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ library_name: transformers
5
+ pipeline_tag: text-generation
6
+ tags:
7
+ - llama
8
+ - llama-2
9
+ license: llama2
10
+ ---
11
+
12
+ # Model Card: Pygmalion-2-13b-SuperCOT
13
+
14
+ This is a merge between:
15
+ - [Pygmalion 2 13b](https://huggingface.co/PygmalionAI/pygmalion-2-13b)
16
+ - [Ausboss's Llama2 SuperCOT loras](https://huggingface.co/ausboss/llama2-13b-supercot-loras) at a weight of 1.00.
17
+
18
+ Quantizations provided by myself:
19
+ - [GGML](https://huggingface.co/royallab/Pygmalion-2-13b-SuperCOT-GGUF)
20
+
21
+ The merge was performed by a commandline version of [EzTrainer](https://github.com/CoffeeVampir3/ez-trainer) by CoffeeVampire/Blackroot via [zaraki-tools](https://github.com/CoffeeVampir3/ez-trainer) by Zaraki.
22
+
23
+ The intended objective is to make Pygmalion-2 smarter and try to make it drift off less.
24
+
25
+ The SuperCOT lora was merged at a weight of 1.
26
+
27
+ ## Usage:
28
+
29
+ Since this is a merge between Pygmalion-2 and SuperCOT, the following instruction formats should work:
30
+
31
+ Metharme:
32
+
33
+ ```
34
+ <|system|>This is a text adventure game. Describe the scenario to the user and give him three options to pick from on each turn.<|user|>Start!<|model|>
35
+ ```
36
+
37
+ Alpaca:
38
+
39
+ ```
40
+ ### Instruction:
41
+ Your instruction or question here.
42
+ ### Response:
43
+ ```
44
+
45
+ ## Bias, Risks, and Limitations
46
+
47
+ The model will show biases similar to those observed in niche roleplaying forums on the Internet, besides those exhibited by the base model. It is not intended for supplying factual information or advice in any form.
48
+
49
+ ## Training Details
50
+
51
+ This model is merged and can be reproduced using the tools mentioned above. Please refer to all provided links for extra model-specific details.