zgce commited on
Commit
6d1cdc6
1 Parent(s): 0039e29

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -0
README.md CHANGED
@@ -1,3 +1,20 @@
1
  ---
2
  license: mit
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
  ---
4
+
5
+ 这是以Yi-34B-Llama为底座重新合并的模型,原本200K上下文底座在合并了几个非200K上下文LoRA后的效果好象不太行,所以使用与LoRA相配套的底座重新合并
6
+
7
+ ### Yi-34b-200K-alpaca-rpv3-scipy-6bpw-hb6-exl2
8
+
9
+ - base model: [Yi-34B-Llama](https://huggingface.co/chargoddard/Yi-34B-Llama)
10
+ - LoRA: [Yi-34b-alpaca-cot-lora](https://huggingface.co/zzlgreat/Yi-34b-alpaca-cot-lora)
11
+ - LoRA: [Yi-34B-Spicyboros-3.1-LoRA](https://huggingface.co/LoneStriker/Yi-34B-Spicyboros-3.1-LoRA)
12
+ - LoRA: [limarpv3-yi-llama-34b-lora](https://huggingface.co/Doctor-Shotgun/limarpv3-yi-llama-34b-lora)
13
+
14
+ ### description
15
+
16
+ - This is test for [exllamav2](https://github.com/turboderp/exllamav2) model version must after [Add Yi support](https://github.com/turboderp/exllamav2/commit/6d24e1ad40d89f64b1bd3ae36e639c74c9f730b2)
17
+ - 6.0bpw `python convert.py -i acsr-y34b -c exl2/0000.parquet -o acsr-y34b-4bpw-hb6-exl2 -hb 6 -l 4096 -b 6`
18
+ - [convert doc](https://github.com/turboderp/exllamav2/blob/master/doc/convert.md)
19
+ - calibration dataset: [WikiText-2-v1](https://huggingface.co/datasets/wikitext/blob/refs%2Fconvert%2Fparquet/wikitext-2-v1/test/0000.parquet)
20
+ - oobabooga/text-generation-webui must add `--trust-remote-code` into CMD_FLAGS.txt and use ExLlamav2 to load model