robinsyihab
commited on
Commit
•
736cdd5
1
Parent(s):
b436915
Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,56 @@
|
|
1 |
---
|
2 |
-
license:
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
license: apache-2.0
|
3 |
---
|
4 |
+
|
5 |
+
# Sidrap-7B-v2-GPTQ-4bit
|
6 |
+
|
7 |
+
Sidrap-7B-v2-GPTQ-4bit is an 4-bit quantized model of Sidrap-7B-v2, which is one of the best open model LLM bahasa Indonesia available today. This model has been quantized using [AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ) to get smaller model that allows us to run in a lower resource environment with faster inference. The quantization uses random subset of original training data to "calibrate" the weights resulting in an optimally compact model with minimall loss in accuracy.
|
8 |
+
|
9 |
+
## Usage
|
10 |
+
|
11 |
+
The fastest way to use this model, use [AutoGPTQ-API](https://github.com/anvie/gptq-api):
|
12 |
+
|
13 |
+
```bash
|
14 |
+
python -m gptqapi.server robinsyihab/Sidrap-7B-v2-GPTQ-4bit
|
15 |
+
```
|
16 |
+
|
17 |
+
Or use AutoGPTQ directly:
|
18 |
+
|
19 |
+
```python
|
20 |
+
from transformers import AutoTokenizer, pipeline
|
21 |
+
from auto_gptq import AutoGPTQForCausalLM
|
22 |
+
|
23 |
+
model_id = "robinsyihab/Sidrap-7B-v2-GPTQ-4bit"
|
24 |
+
|
25 |
+
tokenizer = AutoTokenizer.from_pretrained(model_id, use_fast=True)
|
26 |
+
|
27 |
+
model = AutoGPTQForCausalLM.from_quantized(model_id,
|
28 |
+
device="cuda:0",
|
29 |
+
inject_fused_mlp=True,
|
30 |
+
inject_fused_attention=True,
|
31 |
+
trust_remote_code=True)
|
32 |
+
|
33 |
+
chat = pipeline("text-generation",
|
34 |
+
model=model,
|
35 |
+
tokenizer=tokenizer,
|
36 |
+
device_map="auto")
|
37 |
+
|
38 |
+
prompt = ("<s>[INST] <<SYS>>\nAnda adalah asisten yang suka membantu, penuh hormat, dan jujur. Selalu jawab semaksimal mungkin, sambil tetap aman. Jawaban Anda tidak boleh berisi konten berbahaya, tidak etis, rasis, seksis, beracun, atau ilegal. Harap pastikan bahwa tanggapan Anda tidak memihak secara sosial dan bersifat positif.\n\
|
39 |
+
Jika sebuah pertanyaan tidak masuk akal, atau tidak koheren secara faktual, jelaskan alasannya daripada menjawab sesuatu yang tidak benar. Jika Anda tidak mengetahui jawaban atas sebuah pertanyaan, mohon jangan membagikan informasi palsu.\n"
|
40 |
+
"<</SYS>>\n\n"
|
41 |
+
"Siapa penulis kitab alfiyah? [/INST]\n"
|
42 |
+
)
|
43 |
+
|
44 |
+
sequences = chat(prompt, num_beams=2, max_length=max_size, top_k=10, num_return_sequences=1)
|
45 |
+
print(sequences[0]['generated_text'])
|
46 |
+
```
|
47 |
+
|
48 |
+
## License
|
49 |
+
|
50 |
+
Sidrap-7B-v2-GPTQ is licensed under the Apache 2.0 License.
|
51 |
+
|
52 |
+
## Author
|
53 |
+
|
54 |
+
[] Robin Syihab ([@anvie](https://x.com/anvie))
|
55 |
+
|
56 |
+
|