Update README.md
Browse files
README.md
CHANGED
@@ -15,35 +15,4 @@ license: apache-2.0
|
|
15 |
|
16 |
Haary/haryra-7B-gguf adalah Model LLM Bahasa Indonesia
|
17 |
|
18 |
-
Model [Haary/haryra-7b-id](https://huggingface.co/Haary/haryra-7b-id) adalah Model terkuantisasi dari Model Dasar [Open-Orca/Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca) ke format GGUF.
|
19 |
-
|
20 |
-
## Cara menjalankan dengan kode Python
|
21 |
-
|
22 |
-
Anda dapat menggunakan model GGUF dari Python menggunakan [ctransformers](https://github.com/marella/ctransformers) library.
|
23 |
-
### Cara memuat model ini dalam kode Python, menggunakan ctransformers
|
24 |
-
|
25 |
-
#### Pertama instal package ctransformers
|
26 |
-
|
27 |
-
Jalankan salah satu perintah berikut, sesuai dengan sistem Anda:
|
28 |
-
|
29 |
-
```shell
|
30 |
-
# Base ctransformers with no GPU acceleration
|
31 |
-
pip install ctransformers
|
32 |
-
# Or with CUDA GPU acceleration
|
33 |
-
pip install ctransformers[cuda]
|
34 |
-
# Or with AMD ROCm GPU acceleration (Linux only)
|
35 |
-
CT_HIPBLAS=1 pip install ctransformers --no-binary ctransformers
|
36 |
-
# Or with Metal GPU acceleration for macOS systems only
|
37 |
-
CT_METAL=1 pip install ctransformers --no-binary ctransformers
|
38 |
-
```
|
39 |
-
|
40 |
-
#### Contoh kode sederhana untuk menjalankan ctransformers
|
41 |
-
|
42 |
-
```python
|
43 |
-
from ctransformers import AutoModelForCausalLM
|
44 |
-
|
45 |
-
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
|
46 |
-
llm = AutoModelForCausalLM.from_pretrained("Ichsan2895/Merak-7B-v4-GGUF", model_file="Merak-7B-v4-model-q5_k_m.gguf", model_type="mistral", gpu_layers=50)
|
47 |
-
|
48 |
-
print(llm("AI is going to"))
|
49 |
-
```
|
|
|
15 |
|
16 |
Haary/haryra-7B-gguf adalah Model LLM Bahasa Indonesia
|
17 |
|
18 |
+
Model [Haary/haryra-7b-id](https://huggingface.co/Haary/haryra-7b-id) adalah Model terkuantisasi dari Model Dasar [Open-Orca/Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca) ke format GGUF.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|