macadeliccc
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -7,16 +7,26 @@ tags:
|
|
7 |
- merge
|
8 |
|
9 |
---
|
10 |
-
#
|
11 |
|
12 |
-
This is a
|
13 |
|
14 |
-
|
15 |
-
### Merge Method
|
16 |
|
17 |
-
|
18 |
|
19 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
20 |
|
21 |
The following models were included in the merge:
|
22 |
* [macadeliccc/MBX-7B-v3-DPO](https://huggingface.co/macadeliccc/MBX-7B-v3-DPO)
|
|
|
7 |
- merge
|
8 |
|
9 |
---
|
10 |
+
# OmniCorso-7B
|
11 |
|
12 |
+
This model is a finetune of [flemmingmiguel/MBX-7B-v3](https://huggingface.co/flemmingmiguel/MBX-7B-v3) using jondurbin/truthy-dpo-v0.1
|
13 |
|
14 |
+
![MBX-v3-orca](MBX-v3-orca.png)
|
|
|
15 |
|
16 |
+
## Code Example
|
17 |
|
18 |
+
```python
|
19 |
+
from transformers import AutoModelForCausalLM, AutoTokenizer
|
20 |
+
|
21 |
+
tokenizer = AutoTokenizer.from_pretrained("macadeliccc/MBX-7B-v3-DPO")
|
22 |
+
model = AutoModelForCausalLM.from_pretrained("macadeliccc/MBX-7B-v3-DPO")
|
23 |
+
|
24 |
+
messages = [
|
25 |
+
{"role": "system", "content": "Respond to the users request like a pirate"},
|
26 |
+
{"role": "user", "content": "Can you write me a quicksort algorithm?"}
|
27 |
+
]
|
28 |
+
gen_input = tokenizer.apply_chat_template(messages, return_tensors="pt")
|
29 |
+
```
|
30 |
|
31 |
The following models were included in the merge:
|
32 |
* [macadeliccc/MBX-7B-v3-DPO](https://huggingface.co/macadeliccc/MBX-7B-v3-DPO)
|