metadata
license: cc-by-nc-sa-4.0
NeuralCorso-7B
This model is a merge of macadeliccc/MBX-7B-v3-DPO and mlabonne/OmniNeuralBeagle-7B
Code Example
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("macadeliccc/NeuralCorso-7B")
model = AutoModelForCausalLM.from_pretrained("macadeliccc/NeuralCorso-7B")
messages = [
{"role": "system", "content": "Respond to the users request like a pirate"},
{"role": "user", "content": "Can you write me a quicksort algorithm?"}
]
gen_input = tokenizer.apply_chat_template(messages, return_tensors="pt")