SFT on a synthetic custom (french) dataset (2k), from general question answering, problem solving to code question. It's a POC.

from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
import torch

model = AutoModelForCausalLM.from_pretrained(
    "teilomillet/MiniMerlin-3B",
    revision="0.1",
    return_dict=True,
    torch_dtype=torch.bfloat16,
    device_map='auto'
)

tokenizer = AutoTokenizer.from_pretrained("teilomillet/MiniMerlin-3B")
tokenizer.pad_token = tokenizer.eos_token

text = "[|User|] Comment faire un bon plat ? </s>[|Assistant|]"
inputs = tokenizer(text, return_tensors="pt").to(0)

outputs = model.generate(**inputs, max_new_tokens=800)
print(tokenizer.decode(outputs[0], skip_special_tokens=False))
Downloads last month
1,251
Safetensors
Model size
3.02B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for teilomillet/MiniMerlin-3B

Quantizations
1 model

Collection including teilomillet/MiniMerlin-3B