Merging stuff to make a potato. Idk about it, might delete later.

Merge of MiniMerlin via Task arithmetic using mergekit. There was no goal except merging. Interest in the outcome tho. I might need to fine-tune it more.

FT on more french data (Merlin).

Je pense qu'il s'agit du meilleur model français en 3B. Essayez le.

from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
import torch

model = AutoModelForCausalLM.from_pretrained(
    "teilomillet/Potato-3B",
    revision="0.1",
    return_dict=True,
    torch_dtype=torch.bfloat16,
    device_map='auto'
)

tokenizer = AutoTokenizer.from_pretrained("teilomillet/Potato-3B")
tokenizer.pad_token = tokenizer.eos_token

text = "[|User|] Comment faire un bon plat ? </s>[|Assistant|]"
inputs = tokenizer(text, return_tensors="pt").to(0)

outputs = model.generate(**inputs, max_new_tokens=800)
print(tokenizer.decode(outputs[0], skip_special_tokens=False))

#merge

Downloads last month
1,224
Safetensors
Model size
3.02B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for teilomillet/Potato-3B

Adapters
1 model
Quantizations
1 model

Collection including teilomillet/Potato-3B