Esperimento

Eseguire con

import torch
import os
from transformers import AutoModelForCausalLM,AutoTokenizer
os.environ["MATFORMER_ROOT"] = /path/to/matformer

tokenizer=AutoTokenizer.from_pretrained("mrinaldi/Gettone-TEST")
model = AutoModelForCausalLM.from_pretrained("CCC-Unito/BAMBINO-0.1", trust_remote_code=True, dtype=torch.bfloat16,device_map='cuda')
text="In un giorno di Autunno"
encoded=tokenizer(text,return_tensors='pt',add_special_tokens=True)
output=model.generate(encoded['input_ids'].to('cuda'))

print(tokenizer.decode(output[0]))
Downloads last month
133
Safetensors
Model size
0.1B params
Tensor type
F32
·
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support