metadata
license: apache-2.0
language:
- ko
KULLM project
- base model: mistralai/Mistral-7B-Instruct-v0.2
datasets
- KULLM dataset
- hand-crafted instruction data
Implementation Code
from transformers import (
AutoModelForCausalLM,
AutoTokenizer
)
import torch
repo = "ifuseok/sft-solar-10.7b-v2.1-dpo"
model = AutoModelForCausalLM.from_pretrained(
repo,
torch_dtype=torch.float16,
device_map='auto'
)
tokenizer = AutoTokenizer.from_pretrained(repo)
Initial upload: 2024/01/28 20:30