File size: 1,989 Bytes
5d2e7e6 cbf3abf 08cb56b cbf3abf |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 |
---
base_model: llm-jp/llm-jp-3-13b
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** ikedachin
- **License:** apache-2.0
- **Finetuned from model :** llm-jp/llm-jp-3-13b
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
### 使用したdataset
下記からランダムに5000データを抽出
- DeL-TaiseiOzaki/Tengentoppa-sft-v1.0
- llm-jp/magpie-sft-v1.0
### 実行コード
```:Python
from tqdm import tqdm
import os
import json
import torch
from unsloth import FastLanguageModel
from transformers import (
AutoTokenizer,
AutoModelForCausalLM,
BitsAndBytesConfig,
)
HF_TOKEN = "your-token"
model_name = "ikedachin/llm-jp-3-13b-ozaki-ds-5000"
# QLoRAの設定
bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.bfloat16,
bnb_4bit_use_double_quant=False,
)
# modelのダウンロード
model = AutoModelForCausalLM.from_pretrained(
model_name,
quantization_config=bnb_config,
device_map="auto",
token = HF_TOKEN
)
# tokenizerのダウンロード
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True, token = HF_TOKEN)
prompt = "<ここに入力を入れる>"
# トークン化
tokenized_input = tokenizer.encode(prompt, add_special_tokens=False, return_tensors="pt").to(model.device)
# 推論
with torch.no_grad():
outputs = model.generate(
tokenized_input,
max_new_tokens=300,
do_sample=False,
repetition_penalty=1.2
)[0]
# トークンから言葉にデコード
output = tokenizer.decode(outputs[tokenized_input.size(1):], skip_special_tokens=True)
```
|