skt/kogpt2-base-v2λ₯Ό AIHub μΌμλν λ°μ΄ν°μ
μΌλ‘ νμΈνλν λͺ¨λΈμ
λλ€.
νμ΅ μ½λ: https://github.com/HeegyuKim/open-domain-dialog
Streamlit Demo: https://heegyukim-open-domain-dialog-st-demo-1tzktp.streamlitapp.com/
μ¬μ©μμ
tokenizer = AutoTokenizer.from_pretrained("heegyu/kodialogpt-v0")
model = AutoModelForCausalLM.from_pretrained("heegyu/kodialogpt-v0")
generator = pipeline("text-generation", model=model, tokenizer=tokenizer)
generation_args = dict(
num_beams=4,
repetition_penalty=2.0,
no_repeat_ngram_size=4,
eos_token_id=375, # \n
max_new_tokens=64,
do_sample=True,
top_k=50,
early_stopping=True
)
generator(
["0 : **λ κ²μ μ’μνλ\n1 :",
"0 : μ΄μ κ°λ¨μμ μ΄μΈμ¬κ±΄ λ¬λ γ
γ
λ무 무μμ\n1 : ν μ? λ¬΄μ¨ μΌ μμμ΄?\n0 : μ¬μ§λ³΄λκΉ λ§ νΌν리λ μ¬λμκ³ κ²½μ°°λ€μ΄ λ μ μ μνκ³ λ리λ μλμλ€λλ°??\n1 :",
"0 : μκΈ°μΌ μ΄μ λ λνν
μ κ·Έλ¬μ΄?\n1 : λ μΌ μμμ΄?\n0 : μ΄λ»κ² λνν
λ§λ μμ΄ κ·Έλ΄ μ μμ΄? λ μ§μ§ μ€λ§νμ΄\n1 : "],
**generation_args
)
κ²°κ³Ό
[[{'generated_text': '0 : **λ κ²μ μ’μνλ\n1 : λλ κ²μμ μ μ ν΄ ν€ν€ '}],
[{'generated_text': '0 : μ΄μ κ°λ¨μμ μ΄μΈμ¬κ±΄ λ¬λ γ
γ
λ무 무μμ\n1 : ν μ? λ¬΄μ¨ μΌ μμμ΄?\n0 : μ¬μ§λ³΄λκΉ λ§ νΌν리λ μ¬λμκ³ κ²½μ°°λ€μ΄ λ μ μ μνκ³ λ리λ μλμλ€λλ°??\n1 : μμ΄κ³ ... μ§μ§ 무μλ€... '}],
[{'generated_text': '0 : μκΈ°μΌ μ΄μ λ λνν
μ κ·Έλ¬μ΄?\n1 : λ μΌ μμμ΄?\n0 : μ΄λ»κ² λνν
λ§λ μμ΄ κ·Έλ΄ μ μμ΄? λ μ§μ§ μ€λ§νμ΄\n1 : λ μλͺ» νκΈΈλ κ·Έλ? '}]]
νμ΅μ μ¬μ©ν νμ΄νΌνλΌλ―Έν°
- Downloads last month
- 17
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.