t5-large-finetuned-multi_v2
This model is a fine-tuned version of paust/pko-t5-large on the None dataset.
Model description
의도 | 개체 |
---|---|
일상대화 | |
전화연결 | 대상 |
장소안내 | 장소, 대상 |
날씨예보 | 날짜, 장소, 대상, 시간, 조건 |
화물추천 | 날짜, 시간, 상차, 하차, 기준 |
긍부정 | 긍정, 부정, 중립 |
*대상 : 상차지/하차지 |
How to use
import requests
API_URL = "https://api-inference.huggingface.co/models/yeye776/t5-large-finetuned-multi_v2"
headers = {"Authorization": "Bearer hf_key"}
def query(payload):
response = requests.post(API_URL, headers=headers, json=payload)
return response.json()
prompt = "브로캐리에 관련된 이용자의 대화인데 분류 및 인식 해줘! :"
input = "내일 심야 상차지가 분당인 화물 추천해줘"
output = query({
"inputs": prompt+input,
"options":{"wait_for_model":True}
})
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0007
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.06
- num_epochs: 8
Framework versions
- Transformers 4.34.1
- Pytorch 2.1.0+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
- Downloads last month
- 7
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for yeye776/t5-large-finetuned-multi_v2
Base model
paust/pko-t5-large