BERT multilingual basecased finetuned with NSMC
This model is a fine-tune checkpoint of bert-base-multilingual-cased, fine-tuned on NSMC(Naver Sentiment Movie Corpus).
Usage
You can use this model directly with a pipeline for sentiment-analysis:
>>> from transformers import pipeline
>>> classifier = pipeline(
"sentiment-analysis", model="sangrimlee/bert-base-multilingual-cased-nsmc"
)
>>> classifier("ν ...ν¬μ€ν°λ³΄κ³ μ΄λ©μνμ€....μ€λ²μ°κΈ°μ‘°μ°¨ κ°λ³μ§ μꡬλ.")
>>> classifier("μ‘μ
μ΄ μλλ°λ μ¬λ―Έ μλ λͺμλλ μν")
[{'label': 'negative', 'score': 0.9642567038536072}]
[{'label': 'positive', 'score': 0.9970554113388062}]
- Downloads last month
- 382
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.