egoriya's picture
Update README.md
abb08f6
|
raw
history blame
1.51 kB
metadata
license: mit

This classification model is based on cointegrated/rubert-tiny2. The model should be used to produce relevance and specificity of the last message in the context of a dialog.

It is pretrained on corpus of dialog data from social networks and finetuned on tinkoff-ai/context_similarity. The performance of the model on validation split tinkoff-ai/context_similarity (with the best thresholds for validation samples):

relevance
specificity
f0.5
roc-auc
f0.5
roc-auc
0.82
0.74
0.81
0.8

The model can be loaded as follows:

# pip install transformers
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("tinkoff-ai/context_similarity")
model = AutoModel.from_pretrained("tinkoff-ai/context_similarity")
# model.cuda()