YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
To use this model
import torch
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("xlm-roberta-base")
model = AutoModelForSequenceClassification.from_pretrained("LsTam/MQ-classification")
softmax = torch.nn.Softmax(dim=1)
prediction = lambda p : [(a[0] < a[1]) * 1 for a in p]
# 0 is for wrong question and 1 for good ones
text = ['Your question' + ' </s> ' + 'your context']
a = tokenizer(text, return_tensors="pt")
result = model(**a)
pred = prediction(softmax(result.logits).tolist())
- Downloads last month
- 9
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.