norbert3-coarse-absa-full

This model is a fine-tuned version of NorBERT3-large, applied on the full sentence-level NorPaC_absa dataset. The model is trained on a total of 25 unique, coarse-grained aspect+sentiment labels. This model along with norbert3-fine-absa-full, represent the models which in practice will be used by NIPH/FHI, as they are trained on the full NorPaC_absa dataset.

Example Usage

import torch
from transformers import AutoTokenizer, AutoModelForSequenceClassification

tokenizer = AutoTokenizer.from_pretrained("ltg/norbert3-coarse-absa-full")
model = AutoModelForSequenceClassification.from_pretrained("ltg/norbert3-coarse-absa-full", trust_remote_code=True)

model.eval()

text = "fastlegen lytter til meg, men jeg synes ventetiden er for lang."

# tokenize input
inputs = tokenizer(text, return_tensors="pt", truncation=True, padding=True, max_length=512)

# Run inference
with torch.no_grad():
    outputs = model(**inputs)

# Get predictions
threshold = 0.5
probs = torch.sigmoid(outputs.logits).squeeze()
predictions = [model.config.id2label[i] for i, prob in enumerate(probs) if prob > threshold]
print(predictions)
# -> ['staff_pos', 'avail_neg'] (healthcare providers and staff:positive, access and availability:negative)

Class labels

Below is the distribution of coarse-grained labels within NorPaC_absa on the comment-level.

Aspect full name Short-name Instances
Healthcare providers and staff staff 1169
Organization of health services org 502
Access and availability avail 353
Environment and facilities env 286
Treatment treat 591
Uncategorized / Top-level aspects
    Outcome and impact of treatment / stay oits 319
    Patient involvement and participation pip 68
    General gen 1376
    No aspect / Neutral no-asp 92
Total 4756

Evaluation

As this model is fine-tuned on all of the data splits, there is no current evaluation. Performance metrics is however available in our paper for a version of the model trained on the train split.

Citation

Coming.

Downloads last month
28
Safetensors
Model size
0.4B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ltg/norbert3-coarse-absa-full

Base model

ltg/norbert3-large
Finetuned
(8)
this model

Collection including ltg/norbert3-coarse-absa-full