BERT Named Entity Recognition - n2c2 2018
Collection
8 items
•
Updated
This model is a fine-tuned version of FacebookAI/roberta-large on the n2c2 2018 dataset for the paper https://arxiv.org/abs/2409.19467. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
No log | 1.0 | 231 | 0.0809 | 0.7947 | 0.7638 | 0.7789 | 0.9764 |
No log | 2.0 | 462 | 0.0713 | 0.8097 | 0.7929 | 0.8012 | 0.9779 |
0.2213 | 3.0 | 693 | 0.0704 | 0.8092 | 0.8046 | 0.8069 | 0.9780 |
0.2213 | 4.0 | 924 | 0.0738 | 0.8113 | 0.8082 | 0.8098 | 0.9782 |
Base model
FacebookAI/roberta-large