Fine-tuned Flair Model on German MobIE Dataset with AutoTrain
This Flair model was fine-tuned on the German MobIE NER Dataset using GBERT Base as backbone LM and the 🚀 AutoTrain library.
Dataset
The German MobIE dataset is a German-language dataset, which is human-annotated with 20 coarse- and fine-grained entity types and entity linking information for geographically linkable entities. The dataset consists of 3,232 social media texts and traffic reports with 91K tokens, and contains 20.5K annotated entities, 13.1K of which are linked to a knowledge base.
The following named entities are annotated:
location-stop
trigger
organization-company
location-city
location
event-cause
location-street
time
date
number
duration
organization
person
set
distance
disaster-type
money
org-position
percent
Fine-Tuning
The latest Flair version is used for fine-tuning. Additionally, the model is trained with the FLERT (Schweter and Akbik (2020) approach, because the MobIE dataset thankfully comes with document boundary information marker.
A hyper-parameter search over the following parameters with 5 different seeds per configuration is performed:
- Batch Sizes: [
16
] - Learning Rates: [
5e-05
,3e-05
]
All models are trained with the awesome AutoTrain Advanced from Hugging Face. More details can be found in this repository.
Results
A hyper-parameter search with 5 different seeds per configuration is performed and micro F1-score on development set is reported:
Configuration | Seed 1 | Seed 2 | Seed 3 | Seed 4 | Seed 5 | Average |
---|---|---|---|---|---|---|
bs16-e10-lr5e-05 |
0.8446 | 0.8495 | 0.8455 | 0.8419 | 0.8476 | 0.8458 ± 0.0029 |
bs16-e10-lr3e-05 |
0.8392 | 0.8445 | 0.8495 | 0.8381 | 0.8449 | 0.8432 ± 0.0046 |
The result in bold shows the performance of this model.
Additionally, the Flair training log and TensorBoard logs are also uploaded to the model hub.
- Downloads last month
- 8
Model tree for stefan-it/autotrain-flair-mobie-gbert_base-bs16-e10-lr3e-05-1
Base model
deepset/gbert-base