File size: 12,600 Bytes
e3550fe bf6b1fc 5e21c98 bf6b1fc 5e21c98 17e5bc1 5e21c98 17e5bc1 5e21c98 17e5bc1 fc40971 5e21c98 e2dc778 5e21c98 e3550fe bf6b1fc 5e21c98 e2dc778 bf6b1fc 17e5bc1 bf6b1fc 5e21c98 bf6b1fc 77629dc 5e21c98 bf6b1fc 5e21c98 bf6b1fc 5e21c98 e3550fe fc40971 bf6b1fc e3550fe 5e21c98 bf6b1fc e3550fe bf6b1fc e3550fe bf6b1fc e3550fe bf6b1fc e3550fe bf6b1fc e3550fe bf6b1fc 5e21c98 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 |
---
tags:
- generated_from_trainer
- ner
- named-entity-recognition
- span-marker
model-index:
- name: span-marker-bert-base-multilingual-cased-multinerd
results:
- task:
type: token-classification
name: Named Entity Recognition
dataset:
type: Babelscape/multinerd
name: MultiNERD
split: test
revision: 2814b78e7af4b5a1f1886fe7ad49632de4d9dd25
metrics:
- type: f1
value: 0.9270
name: F1
- type: precision
value: 0.9281
name: Precision
- type: recall
value: 0.9259
name: Recall
license: apache-2.0
datasets:
- Babelscape/multinerd
metrics:
- precision
- recall
- f1
pipeline_tag: token-classification
widget:
- text: "Amelia Earthart flog mit ihrer einmotorigen Lockheed Vega 5B über den Atlantik nach Paris."
example_title: "German"
- text: "Amelia Earhart flew her single engine Lockheed Vega 5B across the Atlantic to Paris."
example_title: "English"
- text: "Amelia Earthart voló su Lockheed Vega 5B monomotor a través del Océano Atlántico hasta París."
example_title: "Spanish"
- text: "Amelia Earthart a fait voler son monomoteur Lockheed Vega 5B à travers l'ocean Atlantique jusqu'à Paris."
example_title: "French"
- text: "Amelia Earhart ha volato con il suo monomotore Lockheed Vega 5B attraverso l'Atlantico fino a Parigi."
example_title: "Italian"
- text: "Amelia Earthart vloog met haar één-motorige Lockheed Vega 5B over de Atlantische Oceaan naar Parijs."
example_title: "Dutch"
- text: "Amelia Earthart przeleciała swoim jednosilnikowym samolotem Lockheed Vega 5B przez Ocean Atlantycki do Paryża."
example_title: "Polish"
- text: "Amelia Earhart voou em seu monomotor Lockheed Vega 5B através do Atlântico para Paris."
example_title: "Portuguese"
- text: "Амелия Эртхарт перелетела на своем одномоторном самолете Lockheed Vega 5B через Атлантический океан в Париж."
example_title: "Russian"
- text: "Amelia Earthart flaug eins hreyfils Lockheed Vega 5B yfir Atlantshafið til Parísar."
example_title: "Icelandic"
- text: "Η Amelia Earthart πέταξε το μονοκινητήριο Lockheed Vega 5B της πέρα από τον Ατλαντικό Ωκεανό στο Παρίσι."
example_title: "Greek"
- text: "Amelia Earhartová přeletěla se svým jednomotorovým Lockheed Vega 5B přes Atlantik do Paříže."
example_title: "Czech"
- text: "Amelia Earhart lensi yksimoottorisella Lockheed Vega 5B:llä Atlantin yli Pariisiin."
example_title: "Finnish"
- text: "Amelia Earhart fløj med sin enmotoriske Lockheed Vega 5B over Atlanten til Paris."
example_title: "Danish"
- text: "Amelia Earhart flög sin enmotoriga Lockheed Vega 5B över Atlanten till Paris."
example_title: "Swedish"
- text: "Amelia Earhart fløy sin enmotoriske Lockheed Vega 5B over Atlanterhavet til Paris."
example_title: "Norwegian"
- text: "Amelia Earhart și-a zburat cu un singur motor Lockheed Vega 5B peste Atlantic până la Paris."
example_title: "Romanian"
- text: "Amelia Earhart menerbangkan mesin tunggal Lockheed Vega 5B melintasi Atlantik ke Paris."
example_title: "Indonesian"
- text: "Амелія Эрхарт пераляцела на сваім аднаматорным Lockheed Vega 5B праз Атлантыку ў Парыж."
example_title: "Belarusian"
- text: "Амелія Ергарт перелетіла на своєму одномоторному літаку Lockheed Vega 5B через Атлантику до Парижа."
example_title: "Ukrainian"
- text: "Amelia Earhart preletjela je svojim jednomotornim zrakoplovom Lockheed Vega 5B preko Atlantika do Pariza."
example_title: "Croatian"
- text: "Amelia Earhart lendas oma ühemootoriga Lockheed Vega 5B üle Atlandi ookeani Pariisi ."
example_title: "Estonian"
language:
- de
- en
- es
- fr
- it
- nl
- pl
- pt
- ru
- zh
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# span-marker-bert-base-multilingual-cased-multinerd
This model is a fine-tuned version of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) on an [Babelscape/multinerd](https://huggingface.co/datasets/Babelscape/multinerd) dataset.
Is your data not (always) capitalized correctly? Then consider using the uncased variant of this model instead for better performance:
[lxyuan/span-marker-bert-base-multilingual-uncased-multinerd](https://huggingface.co/lxyuan/span-marker-bert-base-multilingual-uncased-multinerd).
This model achieves the following results on the evaluation set:
- Loss: 0.0049
- Overall Precision: 0.9242
- Overall Recall: 0.9281
- Overall F1: 0.9261
- Overall Accuracy: 0.9852
Test set results:
- test_loss: 0.005226554349064827,
- test_overall_accuracy: 0.9851129807294873,
- test_overall_f1: 0.9270450073152169,
- test_overall_precision: 0.9281906912835416,
- test_overall_recall: 0.9259021481405626,
- test_runtime: 2690.9722,
- test_samples_per_second: 150.748,
- test_steps_per_second: 4.711
This is a replication of Tom's work. Everything remains unchanged,
except that we extended the number of training epochs to 3 for a
slightly longer training duration and set the gradient_accumulation_steps to 2.
Please refer to the official [model page](https://huggingface.co/tomaarsen/span-marker-mbert-base-multinerd) to review their results and training script
## Results:
| **Language** | **Precision** | **Recall** | **F1** |
|--------------|---------------|------------|------------|
| **all** | 92.42 | 92.81 | **92.61** |
| **de** | 95.03 | 95.07 | **95.05** |
| **en** | 95.00 | 95.40 | **95.20** |
| **es** | 92.05 | 91.37 | **91.71** |
| **fr** | 92.37 | 91.41 | **91.89** |
| **it** | 91.45 | 93.15 | **92.29** |
| **nl** | 93.85 | 92.98 | **93.41** |
| **pl** | 93.13 | 92.66 | **92.89** |
| **pt** | 93.60 | 92.50 | **93.05** |
| **ru** | 93.25 | 93.32 | **93.29** |
| **zh** | 89.47 | 88.40 | **88.93** |
- Special thanks to Tom for creating the evaluation script and generating the [results](https://huggingface.co/lxyuan/span-marker-bert-base-multilingual-cased-multinerd/discussions/1).
## Label set
| Class | Description | Examples |
|-------|-------------|----------|
| **PER (person)** | People | Ray Charles, Jessica Alba, Leonardo DiCaprio, Roger Federer, Anna Massey. |
| **ORG (organization)** | Associations, companies, agencies, institutions, nationalities and religious or political groups | University of Edinburgh, San Francisco Giants, Google, Democratic Party. |
| **LOC (location)** | Physical locations (e.g. mountains, bodies of water), geopolitical entities (e.g. cities, states), and facilities (e.g. bridges, buildings, airports). | Rome, Lake Paiku, Chrysler Building, Mount Rushmore, Mississippi River. |
| **ANIM (animal)** | Breeds of dogs, cats and other animals, including their scientific names. | Maine Coon, African Wild Dog, Great White Shark, New Zealand Bellbird. |
| **BIO (biological)** | Genus of fungus, bacteria and protoctists, families of viruses, and other biological entities. | Herpes Simplex Virus, Escherichia Coli, Salmonella, Bacillus Anthracis. |
| **CEL (celestial)** | Planets, stars, asteroids, comets, nebulae, galaxies and other astronomical objects. | Sun, Neptune, Asteroid 187 Lamberta, Proxima Centauri, V838 Monocerotis. |
| **DIS (disease)** | Physical, mental, infectious, non-infectious, deficiency, inherited, degenerative, social and self-inflicted diseases. | Alzheimer’s Disease, Cystic Fibrosis, Dilated Cardiomyopathy, Arthritis. |
| **EVE (event)** | Sport events, battles, wars and other events. | American Civil War, 2003 Wimbledon Championships, Cannes Film Festival. |
| **FOOD (food)** | Foods and drinks. | Carbonara, Sangiovese, Cheddar Beer Fondue, Pizza Margherita. |
| **INST (instrument)** | Technological instruments, mechanical instruments, musical instruments, and other tools. | Spitzer Space Telescope, Commodore 64, Skype, Apple Watch, Fender Stratocaster. |
| **MEDIA (media)** | Titles of films, books, magazines, songs and albums, fictional characters and languages. | Forbes, American Psycho, Kiss Me Once, Twin Peaks, Disney Adventures. |
| **PLANT (plant)** | Types of trees, flowers, and other plants, including their scientific names. | Salix, Quercus Petraea, Douglas Fir, Forsythia, Artemisia Maritima. |
| **MYTH (mythological)** | Mythological and religious entities. | Apollo, Persephone, Aphrodite, Saint Peter, Pope Gregory I, Hercules. |
| **TIME (time)** | Specific and well-defined time intervals, such as eras, historical periods, centuries, years and important days. No months and days of the week. | Renaissance, Middle Ages, Christmas, Great Depression, 17th Century, 2012. |
| **VEHI (vehicle)** | Cars, motorcycles and other vehicles. | Ferrari Testarossa, Suzuki Jimny, Honda CR-X, Boeing 747, Fairey Fulmar. |
## Inference Example
```python
# install span_marker
(env)$ pip install span_marker
from span_marker import SpanMarkerModel
model = SpanMarkerModel.from_pretrained("lxyuan/span-marker-bert-base-multilingual-cased-multinerd")
description = "Singapore is renowned for its hawker centers offering dishes \
like Hainanese chicken rice and laksa, while Malaysia boasts dishes such as \
nasi lemak and rendang, reflecting its rich culinary heritage."
entities = model.predict(description)
entities
>>>
[
{'span': 'Singapore', 'label': 'LOC', 'score': 0.999988317489624, 'char_start_index': 0, 'char_end_index': 9},
{'span': 'Hainanese chicken rice', 'label': 'FOOD', 'score': 0.9894770383834839, 'char_start_index': 66, 'char_end_index': 88},
{'span': 'laksa', 'label': 'FOOD', 'score': 0.9224908947944641, 'char_start_index': 93, 'char_end_index': 98},
{'span': 'Malaysia', 'label': 'LOC', 'score': 0.9999839067459106, 'char_start_index': 106, 'char_end_index': 114}]
# missed: nasi lemak as FOOD
# missed: rendang as FOOD
# :(
```
#### Quick test on Chinese
```python
from span_marker import SpanMarkerModel
model = SpanMarkerModel.from_pretrained("lxyuan/span-marker-bert-base-multilingual-cased-multinerd")
# translate to chinese
description = "Singapore is renowned for its hawker centers offering dishes \
like Hainanese chicken rice and laksa, while Malaysia boasts dishes such as \
nasi lemak and rendang, reflecting its rich culinary heritage."
zh_description = "新加坡因其小贩中心提供海南鸡饭和叻沙等菜肴而闻名, 而马来西亚则拥有椰浆饭和仁当等菜肴,反映了其丰富的烹饪传统."
entities = model.predict(zh_description)
entities
>>>
[
{'span': '新加坡', 'label': 'LOC', 'score': 0.9282007813453674, 'char_start_index': 0, 'char_end_index': 3},
{'span': '马来西亚', 'label': 'LOC', 'score': 0.7439665794372559, 'char_start_index': 27, 'char_end_index': 31}]
# It only managed to capture two countries: Singapore and Malaysia.
# All other entities were missed out.
```
## Training procedure
One can reproduce the result running this [script](https://huggingface.co/tomaarsen/span-marker-mbert-base-multinerd/blob/main/train.py)
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:------:|:---------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 0.0129 | 1.0 | 50436 | 0.0042 | 0.9226 | 0.9169 | 0.9197 | 0.9837 |
| 0.0027 | 2.0 | 100873 | 0.0043 | 0.9255 | 0.9206 | 0.9230 | 0.9846 |
| 0.0015 | 3.0 | 151308 | 0.0049 | 0.9242 | 0.9281 | 0.9261 | 0.9852 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.14.3
- Tokenizers 0.13.3 |