stefan-it commited on
Commit
2df5569
1 Parent(s): 7df624b

readme: update base model and backbone model

Browse files

Hi,
This PR updates the base model and backbone model (hmByT5)

Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -5,7 +5,7 @@ tags:
5
  - flair
6
  - token-classification
7
  - sequence-tagger-model
8
- base_model: hmteams/teams-base-historic-multilingual-discriminator
9
  inference: false
10
  widget:
11
  - text: Cp . Eur . Phoen . 240 , 1 , αἷμα ddiov φλέγέι .
@@ -15,7 +15,7 @@ widget:
15
 
16
  This Flair model was fine-tuned on the
17
  [AjMC English](https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-ajmc.md)
18
- NER Dataset using hmTEAMS as backbone LM.
19
 
20
  The AjMC dataset consists of NE-annotated historical commentaries in the field of Classics,
21
  and was created in the context of the [Ajax MultiCommentary](https://mromanello.github.io/ajax-multi-commentary/)
@@ -34,7 +34,7 @@ Thus, the inference widget is not working with hmByT5 at the moment on the Model
34
  This should be fixed in future, when ByT5 fine-tuning is supported in Flair directly.
35
 
36
  [1]: https://github.com/stefan-it/hmBench/blob/main/byt5_embeddings.py
37
-
38
  # Results
39
 
40
  We performed a hyper-parameter search over the following parameters with 5 different seeds per configuration:
 
5
  - flair
6
  - token-classification
7
  - sequence-tagger-model
8
+ base_model: hmbyt5-preliminary/byt5-small-historic-multilingual-span20-flax
9
  inference: false
10
  widget:
11
  - text: Cp . Eur . Phoen . 240 , 1 , αἷμα ddiov φλέγέι .
 
15
 
16
  This Flair model was fine-tuned on the
17
  [AjMC English](https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-ajmc.md)
18
+ NER Dataset using hmByT5 as backbone LM.
19
 
20
  The AjMC dataset consists of NE-annotated historical commentaries in the field of Classics,
21
  and was created in the context of the [Ajax MultiCommentary](https://mromanello.github.io/ajax-multi-commentary/)
 
34
  This should be fixed in future, when ByT5 fine-tuning is supported in Flair directly.
35
 
36
  [1]: https://github.com/stefan-it/hmBench/blob/main/byt5_embeddings.py
37
+
38
  # Results
39
 
40
  We performed a hyper-parameter search over the following parameters with 5 different seeds per configuration: