julian-schelb
commited on
Commit
•
6083051
1
Parent(s):
5c78dcd
Update README.md
Browse files
README.md
CHANGED
@@ -146,11 +146,19 @@ This way, the model learns an inner representation of 100 languages that can the
|
|
146 |
|
147 |
This model is limited by its training dataset of entity-annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains.
|
148 |
|
|
|
149 |
|
150 |
-
|
|
|
|
|
|
|
|
|
|
|
151 |
|
152 |
```text
|
153 |
|
|
|
|
|
154 |
@inproceedings{schelbECCEEntitycentricCorpus2022,
|
155 |
title = {{ECCE}: {Entity}-centric {Corpus} {Exploration} {Using} {Contextual} {Implicit} {Networks}},
|
156 |
url = {https://dl.acm.org/doi/10.1145/3487553.3524237},
|
@@ -160,10 +168,3 @@ This model is limited by its training dataset of entity-annotated news articles
|
|
160 |
}
|
161 |
|
162 |
```
|
163 |
-
|
164 |
-
## Related Papers
|
165 |
-
|
166 |
-
* Pan, X., Zhang, B., May, J., Nothman, J., Knight, K., & Ji, H. (2017). Cross-lingual Name Tagging and Linking for 282 Languages. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (pp. 1946–1958). Association for Computational Linguistics.
|
167 |
-
* Rahimi, A., Li, Y., & Cohn, T. (2019). Massively Multilingual Transfer for NER. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (pp. 151–164). Association for Computational Linguistics.
|
168 |
-
* Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., Levy, O., Lewis, M., Zettlemoyer, L., & Stoyanov, V.. (2019). RoBERTa: A Robustly Optimized BERT Pretraining Approach.
|
169 |
-
|
|
|
146 |
|
147 |
This model is limited by its training dataset of entity-annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains.
|
148 |
|
149 |
+
## Related Papers
|
150 |
|
151 |
+
* Pan, X., Zhang, B., May, J., Nothman, J., Knight, K., & Ji, H. (2017). Cross-lingual Name Tagging and Linking for 282 Languages. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (pp. 1946–1958). Association for Computational Linguistics.
|
152 |
+
* Rahimi, A., Li, Y., & Cohn, T. (2019). Massively Multilingual Transfer for NER. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (pp. 151–164). Association for Computational Linguistics.
|
153 |
+
* Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., Levy, O., Lewis, M., Zettlemoyer, L., & Stoyanov, V.. (2019). RoBERTa: A Robustly Optimized BERT Pretraining Approach.
|
154 |
+
|
155 |
+
|
156 |
+
## Bibtex Citation
|
157 |
|
158 |
```text
|
159 |
|
160 |
+
This model was fine-tuned for the following paper. This is how you can cite it if you like:
|
161 |
+
|
162 |
@inproceedings{schelbECCEEntitycentricCorpus2022,
|
163 |
title = {{ECCE}: {Entity}-centric {Corpus} {Exploration} {Using} {Contextual} {Implicit} {Networks}},
|
164 |
url = {https://dl.acm.org/doi/10.1145/3487553.3524237},
|
|
|
168 |
}
|
169 |
|
170 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|