Update README.md
Browse files
README.md
CHANGED
@@ -74,7 +74,7 @@ thumbnail: https://gsarti.com/publication/it5/featured.png
|
|
74 |
---
|
75 |
# mT5 Base for News Summarization ✂️🗞️ 🇮🇹
|
76 |
|
77 |
-
This repository contains the checkpoint for the [mT5 Base](https://huggingface.co/google/mt5-base) model fine-tuned on news summarization on the [Fanpage](https://huggingface.co/datasets/ARTeLab/fanpage) and [Il Post](https://huggingface.co/datasets/ARTeLab/ilpost) corpora as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
|
78 |
|
79 |
A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
|
80 |
|
@@ -103,10 +103,11 @@ If you use this model in your research, please cite our work as:
|
|
103 |
|
104 |
```bibtex
|
105 |
@article{sarti-nissim-2022-it5,
|
106 |
-
title={IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
|
107 |
author={Sarti, Gabriele and Nissim, Malvina},
|
108 |
-
journal={ArXiv preprint
|
109 |
-
url={
|
110 |
-
year={2022}
|
|
|
111 |
}
|
112 |
```
|
|
|
74 |
---
|
75 |
# mT5 Base for News Summarization ✂️🗞️ 🇮🇹
|
76 |
|
77 |
+
This repository contains the checkpoint for the [mT5 Base](https://huggingface.co/google/mt5-base) model fine-tuned on news summarization on the [Fanpage](https://huggingface.co/datasets/ARTeLab/fanpage) and [Il Post](https://huggingface.co/datasets/ARTeLab/ilpost) corpora as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org/abs/2203.03759) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
|
78 |
|
79 |
A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
|
80 |
|
|
|
103 |
|
104 |
```bibtex
|
105 |
@article{sarti-nissim-2022-it5,
|
106 |
+
title={{IT5}: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
|
107 |
author={Sarti, Gabriele and Nissim, Malvina},
|
108 |
+
journal={ArXiv preprint 2203.03759},
|
109 |
+
url={https://arxiv.org/abs/2203.03759},
|
110 |
+
year={2022},
|
111 |
+
month={mar}
|
112 |
}
|
113 |
```
|