Update README.md
Browse files
README.md
CHANGED
@@ -6,7 +6,7 @@ IndicBART is a multilingual, sequence-to-sequence pre-trained model focusing on
|
|
6 |
<li> Trained on large Indic language corpora (452 million sentences and 9 billion tokens) which also includes Indian English content. </li>
|
7 |
</ul>
|
8 |
|
9 |
-
You can read more about IndicBART in this
|
10 |
|
11 |
For detailed documentation, look here: https://github.com/AI4Bharat/indic-bart/ and https://indicnlp.ai4bharat.org/indic-bart/
|
12 |
|
@@ -78,7 +78,8 @@ print(decoded_output) # I am happy
|
|
78 |
|
79 |
# Fine-tuning on a downstream task
|
80 |
|
81 |
-
If you wish to fine-tune this model, then you can do so using the toolkit <a href="https://github.com/prajdabre/yanmtt">YANMTT</a> following the instructions
|
|
|
82 |
|
83 |
# Contributors
|
84 |
<ul>
|
|
|
6 |
<li> Trained on large Indic language corpora (452 million sentences and 9 billion tokens) which also includes Indian English content. </li>
|
7 |
</ul>
|
8 |
|
9 |
+
You can read more about IndicBART in this <a href="https://arxiv.org/abs/2109.02903">paper</a>.
|
10 |
|
11 |
For detailed documentation, look here: https://github.com/AI4Bharat/indic-bart/ and https://indicnlp.ai4bharat.org/indic-bart/
|
12 |
|
|
|
78 |
|
79 |
# Fine-tuning on a downstream task
|
80 |
|
81 |
+
1. If you wish to fine-tune this model, then you can do so using the toolkit <a href="https://github.com/prajdabre/yanmtt">YANMTT</a> following the instructions <a href="https://github.com/AI4Bharat/indic-bart ">here<a>.
|
82 |
+
2. (Untested) Alternatively, you may use the official huggingface scripts for <a href="https://github.com/huggingface/transformers/tree/master/examples/pytorch/translation">translation</a> and <a href="https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization">summarization</a>.
|
83 |
|
84 |
# Contributors
|
85 |
<ul>
|