arman-longformer-8k / README.md
hofarah's picture
Update README.md
6f802c5
|
raw
history blame
473 Bytes
---
language:
- fa
pipeline_tag: summarization
---
# Model Card for Model arman-longformer-8k
<!-- Provide a quick summary of what the model is/does. -->
This project use Longformer's attention mechanism to [alireza7/ARMAN-MSR-persian-base](https://huggingface.co/alireza7/ARMAN-MSR-persian-base) in order to perform abstractive summarization on long documents. so new model can accept 8K tokens (rather than 512 tokens).it should be fine-tuned for summarization tasks.