mrm8488 commited on
Commit
7d64a99
1 Parent(s): 22b01a0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -23,7 +23,7 @@ tags:
23
 
24
  # Legal ⚖️ longformer-base-8192-spanish
25
 
26
- `legal-longformer-base-8192` is a BERT-like model started from the RoBERTa checkpoint (**[RoBERTalex](PlanTL-GOB-ES/RoBERTalex)** in this case) and pre-trained for *MLM* on long documents from the [Spanish Legal Domain Corpora](https://zenodo.org/record/5495529/#.Y205lpHMKV5). It supports sequences of length up to **8192**!
27
 
28
  **Longformer** uses a combination of a sliding window (*local*) attention and *global* attention. Global attention is user-configured based on the task to allow the model to learn task-specific representations.
29
 
 
23
 
24
  # Legal ⚖️ longformer-base-8192-spanish
25
 
26
+ `legal-longformer-base-8192` is a BERT-like model started from the RoBERTa checkpoint (**[RoBERTalex](https://huggingface.co/PlanTL-GOB-ES/RoBERTalex)** in this case) and pre-trained for *MLM* on long documents from the [Spanish Legal Domain Corpora](https://zenodo.org/record/5495529/#.Y205lpHMKV5). It supports sequences of length up to **8192**!
27
 
28
  **Longformer** uses a combination of a sliding window (*local*) attention and *global* attention. Global attention is user-configured based on the task to allow the model to learn task-specific representations.
29