kiddothe2b commited on
Commit
a329704
1 Parent(s): 1d3ddb6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -3,10 +3,10 @@ license: cc-by-nc-sa-4.0
3
  pipeline_tag: fill-mask
4
  language: en
5
  tags:
6
- - legal
7
  - long-documents
8
  ---
9
 
10
- # Legal Longformer (base)
11
 
12
- This is a derivative model based on the [LexLM (base)](https://huggingface.co/lexlms/roberta-base-cased) RoBERTa model. All model parameters where cloned from the original model, while the positional embeddings were extended by cloning the original embeddings multiple times following [Beltagy et al. (2020)](https://arxiv.org/abs/2004.05150) using a python script similar to this one (https://github.com/allenai/longformer/blob/master/scripts/convert_model_to_long.ipynb).
 
3
  pipeline_tag: fill-mask
4
  language: en
5
  tags:
6
+ - biomedical
7
  - long-documents
8
  ---
9
 
10
+ # Biomedical Longformer (base)
11
 
12
+ This is a derivative model based on [microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract) BERT model developed in the work "Fine-Tuning Large Neural Language Models for Biomedical Natural Language Processing" by [Tinn et al. (2021)](https://arxiv.org/abs/2112.07869). All model parameters where cloned from the original model, while the positional embeddings were extended by cloning the original embeddings multiple times following [Beltagy et al. (2020)](https://arxiv.org/abs/2004.05150) using a python script similar to this one (https://github.com/allenai/longformer/blob/master/scripts/convert_model_to_long.ipynb).