Fill-Mask
Transformers
Safetensors
English
mdlm
custom_code
subbham commited on
Commit
9e6829b
1 Parent(s): 0dda33a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -4
README.md CHANGED
@@ -8,16 +8,13 @@ datasets:
8
  metrics:
9
  - perplexity
10
  ---
11
- ## Paper
12
-
13
- arxiv.org/abs/2406.07524
14
 
15
  ## Using MDLM
16
  To use the pre-trained model for masked language modeling, use the following snippet:
17
  ```python
18
  from transformers import AutoModelForMaskedLM, AutoTokenizer
19
 
20
- # See the `Caduceus` collection page on the hub for list of available models.
21
  tokenizer = transformers.AutoTokenizer.from_pretrained('gpt2')
22
  model_name = 'kuleshov-group/mdlm-owt'
23
  model = AutoModelForMaskedLM.from_pretrained(model_name)
 
8
  metrics:
9
  - perplexity
10
  ---
 
 
 
11
 
12
  ## Using MDLM
13
  To use the pre-trained model for masked language modeling, use the following snippet:
14
  ```python
15
  from transformers import AutoModelForMaskedLM, AutoTokenizer
16
 
17
+ # See the `MDLM` collection page on the hub for list of available models.
18
  tokenizer = transformers.AutoTokenizer.from_pretrained('gpt2')
19
  model_name = 'kuleshov-group/mdlm-owt'
20
  model = AutoModelForMaskedLM.from_pretrained(model_name)