Fill-Mask
Transformers
PyTorch
English
roberta
earth science
climate
biology
Inference Endpoints
Muthukumaran osanseviero HF staff commited on
Commit
556e635
1 Parent(s): ee299e3

Fix link to distilled model (#1)

Browse files

- Fix link to distilled model (67813b75191cc535803186ea4dab426b47279fa8)


Co-authored-by: Omar Sanseviero <osanseviero@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -23,7 +23,7 @@ nasa-smd-ibm-v0.1 (Currently named as Indus) is a RoBERTa-based, Encoder-only tr
23
  - **Tokenizer**: Custom
24
  - **Parameters**: 125M
25
  - **Pretraining Strategy**: Masked Language Modeling (MLM)
26
- - **Distilled Version**: You can download a distilled version of the model (30 Million Parameters) here: https://drive.google.com/file/d/19s2Vv9WlmlRhh_AhzdP-s__0spQCG8cQ/view?usp=sharing
27
 
28
  ## Training Data
29
  - Wikipedia English (Feb 1, 2020)
 
23
  - **Tokenizer**: Custom
24
  - **Parameters**: 125M
25
  - **Pretraining Strategy**: Masked Language Modeling (MLM)
26
+ - **Distilled Version**: You can download a distilled version of the model (30 Million Parameters) here: https://huggingface.co/nasa-impact/nasa-smd-ibm-distil-v0.1
27
 
28
  ## Training Data
29
  - Wikipedia English (Feb 1, 2020)