HiTZ
/

Text2Text Generation
Transformers
PyTorch
mt5
medical
multilingual
medic
Inference Endpoints
Iker commited on
Commit
2bc36a2
1 Parent(s): b4baaf8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -39,8 +39,8 @@ We present MedMT5, the first open-source text-to-text multilingual model for the
39
  <thead>
40
  <tr>
41
  <th></th>
42
- <th>MedMT5-Large (<a href="https://huggingface.co/HiTZ/MedMT5-large">HiTZ/MedMT5-large</a>)</th>
43
- <th>MedMT5-XL (<a href="https://huggingface.co/HiTZ/MedMT5-xl">HiTZ/MedMT5-xl</a>)</th>
44
  </tr>
45
  </thead>
46
  <tbody>
@@ -120,8 +120,8 @@ You can load the model using
120
  ```python
121
  from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
122
 
123
- tokenizer = AutoTokenizer.from_pretrained("HiTZ/MedMT5-xl")
124
- model = AutoModelForSeq2SeqLM.from_pretrained("HiTZ/MedMT5-xl")
125
  ```
126
 
127
  The model has been trained using the T5 masked language modeling tasks. You need to finetune the model for your task.
 
39
  <thead>
40
  <tr>
41
  <th></th>
42
+ <th>MedMT5-Large (<a href="https://huggingface.co/HiTZ/Medical-mT5-large">HiTZ/Medical-mT5-large</a>)</th>
43
+ <th>MedMT5-XL (<a href="https://huggingface.co/HiTZ/Medical-mT5-xl">HiTZ/Medical-mT5-xl</a>)</th>
44
  </tr>
45
  </thead>
46
  <tbody>
 
120
  ```python
121
  from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
122
 
123
+ tokenizer = AutoTokenizer.from_pretrained("HiTZ/Medical-mT5-xl")
124
+ model = AutoModelForSeq2SeqLM.from_pretrained("HiTZ/Medical-mT5-xl")
125
  ```
126
 
127
  The model has been trained using the T5 masked language modeling tasks. You need to finetune the model for your task.