Clinical-T5-Large / README.md
xyla's picture
Update README.md
e7a2ded
|
raw
history blame
669 Bytes
metadata
license: mit

Clinical-T5 Models

We train four different T5 variants on the union of MIMIC-III and MIMIC-IV: (1) Initialized from T5-Base, (2) Initialized from SciFive-Base, (3) T5-Base initialized from scratch, and (4) T5-Large initialized from scratch.

This particular model card describes the T5-Large model trained from scratch on MIMIC notes.

Model Pretraining

In this section, we will describe the pretraining procedure.

Pretraining Data

Note Preprocessing

Pretraining Procedures

Pretraining Hyperparameters

How to use the Model

Questions?

If you have any questions about using the models, please email eric@xyla.com.