Model Card for mpyt5_e15
事前に自然言語だけでなくPythonを学習したモデル
Training Details
Training Data
Python Code (1.05GB)
Training Procedure
- MLM
- python vocab (https://huggingface.co/kkuramitsu/mt5-pytoken)
Preprocessing
mT5 + Python
Speeds, Sizes, Times
- mT5-small(300M Paramators)
- max_length = 128
Model Version
- *epoch5: https://huggingface.co/Roy029/mpyt5_e5
- *epoch10: https://huggingface.co/Roy029/mpyt5_e10
- *epoch15: This Model
- *epoch20: https://huggingface.co/Roy029/mpyt5_e20
- Downloads last month
- 106
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.