hubert-tiny-wiki-seq128
Fully trained model with the second phase of training is available here: SzegedAI/hubert-tiny-wiki
This model was trained from scratch on the Wikipedia subset of Hungarian Webcorpus 2.0 with MLM and SOP tasks.
Pre-Training Parameters:
- Training steps: 500.000
- Sequence length: 128 (the model is capable for 512)
- Batch size: 1024
Framework versions
- Transformers 4.21.3
- TensorFlow 2.10.0
- Datasets 2.4.0
- Tokenizers 0.12.1
Acknowledgement
- Downloads last month
- 15