This model uses the LTG-BERT architecture. The model was trained on a combination of the BabyLM Dataset, the TinyStories Dataset, and generated data, in accordance with the rules of the Stric-Small track, and the 10M word budget.

The model was trained with 128 token sequence length

Hyperparameters used and evaluation scores will follow in a subsequent update.

Downloads last month
4,599
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API does not yet support model repos that contain custom code.

Collection including nikitastheo/BERTtime-Stories-10m-nucleus-1-balanced