Transformers
English
Inference Endpoints

Model Card for omarmomen/babylm_bpe_tokenizer_32k

This model is part of the experiments in my master's thesis titled "Linguistic Structure Induction from Language Models" (https://arxiv.org/abs/2403.09714).

"omarmomen/babylm_bpe_tokenizer_32k" is a RobertaTokenizer pretrained on the BabyLM 10 Training dataset (cased) with 32K tokens.

https://arxiv.org/abs/2403.09714

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Dataset used to train omarmomen/babylm_bpe_tokenizer_32k