Instructions to use omarmomen/babylm_bpe_tokenizer_16k with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use omarmomen/babylm_bpe_tokenizer_16k with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("omarmomen/babylm_bpe_tokenizer_16k", dtype="auto") - Notebooks
- Google Colab
- Kaggle
| {"bos_token": "<s>", "eos_token": "</s>", "unk_token": "<unk>", "sep_token": "</s>", "pad_token": "<pad>", "cls_token": "<s>", "mask_token": {"content": "<mask>", "single_word": false, "lstrip": true, "rstrip": false, "normalized": false}} |