omarmomen's picture
Update README.md
a2ee48c verified
|
raw
history blame
714 Bytes
metadata
license: mit
datasets:
  - omarmomen/babylm_10M
language:
  - en
metrics:
  - perplexity
library_name: transformers

Model Card for omarmomen/roberta_base_32k_final

This model is part of the experiments in the published paper at the BabyLM workshop in CoNLL 2023. The paper titled "Increasing The Performance of Cognitively Inspired Data-Efficient Language Models via Implicit Structure Building" (https://aclanthology.org/2023.conll-babylm.29/)

omarmomen/roberta_base_32k_final is a baseline RobertaModel.

The model is pretrained on the BabyLM 10M dataset using a custom pretrained RobertaTokenizer (https://huggingface.co/omarmomen/babylm_tokenizer_32k).

https://arxiv.org/abs/2310.20589