AraRoBERTa-EGY / README.md
reemalyami's picture
Update README.md
ab2192c
metadata
license: apache-2.0
language:
  - ar

The AraRoBERTa models are mono-dialectal Arabic models trained on a country-level dialect. AraRoBERTa uses RoBERTa base config. More details are available in the paper click.

The following are the AraRoBERTa seven dialectal variations:

When using the model, please cite our paper:

@inproceedings{alyami-al-zaidy-2022-weakly,
    title = "Weakly and Semi-Supervised Learning for {A}rabic Text Classification using Monodialectal Language Models",
    author = "AlYami, Reem  and Al-Zaidy, Rabah",
    booktitle = "Proceedings of the The Seventh Arabic Natural Language Processing Workshop (WANLP)",
    month = dec,
    year = "2022",
    address = "Abu Dhabi, United Arab Emirates (Hybrid)",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2022.wanlp-1.24",
    pages = "260--272",
}

Contact

Reem AlYami: Linkedin | reem.yami@kfupm.edu.sa | yami.m.reem@gmail.com