Citation

If you use this model, please cite the following paper:

@inproceedings {yang-language-models,
    title = {Training language models with low resources: RoBERTa, BART and ELECTRA experimental models for Hungarian},
    booktitle = {Proceedings of 12th IEEE International Conference on Cognitive Infocommunications (CogInfoCom 2021)},
    year = {2021},
    publisher = {IEEE},
    address = {Online},
    author = {Yang, Zijian Győző and Váradi, Tamás},
    pages = {279--285}
}
Downloads last month
50
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.