Edit model card

PolitiBETO: A Spanish BERT adapted to a language domain of Political Tweets

PolitiBETO is a BERT model tailored for political tasks in social media corpora. It is a Domain Adaptation on top of BETO, a pretrained BERT in Spanish. This model is meant to be fine-tuned for downstream tasks.

Citation

NLP-CIMAT at PoliticEs 2022: PolitiBETO, a Domain-Adapted Transformer for Multi-class Political Author Profiling

To cite this in a publication please use the following:

@inproceedings{PolitiBeto2022,
  title={{NLP-CIMAT} at {P}olitic{E}s 2022: {P}oliti{BETO}, a {D}omain-{A}dapted {T}ransformer for {M}ulti-class {P}olitical {A}uthor {P}rofiling},
  author={Emilio Villa-Cueva and Ivan Gonz{\'a}lez-Franco and Fernando Sanchez-Vega and Adri{\'a}n Pastor L{\'o}pez-Monroy},
  booktitle={Proceedings of the Iberian Languages Evaluation Forum (IberLEF 2022)},
  series    = {{CEUR} Workshop Proceedings},
  publisher = {CEUR-WS},
  year={2022}
}
Downloads last month
6
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.