gpt2-large-serbian-upos

Model Description

This is a GPT-2 model in Serbian (Cyrillic and Latin) for POS-tagging and dependency-parsing, derived from gpt2-orao. Every word is tagged by UPOS (Universal Part-Of-Speech) and FEATS.

How to Use

from transformers import pipeline
nlp=pipeline("upos","KoichiYasuoka/gpt2-large-serbian-upos",trust_remote_code=True,aggregation_strategy="simple")

or

import esupar
nlp=esupar.load("KoichiYasuoka/gpt2-large-serbian-upos")

See Also

esupar: Tokenizer POS-tagger and Dependency-parser with BERT/RoBERTa/DeBERTa/GPT models

Downloads last month
44
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for KoichiYasuoka/gpt2-large-serbian-upos

Base model

jerteh/gpt2-orao
Finetuned
(3)
this model

Dataset used to train KoichiYasuoka/gpt2-large-serbian-upos