gpt2-small-serbian-upos
Model Description
This is a GPT-2 model in Serbian (Cyrillic and Latin) for POS-tagging and dependency-parsing, derived from gpt2-vrabac. Every word is tagged by UPOS (Universal Part-Of-Speech) and FEATS.
How to Use
from transformers import pipeline
nlp=pipeline("upos","KoichiYasuoka/gpt2-small-serbian-upos",trust_remote_code=True,aggregation_strategy="simple")
or
import esupar
nlp=esupar.load("KoichiYasuoka/gpt2-small-serbian-upos")
See Also
esupar: Tokenizer POS-tagger and Dependency-parser with BERT/RoBERTa/DeBERTa/GPT models
- Downloads last month
- 47
Model tree for KoichiYasuoka/gpt2-small-serbian-upos
Base model
jerteh/gpt2-vrabac