KoichiYasuoka
commited on
Commit
•
540ba7e
1
Parent(s):
49c5ab4
dependency-parsing
Browse files
README.md
CHANGED
@@ -5,6 +5,7 @@ tags:
|
|
5 |
- "japanese"
|
6 |
- "token-classification"
|
7 |
- "pos"
|
|
|
8 |
datasets:
|
9 |
- "universal_dependencies"
|
10 |
license: "cc-by-sa-4.0"
|
@@ -17,7 +18,7 @@ widget:
|
|
17 |
|
18 |
## Model Description
|
19 |
|
20 |
-
This is a RoBERTa model pre-trained on Japanese 青空文庫 texts for POS-tagging, derived from [roberta-small-japanese-aozora](https://huggingface.co/KoichiYasuoka/roberta-small-japanese-aozora). Every long-unit-word is tagged by [UPOS](https://universaldependencies.org/u/pos/) (Universal Part-Of-Speech).
|
21 |
|
22 |
## How to Use
|
23 |
|
@@ -30,6 +31,14 @@ nlp=lambda x:[(x[t["start"]:t["end"]],t["entity_group"]) for t in pipeline(x)]
|
|
30 |
print(nlp("国境の長いトンネルを抜けると雪国であった。"))
|
31 |
```
|
32 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
33 |
## See Also
|
34 |
|
35 |
[esupar](https://github.com/KoichiYasuoka/esupar): Tokenizer POS-tagger and Dependency-parser with BERT/RoBERTa models
|
|
|
5 |
- "japanese"
|
6 |
- "token-classification"
|
7 |
- "pos"
|
8 |
+
- "dependency-parsing"
|
9 |
datasets:
|
10 |
- "universal_dependencies"
|
11 |
license: "cc-by-sa-4.0"
|
|
|
18 |
|
19 |
## Model Description
|
20 |
|
21 |
+
This is a RoBERTa model pre-trained on Japanese 青空文庫 texts for POS-tagging and dependency-parsing, derived from [roberta-small-japanese-aozora](https://huggingface.co/KoichiYasuoka/roberta-small-japanese-aozora). Every long-unit-word is tagged by [UPOS](https://universaldependencies.org/u/pos/) (Universal Part-Of-Speech).
|
22 |
|
23 |
## How to Use
|
24 |
|
|
|
31 |
print(nlp("国境の長いトンネルを抜けると雪国であった。"))
|
32 |
```
|
33 |
|
34 |
+
or
|
35 |
+
|
36 |
+
```py
|
37 |
+
import esupar
|
38 |
+
nlp=esupar.load("KoichiYasuoka/roberta-small-japanese-luw-upos")
|
39 |
+
print(nlp("国境の長いトンネルを抜けると雪国であった。"))
|
40 |
+
```
|
41 |
+
|
42 |
## See Also
|
43 |
|
44 |
[esupar](https://github.com/KoichiYasuoka/esupar): Tokenizer POS-tagger and Dependency-parser with BERT/RoBERTa models
|