Commit
•
231d924
1
Parent(s):
e74edb9
Update README.md (#2)
Browse files- Update README.md (1594a591e2b53ffd77b4f5939fdb0439e91e964e)
Co-authored-by: Mofetoluwa Adeyemi <Mofe@users.noreply.huggingface.co>
README.md
CHANGED
@@ -1,20 +1,22 @@
|
|
1 |
-
Hugging Face's logo
|
2 |
---
|
3 |
-
language:
|
4 |
- om
|
5 |
- am
|
6 |
- rw
|
7 |
- rn
|
8 |
- ha
|
9 |
- ig
|
10 |
-
- pcm
|
11 |
- so
|
12 |
- sw
|
13 |
- ti
|
14 |
- yo
|
|
|
15 |
- multilingual
|
16 |
-
|
|
|
|
|
17 |
---
|
|
|
18 |
# afriberta_large
|
19 |
## Model description
|
20 |
AfriBERTa large is a pretrained multilingual language model with around 126 million parameters.
|
@@ -63,6 +65,4 @@ For information on training procedures, please refer to the AfriBERTa [paper]()
|
|
63 |
url = "https://aclanthology.org/2021.mrl-1.11",
|
64 |
pages = "116--126",
|
65 |
}
|
66 |
-
```
|
67 |
-
|
68 |
-
|
|
|
|
|
1 |
---
|
2 |
+
language:
|
3 |
- om
|
4 |
- am
|
5 |
- rw
|
6 |
- rn
|
7 |
- ha
|
8 |
- ig
|
|
|
9 |
- so
|
10 |
- sw
|
11 |
- ti
|
12 |
- yo
|
13 |
+
- pcm
|
14 |
- multilingual
|
15 |
+
license: mit
|
16 |
+
datasets:
|
17 |
+
- castorini/afriberta-corpus
|
18 |
---
|
19 |
+
|
20 |
# afriberta_large
|
21 |
## Model description
|
22 |
AfriBERTa large is a pretrained multilingual language model with around 126 million parameters.
|
|
|
65 |
url = "https://aclanthology.org/2021.mrl-1.11",
|
66 |
pages = "116--126",
|
67 |
}
|
68 |
+
```
|
|
|
|