Commit
·
0552c3d
1
Parent(s):
4dacbda
add link to official release
Browse files
README.md
CHANGED
@@ -28,9 +28,9 @@ counterparts.
|
|
28 |
|
29 |
The model was trained using a self-supervised masked language modeling task. We do whole word masking with a maximum of 80 predictions. The model was trained for 1000K steps, with a batch size of 4096, and a max sequence length of 512.
|
30 |
|
31 |
-
Original model
|
32 |
|
33 |
-
|
34 |
|
35 |
License: Apache 2.0
|
36 |
|
|
|
28 |
|
29 |
The model was trained using a self-supervised masked language modeling task. We do whole word masking with a maximum of 80 predictions. The model was trained for 1000K steps, with a batch size of 4096, and a max sequence length of 512.
|
30 |
|
31 |
+
Original model on TFHub: https://tfhub.dev/google/MuRIL/1
|
32 |
|
33 |
+
*Official release now on HuggingFace (March 2021)* https://huggingface.co/google/muril-base-cased
|
34 |
|
35 |
License: Apache 2.0
|
36 |
|