Muthukumaran
commited on
Commit
•
f8e5108
1
Parent(s):
8afbc45
Update README.md
Browse files
README.md
CHANGED
@@ -1,4 +1,4 @@
|
|
1 |
-
This model is further trained on top of scibert-base using masked language modeling loss (MLM). The corpus is roughly 270,000 earth science-based publications.
|
2 |
|
3 |
The tokenizer used is AutoTokenizer, which is trained on the same corpus.
|
4 |
|
|
|
1 |
+
This model is further trained on top of scibert-base using masked language modeling loss (MLM). The corpus is roughly abstracts from 270,000 earth science-based publications.
|
2 |
|
3 |
The tokenizer used is AutoTokenizer, which is trained on the same corpus.
|
4 |
|