julien-c HF staff commited on
Commit
1c70896
β€’
1 Parent(s): 8662f9e

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/dbmdz/electra-small-turkish-cased-discriminator/README.md

Files changed (1) hide show
  1. README.md +79 -0
README.md ADDED
@@ -0,0 +1,79 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: tr
3
+ license: mit
4
+ ---
5
+
6
+ # πŸ€— + πŸ“š dbmdz Turkish ELECTRA model
7
+
8
+ In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
9
+ Library open sources a cased ELECTRA small model for Turkish πŸŽ‰
10
+
11
+ # Turkish ELECTRA model
12
+
13
+ We release a small ELEC**TR**A model for Turkish, that was trained on the same data as *BERTurk*.
14
+
15
+ > ELECTRA is a new method for self-supervised language representation learning. It can be used to
16
+ > pre-train transformer networks using relatively little compute. ELECTRA models are trained to
17
+ > distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to
18
+ > the discriminator of a GAN.
19
+
20
+ More details about ELECTRA can be found in the [ICLR paper](https://openreview.net/forum?id=r1xMH1BtvB)
21
+ or in the [official ELECTRA repository](https://github.com/google-research/electra) on GitHub.
22
+
23
+ ## Stats
24
+
25
+ The current version of the model is trained on a filtered and sentence
26
+ segmented version of the Turkish [OSCAR corpus](https://traces1.inria.fr/oscar/),
27
+ a recent Wikipedia dump, various [OPUS corpora](http://opus.nlpl.eu/) and a
28
+ special corpus provided by [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/).
29
+
30
+ The final training corpus has a size of 35GB and 44,04,976,662 tokens.
31
+
32
+ Thanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model
33
+ on a TPU v3-8 for 1M steps.
34
+
35
+ ## Model weights
36
+
37
+ [Transformers](https://github.com/huggingface/transformers)
38
+ compatible weights for both PyTorch and TensorFlow are available.
39
+
40
+ | Model | Downloads
41
+ | ------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------
42
+ | `dbmdz/electra-small-turkish-cased-discriminator` | [`config.json`](https://cdn.huggingface.co/dbmdz/electra-small-turkish-cased-discriminator/config.json) β€’ [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/electra-small-turkish-cased-discriminator/pytorch_model.bin) β€’ [`vocab.txt`](https://cdn.huggingface.co/dbmdz/electra-small-turkish-cased-discriminator/vocab.txt)
43
+
44
+ ## Usage
45
+
46
+ With Transformers >= 2.8 our ELECTRA small cased model can be loaded like:
47
+
48
+ ```python
49
+ from transformers import AutoModelWithLMHead, AutoTokenizer
50
+
51
+ tokenizer = AutoTokenizer.from_pretrained("dbmdz/electra-small-turkish-cased-discriminator")
52
+ model = AutoModelWithLMHead.from_pretrained("dbmdz/electra-small-turkish-cased-discriminator")
53
+ ```
54
+
55
+ ## Results
56
+
57
+ For results on PoS tagging or NER tasks, please refer to
58
+ [this repository](https://github.com/stefan-it/turkish-bert/electra).
59
+
60
+ # Huggingface model hub
61
+
62
+ All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
63
+
64
+ # Contact (Bugs, Feedback, Contribution and more)
65
+
66
+ For questions about our ELECTRA models just open an issue
67
+ [here](https://github.com/dbmdz/berts/issues/new) πŸ€—
68
+
69
+ # Acknowledgments
70
+
71
+ Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
72
+ additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
73
+ us the Turkish NER dataset for evaluation.
74
+
75
+ Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
76
+ Thanks for providing access to the TFRC ❀️
77
+
78
+ Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
79
+ it is possible to download both cased and uncased models from their S3 storage πŸ€—