julien-c HF staff commited on
Commit
798ecaa
β€’
1 Parent(s): 153b3b9

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/dbmdz/distilbert-base-turkish-cased/README.md

Files changed (1) hide show
  1. README.md +77 -0
README.md ADDED
@@ -0,0 +1,77 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: tr
3
+ license: mit
4
+ ---
5
+
6
+ # πŸ€— + πŸ“š dbmdz Distilled Turkish BERT model
7
+
8
+ In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
9
+ Library open sources a (cased) distilled model for Turkish πŸŽ‰
10
+
11
+ # πŸ‡ΉπŸ‡· DistilBERTurk
12
+
13
+ DistilBERTurk is a community-driven cased distilled BERT model for Turkish.
14
+
15
+ DistilBERTurk was trained on 7GB of the original training data that was used
16
+ for training [BERTurk](https://github.com/stefan-it/turkish-bert/tree/master#stats),
17
+ using the cased version of BERTurk as teacher model.
18
+
19
+ *DistilBERTurk* was trained with the official Hugging Face implementation from
20
+ [here](https://github.com/huggingface/transformers/tree/master/examples/distillation)
21
+ for 5 days on 4 RTX 2080 TI.
22
+
23
+ More details about distillation can be found in the
24
+ ["DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter"](https://arxiv.org/abs/1910.01108)
25
+ paper by Sanh et al. (2019).
26
+
27
+ ## Model weights
28
+
29
+ Currently only PyTorch-[Transformers](https://github.com/huggingface/transformers)
30
+ compatible weights are available. If you need access to TensorFlow checkpoints,
31
+ please raise an issue in the [BERTurk](https://github.com/stefan-it/turkish-bert) repository!
32
+
33
+ | Model | Downloads
34
+ | --------------------------------- | ---------------------------------------------------------------------------------------------------------------
35
+ | `dbmdz/distilbert-base-turkish-cased` | [`config.json`](https://cdn.huggingface.co/dbmdz/distilbert-base-turkish-cased/config.json) β€’ [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/distilbert-base-turkish-cased/pytorch_model.bin) β€’ [`vocab.txt`](https://cdn.huggingface.co/dbmdz/distilbert-base-turkish-cased/vocab.txt)
36
+
37
+ ## Usage
38
+
39
+ With Transformers >= 2.3 our DistilBERTurk model can be loaded like:
40
+
41
+ ```python
42
+ from transformers import AutoModel, AutoTokenizer
43
+
44
+ tokenizer = AutoTokenizer.from_pretrained("dbmdz/distilbert-base-turkish-cased")
45
+ model = AutoModel.from_pretrained("dbmdz/distilbert-base-turkish-cased")
46
+ ```
47
+
48
+ ## Results
49
+
50
+ For results on PoS tagging or NER tasks, please refer to
51
+ [this repository](https://github.com/stefan-it/turkish-bert).
52
+
53
+ For PoS tagging, DistilBERTurk outperforms the 24-layer XLM-RoBERTa model.
54
+
55
+ The overall performance difference between DistilBERTurk and the original
56
+ (teacher) BERTurk model is ~1.18%.
57
+
58
+ # Huggingface model hub
59
+
60
+ All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
61
+
62
+ # Contact (Bugs, Feedback, Contribution and more)
63
+
64
+ For questions about our BERT models just open an issue
65
+ [here](https://github.com/dbmdz/berts/issues/new) πŸ€—
66
+
67
+ # Acknowledgments
68
+
69
+ Thanks to [Kemal Oflazer](http://www.andrew.cmu.edu/user/ko/) for providing us
70
+ additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
71
+ us the Turkish NER dataset for evaluation.
72
+
73
+ Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
74
+ Thanks for providing access to the TFRC ❀️
75
+
76
+ Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
77
+ it is possible to download both cased and uncased models from their S3 storage πŸ€—