ksabeh commited on
Commit
f18b07a
1 Parent(s): 4a4b244

Training in progress epoch 0

Browse files
README.md ADDED
@@ -0,0 +1,53 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_keras_callback
5
+ model-index:
6
+ - name: ksabeh/distilbert-base-uncased-mlm-electronics-attribute-correction-qa-mlm
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information Keras had access to. You should
11
+ probably proofread and complete it, then remove this comment. -->
12
+
13
+ # ksabeh/distilbert-base-uncased-mlm-electronics-attribute-correction-qa-mlm
14
+
15
+ This model is a fine-tuned version of [ksabeh/distilbert-base-uncased-mlm-electronics](https://huggingface.co/ksabeh/distilbert-base-uncased-mlm-electronics) on an unknown dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Train Loss: 0.1703
18
+ - Validation Loss: 0.0730
19
+ - Epoch: 0
20
+
21
+ ## Model description
22
+
23
+ More information needed
24
+
25
+ ## Intended uses & limitations
26
+
27
+ More information needed
28
+
29
+ ## Training and evaluation data
30
+
31
+ More information needed
32
+
33
+ ## Training procedure
34
+
35
+ ### Training hyperparameters
36
+
37
+ The following hyperparameters were used during training:
38
+ - optimizer: {'name': 'Adam', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 36794, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
39
+ - training_precision: float32
40
+
41
+ ### Training results
42
+
43
+ | Train Loss | Validation Loss | Epoch |
44
+ |:----------:|:---------------:|:-----:|
45
+ | 0.1703 | 0.0730 | 0 |
46
+
47
+
48
+ ### Framework versions
49
+
50
+ - Transformers 4.18.0
51
+ - TensorFlow 2.6.3
52
+ - Datasets 2.1.0
53
+ - Tokenizers 0.12.1
config.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "ksabeh/distilbert-base-uncased-mlm-electronics",
3
+ "activation": "gelu",
4
+ "architectures": [
5
+ "DistilBertForQuestionAnswering"
6
+ ],
7
+ "attention_dropout": 0.1,
8
+ "dim": 768,
9
+ "dropout": 0.1,
10
+ "hidden_dim": 3072,
11
+ "initializer_range": 0.02,
12
+ "max_position_embeddings": 512,
13
+ "model_type": "distilbert",
14
+ "n_heads": 12,
15
+ "n_layers": 6,
16
+ "pad_token_id": 0,
17
+ "qa_dropout": 0.1,
18
+ "seq_classif_dropout": 0.2,
19
+ "sinusoidal_pos_embds": false,
20
+ "tie_weights_": true,
21
+ "transformers_version": "4.18.0",
22
+ "vocab_size": 30522
23
+ }
logs/train/events.out.tfevents.1652858501.314657b40b1d.33.0.v2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a686f98cb22b9f878d0db168644e0467ec29a4bb8d9540886a8a15d1368b17d7
3
+ size 1469388
logs/train/events.out.tfevents.1652858514.314657b40b1d.profile-empty ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:446841e91a150a5df05e9e48c712a73ddc44199507af0518ef9401672497da1e
3
+ size 40
logs/train/plugins/profile/2022_05_18_07_21_54/314657b40b1d.input_pipeline.pb ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d6204af42b784f191e925de9d2f1dc4b957b59745d173b48f5d55a1384a481d3
3
+ size 2798
logs/train/plugins/profile/2022_05_18_07_21_54/314657b40b1d.kernel_stats.pb ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b5820910b506e618a4a7a782154761bbc63c9acf67ba97c8db955906683baeb0
3
+ size 311772
logs/train/plugins/profile/2022_05_18_07_21_54/314657b40b1d.memory_profile.json.gz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5b3ff7f7e6aed691d31fc85f8052cb5806842d5cf2a69a1fb6242a11c714a82f
3
+ size 35683
logs/train/plugins/profile/2022_05_18_07_21_54/314657b40b1d.overview_page.pb ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:adc6f799e803e51f4cb8e68371491195e213540ba0b122c3bbb14fbc88e44f59
3
+ size 5733
logs/train/plugins/profile/2022_05_18_07_21_54/314657b40b1d.tensorflow_stats.pb ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:626e66895058f8af23ee4b11eb5e6ce96c35d875554ac45e94158fbb1882c7df
3
+ size 189783
logs/train/plugins/profile/2022_05_18_07_21_54/314657b40b1d.trace.json.gz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5a72718fa3095bd7b4ffffac6fcfa1a626b9c8e82ed0dc706f7da02075c5ac7b
3
+ size 148360
logs/train/plugins/profile/2022_05_18_07_21_54/314657b40b1d.xplane.pb ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:902551e5709a5535907927ff61ce5c94f53fd5accd7e9fc1efff957f43f24343
3
+ size 1136889
logs/validation/events.out.tfevents.1652872174.314657b40b1d.33.1.v2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:830908874e6696a25a2413d742857f466b043f7884687185ebef2028aeff4772
3
+ size 195
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
tf_model.h5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d194289099abeff2e407da998e0562e95bc403bf611c53ae623328d7437a070b
3
+ size 265583688
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "ksabeh/distilbert-base-uncased-mlm-electronics", "tokenizer_class": "DistilBertTokenizer"}
vocab.txt ADDED
The diff for this file is too large to render. See raw diff