Adding model, graphs and metadata.
Browse files- README.md +12 -12
- eval/eval_metrics.json +4 -0
- eval/evaluate_timing.json +1 -0
- eval/nbest_predictions.json.tgz +0 -0
- eval/predictions.json +0 -0
- eval/sparsity_report.json +1 -0
- eval/speed_report.json +1 -0
- model_card/density_info.js +4 -4
- model_card/pruning_info.js +4 -4
- model_info.json +297 -0
- training/data_args.json +16 -0
- training/model_args.json +7 -0
- training/sparse_args.json +29 -0
- training/training_args.bin +3 -0
README.md
CHANGED
@@ -4,8 +4,8 @@ thumbnail:
|
|
4 |
license: mit
|
5 |
tags:
|
6 |
- question-answering
|
7 |
-
-
|
8 |
-
-
|
9 |
datasets:
|
10 |
- squad
|
11 |
metrics:
|
@@ -19,26 +19,26 @@ widget:
|
|
19 |
|
20 |
## BERT-base uncased model fine-tuned on SQuAD v1
|
21 |
|
22 |
-
This model was created using the [nn_pruning](https://github.com/huggingface/nn_pruning) python library: the **linear layers contains 15.0%** of the original
|
23 |
|
24 |
|
25 |
|
26 |
The model contains **34.0%** of the original weights **overall** (the embeddings account for a significant part of the model, and they are not pruned by this method).
|
27 |
|
28 |
-
With a simple resizing of the linear matrices it ran **2.32x as fast as
|
29 |
This is possible because the pruning method lead to structured matrices: to visualize them, hover below on the plot to see the non-zero/zero parts of each matrix.
|
30 |
|
31 |
-
<div class="graph"><script src="/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/density_info.js" id="
|
32 |
|
33 |
-
In terms of accuracy, its **F1 is 86.64**, compared with 88.5 for
|
34 |
|
35 |
## Fine-Pruning details
|
36 |
-
This model was fine-tuned from the HuggingFace [
|
37 |
This model is case-insensitive: it does not make a difference between english and English.
|
38 |
|
39 |
A side-effect of the block pruning is that some of the attention heads are completely removed: 63 heads were removed on a total of 144 (43.8%).
|
40 |
Here is a detailed view on how the remaining heads are distributed in the network after pruning.
|
41 |
-
<div class="graph"><script src="/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/pruning_info.js" id="
|
42 |
|
43 |
## Details of the SQuAD1.1 dataset
|
44 |
|
@@ -60,7 +60,7 @@ GPU driver: 455.23.05, CUDA: 11.1
|
|
60 |
|
61 |
### Results
|
62 |
|
63 |
-
**Pytorch model file size**: `
|
64 |
|
65 |
| Metric | # Value | # Original ([Table 2](https://www.aclweb.org/anthology/N19-1423.pdf))| Variation |
|
66 |
| ------ | --------- | --------- | --------- |
|
@@ -84,11 +84,11 @@ qa_pipeline = pipeline(
|
|
84 |
tokenizer="madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1"
|
85 |
)
|
86 |
|
87 |
-
print("
|
88 |
-
print(f"Parameters count (includes head pruning)={int(qa_pipeline.model.num_parameters() / 1E6)}M")
|
89 |
qa_pipeline.model = optimize_model(qa_pipeline.model, "dense")
|
90 |
|
91 |
-
print(f"Parameters count after optimization={int(qa_pipeline.model.num_parameters() / 1E6)}M")
|
92 |
predictions = qa_pipeline({
|
93 |
'context': "Frédéric François Chopin, born Fryderyk Franciszek Chopin (1 March 1810 – 17 October 1849), was a Polish composer and virtuoso pianist of the Romantic era who wrote primarily for solo piano.",
|
94 |
'question': "Who is Frederic Chopin?",
|
|
|
4 |
license: mit
|
5 |
tags:
|
6 |
- question-answering
|
7 |
+
-
|
8 |
+
-
|
9 |
datasets:
|
10 |
- squad
|
11 |
metrics:
|
|
|
19 |
|
20 |
## BERT-base uncased model fine-tuned on SQuAD v1
|
21 |
|
22 |
+
This model was created using the [nn_pruning](https://github.com/huggingface/nn_pruning) python library: the **linear layers contains 15.0%** of the original weights.
|
23 |
|
24 |
|
25 |
|
26 |
The model contains **34.0%** of the original weights **overall** (the embeddings account for a significant part of the model, and they are not pruned by this method).
|
27 |
|
28 |
+
With a simple resizing of the linear matrices it ran **2.32x as fast as bert-base-uncased** on the evaluation.
|
29 |
This is possible because the pruning method lead to structured matrices: to visualize them, hover below on the plot to see the non-zero/zero parts of each matrix.
|
30 |
|
31 |
+
<div class="graph"><script src="/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/density_info.js" id="1ff1ba08-69d3-4a20-9f29-494033c72860"></script></div>
|
32 |
|
33 |
+
In terms of accuracy, its **F1 is 86.64**, compared with 88.5 for bert-base-uncased, a **F1 drop of 1.86**.
|
34 |
|
35 |
## Fine-Pruning details
|
36 |
+
This model was fine-tuned from the HuggingFace [model](https://huggingface.co/bert-base-uncased) checkpoint on [SQuAD1.1](https://rajpurkar.github.io/SQuAD-explorer), and distilled from the model [bert-large-uncased-whole-word-masking-finetuned-squad](https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad)
|
37 |
This model is case-insensitive: it does not make a difference between english and English.
|
38 |
|
39 |
A side-effect of the block pruning is that some of the attention heads are completely removed: 63 heads were removed on a total of 144 (43.8%).
|
40 |
Here is a detailed view on how the remaining heads are distributed in the network after pruning.
|
41 |
+
<div class="graph"><script src="/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/pruning_info.js" id="e092ee84-28af-4821-8127-11914f68e306"></script></div>
|
42 |
|
43 |
## Details of the SQuAD1.1 dataset
|
44 |
|
|
|
60 |
|
61 |
### Results
|
62 |
|
63 |
+
**Pytorch model file size**: `368MB` (original BERT: `420MB`)
|
64 |
|
65 |
| Metric | # Value | # Original ([Table 2](https://www.aclweb.org/anthology/N19-1423.pdf))| Variation |
|
66 |
| ------ | --------- | --------- | --------- |
|
|
|
84 |
tokenizer="madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1"
|
85 |
)
|
86 |
|
87 |
+
print("bert-base-uncased parameters: 165.0M")
|
88 |
+
print(f"Parameters count (includes only head pruning, not feed forward pruning)={int(qa_pipeline.model.num_parameters() / 1E6)}M")
|
89 |
qa_pipeline.model = optimize_model(qa_pipeline.model, "dense")
|
90 |
|
91 |
+
print(f"Parameters count after complete optimization={int(qa_pipeline.model.num_parameters() / 1E6)}M")
|
92 |
predictions = qa_pipeline({
|
93 |
'context': "Frédéric François Chopin, born Fryderyk Franciszek Chopin (1 March 1810 – 17 October 1849), was a Polish composer and virtuoso pianist of the Romantic era who wrote primarily for solo piano.",
|
94 |
'question': "Who is Frederic Chopin?",
|
eval/eval_metrics.json
ADDED
@@ -0,0 +1,4 @@
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"exact_match": 78.74172185430463,
|
3 |
+
"f1": 86.57976286123308
|
4 |
+
}
|
eval/evaluate_timing.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"eval_elapsed_time": 86.37038454785943, "cuda_eval_elapsed_time": 78.39602122497558}
|
eval/nbest_predictions.json.tgz
ADDED
Binary file (6.63 MB). View file
|
|
eval/predictions.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
eval/sparsity_report.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"total": 108893186, "nnz": 37412084, "linear_total": 84934656, "linear_nnz": 13502464, "layers": {"0": {"total": 7087872, "nnz": 993834, "linear_total": 7077888, "linear_nnz": 988160, "linear_attention_total": 2359296, "linear_attention_nnz": 677888, "linear_dense_total": 4718592, "linear_dense_nnz": 310272}, "1": {"total": 7087872, "nnz": 1131132, "linear_total": 7077888, "linear_nnz": 1125376, "linear_attention_total": 2359296, "linear_attention_nnz": 689152, "linear_dense_total": 4718592, "linear_dense_nnz": 436224}, "2": {"total": 7087872, "nnz": 1637570, "linear_total": 7077888, "linear_nnz": 1631232, "linear_attention_total": 2359296, "linear_attention_nnz": 1087488, "linear_dense_total": 4718592, "linear_dense_nnz": 543744}, "3": {"total": 7087872, "nnz": 1761552, "linear_total": 7077888, "linear_nnz": 1755136, "linear_attention_total": 2359296, "linear_attention_nnz": 1189888, "linear_dense_total": 4718592, "linear_dense_nnz": 565248}, "4": {"total": 7087872, "nnz": 1701216, "linear_total": 7077888, "linear_nnz": 1694720, "linear_attention_total": 2359296, "linear_attention_nnz": 1104896, "linear_dense_total": 4718592, "linear_dense_nnz": 589824}, "5": {"total": 7087872, "nnz": 1338767, "linear_total": 7077888, "linear_nnz": 1332736, "linear_attention_total": 2359296, "linear_attention_nnz": 818176, "linear_dense_total": 4718592, "linear_dense_nnz": 514560}, "6": {"total": 7087872, "nnz": 1331200, "linear_total": 7077888, "linear_nnz": 1325056, "linear_attention_total": 2359296, "linear_attention_nnz": 882688, "linear_dense_total": 4718592, "linear_dense_nnz": 442368}, "7": {"total": 7087872, "nnz": 1175442, "linear_total": 7077888, "linear_nnz": 1169408, "linear_attention_total": 2359296, "linear_attention_nnz": 846848, "linear_dense_total": 4718592, "linear_dense_nnz": 322560}, "8": {"total": 7087872, "nnz": 905581, "linear_total": 7077888, "linear_nnz": 899584, "linear_attention_total": 2359296, "linear_attention_nnz": 732160, "linear_dense_total": 4718592, "linear_dense_nnz": 167424}, "9": {"total": 7087872, "nnz": 539287, "linear_total": 7077888, "linear_nnz": 534016, "linear_attention_total": 2359296, "linear_attention_nnz": 449536, "linear_dense_total": 4718592, "linear_dense_nnz": 84480}, "10": {"total": 7087872, "nnz": 560943, "linear_total": 7077888, "linear_nnz": 555520, "linear_attention_total": 2359296, "linear_attention_nnz": 434176, "linear_dense_total": 4718592, "linear_dense_nnz": 121344}, "11": {"total": 7087872, "nnz": 496838, "linear_total": 7077888, "linear_nnz": 491520, "linear_attention_total": 2359296, "linear_attention_nnz": 334848, "linear_dense_total": 4718592, "linear_dense_nnz": 156672}}, "total_sparsity": 65.64331950026698, "linear_sparsity": 84.10252700617285, "pruned_heads": {"0": [0, 2, 4, 5, 6, 7, 11], "1": [0, 2, 3, 5, 6, 7, 8], "2": [8, 4, 7], "3": [2, 4, 6, 7], "4": [1, 2, 11], "5": [1, 2, 5, 6, 7, 11], "6": [10, 2, 3, 7], "7": [1, 3, 6, 7, 11], "8": [0, 8, 3, 4], "9": [1, 4, 5, 7, 9, 10], "10": [1, 2, 4, 5, 6, 7, 8], "11": [0, 2, 5, 7, 8, 10, 11]}}
|
eval/speed_report.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"timings": {"eval_elapsed_time": 23.629751751199365, "cuda_eval_elapsed_time": 16.665313415527343}, "metrics": {"exact_match": 78.77010406811732, "f1": 86.63938864881486}}
|
model_card/density_info.js
CHANGED
@@ -16,9 +16,9 @@
|
|
16 |
|
17 |
|
18 |
|
19 |
-
var element = document.getElementById("
|
20 |
if (element == null) {
|
21 |
-
console.warn("Bokeh: autoload.js configured with elementid '
|
22 |
}
|
23 |
|
24 |
|
@@ -115,8 +115,8 @@
|
|
115 |
(function(root) {
|
116 |
function embed_document(root) {
|
117 |
|
118 |
-
var docs_json = '{"832f21f0-0877-47a5-9ae3-3ca13f735411":{"roots":{"references":[{"attributes":{"axis_label":"Layer","formatter":{"id":"1148"},"minor_tick_line_color":null,"ticker":{"id":"1107"}},"id":"1106","type":"LinearAxis"},{"attributes":{"fill_alpha":{"value":0.1},"fill_color":{"value":"#20cb97"},"line_alpha":{"value":0.1},"line_color":{"value":"#20cb97"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1131","type":"VBar"},{"attributes":{},"id":"1156","type":"UnionRenderers"},{"attributes":{"data":{"density":["26.4%","31.4%","51.0%","51.9%","51.4%","34.0%","40.6%","39.4%","34.2%","27.1%","23.1%","19.3%"],"height":[0.155648,0.185344,0.301056,0.306176,0.303104,0.200704,0.239616,0.232448,0.201728,0.159744,0.136192,0.113664],"img_height":["96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px"],"img_width":["96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px"],"name":["0.attention.key","1.attention.key","2.attention.key","3.attention.key","4.attention.key","5.attention.key","6.attention.key","7.attention.key","8.attention.key","9.attention.key","10.attention.key","11.attention.key"],"parameters":["0.16","0.19","0.30","0.31","0.30","0.20","0.24","0.23","0.20","0.16","0.14","0.11"],"url":["/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_0_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_1_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_2_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_3_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_4_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_5_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_6_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_7_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_8_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_9_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_10_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_11_attention_self_key.png"],"x":[0.25,1.25,2.25,3.25,4.25,5.25,6.25,7.25,8.25,9.25,10.25,11.25]},"selected":{"id":"1153"},"selection_policy":{"id":"1152"}},"id":"1122","type":"ColumnDataSource"},{"attributes":{"data":{"density":["30.0%","26.7%","39.8%","50.0%","42.5%","38.2%","34.9%","35.4%","29.0%","12.2%","13.2%","8.3%"],"height":[0.177152,0.157696,0.234496,0.294912,0.25088,0.22528,0.205824,0.208896,0.171008,0.07168,0.077824,0.049152],"img_height":["96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px"],"img_width":["96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px"],"name":["0.attention.value","1.attention.value","2.attention.value","3.attention.value","4.attention.value","5.attention.value","6.attention.value","7.attention.value","8.attention.value","9.attention.value","10.attention.value","11.attention.value"],"parameters":["0.18","0.16","0.23","0.29","0.25","0.23","0.21","0.21","0.17","0.07","0.08","0.05"],"url":["/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_0_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_1_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_2_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_3_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_4_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_5_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_6_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_7_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_8_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_9_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_10_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_11_attention_self_value.png"],"x":[0.41666666666666663,1.4166666666666665,2.416666666666667,3.416666666666667,4.416666666666666,5.416666666666666,6.416666666666666,7.416666666666666,8.416666666666668,9.416666666666668,10.416666666666668,11.416666666666668]},"selected":{"id":"1155"},"selection_policy":{"id":"1154"}},"id":"1128","type":"ColumnDataSource"},{"attributes":{"axis":{"id":"1110"},"dimension":1,"ticker":null},"id":"1113","type":"Grid"},{"attributes":{},"id":"1148","type":"BasicTickFormatter"},{"attributes":{},"id":"1098","type":"DataRange1d"},{"attributes":{"fill_color":{"value":"#ed5642"},"line_color":{"value":"#ed5642"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1124","type":"VBar"},{"attributes":{"fill_alpha":{"value":0.1},"fill_color":{"value":"#ed5642"},"line_alpha":{"value":0.1},"line_color":{"value":"#ed5642"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1125","type":"VBar"},{"attributes":{"fill_alpha":{"value":0.1},"fill_color":{"value":"#aa69f7"},"line_alpha":{"value":0.1},"line_color":{"value":"#aa69f7"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1137","type":"VBar"},{"attributes":{"label":{"value":"key"},"renderers":[{"id":"1126"}]},"id":"1142","type":"LegendItem"},{"attributes":{},"id":"1153","type":"Selection"},{"attributes":{"source":{"id":"1134"}},"id":"1139","type":"CDSView"},{"attributes":{},"id":"1152","type":"UnionRenderers"},{"attributes":{"data":{"density":["26.2%","31.2%","51.6%","47.9%","49.1%","30.4%","37.8%","35.9%","34.0%","25.7%","24.1%","19.8%"],"height":[0.154624,0.18432,0.304128,0.282624,0.289792,0.1792,0.223232,0.211968,0.200704,0.151552,0.142336,0.116736],"img_height":["96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px"],"img_width":["96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px"],"name":["0.attention.query","1.attention.query","2.attention.query","3.attention.query","4.attention.query","5.attention.query","6.attention.query","7.attention.query","8.attention.query","9.attention.query","10.attention.query","11.attention.query"],"parameters":["0.15","0.18","0.30","0.28","0.29","0.18","0.22","0.21","0.20","0.15","0.14","0.12"],"url":["/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_0_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_1_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_2_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_3_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_4_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_5_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_6_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_7_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_8_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_9_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_10_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_11_attention_self_query.png"],"x":[0.08333333333333333,1.0833333333333333,2.0833333333333335,3.0833333333333335,4.083333333333333,5.083333333333333,6.083333333333333,7.083333333333333,8.083333333333334,9.083333333333334,10.083333333333334,11.083333333333334]},"selected":{"id":"1151"},"selection_policy":{"id":"1150"}},"id":"1116","type":"ColumnDataSource"},{"attributes":{"fill_color":{"value":"#6573f7"},"line_color":{"value":"#6573f7"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1118","type":"VBar"},{"attributes":{"label":{"value":"fully connected"},"renderers":[{"id":"1138"}]},"id":"1144","type":"LegendItem"},{"attributes":{},"id":"1151","type":"Selection"},{"attributes":{"source":{"id":"1122"}},"id":"1127","type":"CDSView"},{"attributes":{"label":{"value":"value"},"renderers":[{"id":"1132"}]},"id":"1143","type":"LegendItem"},{"attributes":{"data_source":{"id":"1122"},"glyph":{"id":"1124"},"hover_glyph":null,"muted_glyph":null,"name":"key","nonselection_glyph":{"id":"1125"},"selection_glyph":null,"view":{"id":"1127"}},"id":"1126","type":"GlyphRenderer"},{"attributes":{},"id":"1146","type":"BasicTickFormatter"},{"attributes":{},"id":"1102","type":"LinearScale"},{"attributes":{},"id":"1107","type":"BasicTicker"},{"attributes":{},"id":"1157","type":"Selection"},{"attributes":{"data_source":{"id":"1116"},"glyph":{"id":"1118"},"hover_glyph":null,"muted_glyph":null,"name":"query","nonselection_glyph":{"id":"1119"},"selection_glyph":null,"view":{"id":"1121"}},"id":"1120","type":"GlyphRenderer"},{"attributes":{"source":{"id":"1116"}},"id":"1121","type":"CDSView"},{"attributes":{"items":[{"id":"1141"},{"id":"1142"},{"id":"1143"},{"id":"1144"}],"location":[10,0],"orientation":"horizontal"},"id":"1140","type":"Legend"},{"attributes":{},"id":"1104","type":"LinearScale"},{"attributes":{"fill_color":{"value":"#aa69f7"},"line_color":{"value":"#aa69f7"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1136","type":"VBar"},{"attributes":{"axis":{"id":"1106"},"grid_line_color":null,"ticker":null},"id":"1109","type":"Grid"},{"attributes":{},"id":"1154","type":"UnionRenderers"},{"attributes":{"label":{"value":"query"},"renderers":[{"id":"1120"}]},"id":"1141","type":"LegendItem"},{"attributes":{},"id":"1111","type":"BasicTicker"},{"attributes":{"data_source":{"id":"1128"},"glyph":{"id":"1130"},"hover_glyph":null,"muted_glyph":null,"name":"value","nonselection_glyph":{"id":"1131"},"selection_glyph":null,"view":{"id":"1133"}},"id":"1132","type":"GlyphRenderer"},{"attributes":{"data":{"density":["31.8%","6.6%","6.6%","27.4%","9.2%","9.2%","42.0%","11.5%","11.5%","51.9%","12.0%","12.0%","44.3%","12.5%","12.5%","36.1%","10.9%","10.9%","36.3%","9.4%","9.4%","32.8%","6.8%","6.8%","26.9%","3.5%","3.5%","11.3%","1.8%","1.8%","13.2%","2.6%","2.6%","9.4%","3.3%","3.3%"],"height":[0.187392,0.155136,0.155136,0.161792,0.218112,0.218112,0.247808,0.271872,0.271872,0.306176,0.282624,0.282624,0.26112,0.294912,0.294912,0.212992,0.25728,0.25728,0.214016,0.221184,0.221184,0.193536,0.16128,0.16128,0.15872,0.083712,0.083712,0.06656,0.04224,0.04224,0.077824,0.060672,0.060672,0.055296,0.078336,0.078336],"img_height":["96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px"],"img_width":["96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px"],"name":["0.attention.output","0.intermediate","0.output","1.attention.output","1.intermediate","1.output","2.attention.output","2.intermediate","2.output","3.attention.output","3.intermediate","3.output","4.attention.output","4.intermediate","4.output","5.attention.output","5.intermediate","5.output","6.attention.output","6.intermediate","6.output","7.attention.output","7.intermediate","7.output","8.attention.output","8.intermediate","8.output","9.attention.output","9.intermediate","9.output","10.attention.output","10.intermediate","10.output","11.attention.output","11.intermediate","11.output"],"parameters":["0.19","0.16","0.16","0.16","0.22","0.22","0.25","0.27","0.27","0.31","0.28","0.28","0.26","0.29","0.29","0.21","0.26","0.26","0.21","0.22","0.22","0.19","0.16","0.16","0.16","0.08","0.08","0.07","0.04","0.04","0.08","0.06","0.06","0.06","0.08","0.08"],"url":["/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_0_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_0_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_0_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_1_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_1_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_1_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_2_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_2_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_2_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_3_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_3_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_3_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_4_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_4_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_4_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_5_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_5_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_5_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_6_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_6_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_6_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_7_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_7_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_7_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_8_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_8_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_8_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_9_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_9_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_9_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_10_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_10_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_10_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_11_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_11_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_11_output_dense.png"],"x":[0.5833333333333334,0.75,0.9166666666666667,1.5833333333333333,1.75,1.9166666666666665,2.5833333333333335,2.75,2.916666666666667,3.5833333333333335,3.75,3.916666666666667,4.583333333333333,4.75,4.916666666666666,5.583333333333333,5.75,5.916666666666666,6.583333333333333,6.75,6.916666666666666,7.583333333333333,7.75,7.916666666666666,8.583333333333334,8.75,8.916666666666668,9.583333333333334,9.75,9.916666666666668,10.583333333333334,10.75,10.916666666666668,11.583333333333334,11.75,11.916666666666668]},"selected":{"id":"1157"},"selection_policy":{"id":"1156"}},"id":"1134","type":"ColumnDataSource"},{"attributes":{"text":"Transformer Layers"},"id":"1096","type":"Title"},{"attributes":{"above":[{"id":"1140"}],"below":[{"id":"1106"}],"center":[{"id":"1109"},{"id":"1113"}],"left":[{"id":"1110"}],"outline_line_color":null,"plot_height":300,"plot_width":505,"renderers":[{"id":"1120"},{"id":"1126"},{"id":"1132"},{"id":"1138"}],"title":{"id":"1096"},"toolbar":{"id":"1114"},"x_range":{"id":"1098"},"x_scale":{"id":"1102"},"y_range":{"id":"1100"},"y_scale":{"id":"1104"}},"id":"1095","subtype":"Figure","type":"Plot"},{"attributes":{},"id":"1150","type":"UnionRenderers"},{"attributes":{"source":{"id":"1128"}},"id":"1133","type":"CDSView"},{"attributes":{},"id":"1155","type":"Selection"},{"attributes":{"axis_label":"Parameters (M)","formatter":{"id":"1146"},"minor_tick_line_color":null,"ticker":{"id":"1111"}},"id":"1110","type":"LinearAxis"},{"attributes":{"active_drag":"auto","active_inspect":"auto","active_multi":null,"active_scroll":"auto","active_tap":"auto","tools":[{"id":"1094"}]},"id":"1114","type":"Toolbar"},{"attributes":{"start":0},"id":"1100","type":"DataRange1d"},{"attributes":{"fill_color":{"value":"#20cb97"},"line_color":{"value":"#20cb97"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1130","type":"VBar"},{"attributes":{"callback":null,"tooltips":"\\n <div>\\n <div style=\\"margin-bottom:10px\\">\\n <span style=\\"font-size: 15px;\\"><b>@name</b><br/>density=@density</span>\\n </div>\\n <div> \\n <img\\n src=\\"@url\\" height=\\"@img_height\\" width=\\"@img_width\\" alt=\\"@url\\"\\n style=\\"float: left; margin: 0px 15px 15px 0px;\\"\\n border=\\"0\\"\\n />\\n </div>\\n </div>\\n "},"id":"1094","type":"HoverTool"},{"attributes":{"fill_alpha":{"value":0.1},"fill_color":{"value":"#6573f7"},"line_alpha":{"value":0.1},"line_color":{"value":"#6573f7"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1119","type":"VBar"},{"attributes":{"data_source":{"id":"1134"},"glyph":{"id":"1136"},"hover_glyph":null,"muted_glyph":null,"name":"fully connected","nonselection_glyph":{"id":"1137"},"selection_glyph":null,"view":{"id":"1139"}},"id":"1138","type":"GlyphRenderer"}],"root_ids":["1095"]},"title":"Bokeh Application","version":"2.2.3"}}';
|
119 |
-
var render_items = [{"docid":"
|
120 |
root.Bokeh.embed.embed_items(docs_json, render_items);
|
121 |
|
122 |
}
|
|
|
16 |
|
17 |
|
18 |
|
19 |
+
var element = document.getElementById("1ff1ba08-69d3-4a20-9f29-494033c72860");
|
20 |
if (element == null) {
|
21 |
+
console.warn("Bokeh: autoload.js configured with elementid '1ff1ba08-69d3-4a20-9f29-494033c72860' but no matching script tag was found.")
|
22 |
}
|
23 |
|
24 |
|
|
|
115 |
(function(root) {
|
116 |
function embed_document(root) {
|
117 |
|
118 |
+
var docs_json = '{"daf61672-5257-4351-9597-a90ab6a6c90b":{"roots":{"references":[{"attributes":{"start":0},"id":"1100","type":"DataRange1d"},{"attributes":{"data":{"density":["30.0%","26.7%","39.8%","50.0%","42.5%","38.2%","34.9%","35.4%","29.0%","12.2%","13.2%","8.3%"],"height":[0.177152,0.157696,0.234496,0.294912,0.25088,0.22528,0.205824,0.208896,0.171008,0.07168,0.077824,0.049152],"img_height":["96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px"],"img_width":["96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px"],"name":["0.attention.value","1.attention.value","2.attention.value","3.attention.value","4.attention.value","5.attention.value","6.attention.value","7.attention.value","8.attention.value","9.attention.value","10.attention.value","11.attention.value"],"parameters":["0.18","0.16","0.23","0.29","0.25","0.23","0.21","0.21","0.17","0.07","0.08","0.05"],"url":["/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_0_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_1_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_2_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_3_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_4_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_5_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_6_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_7_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_8_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_9_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_10_attention_self_value.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_11_attention_self_value.png"],"x":[0.41666666666666663,1.4166666666666665,2.416666666666667,3.416666666666667,4.416666666666666,5.416666666666666,6.416666666666666,7.416666666666666,8.416666666666668,9.416666666666668,10.416666666666668,11.416666666666668]},"selected":{"id":"1154"},"selection_policy":{"id":"1155"}},"id":"1128","type":"ColumnDataSource"},{"attributes":{"source":{"id":"1122"}},"id":"1127","type":"CDSView"},{"attributes":{"data_source":{"id":"1134"},"glyph":{"id":"1136"},"hover_glyph":null,"muted_glyph":null,"name":"fully connected","nonselection_glyph":{"id":"1137"},"selection_glyph":null,"view":{"id":"1139"}},"id":"1138","type":"GlyphRenderer"},{"attributes":{"items":[{"id":"1141"},{"id":"1142"},{"id":"1143"},{"id":"1144"}],"location":[10,0],"orientation":"horizontal"},"id":"1140","type":"Legend"},{"attributes":{},"id":"1148","type":"BasicTickFormatter"},{"attributes":{},"id":"1102","type":"LinearScale"},{"attributes":{},"id":"1156","type":"Selection"},{"attributes":{"fill_color":{"value":"#20cb97"},"line_color":{"value":"#20cb97"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1130","type":"VBar"},{"attributes":{},"id":"1157","type":"UnionRenderers"},{"attributes":{"callback":null,"tooltips":"\\n <div>\\n <div style=\\"margin-bottom:10px\\">\\n <span style=\\"font-size: 15px;\\"><b>@name</b><br/>density=@density</span>\\n </div>\\n <div> \\n <img\\n src=\\"@url\\" height=\\"@img_height\\" width=\\"@img_width\\" alt=\\"@url\\"\\n style=\\"float: left; margin: 0px 15px 15px 0px;\\"\\n border=\\"0\\"\\n />\\n </div>\\n </div>\\n "},"id":"1094","type":"HoverTool"},{"attributes":{"source":{"id":"1128"}},"id":"1133","type":"CDSView"},{"attributes":{"data_source":{"id":"1128"},"glyph":{"id":"1130"},"hover_glyph":null,"muted_glyph":null,"name":"value","nonselection_glyph":{"id":"1131"},"selection_glyph":null,"view":{"id":"1133"}},"id":"1132","type":"GlyphRenderer"},{"attributes":{},"id":"1146","type":"BasicTickFormatter"},{"attributes":{"axis":{"id":"1110"},"dimension":1,"ticker":null},"id":"1113","type":"Grid"},{"attributes":{"fill_alpha":{"value":0.1},"fill_color":{"value":"#ed5642"},"line_alpha":{"value":0.1},"line_color":{"value":"#ed5642"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1125","type":"VBar"},{"attributes":{"source":{"id":"1134"}},"id":"1139","type":"CDSView"},{"attributes":{"axis_label":"Layer","formatter":{"id":"1148"},"minor_tick_line_color":null,"ticker":{"id":"1107"}},"id":"1106","type":"LinearAxis"},{"attributes":{"fill_alpha":{"value":0.1},"fill_color":{"value":"#aa69f7"},"line_alpha":{"value":0.1},"line_color":{"value":"#aa69f7"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1137","type":"VBar"},{"attributes":{"fill_color":{"value":"#6573f7"},"line_color":{"value":"#6573f7"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1118","type":"VBar"},{"attributes":{},"id":"1098","type":"DataRange1d"},{"attributes":{},"id":"1153","type":"UnionRenderers"},{"attributes":{},"id":"1151","type":"UnionRenderers"},{"attributes":{"label":{"value":"value"},"renderers":[{"id":"1132"}]},"id":"1143","type":"LegendItem"},{"attributes":{"data":{"density":["31.8%","6.6%","6.6%","27.4%","9.2%","9.2%","42.0%","11.5%","11.5%","51.9%","12.0%","12.0%","44.3%","12.5%","12.5%","36.1%","10.9%","10.9%","36.3%","9.4%","9.4%","32.8%","6.8%","6.8%","26.9%","3.5%","3.5%","11.3%","1.8%","1.8%","13.2%","2.6%","2.6%","9.4%","3.3%","3.3%"],"height":[0.187392,0.155136,0.155136,0.161792,0.218112,0.218112,0.247808,0.271872,0.271872,0.306176,0.282624,0.282624,0.26112,0.294912,0.294912,0.212992,0.25728,0.25728,0.214016,0.221184,0.221184,0.193536,0.16128,0.16128,0.15872,0.083712,0.083712,0.06656,0.04224,0.04224,0.077824,0.060672,0.060672,0.055296,0.078336,0.078336],"img_height":["96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px"],"img_width":["96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px","96px","96px","384px"],"name":["0.attention.output","0.intermediate","0.output","1.attention.output","1.intermediate","1.output","2.attention.output","2.intermediate","2.output","3.attention.output","3.intermediate","3.output","4.attention.output","4.intermediate","4.output","5.attention.output","5.intermediate","5.output","6.attention.output","6.intermediate","6.output","7.attention.output","7.intermediate","7.output","8.attention.output","8.intermediate","8.output","9.attention.output","9.intermediate","9.output","10.attention.output","10.intermediate","10.output","11.attention.output","11.intermediate","11.output"],"parameters":["0.19","0.16","0.16","0.16","0.22","0.22","0.25","0.27","0.27","0.31","0.28","0.28","0.26","0.29","0.29","0.21","0.26","0.26","0.21","0.22","0.22","0.19","0.16","0.16","0.16","0.08","0.08","0.07","0.04","0.04","0.08","0.06","0.06","0.06","0.08","0.08"],"url":["/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_0_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_0_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_0_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_1_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_1_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_1_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_2_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_2_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_2_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_3_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_3_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_3_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_4_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_4_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_4_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_5_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_5_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_5_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_6_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_6_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_6_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_7_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_7_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_7_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_8_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_8_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_8_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_9_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_9_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_9_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_10_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_10_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_10_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_11_attention_output_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_11_intermediate_dense.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_11_output_dense.png"],"x":[0.5833333333333334,0.75,0.9166666666666667,1.5833333333333333,1.75,1.9166666666666665,2.5833333333333335,2.75,2.916666666666667,3.5833333333333335,3.75,3.916666666666667,4.583333333333333,4.75,4.916666666666666,5.583333333333333,5.75,5.916666666666666,6.583333333333333,6.75,6.916666666666666,7.583333333333333,7.75,7.916666666666666,8.583333333333334,8.75,8.916666666666668,9.583333333333334,9.75,9.916666666666668,10.583333333333334,10.75,10.916666666666668,11.583333333333334,11.75,11.916666666666668]},"selected":{"id":"1156"},"selection_policy":{"id":"1157"}},"id":"1134","type":"ColumnDataSource"},{"attributes":{"fill_alpha":{"value":0.1},"fill_color":{"value":"#6573f7"},"line_alpha":{"value":0.1},"line_color":{"value":"#6573f7"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1119","type":"VBar"},{"attributes":{"active_drag":"auto","active_inspect":"auto","active_multi":null,"active_scroll":"auto","active_tap":"auto","tools":[{"id":"1094"}]},"id":"1114","type":"Toolbar"},{"attributes":{"data_source":{"id":"1116"},"glyph":{"id":"1118"},"hover_glyph":null,"muted_glyph":null,"name":"query","nonselection_glyph":{"id":"1119"},"selection_glyph":null,"view":{"id":"1121"}},"id":"1120","type":"GlyphRenderer"},{"attributes":{"label":{"value":"fully connected"},"renderers":[{"id":"1138"}]},"id":"1144","type":"LegendItem"},{"attributes":{},"id":"1154","type":"Selection"},{"attributes":{"data":{"density":["26.4%","31.4%","51.0%","51.9%","51.4%","34.0%","40.6%","39.4%","34.2%","27.1%","23.1%","19.3%"],"height":[0.155648,0.185344,0.301056,0.306176,0.303104,0.200704,0.239616,0.232448,0.201728,0.159744,0.136192,0.113664],"img_height":["96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px"],"img_width":["96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px"],"name":["0.attention.key","1.attention.key","2.attention.key","3.attention.key","4.attention.key","5.attention.key","6.attention.key","7.attention.key","8.attention.key","9.attention.key","10.attention.key","11.attention.key"],"parameters":["0.16","0.19","0.30","0.31","0.30","0.20","0.24","0.23","0.20","0.16","0.14","0.11"],"url":["/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_0_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_1_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_2_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_3_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_4_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_5_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_6_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_7_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_8_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_9_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_10_attention_self_key.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_11_attention_self_key.png"],"x":[0.25,1.25,2.25,3.25,4.25,5.25,6.25,7.25,8.25,9.25,10.25,11.25]},"selected":{"id":"1152"},"selection_policy":{"id":"1153"}},"id":"1122","type":"ColumnDataSource"},{"attributes":{"fill_color":{"value":"#aa69f7"},"line_color":{"value":"#aa69f7"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1136","type":"VBar"},{"attributes":{},"id":"1152","type":"Selection"},{"attributes":{"axis":{"id":"1106"},"grid_line_color":null,"ticker":null},"id":"1109","type":"Grid"},{"attributes":{"source":{"id":"1116"}},"id":"1121","type":"CDSView"},{"attributes":{},"id":"1150","type":"Selection"},{"attributes":{},"id":"1155","type":"UnionRenderers"},{"attributes":{},"id":"1111","type":"BasicTicker"},{"attributes":{"label":{"value":"key"},"renderers":[{"id":"1126"}]},"id":"1142","type":"LegendItem"},{"attributes":{"axis_label":"Parameters (M)","formatter":{"id":"1146"},"minor_tick_line_color":null,"ticker":{"id":"1111"}},"id":"1110","type":"LinearAxis"},{"attributes":{"label":{"value":"query"},"renderers":[{"id":"1120"}]},"id":"1141","type":"LegendItem"},{"attributes":{"data":{"density":["26.2%","31.2%","51.6%","47.9%","49.1%","30.4%","37.8%","35.9%","34.0%","25.7%","24.1%","19.8%"],"height":[0.154624,0.18432,0.304128,0.282624,0.289792,0.1792,0.223232,0.211968,0.200704,0.151552,0.142336,0.116736],"img_height":["96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px"],"img_width":["96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px","96px"],"name":["0.attention.query","1.attention.query","2.attention.query","3.attention.query","4.attention.query","5.attention.query","6.attention.query","7.attention.query","8.attention.query","9.attention.query","10.attention.query","11.attention.query"],"parameters":["0.15","0.18","0.30","0.28","0.29","0.18","0.22","0.21","0.20","0.15","0.14","0.12"],"url":["/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_0_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_1_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_2_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_3_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_4_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_5_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_6_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_7_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_8_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_9_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_10_attention_self_query.png","/madlag/bert-base-uncased-squadv1-x2.32-f86.6-d15-hybrid-v1/raw/main/model_card/images/layer_11_attention_self_query.png"],"x":[0.08333333333333333,1.0833333333333333,2.0833333333333335,3.0833333333333335,4.083333333333333,5.083333333333333,6.083333333333333,7.083333333333333,8.083333333333334,9.083333333333334,10.083333333333334,11.083333333333334]},"selected":{"id":"1150"},"selection_policy":{"id":"1151"}},"id":"1116","type":"ColumnDataSource"},{"attributes":{"text":"Transformer Layers"},"id":"1096","type":"Title"},{"attributes":{"fill_color":{"value":"#ed5642"},"line_color":{"value":"#ed5642"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1124","type":"VBar"},{"attributes":{"fill_alpha":{"value":0.1},"fill_color":{"value":"#20cb97"},"line_alpha":{"value":0.1},"line_color":{"value":"#20cb97"},"top":{"field":"height"},"width":{"value":0.125},"x":{"field":"x"}},"id":"1131","type":"VBar"},{"attributes":{},"id":"1107","type":"BasicTicker"},{"attributes":{"data_source":{"id":"1122"},"glyph":{"id":"1124"},"hover_glyph":null,"muted_glyph":null,"name":"key","nonselection_glyph":{"id":"1125"},"selection_glyph":null,"view":{"id":"1127"}},"id":"1126","type":"GlyphRenderer"},{"attributes":{"above":[{"id":"1140"}],"below":[{"id":"1106"}],"center":[{"id":"1109"},{"id":"1113"}],"left":[{"id":"1110"}],"outline_line_color":null,"plot_height":300,"plot_width":505,"renderers":[{"id":"1120"},{"id":"1126"},{"id":"1132"},{"id":"1138"}],"title":{"id":"1096"},"toolbar":{"id":"1114"},"x_range":{"id":"1098"},"x_scale":{"id":"1102"},"y_range":{"id":"1100"},"y_scale":{"id":"1104"}},"id":"1095","subtype":"Figure","type":"Plot"},{"attributes":{},"id":"1104","type":"LinearScale"}],"root_ids":["1095"]},"title":"Bokeh Application","version":"2.2.3"}}';
|
119 |
+
var render_items = [{"docid":"daf61672-5257-4351-9597-a90ab6a6c90b","root_ids":["1095"],"roots":{"1095":"1ff1ba08-69d3-4a20-9f29-494033c72860"}}];
|
120 |
root.Bokeh.embed.embed_items(docs_json, render_items);
|
121 |
|
122 |
}
|
model_card/pruning_info.js
CHANGED
@@ -16,9 +16,9 @@
|
|
16 |
|
17 |
|
18 |
|
19 |
-
var element = document.getElementById("
|
20 |
if (element == null) {
|
21 |
-
console.warn("Bokeh: autoload.js configured with elementid '
|
22 |
}
|
23 |
|
24 |
|
@@ -115,8 +115,8 @@
|
|
115 |
(function(root) {
|
116 |
function embed_document(root) {
|
117 |
|
118 |
-
var docs_json = '{"
|
119 |
-
var render_items = [{"docid":"
|
120 |
root.Bokeh.embed.embed_items(docs_json, render_items);
|
121 |
|
122 |
}
|
|
|
16 |
|
17 |
|
18 |
|
19 |
+
var element = document.getElementById("e092ee84-28af-4821-8127-11914f68e306");
|
20 |
if (element == null) {
|
21 |
+
console.warn("Bokeh: autoload.js configured with elementid 'e092ee84-28af-4821-8127-11914f68e306' but no matching script tag was found.")
|
22 |
}
|
23 |
|
24 |
|
|
|
115 |
(function(root) {
|
116 |
function embed_document(root) {
|
117 |
|
118 |
+
var docs_json = '{"d36ff262-33e2-4648-9ad4-5ce94b947cc1":{"roots":{"references":[{"attributes":{"label":{"value":"pruned"},"renderers":[{"id":"1043"}]},"id":"1057","type":"LegendItem"},{"attributes":{"above":[{"id":"1055"}],"below":[{"id":"1012"}],"center":[{"id":"1014"},{"id":"1018"},{"id":"1037"}],"left":[{"id":"1015"}],"outline_line_color":null,"plot_height":400,"renderers":[{"id":"1028"},{"id":"1043"}],"title":{"id":"1002"},"toolbar":{"id":"1019"},"toolbar_location":null,"x_range":{"id":"1004"},"x_scale":{"id":"1008"},"y_range":{"id":"1006"},"y_scale":{"id":"1010"}},"id":"1001","subtype":"Figure","type":"Plot"},{"attributes":{"source":{"id":"1024"}},"id":"1029","type":"CDSView"},{"attributes":{"axis_label":"Layer index","formatter":{"id":"1033"},"minor_tick_line_color":null,"ticker":{"id":"1013"}},"id":"1012","type":"CategoricalAxis"},{"attributes":{"data":{"active":[5,5,9,8,9,6,8,7,8,6,5,5],"layers":["0","1","2","3","4","5","6","7","8","9","10","11"],"pruned":[7,7,3,4,3,6,4,5,4,6,7,7]},"selected":{"id":"1052"},"selection_policy":{"id":"1053"}},"id":"1039","type":"ColumnDataSource"},{"attributes":{"start":0},"id":"1006","type":"DataRange1d"},{"attributes":{"items":[{"id":"1038"},{"id":"1054"}],"location":null},"id":"1037","type":"Legend"},{"attributes":{},"id":"1008","type":"CategoricalScale"},{"attributes":{},"id":"1052","type":"Selection"},{"attributes":{},"id":"1053","type":"UnionRenderers"},{"attributes":{"axis":{"id":"1012"},"grid_line_color":null,"ticker":null},"id":"1014","type":"Grid"},{"attributes":{},"id":"1035","type":"Selection"},{"attributes":{},"id":"1013","type":"CategoricalTicker"},{"attributes":{},"id":"1036","type":"UnionRenderers"},{"attributes":{"fields":[]},"id":"1020","type":"Stack"},{"attributes":{"bottom":{"expr":{"id":"1020"}},"fill_alpha":{"value":0.1},"fill_color":{"value":"#0000ff"},"line_alpha":{"value":0.1},"line_color":{"value":"#0000ff"},"top":{"expr":{"id":"1021"}},"width":{"value":0.9},"x":{"field":"layers"}},"id":"1027","type":"VBar"},{"attributes":{"data_source":{"id":"1024"},"glyph":{"id":"1026"},"hover_glyph":null,"muted_glyph":null,"name":"active","nonselection_glyph":{"id":"1027"},"selection_glyph":null,"view":{"id":"1029"}},"id":"1028","type":"GlyphRenderer"},{"attributes":{"fields":["active"]},"id":"1022","type":"Stack"},{"attributes":{"text":"Pruned Transformer Heads"},"id":"1002","type":"Title"},{"attributes":{"active_drag":"auto","active_inspect":"auto","active_multi":null,"active_scroll":"auto","active_tap":"auto"},"id":"1019","type":"Toolbar"},{"attributes":{"factors":["0","1","2","3","4","5","6","7","8","9","10","11"],"range_padding":0.1},"id":"1004","type":"FactorRange"},{"attributes":{},"id":"1031","type":"BasicTickFormatter"},{"attributes":{"label":{"value":"active"},"renderers":[{"id":"1028"}]},"id":"1056","type":"LegendItem"},{"attributes":{"items":[{"id":"1056"},{"id":"1057"}],"location":[10,0],"orientation":"horizontal"},"id":"1055","type":"Legend"},{"attributes":{"label":{"value":"active"},"renderers":[{"id":"1028"}]},"id":"1038","type":"LegendItem"},{"attributes":{"source":{"id":"1039"}},"id":"1044","type":"CDSView"},{"attributes":{"bottom":{"expr":{"id":"1022"}},"fill_color":{"value":"#ffcccc"},"line_color":{"value":"#ffcccc"},"top":{"expr":{"id":"1023"}},"width":{"value":0.9},"x":{"field":"layers"}},"id":"1041","type":"VBar"},{"attributes":{"fields":["active"]},"id":"1021","type":"Stack"},{"attributes":{"axis":{"id":"1015"},"dimension":1,"ticker":null},"id":"1018","type":"Grid"},{"attributes":{"bottom":{"expr":{"id":"1022"}},"fill_alpha":{"value":0.1},"fill_color":{"value":"#ffcccc"},"line_alpha":{"value":0.1},"line_color":{"value":"#ffcccc"},"top":{"expr":{"id":"1023"}},"width":{"value":0.9},"x":{"field":"layers"}},"id":"1042","type":"VBar"},{"attributes":{"fields":["active","pruned"]},"id":"1023","type":"Stack"},{"attributes":{},"id":"1033","type":"CategoricalTickFormatter"},{"attributes":{"data_source":{"id":"1039"},"glyph":{"id":"1041"},"hover_glyph":null,"muted_glyph":null,"name":"pruned","nonselection_glyph":{"id":"1042"},"selection_glyph":null,"view":{"id":"1044"}},"id":"1043","type":"GlyphRenderer"},{"attributes":{"label":{"value":"pruned"},"renderers":[{"id":"1043"}]},"id":"1054","type":"LegendItem"},{"attributes":{},"id":"1010","type":"LinearScale"},{"attributes":{},"id":"1016","type":"BasicTicker"},{"attributes":{"axis_label":"Heads count","formatter":{"id":"1031"},"minor_tick_line_color":null,"ticker":{"id":"1016"}},"id":"1015","type":"LinearAxis"},{"attributes":{"bottom":{"expr":{"id":"1020"}},"fill_color":{"value":"#0000ff"},"line_color":{"value":"#0000ff"},"top":{"expr":{"id":"1021"}},"width":{"value":0.9},"x":{"field":"layers"}},"id":"1026","type":"VBar"},{"attributes":{"data":{"active":[5,5,9,8,9,6,8,7,8,6,5,5],"layers":["0","1","2","3","4","5","6","7","8","9","10","11"],"pruned":[7,7,3,4,3,6,4,5,4,6,7,7]},"selected":{"id":"1035"},"selection_policy":{"id":"1036"}},"id":"1024","type":"ColumnDataSource"}],"root_ids":["1001"]},"title":"Bokeh Application","version":"2.2.3"}}';
|
119 |
+
var render_items = [{"docid":"d36ff262-33e2-4648-9ad4-5ce94b947cc1","root_ids":["1001"],"roots":{"1001":"e092ee84-28af-4821-8127-11914f68e306"}}];
|
120 |
root.Bokeh.embed.embed_items(docs_json, render_items);
|
121 |
|
122 |
}
|
model_info.json
ADDED
@@ -0,0 +1,297 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"checkpoint_path": "/data_2to/devel_data/nn_pruning/output/squad_test4/hp_od-__data_2to__devel_data__nn_pruning__output__squad4___es-steps_nte20_ls250_stl50_est5000_rn-__data_2to__devel_data__nn_pruning__output__squad4___dpm-sigmoied_threshold:1d_alt_ap--17cd29ad8a563746/checkpoint-110000",
|
3 |
+
"config": {
|
4 |
+
"_name_or_path": "/tmp/tmpspdgp5f3",
|
5 |
+
"architectures": ["BertForQuestionAnswering"],
|
6 |
+
"attention_probs_dropout_prob": 0.1,
|
7 |
+
"gradient_checkpointing": false,
|
8 |
+
"hidden_act": "gelu",
|
9 |
+
"hidden_dropout_prob": 0.1,
|
10 |
+
"hidden_size": 768,
|
11 |
+
"initializer_range": 0.02,
|
12 |
+
"intermediate_size": 3072,
|
13 |
+
"layer_norm_eps": 1e-12,
|
14 |
+
"max_position_embeddings": 512,
|
15 |
+
"model_type": "bert",
|
16 |
+
"num_attention_heads": 12,
|
17 |
+
"num_hidden_layers": 12,
|
18 |
+
"pad_token_id": 0,
|
19 |
+
"position_embedding_type": "absolute",
|
20 |
+
"pruned_heads": {
|
21 |
+
"0": [0, 2, 4, 5, 6, 7, 11],
|
22 |
+
"1": [0, 2, 3, 5, 6, 7, 8],
|
23 |
+
"10": [1, 2, 4, 5, 6, 7, 8],
|
24 |
+
"11": [0, 2, 5, 7, 8, 10, 11],
|
25 |
+
"2": [4, 7, 8],
|
26 |
+
"3": [2, 4, 6, 7],
|
27 |
+
"4": [1, 2, 11],
|
28 |
+
"5": [1, 2, 5, 6, 7, 11],
|
29 |
+
"6": [2, 3, 7, 10],
|
30 |
+
"7": [1, 3, 6, 7, 11],
|
31 |
+
"8": [0, 3, 4, 8],
|
32 |
+
"9": [1, 4, 5, 7, 9, 10]
|
33 |
+
},
|
34 |
+
"transformers_version": "4.4.2",
|
35 |
+
"type_vocab_size": 2,
|
36 |
+
"use_cache": true,
|
37 |
+
"vocab_size": 30522
|
38 |
+
},
|
39 |
+
"eval_metrics": {
|
40 |
+
"exact_match": 78.77010406811732,
|
41 |
+
"f1": 86.63938864881486,
|
42 |
+
"main_metric": 86.63938864881486
|
43 |
+
},
|
44 |
+
"model_args": {
|
45 |
+
"cache_dir": null,
|
46 |
+
"config_name": null,
|
47 |
+
"model_name_or_path": "bert-base-uncased",
|
48 |
+
"tokenizer_name": null,
|
49 |
+
"use_fast_tokenizer": true
|
50 |
+
},
|
51 |
+
"sparse_args": {
|
52 |
+
"ampere_pruning_method": "disabled",
|
53 |
+
"attention_block_cols": 32,
|
54 |
+
"attention_block_rows": 32,
|
55 |
+
"attention_lambda": 1.0,
|
56 |
+
"attention_output_with_dense": 0,
|
57 |
+
"attention_pruning_method": "sigmoied_threshold",
|
58 |
+
"bias_mask": true,
|
59 |
+
"dense_block_cols": 1,
|
60 |
+
"dense_block_rows": 1,
|
61 |
+
"dense_lambda": 1.0,
|
62 |
+
"dense_pruning_method": "sigmoied_threshold:1d_alt",
|
63 |
+
"distil_alpha_ce": 0.1,
|
64 |
+
"distil_alpha_teacher": 0.9,
|
65 |
+
"distil_teacher_name_or_path": "bert-large-uncased-whole-word-masking-finetuned-squad",
|
66 |
+
"distil_temperature": 2.0,
|
67 |
+
"final_ampere_temperature": 20.0,
|
68 |
+
"final_finetune": false,
|
69 |
+
"final_threshold": 0.1,
|
70 |
+
"final_warmup": 10,
|
71 |
+
"initial_ampere_temperature": 0.0,
|
72 |
+
"initial_threshold": 0,
|
73 |
+
"initial_warmup": 1,
|
74 |
+
"mask_init": "constant",
|
75 |
+
"mask_scale": 0.0,
|
76 |
+
"mask_scores_learning_rate": 0.01,
|
77 |
+
"regularization": "l1",
|
78 |
+
"regularization_final_lambda": 20
|
79 |
+
},
|
80 |
+
"speed": {
|
81 |
+
"cuda_eval_elapsed_time": 16.665313415527343,
|
82 |
+
"eval_elapsed_time": 23.629751751199365
|
83 |
+
},
|
84 |
+
"speedup": 2.3158516160525413,
|
85 |
+
"stats": {
|
86 |
+
"layers": {
|
87 |
+
"0": {
|
88 |
+
"linear_attention_nnz": 677888,
|
89 |
+
"linear_attention_total": 2359296,
|
90 |
+
"linear_dense_nnz": 310272,
|
91 |
+
"linear_dense_total": 4718592,
|
92 |
+
"linear_nnz": 988160,
|
93 |
+
"linear_total": 7077888,
|
94 |
+
"nnz": 993834,
|
95 |
+
"total": 7087872
|
96 |
+
},
|
97 |
+
"1": {
|
98 |
+
"linear_attention_nnz": 689152,
|
99 |
+
"linear_attention_total": 2359296,
|
100 |
+
"linear_dense_nnz": 436224,
|
101 |
+
"linear_dense_total": 4718592,
|
102 |
+
"linear_nnz": 1125376,
|
103 |
+
"linear_total": 7077888,
|
104 |
+
"nnz": 1131132,
|
105 |
+
"total": 7087872
|
106 |
+
},
|
107 |
+
"10": {
|
108 |
+
"linear_attention_nnz": 434176,
|
109 |
+
"linear_attention_total": 2359296,
|
110 |
+
"linear_dense_nnz": 121344,
|
111 |
+
"linear_dense_total": 4718592,
|
112 |
+
"linear_nnz": 555520,
|
113 |
+
"linear_total": 7077888,
|
114 |
+
"nnz": 560943,
|
115 |
+
"total": 7087872
|
116 |
+
},
|
117 |
+
"11": {
|
118 |
+
"linear_attention_nnz": 334848,
|
119 |
+
"linear_attention_total": 2359296,
|
120 |
+
"linear_dense_nnz": 156672,
|
121 |
+
"linear_dense_total": 4718592,
|
122 |
+
"linear_nnz": 491520,
|
123 |
+
"linear_total": 7077888,
|
124 |
+
"nnz": 496838,
|
125 |
+
"total": 7087872
|
126 |
+
},
|
127 |
+
"2": {
|
128 |
+
"linear_attention_nnz": 1087488,
|
129 |
+
"linear_attention_total": 2359296,
|
130 |
+
"linear_dense_nnz": 543744,
|
131 |
+
"linear_dense_total": 4718592,
|
132 |
+
"linear_nnz": 1631232,
|
133 |
+
"linear_total": 7077888,
|
134 |
+
"nnz": 1637570,
|
135 |
+
"total": 7087872
|
136 |
+
},
|
137 |
+
"3": {
|
138 |
+
"linear_attention_nnz": 1189888,
|
139 |
+
"linear_attention_total": 2359296,
|
140 |
+
"linear_dense_nnz": 565248,
|
141 |
+
"linear_dense_total": 4718592,
|
142 |
+
"linear_nnz": 1755136,
|
143 |
+
"linear_total": 7077888,
|
144 |
+
"nnz": 1761552,
|
145 |
+
"total": 7087872
|
146 |
+
},
|
147 |
+
"4": {
|
148 |
+
"linear_attention_nnz": 1104896,
|
149 |
+
"linear_attention_total": 2359296,
|
150 |
+
"linear_dense_nnz": 589824,
|
151 |
+
"linear_dense_total": 4718592,
|
152 |
+
"linear_nnz": 1694720,
|
153 |
+
"linear_total": 7077888,
|
154 |
+
"nnz": 1701216,
|
155 |
+
"total": 7087872
|
156 |
+
},
|
157 |
+
"5": {
|
158 |
+
"linear_attention_nnz": 818176,
|
159 |
+
"linear_attention_total": 2359296,
|
160 |
+
"linear_dense_nnz": 514560,
|
161 |
+
"linear_dense_total": 4718592,
|
162 |
+
"linear_nnz": 1332736,
|
163 |
+
"linear_total": 7077888,
|
164 |
+
"nnz": 1338767,
|
165 |
+
"total": 7087872
|
166 |
+
},
|
167 |
+
"6": {
|
168 |
+
"linear_attention_nnz": 882688,
|
169 |
+
"linear_attention_total": 2359296,
|
170 |
+
"linear_dense_nnz": 442368,
|
171 |
+
"linear_dense_total": 4718592,
|
172 |
+
"linear_nnz": 1325056,
|
173 |
+
"linear_total": 7077888,
|
174 |
+
"nnz": 1331200,
|
175 |
+
"total": 7087872
|
176 |
+
},
|
177 |
+
"7": {
|
178 |
+
"linear_attention_nnz": 846848,
|
179 |
+
"linear_attention_total": 2359296,
|
180 |
+
"linear_dense_nnz": 322560,
|
181 |
+
"linear_dense_total": 4718592,
|
182 |
+
"linear_nnz": 1169408,
|
183 |
+
"linear_total": 7077888,
|
184 |
+
"nnz": 1175442,
|
185 |
+
"total": 7087872
|
186 |
+
},
|
187 |
+
"8": {
|
188 |
+
"linear_attention_nnz": 732160,
|
189 |
+
"linear_attention_total": 2359296,
|
190 |
+
"linear_dense_nnz": 167424,
|
191 |
+
"linear_dense_total": 4718592,
|
192 |
+
"linear_nnz": 899584,
|
193 |
+
"linear_total": 7077888,
|
194 |
+
"nnz": 905581,
|
195 |
+
"total": 7087872
|
196 |
+
},
|
197 |
+
"9": {
|
198 |
+
"linear_attention_nnz": 449536,
|
199 |
+
"linear_attention_total": 2359296,
|
200 |
+
"linear_dense_nnz": 84480,
|
201 |
+
"linear_dense_total": 4718592,
|
202 |
+
"linear_nnz": 534016,
|
203 |
+
"linear_total": 7077888,
|
204 |
+
"nnz": 539287,
|
205 |
+
"total": 7087872
|
206 |
+
}
|
207 |
+
},
|
208 |
+
"linear_nnz": 13502464,
|
209 |
+
"linear_sparsity": 84.10252700617285,
|
210 |
+
"linear_total": 84934656,
|
211 |
+
"nnz": 37412084,
|
212 |
+
"pruned_heads": {
|
213 |
+
"0": [0, 2, 4, 5, 6, 7, 11],
|
214 |
+
"1": [0, 2, 3, 5, 6, 7, 8],
|
215 |
+
"10": [1, 2, 4, 5, 6, 7, 8],
|
216 |
+
"11": [0, 2, 5, 7, 8, 10, 11],
|
217 |
+
"2": [8, 4, 7],
|
218 |
+
"3": [2, 4, 6, 7],
|
219 |
+
"4": [1, 2, 11],
|
220 |
+
"5": [1, 2, 5, 6, 7, 11],
|
221 |
+
"6": [10, 2, 3, 7],
|
222 |
+
"7": [1, 3, 6, 7, 11],
|
223 |
+
"8": [0, 8, 3, 4],
|
224 |
+
"9": [1, 4, 5, 7, 9, 10]
|
225 |
+
},
|
226 |
+
"total": 108893186,
|
227 |
+
"total_sparsity": 65.64331950026698
|
228 |
+
},
|
229 |
+
"training_args": {
|
230 |
+
"_n_gpu": -1,
|
231 |
+
"adafactor": false,
|
232 |
+
"adam_beta1": 0.9,
|
233 |
+
"adam_beta2": 0.999,
|
234 |
+
"adam_epsilon": 1e-08,
|
235 |
+
"dataloader_drop_last": false,
|
236 |
+
"dataloader_num_workers": 0,
|
237 |
+
"dataloader_pin_memory": true,
|
238 |
+
"ddp_find_unused_parameters": null,
|
239 |
+
"debug": false,
|
240 |
+
"deepspeed": null,
|
241 |
+
"disable_tqdm": false,
|
242 |
+
"do_eval": 1,
|
243 |
+
"do_predict": false,
|
244 |
+
"do_train": 1,
|
245 |
+
"eval_accumulation_steps": null,
|
246 |
+
"eval_steps": 5000,
|
247 |
+
"evaluation_strategy": "steps",
|
248 |
+
"fp16": false,
|
249 |
+
"fp16_backend": "auto",
|
250 |
+
"fp16_full_eval": false,
|
251 |
+
"fp16_opt_level": "O1",
|
252 |
+
"gradient_accumulation_steps": 1,
|
253 |
+
"greater_is_better": null,
|
254 |
+
"group_by_length": false,
|
255 |
+
"ignore_data_skip": false,
|
256 |
+
"label_names": null,
|
257 |
+
"label_smoothing_factor": 0.0,
|
258 |
+
"learning_rate": 3e-05,
|
259 |
+
"length_column_name": "length",
|
260 |
+
"load_best_model_at_end": false,
|
261 |
+
"local_rank": -1,
|
262 |
+
"logging_dir": "/data_2to/devel_data/nn_pruning/output/squad4/",
|
263 |
+
"logging_first_step": false,
|
264 |
+
"logging_steps": 250,
|
265 |
+
"logging_strategy": "steps",
|
266 |
+
"lr_scheduler_type": "linear",
|
267 |
+
"max_grad_norm": 1.0,
|
268 |
+
"max_steps": -1,
|
269 |
+
"metric_for_best_model": null,
|
270 |
+
"mp_parameters": "",
|
271 |
+
"no_cuda": false,
|
272 |
+
"num_train_epochs": 20,
|
273 |
+
"optimize_model_before_eval": "disabled",
|
274 |
+
"output_dir": "/data_2to/devel_data/nn_pruning/output/squad4/",
|
275 |
+
"overwrite_output_dir": 1,
|
276 |
+
"past_index": -1,
|
277 |
+
"per_device_eval_batch_size": 8,
|
278 |
+
"per_device_train_batch_size": 16,
|
279 |
+
"per_gpu_eval_batch_size": null,
|
280 |
+
"per_gpu_train_batch_size": null,
|
281 |
+
"prediction_loss_only": false,
|
282 |
+
"remove_unused_columns": true,
|
283 |
+
"report_to": null,
|
284 |
+
"run_name": "/data_2to/devel_data/nn_pruning/output/squad4/",
|
285 |
+
"save_steps": 5000,
|
286 |
+
"save_strategy": "steps",
|
287 |
+
"save_total_limit": 50,
|
288 |
+
"seed": 17,
|
289 |
+
"sharded_ddp": "",
|
290 |
+
"skip_memory_metrics": false,
|
291 |
+
"tpu_metrics_debug": false,
|
292 |
+
"tpu_num_cores": null,
|
293 |
+
"warmup_ratio": 0.0,
|
294 |
+
"warmup_steps": 5400,
|
295 |
+
"weight_decay": 0.0
|
296 |
+
}
|
297 |
+
}
|
training/data_args.json
ADDED
@@ -0,0 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"dataset_cache_dir": "dataset_cache",
|
3 |
+
"dataset_config_name": null,
|
4 |
+
"dataset_name": "squad",
|
5 |
+
"doc_stride": 128,
|
6 |
+
"max_answer_length": 30,
|
7 |
+
"max_seq_length": 384,
|
8 |
+
"n_best_size": 20,
|
9 |
+
"null_score_diff_threshold": 0.0,
|
10 |
+
"overwrite_cache": 0,
|
11 |
+
"pad_to_max_length": true,
|
12 |
+
"preprocessing_num_workers": null,
|
13 |
+
"train_file": null,
|
14 |
+
"validation_file": null,
|
15 |
+
"version_2_with_negative": false
|
16 |
+
}
|
training/model_args.json
ADDED
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"cache_dir": null,
|
3 |
+
"config_name": null,
|
4 |
+
"model_name_or_path": "bert-base-uncased",
|
5 |
+
"tokenizer_name": null,
|
6 |
+
"use_fast_tokenizer": true
|
7 |
+
}
|
training/sparse_args.json
ADDED
@@ -0,0 +1,29 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"ampere_pruning_method": "disabled",
|
3 |
+
"attention_block_cols": 32,
|
4 |
+
"attention_block_rows": 32,
|
5 |
+
"attention_lambda": 1.0,
|
6 |
+
"attention_output_with_dense": 0,
|
7 |
+
"attention_pruning_method": "sigmoied_threshold",
|
8 |
+
"bias_mask": true,
|
9 |
+
"dense_block_cols": 1,
|
10 |
+
"dense_block_rows": 1,
|
11 |
+
"dense_lambda": 1.0,
|
12 |
+
"dense_pruning_method": "sigmoied_threshold:1d_alt",
|
13 |
+
"distil_alpha_ce": 0.1,
|
14 |
+
"distil_alpha_teacher": 0.9,
|
15 |
+
"distil_teacher_name_or_path": "bert-large-uncased-whole-word-masking-finetuned-squad",
|
16 |
+
"distil_temperature": 2.0,
|
17 |
+
"final_ampere_temperature": 20.0,
|
18 |
+
"final_finetune": false,
|
19 |
+
"final_threshold": 0.1,
|
20 |
+
"final_warmup": 10,
|
21 |
+
"initial_ampere_temperature": 0.0,
|
22 |
+
"initial_threshold": 0,
|
23 |
+
"initial_warmup": 1,
|
24 |
+
"mask_init": "constant",
|
25 |
+
"mask_scale": 0.0,
|
26 |
+
"mask_scores_learning_rate": 0.01,
|
27 |
+
"regularization": "l1",
|
28 |
+
"regularization_final_lambda": 20
|
29 |
+
}
|
training/training_args.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:74c002be5cf5e3a341af2e459e1439660eaa6e6270a2d3077579a48757266f4a
|
3 |
+
size 1903
|