TomatenMarc commited on
Commit
e5c3138
1 Parent(s): cb9dda6

Add new SentenceTransformer model.

Browse files
.gitattributes CHANGED
@@ -35,3 +35,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
  pytorch_model.bin filter=lfs diff=lfs merge=lfs -text
37
  .git/lfs/objects/3e/cd/3ecd82469cb960c9ac4d7a041589441ffbde220cd2b4d5d13db88be8b9e5b462 filter=lfs diff=lfs merge=lfs -text
 
 
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
  pytorch_model.bin filter=lfs diff=lfs merge=lfs -text
37
  .git/lfs/objects/3e/cd/3ecd82469cb960c9ac4d7a041589441ffbde220cd2b4d5d13db88be8b9e5b462 filter=lfs diff=lfs merge=lfs -text
38
+ .git/lfs/objects/8f/b7/8fb7a64f61f366e6a68665ad8dc9ce6ef2969d2c1d541fac6d0287fe3afd5628 filter=lfs diff=lfs merge=lfs -text
README.md CHANGED
@@ -5,6 +5,10 @@ tags:
5
  - feature-extraction
6
  - sentence-similarity
7
  - transformers
 
 
 
 
8
  license: cc-by-sa-4.0
9
  language:
10
  - en
@@ -54,8 +58,8 @@ Relevant Argument Properties (WRAP) into the embedding space.
54
 
55
  WRAPresentations, to some degree, captures the semantics of the critical components of an argument (inference and information), as defined by the
56
  [Cambridge Dictionary](https://dictionary.cambridge.org).
57
- It encodes *inference* as *a guess that one makes or an opinion formed based on available information*, and it also leverages the definition of
58
- *information* as *facts or details about a person, company, product, etc.*.
59
 
60
  Consequently, it has also learned the semantics of:
61
 
@@ -102,7 +106,7 @@ print(embeddings)
102
  ## Usage (HuggingFace Transformers)
103
 
104
  Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model,
105
- then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
106
 
107
  ```python
108
  from transformers import AutoTokenizer, AutoModel
@@ -144,14 +148,23 @@ efficient identification and analysis of argumentative content and non-argumenta
144
 
145
  ## Training
146
 
147
- The WRAPresentations model underwent fine-tuning with 1,219 golden tweets from the TACO dataset, covering six topics.
148
  Five topics were chosen for optimization, representing 925 tweets (75.88%) covering #brexit (33.3%), #got (17%), #lotrrop (18.8%), #squidgame (17.1%),
149
  and #twittertakeover (13.8%). The model used a stratified 60/40 split for training/testing on optimization data.
150
- Additionally, 294 golden tweets (24.12%) related to the topic of abortion were chosen as the holdout-set for final evaluation.
151
-
152
- During fine-tuning, we formed tweet pairs by matching each tweet with all remaining tweets in the same data split (training, testing, holdout) with
153
- similar or dissimilar class labels. This process created 307,470 pairs for training and 136,530 pairs for testing. An additional 86,142 pairs were
154
- used for final evaluation with the holdout data.
 
 
 
 
 
 
 
 
 
155
 
156
  The model was trained with the parameters:
157
 
@@ -200,16 +213,16 @@ WRAPresentation model using the
200
  | Model | Accuracy | Precision | Recall | F1 | Support |
201
  |---------------------|----------|-----------|--------|--------|---------|
202
  | vinai/bertweet-base | 60.62% | 50.08% | 99.89% | 66.71% | 86,142 |
203
- | WRAPresentations | 72.25% | 65.45% | 88.21% | 75.14% | 86,142 |
204
 
205
- An evaluation was conducted on previously unseen data from the holdout topic abortion, resulting in the model achieving a sophisticated macro
206
- F1-score of 75.14%. The recall, which stands at 88.21%, indicates the model's ability to capture subtle tweet patterns and class-specific features for
207
- Reason, Statement, Notification, and None. Despite having a lower precision of 65.45%, the model's primary focus is on prioritizing recall to capture
208
  relevant instances. Fine-tuning precision can be addressed in a subsequent classification phase, when using this model
209
  for `AutoModelForSequenceClassification`. In contrast, the baseline model (*vinai/bertweet-base*) achieved an exceptional recall of 99.89%, but it
210
  comes with a precision trade-off (50.08%), possibly indicating overfitting. However, WRAPresentations demonstrated its ability to effectively
211
  distinguish between tweets of the argument framework, capturing intra-class semantics while discerning inter-class semantics.
212
- This is indicated by its better F1 score of 75.14%, showcasing a superior balance between recall and precision.
213
  As a result, WRAPresentations proves to be more suitable for argument mining on Twitter, as it achieves a more reliable performance in identifying
214
  relevant instances in the data.
215
 
 
5
  - feature-extraction
6
  - sentence-similarity
7
  - transformers
8
+ - argument-mining
9
+ - Twitter
10
+ metrics:
11
+ - macro-F1
12
  license: cc-by-sa-4.0
13
  language:
14
  - en
 
58
 
59
  WRAPresentations, to some degree, captures the semantics of the critical components of an argument (inference and information), as defined by the
60
  [Cambridge Dictionary](https://dictionary.cambridge.org).
61
+ It encodes *inference* as *a guess that you make or an opinion that you form based on the information that you have*, and it also leverages the
62
+ definition of *information* as *facts or details about a person, company, product, etc.*.
63
 
64
  Consequently, it has also learned the semantics of:
65
 
 
106
  ## Usage (HuggingFace Transformers)
107
 
108
  Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model,
109
+ then you have to apply the right pooling-operation on top of the contextualized word embeddings.
110
 
111
  ```python
112
  from transformers import AutoTokenizer, AutoModel
 
148
 
149
  ## Training
150
 
151
+ The WRAPresentations model underwent fine-tuning with 1,219 golden tweets from the [TACO](https://doi.org/10.5281/zenodo.8030026) dataset, covering six topics.
152
  Five topics were chosen for optimization, representing 925 tweets (75.88%) covering #brexit (33.3%), #got (17%), #lotrrop (18.8%), #squidgame (17.1%),
153
  and #twittertakeover (13.8%). The model used a stratified 60/40 split for training/testing on optimization data.
154
+ Additionally, 294 golden tweets (24.12%) related to the topic of #abortion were chosen as the holdout-set for final evaluation.
155
+
156
+ Before fine-tuning, we built a copy of the dataset by creating an augmentation of each tweet. The augmentation consisted of replacing all the
157
+ topic words and entities in a tweet replaced, and then randomly masking 10% of the words in a tweet, which were then matched using
158
+ [BERTweet-base](https://huggingface.co/vinai/bertweet-base) as a `fill-mask` model. We chose to omit 10% of the words because this resulted in the
159
+ smallest possible average cosine distance between the tweets and their augmentations of 0.08, which is close to total dissimilarity, making
160
+ augmentation during extended pre-training itself a regulating factor prior to any overfitting with the later test data.
161
+ During fine-tuning, we formed pairs by matching each tweet with all remaining tweets in the same data split (training, testing, holdout)
162
+ with similar or dissimilar class labels. For the training and testing set during the fine-tuning process, we utilized the augmentations, and for the
163
+ holdout tweets, we used their original text to test the fine-tuning process and the usefulness of the augmentations towards real tweets.
164
+ For all pairs, we chose the largest possible set so that both similar and dissimilar pairs are equally represented while covering all tweets
165
+ of the respective data split.
166
+ This process created 307,470 pairs for training and 136,530 pairs for testing. An additional 86,142 pairs were used for final evaluation with the
167
+ holdout data.
168
 
169
  The model was trained with the parameters:
170
 
 
213
  | Model | Accuracy | Precision | Recall | F1 | Support |
214
  |---------------------|----------|-----------|--------|--------|---------|
215
  | vinai/bertweet-base | 60.62% | 50.08% | 99.89% | 66.71% | 86,142 |
216
+ | WRAPresentations | 71.32% | 66.22% | 84.05% | 74.08% | 86,142 |
217
 
218
+ An evaluation was conducted on previously unseen data from the holdout topic #abortion, resulting in the model achieving a sophisticated macro-F1
219
+ score of 74.08%. The recall, which stands at 84.05%, indicates the model's ability to capture subtle tweet patterns and class-specific features for
220
+ Reason, Statement, Notification, and None. Despite having a lower precision of 66.22%, the model's primary focus is on prioritizing recall to capture
221
  relevant instances. Fine-tuning precision can be addressed in a subsequent classification phase, when using this model
222
  for `AutoModelForSequenceClassification`. In contrast, the baseline model (*vinai/bertweet-base*) achieved an exceptional recall of 99.89%, but it
223
  comes with a precision trade-off (50.08%), possibly indicating overfitting. However, WRAPresentations demonstrated its ability to effectively
224
  distinguish between tweets of the argument framework, capturing intra-class semantics while discerning inter-class semantics.
225
+ This is indicated by its better F1 score of 74.08%, showcasing a superior balance between recall and precision.
226
  As a result, WRAPresentations proves to be more suitable for argument mining on Twitter, as it achieves a more reliable performance in identifying
227
  relevant instances in the data.
228
 
binary_classification_evaluation_baseline-heldout_results.csv CHANGED
@@ -1,2 +1,2 @@
1
  epoch,steps,cossim_accuracy,cossim_accuracy_threshold,cossim_f1,cossim_precision,cossim_recall,cossim_f1_threshold,cossim_ap,manhattan_accuracy,manhattan_accuracy_threshold,manhattan_f1,manhattan_precision,manhattan_recall,manhattan_f1_threshold,manhattan_ap,euclidean_accuracy,euclidean_accuracy_threshold,euclidean_f1,euclidean_precision,euclidean_recall,euclidean_f1_threshold,euclidean_ap,dot_accuracy,dot_accuracy_threshold,dot_f1,dot_precision,dot_recall,dot_f1_threshold,dot_ap
2
- -1,-1,0.606235997012696,0.7620757818222046,0.6671155668611901,0.5007862812640408,0.9988797610156833,0.2958066463470459,0.6323701642846189,0.6153286034353995,72.18704223632812,0.6850335895837661,0.5444897060442585,0.9233756534727409,87.33160400390625,0.6237689041144366,0.6188946975354742,3.8260855674743652,0.6802268387246109,0.5552144341583223,0.8778939507094847,4.539368152618408,0.6216050521813652,0.5073562359970127,15.957334518432617,0.6667997405060133,0.5006744604316546,0.9979088872292756,9.347124099731445,0.5036378539343036
 
1
  epoch,steps,cossim_accuracy,cossim_accuracy_threshold,cossim_f1,cossim_precision,cossim_recall,cossim_f1_threshold,cossim_ap,manhattan_accuracy,manhattan_accuracy_threshold,manhattan_f1,manhattan_precision,manhattan_recall,manhattan_f1_threshold,manhattan_ap,euclidean_accuracy,euclidean_accuracy_threshold,euclidean_f1,euclidean_precision,euclidean_recall,euclidean_f1_threshold,euclidean_ap,dot_accuracy,dot_accuracy_threshold,dot_f1,dot_precision,dot_recall,dot_f1_threshold,dot_ap
2
+ -1,-1,0.606235997012696,0.7620757818222046,0.6671155668611901,0.5007862812640408,0.9988797610156833,0.2958066463470459,0.632370163297349,0.6153286034353995,72.18704223632812,0.6850335895837661,0.5444897060442585,0.9233756534727409,87.33160400390625,0.6237689103378412,0.6188946975354742,3.8260855674743652,0.6802268387246109,0.5552144341583223,0.8778939507094847,4.539368152618408,0.6216050521813652,0.5073562359970127,15.957334518432617,0.6667997405060133,0.5006744604316546,0.9979088872292756,9.347124099731445,0.5036378545091198
binary_classification_evaluation_fine-tune-heldout_results.csv CHANGED
@@ -1,2 +1,2 @@
1
  epoch,steps,cossim_accuracy,cossim_accuracy_threshold,cossim_f1,cossim_precision,cossim_recall,cossim_f1_threshold,cossim_ap,manhattan_accuracy,manhattan_accuracy_threshold,manhattan_f1,manhattan_precision,manhattan_recall,manhattan_f1_threshold,manhattan_ap,euclidean_accuracy,euclidean_accuracy_threshold,euclidean_f1,euclidean_precision,euclidean_recall,euclidean_f1_threshold,euclidean_ap,dot_accuracy,dot_accuracy_threshold,dot_f1,dot_precision,dot_recall,dot_f1_threshold,dot_ap
2
- -1,-1,0.7224607916355489,0.7997592687606812,0.7514314798320398,0.6544940707081902,0.8820761762509335,0.468674898147583,0.7656845929850138,0.7232449589245705,122.13343048095703,0.7540637796252636,0.6449113870317309,0.9076923076923077,204.0776824951172,0.7707895435659883,0.7226661687826736,6.091953277587891,0.7526959588167692,0.6506627473394845,0.8926811053024645,9.70598030090332,0.7693129191511483,0.7146191187453323,64.32061767578125,0.7447954113342601,0.6508978189840543,0.8703510082150859,41.77745819091797,0.7333139687176903
 
1
  epoch,steps,cossim_accuracy,cossim_accuracy_threshold,cossim_f1,cossim_precision,cossim_recall,cossim_f1_threshold,cossim_ap,manhattan_accuracy,manhattan_accuracy_threshold,manhattan_f1,manhattan_precision,manhattan_recall,manhattan_f1_threshold,manhattan_ap,euclidean_accuracy,euclidean_accuracy_threshold,euclidean_f1,euclidean_precision,euclidean_recall,euclidean_f1_threshold,euclidean_ap,dot_accuracy,dot_accuracy_threshold,dot_f1,dot_precision,dot_recall,dot_f1_threshold,dot_ap
2
+ -1,-1,0.7131814787154593,0.7389853000640869,0.7407480541705748,0.6621752816922126,0.8404779686333085,0.6548126935958862,0.764091696227668,0.7105675877520538,107.62960815429688,0.7439052101931505,0.6579385587137525,0.8557132188200149,159.94927978515625,0.7620236140757088,0.7092233009708738,7.13208532333374,0.7394660662592474,0.6494350282485876,0.8584764749813294,8.149255752563477,0.7595757602227923,0.7085324869305452,58.26499938964844,0.7354351731298515,0.6582293418296604,0.8331590739357729,52.617462158203125,0.7348401379960027
config.json CHANGED
@@ -22,7 +22,7 @@
22
  "position_embedding_type": "absolute",
23
  "tokenizer_class": "BertweetTokenizer",
24
  "torch_dtype": "float32",
25
- "transformers_version": "4.31.0",
26
  "type_vocab_size": 1,
27
  "use_cache": true,
28
  "vocab_size": 64001
 
22
  "position_embedding_type": "absolute",
23
  "tokenizer_class": "BertweetTokenizer",
24
  "torch_dtype": "float32",
25
+ "transformers_version": "4.32.1",
26
  "type_vocab_size": 1,
27
  "use_cache": true,
28
  "vocab_size": 64001
config_sentence_transformers.json CHANGED
@@ -1,7 +1,7 @@
1
  {
2
  "__version__": {
3
  "sentence_transformers": "2.2.2",
4
- "transformers": "4.31.0",
5
  "pytorch": "2.0.1+cu118"
6
  }
7
  }
 
1
  {
2
  "__version__": {
3
  "sentence_transformers": "2.2.2",
4
+ "transformers": "4.32.1",
5
  "pytorch": "2.0.1+cu118"
6
  }
7
  }
eval/binary_classification_evaluation_fine-tune-test_results.csv CHANGED
@@ -1,31 +1,31 @@
1
  epoch,steps,cossim_accuracy,cossim_accuracy_threshold,cossim_f1,cossim_precision,cossim_recall,cossim_f1_threshold,cossim_ap,manhattan_accuracy,manhattan_accuracy_threshold,manhattan_f1,manhattan_precision,manhattan_recall,manhattan_f1_threshold,manhattan_ap,euclidean_accuracy,euclidean_accuracy_threshold,euclidean_f1,euclidean_precision,euclidean_recall,euclidean_f1_threshold,euclidean_ap,dot_accuracy,dot_accuracy_threshold,dot_f1,dot_precision,dot_recall,dot_f1_threshold,dot_ap
2
- 0,1000,0.7664317941291149,0.486582487821579,0.7803905319314238,0.7292483422874014,0.8392469225199131,0.4408458471298218,0.8359886478046229,0.764565810728012,189.147216796875,0.7799080846656742,0.7286581678318295,0.838912716537626,193.72543334960938,0.8393917121665209,0.7655127276778254,8.783966064453125,0.7800096990734833,0.7198718552718364,0.8511112348911045,9.466306686401367,0.8391931466456664,0.7670166545981173,36.696746826171875,0.7787188919580557,0.7347150515056446,0.828329527098535,34.698734283447266,0.8128973550679109
3
- 0,2000,0.7393750348131232,0.5573605298995972,0.735912426918771,0.7304705720949376,0.7414359717038935,0.4102650582790375,0.8076656543778584,0.7434551328468779,199.37057495117188,0.7488441027070165,0.6868043776285522,0.8232050353701331,220.32708740234375,0.8211288020557157,0.7424107391522308,8.511497497558594,0.7400637331344498,0.7211607072068501,0.7599844037208267,9.90682601928711,0.8160996854852789,0.7356848437587032,39.38921356201172,0.7262602993343414,0.7372528765315199,0.7155907090736924,35.134132385253906,0.7802658061437526
4
- 0,3000,0.786456302567816,0.7374271154403687,0.7835469018694181,0.7803436274238992,0.7867765833008411,0.4987448751926422,0.8528158825569336,0.7859132178465994,146.7528076171875,0.7966924691850955,0.7410566670660117,0.8613602183479084,207.1959228515625,0.8659377973406632,0.7862891995766724,7.01038122177124,0.7862018654577667,0.7385579323510696,0.8404166434579179,9.810394287109375,0.859054868144066,0.7848270484041664,58.307464599609375,0.7802126519446089,0.7902193784277879,0.7704561911658219,44.82365036010742,0.8401202917713289
5
- 0,4000,0.7870829387846042,0.5717326402664185,0.7865554398821032,0.7643271934203957,0.8101153010638891,0.47935885190963745,0.8528562703075022,0.7898540633877347,189.68896484375,0.7965073721635471,0.7504803192275482,0.8485489890269036,204.52267456054688,0.8630952748863835,0.7867765833008411,8.835192680358887,0.7864319242511868,0.7604629990896086,0.8142371748454297,9.588624954223633,0.8571658565213038,0.7853979836239069,49.12046813964844,0.7791595317631865,0.7853989813242784,0.7730184370300228,45.11640930175781,0.8364191012827793
6
- 0,5000,0.7808026513674595,0.5821325778961182,0.7869083690314492,0.7408882986572229,0.8390241185317218,0.5424697399139404,0.8576738380657107,0.7815685400768674,170.3080291748047,0.7900655545488462,0.7573698462167271,0.8257115802372863,184.01321411132812,0.8605206322801829,0.7773770400490169,8.039061546325684,0.7895610789325732,0.7441161248983414,0.8409179524313485,8.928276062011719,0.8591663824472929,0.7778783490224475,53.91118621826172,0.7809087376559339,0.7524854744996772,0.8115635269871331,49.15049362182617,0.817047450871158
7
- 0,-1,0.7717373140979223,0.5294360518455505,0.7723978113936273,0.7181881299528359,0.8354592547206595,0.4505099058151245,0.8461767046437717,0.7718347908427561,191.7567901611328,0.7739183267189782,0.7221538684229573,0.8336768228151283,206.53079223632812,0.8519611394734723,0.7679914220464547,8.787954330444336,0.7706761375058635,0.7427729985274742,0.8007575335598507,9.45853042602539,0.8473283809397187,0.7697320782041999,47.89080810546875,0.7718713009792317,0.7464488266819294,0.7990865036484153,41.52181625366211,0.8107323333475928
8
- 1,1000,0.7610148721662118,0.47341352701187134,0.7708736052045699,0.7393427950389976,0.8052136133236785,0.4720102548599243,0.8393056786258988,0.7629783323121484,207.37928771972656,0.778014996875651,0.7304326570520655,0.8322285968918843,207.80984497070312,0.8458845400300564,0.7596919734863254,9.571844100952148,0.7671932773109243,0.7416361225922171,0.7945747228875397,9.65745735168457,0.8408814907414719,0.7610009469169499,48.85084915161133,0.7655571635311142,0.7377201590826921,0.7955773408344009,42.320281982421875,0.7889608097627274
9
- 1,2000,0.7617668356263577,0.4831611216068268,0.768227188179508,0.7251953704619646,0.816688018715535,0.447949081659317,0.841306812489196,0.7660975881468278,206.78155517578125,0.7777397710827523,0.7370337123478955,0.8232050353701331,209.63221740722656,0.8465489166235076,0.7623099203475743,9.669540405273438,0.7684107879937484,0.7383391092285971,0.80103603854509,9.81333065032959,0.8418975448599488,0.7622542193505264,44.03800964355469,0.7635499079882333,0.7374484315404375,0.791566869046956,40.96443176269531,0.7916606936932462
10
- 1,3000,0.7644126329861305,0.32492801547050476,0.764449053102658,0.7493179264965495,0.7802038656491951,0.29830923676490784,0.8328818262411662,0.7600818804656604,227.51390075683594,0.7586775704200741,0.7616267647141374,0.7557511279451902,228.03140258789062,0.8336382918546894,0.7614047791455467,10.477073669433594,0.7653250104711335,0.7432230298895216,0.7887818191945636,10.87778377532959,0.8332309655383566,0.7651924469448003,27.151779174804688,0.7641630781276216,0.7672525127744398,0.7610984236617836,27.068164825439453,0.804377275736198
11
- 1,4000,0.7659861861527322,0.354583740234375,0.763364736049947,0.7639609483960949,0.762769453573219,0.3361058831214905,0.8362531142900129,0.7672394585863087,227.73187255859375,0.7662296130476496,0.7650701096765237,0.7673926363281903,228.43576049804688,0.8383852543413032,0.7676154403163816,10.506753921508789,0.7660727592981506,0.7459299717670651,0.7873335932713196,10.817108154296875,0.8364420469430429,0.7644822592324403,32.58717346191406,0.7630628494931043,0.7641390867197319,0.7619896396145491,28.88344383239746,0.791439876728664
12
- 1,5000,0.7680053472957166,0.41714152693748474,0.7577735459580468,0.7925802037403071,0.7258953935275442,0.41644296050071716,0.8336291357428097,0.7651228206984905,221.26913452148438,0.7606010761381694,0.7613864701942398,0.759817300729683,225.83990478515625,0.8357898417051105,0.767141981841475,10.097867965698242,0.7572964892675211,0.790677940418826,0.7266195064891662,10.11166763305664,0.8331590369308912,0.7684370300228374,36.36254119873047,0.758169554599372,0.7929818159703217,0.7262853005068791,36.30207443237305,0.7807937708671169
13
- 1,-1,0.7680331977942405,0.42106810212135315,0.7577179550506299,0.7919944118808273,0.7262853005068791,0.4176786541938782,0.833636826630449,0.7651088954492286,218.78744506835938,0.7607916428402071,0.7619925683792921,0.7595944967414917,225.50914001464844,0.835775475062599,0.7670166545981173,10.075724601745605,0.7570001161845009,0.7909559939301972,0.7258396925304963,10.078554153442383,0.8331679444742818,0.7684231047735754,36.70421600341797,0.7581744958636831,0.7931254752851711,0.7261738985127834,36.548641204833984,0.7795522122685896
14
- 2,1000,0.7698852559460815,0.45917731523513794,0.7593709454353622,0.777949811574304,0.7416587756920849,0.42607033252716064,0.8260349945603962,0.770734696151061,213.33164978027344,0.7615102756093676,0.7932772094988081,0.7321896061939509,213.5850830078125,0.8277628265717094,0.7703587144209881,9.798596382141113,0.7612718418791178,0.7836291796898036,0.740154848771793,10.001861572265625,0.8264172100812509,0.7694953489667465,40.91225051879883,0.7576349794358721,0.7952847519902021,0.723388848660391,39.9620246887207,0.7706383596974287
15
- 2,2000,0.7663203921350192,0.4643022418022156,0.7558404805221057,0.7847805229071719,0.7289589483651757,0.4442211389541626,0.8257776933109247,0.7687573107558625,211.79931640625,0.7573733189416261,0.7951120911429621,0.7230546426781039,212.55064392089844,0.8270907892385867,0.769314320726341,9.791139602661133,0.7578818487909399,0.7950215589737317,0.7240572606249652,9.844770431518555,0.8247968913117689,0.764927867208823,41.210693359375,0.7543912476161798,0.7774789419240431,0.7326352141703336,38.983665466308594,0.7525271090538921
16
- 2,3000,0.7628947808165766,0.466765820980072,0.7455698841468862,0.8011392729134664,0.6972093800479029,0.4552168548107147,0.8245430158227223,0.7608199186765443,209.82913208007812,0.7435288565065412,0.7902806082458066,0.7019996657940177,212.08395385742188,0.8250487096677921,0.7629922575614103,9.727481842041016,0.744505289544756,0.8018513161700896,0.6948142371748455,9.79130744934082,0.8236187301873519,0.7633821645407453,40.188594818115234,0.7475506445672191,0.7935812818668084,0.7065671475519412,39.79478454589844,0.7663008201127002
17
- 2,4000,0.7655127276778254,0.4727761149406433,0.751514796709891,0.7919185687847008,0.7150336991032139,0.46665799617767334,0.8256477131993547,0.7635353422826269,210.45877075195312,0.7487110382001406,0.7896558116542051,0.7118030412744388,211.28128051757812,0.827269046116811,0.7630479585584582,9.724624633789062,0.7477670073505726,0.788378411757441,0.7111346293098646,9.776246070861816,0.8241057263805148,0.7646772127221078,42.401153564453125,0.7535693557748113,0.78657539225783,0.7232217456692475,41.692344665527344,0.7660479312291009
18
- 2,5000,0.7624073970924079,0.47480508685112,0.7462254395036195,0.7946136420840675,0.7033921907202139,0.4719405770301819,0.8260047209317134,0.761293377151451,209.33734130859375,0.7438429506246282,0.7981106784962022,0.6964852670862809,211.26072692871094,0.8280162151640978,0.7622681445997883,9.641020774841309,0.7443402425183681,0.8024341554510915,0.6940901242132235,9.752893447875977,0.8247934752022039,0.7636049685289367,43.243228912353516,0.7480436329143941,0.7994677817905341,0.7028351807497354,42.909053802490234,0.769666046201166
19
- 2,-1,0.7624491728401939,0.4863702952861786,0.7464539007092199,0.7949896141499339,0.7035035927143096,0.47535228729248047,0.8259985722960899,0.7613769286470228,208.98150634765625,0.744395808462661,0.7905462513695414,0.7033364897231661,211.2559814453125,0.8280516075885992,0.7623795465938841,9.617895126342773,0.7443840151336013,0.7929970401158764,0.7013869548264914,9.760398864746094,0.8247848981030891,0.7635771180304127,43.704925537109375,0.748010767000917,0.7975274378705689,0.7042834066729794,43.20008850097656,0.7703023924808915
20
- 3,1000,0.7468807441653206,0.5050731897354126,0.7435443173887367,0.7175794000310864,0.7714588091126832,0.37026703357696533,0.7977073238610843,0.7473263521417033,214.29537963867188,0.74179991799918,0.686915666081344,0.8062162312705398,228.963134765625,0.8000499172101474,0.7465326129337715,9.728931427001953,0.7454636754746685,0.7270032030071207,0.7648860914610371,10.620023727416992,0.7970713194440227,0.7473820531387512,48.07101821899414,0.7426577414493069,0.7245846912010174,0.7616554336322621,34.42692565917969,0.7280201261344894
21
- 3,2000,0.7458781262184593,0.720092236995697,0.7463349836976211,0.6588350673716528,0.8606361053862864,0.33686375617980957,0.7983174089777652,0.7465883139308194,151.50115966796875,0.7379131043447827,0.6693727606694181,0.8220910154291762,231.5017547607422,0.7976659280207756,0.746351584693366,7.285037040710449,0.7399749769358119,0.6773559149487518,0.8153511947863866,10.80156135559082,0.7973673654366333,0.7448476577730742,67.33314514160156,0.7461720403336238,0.6746353322528363,0.8346794407619896,31.5607967376709,0.7233088294933525
22
- 3,3000,0.7482314933437308,0.6788135170936584,0.7427231408954645,0.6561445474349664,0.8556230156519802,0.34727874398231506,0.8052624958972235,0.74779981061661,165.0928955078125,0.7345990948793393,0.697956625297731,0.7753021779089846,221.5665283203125,0.8039966803444085,0.748175792346683,8.29865837097168,0.7363489223954341,0.7510426320667285,0.7222191277223863,10.29574966430664,0.8046418523555745,0.7481061661003732,64.63654327392578,0.7439664980271599,0.657890230327939,0.8559572216342672,31.514062881469727,0.7251012559777354
23
- 3,4000,0.7483568205870885,0.6468403339385986,0.7433181305477907,0.6639362214726883,0.8442600122542193,0.36670345067977905,0.8100511634526301,0.7484821478304462,173.53329467773438,0.7363483403864783,0.6788887321957411,0.8044337993650086,228.00086975097656,0.8086621316724223,0.7483985963348744,8.187524795532227,0.736533729151205,0.6915910702496516,0.7877235002506545,10.532373428344727,0.8084176294386471,0.7478555116136579,55.885475158691406,0.7435897435897436,0.6776011362074494,0.8238177463376595,34.712608337402344,0.728199161046412
24
- 3,5000,0.7487745780649474,0.6286755204200745,0.7385057058781311,0.7619893954204805,0.7164262240294101,0.44652101397514343,0.8121020662991324,0.7481061661003732,179.2566680908203,0.735515468543662,0.6814758499441685,0.798863699660224,225.1893310546875,0.8101274009317954,0.7488024285634712,8.48324203491211,0.7376079452863357,0.6838936873914389,0.8004790285746115,10.483926773071289,0.8106823978460422,0.747813735865872,59.67631149291992,0.7426636290868789,0.6656659087397073,0.8398039324903915,35.37864303588867,0.728731711855164
25
- 3,-1,0.7486910265693756,0.6259658336639404,0.7381954585803052,0.7595757218621096,0.7179858519467498,0.4472625255584717,0.8120948663631773,0.748189717595945,170.84100341796875,0.7354717900704124,0.6816442003661175,0.7985294936779368,224.99334716796875,0.8101333321025699,0.7488302790619952,8.504928588867188,0.7363695660514905,0.6808409999290495,0.801760151506712,10.484764099121094,0.8106463067737346,0.7479947641062775,59.872596740722656,0.7422178509958199,0.6643923053220055,0.8406951484431572,35.47296905517578,0.7286414023745557
26
- 4,1000,0.7501392524926196,0.4743134081363678,0.7407087543541468,0.7665038131553861,0.7165933270205537,0.46892672777175903,0.8117229912824062,0.7502367292374533,206.0076141357422,0.7381509867512106,0.772424288540557,0.7067899515401326,206.4764404296875,0.8105447519629301,0.7483568205870885,9.838419914245605,0.7386380232882351,0.705792798964756,0.7746894669414582,10.05587387084961,0.8099282781580126,0.7487328023171614,43.34110641479492,0.741543366711503,0.670706979678277,0.8291093410572049,38.03354263305664,0.7267622020485439
27
- 4,2000,0.7494012142817357,0.4709709882736206,0.7406259861720744,0.7636062470421202,0.7189884698936111,0.4665965735912323,0.811504972240535,0.749637943519189,206.9192352294922,0.7386116191969341,0.771376591873863,0.7085166824486159,207.0909881591797,0.8110615845624107,0.7487049518186375,8.368803977966309,0.7382044173488024,0.698753824341467,0.7823762045340612,10.136795043945312,0.8099362130031903,0.7473681278894893,43.181793212890625,0.7401300775901415,0.7585218967432614,0.7226090347017211,42.75250244140625,0.7536420262577872
28
- 4,3000,0.7456970979780538,0.6712411642074585,0.7533495660860334,0.6828131012471989,0.8401381384726787,0.43679845333099365,0.8202066972566091,0.7461009302066507,154.74600219726562,0.740471767942684,0.7426394374877441,0.7383167158692141,211.21031188964844,0.8179704159239546,0.7456553222302679,7.134251594543457,0.7422293019646286,0.7260474655728099,0.7591488887651089,10.091227531433105,0.8141645843929346,0.7449312092686459,42.60676574707031,0.7561614824619458,0.7205125618000202,0.7955216398373531,41.880393981933594,0.761104694810306
29
- 4,4000,0.747423828886537,0.7292289733886719,0.7538784906415775,0.6843850801871791,0.8390798195287695,0.44124242663383484,0.8217723055013232,0.7481340165988971,149.73931884765625,0.742925222510337,0.7478201980868535,0.7380939118810227,210.6751708984375,0.8195341124481053,0.7479669136077536,7.131768226623535,0.7438380760650743,0.7269875033235842,0.7614883306411184,10.072151184082031,0.8155202589056434,0.746351584693366,42.63652038574219,0.7589479112944816,0.7065828011715561,0.8196958725561188,41.66033172607422,0.7617849838312489
30
- 4,5000,0.7481200913496352,0.7397788763046265,0.7532934131736527,0.6822434130248113,0.8408622514343007,0.4417516589164734,0.8223604832904836,0.749637943519189,209.40219116210938,0.7444073172113043,0.7570992454351708,0.732133905196903,210.01272583007812,0.8202572748893633,0.7480922408511113,7.069145202636719,0.7441329266424322,0.7442676845067283,0.7439982175680945,10.023733139038086,0.8162677044161655,0.747785885367348,42.620033264160156,0.7599622089637231,0.7098474408258988,0.8176906366623963,41.85185241699219,0.7626665612311252
31
- 4,-1,0.7481340165988971,0.739820659160614,0.7532840159179651,0.6822279968365157,0.8408622514343007,0.44175392389297485,0.8223617033677332,0.749637943519189,209.40499877929688,0.7444033813346926,0.7571506754615894,0.7320782041998551,210.01052856445312,0.8202571976199776,0.7480922408511113,7.069060325622559,0.7441471804799242,0.7442404657770845,0.7440539185651424,10.023887634277344,0.8162683949424445,0.747827661115134,42.61126708984375,0.759996894168802,0.7098240185650745,0.8178020386564919,41.85026550292969,0.7626676236249696
 
1
  epoch,steps,cossim_accuracy,cossim_accuracy_threshold,cossim_f1,cossim_precision,cossim_recall,cossim_f1_threshold,cossim_ap,manhattan_accuracy,manhattan_accuracy_threshold,manhattan_f1,manhattan_precision,manhattan_recall,manhattan_f1_threshold,manhattan_ap,euclidean_accuracy,euclidean_accuracy_threshold,euclidean_f1,euclidean_precision,euclidean_recall,euclidean_f1_threshold,euclidean_ap,dot_accuracy,dot_accuracy_threshold,dot_f1,dot_precision,dot_recall,dot_f1_threshold,dot_ap
2
+ 0,1000,0.740154848771793,0.585610032081604,0.7512134477787623,0.7152751542626873,0.7909541580794296,0.5636146068572998,0.7862586453830283,0.7320364284520693,162.85134887695312,0.7506087738358372,0.6803703619858285,0.8370188826379992,178.11680603027344,0.7825977157621367,0.7328440929092631,8.462957382202148,0.7480418089832814,0.6939864671685886,0.811229321004846,9.021198272705078,0.7786076253783956,0.7421461594162535,51.69240951538086,0.7458258031451461,0.7230648535564853,0.7700662841864869,48.888954162597656,0.7438784524600425
3
+ 0,2000,0.7328301676600011,0.6296051740646362,0.739300023880914,0.7059389885476842,0.7759705898735587,0.5345308780670166,0.7887225712892582,0.728109508160196,163.58560180664062,0.7392852632908196,0.6762776083541262,0.815239792792291,187.81973266601562,0.7859110126313795,0.7315768952264246,7.947263717651367,0.7440591830170473,0.6914560367297161,0.8053250153177742,9.377279281616211,0.7863726621383399,0.729223528101153,56.419029235839844,0.7399514107714141,0.6511854360711261,0.8567370355929371,36.074989318847656,0.7545647266175324
4
+ 0,3000,0.7333593271319556,0.5852109789848328,0.7366814937415951,0.6875693941588221,0.793349300952487,0.4697726368904114,0.7814915169482928,0.7342923188325071,167.56234741210938,0.7405934896616665,0.6981285598047193,0.7885590152063722,185.62295532226562,0.7844438220700205,0.7312009134963516,8.143726348876953,0.7367706050469709,0.6824149474134042,0.8005347295716594,9.543863296508789,0.7819653120118226,0.730685679273659,51.036773681640625,0.729976677184799,0.7193380921479559,0.7409346627304628,43.77399444580078,0.7682029379615493
5
+ 0,4000,0.7319528769564975,0.6113885641098022,0.7304305217166424,0.7143199424966057,0.7472845763939174,0.5032544136047363,0.7796168548865824,0.7295438088341781,183.77114868164062,0.7423264834222205,0.6704241798217052,0.8315044839302623,200.60665893554688,0.7859687567213028,0.7313540912382331,7.762547492980957,0.7386498803843737,0.7100752884343603,0.7696206762101042,9.466588973999023,0.7804174238485684,0.7320364284520693,53.00003433227539,0.7226950455378582,0.7376939811457578,0.7082938784604245,49.99421691894531,0.7545644434872574
6
+ 0,5000,0.7008717206037988,0.6893185377120972,0.7238735865468251,0.6202031041952344,0.869158357934607,0.505645751953125,0.7573581149708617,0.697181529549379,156.41075134277344,0.7252543940795559,0.5998032715217312,0.9170612153957556,199.6271209716797,0.7576633856761417,0.699715924915056,7.274563789367676,0.721679105785066,0.6214721087887992,0.860413301398095,9.24429702758789,0.7572170773271093,0.6952319946527042,63.3360710144043,0.7178262039278989,0.5849278310738526,0.9288698267698992,39.34496307373047,0.7060530230032297
7
+ 0,-1,0.7016933103102545,0.5540434122085571,0.7219917012448133,0.6517000089710236,0.8092797861081713,0.49931254982948303,0.7624222182491369,0.7033086392246422,163.49075317382812,0.7302173477786924,0.607896881250463,0.9141647635492676,203.9537811279297,0.7638903054769688,0.6997855511613658,8.051408767700195,0.7222059254913465,0.6070019723865878,0.8913830557566981,9.920435905456543,0.7609098207552563,0.6990753634490058,51.86427688598633,0.722648392485766,0.6104018277113291,0.8854787500696263,38.65266418457031,0.7213601486505429
8
+ 1,1000,0.6843563749791122,0.622389554977417,0.7175578929688649,0.5758137340791158,0.9518743385506601,0.4198741316795349,0.728622577138776,0.6873503035704339,183.5062713623047,0.7161725373565888,0.5691027555759793,0.965743886815574,214.1572265625,0.7309284856131453,0.6852058151840918,8.175722122192383,0.712980311293007,0.5711458804983985,0.9485322787277892,9.99151611328125,0.7289892547982773,0.6832284297888932,53.4877815246582,0.7173183420036919,0.5753221844611192,0.9523756475240907,35.72834777832031,0.6721446578082373
9
+ 1,2000,0.7341948420876734,0.6994354724884033,0.7351262826582051,0.679920477137177,0.8000891215952766,0.46387240290641785,0.7749941661774506,0.7303653985406339,155.98574829101562,0.736097204252686,0.6174899488476566,0.9111012087116359,210.04220581054688,0.7764608189443374,0.7341530663398875,8.792206764221191,0.7397617882199381,0.6360320641282565,0.8839191221522865,9.918052673339844,0.7768380980149778,0.7378293321450454,54.189537048339844,0.729577001152137,0.7187719582725258,0.7407118587422715,45.27055358886719,0.7493830297182804
10
+ 1,3000,0.7296552108282738,0.5739662647247314,0.7263482156068819,0.7024692321702704,0.7519077591488887,0.502313494682312,0.7743028481845646,0.7257422157856626,172.76417541503906,0.7305597908952298,0.6459385431824017,0.8406951484431572,192.86375427246094,0.7786140728743274,0.7280120314153623,9.018473625183105,0.7278951323369792,0.7273890310379353,0.7284019383946972,9.034345626831055,0.7743923653074898,0.7301147440539185,52.54202651977539,0.7242862080884893,0.7191850407754194,0.7294602573386063,45.99448013305664,0.7453817335824302
11
+ 1,4000,0.7402523255166268,0.6452406048774719,0.7435705436277247,0.7273405247036889,0.7605414136913051,0.6225917339324951,0.7840243237284034,0.7407814849885813,165.77940368652344,0.7478685131339017,0.6705996729859914,0.8452626302010806,186.43260192871094,0.7861603683958884,0.7381913886258564,6.82960319519043,0.7469631569803958,0.7179961464354528,0.7783657327466161,8.420555114746094,0.7836666708347142,0.7375508271598061,56.555137634277344,0.7407787993510005,0.719871761181479,0.7629365565643625,54.43528747558594,0.7451984157246527
12
+ 1,5000,0.7230546426781039,0.536429762840271,0.7236885335669697,0.7116471918582736,0.736144376984348,0.5119767189025879,0.7659816565081021,0.7185707124157522,174.91607666015625,0.7234270345079709,0.6809389209782475,0.7715702111067788,188.4012451171875,0.7674257316915875,0.7213557622681446,8.596382141113281,0.718048780487805,0.7186697913179333,0.7174288419762713,8.912203788757324,0.7641920973694406,0.7250598785718264,48.78872299194336,0.7264762346514116,0.6759536768408181,0.7851612543864536,41.008968353271484,0.7442765183688369
13
+ 1,-1,0.7278031526764329,0.5447660684585571,0.7325829851518163,0.7163503766202657,0.7495683172728792,0.5393775701522827,0.7817619381380104,0.7300451178076087,173.03179931640625,0.7377627420898732,0.6970785737294646,0.7834902244750181,185.53854370117188,0.7820923898251559,0.7308527822648025,8.732171058654785,0.7319959272407056,0.7233764821059502,0.7408232607363672,8.958128929138184,0.7811526259842299,0.7298780148164652,55.8602180480957,0.7309920305204152,0.6303764254717933,0.8698267698991812,39.615943908691406,0.7576507133048633
14
+ 2,1000,0.7313123154904473,0.6461310386657715,0.7261163882519925,0.7119306321254617,0.740878961733415,0.5153070688247681,0.7716201179420012,0.7292096028518911,155.5853271484375,0.7341913213649892,0.7169360617883589,0.7522976661282237,181.00506591796875,0.7742673399169445,0.7307692307692307,7.668848991394043,0.7267108526816574,0.7078393578538801,0.7466161644293433,9.155574798583984,0.7719036011298094,0.7290982008577953,62.3431396484375,0.7222334888122761,0.7018050920938543,0.7438868155739988,43.685516357421875,0.7622692399615848
15
+ 2,2000,0.7227065114465548,0.5908321142196655,0.7150135263179577,0.7052417716375458,0.7250598785718264,0.4846171736717224,0.7719227147151037,0.7227204366958169,180.50222778320312,0.7273562748902277,0.6713794241010416,0.7935164039436305,198.63790893554688,0.773027577465341,0.7243914666072523,9.024002075195312,0.7205255336583198,0.7186192758400975,0.7224419317105776,9.362994194030762,0.7726321451566962,0.7201581908316159,52.466888427734375,0.7121150473518064,0.7175209929599367,0.7067899515401326,45.56163024902344,0.7381867489399621
16
+ 2,3000,0.7205063220631649,0.5153660178184509,0.7233309094826426,0.7002294295546981,0.7480086893555394,0.47842174768447876,0.7726502799886763,0.7227343619450788,185.091064453125,0.7239453237771278,0.65895848090853,0.8031526764329081,198.99757385253906,0.7730456413359823,0.719768283852281,8.75425910949707,0.7234372908579841,0.6964230491701886,0.7526318721105107,9.477849006652832,0.7717109752260815,0.7238483818860357,46.414215087890625,0.7165663812940387,0.7359736942544259,0.6981562969977163,46.414215087890625,0.7366198661418574
17
+ 2,4000,0.7096724781373587,0.6618224382400513,0.7143908279709428,0.6873825399685916,0.7436083105887595,0.5073532462120056,0.7553558726272488,0.7123322007463934,183.67529296875,0.7185965280878219,0.6798350228582787,0.7620453406115969,192.9994354248047,0.7607150504711715,0.7098256558792403,7.929637908935547,0.7171552087726697,0.6807286193264275,0.7577006628418649,9.43260383605957,0.7558975464780855,0.7107447223305298,57.237831115722656,0.7079392592034365,0.6447347349637391,0.7848827494012143,43.33351135253906,0.7251238179883864
18
+ 2,5000,0.7255611875452571,0.5437341928482056,0.715160713317795,0.696437054631829,0.7349189550492954,0.47667479515075684,0.7567678482268225,0.7220102489834568,185.57192993164062,0.711844839528977,0.7386642707397424,0.6869046955940511,186.80551147460938,0.7609719241458824,0.723388848660391,8.853292465209961,0.715018031586365,0.7095181945320427,0.7206037988079986,9.505727767944336,0.7535932561250493,0.7251295048181362,48.92876434326172,0.7170105967651981,0.7179315351538504,0.7160920180471231,44.10511779785156,0.7190622136628879
19
+ 2,-1,0.7221216509775525,0.5830212235450745,0.7139959432048681,0.7028981596416428,0.7254497855511614,0.49401402473449707,0.7577789632915699,0.725213056313708,183.59326171875,0.7167360251940165,0.7236954176367043,0.709909207374812,191.8802490234375,0.7617706717510708,0.7214671642622403,8.792418479919434,0.7097350745193001,0.7355401529636711,0.6856792736589985,9.025744438171387,0.7570492454895487,0.7195594051133515,55.14214324951172,0.7116433948009067,0.7150247413405308,0.7082938784604245,46.18491744995117,0.7221163569754776
20
+ 3,1000,0.7138779034144711,0.667630672454834,0.7080084370960059,0.6922483301844115,0.7245028686013479,0.49316561222076416,0.7572260109745176,0.71571603631705,187.70257568359375,0.715323837233353,0.7083981974600574,0.7223862307135298,191.52281188964844,0.7602624755127021,0.7163844482816243,7.673393249511719,0.7175615788931259,0.6675775444652189,0.7756363838912717,9.71194076538086,0.7584603065381312,0.7116359382832953,57.26898193359375,0.7012660350465331,0.703724478348665,0.6988247089622904,46.01648712158203,0.7149009507877254
21
+ 3,2000,0.7034061159694759,0.645792543888092,0.6999796596379416,0.6821004783424509,0.7188213669024676,0.5049311518669128,0.7487459523533433,0.7043112571715033,171.255126953125,0.7112607557052002,0.643979766958721,0.7942405169052527,200.30978393554688,0.7513133974056816,0.7039492006906923,8.17646598815918,0.7082396663071586,0.6638571532971183,0.7589817857739654,9.540943145751953,0.7502180801022875,0.705912660836629,57.01820373535156,0.6982961421830594,0.6822735365237956,0.7150894001002618,45.76020050048828,0.692171491602241
22
+ 3,3000,0.7149362223583802,0.6349054574966431,0.6997366856366263,0.7115628239087873,0.6882972205202473,0.5243378281593323,0.74569492687835,0.7127638834735142,167.7770233154297,0.7039425533383967,0.6981292284094333,0.7098535063777641,189.87074279785156,0.7520055862079771,0.7148387456135464,7.772121429443359,0.702222978337303,0.7151602656656079,0.6897454464434913,9.073543548583984,0.7461157073864344,0.7128613602183479,55.63344192504883,0.6957652088508868,0.7226522652265227,0.6708071074472233,49.71984100341797,0.7184657851538883
23
+ 3,4000,0.7158552888096696,0.5480663180351257,0.703342939481268,0.728667821102287,0.6797192669748788,0.5273720026016235,0.7453813351098914,0.7165376260235058,183.96432495117188,0.7057195970775151,0.7312091259294652,0.6819473068567927,189.3191680908203,0.7524368587933294,0.7164540745279341,8.950374603271484,0.7072083052524413,0.7288741761002572,0.6867932935999554,9.20567798614502,0.7461612088563488,0.7138918286637331,56.630958557128906,0.6992818928275112,0.7188823529411764,0.6807218849217401,44.9284553527832,0.7240756559330876
24
+ 3,5000,0.7083356542082103,0.5160317420959473,0.7045579274728059,0.7120630315442159,0.6972093800479029,0.504380464553833,0.7456199629999113,0.7085863086949257,189.6381072998047,0.7156636611448942,0.6758750433189762,0.7604300116972094,200.63095092773438,0.7508556795035048,0.7090458419205704,8.91309928894043,0.710638182655078,0.6914120126448894,0.7309641842588982,9.557706832885742,0.7475660668550255,0.7067203252938228,47.12469482421875,0.6979566775662902,0.7161485041169748,0.6806661839246922,46.63441848754883,0.7068809977995912
25
+ 3,-1,0.7080432239737091,0.49873578548431396,0.7075335626317543,0.7046798165644511,0.7104105163482426,0.4846397340297699,0.7455931627042371,0.7091711691639281,188.5823974609375,0.7179420282292781,0.6752552021986651,0.766390018381329,202.57958984375,0.7507781252042124,0.7068735030357043,9.246362686157227,0.7097935189751928,0.6806943982819011,0.7414916727009413,9.675313949584961,0.7474166262261993,0.7069570545312761,45.76421356201172,0.6987580299785867,0.7167877225866917,0.6816131008745057,45.31532669067383,0.7064645183331485
26
+ 4,1000,0.6943825544477246,0.4930197298526764,0.6938075334025949,0.6126472094214029,0.7997549156129895,0.3941698670387268,0.7405568577999365,0.6975296607809279,195.56838989257812,0.7046135092879667,0.6720076829761423,0.740544755751128,202.88018798828125,0.7438774771563635,0.6977663900183814,9.705812454223633,0.7013844544379986,0.6847229222484548,0.7188770678995154,9.76524543762207,0.7417766817151012,0.6942293767058431,58.167266845703125,0.69301861413633,0.5675704488113429,0.8896563248482148,29.745956420898438,0.6942912327930107
27
+ 4,2000,0.7030579847379268,0.5139841437339783,0.7035393976019448,0.6940357152535024,0.7133069681947307,0.4846697151660919,0.7441474052149798,0.7052024731242689,189.02398681640625,0.7089247774345337,0.6663073040452797,0.7573664568595778,202.80690002441406,0.7470481948205908,0.7012198518353479,9.050704002380371,0.7068246921625329,0.689534606521336,0.7250041775747786,9.516852378845215,0.7449539817473834,0.6984905029800034,54.68560028076172,0.6931623513288631,0.6181849927975904,0.7888375201916115,36.2747802734375,0.7026699316622068
28
+ 4,3000,0.706622848548989,0.5015596151351929,0.7011823414329865,0.7140794640794641,0.68874282849663,0.4988675117492676,0.7445471790877778,0.7076393917451123,193.09793090820312,0.7143663450810527,0.6755877752675091,0.7578677658330084,201.8223876953125,0.7489774100379399,0.7079735977273993,9.55426025390625,0.7111704010236585,0.6955479816806902,0.7275107224419317,9.62564754486084,0.7461480480834116,0.7028212555004735,45.977439880371094,0.6971972287773052,0.6374229872856912,0.7693421712248649,38.79158020019531,0.7053584501526011
29
+ 4,4000,0.6990475129504818,0.5282589197158813,0.694938652285541,0.6473571617431435,0.7500696262463098,0.4708825945854187,0.7377003766587753,0.7024313485211385,190.80166625976562,0.7063629222309505,0.6665019518703366,0.7512950481813625,200.47503662109375,0.7419852358176189,0.6996602239180081,9.462997436523438,0.70463525628798,0.689767392232181,0.7201581908316159,9.520181655883789,0.7391906915771685,0.6963738650921851,47.430747985839844,0.6992821194083234,0.6458706937234545,0.7623238455968362,40.27260971069336,0.7056118166622416
30
+ 4,5000,0.7005653651200356,0.5047475695610046,0.6955575138665699,0.6504435503417519,0.7473959783880132,0.46212443709373474,0.7389561149537538,0.7068317272879184,192.73989868164062,0.7078305018551064,0.6645481434403188,0.7571436528713864,202.312255859375,0.7436432743841523,0.7028212555004735,8.905183792114258,0.7042218982875359,0.6892818092114036,0.7198239848493289,9.600275039672852,0.7404873009797531,0.6990753634490058,46.9439697265625,0.699538423955668,0.6497098230278713,0.757644961844817,39.5091667175293,0.706242117687969
31
+ 4,-1,0.7005375146215117,0.504567563533783,0.6955755195167166,0.6506439017291975,0.7471731743998218,0.4619932770729065,0.7389557923053753,0.7069292040327522,192.78004455566406,0.7079010063400727,0.6646295255090816,0.7571993538684343,202.333984375,0.7436463307181385,0.7027655545034256,8.898887634277344,0.7044252076630003,0.6817814609230448,0.7286247423828887,9.639416694641113,0.7404959118025523,0.6990614381997438,46.920677185058594,0.6993571612239651,0.6495199885370397,0.7574778588536735,39.490272521972656,0.7062346801358627
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8fb7a64f61f366e6a68665ad8dc9ce6ef2969d2c1d541fac6d0287fe3afd5628
3
  size 539666601
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fb34d868f70c3c727c58e27b9f30c5db25f89cbbfb2ab5734e550f3fe4645895
3
  size 539666601