File size: 102,433 Bytes
b2d0708
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
---
library_name: sentence-transformers
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:26147930
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: '[YEAR_RANGE] 2020-2024 [TEXT] Vitamin B-6 Prevents Heart Failure
    with Preserved Ejection Fraction Through Downstream of Kinase 3 in a Mouse Model.'
  sentences:
  - '[YEAR_RANGE] 2020-2024 [TEXT] Colorectal cancer (CRC) is a complex and genetically
    heterogeneous disease presenting a specific metastatic pattern, with the liver
    being the most common site of metastasis. Around 20%-25% of patients with CRC
    will develop exclusively hepatic metastatic disease throughout their disease history.
    With its specific characteristics and therapeutic options, liver-limited disease
    (LLD) should be considered as a specific entity. The identification of these patients
    is particularly relevant in view of the growing interest in liver transplantation
    in selected patients with advanced CRC. Identifying why some patients will develop
    only LLD remains a challenge, mainly because of a lack of a systemic understanding
    of this complex and interlinked phenomenon given that cancer has traditionally
    been investigated according to distinct physiological compartments. Recently,
    multidisciplinary efforts and new diagnostic tools have made it possible to study
    some of these complex issues in greater depth and may help identify targets and
    specific treatment strategies to benefit these patients. In this review we analyze
    the underlying biology and available tools to help clinicians better understand
    this increasingly common and specific disease.'
  - '[YEAR_RANGE] 2020-2024 [TEXT] PURPOSE: Secondary breast cancer is a frequent
    late adverse event of mediastinal Hodgkin lymphoma radiotherapy. Secondary breast
    cancers overwhelmingly correspond to ductal carcinoma and develop from the glandular
    mammary tissue. In addition, during childhood, radiation overexposure of the glandular
    tissue may lead to a late breast hypotrophy at adult age. The aim of this study
    was to evaluate the radiation exposure to the glandular tissue in patients treated
    for mediastinal Hodgkin lymphoma with intensity-modulated proton therapy, in order
    to evaluate the potential dosimetric usefulness of its delineation for breast
    sparing. MATERIALS AND METHODS: Sixteen consecutive intermediate-risk mediastinal
    female patients with Hodgkin lymphoma treated with consolidation radiation with
    deep inspiration breath hold intensity-modulated proton therapy to the total dose
    of 30Gy were included. Breasts were delineated according to the European Society
    for Radiotherapy and Oncology guidelines for treatment optimization ("clinical
    organ at risk"). The glandular tissue ("glandular organ at risk") was retrospectively
    contoured on the initial simulation CT scans based on Hounsfield unit (HU) values,
    using a range between -80HU and 500HU. RESULTS: The mean and maximum doses delivered
    to the glandular organ at risk were significantly lower than the mean and maximum
    doses delivered to the clinical organ at risk, but were statistically correlated.
    Glandular organ at risk volumes were significantly smaller. CONCLUSION: Optimizing
    the treatment plans on the clinical breast contours will systematically lead to
    overestimation of the dose received to the glandular tissue and, consequently,
    to an indistinct and involuntary improved glandular tissue sparing. As such, our
    findings do not support the consideration of the glandular tissue as an additional
    organ at risk when planning intensity-modulated proton therapy for mediastinal
    Hodgkin lymphoma in female patients.'
  - '[YEAR_RANGE] 2020-2024 [TEXT] BACKGROUND: There is an urgent need to develop
    an efficient therapeutic strategy for heart failure with preserved ejection fraction
    (HFpEF), which is mediated by phenotypic changes in cardiac macrophages. We previously
    reported that vitamin B-6 inhibits macrophage-mediated inflammasome activation.
    OBJECTIVES: We sought to examine whether the prophylactic use of vitamin B-6 prevents
    HFpEF. METHODS: HFpEF model was elicited by a combination of high-fat diet and
    Nω-nitro-l-arginine methyl ester supplement in mice. Cardiac function was assessed
    using conventional echocardiography and Doppler imaging. Immunohistochemistry
    and immunoblotting were used to detect changes in the macrophage phenotype and
    myocardial remodeling-related molecules. RESULTS: Co-administration of vitamin
    B-6 with HFpEF mice mitigated HFpEF phenotypes, including diastolic dysfunction,
    cardiac macrophage phenotypic shifts, fibrosis, and hypertrophy. Echocardiographic
    improvements were observed, with the E/E'' ratio decreasing from 42.0 to 21.6
    and the E/A ratio improving from 2.13 to 1.17. The exercise capacity also increased
    from 295.3 to 657.7 min. However, these beneficial effects were negated in downstream
    of kinase (DOK) 3-deficient mice. Mechanistically, vitamin B-6 increased DOK3
    protein concentrations and inhibited macrophage phenotypic changes, which were
    abrogated by an AMP-activated protein kinase inhibitor. CONCLUSIONS: Vitamin B-6
    increases DOK3 signaling to lower risk of HFpEF by inhibiting phenotypic changes
    in cardiac macrophages.'
- source_sentence: '[YEAR_RANGE] 2020-2024 [TEXT] Resolving phylogenetic relationships
    and taxonomic revision in the Pseudogastromyzon (Cypriniformes, Gastromyzonidae)
    genus: molecular and morphological evidence for a new genus, Labigastromyzon.'
  sentences:
  - '[YEAR_RANGE] 2020-2024 [TEXT] Bats contain a diverse spectrum of viral species
    in their bodies. The RNA virus family Paramyxoviridae tends to infect several
    vertebrate species, which are accountable for a variety of devastating infections
    in both humans and animals. Viruses of this kind include measles, mumps, and Hendra.
    Some synonymous codons are favoured over others in mRNAs during gene-to-protein
    synthesis process. Such phenomenon is termed as codon usage bias (CUB). Our research
    emphasized many aspects that shape the CUB of genes in the Paramyxoviridae family
    found in bats. Here, the nitrogenous base A occurred the most. AT was found to
    be abundant in the coding sequences of the Paramyxoviridae family. RSCU data revealed
    that A or T ending codons occurred more frequently than predicted. Furthermore,
    3 overrepresented codons (CAT, AGA, and GCA) and 7 underrepresented codons (CCG,
    TCG, CGC, CGG, CGT, GCG and ACG) were detected in the viral genomes. Correspondence
    analysis, neutrality plot, and parity plots highlight the combined impact of mutational
    pressure and natural selection on CUB. The neutrality plot of GC12 against GC3
    yielded a regression coefficient value of 0.366, indicating that natural selection
    had a significant (63.4 %) impact. Moreover, RNA editing analysis was done, which
    revealed the highest frequency of C to T mutations. The results of our research
    revealed the pattern of codon usage and RNA editing sites in Paramyxoviridae genomes.'
  - '[YEAR_RANGE] 2020-2024 [TEXT] OBJECTIVE: The preoperative inclination angle of
    mandibular incisors was crucial for surgical and postoperative stability while
    the effect of proclined mandibular incisors on skeletal stability has not been
    investigated. This study aimed to evaluate the effects of differences in presurgical
    mandibular incisor inclination on skeletal stability after orthognathic surgery
    in patients with skeletal Class III malocclusion. METHODS: A retrospective cohort
    study of 80 consecutive patients with skeletal Class III malocclusion who underwent
    bimaxillary orthognathic surgery was conducted. According to incisor mandibular
    plane angle (IMPA), patients were divided into 3 groups: retroclined inclination
    (IMPA < 87°), normal inclination (87° ≤ IMPA < 93°) and proclined inclination
    (IMPA ≥ 93°). Preoperative characteristics, surgical changes and postoperative
    stability were compared based on lateral cephalograms obtained 1 week before surgery
    (T0), 1 week after surgery (T1), and at 6 to 12 months postoperatively (T2). RESULTS:
    The mandible demonstrated a forward and upward relapse in all three groups. No
    significant differences in skeletal relapse were observed in the 3 groups of patients.
    However, the proclined inclination group showed a negative overbite tendency postoperatively
    compared with the other two groups and a clinically significant mandibular relapse
    pattern. Proclined IMPA both pre- and postoperatively was correlated with mandibular
    relapse. CONCLUSION: Sufficient presurgical mandibular incisor decompensation
    was of crucial importance for the maintenance of skeletal stability in patients
    with skeletal Class III malocclusion who subsequently underwent orthognathic surgery.'
  - '[YEAR_RANGE] 2020-2024 [TEXT] The Pseudogastromyzon genus, consisting of species
    predominantly distributed throughout southeastern China, has garnered increasing
    market attention in recent years due to its ornamental appeal. However, the overlapping
    diagnostic attributes render the commonly accepted criteria for interspecific
    identification unreliable, leaving the phylogenetic relationships among Pseudogastromyzon
    species unexplored. In the present study, we undertake molecular phylogenetic
    and morphological examinations of the Pseudogastromyzon genus. Our phylogenetic
    analysis of mitochondrial genes distinctly segregated Pseudogastromyzon species
    into two clades: the Pseudogastromyzon clade and the Labigastromyzon clade. A
    subsequent morphological assessment revealed that the primary dermal ridge (specifically,
    the second ridge) within the labial adhesive apparatus serves as an effective
    and precise interspecific diagnostic characteristic. Moreover, the distributional
    ranges of Pseudogastromyzon and Labigastromyzon are markedly distinct, exhibiting
    only a narrow area of overlap. Considering the morphological heterogeneity of
    the labial adhesive apparatus and the substantial division within the molecular
    phylogeny, we advocate for the elevation of the Labigastromyzon subgenus to the
    status of a separate genus. Consequently, we have ascertained the validity of
    the Pseudogastromyzon and Labigastromyzon species, yielding a total of six valid
    species. To facilitate future research, we present comprehensive descriptions
    of the redefined species and introduce novel identification keys.'
- source_sentence: '[YEAR_RANGE] 2020-2024 [TEXT] PCa-RadHop: A transparent and lightweight
    feed-forward method for clinically significant prostate cancer segmentation.'
  sentences:
  - '[YEAR_RANGE] 2020-2024 [TEXT] According to the importance of time in treatment
    of thrombosis disorders, faster than current treatments are required. For the
    first time, this research discloses a novel strategy for rapid dissolution of
    blood clots by encapsulation of a fibrinolytic (Reteplase) into a Thrombin sensitive
    shell formed by polymerization of acrylamide monomers and bisacryloylated peptide
    as crosslinker. Degradability of the peptide units in exposure to Thrombin, creates
    the Thrombin-sensitive Reteplase nanocapsules (TSRNPs) as a triggered release
    system. Accelerated thrombolysis was achieved by combining three approaches including:
    deep penetration of TSRNPs into the blood clots, changing the clot dissolution
    mechanism by altering the distribution pattern of TSRNPs to 3D intra-clot distribution
    (based on the distributed intra-clot thrombolysis (DIT) model) instead of peripheral
    and unidirectional distribution of unencapsulated fibrinolytics and, enzyme-stimulated
    release of the fibrinolytic. Ex-vivo study was carried out by an occluded tube
    model that mimics in-vivo brain stroke as an emergency situation where faster
    treatment in short time is a golden key. In in vivo, efficacy of the developed
    formulation was confirmed by PET scan and laser Doppler flowmetry (LDF). As the
    most important achievements, 40.0 ± 0.7 (n = 3) % and 37.0 ± 0.4 (n = 3) % reduction
    in the thrombolysis time (faster reperfusion) were observed by ex-vivo and in-vivo
    experiments, respectively. Higher blood flow and larger digestion mass of clot
    at similar times in comparison to non-encapsulated Reteplase were observed that
    means more effective thrombolysis by the developed strategy.'
  - '[YEAR_RANGE] 2020-2024 [TEXT] Prostate Cancer is one of the most frequently occurring
    cancers in men, with a low survival rate if not early diagnosed. PI-RADS reading
    has a high false positive rate, thus increasing the diagnostic incurred costs
    and patient discomfort. Deep learning (DL) models achieve a high segmentation
    performance, although require a large model size and complexity. Also, DL models
    lack of feature interpretability and are perceived as "black-boxes" in the medical
    field. PCa-RadHop pipeline is proposed in this work, aiming to provide a more
    transparent feature extraction process using a linear model. It adopts the recently
    introduced Green Learning (GL) paradigm, which offers a small model size and low
    complexity. PCa-RadHop consists of two stages: Stage-1 extracts data-driven radiomics
    features from the bi-parametric Magnetic Resonance Imaging (bp-MRI) input and
    predicts an initial heatmap. To reduce the false positive rate, a subsequent stage-2
    is introduced to refine the predictions by including more contextual information
    and radiomics features from each already detected Region of Interest (ROI). Experiments
    on the largest publicly available dataset, PI-CAI, show a competitive performance
    standing of the proposed method among other deep DL models, achieving an area
    under the curve (AUC) of 0.807 among a cohort of 1,000 patients. Moreover, PCa-RadHop
    maintains orders of magnitude smaller model size and complexity.'
  - '[YEAR_RANGE] 2020-2024 [TEXT] OBJECTIVE: To evaluate rates of remission, recovery,
    relapse, and recurrence in suicidal youth who participated in a clinical trial
    comparing Dialectical Behavior Therapy (DBT) and Individual and Group Supportive
    Therapy (IGST). METHOD: Participants were 173 youth, aged 12 to 18 years, with
    repetitive self-harm (including at least 1 prior suicide attempt [SA]) and elevated
    suicidal ideation (SI). Participants received 6 months of DBT or IGST and were
    followed for 6 months post-treatment. The sample was 95% female, 56.4% White,
    and 27.49% Latina. Remission was defined as absence of SA or nonsuicidal self-injury
    (NSSI) across one 3-month interval; recovery was defined across 2 or more consecutive
    intervals. Relapse and recurrence were defined as SA or NSSI following remission
    or recovery. Cross-tabulation with χ2 was used for between-group contrasts. RESULTS:
    Over 70% of the sample reported remission of SA at each treatment and follow-up
    interval. There were significantly higher rates of remission and recovery and
    lower rates of relapse and recurrence for SA in DBT than for IGST. Across treatments
    and time points, SA had higher remission and recovery rates and lower relapse
    and recurrence rates than NSSI. There were no significant differences in NSSI
    remission between conditions; however, participants receiving DBT had significantly
    higher NSSI recovery rates than those receiving IGST for the 3- to 9-month, 3-
    to 12-month, and 6- to 12-month intervals. CONCLUSION: Results showed higher percentages
    of SA remission and recovery for DBT as compared to IGST. NSSI was less likely
    to remit than SA. PLAIN LANGUAGE SUMMARY: This study examined rates of remission,
    recovery, relapse, and recurrence of suicide attempts (SA) and nonsuicidal self-injury
    (NSSI) among the participants in the CARES Study, a randomized clinical trial
    of 6 months of Dialectical Behavior Therapy or Individual and Group Supportive
    Therapy. 173 youth aged 12 to 18 years participated in the study and were followed
    for 6 months post treatment. Over 70% of the sample reported remission of SA at
    each treatment and follow-up interval. There were significantly higher rates of
    remission and recovery and lower rates of relapse and recurrence for SA among
    participants who received Dialectical Behavioral Therapy. Across both treatments,
    remission and recovery rates were lower and relapse and recurrence rates were
    higher for NSSI than for SA. These results underscore the value of Dialectical
    Behavioral Therapy as a first line treatment for youth at high risk for suicide.
    DIVERSITY & INCLUSION STATEMENT: We worked to ensure race, ethnic, and/or other
    types of diversity in the recruitment of human participants. CLINICAL TRIAL REGISTRATION
    INFORMATION: Collaborative Adolescent Research on Emotions and Suicide (CARES);
    https://www. CLINICALTRIALS: gov/; NCT01528020.'
- source_sentence: '[YEAR_RANGE] 2020-2024 [TEXT] Predicting Recovery After Concussion
    in Pediatric Patients: A Meta-Analysis.'
  sentences:
  - '[YEAR_RANGE] 2020-2024 [TEXT] OBJECTIVE: The authors examined licensing requirements
    for select children''s behavioral health care providers. METHODS: Statutes and
    regulations as of October 2021 were reviewed for licensed clinical social workers,
    licensed professional counselors, and licensed marriage and family therapists
    for all 50 U.S. states and the District of Columbia. RESULTS: All jurisdictions
    had laws regarding postgraduate training and license portability. No jurisdiction
    included language about specialized postgraduate training related to serving children
    and families or cultural competence. Other policies that related to the structure,
    composition, and authority of licensing boards varied across states and licensure
    types. CONCLUSIONS: In their efforts to address barriers to licensure, expand
    the workforce, and ensure that children have access to high-quality and culturally
    responsive care, states could consider their statutes and regulations.'
  - '[YEAR_RANGE] 2020-2024 [TEXT] Magnetic Resonance Imaging (MRI) plays a pivotal
    role in the accurate measurement of brain subcortical structures in macaques,
    which is crucial for unraveling the complexities of brain structure and function,
    thereby enhancing our understanding of neurodegenerative diseases and brain development.
    However, due to significant differences in brain size, structure, and imaging
    characteristics between humans and macaques, computational tools developed for
    human neuroimaging studies often encounter obstacles when applied to macaques.
    In this context, we propose an Anatomy Attentional Fusion Network (AAF-Net), which
    integrates multimodal MRI data with anatomical constraints in a multi-scale framework
    to address the challenges posed by the dynamic development, regional heterogeneity,
    and age-related size variations of the juvenile macaque brain, thus achieving
    precise subcortical segmentation. Specifically, we generate a Signed Distance
    Map (SDM) based on the initial rough segmentation of the subcortical region by
    a network as an anatomical constraint, providing comprehensive information on
    positions, structures, and morphology. Then we construct AAF-Net to fully fuse
    the SDM anatomical constraints and multimodal images for refined segmentation.
    To thoroughly evaluate the performance of our proposed tool, over 700 macaque
    MRIs from 19 datasets were used in this study. Specifically, we employed two manually
    labeled longitudinal macaque datasets to develop the tool and complete four-fold
    cross-validations. Furthermore, we incorporated various external datasets to demonstrate
    the proposed tool''s generalization capabilities and promise in brain development
    research. We have made this tool available as an open-source resource at https://github.com/TaoZhong11/Macaque_subcortical_segmentation
    for direct application.'
  - '[YEAR_RANGE] 2020-2024 [TEXT] CONTEXT: Prognostic prediction models (PPMs) can
    help clinicians predict outcomes. OBJECTIVE: To critically examine peer-reviewed
    PPMs predicting delayed recovery among pediatric patients with concussion. DATA
    SOURCES: Ovid Medline, Embase, Ovid PsycInfo, Web of Science Core Collection,
    Cumulative Index to Nursing and Allied Health Literature, Cochrane Library, Google
    Scholar. STUDY SELECTION: The study had to report a PPM for pediatric patients
    to be used within 28 days of injury to estimate risk of delayed recovery at 28
    days to 1 year postinjury. Studies had to have at least 30 participants. DATA
    EXTRACTION: The Critical Appraisal and Data Extraction for Systematic Reviews
    of Prediction Modeling Studies checklist was completed. RESULTS: Six studies of
    13 PPMs were included. These studies primarily reflected male patients in late
    childhood or early adolescence presenting to an emergency department meeting the
    Concussion in Sport Group concussion criteria. No study authors used the same
    outcome definition nor evaluated the clinical utility of a model. All studies
    demonstrated high risk of bias. Quality of evidence was best for the Predicting
    and Preventing Postconcussive Problems in Pediatrics (5P) clinical risk score.
    LIMITATIONS: No formal PPM Grading of Recommendations, Assessment, Development,
    and Evaluations (GRADE) process exists. CONCLUSIONS: The 5P clinical risk score
    may be considered for clinical use. Rigorous external validations, particularly
    in other settings, are needed. The remaining PPMs require external validation.
    Lack of consensus regarding delayed recovery criteria limits these PPMs.'
- source_sentence: '[YEAR_RANGE] 2020-2024 [TEXT] Intraoperative Monitoring of the
    External Urethral Sphincter Reflex: A Novel Adjunct to Bulbocavernosus Reflex
    Neuromonitoring for Protecting the Sacral Neural Pathways Responsible for Urination,
    Defecation and Sexual Function.'
  sentences:
  - '[YEAR_RANGE] 2020-2024 [TEXT] Early menarche has been associated with adverse
    health outcomes, such as depressive symptoms. Discovering effect modifiers across
    these conditions in the pediatric population is a constant challenge. We tested
    whether movement behaviours modified the effect of the association between early
    menarche and depression symptoms among adolescents. This cross-sectional study
    included 2031 females aged 15-19 years across all Brazilian geographic regions.
    Data were collected using a self-administered questionnaire; 30.5% (n = 620) reported
    having experienced menarche before age 12 years (that is, early menarche). We
    used the Patient Health Questionnaire (PHQ-9) to evaluate depressive symptoms.
    Accruing any moderate-vigorous physical activity during leisure time, limited
    recreational screen time, and having good sleep quality were the exposures investigated.
    Adolescents who experienced early menarche and met one (B: -4.45, 95% CI: (-5.38,
    -3.51)), two (B: -6.07 (-7.02, -5.12)), or three (B: -6.49 (-7.76, -5.21)), and
    adolescents who experienced not early menarche and met one (B: -5.33 (-6.20; -4.46)),
    two (B: -6.12 (-6.99; -5.24)), or three (B: -6.27 (-7.30; -5.24)) of the movement
    behaviour targets had lower PHQ-9 scores for depression symptoms than adolescents
    who experienced early menarche and did not meet any of the movement behaviours.
    The disparities in depressive symptoms among the adolescents (early menarche versus
    not early menarche) who adhered to all three target behaviours were not statistically
    significant (B: 0.41 (-0.19; 1.01)). Adherence to movement behaviours modified
    the effect of the association between early menarche and depression symptoms.'
  - '[YEAR_RANGE] 2020-2024 [TEXT] PURPOSE: Intraoperative bulbocavernosus reflex
    neuromonitoring has been utilized to protect bowel, bladder, and sexual function,
    providing a continuous functional assessment of the somatic sacral nervous system
    during surgeries where it is at risk. Bulbocavernosus reflex data may also provide
    additional functional insight, including an evaluation for spinal shock, distinguishing
    upper versus lower motor neuron injury (conus versus cauda syndromes) and prognosis
    for postoperative bowel and bladder function. Continuous intraoperative bulbocavernosus
    reflex monitoring has been utilized to provide the surgeon with an ongoing functional
    assessment of the anatomical elements involved in the S2-S4 mediated reflex arc
    including the conus, cauda equina and pudendal nerves. Intraoperative bulbocavernosus
    reflex monitoring typically includes the electrical activation of the dorsal nerves
    of the genitals to initiate the afferent component of the reflex, followed by
    recording the resulting muscle response using needle electromyography recordings
    from the external anal sphincter. METHODS: Herein we describe a complementary
    and novel technique that includes recording electromyography responses from the
    external urethral sphincter to monitor the external urethral sphincter reflex.
    Specialized foley catheters embedded with recording electrodes have recently become
    commercially available that provide the ability to perform intraoperative external
    urethral sphincter muscle recordings. RESULTS: We describe technical details and
    the potential utility of incorporating external urethral sphincter reflex recordings
    into existing sacral neuromonitoring paradigms to provide redundant yet complementary
    data streams. CONCLUSIONS: We present two illustrative neurosurgical oncology
    cases to demonstrate the utility of the external urethral sphincter reflex technique
    in the setting of the necessary surgical sacrifice of sacral nerve roots.'
  - '[YEAR_RANGE] 2020-2024 [TEXT] BACKGROUND: Limited data are available on the appropriate
    choice of blood pressure management strategy for patients with acute basilar artery
    occlusion assessed by the standard deviation (SD). Multivariate logistic models
    were used to investigate the association between BPV, the primary outcome (futile
    recanalization, 90-day modified Rankin Scale score 3-6), and the secondary outcome
    (30-day mortality). Subgroup analysis was performed as a sensitivity test. RESULTS:
    Futile recanalization occurred in 60 (56 %) patients, while 26 (24 %) patients
    died within 30 days. In the fully adjusted model, MAP SD was associated with a
    higher risk of futile recanalization (OR adj=1.36, per 1 mmHg increase, 95 % CI:
    1.09-1.69, P=0.006) and 30-day mortality (OR adj=1.56, per 1 mmHg increase, 95
    % CI: 1.20-2.04, P=0.001). A significant interaction between MAP SD and the lack
    of hypertension history on futile recanalization (P<0.05) was observed. CONCLUSIONS:
    Among recanalized acute BAO ischemic patients, higher blood pressure variability
    during the first 24 h after MT was associated with worse outcomes. This association
    was stronger in patients without a history of hypertension.'
---

# SentenceTransformer

This is a [sentence-transformers](https://www.SBERT.net) model trained on the parquet dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

## Model Details

### Model Description
- **Model Type:** Sentence Transformer
<!-- - **Base model:** [Unknown](https://huggingface.co/unknown) -->
- **Maximum Sequence Length:** 1024 tokens
- **Output Dimensionality:** 384 tokens
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
    - parquet
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->

### Model Sources

- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)

### Full Model Architecture

```
SentenceTransformer(
  (0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```

## Usage

### Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

```bash
pip install -U sentence-transformers
```

Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("pankajrajdeo/UMLS-Pubmed-ST-TCE-Epoch-4")
# Run inference
sentences = [
    '[YEAR_RANGE] 2020-2024 [TEXT] Intraoperative Monitoring of the External Urethral Sphincter Reflex: A Novel Adjunct to Bulbocavernosus Reflex Neuromonitoring for Protecting the Sacral Neural Pathways Responsible for Urination, Defecation and Sexual Function.',
    '[YEAR_RANGE] 2020-2024 [TEXT] PURPOSE: Intraoperative bulbocavernosus reflex neuromonitoring has been utilized to protect bowel, bladder, and sexual function, providing a continuous functional assessment of the somatic sacral nervous system during surgeries where it is at risk. Bulbocavernosus reflex data may also provide additional functional insight, including an evaluation for spinal shock, distinguishing upper versus lower motor neuron injury (conus versus cauda syndromes) and prognosis for postoperative bowel and bladder function. Continuous intraoperative bulbocavernosus reflex monitoring has been utilized to provide the surgeon with an ongoing functional assessment of the anatomical elements involved in the S2-S4 mediated reflex arc including the conus, cauda equina and pudendal nerves. Intraoperative bulbocavernosus reflex monitoring typically includes the electrical activation of the dorsal nerves of the genitals to initiate the afferent component of the reflex, followed by recording the resulting muscle response using needle electromyography recordings from the external anal sphincter. METHODS: Herein we describe a complementary and novel technique that includes recording electromyography responses from the external urethral sphincter to monitor the external urethral sphincter reflex. Specialized foley catheters embedded with recording electrodes have recently become commercially available that provide the ability to perform intraoperative external urethral sphincter muscle recordings. RESULTS: We describe technical details and the potential utility of incorporating external urethral sphincter reflex recordings into existing sacral neuromonitoring paradigms to provide redundant yet complementary data streams. CONCLUSIONS: We present two illustrative neurosurgical oncology cases to demonstrate the utility of the external urethral sphincter reflex technique in the setting of the necessary surgical sacrifice of sacral nerve roots.',
    '[YEAR_RANGE] 2020-2024 [TEXT] Early menarche has been associated with adverse health outcomes, such as depressive symptoms. Discovering effect modifiers across these conditions in the pediatric population is a constant challenge. We tested whether movement behaviours modified the effect of the association between early menarche and depression symptoms among adolescents. This cross-sectional study included 2031 females aged 15-19 years across all Brazilian geographic regions. Data were collected using a self-administered questionnaire; 30.5% (n = 620) reported having experienced menarche before age 12 years (that is, early menarche). We used the Patient Health Questionnaire (PHQ-9) to evaluate depressive symptoms. Accruing any moderate-vigorous physical activity during leisure time, limited recreational screen time, and having good sleep quality were the exposures investigated. Adolescents who experienced early menarche and met one (B: -4.45, 95% CI: (-5.38, -3.51)), two (B: -6.07 (-7.02, -5.12)), or three (B: -6.49 (-7.76, -5.21)), and adolescents who experienced not early menarche and met one (B: -5.33 (-6.20; -4.46)), two (B: -6.12 (-6.99; -5.24)), or three (B: -6.27 (-7.30; -5.24)) of the movement behaviour targets had lower PHQ-9 scores for depression symptoms than adolescents who experienced early menarche and did not meet any of the movement behaviours. The disparities in depressive symptoms among the adolescents (early menarche versus not early menarche) who adhered to all three target behaviours were not statistically significant (B: 0.41 (-0.19; 1.01)). Adherence to movement behaviours modified the effect of the association between early menarche and depression symptoms.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```

<!--
### Direct Usage (Transformers)

<details><summary>Click to see the direct usage in Transformers</summary>

</details>
-->

<!--
### Downstream Usage (Sentence Transformers)

You can finetune this model on your own dataset.

<details><summary>Click to expand</summary>

</details>
-->

<!--
### Out-of-Scope Use

*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->

<!--
## Bias, Risks and Limitations

*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->

<!--
### Recommendations

*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->

## Training Details

### Training Dataset

#### parquet

* Dataset: parquet
* Size: 26,147,930 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
  |         | anchor                                                                              | positive                                                                              |
  |:--------|:------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|
  | type    | string                                                                              | string                                                                                |
  | details | <ul><li>min: 16 tokens</li><li>mean: 45.85 tokens</li><li>max: 137 tokens</li></ul> | <ul><li>min: 31 tokens</li><li>mean: 307.52 tokens</li><li>max: 1024 tokens</li></ul> |
* Samples:
  | anchor                                                                                       | positive                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        |
  |:---------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>[YEAR_RANGE] 1880-1884 [TEXT] ADDRESS OF COL. GARRICK MALLERY, U. S. ARMY.</code>      | <code>[YEAR_RANGE] 1880-1884 [TEXT] It may be conceded that after man had all his present faculties, he did not choose between the adoption of voice and gesture, and never with those faculties, was in a state where the one was used, to the absolute exclusion of the other. The epoch, however, to which our speculations relate is that in which he had not reached the present symmetric development of his intellect and of his bodily organs, and the inquiry is: Which mode of communication was earliest adopted to his single wants and informed intelligence? With the voice he could imitate distinictively but few sounds of nature, while with gesture he could exhibit actions, motions, positions, forms, dimensions, directions and distances, with their derivations and analogues. It would seem from this unequal division of capacity that oral speech remained rudimentary long after gesture had become an efficient mode of communication. With due allowance for all purely imitative sounds, and for the spontaneous action of vocal organs under excitement, it appears that the connection between ideas and words is only to be explained by a compact between speaker and hearer which supposes the existence of a prior mode of communication. This was probably by gesture. At least we may accept it as a clew leading out of the labyrinth of philological confusion, and regulating the immemorial quest of man's primitive speech.</code> |
  | <code>[YEAR_RANGE] 1880-1884 [TEXT] How TO OBTAIN THE BRAIN OF THE CAT.</code>               | <code>[YEAR_RANGE] 1880-1884 [TEXT] How to obtain the Brain of the Cat, (Wilder).-Correction: Page 158, second column, line 7, "grains," should be "grams;" page 159, near middle of 2nd column, "successily," should be "successively;" page 161, the number of Flower's paper is 3.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    |
  | <code>[YEAR_RANGE] 1880-1884 [TEXT] DOLBEAR ON THE NATURE AND CONSTITUTION OF MATTER.</code> | <code>[YEAR_RANGE] 1880-1884 [TEXT] Mr. Dopp desires to make the following correction in his paper in the last issue: "In my article on page 200 of "Science", the expression and should have been and being the velocity of light.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                      |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
  ```json
  {
      "scale": 20.0,
      "similarity_fct": "cos_sim"
  }
  ```

### Evaluation Dataset

#### parquet

* Dataset: parquet
* Size: 26,147,930 evaluation samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
  |         | anchor                                                                             | positive                                                                             |
  |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                             | string                                                                               |
  | details | <ul><li>min: 15 tokens</li><li>mean: 31.78 tokens</li><li>max: 78 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 303.03 tokens</li><li>max: 835 tokens</li></ul> |
* Samples:
  | anchor                                                                                                                                                                    | positive                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    |
  |:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>[YEAR_RANGE] 2020-2024 [TEXT] Solubility and thermodynamics of mesalazine in aqueous mixtures of poly ethylene glycol 200/600 at 293.2-313.2K.</code>               | <code>[YEAR_RANGE] 2020-2024 [TEXT] In this study, the solubility of mesalazine was investigated in binary solvent mixtures of poly ethylene glycols 200/600 and water at temperatures ranging from 293.2K to 313.2K. The solubility of mesalazine was determined using a shake-flask method, and its concentrations were measured using a UV-Vis spectrophotometer. The obtained solubility data were analyzed using mathematical models including the van't Hoff, Jouyban-Acree, Jouyban-Acree-van't Hoff, mixture response surface, and modified Wilson models. The experimental data obtained for mesalazine dissolution encompassed various thermodynamic properties, including ΔG°, ΔH°, ΔS°, and TΔS°. These properties offer valuable insights into the energetic aspects of the dissolution process and were calculated based on the van't Hoff equation.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   |
  | <code>[YEAR_RANGE] 2020-2024 [TEXT] Safety and efficacy of remimazolam versus propofol during EUS: a multicenter randomized controlled study.</code>                      | <code>[YEAR_RANGE] 2020-2024 [TEXT] BACKGROUND AND AIMS: Propofol, a widely used sedative in GI endoscopic procedures, is associated with cardiorespiratory suppression. Remimazolam is a novel ultrashort-acting benzodiazepine sedative with rapid onset and minimal cardiorespiratory depression. This study compared the safety and efficacy of remimazolam and propofol during EUS procedures. METHODS: A multicenter randomized controlled study was conducted between October 2022 and March 2023 in patients who underwent EUS procedures. Patients were randomly assigned to receive either remimazolam or propofol as a sedative agent. The primary endpoint was cardiorespiratory adverse events.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                         |
  | <code>[YEAR_RANGE] 2020-2024 [TEXT] Ultrasound-Guided Vs Non-Guided Prolotherapy for Internal Derangement of Temporomandibular Joint. A Randomized Clinical Trial.</code> | <code>[YEAR_RANGE] 2020-2024 [TEXT] OBJECTIVES: This randomized clinical trial study aims to compare ultrasound-guided versus non-guided Dextrose 10% injections in patients suffering from internal derangement in the temporomandibular joint (TMJ). MATERIAL AND METHODS: The study population included 22 patients and 43 TMJs suffering from unilateral or bilateral TMJ painful clicking, magnetic resonance imaging (MRI) proved disc displacement with reduction (DDWR), refractory to or failed conservative treatment. The patients were divided randomly into two groups (non-guided and ultrasound (US)-guided groups). The procedure involved injection of 2 mL solution of a mixture of 0.75 mL 0.9% normal saline solution, 0.3 mL 2% lidocaine and 0.75 mL dextrose 10% using a 25G needle in the joint and 1 mL intramuscular injection to the masseter muscle at the most tender point. The Visual Analogue Score (VAS) was used to compare joint pain intensity over four different periods, beginning with pre-injection, 1-, 2-, and 6-months postinjection. RESULTS: Twenty-two patients 5 males (n = 5/22, 22.7%) and 17 females (n = 17/22, 77.2%) were included in this study. The mean age was 27.3 ± 7.4 years (30.2 ± 7.0) for the non-guided group and 24.3 ± 6.9 for the US-guided group. The dextrose injection reduced intensity over time in both groups with statistically significant improvement (P value <.05) at 2 and 6 months in both groups. There was no statistically significant difference in VAS assessment between both groups. CONCLUSION: Intra-articular injection of dextrose 10% for patients with painful clicking and DDWR resulted in reduced pain intensity in both US-guided and non-guided groups with significant symptomatic improvement over time in both groups. US guidance allowed accurate anatomical localization and safe procedure with a single joint puncture.</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
  ```json
  {
      "scale": 20.0,
      "similarity_fct": "cos_sim"
  }
  ```

### Training Hyperparameters
#### Non-Default Hyperparameters

- `eval_strategy`: steps
- `per_device_train_batch_size`: 128
- `learning_rate`: 2e-05
- `num_train_epochs`: 5
- `max_steps`: 970330
- `log_level`: info
- `fp16`: True
- `dataloader_num_workers`: 16
- `load_best_model_at_end`: True
- `resume_from_checkpoint`: True

#### All Hyperparameters
<details><summary>Click to expand</summary>

- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 128
- `per_device_eval_batch_size`: 8
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 5
- `max_steps`: 970330
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.0
- `warmup_steps`: 0
- `log_level`: info
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 16
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: True
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`: 
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `eval_use_gather_object`: False
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional

</details>

### Training Logs
<details><summary>Click to expand</summary>

| Epoch  | Step   | Training Loss | Validation Loss |
|:------:|:------:|:-------------:|:---------------:|
| 0.0000 | 1      | 4.7032        | -               |
| 0.0052 | 1000   | 0.6304        | -               |
| 0.0103 | 2000   | 0.1763        | -               |
| 0.0155 | 3000   | 0.1602        | -               |
| 0.0206 | 4000   | 0.1494        | -               |
| 0.0258 | 5000   | 0.1122        | -               |
| 0.0309 | 6000   | 0.1225        | -               |
| 0.0361 | 7000   | 0.1059        | -               |
| 0.0412 | 8000   | 0.1002        | -               |
| 0.0464 | 9000   | 0.0988        | -               |
| 0.0515 | 10000  | 0.1148        | -               |
| 0.0567 | 11000  | 0.1034        | -               |
| 0.0618 | 12000  | 0.0758        | -               |
| 0.0670 | 13000  | 0.1056        | -               |
| 0.0721 | 14000  | 0.1123        | -               |
| 0.0773 | 15000  | 0.0702        | -               |
| 0.0824 | 16000  | 0.1633        | -               |
| 0.0876 | 17000  | 0.0736        | -               |
| 0.0928 | 18000  | 0.1132        | -               |
| 0.0979 | 19000  | 0.0695        | -               |
| 0.1031 | 20000  | 0.1339        | -               |
| 0.1082 | 21000  | 0.0761        | -               |
| 0.1134 | 22000  | 0.1311        | -               |
| 0.1185 | 23000  | 0.0664        | -               |
| 0.1237 | 24000  | 0.0807        | -               |
| 0.1288 | 25000  | 0.0641        | -               |
| 0.1340 | 26000  | 0.1327        | -               |
| 0.1391 | 27000  | 0.0721        | -               |
| 0.1443 | 28000  | 0.139         | -               |
| 0.1494 | 29000  | 0.0694        | -               |
| 0.1546 | 30000  | 0.1446        | -               |
| 0.1597 | 31000  | 0.0651        | -               |
| 0.1649 | 32000  | 0.1079        | -               |
| 0.1700 | 33000  | 0.109         | -               |
| 0.1752 | 34000  | 0.0741        | -               |
| 0.1804 | 35000  | 0.144         | -               |
| 0.1855 | 36000  | 0.0693        | -               |
| 0.1907 | 37000  | 0.0762        | -               |
| 0.1958 | 38000  | 0.1255        | -               |
| 0.2010 | 39000  | 0.0764        | -               |
| 0.2061 | 40000  | 0.1253        | -               |
| 0.2113 | 41000  | 0.0861        | -               |
| 0.2164 | 42000  | 0.0722        | -               |
| 0.2216 | 43000  | 0.1178        | -               |
| 0.2267 | 44000  | 0.0727        | -               |
| 0.2319 | 45000  | 0.0827        | -               |
| 0.2370 | 46000  | 0.0996        | -               |
| 0.2422 | 47000  | 0.0657        | -               |
| 0.2473 | 48000  | 0.0836        | -               |
| 0.2525 | 49000  | 0.0913        | -               |
| 0.2576 | 50000  | 0.0747        | -               |
| 0.2628 | 51000  | 0.0649        | -               |
| 0.2679 | 52000  | 0.0671        | -               |
| 0.2731 | 53000  | 0.0905        | -               |
| 0.2783 | 54000  | 0.0806        | -               |
| 0.2834 | 55000  | 0.0574        | -               |
| 0.2886 | 56000  | 0.0667        | -               |
| 0.2937 | 57000  | 0.0634        | -               |
| 0.2989 | 58000  | 0.0662        | -               |
| 0.3040 | 59000  | 0.0607        | -               |
| 0.3092 | 60000  | 0.0762        | -               |
| 0.3143 | 61000  | 0.0474        | -               |
| 0.3195 | 62000  | 0.0531        | -               |
| 0.3246 | 63000  | 0.0579        | -               |
| 0.3298 | 64000  | 0.0704        | -               |
| 0.3349 | 65000  | 0.0688        | -               |
| 0.3401 | 66000  | 0.0544        | -               |
| 0.3452 | 67000  | 0.0424        | -               |
| 0.3504 | 68000  | 0.0551        | -               |
| 0.3555 | 69000  | 0.0717        | -               |
| 0.3607 | 70000  | 0.0812        | -               |
| 0.3659 | 71000  | 0.0882        | -               |
| 0.3710 | 72000  | 0.0357        | -               |
| 0.3762 | 73000  | 0.0448        | -               |
| 0.3813 | 74000  | 0.0542        | -               |
| 0.3865 | 75000  | 0.0456        | -               |
| 0.3916 | 76000  | 0.1029        | -               |
| 0.3968 | 77000  | 0.054         | -               |
| 0.4019 | 78000  | 0.0673        | -               |
| 0.4071 | 79000  | 0.0357        | -               |
| 0.4122 | 80000  | 0.0601        | -               |
| 0.4174 | 81000  | 0.0751        | -               |
| 0.4225 | 82000  | 0.044         | -               |
| 0.4277 | 83000  | 0.0489        | -               |
| 0.4328 | 84000  | 0.0648        | -               |
| 0.4380 | 85000  | 0.0308        | -               |
| 0.4431 | 86000  | 0.0415        | -               |
| 0.4483 | 87000  | 0.0468        | -               |
| 0.4535 | 88000  | 0.0719        | -               |
| 0.4586 | 89000  | 0.0577        | -               |
| 0.4638 | 90000  | 0.0465        | -               |
| 0.4689 | 91000  | 0.0357        | -               |
| 0.4741 | 92000  | 0.0413        | -               |
| 0.4792 | 93000  | 0.0482        | -               |
| 0.4844 | 94000  | 0.0471        | -               |
| 0.4895 | 95000  | 0.083         | -               |
| 0.4947 | 96000  | 0.0313        | -               |
| 0.4998 | 97000  | 0.0366        | -               |
| 0.5050 | 98000  | 0.034         | -               |
| 0.5101 | 99000  | 0.0366        | -               |
| 0.5153 | 100000 | 0.0292        | -               |
| 0.5204 | 101000 | 0.0423        | -               |
| 0.5256 | 102000 | 0.0451        | -               |
| 0.5307 | 103000 | 0.0243        | -               |
| 0.5359 | 104000 | 0.0315        | -               |
| 0.5411 | 105000 | 0.0288        | -               |
| 0.5462 | 106000 | 0.0232        | -               |
| 0.5514 | 107000 | 0.0533        | -               |
| 0.5565 | 108000 | 0.0474        | -               |
| 0.5617 | 109000 | 0.0312        | -               |
| 0.5668 | 110000 | 0.0381        | -               |
| 0.5720 | 111000 | 0.0407        | -               |
| 0.5771 | 112000 | 0.0411        | -               |
| 0.5823 | 113000 | 0.0285        | -               |
| 0.5874 | 114000 | 0.0344        | -               |
| 0.5926 | 115000 | 0.0471        | -               |
| 0.5977 | 116000 | 0.0311        | -               |
| 0.6029 | 117000 | 0.0671        | -               |
| 0.6080 | 118000 | 0.0406        | -               |
| 0.6132 | 119000 | 0.0342        | -               |
| 0.6183 | 120000 | 0.0393        | -               |
| 0.6235 | 121000 | 0.0288        | -               |
| 0.6286 | 122000 | 0.0407        | -               |
| 0.6338 | 123000 | 0.0385        | -               |
| 0.6390 | 124000 | 0.0463        | -               |
| 0.6441 | 125000 | 0.0419        | -               |
| 0.6493 | 126000 | 0.0505        | -               |
| 0.6544 | 127000 | 0.0426        | -               |
| 0.6596 | 128000 | 0.0422        | -               |
| 0.6647 | 129000 | 0.034         | -               |
| 0.6699 | 130000 | 0.0266        | -               |
| 0.6750 | 131000 | 0.0205        | -               |
| 0.6802 | 132000 | 0.0412        | -               |
| 0.6853 | 133000 | 0.0374        | -               |
| 0.6905 | 134000 | 0.0338        | -               |
| 0.6956 | 135000 | 0.0287        | -               |
| 0.7008 | 136000 | 0.0364        | -               |
| 0.7059 | 137000 | 0.0342        | -               |
| 0.7111 | 138000 | 0.0406        | -               |
| 0.7162 | 139000 | 0.0333        | -               |
| 0.7214 | 140000 | 0.0408        | -               |
| 0.7266 | 141000 | 0.0439        | -               |
| 0.7317 | 142000 | 0.0327        | -               |
| 0.7369 | 143000 | 0.028         | -               |
| 0.7420 | 144000 | 0.0267        | -               |
| 0.7472 | 145000 | 0.0286        | -               |
| 0.7523 | 146000 | 0.0231        | -               |
| 0.7575 | 147000 | 0.0291        | -               |
| 0.7626 | 148000 | 0.0365        | -               |
| 0.7678 | 149000 | 0.0345        | -               |
| 0.7729 | 150000 | 0.0291        | -               |
| 0.7781 | 151000 | 0.0364        | -               |
| 0.7832 | 152000 | 0.0364        | -               |
| 0.7884 | 153000 | 0.0291        | -               |
| 0.7935 | 154000 | 0.0379        | -               |
| 0.7987 | 155000 | 0.0343        | -               |
| 0.8038 | 156000 | 0.0355        | -               |
| 0.8090 | 157000 | 0.0334        | -               |
| 0.8142 | 158000 | 0.0289        | -               |
| 0.8193 | 159000 | 0.0366        | -               |
| 0.8245 | 160000 | 0.0357        | -               |
| 0.8296 | 161000 | 0.0265        | -               |
| 0.8348 | 162000 | 0.0231        | -               |
| 0.8399 | 163000 | 0.0177        | -               |
| 0.8451 | 164000 | 0.022         | -               |
| 0.8502 | 165000 | 0.0227        | -               |
| 0.8554 | 166000 | 0.0179        | -               |
| 0.8605 | 167000 | 0.0238        | -               |
| 0.8657 | 168000 | 0.0225        | -               |
| 0.8708 | 169000 | 0.0219        | -               |
| 0.8760 | 170000 | 0.0254        | -               |
| 0.8811 | 171000 | 0.0239        | -               |
| 0.8863 | 172000 | 0.0267        | -               |
| 0.8914 | 173000 | 0.0255        | -               |
| 0.8966 | 174000 | 0.0234        | -               |
| 0.9018 | 175000 | 0.0261        | -               |
| 0.9069 | 176000 | 0.0235        | -               |
| 0.9121 | 177000 | 0.0267        | -               |
| 0.9172 | 178000 | 0.0232        | -               |
| 0.9224 | 179000 | 0.0197        | -               |
| 0.9275 | 180000 | 0.0189        | -               |
| 0.9327 | 181000 | 0.0219        | -               |
| 0.9378 | 182000 | 0.0226        | -               |
| 0.9430 | 183000 | 0.021         | -               |
| 0.9481 | 184000 | 0.0214        | -               |
| 0.9533 | 185000 | 0.0219        | -               |
| 0.9584 | 186000 | 0.021         | -               |
| 0.9636 | 187000 | 0.0195        | -               |
| 0.9687 | 188000 | 0.0188        | -               |
| 0.9739 | 189000 | 0.0205        | -               |
| 0.9790 | 190000 | 0.0199        | -               |
| 0.9842 | 191000 | 0.0315        | -               |
| 0.9893 | 192000 | 0.0214        | -               |
| 0.9945 | 193000 | 0.0169        | -               |
| 0.9997 | 194000 | 0.0182        | -               |
| 1.0000 | 194066 | -             | 0.0006          |
| 1.0048 | 195000 | 0.2355        | -               |
| 1.0100 | 196000 | 0.0796        | -               |
| 1.0151 | 197000 | 0.0853        | -               |
| 1.0203 | 198000 | 0.0829        | -               |
| 1.0254 | 199000 | 0.0628        | -               |
| 1.0306 | 200000 | 0.0698        | -               |
| 1.0357 | 201000 | 0.0601        | -               |
| 1.0409 | 202000 | 0.0581        | -               |
| 1.0460 | 203000 | 0.0577        | -               |
| 1.0512 | 204000 | 0.0697        | -               |
| 1.0563 | 205000 | 0.0515        | -               |
| 1.0615 | 206000 | 0.0553        | -               |
| 1.0666 | 207000 | 0.0613        | -               |
| 1.0718 | 208000 | 0.0712        | -               |
| 1.0769 | 209000 | 0.043         | -               |
| 1.0821 | 210000 | 0.1127        | -               |
| 1.0873 | 211000 | 0.0437        | -               |
| 1.0924 | 212000 | 0.0737        | -               |
| 1.0976 | 213000 | 0.0437        | -               |
| 1.1027 | 214000 | 0.0916        | -               |
| 1.1079 | 215000 | 0.0454        | -               |
| 1.1130 | 216000 | 0.088         | -               |
| 1.1182 | 217000 | 0.0442        | -               |
| 1.1233 | 218000 | 0.0505        | -               |
| 1.1285 | 219000 | 0.0414        | -               |
| 1.1336 | 220000 | 0.0904        | -               |
| 1.1388 | 221000 | 0.0466        | -               |
| 1.1439 | 222000 | 0.0965        | -               |
| 1.1491 | 223000 | 0.0459        | -               |
| 1.1542 | 224000 | 0.0992        | -               |
| 1.1594 | 225000 | 0.0435        | -               |
| 1.1645 | 226000 | 0.0594        | -               |
| 1.1697 | 227000 | 0.0857        | -               |
| 1.1749 | 228000 | 0.049         | -               |
| 1.1800 | 229000 | 0.0995        | -               |
| 1.1852 | 230000 | 0.0471        | -               |
| 1.1903 | 231000 | 0.0499        | -               |
| 1.1955 | 232000 | 0.0866        | -               |
| 1.2006 | 233000 | 0.0496        | -               |
| 1.2058 | 234000 | 0.0854        | -               |
| 1.2109 | 235000 | 0.0589        | -               |
| 1.2161 | 236000 | 0.0461        | -               |
| 1.2212 | 237000 | 0.0814        | -               |
| 1.2264 | 238000 | 0.0489        | -               |
| 1.2315 | 239000 | 0.0551        | -               |
| 1.2367 | 240000 | 0.0695        | -               |
| 1.2418 | 241000 | 0.043         | -               |
| 1.2470 | 242000 | 0.0533        | -               |
| 1.2521 | 243000 | 0.0556        | -               |
| 1.2573 | 244000 | 0.0608        | -               |
| 1.2625 | 245000 | 0.0426        | -               |
| 1.2676 | 246000 | 0.0439        | -               |
| 1.2728 | 247000 | 0.0638        | -               |
| 1.2779 | 248000 | 0.0549        | -               |
| 1.2831 | 249000 | 0.0377        | -               |
| 1.2882 | 250000 | 0.0383        | -               |
| 1.2934 | 251000 | 0.0472        | -               |
| 1.2985 | 252000 | 0.0448        | -               |
| 1.3037 | 253000 | 0.0387        | -               |
| 1.3088 | 254000 | 0.0528        | -               |
| 1.3140 | 255000 | 0.0331        | -               |
| 1.3191 | 256000 | 0.0342        | -               |
| 1.3243 | 257000 | 0.0362        | -               |
| 1.3294 | 258000 | 0.0436        | -               |
| 1.3346 | 259000 | 0.0524        | -               |
| 1.3397 | 260000 | 0.0353        | -               |
| 1.3449 | 261000 | 0.0274        | -               |
| 1.3500 | 262000 | 0.0368        | -               |
| 1.3552 | 263000 | 0.0486        | -               |
| 1.3604 | 264000 | 0.0536        | -               |
| 1.3655 | 265000 | 0.0595        | -               |
| 1.3707 | 266000 | 0.024         | -               |
| 1.3758 | 267000 | 0.0243        | -               |
| 1.3810 | 268000 | 0.0393        | -               |
| 1.3861 | 269000 | 0.029         | -               |
| 1.3913 | 270000 | 0.0722        | -               |
| 1.3964 | 271000 | 0.0366        | -               |
| 1.4016 | 272000 | 0.0375        | -               |
| 1.4067 | 273000 | 0.0289        | -               |
| 1.4119 | 274000 | 0.0247        | -               |
| 1.4170 | 275000 | 0.0695        | -               |
| 1.4222 | 276000 | 0.0283        | -               |
| 1.4273 | 277000 | 0.0328        | -               |
| 1.4325 | 278000 | 0.0457        | -               |
| 1.4376 | 279000 | 0.0204        | -               |
| 1.4428 | 280000 | 0.0277        | -               |
| 1.4480 | 281000 | 0.0255        | -               |
| 1.4531 | 282000 | 0.0536        | -               |
| 1.4583 | 283000 | 0.0411        | -               |
| 1.4634 | 284000 | 0.0289        | -               |
| 1.4686 | 285000 | 0.0244        | -               |
| 1.4737 | 286000 | 0.0292        | -               |
| 1.4789 | 287000 | 0.0334        | -               |
| 1.4840 | 288000 | 0.0315        | -               |
| 1.4892 | 289000 | 0.0408        | -               |
| 1.4943 | 290000 | 0.0379        | -               |
| 1.4995 | 291000 | 0.0243        | -               |
| 1.5046 | 292000 | 0.0228        | -               |
| 1.5098 | 293000 | 0.0235        | -               |
| 1.5149 | 294000 | 0.0187        | -               |
| 1.5201 | 295000 | 0.0256        | -               |
| 1.5252 | 296000 | 0.031         | -               |
| 1.5304 | 297000 | 0.0156        | -               |
| 1.5356 | 298000 | 0.0216        | -               |
| 1.5407 | 299000 | 0.0185        | -               |
| 1.5459 | 300000 | 0.0146        | -               |
| 1.5510 | 301000 | 0.0302        | -               |
| 1.5562 | 302000 | 0.0346        | -               |
| 1.5613 | 303000 | 0.0211        | -               |
| 1.5665 | 304000 | 0.0211        | -               |
| 1.5716 | 305000 | 0.0239        | -               |
| 1.5768 | 306000 | 0.0265        | -               |
| 1.5819 | 307000 | 0.018         | -               |
| 1.5871 | 308000 | 0.0204        | -               |
| 1.5922 | 309000 | 0.0288        | -               |
| 1.5974 | 310000 | 0.0193        | -               |
| 1.6025 | 311000 | 0.0443        | -               |
| 1.6077 | 312000 | 0.0251        | -               |
| 1.6128 | 313000 | 0.0209        | -               |
| 1.6180 | 314000 | 0.0245        | -               |
| 1.6232 | 315000 | 0.0179        | -               |
| 1.6283 | 316000 | 0.026         | -               |
| 1.6335 | 317000 | 0.025         | -               |
| 1.6386 | 318000 | 0.0291        | -               |
| 1.6438 | 319000 | 0.028         | -               |
| 1.6489 | 320000 | 0.0351        | -               |
| 1.6541 | 321000 | 0.0279        | -               |
| 1.6592 | 322000 | 0.0285        | -               |
| 1.6644 | 323000 | 0.0239        | -               |
| 1.6695 | 324000 | 0.0171        | -               |
| 1.6747 | 325000 | 0.0131        | -               |
| 1.6798 | 326000 | 0.0252        | -               |
| 1.6850 | 327000 | 0.0244        | -               |
| 1.6901 | 328000 | 0.0234        | -               |
| 1.6953 | 329000 | 0.0185        | -               |
| 1.7004 | 330000 | 0.0248        | -               |
| 1.7056 | 331000 | 0.0243        | -               |
| 1.7107 | 332000 | 0.0282        | -               |
| 1.7159 | 333000 | 0.0225        | -               |
| 1.7211 | 334000 | 0.0256        | -               |
| 1.7262 | 335000 | 0.03          | -               |
| 1.7314 | 336000 | 0.0227        | -               |
| 1.7365 | 337000 | 0.0192        | -               |
| 1.7417 | 338000 | 0.0178        | -               |
| 1.7468 | 339000 | 0.0187        | -               |
| 1.7520 | 340000 | 0.0156        | -               |
| 1.7571 | 341000 | 0.0186        | -               |
| 1.7623 | 342000 | 0.0241        | -               |
| 1.7674 | 343000 | 0.0252        | -               |
| 1.7726 | 344000 | 0.0201        | -               |
| 1.7777 | 345000 | 0.0251        | -               |
| 1.7829 | 346000 | 0.0258        | -               |
| 1.7880 | 347000 | 0.0216        | -               |
| 1.7932 | 348000 | 0.0274        | -               |
| 1.7983 | 349000 | 0.0244        | -               |
| 1.8035 | 350000 | 0.0243        | -               |
| 1.8087 | 351000 | 0.024         | -               |
| 1.8138 | 352000 | 0.0182        | -               |
| 1.8190 | 353000 | 0.0233        | -               |
| 1.8241 | 354000 | 0.024         | -               |
| 1.8293 | 355000 | 0.0177        | -               |
| 1.8344 | 356000 | 0.0149        | -               |
| 1.8396 | 357000 | 0.0113        | -               |
| 1.8447 | 358000 | 0.0142        | -               |
| 1.8499 | 359000 | 0.0147        | -               |
| 1.8550 | 360000 | 0.0109        | -               |
| 1.8602 | 361000 | 0.0155        | -               |
| 1.8653 | 362000 | 0.0144        | -               |
| 1.8705 | 363000 | 0.0131        | -               |
| 1.8756 | 364000 | 0.0171        | -               |
| 1.8808 | 365000 | 0.0156        | -               |
| 1.8859 | 366000 | 0.0168        | -               |
| 1.8911 | 367000 | 0.0167        | -               |
| 1.8963 | 368000 | 0.0161        | -               |
| 1.9014 | 369000 | 0.0168        | -               |
| 1.9066 | 370000 | 0.0151        | -               |
| 1.9117 | 371000 | 0.0178        | -               |
| 1.9169 | 372000 | 0.0153        | -               |
| 1.9220 | 373000 | 0.0133        | -               |
| 1.9272 | 374000 | 0.0121        | -               |
| 1.9323 | 375000 | 0.0141        | -               |
| 1.9375 | 376000 | 0.0151        | -               |
| 1.9426 | 377000 | 0.0142        | -               |
| 1.9478 | 378000 | 0.0141        | -               |
| 1.9529 | 379000 | 0.014         | -               |
| 1.9581 | 380000 | 0.0144        | -               |
| 1.9632 | 381000 | 0.0123        | -               |
| 1.9684 | 382000 | 0.0128        | -               |
| 1.9735 | 383000 | 0.0132        | -               |
| 1.9787 | 384000 | 0.0135        | -               |
| 1.9839 | 385000 | 0.0155        | -               |
| 1.9890 | 386000 | 0.0214        | -               |
| 1.9942 | 387000 | 0.0111        | -               |
| 1.9993 | 388000 | 0.0121        | -               |
| 2.0000 | 388132 | -             | 0.0005          |
| 2.0045 | 389000 | 0.1779        | -               |
| 2.0096 | 390000 | 0.0634        | -               |
| 2.0148 | 391000 | 0.0613        | -               |
| 2.0199 | 392000 | 0.0741        | -               |
| 2.0251 | 393000 | 0.0496        | -               |
| 2.0302 | 394000 | 0.056         | -               |
| 2.0354 | 395000 | 0.048         | -               |
| 2.0405 | 396000 | 0.0458        | -               |
| 2.0457 | 397000 | 0.0457        | -               |
| 2.0508 | 398000 | 0.057         | -               |
| 2.0560 | 399000 | 0.04          | -               |
| 2.0611 | 400000 | 0.0435        | -               |
| 2.0663 | 401000 | 0.0484        | -               |
| 2.0714 | 402000 | 0.0519        | -               |
| 2.0766 | 403000 | 0.0405        | -               |
| 2.0818 | 404000 | 0.0955        | -               |
| 2.0869 | 405000 | 0.0331        | -               |
| 2.0921 | 406000 | 0.0607        | -               |
| 2.0972 | 407000 | 0.0335        | -               |
| 2.1024 | 408000 | 0.0771        | -               |
| 2.1075 | 409000 | 0.0346        | -               |
| 2.1127 | 410000 | 0.073         | -               |
| 2.1178 | 411000 | 0.0348        | -               |
| 2.1230 | 412000 | 0.0396        | -               |
| 2.1281 | 413000 | 0.0317        | -               |
| 2.1333 | 414000 | 0.0766        | -               |
| 2.1384 | 415000 | 0.0366        | -               |
| 2.1436 | 416000 | 0.0796        | -               |
| 2.1487 | 417000 | 0.0367        | -               |
| 2.1539 | 418000 | 0.0819        | -               |
| 2.1590 | 419000 | 0.0344        | -               |
| 2.1642 | 420000 | 0.0435        | -               |
| 2.1694 | 421000 | 0.0764        | -               |
| 2.1745 | 422000 | 0.0389        | -               |
| 2.1797 | 423000 | 0.0675        | -               |
| 2.1848 | 424000 | 0.0521        | -               |
| 2.1900 | 425000 | 0.0405        | -               |
| 2.1951 | 426000 | 0.0704        | -               |
| 2.2003 | 427000 | 0.0404        | -               |
| 2.2054 | 428000 | 0.0703        | -               |
| 2.2106 | 429000 | 0.0461        | -               |
| 2.2157 | 430000 | 0.0378        | -               |
| 2.2209 | 431000 | 0.0655        | -               |
| 2.2260 | 432000 | 0.0391        | -               |
| 2.2312 | 433000 | 0.044         | -               |
| 2.2363 | 434000 | 0.0576        | -               |
| 2.2415 | 435000 | 0.0337        | -               |
| 2.2466 | 436000 | 0.0409        | -               |
| 2.2518 | 437000 | 0.0453        | -               |
| 2.2570 | 438000 | 0.0498        | -               |
| 2.2621 | 439000 | 0.0327        | -               |
| 2.2673 | 440000 | 0.0347        | -               |
| 2.2724 | 441000 | 0.0496        | -               |
| 2.2776 | 442000 | 0.0442        | -               |
| 2.2827 | 443000 | 0.0299        | -               |
| 2.2879 | 444000 | 0.031         | -               |
| 2.2930 | 445000 | 0.0378        | -               |
| 2.2982 | 446000 | 0.0339        | -               |
| 2.3033 | 447000 | 0.0297        | -               |
| 2.3085 | 448000 | 0.0406        | -               |
| 2.3136 | 449000 | 0.0277        | -               |
| 2.3188 | 450000 | 0.0271        | -               |
| 2.3239 | 451000 | 0.0275        | -               |
| 2.3291 | 452000 | 0.033         | -               |
| 2.3342 | 453000 | 0.0447        | -               |
| 2.3394 | 454000 | 0.0268        | -               |
| 2.3446 | 455000 | 0.0205        | -               |
| 2.3497 | 456000 | 0.029         | -               |
| 2.3549 | 457000 | 0.038         | -               |
| 2.3600 | 458000 | 0.0419        | -               |
| 2.3652 | 459000 | 0.0475        | -               |
| 2.3703 | 460000 | 0.0179        | -               |
| 2.3755 | 461000 | 0.0178        | -               |
| 2.3806 | 462000 | 0.0302        | -               |
| 2.3858 | 463000 | 0.0233        | -               |
| 2.3909 | 464000 | 0.0599        | -               |
| 2.3961 | 465000 | 0.0277        | -               |
| 2.4012 | 466000 | 0.0229        | -               |
| 2.4064 | 467000 | 0.0295        | -               |
| 2.4115 | 468000 | 0.0181        | -               |
| 2.4167 | 469000 | 0.057         | -               |
| 2.4218 | 470000 | 0.0203        | -               |
| 2.4270 | 471000 | 0.0248        | -               |
| 2.4321 | 472000 | 0.0382        | -               |
| 2.4373 | 473000 | 0.0151        | -               |
| 2.4425 | 474000 | 0.0212        | -               |
| 2.4476 | 475000 | 0.0131        | -               |
| 2.4528 | 476000 | 0.0473        | -               |
| 2.4579 | 477000 | 0.034         | -               |
| 2.4631 | 478000 | 0.0222        | -               |
| 2.4682 | 479000 | 0.0189        | -               |
| 2.4734 | 480000 | 0.0223        | -               |
| 2.4785 | 481000 | 0.0242        | -               |
| 2.4837 | 482000 | 0.0247        | -               |
| 2.4888 | 483000 | 0.0293        | -               |
| 2.4940 | 484000 | 0.0372        | -               |
| 2.4991 | 485000 | 0.0178        | -               |
| 2.5043 | 486000 | 0.0152        | -               |
| 2.5094 | 487000 | 0.0201        | -               |
| 2.5146 | 488000 | 0.0135        | -               |
| 2.5197 | 489000 | 0.0194        | -               |
| 2.5249 | 490000 | 0.0239        | -               |
| 2.5301 | 491000 | 0.0116        | -               |
| 2.5352 | 492000 | 0.0163        | -               |
| 2.5404 | 493000 | 0.0142        | -               |
| 2.5455 | 494000 | 0.0101        | -               |
| 2.5507 | 495000 | 0.0218        | -               |
| 2.5558 | 496000 | 0.0255        | -               |
| 2.5610 | 497000 | 0.0178        | -               |
| 2.5661 | 498000 | 0.0145        | -               |
| 2.5713 | 499000 | 0.0178        | -               |
| 2.5764 | 500000 | 0.0195        | -               |
| 2.5816 | 501000 | 0.0131        | -               |
| 2.5867 | 502000 | 0.0149        | -               |
| 2.5919 | 503000 | 0.0213        | -               |
| 2.5970 | 504000 | 0.013         | -               |
| 2.6022 | 505000 | 0.0351        | -               |
| 2.6073 | 506000 | 0.0197        | -               |
| 2.6125 | 507000 | 0.0133        | -               |
| 2.6177 | 508000 | 0.0201        | -               |
| 2.6228 | 509000 | 0.0133        | -               |
| 2.6280 | 510000 | 0.0189        | -               |
| 2.6331 | 511000 | 0.0191        | -               |
| 2.6383 | 512000 | 0.0227        | -               |
| 2.6434 | 513000 | 0.0199        | -               |
| 2.6486 | 514000 | 0.0281        | -               |
| 2.6537 | 515000 | 0.0216        | -               |
| 2.6589 | 516000 | 0.0219        | -               |
| 2.6640 | 517000 | 0.0185        | -               |
| 2.6692 | 518000 | 0.0131        | -               |
| 2.6743 | 519000 | 0.0104        | -               |
| 2.6795 | 520000 | 0.019         | -               |
| 2.6846 | 521000 | 0.0179        | -               |
| 2.6898 | 522000 | 0.0187        | -               |
| 2.6949 | 523000 | 0.0138        | -               |
| 2.7001 | 524000 | 0.0194        | -               |
| 2.7053 | 525000 | 0.018         | -               |
| 2.7104 | 526000 | 0.0222        | -               |
| 2.7156 | 527000 | 0.018         | -               |
| 2.7207 | 528000 | 0.0174        | -               |
| 2.7259 | 529000 | 0.0254        | -               |
| 2.7310 | 530000 | 0.0178        | -               |
| 2.7362 | 531000 | 0.0147        | -               |
| 2.7413 | 532000 | 0.0128        | -               |
| 2.7465 | 533000 | 0.0145        | -               |
| 2.7516 | 534000 | 0.0123        | -               |
| 2.7568 | 535000 | 0.0134        | -               |
| 2.7619 | 536000 | 0.0181        | -               |
| 2.7671 | 537000 | 0.0207        | -               |
| 2.7722 | 538000 | 0.0163        | -               |
| 2.7774 | 539000 | 0.0201        | -               |
| 2.7825 | 540000 | 0.0214        | -               |
| 2.7877 | 541000 | 0.0169        | -               |
| 2.7928 | 542000 | 0.0224        | -               |
| 2.7980 | 543000 | 0.0194        | -               |
| 2.8032 | 544000 | 0.0197        | -               |
| 2.8083 | 545000 | 0.0195        | -               |
| 2.8135 | 546000 | 0.0127        | -               |
| 2.8186 | 547000 | 0.018         | -               |
| 2.8238 | 548000 | 0.0182        | -               |
| 2.8289 | 549000 | 0.0138        | -               |
| 2.8341 | 550000 | 0.0109        | -               |
| 2.8392 | 551000 | 0.0082        | -               |
| 2.8444 | 552000 | 0.0105        | -               |
| 2.8495 | 553000 | 0.0104        | -               |
| 2.8547 | 554000 | 0.0081        | -               |
| 2.8598 | 555000 | 0.0111        | -               |
| 2.8650 | 556000 | 0.0104        | -               |
| 2.8701 | 557000 | 0.0098        | -               |
| 2.8753 | 558000 | 0.0123        | -               |
| 2.8804 | 559000 | 0.0119        | -               |
| 2.8856 | 560000 | 0.0119        | -               |
| 2.8908 | 561000 | 0.0122        | -               |
| 2.8959 | 562000 | 0.012         | -               |
| 2.9011 | 563000 | 0.0123        | -               |
| 2.9062 | 564000 | 0.0117        | -               |
| 2.9114 | 565000 | 0.013         | -               |
| 2.9165 | 566000 | 0.0118        | -               |
| 2.9217 | 567000 | 0.0097        | -               |
| 2.9268 | 568000 | 0.0085        | -               |
| 2.9320 | 569000 | 0.0099        | -               |
| 2.9371 | 570000 | 0.0111        | -               |
| 2.9423 | 571000 | 0.011         | -               |
| 2.9474 | 572000 | 0.0103        | -               |
| 2.9526 | 573000 | 0.0099        | -               |
| 2.9577 | 574000 | 0.0106        | -               |
| 2.9629 | 575000 | 0.0088        | -               |
| 2.9680 | 576000 | 0.0096        | -               |
| 2.9732 | 577000 | 0.0092        | -               |
| 2.9784 | 578000 | 0.0102        | -               |
| 2.9835 | 579000 | 0.0111        | -               |
| 2.9887 | 580000 | 0.018         | -               |
| 2.9938 | 581000 | 0.0082        | -               |
| 2.9990 | 582000 | 0.009         | -               |
| 3.0000 | 582198 | -             | 0.0005          |
| 3.0041 | 583000 | 0.1405        | -               |
| 3.0093 | 584000 | 0.0599        | -               |
| 3.0144 | 585000 | 0.0529        | -               |
| 3.0196 | 586000 | 0.0627        | -               |
| 3.0247 | 587000 | 0.0428        | -               |
| 3.0299 | 588000 | 0.0477        | -               |
| 3.0350 | 589000 | 0.0396        | -               |
| 3.0402 | 590000 | 0.0384        | -               |
| 3.0453 | 591000 | 0.0386        | -               |
| 3.0505 | 592000 | 0.0481        | -               |
| 3.0556 | 593000 | 0.0331        | -               |
| 3.0608 | 594000 | 0.0366        | -               |
| 3.0660 | 595000 | 0.0399        | -               |
| 3.0711 | 596000 | 0.042         | -               |
| 3.0763 | 597000 | 0.0368        | -               |
| 3.0814 | 598000 | 0.0837        | -               |
| 3.0866 | 599000 | 0.0272        | -               |
| 3.0917 | 600000 | 0.0532        | -               |
| 3.0969 | 601000 | 0.0266        | -               |
| 3.1020 | 602000 | 0.0691        | -               |
| 3.1072 | 603000 | 0.0276        | -               |
| 3.1123 | 604000 | 0.0629        | -               |
| 3.1175 | 605000 | 0.0294        | -               |
| 3.1226 | 606000 | 0.0324        | -               |
| 3.1278 | 607000 | 0.0259        | -               |
| 3.1329 | 608000 | 0.066         | -               |
| 3.1381 | 609000 | 0.0307        | -               |
| 3.1432 | 610000 | 0.0696        | -               |
| 3.1484 | 611000 | 0.0302        | -               |
| 3.1536 | 612000 | 0.0716        | -               |
| 3.1587 | 613000 | 0.0274        | -               |
| 3.1639 | 614000 | 0.0278        | -               |
| 3.1690 | 615000 | 0.0766        | -               |
| 3.1742 | 616000 | 0.0324        | -               |
| 3.1793 | 617000 | 0.0582        | -               |
| 3.1845 | 618000 | 0.0441        | -               |
| 3.1896 | 619000 | 0.0331        | -               |
| 3.1948 | 620000 | 0.0624        | -               |
| 3.1999 | 621000 | 0.0339        | -               |
| 3.2051 | 622000 | 0.059         | -               |
| 3.2102 | 623000 | 0.0379        | -               |
| 3.2154 | 624000 | 0.0339        | -               |
| 3.2205 | 625000 | 0.0556        | -               |
| 3.2257 | 626000 | 0.0319        | -               |
| 3.2308 | 627000 | 0.0373        | -               |
| 3.2360 | 628000 | 0.0475        | -               |
| 3.2411 | 629000 | 0.0297        | -               |
| 3.2463 | 630000 | 0.0321        | -               |
| 3.2515 | 631000 | 0.0381        | -               |
| 3.2566 | 632000 | 0.0439        | -               |
| 3.2618 | 633000 | 0.0261        | -               |
| 3.2669 | 634000 | 0.0292        | -               |
| 3.2721 | 635000 | 0.0404        | -               |
| 3.2772 | 636000 | 0.0385        | -               |
| 3.2824 | 637000 | 0.0252        | -               |
| 3.2875 | 638000 | 0.0255        | -               |
| 3.2927 | 639000 | 0.0305        | -               |
| 3.2978 | 640000 | 0.0283        | -               |
| 3.3030 | 641000 | 0.0245        | -               |
| 3.3081 | 642000 | 0.0271        | -               |
| 3.3133 | 643000 | 0.0297        | -               |
| 3.3184 | 644000 | 0.022         | -               |
| 3.3236 | 645000 | 0.0218        | -               |
| 3.3287 | 646000 | 0.0269        | -               |
| 3.3339 | 647000 | 0.0386        | -               |
| 3.3391 | 648000 | 0.021         | -               |
| 3.3442 | 649000 | 0.0161        | -               |
| 3.3494 | 650000 | 0.0231        | -               |
| 3.3545 | 651000 | 0.032         | -               |
| 3.3597 | 652000 | 0.0339        | -               |
| 3.3648 | 653000 | 0.0407        | -               |
| 3.3700 | 654000 | 0.0146        | -               |
| 3.3751 | 655000 | 0.0151        | -               |
| 3.3803 | 656000 | 0.0236        | -               |
| 3.3854 | 657000 | 0.0184        | -               |
| 3.3906 | 658000 | 0.0518        | -               |
| 3.3957 | 659000 | 0.0213        | -               |
| 3.4009 | 660000 | 0.017         | -               |
| 3.4060 | 661000 | 0.027         | -               |
| 3.4112 | 662000 | 0.0142        | -               |
| 3.4163 | 663000 | 0.0492        | -               |
| 3.4215 | 664000 | 0.0158        | -               |
| 3.4267 | 665000 | 0.0192        | -               |
| 3.4318 | 666000 | 0.0341        | -               |
| 3.4370 | 667000 | 0.0114        | -               |
| 3.4421 | 668000 | 0.0171        | -               |
| 3.4473 | 669000 | 0.0107        | -               |
| 3.4524 | 670000 | 0.0368        | -               |
| 3.4576 | 671000 | 0.0306        | -               |
| 3.4627 | 672000 | 0.0192        | -               |
| 3.4679 | 673000 | 0.0151        | -               |
| 3.4730 | 674000 | 0.0181        | -               |
| 3.4782 | 675000 | 0.0197        | -               |
| 3.4833 | 676000 | 0.0204        | -               |
| 3.4885 | 677000 | 0.0245        | -               |
| 3.4936 | 678000 | 0.0316        | -               |
| 3.4988 | 679000 | 0.0142        | -               |
| 3.5039 | 680000 | 0.012         | -               |
| 3.5091 | 681000 | 0.0166        | -               |
| 3.5143 | 682000 | 0.0103        | -               |
| 3.5194 | 683000 | 0.0154        | -               |
| 3.5246 | 684000 | 0.0195        | -               |
| 3.5297 | 685000 | 0.0093        | -               |
| 3.5349 | 686000 | 0.0127        | -               |
| 3.5400 | 687000 | 0.0101        | -               |
| 3.5452 | 688000 | 0.0085        | -               |
| 3.5503 | 689000 | 0.0167        | -               |
| 3.5555 | 690000 | 0.0205        | -               |
| 3.5606 | 691000 | 0.0151        | -               |
| 3.5658 | 692000 | 0.0109        | -               |
| 3.5709 | 693000 | 0.014         | -               |
| 3.5761 | 694000 | 0.0149        | -               |
| 3.5812 | 695000 | 0.0107        | -               |
| 3.5864 | 696000 | 0.0112        | -               |
| 3.5915 | 697000 | 0.0168        | -               |
| 3.5967 | 698000 | 0.0101        | -               |
| 3.6018 | 699000 | 0.0283        | -               |
| 3.6070 | 700000 | 0.0156        | -               |
| 3.6122 | 701000 | 0.0105        | -               |
| 3.6173 | 702000 | 0.0167        | -               |
| 3.6225 | 703000 | 0.0106        | -               |
| 3.6276 | 704000 | 0.0144        | -               |
| 3.6328 | 705000 | 0.0162        | -               |
| 3.6379 | 706000 | 0.0179        | -               |
| 3.6431 | 707000 | 0.0161        | -               |
| 3.6482 | 708000 | 0.0232        | -               |
| 3.6534 | 709000 | 0.017         | -               |
| 3.6585 | 710000 | 0.018         | -               |
| 3.6637 | 711000 | 0.0157        | -               |
| 3.6688 | 712000 | 0.0101        | -               |
| 3.6740 | 713000 | 0.0085        | -               |
| 3.6791 | 714000 | 0.0143        | -               |
| 3.6843 | 715000 | 0.0152        | -               |
| 3.6894 | 716000 | 0.0153        | -               |
| 3.6946 | 717000 | 0.0117        | -               |
| 3.6998 | 718000 | 0.0147        | -               |
| 3.7049 | 719000 | 0.015         | -               |
| 3.7101 | 720000 | 0.0188        | -               |
| 3.7152 | 721000 | 0.0141        | -               |
| 3.7204 | 722000 | 0.0143        | -               |
| 3.7255 | 723000 | 0.0214        | -               |
| 3.7307 | 724000 | 0.0144        | -               |
| 3.7358 | 725000 | 0.0121        | -               |
| 3.7410 | 726000 | 0.0104        | -               |
| 3.7461 | 727000 | 0.0114        | -               |
| 3.7513 | 728000 | 0.0105        | -               |
| 3.7564 | 729000 | 0.0096        | -               |
| 3.7616 | 730000 | 0.0146        | -               |
| 3.7667 | 731000 | 0.018         | -               |
| 3.7719 | 732000 | 0.0141        | -               |
| 3.7770 | 733000 | 0.0166        | -               |
| 3.7822 | 734000 | 0.0182        | -               |
| 3.7874 | 735000 | 0.015         | -               |
| 3.7925 | 736000 | 0.0184        | -               |
| 3.7977 | 737000 | 0.0162        | -               |
| 3.8028 | 738000 | 0.0166        | -               |
| 3.8080 | 739000 | 0.017         | -               |
| 3.8131 | 740000 | 0.01          | -               |
| 3.8183 | 741000 | 0.0143        | -               |
| 3.8234 | 742000 | 0.0147        | -               |
| 3.8286 | 743000 | 0.0109        | -               |
| 3.8337 | 744000 | 0.0088        | -               |
| 3.8389 | 745000 | 0.0064        | -               |
| 3.8440 | 746000 | 0.0084        | -               |
| 3.8492 | 747000 | 0.0079        | -               |
| 3.8543 | 748000 | 0.0064        | -               |
| 3.8595 | 749000 | 0.0085        | -               |
| 3.8646 | 750000 | 0.0082        | -               |
| 3.8698 | 751000 | 0.0077        | -               |
| 3.8750 | 752000 | 0.0096        | -               |
| 3.8801 | 753000 | 0.0095        | -               |
| 3.8853 | 754000 | 0.0093        | -               |
| 3.8904 | 755000 | 0.0095        | -               |
| 3.8956 | 756000 | 0.0097        | -               |
| 3.9007 | 757000 | 0.01          | -               |
| 3.9059 | 758000 | 0.0091        | -               |
| 3.9110 | 759000 | 0.01          | -               |
| 3.9162 | 760000 | 0.0099        | -               |
| 3.9213 | 761000 | 0.0082        | -               |
| 3.9265 | 762000 | 0.0066        | -               |
| 3.9316 | 763000 | 0.0073        | -               |
| 3.9368 | 764000 | 0.0082        | -               |
| 3.9419 | 765000 | 0.0092        | -               |
| 3.9471 | 766000 | 0.0079        | -               |
| 3.9522 | 767000 | 0.008         | -               |
| 3.9574 | 768000 | 0.0081        | -               |
| 3.9625 | 769000 | 0.007         | -               |
| 3.9677 | 770000 | 0.0076        | -               |
| 3.9729 | 771000 | 0.0072        | -               |
| 3.9780 | 772000 | 0.008         | -               |
| 3.9832 | 773000 | 0.0082        | -               |
| 3.9883 | 774000 | 0.0163        | -               |
| 3.9935 | 775000 | 0.0066        | -               |
| 3.9986 | 776000 | 0.0068        | -               |
| 4.0000 | 776264 | -             | 0.0005          |

</details>

### Framework Versions
- Python: 3.12.2
- Sentence Transformers: 3.2.1
- Transformers: 4.44.2
- PyTorch: 2.5.0
- Accelerate: 1.0.1
- Datasets: 3.0.2
- Tokenizers: 0.19.1

## Citation

### BibTeX

#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
```

#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
```

<!--
## Glossary

*Clearly define terms in order to be accessible across audiences.*
-->

<!--
## Model Card Authors

*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->

<!--
## Model Card Contact

*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->