Input
stringlengths 251
41.6k
| Output
stringlengths 137
9.7k
| input_ids
sequencelengths 157
2.05k
| attention_mask
sequencelengths 157
2.05k
| labels
sequencelengths 157
2.05k
|
---|---|---|---|---|
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper addresses longtext generation with a specific task of being given a prefix of a review and needing to add the next five sentences coherently the paper proposes adding two discriminators one trained to maximize a cosine similarity between source sentences and target sentences dcoherence and one trained to maximize a cosine similarity between two consecutive sentences on some automatic metrics like bleu and perplexity an mle model with these discriminators performs a little bit better than without this paper does not include any manual evaluation which is critical for evaluating the quality of generated output especially for evaluating coherence and cohesion this paper uses the task setup and dataset from learning to write with cooperative discriminators holtzman et al acl 2018 that paper also includes many specified aspects to improve the coherence from the abstract of that paper human evaluation demonstrates that text generated by our model is preferred over that of baselines by a large margin significantly enhancing the overall coherence style and information of the generations but this paper does not compare against the method described in holtzman et al or any other prior work does not include any human evaluations even though they were the main measure of evaluation in prior work this paper states that to the best of our knowledge this paper is the first attempt to explicitly capture crosssentence linguistic properties ie coherence and cohesion for long text generation there is much past work in the nlp community on these for example see modeling local coherence an entitybased approach by barzilay and lapata 2005 which has 500 citations it has been widely studied in the area of summarization for example using cohesion and coherence models for text summarization mani et al aaai 1998 and followup work and in more recent work the learning to write paper that the dataset and task follow from addresses several linguistically informed crosssentence issues like repetition and entailment the cosine similarity metric in the model is not very well suited to the tasks of coherence and cohesion as it is symmetric while natural language isnt the pair john went to the store to buy some milk when he got there they were all out and when he got there they were all out john went to the store to buy some milk would have identical scores according to a cosine similarity metric while the first ordering is much more coherent than the second the conclusion says we showed a significant improvement how was significance determined here docsepthe paper proposes a method for improving the quality of text generation by optimizing for coherence and cohesion the authors develop two discriminatorsa coherence discriminator which takes as input all of the sentence embeddings ie averaged word embeddings of the document and assigns a score and a cohesion discriminator which takes as input the word embeddings of two consecutive sentences and assigns a score in the former the score is the cosine similarity between the encodings of the first and second half of the document in the latter the score is the cosine similarity between the encodings of the two sentences both discriminators use cnns to encode the inputs the discriminators are trained to rank true text over randomly drawn negative samples which consist of randomly permuted sentence orderings andor random combinations of firstsecond half of documents this discriminators are then used to train a text generation model the output of the text generation model is scored by various automatic metrics including nll ppl bleu and number of unique ngrams in the outputs the improvements over a genericallytrained generation model are very small overall i did not find this paper to be convincing the initial motivation is goodwe need to find a way to capture richer linguistic properties of text and to encourage nlg to produce such properties however the discriminators presented do not actually capture the nuances that they purport to capture as i understand it these models are just being trained to incentivize high cosine similarity between the words in the firstsecond half of a document or sentencefollowing sentence that is not reflective of the definitions of coherence and cohesion which should reflect deeper discourse and even syntactic structure rather these are just models which capture topical similarity and naively at that moreover training this model to discriminate real text from randomly perturbed text seems problematic since 1 randomly shuffled text should be trivially easy to distinguish from real text in terms of topical similarity and 2 these negative samples are not i dont think at all reflective of the types of texts that the discriminators actually need to discriminate ie automatically generated texts thus even ignoring the fact that i disagree with the authors on exactly what the discriminators areshould be doing it is still not clear to me that the discriminators are well trained to do the thing the authors want them to do i have various other concerns about the claims the approach and the evaluation a list of more specific questionscomments for the authors is below there are a lot of unsubstantiated claims and speculation about the linguistic properties that these discriminators capture and no motivation of analysis as to how they are capturing it claims like the following definitely need to be removed learn to inspect the higherlevel role of t such as but not limited to whether it supports the intent of s transitions smoothly against s or avoids redundancy such as grammar of each of the sentences and the logical flow between arbitrary two consecutive sentences you only use automated metrics despite acknowledging that there is no good way to evaluate generation why not use human eval this is not difficult to carry out and when you are arguing about such subtle properties of language human eval is essential there is no reason that bleu for example would be sensitive to coherence or cohesion so why would this be a good way to evaluate a model aimed to capture exactly those things also related to human eval there should be an intrinsic evaluation of the discriminators do they correlate with human judgments of coherence and cohesion you cannot take it for granted that they capture these things i very much believe they do not so present some evidence that the models do what you claim they do the reported improvements are minuscule to the extent that i would read them as no difference the only metric where there is a real difference is on number of unique ngrams generated cross inputs which is presumably because its just learning being encouraged to spit out words that were in the input id like to see the baseline of just copying the input as the output you mention several times that these models will pick up on redundancy it is not clear to me how they could do that arent they simply using a cosine similarity between feature vectors perhaps i am missing something but i dont see how this could learn to disincentivize redundancy but simultaneously encourage topical similarity could you explain this claim docsepthe idea of training discriminators to determine coherence and cohesion and training those discriminators as part of an nlg system using policy gradients is an interesting one however there are two major problems with the papers as it stands 1 it completely ignores the decades of nlg literature on this topic before the neural revolution in nlp 2 the presentation of the paper is confusing in a number of respects some details below to claim that this is the first paper to capture crosssentence linguistic properties for text generation is the sort of comment that is likely to make experienced nlg researchers very grumpy a good place to start looking at the extensive literature on this topic is the following paper modeling local coherence an entitybased approach barzilay and lapata 2007 one aspect in which the presentation is muddled is the order of the results tables table 2 is far too early in the paper i had no idea at that point why the retrieval results were being presented or what the numbers meant you also have cohesion in the table before the cohesion section in 32 likewise table 1 which is on p2 and gives examples of system output is far too early perhaps the biggest confusion for me was the difference between cohesion and coherence and in particular how they are modeled the intro does a good job of describing the two concepts and making the contrast between local and global coherence but when i was reading 31 i kept thinking this was describing cohesion t that follows s in the data sounds local no and then 32 seems to suggest that coherence and cohesion essentially are being modeled in the same way except shuffling happens on the word level i suppose what i was expecting was some attempt at a global model for coherence which goes beyond just looking at consecutive sentence pairs i wonder why you didnt try a sequence model of sentences eg bidirectional lstm these are so standard now it seems odd not to have them do you describe the decoding procedure greedy beam at test time anywhere i liked table 4 and found the example pairs with the scores to be useful qualitative analysis based on automated nlp metrics we showed a significant improvement which metrics not clear to me that the improvements in table 3 are significant minor presentation points followed by a logically sound sentence might want to rephrase this since you dont mean logical soundness in a technical sense here i dont think the comment in the conclusion about being convinced the architecture generalizes well to unseen texts is irrelevant without some evidence
### Summary: | this paper attempts at modeling coherence of generated text and proposes two kinds of discriminators that tries to measure whether a piece of text is coherent or not however the paper misses several related critical references and also lacks extensive evaluation especially manual evaluation there is consensus between the reviewers that this paper needs more work before it is accepted to a conference such as iclr | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
12453,
1048,
1156,
5978,
342,
247,
2173,
4836,
273,
1146,
1677,
247,
17744,
273,
247,
2278,
285,
25312,
281,
823,
253,
1735,
2620,
14683,
18893,
314,
50276,
783,
2929,
29328,
6240,
767,
20741,
2392,
581,
10166,
281,
22950,
247,
7349,
460,
14259,
875,
2603,
14683,
285,
2303,
14683,
277,
1940,
11724,
285,
581,
10166,
281,
22950,
247,
7349,
460,
14259,
875,
767,
12640,
14683,
50276,
251,
690,
12077,
17082,
751,
7387,
86,
285,
44229,
414,
271,
278,
282,
1566,
342,
841,
20741,
2392,
17923,
247,
1652,
2372,
1805,
685,
1293,
50276,
2520,
2929,
1057,
417,
2486,
667,
11595,
7103,
534,
310,
4619,
323,
16344,
253,
3290,
273,
4561,
3453,
3340,
323,
16344,
25253,
285,
28901,
279,
50276,
2520,
2929,
4648,
253,
4836,
9978,
285,
10895,
432,
4715,
281,
3630,
342,
27293,
20741,
2392,
5965,
21239,
1342,
1162,
355,
247,
498,
4765,
50276,
3529,
2929,
671,
3797,
1142,
7616,
7794,
281,
3157,
253,
25253,
432,
253,
12002,
273,
326,
2929,
1966,
7103,
14371,
326,
2505,
4561,
407,
776,
1566,
310,
9013,
689,
326,
273,
1666,
25379,
407,
247,
1781,
8459,
3012,
22474,
253,
4583,
25253,
3740,
285,
1491,
273,
253,
14649,
50276,
2858,
436,
2929,
1057,
417,
7277,
1411,
253,
1332,
2529,
275,
5965,
21239,
1342,
1162,
355,
390,
667,
643,
2720,
789,
1057,
417,
2486,
667,
1966,
27163,
1014,
2167,
597,
497,
253,
2022,
2557,
273,
7103,
275,
2720,
789,
50276,
2520,
2929,
3054,
326,
281,
253,
1682,
273,
776,
3640,
436,
2929,
310,
253,
806,
3177,
281,
11120,
9232,
2831,
36817,
32019,
3607,
26332,
25253,
285,
28901,
279,
323,
1048,
2505,
5978,
50276,
9088,
310,
1199,
2469,
789,
275,
253,
295,
24343,
3114,
327,
841,
50276,
1542,
1650,
923,
50276,
7645,
272,
1980,
25253,
271,
10726,
3169,
2746,
407,
2534,
91,
300,
333,
285,
13427,
682,
5826,
534,
556,
6783,
30404,
50276,
262,
556,
644,
7561,
5421,
275,
253,
2170,
273,
10405,
1320,
323,
1650,
50276,
5302,
28901,
279,
285,
25253,
3210,
323,
2505,
10405,
1320,
637,
74,
1162,
355,
39951,
2284,
8065,
285,
956,
484,
789,
285,
275,
625,
3332,
789,
253,
4715,
281,
3630,
2929,
326,
253,
10895,
285,
4836,
956,
432,
12453,
2067,
20365,
18260,
8191,
2831,
36817,
3374,
751,
22563,
285,
46518,
420,
50274,
783,
7349,
460,
14259,
7982,
275,
253,
1566,
310,
417,
1077,
973,
18960,
281,
253,
8892,
273,
25253,
285,
28901,
279,
347,
352,
310,
13123,
1223,
3626,
3448,
310,
2649,
50276,
783,
4667,
480,
2116,
2427,
281,
253,
4657,
281,
4489,
690,
8463,
672,
344,
1694,
627,
597,
497,
512,
562,
50276,
395,
50275,
9453,
344,
1694,
627,
597,
497,
512,
562,
480,
2116,
2427,
281,
253,
4657,
281,
4489,
690,
8463,
50276,
12756,
452,
8931,
7363,
2556,
281,
247,
7349,
460,
14259,
7982,
1223,
253,
806,
15824,
310,
1199,
625,
18893,
685,
253,
1273,
50276,
783,
6452,
2296,
359,
2692,
247,
1534,
7756,
849,
369,
8453,
3413,
1060,
5474,
339,
431,
248,
2929,
29328,
247,
1332,
323,
11138,
253,
3290,
273,
2505,
5978,
407,
39793,
323,
25253,
285,
28901,
279,
253,
4477,
1287,
767,
20741,
2392,
66,
25253,
7134,
12915,
534,
3936,
347,
3280,
512,
273,
253,
6197,
46234,
26332,
17522,
3159,
46234,
273,
253,
3389,
285,
39360,
247,
4868,
285,
247,
28901,
279,
7134,
12915,
534,
3936,
347,
3280,
253,
3159,
46234,
273,
767,
12640,
14683,
285,
39360,
247,
4868,
275,
253,
3438,
253,
4868,
310,
253,
7349,
460,
14259,
875,
253,
2349,
351,
723,
273,
253,
806,
285,
1273,
2716,
273,
253,
3389,
275,
253,
6158,
253,
4868,
310,
253,
7349,
460,
14259,
875,
253,
2349,
351,
723,
273,
253,
767,
14683,
1097,
20741,
2392,
897,
260,
79,
2224,
281,
22573,
253,
14800,
253,
20741,
2392,
403,
10166,
281,
5958,
2032,
2505,
689,
12421,
8392,
4016,
3530,
534,
2882,
273,
12421,
8143,
4525,
6197,
1340,
723,
285,
263,
3632,
13553,
273,
806,
9815,
2716,
273,
7177,
436,
20741,
2392,
403,
840,
908,
281,
6194,
247,
2505,
5978,
1566,
253,
3453,
273,
253,
2505,
5978,
1566,
310,
11691,
407,
2710,
12077,
17082,
1690,
295,
620,
268,
446,
7387,
86,
285,
1180,
273,
4451,
295,
5059,
275,
253,
18012,
253,
11701,
689,
247,
1006,
1037,
32927,
5978,
1566,
403,
1077,
1355,
50276,
1189,
455,
891,
858,
417,
1089,
436,
2929,
281,
320,
21414,
253,
3302,
16038,
310,
1175,
664,
878,
281,
1089,
247,
1039,
281,
9232,
38539,
32019,
3607,
273,
2505,
285,
281,
11907,
295,
21619,
281,
4711,
824,
3607,
2299,
253,
20741,
2392,
3559,
513,
417,
2686,
9232,
253,
8794,
1972,
326,
597,
1460,
631,
281,
9232,
347,
891,
2096,
352,
841,
3210,
403,
816,
1146,
10166,
281,
15210,
400,
907,
1029,
7349,
460,
14259,
875,
253,
3000,
275,
253,
806,
9815,
2716,
273,
247,
3389,
390,
6197,
34814,
6197,
326,
310,
417,
29210,
273,
253,
14308,
273,
25253,
285,
28901,
279,
534,
943,
4887,
12861,
25200,
285,
1014,
43548,
9994,
2605,
2581,
841,
403,
816,
3210,
534,
9232,
32323,
14259,
285,
5549,
1242,
387,
326,
25761,
3733,
436,
1566,
281,
30530,
1524,
2505,
432,
12421,
44711,
2505,
3133,
20276,
1580,
337,
12421,
439,
31377,
2505,
943,
320,
35820,
1365,
3477,
281,
12129,
432,
1524,
2505,
275,
2426,
273,
32323,
14259,
285,
374,
841,
4016,
3530,
403,
417,
891,
13414,
1158,
387,
512,
29210,
273,
253,
3510,
273,
17438,
326,
253,
20741,
2392,
2686,
878,
281,
30530,
26332,
8356,
4561,
17438,
3021,
1014,
23111,
253,
958,
326,
891,
14936,
342,
253,
4477,
327,
4555,
752,
253,
20741,
2392,
247,
2935,
450,
320,
2509,
352,
310,
1335,
417,
2590,
281,
479,
326,
253,
20741,
2392,
403,
973,
10166,
281,
513,
253,
2181,
253,
4477,
971,
731,
281,
513,
891,
452,
2710,
643,
7350,
670,
253,
3916,
253,
2746,
285,
253,
7103,
247,
1618,
273,
625,
2173,
3533,
26122,
323,
253,
4477,
310,
2708,
50275,
9088,
403,
247,
2257,
273,
440,
44167,
4215,
3916,
285,
22898,
670,
253,
32019,
3607,
326,
841,
20741,
2392,
9232,
285,
642,
16038,
273,
1783,
347,
281,
849,
597,
403,
26475,
352,
3916,
751,
253,
1563,
7964,
878,
281,
320,
5176,
3037,
281,
16030,
253,
2169,
5251,
2554,
273,
246,
824,
347,
533,
417,
3710,
281,
1880,
352,
8525,
253,
6860,
273,
256,
16307,
25863,
1411,
256,
390,
32547,
39296,
824,
347,
28146,
273,
1016,
273,
253,
14683,
285,
253,
13760,
2685,
875,
10341,
767,
12640,
14683,
50276,
5658,
760,
897,
16644,
17082,
5747,
40088,
326,
627,
310,
642,
1175,
1039,
281,
7472,
5978,
2139,
417,
897,
1966,
2777,
436,
310,
417,
2834,
281,
4459,
562,
285,
672,
368,
403,
16425,
670,
824,
16105,
3607,
273,
3448,
1966,
2777,
310,
5667,
627,
310,
642,
1921,
326,
7387,
86,
323,
1650,
651,
320,
7996,
281,
25253,
390,
28901,
279,
594,
2139,
651,
436,
320,
247,
1175,
1039,
281,
7472,
247,
1566,
11205,
281,
9232,
4555,
1110,
1841,
50276,
12563,
2905,
281,
1966,
2777,
627,
943,
320,
271,
15276,
7103,
273,
253,
20741,
2392,
513,
597,
24888,
342,
1966,
23014,
273,
25253,
285,
28901,
279,
368,
2550,
1379,
352,
323,
7169,
326,
597,
9232,
841,
1841,
891,
1077,
1199,
2868,
597,
513,
417,
594,
1246,
690,
1941,
326,
253,
3210,
513,
752,
368,
1750,
597,
513,
50276,
783,
2361,
11701,
403,
19734,
11047,
281,
253,
6070,
326,
891,
651,
1239,
731,
347,
642,
3064,
253,
760,
7982,
835,
627,
310,
247,
1524,
3064,
310,
327,
1180,
273,
4451,
295,
5059,
4561,
2831,
14800,
534,
310,
18289,
984,
697,
816,
4715,
1146,
14659,
281,
37205,
562,
3000,
326,
497,
275,
253,
3280,
2654,
751,
281,
923,
253,
8245,
273,
816,
24699,
253,
3280,
347,
253,
3453,
50276,
5658,
3748,
2067,
2069,
326,
841,
3210,
588,
2619,
598,
327,
39296,
352,
310,
417,
2590,
281,
479,
849,
597,
812,
513,
326,
403,
2649,
597,
3365,
970,
247,
7349,
460,
14259,
875,
4735,
11390,
4931,
891,
717,
5816,
1633,
533,
891,
13414,
923,
849,
436,
812,
3037,
281,
557,
249,
1154,
400,
907,
39296,
533,
10486,
11907,
32323,
14259,
812,
368,
5513,
436,
1750,
50276,
7152,
339,
431,
248,
2934,
273,
3733,
20741,
2392,
281,
3653,
25253,
285,
28901,
279,
285,
3733,
1110,
20741,
2392,
347,
629,
273,
271,
295,
21619,
985,
970,
3646,
27935,
310,
271,
4722,
581,
2299,
627,
403,
767,
2201,
3237,
342,
253,
9380,
347,
352,
9572,
50276,
18,
352,
4336,
35136,
253,
8007,
273,
295,
21619,
6239,
327,
436,
9400,
1078,
253,
11454,
10532,
275,
295,
24343,
374,
253,
9759,
273,
253,
2929,
310,
21643,
275,
247,
1180,
273,
23006,
690,
4278,
2708,
50276,
936,
1750,
326,
436,
310,
253,
806,
2929,
281,
9232,
2831,
36817,
32019,
3607,
323,
2505,
5978,
310,
253,
3686,
273,
4385,
326,
310,
2779,
281,
1056,
7407,
295,
21619,
8607,
1077,
650,
37028,
247,
1175,
1659,
281,
1265,
2819,
387,
253,
9470,
6239,
327,
436,
9400,
310,
253,
1563,
2929,
50276,
7645,
272,
1980,
25253,
271,
10726,
3169,
2746,
2534,
91,
300,
333,
285,
13427,
682,
5215,
50276,
531,
4809,
275,
534,
253,
9759,
310,
278,
40747,
310,
253,
1340,
273,
253,
1543,
7180,
2829,
374,
310,
2080,
1512,
2393,
275,
253,
2929,
891,
574,
642,
2934,
387,
326,
1127,
2139,
253,
25064,
1543,
497,
1146,
3559,
390,
752,
253,
3904,
5486,
368,
671,
452,
28901,
279,
275,
253,
2829,
1078,
253,
28901,
279,
2593,
275,
4567,
21223,
2829,
337,
534,
310,
327,
268,
19,
285,
4245,
6667,
273,
985,
3453,
310,
2080,
1512,
2393,
50276,
30875,
253,
5962,
13775,
323,
479,
369,
253,
3064,
875,
28901,
279,
285,
25253,
285,
275,
1798,
849,
597,
403,
23115,
253,
26432,
1057,
247,
1175,
2628,
273,
12930,
253,
767,
12342,
285,
2403,
253,
4499,
875,
1980,
285,
4156,
25253,
533,
672,
891,
369,
4361,
4562,
891,
4934,
4680,
436,
369,
12930,
28901,
279,
246,
326,
3637,
256,
275,
253,
941,
50276,
84,
2261,
1980,
642,
285,
840,
4567,
3133,
281,
1804,
326,
25253,
285,
28901,
279,
9093,
403,
1146,
23115,
275,
253,
1072,
1039,
3707,
439,
47587,
6569,
327,
253,
3159,
1268,
891,
9428,
752,
891,
369,
16764,
369,
690,
3177,
387,
247,
4156,
1566,
323,
25253,
534,
4566,
4457,
816,
2819,
387,
12640,
6197,
8557,
50276,
74,
4282,
2139,
368,
42126,
1611,
247,
3425,
1566,
273,
14683,
24088,
12246,
30869,
298,
296,
78,
841,
403,
594,
2629,
1024,
352,
3133,
8909,
417,
281,
452,
731,
50276,
3088,
368,
6266,
253,
28490,
5199,
38754,
8325,
387,
1071,
673,
9825,
50276,
74,
10490,
2829,
577,
285,
1119,
253,
1650,
8557,
342,
253,
7363,
281,
320,
4217,
18276,
1783,
50276,
3169,
327,
16644,
295,
24343,
17082,
359,
2692,
247,
1534,
7756,
50276,
4609,
17082,
417,
2590,
281,
479,
326,
253,
11701,
275,
2829,
495,
403,
1534,
50276,
37585,
9759,
2792,
50275,
25739,
264,
407,
247,
40452,
3590,
6197,
50276,
22732,
971,
281,
294,
40712,
436,
1580,
368,
13414,
1599,
13760,
3590,
1255,
275,
247,
7681,
3282,
1060,
891,
13414,
1158,
50276,
783,
4385,
275,
253,
6452,
670,
1146,
13762,
253,
10336,
2087,
4219,
973,
281,
39709,
17438,
310,
19124,
1293,
690,
1941,
187,
187,
4118,
18435,
27,
2520,
2929,
9437,
387,
14053,
25253,
273,
4561,
2505,
285,
29328,
767,
9351,
273,
20741,
2392,
326,
14177,
281,
2557,
1880,
247,
5313,
273,
2505,
310,
18893,
390,
417,
50276,
35529,
253,
2929,
38771,
2067,
2905,
4619,
10414,
285,
671,
19756,
9470,
7103,
3340,
11595,
7103,
50276,
9088,
310,
13969,
875,
253,
30628,
326,
436,
2929,
3198,
625,
789,
1078,
352,
310,
7607,
281,
247,
8059,
824,
347,
17857,
32888,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
12453,
1048,
1156,
5978,
342,
247,
2173,
4836,
273,
1146,
1677,
247,
17744,
273,
247,
2278,
285,
25312,
281,
823,
253,
1735,
2620,
14683,
18893,
314,
50276,
783,
2929,
29328,
6240,
767,
20741,
2392,
581,
10166,
281,
22950,
247,
7349,
460,
14259,
875,
2603,
14683,
285,
2303,
14683,
277,
1940,
11724,
285,
581,
10166,
281,
22950,
247,
7349,
460,
14259,
875,
767,
12640,
14683,
50276,
251,
690,
12077,
17082,
751,
7387,
86,
285,
44229,
414,
271,
278,
282,
1566,
342,
841,
20741,
2392,
17923,
247,
1652,
2372,
1805,
685,
1293,
50276,
2520,
2929,
1057,
417,
2486,
667,
11595,
7103,
534,
310,
4619,
323,
16344,
253,
3290,
273,
4561,
3453,
3340,
323,
16344,
25253,
285,
28901,
279,
50276,
2520,
2929,
4648,
253,
4836,
9978,
285,
10895,
432,
4715,
281,
3630,
342,
27293,
20741,
2392,
5965,
21239,
1342,
1162,
355,
247,
498,
4765,
50276,
3529,
2929,
671,
3797,
1142,
7616,
7794,
281,
3157,
253,
25253,
432,
253,
12002,
273,
326,
2929,
1966,
7103,
14371,
326,
2505,
4561,
407,
776,
1566,
310,
9013,
689,
326,
273,
1666,
25379,
407,
247,
1781,
8459,
3012,
22474,
253,
4583,
25253,
3740,
285,
1491,
273,
253,
14649,
50276,
2858,
436,
2929,
1057,
417,
7277,
1411,
253,
1332,
2529,
275,
5965,
21239,
1342,
1162,
355,
390,
667,
643,
2720,
789,
1057,
417,
2486,
667,
1966,
27163,
1014,
2167,
597,
497,
253,
2022,
2557,
273,
7103,
275,
2720,
789,
50276,
2520,
2929,
3054,
326,
281,
253,
1682,
273,
776,
3640,
436,
2929,
310,
253,
806,
3177,
281,
11120,
9232,
2831,
36817,
32019,
3607,
26332,
25253,
285,
28901,
279,
323,
1048,
2505,
5978,
50276,
9088,
310,
1199,
2469,
789,
275,
253,
295,
24343,
3114,
327,
841,
50276,
1542,
1650,
923,
50276,
7645,
272,
1980,
25253,
271,
10726,
3169,
2746,
407,
2534,
91,
300,
333,
285,
13427,
682,
5826,
534,
556,
6783,
30404,
50276,
262,
556,
644,
7561,
5421,
275,
253,
2170,
273,
10405,
1320,
323,
1650,
50276,
5302,
28901,
279,
285,
25253,
3210,
323,
2505,
10405,
1320,
637,
74,
1162,
355,
39951,
2284,
8065,
285,
956,
484,
789,
285,
275,
625,
3332,
789,
253,
4715,
281,
3630,
2929,
326,
253,
10895,
285,
4836,
956,
432,
12453,
2067,
20365,
18260,
8191,
2831,
36817,
3374,
751,
22563,
285,
46518,
420,
50274,
783,
7349,
460,
14259,
7982,
275,
253,
1566,
310,
417,
1077,
973,
18960,
281,
253,
8892,
273,
25253,
285,
28901,
279,
347,
352,
310,
13123,
1223,
3626,
3448,
310,
2649,
50276,
783,
4667,
480,
2116,
2427,
281,
253,
4657,
281,
4489,
690,
8463,
672,
344,
1694,
627,
597,
497,
512,
562,
50276,
395,
50275,
9453,
344,
1694,
627,
597,
497,
512,
562,
480,
2116,
2427,
281,
253,
4657,
281,
4489,
690,
8463,
50276,
12756,
452,
8931,
7363,
2556,
281,
247,
7349,
460,
14259,
7982,
1223,
253,
806,
15824,
310,
1199,
625,
18893,
685,
253,
1273,
50276,
783,
6452,
2296,
359,
2692,
247,
1534,
7756,
849,
369,
8453,
3413,
1060,
5474,
339,
431,
248,
2929,
29328,
247,
1332,
323,
11138,
253,
3290,
273,
2505,
5978,
407,
39793,
323,
25253,
285,
28901,
279,
253,
4477,
1287,
767,
20741,
2392,
66,
25253,
7134,
12915,
534,
3936,
347,
3280,
512,
273,
253,
6197,
46234,
26332,
17522,
3159,
46234,
273,
253,
3389,
285,
39360,
247,
4868,
285,
247,
28901,
279,
7134,
12915,
534,
3936,
347,
3280,
253,
3159,
46234,
273,
767,
12640,
14683,
285,
39360,
247,
4868,
275,
253,
3438,
253,
4868,
310,
253,
7349,
460,
14259,
875,
253,
2349,
351,
723,
273,
253,
806,
285,
1273,
2716,
273,
253,
3389,
275,
253,
6158,
253,
4868,
310,
253,
7349,
460,
14259,
875,
253,
2349,
351,
723,
273,
253,
767,
14683,
1097,
20741,
2392,
897,
260,
79,
2224,
281,
22573,
253,
14800,
253,
20741,
2392,
403,
10166,
281,
5958,
2032,
2505,
689,
12421,
8392,
4016,
3530,
534,
2882,
273,
12421,
8143,
4525,
6197,
1340,
723,
285,
263,
3632,
13553,
273,
806,
9815,
2716,
273,
7177,
436,
20741,
2392,
403,
840,
908,
281,
6194,
247,
2505,
5978,
1566,
253,
3453,
273,
253,
2505,
5978,
1566,
310,
11691,
407,
2710,
12077,
17082,
1690,
295,
620,
268,
446,
7387,
86,
285,
1180,
273,
4451,
295,
5059,
275,
253,
18012,
253,
11701,
689,
247,
1006,
1037,
32927,
5978,
1566,
403,
1077,
1355,
50276,
1189,
455,
891,
858,
417,
1089,
436,
2929,
281,
320,
21414,
253,
3302,
16038,
310,
1175,
664,
878,
281,
1089,
247,
1039,
281,
9232,
38539,
32019,
3607,
273,
2505,
285,
281,
11907,
295,
21619,
281,
4711,
824,
3607,
2299,
253,
20741,
2392,
3559,
513,
417,
2686,
9232,
253,
8794,
1972,
326,
597,
1460,
631,
281,
9232,
347,
891,
2096,
352,
841,
3210,
403,
816,
1146,
10166,
281,
15210,
400,
907,
1029,
7349,
460,
14259,
875,
253,
3000,
275,
253,
806,
9815,
2716,
273,
247,
3389,
390,
6197,
34814,
6197,
326,
310,
417,
29210,
273,
253,
14308,
273,
25253,
285,
28901,
279,
534,
943,
4887,
12861,
25200,
285,
1014,
43548,
9994,
2605,
2581,
841,
403,
816,
3210,
534,
9232,
32323,
14259,
285,
5549,
1242,
387,
326,
25761,
3733,
436,
1566,
281,
30530,
1524,
2505,
432,
12421,
44711,
2505,
3133,
20276,
1580,
337,
12421,
439,
31377,
2505,
943,
320,
35820,
1365,
3477,
281,
12129,
432,
1524,
2505,
275,
2426,
273,
32323,
14259,
285,
374,
841,
4016,
3530,
403,
417,
891,
13414,
1158,
387,
512,
29210,
273,
253,
3510,
273,
17438,
326,
253,
20741,
2392,
2686,
878,
281,
30530,
26332,
8356,
4561,
17438,
3021,
1014,
23111,
253,
958,
326,
891,
14936,
342,
253,
4477,
327,
4555,
752,
253,
20741,
2392,
247,
2935,
450,
320,
2509,
352,
310,
1335,
417,
2590,
281,
479,
326,
253,
20741,
2392,
403,
973,
10166,
281,
513,
253,
2181,
253,
4477,
971,
731,
281,
513,
891,
452,
2710,
643,
7350,
670,
253,
3916,
253,
2746,
285,
253,
7103,
247,
1618,
273,
625,
2173,
3533,
26122,
323,
253,
4477,
310,
2708,
50275,
9088,
403,
247,
2257,
273,
440,
44167,
4215,
3916,
285,
22898,
670,
253,
32019,
3607,
326,
841,
20741,
2392,
9232,
285,
642,
16038,
273,
1783,
347,
281,
849,
597,
403,
26475,
352,
3916,
751,
253,
1563,
7964,
878,
281,
320,
5176,
3037,
281,
16030,
253,
2169,
5251,
2554,
273,
246,
824,
347,
533,
417,
3710,
281,
1880,
352,
8525,
253,
6860,
273,
256,
16307,
25863,
1411,
256,
390,
32547,
39296,
824,
347,
28146,
273,
1016,
273,
253,
14683,
285,
253,
13760,
2685,
875,
10341,
767,
12640,
14683,
50276,
5658,
760,
897,
16644,
17082,
5747,
40088,
326,
627,
310,
642,
1175,
1039,
281,
7472,
5978,
2139,
417,
897,
1966,
2777,
436,
310,
417,
2834,
281,
4459,
562,
285,
672,
368,
403,
16425,
670,
824,
16105,
3607,
273,
3448,
1966,
2777,
310,
5667,
627,
310,
642,
1921,
326,
7387,
86,
323,
1650,
651,
320,
7996,
281,
25253,
390,
28901,
279,
594,
2139,
651,
436,
320,
247,
1175,
1039,
281,
7472,
247,
1566,
11205,
281,
9232,
4555,
1110,
1841,
50276,
12563,
2905,
281,
1966,
2777,
627,
943,
320,
271,
15276,
7103,
273,
253,
20741,
2392,
513,
597,
24888,
342,
1966,
23014,
273,
25253,
285,
28901,
279,
368,
2550,
1379,
352,
323,
7169,
326,
597,
9232,
841,
1841,
891,
1077,
1199,
2868,
597,
513,
417,
594,
1246,
690,
1941,
326,
253,
3210,
513,
752,
368,
1750,
597,
513,
50276,
783,
2361,
11701,
403,
19734,
11047,
281,
253,
6070,
326,
891,
651,
1239,
731,
347,
642,
3064,
253,
760,
7982,
835,
627,
310,
247,
1524,
3064,
310,
327,
1180,
273,
4451,
295,
5059,
4561,
2831,
14800,
534,
310,
18289,
984,
697,
816,
4715,
1146,
14659,
281,
37205,
562,
3000,
326,
497,
275,
253,
3280,
2654,
751,
281,
923,
253,
8245,
273,
816,
24699,
253,
3280,
347,
253,
3453,
50276,
5658,
3748,
2067,
2069,
326,
841,
3210,
588,
2619,
598,
327,
39296,
352,
310,
417,
2590,
281,
479,
849,
597,
812,
513,
326,
403,
2649,
597,
3365,
970,
247,
7349,
460,
14259,
875,
4735,
11390,
4931,
891,
717,
5816,
1633,
533,
891,
13414,
923,
849,
436,
812,
3037,
281,
557,
249,
1154,
400,
907,
39296,
533,
10486,
11907,
32323,
14259,
812,
368,
5513,
436,
1750,
50276,
7152,
339,
431,
248,
2934,
273,
3733,
20741,
2392,
281,
3653,
25253,
285,
28901,
279,
285,
3733,
1110,
20741,
2392,
347,
629,
273,
271,
295,
21619,
985,
970,
3646,
27935,
310,
271,
4722,
581,
2299,
627,
403,
767,
2201,
3237,
342,
253,
9380,
347,
352,
9572,
50276,
18,
352,
4336,
35136,
253,
8007,
273,
295,
21619,
6239,
327,
436,
9400,
1078,
253,
11454,
10532,
275,
295,
24343,
374,
253,
9759,
273,
253,
2929,
310,
21643,
275,
247,
1180,
273,
23006,
690,
4278,
2708,
50276,
936,
1750,
326,
436,
310,
253,
806,
2929,
281,
9232,
2831,
36817,
32019,
3607,
323,
2505,
5978,
310,
253,
3686,
273,
4385,
326,
310,
2779,
281,
1056,
7407,
295,
21619,
8607,
1077,
650,
37028,
247,
1175,
1659,
281,
1265,
2819,
387,
253,
9470,
6239,
327,
436,
9400,
310,
253,
1563,
2929,
50276,
7645,
272,
1980,
25253,
271,
10726,
3169,
2746,
2534,
91,
300,
333,
285,
13427,
682,
5215,
50276,
531,
4809,
275,
534,
253,
9759,
310,
278,
40747,
310,
253,
1340,
273,
253,
1543,
7180,
2829,
374,
310,
2080,
1512,
2393,
275,
253,
2929,
891,
574,
642,
2934,
387,
326,
1127,
2139,
253,
25064,
1543,
497,
1146,
3559,
390,
752,
253,
3904,
5486,
368,
671,
452,
28901,
279,
275,
253,
2829,
1078,
253,
28901,
279,
2593,
275,
4567,
21223,
2829,
337,
534,
310,
327,
268,
19,
285,
4245,
6667,
273,
985,
3453,
310,
2080,
1512,
2393,
50276,
30875,
253,
5962,
13775,
323,
479,
369,
253,
3064,
875,
28901,
279,
285,
25253,
285,
275,
1798,
849,
597,
403,
23115,
253,
26432,
1057,
247,
1175,
2628,
273,
12930,
253,
767,
12342,
285,
2403,
253,
4499,
875,
1980,
285,
4156,
25253,
533,
672,
891,
369,
4361,
4562,
891,
4934,
4680,
436,
369,
12930,
28901,
279,
246,
326,
3637,
256,
275,
253,
941,
50276,
84,
2261,
1980,
642,
285,
840,
4567,
3133,
281,
1804,
326,
25253,
285,
28901,
279,
9093,
403,
1146,
23115,
275,
253,
1072,
1039,
3707,
439,
47587,
6569,
327,
253,
3159,
1268,
891,
9428,
752,
891,
369,
16764,
369,
690,
3177,
387,
247,
4156,
1566,
323,
25253,
534,
4566,
4457,
816,
2819,
387,
12640,
6197,
8557,
50276,
74,
4282,
2139,
368,
42126,
1611,
247,
3425,
1566,
273,
14683,
24088,
12246,
30869,
298,
296,
78,
841,
403,
594,
2629,
1024,
352,
3133,
8909,
417,
281,
452,
731,
50276,
3088,
368,
6266,
253,
28490,
5199,
38754,
8325,
387,
1071,
673,
9825,
50276,
74,
10490,
2829,
577,
285,
1119,
253,
1650,
8557,
342,
253,
7363,
281,
320,
4217,
18276,
1783,
50276,
3169,
327,
16644,
295,
24343,
17082,
359,
2692,
247,
1534,
7756,
50276,
4609,
17082,
417,
2590,
281,
479,
326,
253,
11701,
275,
2829,
495,
403,
1534,
50276,
37585,
9759,
2792,
50275,
25739,
264,
407,
247,
40452,
3590,
6197,
50276,
22732,
971,
281,
294,
40712,
436,
1580,
368,
13414,
1599,
13760,
3590,
1255,
275,
247,
7681,
3282,
1060,
891,
13414,
1158,
50276,
783,
4385,
275,
253,
6452,
670,
1146,
13762,
253,
10336,
2087,
4219,
973,
281,
39709,
17438,
310,
19124,
1293,
690,
1941,
187,
187,
4118,
18435,
27,
2520,
2929,
9437,
387,
14053,
25253,
273,
4561,
2505,
285,
29328,
767,
9351,
273,
20741,
2392,
326,
14177,
281,
2557,
1880,
247,
5313,
273,
2505,
310,
18893,
390,
417,
50276,
35529,
253,
2929,
38771,
2067,
2905,
4619,
10414,
285,
671,
19756,
9470,
7103,
3340,
11595,
7103,
50276,
9088,
310,
13969,
875,
253,
30628,
326,
436,
2929,
3198,
625,
789,
1078,
352,
310,
7607,
281,
247,
8059,
824,
347,
17857,
32888,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
contributions this paper presents sbevnet a neural network architecture to estimate the birdseye view bev layout of an urban driving scene given an image captured by a stereo camera sbevnet performs an inverse perspective mapping ipm to obtain an initial feature volume which is further processed to generate the bev layout the system is trained endtoend in a supervised learning setup strengths s1 the problem considered here is very relevant to perception groups in the autonomous driving community this area has only recently seen work crop up approaches like monolayout a monooccupancy b and pseudolidar c are closely related to this submission s2 the paper is easy to follow and provides a majority of the details needed to understand and assess the approach s3 the authors also seem to provide code and promise a public release which might help ensure reproducibility weaknesses i see a few major and a number of other minor concerns that impact my perception of this paper im hoping the discussion period helps address some of these and im open to revising my score in light of evidence contrary to the following claims it appears that this paper uses monolayout a monooccupancy b and pseudolidar c as primary baselines much of my review stems from my understanding of a b c and my surprise at a few contradictory trends observed in this paper problem setup it is unclear from reading the paper and supplementary material if the problem setup is infact amodal layout estimation ie if scene points outside of the camera view are predicted in the bev layout approaches like schulter et al 2016 and mani et al 2020 operate in this amodal setup while others such as pseudolidar c and lu et al 2019 only predict points that are visible in the input image does this approach for instance hallucianate hidden intersections and roads it seems not since a visibility mask is explicitly employed in the loss function cf fig 1 and eq 12 13 monolayout baseline the primary baseline considered in this paper is monolayout mani et al 2020 upon examining the monolayout a paper i find a surprising and troubling trend this paper reports very poor performances of monolayout on the kitti dataset the original monolayout paper reports miou for the car class to be around 2608 while the current submission reports 243 cf table 2 ive noted that monolayout makes its code and models publicly available its publicly available pretrained models claim an miou of 3018 for the car class as highlighted on their github page also other baselines like monooccupancy have surprisingly low scores in this paper an order of magnitude compared to scores reported in the monolayout paper i wonder if there is something different in the experiment andor training protocols employed in the current work as opposed to those in the monooccupancy and monolayout papers for example the monooccupancy baseline as reported in the monolayout paper achieves an miou of about 2416 for the car class monolayout paper table 1 while the same baseline has a dismal performance miou of 711 for car class in table 2 of the current manuscript the fact that this performance gap is not explained in the paper makes it hard to analyze the merits of the proposed approach save for a single sentence the results of monolayout and monooccupancy are inferior due to lack of any camera geometry priors in the network ive not found any other discussion of this performance gapdiscrepancy i also find it a tad weird and unexplained that the performance of various baselines do not seem to follow a set patterntrend across the carla and kitti datasets in the monolayout paper i notice that changing the dataset from kitti to argoverse does change absolute miou scores a bit but preserves the ranking of various baselines ie monolayout oft monooccupancy on both kitti and argoverse in the current submission the trends seem to be changing across the two datasets cf tables 1 2 yet another set of baselines that seem to underperform here are the pseudolidar variants in the monolayout paper cf supplementary material table 5 pseudolidar is evaluated on the kitti dataset and the reported miou for vehicles is 59 whereas in this paper the best performance on this class achieved by a pseudolidar model is 4564 further the monolayout papers version of the stereo pseudolidar baseline seems to perform quite competetively miou 590 to sbevnet ensemble miou 6017 for car cf table 2 this seems to indicate that welltuned baselines could perhaps achieve better performance in appendix a2 the authors seem to indicate that they used a very different process to train monolayout ie using random images from the train set as opposed to using openstreetmap andor adversarial training i suspect this might have resulted in a performance gap i feel that oft d could be cited and used as a baseline particularly to measure layout estimation accuracy for the car class qualitative results unfortunately there seems to be a dearth of qualitative result figures to get a better sense of the approach in particular monolayout and monooccupancy seem to obtain crisp reconstructions of cars cf monolayout paper while in figure 2 cars are splayed throughout the image in the sbevnet results this is also surprising in my opinion these results do not adequately substantiate the impressive reported miou missing map metric other papers such as monolayout and oft seem to report the map mean average precision metric in addition to the miou metric because map often turns out to be a more accurate estimate of prediction performance due to integrating over various recall values in practice this leads to lessthanperfect predictions being scored well and this could explain the splayedout results in fig 2 scoring a high miou evaluating map would be a stricter criteria and will allow an additional point of comparison with prior art minor remarks the following remarks have had no impact on my assessment of the paper and as such i dont expect the authors to respond to these concurrent approaches such as f can be cited and discussed the paper could be structured better for instance input image sizes and baselines could be moved over to the main paper rather than being listed in the appendix references a mani kaustubh et al monolayout amodal scene layout from a single image the ieee winter conference on applications of computer vision 2020 b lu chenyang marinus jacobus gerardus van de molengraft and gijs dubbelman monocular semantic occupancy grid mapping with convolutional variational encoderdecoder networks ieee robotics and automation letters 42 2019 445452 c wang yan et al pseudolidar from visual depth estimation bridging the gap in 3d object detection for autonomous driving proceedings of the ieee conference on computer vision and pattern recognition 2019 d roddick thomas alex kendall and roberto cipolla orthographic feature transform for monocular 3d object detection arxiv preprint arxiv181108188 2018 e a parametric topview representation of complex road scenes cvpr 2019 f lift splat shoot encoding images from arbitrary camera rigs by implicitly unprojecting to 3d eccv 2020docsepthe paper proposed to estimate the semantic layout in the bird eyes view from a pair of stereo images the main noveltycontribution lies in how to organize and exploit the information from the stereo images the proposed framework builds upon inverse perspective mapping and projected stereo feature volume the performance was evaluated on the kitti and carla datasets given a pair of stereo images there are various options to exploit the image information where this paper provides a framework by exploiting the stereo information in the bird eyes view a principled question is what is the real superiority of estimating the layout in the bird eyes view from the application view the semantic estimation from the cameraview already provide much information which the stereo images could further improve the performance from the applicationss perspective i would like to see discussions and experiments in showing the superiority in using the bird eyes representation in the ablation studies the paper already provide different variants of the network architecture in exploiting the stereo image information i believe there are multitask learning based framework form this task where the semantic layout estimation and stereo estimation are jointly estimated and optimized whether that pipeline will provide extra benefit in section 344 the paper claimed that we pass the concatenated stereo bev feature map and ipm bev feature map to a unet ronneberger et al 2015 network to generate the semantic map c however the loss evaluation applies to the ipm features and stereo features separately namely mathcalciipm and mathcalcistereo if two estimations are made as the network output which one will be used for performance evaluation the other following question is if two separate estimations are made as the network outputs and compared with the ground truth for loss evaluation whether a consistency loss between these two estimations will further constrain the network learning the paper conducted experiments on the kitti and carla dataset it is well understood that the cityscape dataset has been widely in evaluating semantic segmentation where the stereo images are available i would to see more evaluation on these realimage dataset rather than synthetic dataset such as the carla dataset the paper title and abstract should highlight semantic and bird eyes view as the paper proposed to learn the semantic layout in the bird eyes view the current title did reflect these properties all in all taking all the above comments into consideration i would like to hear from the authors response which could lead to updated rating in either directionsdocsepinteresting problem but the paper can be improved this work aims to directly estimate the world layout in front of the vehicle from a pair of stereo cameras it is based on cost volume but it does not explicitly predict the depth values of each pixel instead it warps the cost volume features to the bird eye view bev and do semantic segmentation from bev using unet i think the problem is interesting and i believe it has never been mentioned andor addressed before moreover i think the motivation is also valid as the bev semantic segmentation from camera sensors can be one important perception input for navigation and planning i like the idea of skipping the explicit 3d reconstruction and directly shoot for the final goal i believe we usually get better performance when we directly minimize the loss we want to minimize moreover it could potentially introduce some inspiration to other works eg the direct extension 3d pointvolume semantic segmentation though i like this work i also has several concerns 1 is ipm feature really important i only see it is effective on the synthetic dataset carla but not on kitti what is the possible reason my guess is that the ground estimation is very bad for the realworld data i am also curious what is the performance if only ipm feature is used 2 in introduction this paper claims estimating accurate depth is not sufficient due to occlusion however i dont see how this work could handle occlusion instead the occluded part is masked out during training please explain this statement 3 what is the range of the layout estimation from carla it is 39m and from figure 5 and figure 6 it is 35m if it is the case the short range of the estimation makes it hard to act as a major component in the perception system the best use case is for short range detection and system redundancy but actually i can imagine that it would not get very good result in long range as there is always a tradeoff between baseline for accuracy and camera overlap for coverage in stereo estimation 4 what is the image resolution for the inference time test it seems quite slow if the resolution is 512x288 for carla or 640x256 for kitti 5 for the experiment i think it is better to report mean pm std with multiple trainings as there are training noise other suggestions and clarifications 1 when ipm is first introduced in page 2 it is better to explain it in a short sentence the current version is not clear and there are typos 2 i believe this work is based on binocular stereo pairs correct me if i am wrong so please explicitly say that in the paper also using leftright image instead of referencetarget image is less misleading 3 for disparity feature volume it is better to use the prevalent name cost volume it is called cost volume in the introduction but later called disparity feature volume i think it is better to be consistent 4 it is unclear how ipm feature are obtained from predetermined parameters or ground estimation i think predetermined parameters will not work very well because ground is not always a perfect plane 5 it is unclear what is the ensemble method used here if it just takes the best of several models i will not be convinced after rebuttal i still think this work has a interesting task setup though it indeed has many faults after reading the responses and other reviewers 1 it seems that ipm is not really useful in practice 2 it is also not sufficient to large occlusion and thus there is no explanation for its advatange over estimating accurate depth 3 range is short and latency is high 4 after reading reviewer1s comments i think it could use the same experimental setting as the existing methods for a fair comparison the other methods might be not properly trained with the new setting 5 it is still not clear how to emsemble several models with different trained weights in this work thus i am changin my rating to 6 and i will not fight for this workdocsepthe paper proposes an endtoend network for layout estimation from stereo images the approach is built off previous stereo matching networks which built and process a 3d disparity volume the stereo estimate is used to project image features into a birdseyeview representation which is processed using a unet which predicts a semantic scene layout the approach is evaluated on the kitti and carla generated datasets strengths this is the first work to attempt semantic layout estimation from stereo images the approach is geometrically grounded and can properly leverage stereo information to improve layout estimation the approach performs well on the two datasets evaluated since this paper is focused on a new problem there are not existing works to directly compare to however the paper provides reasonable baselines by modifying existing networks for this task avoids the need for an intermediate representation ie point cloud by directly mapping features from the disparity volume into birdseyeview coordinates plots in the appendix are interesting weaknesses while the task itself is new closely related forms of the problem have been studied for example 3d object detection from monocularstereo and monocular layout estimation it would have been helpful to see results on the closely related task of 3d object detection to better compare against prior works the ipm module appears to be very sensitive to the accuracy of the ground plane in the synthetic carla dataset where a ground plane can be accurately computed there seems to be a large advantage of using the ipm module on realworld data like kitti the use of the ipm module gives very limited improvement in performance of the stereoonly baseline the task is closely related to 3d object detection which has been using similar components the core components of the approach have been used in various forms in prior work the paper orthographic feature transform for monocular 3d object detection roddick 2019 uses a very similar method to project image features into a birdseyeview representation
### Summary: | this paper addresses the problem of estimating a birdseyedview overhead semantic layout estimate of a scene given an input pair of stereo images of the scene the authors present an endtoend trainable deep network that fuses features derived from the stereo images and projects these features into an overhead coordinate frame which is passed through a unet style model to generate the final top view semantic segmentation map the model is trained in a fully supervised manner experiments are performed on the carla and kitti datasets while r2 was positive they still had some concerns after reading the rebuttal and the other reviews specifically they were not convinced about the value of the ipm module this concern was also shared by r4 especially in light of the relationship to roddick et al bmvc 2019 r1 had concerns about the experiments specifically the quantitative comparisons to monolayout the authors addressed these comments but it is still not clear if the differences can be attributed to the number of classes how they are weighted or the training split used r3 had questions about the utility of bev predictions in general however as stated by r2 there is a lot of value in approaching the problem in this way in conclusion while there were some positive comments from the reviewers there were also several significant concerns with no reviewer willing to champion the paper there is not enough support to justify accepting the paper in its current form | [
31524,
2304,
30264,
480,
317,
706,
316,
21974,
472,
316,
3889,
372,
14008,
1205,
2694,
285,
305,
1944,
84,
19155,
8382,
1342,
1114,
26292,
24705,
35190,
9860,
10603,
342,
27311,
267,
39762,
32049,
48759,
6928,
26332,
1796,
15688,
982,
285,
29885,
4876,
5976,
6247,
38848,
34237,
50276,
68,
259,
606,
340,
266,
1162,
355,
10585,
9528,
274,
432,
5304,
6864,
13418,
49519,
253,
8037,
275,
495,
69,
1789,
5481,
323,
26279,
6276,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
6247,
50276,
69,
687,
1678,
781,
289,
4921,
247,
1591,
465,
423,
455,
285,
687,
589,
936,
260,
532,
30997,
9373,
5576,
4735,
4979,
323,
1114,
26292,
495,
69,
1789,
5481,
549,
32693,
638,
3845,
549,
32693,
1093,
7749,
25,
17599,
4765,
50276,
70,
247,
36833,
1755,
1374,
6779,
273,
2570,
3971,
13451,
30105,
1087,
6247,
50276,
71,
8488,
6821,
255,
5310,
9706,
3888,
432,
10341,
6568,
8132,
84,
407,
29688,
440,
10408,
272,
281,
495,
69,
23746,
87,
9169,
7152,
339,
431,
248,
2929,
4081,
281,
6642,
253,
24705,
12806,
275,
253,
12621,
2927,
1859,
432,
247,
4667,
273,
36167,
3888,
253,
2022,
38135,
1987,
2382,
8696,
275,
849,
281,
23968,
285,
22059,
253,
1491,
432,
253,
36167,
3888,
253,
4081,
7792,
21168,
2220,
13737,
8668,
10603,
285,
16589,
36167,
4735,
4644,
253,
3045,
369,
6760,
327,
253,
465,
21498,
285,
1113,
4123,
15302,
1677,
247,
4667,
273,
36167,
3888,
627,
403,
2710,
4610,
281,
22059,
253,
2460,
1491,
835,
436,
2929,
3400,
247,
7792,
407,
38883,
253,
36167,
1491,
275,
253,
12621,
2927,
1859,
50274,
66,
3505,
74,
6216,
1953,
310,
752,
310,
253,
1524,
34385,
273,
26230,
253,
12806,
275,
253,
12621,
2927,
1859,
432,
253,
2898,
1859,
253,
24705,
13418,
432,
253,
4049,
254,
580,
827,
2168,
2085,
1199,
1491,
534,
253,
36167,
3888,
812,
2007,
3157,
253,
3045,
432,
253,
4893,
84,
8668,
891,
651,
751,
281,
923,
11985,
285,
4679,
275,
4645,
253,
34385,
275,
970,
253,
12621,
2927,
6779,
50274,
249,
253,
28913,
2175,
253,
2929,
2168,
2085,
1027,
11640,
273,
253,
2990,
10336,
275,
38883,
253,
36167,
2460,
1491,
891,
2868,
627,
403,
1554,
262,
1945,
4715,
1754,
7792,
830,
436,
4836,
835,
253,
24705,
12806,
13418,
285,
36167,
13418,
403,
26277,
5998,
285,
18325,
1880,
326,
15722,
588,
2085,
4465,
5649,
50275,
249,
2593,
36279,
253,
2929,
7558,
326,
359,
1509,
253,
32147,
456,
36167,
320,
87,
4735,
3711,
285,
13997,
78,
320,
87,
4735,
3711,
281,
247,
440,
292,
391,
44215,
24423,
1162,
355,
4104,
2990,
281,
6635,
253,
24705,
3711,
260,
2299,
253,
2957,
7103,
10384,
281,
253,
13997,
78,
3386,
285,
36167,
3386,
11794,
10775,
14168,
1179,
5297,
532,
78,
285,
14168,
32557,
382,
11892,
80,
604,
767,
3311,
569,
403,
1160,
347,
253,
2990,
3453,
534,
581,
588,
320,
908,
323,
3045,
7103,
253,
643,
1563,
1953,
310,
50276,
338,
767,
4858,
3311,
569,
403,
1160,
347,
253,
2990,
18012,
285,
2429,
342,
253,
3216,
5083,
323,
2957,
7103,
1880,
247,
15274,
2957,
875,
841,
767,
3311,
569,
588,
2007,
37709,
253,
2990,
4715,
50275,
783,
2929,
5196,
4679,
327,
253,
465,
21498,
285,
1113,
4123,
10895,
352,
310,
973,
7192,
326,
253,
2846,
9875,
10895,
556,
644,
7561,
275,
16344,
24705,
26405,
835,
253,
36167,
3888,
403,
2130,
891,
651,
281,
923,
625,
7103,
327,
841,
1524,
5695,
10895,
2581,
685,
13506,
10895,
824,
347,
253,
1113,
4123,
10895,
50275,
783,
2929,
4060,
285,
12002,
943,
6780,
24705,
285,
12621,
2927,
1859,
347,
253,
2929,
4081,
281,
3037,
253,
24705,
12806,
275,
253,
12621,
2927,
1859,
253,
1655,
4060,
858,
4887,
841,
3607,
50276,
455,
275,
512,
3192,
512,
253,
1840,
5701,
715,
8180,
891,
651,
751,
281,
4089,
432,
253,
4477,
2380,
534,
812,
1421,
281,
9300,
13716,
275,
2057,
10746,
7152,
339,
9852,
6173,
272,
1895,
533,
253,
2929,
476,
320,
5520,
50276,
2520,
789,
13698,
281,
3587,
6642,
253,
1533,
12806,
275,
2914,
273,
253,
4958,
432,
247,
4667,
273,
36167,
14693,
352,
310,
1754,
327,
2105,
4644,
533,
352,
1057,
417,
11120,
3283,
253,
6864,
2193,
273,
1016,
12275,
3185,
352,
2137,
793,
253,
2105,
4644,
3386,
281,
253,
12621,
5130,
1859,
320,
87,
285,
513,
24705,
26405,
432,
320,
87,
970,
440,
292,
50276,
74,
1158,
253,
1895,
310,
4722,
285,
891,
2868,
352,
556,
1620,
644,
5393,
285,
263,
9713,
1078,
25761,
891,
1158,
253,
16038,
310,
671,
3588,
347,
253,
320,
87,
24705,
26405,
432,
6568,
13479,
476,
320,
581,
1774,
13071,
3280,
323,
15034,
285,
7219,
50275,
74,
751,
253,
2934,
273,
42654,
253,
6843,
495,
69,
14433,
285,
3587,
5310,
323,
253,
2457,
4736,
891,
2868,
359,
3798,
755,
1805,
3045,
672,
359,
3587,
15338,
253,
2957,
359,
971,
281,
15338,
25761,
352,
812,
7826,
9569,
690,
17006,
281,
643,
2987,
24088,
253,
1480,
6880,
50276,
20,
69,
1127,
21970,
24705,
26405,
50276,
2004,
891,
751,
436,
789,
891,
671,
556,
2067,
7350,
337,
310,
13997,
78,
4735,
1663,
1774,
891,
760,
923,
352,
310,
3576,
327,
253,
13506,
10895,
1113,
4123,
533,
417,
327,
465,
21498,
752,
310,
253,
1896,
1921,
619,
5476,
310,
326,
253,
3216,
13418,
310,
1077,
3076,
323,
253,
1524,
10186,
941,
891,
717,
671,
14338,
752,
310,
253,
3045,
604,
760,
13997,
78,
4735,
310,
908,
50276,
19,
275,
10199,
436,
2929,
3916,
26230,
7899,
6864,
310,
417,
4209,
1955,
281,
30796,
2299,
891,
13414,
923,
849,
436,
789,
812,
6016,
30796,
3185,
253,
15715,
4686,
629,
310,
34741,
562,
1309,
3733,
4496,
5513,
436,
3908,
495,
752,
310,
253,
2491,
273,
253,
12806,
13418,
432,
1113,
4123,
352,
310,
6931,
78,
285,
432,
4677,
608,
285,
4677,
721,
352,
310,
4791,
78,
604,
352,
310,
253,
1083,
253,
2159,
2491,
273,
253,
13418,
2789,
352,
1892,
281,
769,
347,
247,
2201,
4445,
275,
253,
13071,
985,
253,
1682,
897,
1083,
310,
323,
2159,
2491,
5481,
285,
985,
39296,
533,
2686,
891,
476,
8564,
326,
352,
651,
417,
755,
1077,
1175,
906,
275,
1048,
2491,
347,
627,
310,
1900,
247,
5454,
2727,
875,
8245,
323,
7200,
285,
6568,
14787,
323,
7031,
275,
36167,
13418,
577,
752,
310,
253,
2460,
6064,
323,
253,
17032,
673,
1071,
352,
3133,
3240,
3468,
604,
253,
6064,
310,
23414,
89,
21340,
323,
1113,
4123,
390,
37174,
89,
9726,
323,
465,
21498,
608,
323,
253,
3368,
891,
1158,
352,
310,
1805,
281,
1304,
1599,
12920,
6268,
342,
2709,
6194,
723,
347,
627,
403,
3733,
6046,
50276,
977,
13991,
285,
8254,
6787,
337,
672,
13997,
78,
310,
806,
5611,
275,
3239,
374,
352,
310,
1805,
281,
5513,
352,
275,
247,
2159,
6197,
253,
1655,
2715,
310,
417,
2590,
285,
627,
403,
963,
993,
374,
891,
2868,
436,
789,
310,
1754,
327,
10269,
26292,
36167,
8557,
3451,
479,
604,
891,
717,
3430,
594,
4496,
11120,
1333,
326,
275,
253,
2929,
671,
970,
1669,
918,
2460,
3185,
273,
17807,
68,
292,
1816,
2460,
310,
1679,
24363,
495,
323,
37808,
4735,
4644,
352,
310,
1805,
281,
897,
253,
21270,
1416,
50276,
16736,
4644,
352,
310,
1925,
2105,
4644,
275,
253,
10199,
533,
1996,
1925,
37808,
4735,
4644,
891,
1158,
352,
310,
1805,
281,
320,
5185,
577,
352,
310,
12744,
849,
13997,
78,
4735,
403,
2797,
432,
17095,
3602,
390,
3216,
13418,
891,
1158,
17095,
3602,
588,
417,
789,
1077,
973,
984,
3216,
310,
417,
1900,
247,
3962,
6415,
608,
352,
310,
12744,
752,
310,
253,
19862,
1332,
908,
1060,
604,
352,
816,
3936,
253,
1682,
273,
2067,
3210,
891,
588,
417,
320,
13762,
50276,
6438,
30080,
22559,
891,
1335,
1158,
436,
789,
556,
247,
4722,
4836,
9978,
2167,
352,
6296,
556,
1142,
35354,
846,
4361,
253,
6128,
285,
643,
30628,
337,
352,
3133,
326,
13997,
78,
310,
417,
1663,
4217,
275,
3946,
374,
352,
310,
671,
417,
4209,
281,
1781,
30796,
285,
3021,
627,
310,
642,
8813,
323,
697,
1604,
255,
912,
689,
26230,
7899,
6864,
495,
2491,
310,
2159,
285,
22667,
310,
1029,
577,
846,
4361,
37317,
18,
84,
5701,
891,
1158,
352,
812,
897,
253,
1072,
5661,
4758,
347,
253,
5368,
3082,
323,
247,
4344,
5301,
253,
643,
3082,
1537,
320,
417,
6283,
10166,
342,
253,
747,
4758,
608,
352,
310,
1335,
417,
2590,
849,
281,
802,
12779,
2067,
3210,
342,
1027,
10166,
13461,
275,
436,
789,
3021,
891,
717,
1683,
249,
619,
13716,
281,
721,
285,
891,
588,
417,
3819,
323,
436,
789,
7152,
339,
431,
248,
2929,
29328,
271,
990,
936,
423,
2990,
323,
12806,
13418,
432,
36167,
3888,
253,
2746,
310,
4270,
745,
2045,
36167,
11038,
6928,
534,
4270,
285,
1232,
247,
495,
69,
37808,
4644,
253,
36167,
6642,
310,
908,
281,
2199,
2460,
3386,
715,
247,
12621,
5462,
1173,
827,
6779,
534,
310,
11742,
970,
247,
440,
292,
534,
26295,
247,
24705,
6200,
12806,
253,
2746,
310,
6760,
327,
253,
465,
21498,
285,
1113,
4123,
4561,
15302,
50276,
296,
3755,
20556,
50276,
2520,
310,
253,
806,
789,
281,
3177,
24705,
12806,
13418,
432,
36167,
3888,
50276,
783,
2746,
310,
22040,
16671,
28462,
285,
476,
6283,
25057,
36167,
1491,
281,
3157,
12806,
13418,
50276,
783,
2746,
17923,
973,
327,
253,
767,
15302,
6760,
1580,
436,
2929,
310,
7106,
327,
247,
747,
1895,
627,
403,
417,
5368,
2987,
281,
3587,
7277,
281,
2299,
253,
2929,
3400,
5272,
1666,
25379,
407,
26264,
5368,
6928,
323,
436,
4836,
50276,
580,
9448,
253,
878,
323,
271,
10444,
6779,
26332,
1127,
9005,
407,
3587,
10603,
3386,
432,
253,
37808,
4644,
715,
12621,
5462,
1173,
827,
11627,
50276,
42045,
275,
253,
30762,
403,
4722,
50276,
20881,
1255,
265,
50276,
6050,
253,
4836,
3139,
310,
747,
8244,
2905,
4948,
273,
253,
1895,
452,
644,
5421,
323,
1650,
495,
69,
1789,
5481,
432,
1114,
26292,
3241,
40470,
285,
1114,
26292,
12806,
13418,
352,
651,
452,
644,
9371,
281,
923,
1543,
327,
253,
8244,
2905,
4836,
273,
495,
69,
1789,
5481,
281,
1805,
7277,
1411,
2720,
2987,
50275,
783,
13997,
78,
6333,
4620,
281,
320,
1077,
7996,
281,
253,
7200,
273,
253,
3216,
6415,
275,
253,
13506,
1113,
4123,
10895,
835,
247,
3216,
6415,
476,
320,
13613,
10302,
627,
3133,
281,
320,
247,
1781,
5750,
273,
970,
253,
13997,
78,
6333,
327,
1524,
10186,
941,
751,
465,
21498,
253,
897,
273,
253,
13997,
78,
6333,
4245,
1077,
3710,
7756,
275,
3045,
273,
253,
14166,
3508,
314,
8245,
50276,
783,
4836,
310,
8244,
2905,
281,
495,
69,
1789,
5481,
534,
556,
644,
970,
2074,
4295,
253,
5161,
4295,
273,
253,
2746,
452,
644,
908,
275,
2710,
4948,
275,
2720,
789,
253,
2929,
9373,
5576,
4735,
4979,
323,
1114,
26292,
495,
69,
1789,
5481,
687,
1678,
781,
6247,
4648,
247,
1077,
2074,
1332,
281,
2199,
2460,
3386,
715,
247,
12621,
5462,
1173,
827,
6779,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
12453,
253,
1895,
273,
26230,
247,
12621,
5462,
264,
1374,
18332,
24705,
12806,
6642,
273,
247,
6200,
1677,
271,
3280,
4667,
273,
36167,
3888,
273,
253,
6200,
253,
4477,
1246,
271,
990,
936,
423,
6194,
494,
3676,
2990,
326,
269,
5123,
3386,
6012,
432,
253,
36167,
3888,
285,
6493,
841,
3386,
715,
271,
18332,
13249,
3665,
534,
310,
4817,
949,
247,
440,
292,
3740,
1566,
281,
6635,
253,
2457,
1755,
1859,
24705,
26405,
3711,
253,
1566,
310,
10166,
275,
247,
4751,
22296,
5133,
4679,
403,
2684,
327,
253,
1113,
4123,
285,
465,
21498,
15302,
50275,
6050,
391,
19,
369,
2762,
597,
1335,
574,
690,
7350,
846,
4361,
253,
30080,
22559,
285,
253,
643,
10123,
5742,
597,
497,
417,
13762,
670,
253,
1318,
273,
253,
13997,
78,
6333,
436,
4468,
369,
671,
6096,
407,
391,
21,
3340,
275,
1708,
273,
253,
2954,
281,
687,
1678,
781,
1162,
355,
270,
78,
16788,
6247,
391,
18,
574,
7350,
670,
253,
4679,
5742,
253,
11745,
14023,
281,
28294,
5038,
253,
4477,
9713,
841,
5701,
533,
352,
310,
1335,
417,
2590,
604,
253,
3910,
476,
320,
12877,
281,
253,
1180,
273,
5971,
849,
597,
403,
17375,
390,
253,
3733,
8085,
908,
391,
20,
574,
3533,
670,
253,
11839,
273,
320,
87,
13650,
275,
2087,
2299,
347,
4767,
407,
391,
19,
627,
310,
247,
2257,
273,
1318,
275,
17682,
253,
1895,
275,
436,
1039,
50275,
249,
6452,
1223,
627,
497,
690,
2762,
5701,
432,
253,
30628,
627,
497,
671,
2067,
1534,
7350,
342,
642,
37317,
7378,
281,
16928,
253,
2929,
627,
310,
417,
2217,
1329,
281,
15249,
18738,
253,
2929,
275,
697,
1655,
830,
50276
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
31524,
2304,
30264,
480,
317,
706,
316,
21974,
472,
316,
3889,
372,
14008,
1205,
2694,
285,
305,
1944,
84,
19155,
8382,
1342,
1114,
26292,
24705,
35190,
9860,
10603,
342,
27311,
267,
39762,
32049,
48759,
6928,
26332,
1796,
15688,
982,
285,
29885,
4876,
5976,
6247,
38848,
34237,
50276,
68,
259,
606,
340,
266,
1162,
355,
10585,
9528,
274,
432,
5304,
6864,
13418,
49519,
253,
8037,
275,
495,
69,
1789,
5481,
323,
26279,
6276,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
6247,
50276,
69,
687,
1678,
781,
289,
4921,
247,
1591,
465,
423,
455,
285,
687,
589,
936,
260,
532,
30997,
9373,
5576,
4735,
4979,
323,
1114,
26292,
495,
69,
1789,
5481,
549,
32693,
638,
3845,
549,
32693,
1093,
7749,
25,
17599,
4765,
50276,
70,
247,
36833,
1755,
1374,
6779,
273,
2570,
3971,
13451,
30105,
1087,
6247,
50276,
71,
8488,
6821,
255,
5310,
9706,
3888,
432,
10341,
6568,
8132,
84,
407,
29688,
440,
10408,
272,
281,
495,
69,
23746,
87,
9169,
7152,
339,
431,
248,
2929,
4081,
281,
6642,
253,
24705,
12806,
275,
253,
12621,
2927,
1859,
432,
247,
4667,
273,
36167,
3888,
253,
2022,
38135,
1987,
2382,
8696,
275,
849,
281,
23968,
285,
22059,
253,
1491,
432,
253,
36167,
3888,
253,
4081,
7792,
21168,
2220,
13737,
8668,
10603,
285,
16589,
36167,
4735,
4644,
253,
3045,
369,
6760,
327,
253,
465,
21498,
285,
1113,
4123,
15302,
1677,
247,
4667,
273,
36167,
3888,
627,
403,
2710,
4610,
281,
22059,
253,
2460,
1491,
835,
436,
2929,
3400,
247,
7792,
407,
38883,
253,
36167,
1491,
275,
253,
12621,
2927,
1859,
50274,
66,
3505,
74,
6216,
1953,
310,
752,
310,
253,
1524,
34385,
273,
26230,
253,
12806,
275,
253,
12621,
2927,
1859,
432,
253,
2898,
1859,
253,
24705,
13418,
432,
253,
4049,
254,
580,
827,
2168,
2085,
1199,
1491,
534,
253,
36167,
3888,
812,
2007,
3157,
253,
3045,
432,
253,
4893,
84,
8668,
891,
651,
751,
281,
923,
11985,
285,
4679,
275,
4645,
253,
34385,
275,
970,
253,
12621,
2927,
6779,
50274,
249,
253,
28913,
2175,
253,
2929,
2168,
2085,
1027,
11640,
273,
253,
2990,
10336,
275,
38883,
253,
36167,
2460,
1491,
891,
2868,
627,
403,
1554,
262,
1945,
4715,
1754,
7792,
830,
436,
4836,
835,
253,
24705,
12806,
13418,
285,
36167,
13418,
403,
26277,
5998,
285,
18325,
1880,
326,
15722,
588,
2085,
4465,
5649,
50275,
249,
2593,
36279,
253,
2929,
7558,
326,
359,
1509,
253,
32147,
456,
36167,
320,
87,
4735,
3711,
285,
13997,
78,
320,
87,
4735,
3711,
281,
247,
440,
292,
391,
44215,
24423,
1162,
355,
4104,
2990,
281,
6635,
253,
24705,
3711,
260,
2299,
253,
2957,
7103,
10384,
281,
253,
13997,
78,
3386,
285,
36167,
3386,
11794,
10775,
14168,
1179,
5297,
532,
78,
285,
14168,
32557,
382,
11892,
80,
604,
767,
3311,
569,
403,
1160,
347,
253,
2990,
3453,
534,
581,
588,
320,
908,
323,
3045,
7103,
253,
643,
1563,
1953,
310,
50276,
338,
767,
4858,
3311,
569,
403,
1160,
347,
253,
2990,
18012,
285,
2429,
342,
253,
3216,
5083,
323,
2957,
7103,
1880,
247,
15274,
2957,
875,
841,
767,
3311,
569,
588,
2007,
37709,
253,
2990,
4715,
50275,
783,
2929,
5196,
4679,
327,
253,
465,
21498,
285,
1113,
4123,
10895,
352,
310,
973,
7192,
326,
253,
2846,
9875,
10895,
556,
644,
7561,
275,
16344,
24705,
26405,
835,
253,
36167,
3888,
403,
2130,
891,
651,
281,
923,
625,
7103,
327,
841,
1524,
5695,
10895,
2581,
685,
13506,
10895,
824,
347,
253,
1113,
4123,
10895,
50275,
783,
2929,
4060,
285,
12002,
943,
6780,
24705,
285,
12621,
2927,
1859,
347,
253,
2929,
4081,
281,
3037,
253,
24705,
12806,
275,
253,
12621,
2927,
1859,
253,
1655,
4060,
858,
4887,
841,
3607,
50276,
455,
275,
512,
3192,
512,
253,
1840,
5701,
715,
8180,
891,
651,
751,
281,
4089,
432,
253,
4477,
2380,
534,
812,
1421,
281,
9300,
13716,
275,
2057,
10746,
7152,
339,
9852,
6173,
272,
1895,
533,
253,
2929,
476,
320,
5520,
50276,
2520,
789,
13698,
281,
3587,
6642,
253,
1533,
12806,
275,
2914,
273,
253,
4958,
432,
247,
4667,
273,
36167,
14693,
352,
310,
1754,
327,
2105,
4644,
533,
352,
1057,
417,
11120,
3283,
253,
6864,
2193,
273,
1016,
12275,
3185,
352,
2137,
793,
253,
2105,
4644,
3386,
281,
253,
12621,
5130,
1859,
320,
87,
285,
513,
24705,
26405,
432,
320,
87,
970,
440,
292,
50276,
74,
1158,
253,
1895,
310,
4722,
285,
891,
2868,
352,
556,
1620,
644,
5393,
285,
263,
9713,
1078,
25761,
891,
1158,
253,
16038,
310,
671,
3588,
347,
253,
320,
87,
24705,
26405,
432,
6568,
13479,
476,
320,
581,
1774,
13071,
3280,
323,
15034,
285,
7219,
50275,
74,
751,
253,
2934,
273,
42654,
253,
6843,
495,
69,
14433,
285,
3587,
5310,
323,
253,
2457,
4736,
891,
2868,
359,
3798,
755,
1805,
3045,
672,
359,
3587,
15338,
253,
2957,
359,
971,
281,
15338,
25761,
352,
812,
7826,
9569,
690,
17006,
281,
643,
2987,
24088,
253,
1480,
6880,
50276,
20,
69,
1127,
21970,
24705,
26405,
50276,
2004,
891,
751,
436,
789,
891,
671,
556,
2067,
7350,
337,
310,
13997,
78,
4735,
1663,
1774,
891,
760,
923,
352,
310,
3576,
327,
253,
13506,
10895,
1113,
4123,
533,
417,
327,
465,
21498,
752,
310,
253,
1896,
1921,
619,
5476,
310,
326,
253,
3216,
13418,
310,
1077,
3076,
323,
253,
1524,
10186,
941,
891,
717,
671,
14338,
752,
310,
253,
3045,
604,
760,
13997,
78,
4735,
310,
908,
50276,
19,
275,
10199,
436,
2929,
3916,
26230,
7899,
6864,
310,
417,
4209,
1955,
281,
30796,
2299,
891,
13414,
923,
849,
436,
789,
812,
6016,
30796,
3185,
253,
15715,
4686,
629,
310,
34741,
562,
1309,
3733,
4496,
5513,
436,
3908,
495,
752,
310,
253,
2491,
273,
253,
12806,
13418,
432,
1113,
4123,
352,
310,
6931,
78,
285,
432,
4677,
608,
285,
4677,
721,
352,
310,
4791,
78,
604,
352,
310,
253,
1083,
253,
2159,
2491,
273,
253,
13418,
2789,
352,
1892,
281,
769,
347,
247,
2201,
4445,
275,
253,
13071,
985,
253,
1682,
897,
1083,
310,
323,
2159,
2491,
5481,
285,
985,
39296,
533,
2686,
891,
476,
8564,
326,
352,
651,
417,
755,
1077,
1175,
906,
275,
1048,
2491,
347,
627,
310,
1900,
247,
5454,
2727,
875,
8245,
323,
7200,
285,
6568,
14787,
323,
7031,
275,
36167,
13418,
577,
752,
310,
253,
2460,
6064,
323,
253,
17032,
673,
1071,
352,
3133,
3240,
3468,
604,
253,
6064,
310,
23414,
89,
21340,
323,
1113,
4123,
390,
37174,
89,
9726,
323,
465,
21498,
608,
323,
253,
3368,
891,
1158,
352,
310,
1805,
281,
1304,
1599,
12920,
6268,
342,
2709,
6194,
723,
347,
627,
403,
3733,
6046,
50276,
977,
13991,
285,
8254,
6787,
337,
672,
13997,
78,
310,
806,
5611,
275,
3239,
374,
352,
310,
1805,
281,
5513,
352,
275,
247,
2159,
6197,
253,
1655,
2715,
310,
417,
2590,
285,
627,
403,
963,
993,
374,
891,
2868,
436,
789,
310,
1754,
327,
10269,
26292,
36167,
8557,
3451,
479,
604,
891,
717,
3430,
594,
4496,
11120,
1333,
326,
275,
253,
2929,
671,
970,
1669,
918,
2460,
3185,
273,
17807,
68,
292,
1816,
2460,
310,
1679,
24363,
495,
323,
37808,
4735,
4644,
352,
310,
1805,
281,
897,
253,
21270,
1416,
50276,
16736,
4644,
352,
310,
1925,
2105,
4644,
275,
253,
10199,
533,
1996,
1925,
37808,
4735,
4644,
891,
1158,
352,
310,
1805,
281,
320,
5185,
577,
352,
310,
12744,
849,
13997,
78,
4735,
403,
2797,
432,
17095,
3602,
390,
3216,
13418,
891,
1158,
17095,
3602,
588,
417,
789,
1077,
973,
984,
3216,
310,
417,
1900,
247,
3962,
6415,
608,
352,
310,
12744,
752,
310,
253,
19862,
1332,
908,
1060,
604,
352,
816,
3936,
253,
1682,
273,
2067,
3210,
891,
588,
417,
320,
13762,
50276,
6438,
30080,
22559,
891,
1335,
1158,
436,
789,
556,
247,
4722,
4836,
9978,
2167,
352,
6296,
556,
1142,
35354,
846,
4361,
253,
6128,
285,
643,
30628,
337,
352,
3133,
326,
13997,
78,
310,
417,
1663,
4217,
275,
3946,
374,
352,
310,
671,
417,
4209,
281,
1781,
30796,
285,
3021,
627,
310,
642,
8813,
323,
697,
1604,
255,
912,
689,
26230,
7899,
6864,
495,
2491,
310,
2159,
285,
22667,
310,
1029,
577,
846,
4361,
37317,
18,
84,
5701,
891,
1158,
352,
812,
897,
253,
1072,
5661,
4758,
347,
253,
5368,
3082,
323,
247,
4344,
5301,
253,
643,
3082,
1537,
320,
417,
6283,
10166,
342,
253,
747,
4758,
608,
352,
310,
1335,
417,
2590,
849,
281,
802,
12779,
2067,
3210,
342,
1027,
10166,
13461,
275,
436,
789,
3021,
891,
717,
1683,
249,
619,
13716,
281,
721,
285,
891,
588,
417,
3819,
323,
436,
789,
7152,
339,
431,
248,
2929,
29328,
271,
990,
936,
423,
2990,
323,
12806,
13418,
432,
36167,
3888,
253,
2746,
310,
4270,
745,
2045,
36167,
11038,
6928,
534,
4270,
285,
1232,
247,
495,
69,
37808,
4644,
253,
36167,
6642,
310,
908,
281,
2199,
2460,
3386,
715,
247,
12621,
5462,
1173,
827,
6779,
534,
310,
11742,
970,
247,
440,
292,
534,
26295,
247,
24705,
6200,
12806,
253,
2746,
310,
6760,
327,
253,
465,
21498,
285,
1113,
4123,
4561,
15302,
50276,
296,
3755,
20556,
50276,
2520,
310,
253,
806,
789,
281,
3177,
24705,
12806,
13418,
432,
36167,
3888,
50276,
783,
2746,
310,
22040,
16671,
28462,
285,
476,
6283,
25057,
36167,
1491,
281,
3157,
12806,
13418,
50276,
783,
2746,
17923,
973,
327,
253,
767,
15302,
6760,
1580,
436,
2929,
310,
7106,
327,
247,
747,
1895,
627,
403,
417,
5368,
2987,
281,
3587,
7277,
281,
2299,
253,
2929,
3400,
5272,
1666,
25379,
407,
26264,
5368,
6928,
323,
436,
4836,
50276,
580,
9448,
253,
878,
323,
271,
10444,
6779,
26332,
1127,
9005,
407,
3587,
10603,
3386,
432,
253,
37808,
4644,
715,
12621,
5462,
1173,
827,
11627,
50276,
42045,
275,
253,
30762,
403,
4722,
50276,
20881,
1255,
265,
50276,
6050,
253,
4836,
3139,
310,
747,
8244,
2905,
4948,
273,
253,
1895,
452,
644,
5421,
323,
1650,
495,
69,
1789,
5481,
432,
1114,
26292,
3241,
40470,
285,
1114,
26292,
12806,
13418,
352,
651,
452,
644,
9371,
281,
923,
1543,
327,
253,
8244,
2905,
4836,
273,
495,
69,
1789,
5481,
281,
1805,
7277,
1411,
2720,
2987,
50275,
783,
13997,
78,
6333,
4620,
281,
320,
1077,
7996,
281,
253,
7200,
273,
253,
3216,
6415,
275,
253,
13506,
1113,
4123,
10895,
835,
247,
3216,
6415,
476,
320,
13613,
10302,
627,
3133,
281,
320,
247,
1781,
5750,
273,
970,
253,
13997,
78,
6333,
327,
1524,
10186,
941,
751,
465,
21498,
253,
897,
273,
253,
13997,
78,
6333,
4245,
1077,
3710,
7756,
275,
3045,
273,
253,
14166,
3508,
314,
8245,
50276,
783,
4836,
310,
8244,
2905,
281,
495,
69,
1789,
5481,
534,
556,
644,
970,
2074,
4295,
253,
5161,
4295,
273,
253,
2746,
452,
644,
908,
275,
2710,
4948,
275,
2720,
789,
253,
2929,
9373,
5576,
4735,
4979,
323,
1114,
26292,
495,
69,
1789,
5481,
687,
1678,
781,
6247,
4648,
247,
1077,
2074,
1332,
281,
2199,
2460,
3386,
715,
247,
12621,
5462,
1173,
827,
6779,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
12453,
253,
1895,
273,
26230,
247,
12621,
5462,
264,
1374,
18332,
24705,
12806,
6642,
273,
247,
6200,
1677,
271,
3280,
4667,
273,
36167,
3888,
273,
253,
6200,
253,
4477,
1246,
271,
990,
936,
423,
6194,
494,
3676,
2990,
326,
269,
5123,
3386,
6012,
432,
253,
36167,
3888,
285,
6493,
841,
3386,
715,
271,
18332,
13249,
3665,
534,
310,
4817,
949,
247,
440,
292,
3740,
1566,
281,
6635,
253,
2457,
1755,
1859,
24705,
26405,
3711,
253,
1566,
310,
10166,
275,
247,
4751,
22296,
5133,
4679,
403,
2684,
327,
253,
1113,
4123,
285,
465,
21498,
15302,
50275,
6050,
391,
19,
369,
2762,
597,
1335,
574,
690,
7350,
846,
4361,
253,
30080,
22559,
285,
253,
643,
10123,
5742,
597,
497,
417,
13762,
670,
253,
1318,
273,
253,
13997,
78,
6333,
436,
4468,
369,
671,
6096,
407,
391,
21,
3340,
275,
1708,
273,
253,
2954,
281,
687,
1678,
781,
1162,
355,
270,
78,
16788,
6247,
391,
18,
574,
7350,
670,
253,
4679,
5742,
253,
11745,
14023,
281,
28294,
5038,
253,
4477,
9713,
841,
5701,
533,
352,
310,
1335,
417,
2590,
604,
253,
3910,
476,
320,
12877,
281,
253,
1180,
273,
5971,
849,
597,
403,
17375,
390,
253,
3733,
8085,
908,
391,
20,
574,
3533,
670,
253,
11839,
273,
320,
87,
13650,
275,
2087,
2299,
347,
4767,
407,
391,
19,
627,
310,
247,
2257,
273,
1318,
275,
17682,
253,
1895,
275,
436,
1039,
50275,
249,
6452,
1223,
627,
497,
690,
2762,
5701,
432,
253,
30628,
627,
497,
671,
2067,
1534,
7350,
342,
642,
37317,
7378,
281,
16928,
253,
2929,
627,
310,
417,
2217,
1329,
281,
15249,
18738,
253,
2929,
275,
697,
1655,
830,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper considers the problem of dp isotonic regression for general loss functions an inefficient algorithm is proposed to achieve a utility with a logarithmic dependency on the alphabet size for pure dp an efficient algorithm is provided for ell1 and ell2 loss functions strengths 1 this paper considers the problem of dp isotonic regression for general loss functions an inefficient algorithm is proposed to achieve a utility with a logarithmic dependency on the alphabet size for pure dp the result is tight if the paper considers the minimax setting i think this is a very good result for an initial work in this area 2 an efficient algorithm is provided for ell1 and ell2 loss functions weakness 1 my most complaints are on the presentation first the paper does not distinguish between the minimax setting and the instance based setting specifically i was quite confused when i first looked at the discussion about the tightness if the paper explicitly defined the two concepts it could easily say the algorithm is minimax optimal but not instance optimal furthermore it would be much better if the authors could define the isotonic regression and the width in the introduction making it easier for the readers to evaluate the results 1 it remains interesting to see the effects of different loss functions both to the sample complexity and computation complexity minor 1 no hyperlinks for theorems na docsepthis paper considers the problem of private isotonic regression in which given a dataset d consisting of n samples the goal is to output a monotone function f that minimizes the empirical risk lfd1nsumi ellfxiyi the authors give a pure dp algorithm for the most general version of the problem which considers a poset x and a lipschitz loss function ell and they obtain an expected excess empirical risk of widthxlogxn for a fully ordered set this algorithm is efficient and the idea is to privately choose a maximal point alpha via the exponential mechanism and then recursively obtain the final function f by gluing together the functions obtained from recursing on the two partitions of m created via alpha the authors note that a simple implementation of assigning the unnormalized empirical risk as the score function results in a large error loss so instead they use a clipped version of the loss function as the score function resulting in reasonably low sensitivity the more general dp algorithm has a similar flavor except now one has to privately choose multiple maximal points alpha which leads to a less efficient algorithm due to multiple calls to the exponential mechanism the authors also obtain a nearmatching lower bound of widthxlogxn they achieve this by reducing to a known dp lower bound for dp algorithms that output a binary vector that is close to the input they also show that while there is a gap between the demonstrated upper and lower bounds there are posets that tightly realize each bound originality the contributions are original and would be of much interest to the overall machine learning community it would be good to discuss existing dp algorithms on the closely related topic of simple linear regression in the related work section see daniel alabi audra mcmillan jayshree sarathy adam d smith salil p vadhan differentially private simple linear regression proc priv enhancing technol 20222 184204 2022 quality the methods used are standard techniques in dp such as exponential mechanism composition for the upper bounds and reductions from known dp problems for the lower bounds i am fairly confident that the work is sound although i have not checked every single detail clarity the paper is mostly wellwritten just for completeness it would be good to explicitly state the running time of the dp algorithm for the general posets significance isotonic regression is an important primitive in the machine learning toolbox this work advances the state of the art on dp machine learning algorithms by adding this problem to the dp machine learning toolbox the algorithms and proofs presented are relatively straightforward and easy to follow and the authors also leave a set of intriguing open questions which may lead to further understanding of the complexity of machine learning tasks under dp constraints see in comments above no potential for negative societal impact docsepthe paper studies the problem of diferentially private isotonic regression it first introduces an algorithm and its excess risk it then studies a lower bound for solving this problem privately and shows that the gap between the two bounds is tight in the sense that for each bound there exist posets for which each bound is tight and thus the gap cannot be closed the algorithm runs in near linear time for totally ordered sets with ell1 and ell22 losses privacy is guaranteed by relying on the exponential mechanism to iteratively select threshold functions on smaller partitions of the domain then it applies standard composition to allocate the privacy budget across iterations strengths the specific problem of isotonic regression has not been studied before in the privacy literature the paper provides a clear characterization of the problem introducing upper and lower bounds and assumptions that allow for improvement or tightness of the results the paper is clearly written and well organized weaknesses the paper could provide more tangible intuition on the results for example in what settings this would be a meaningful practical algorithm and in what settings it is still a first attempt that needs improvements to be applied either a discussion or small synthetic experiments could help understand these results especially given that there is no previous work on the area this would give an intuition on the price of privacy the easiness to tune clipping etc widthx can be an extremely large quantity and make this algorithm impractical docsepthis paper is the first to deal with dp isotonic regression where the domain is some partially ordered set x and the goal is to find a monotonic fxto01 that minimizes a certain empirical loss the paper first discusses the totallyordered set x case and then the partially ordered set case by implementing a generalization of the totallyordered algorithm this is a the first paper to tackle this problem strengths first to deal with this problem poses upper and lowerbounds that depend on both logx and widthx the maxlength of an antichain in x weakness upper and lower bounds dont match yet fully that is of course expected from a first paper i think this is an interesting paper that is likely to instigate followup works on this version of erm and many other variants of constrained erms a clear accept first paper to tackle a new problem and as such the painting isnt complete yet
### Summary: | most reviewers found the paper well written with no serious doubts regarding the correctness we hope authors incorporate the comments from the reviewers in their final revision to improve the presentation | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
19401,
253,
1895,
273,
33234,
14539,
5120,
9077,
323,
2087,
2957,
3470,
271,
31334,
5933,
310,
4081,
281,
5115,
247,
11839,
342,
247,
32643,
18925,
327,
253,
30156,
1979,
323,
6313,
33234,
271,
5919,
5933,
310,
2530,
323,
11591,
18,
285,
11591,
19,
2957,
3470,
20544,
337,
436,
2929,
19401,
253,
1895,
273,
33234,
14539,
5120,
9077,
323,
2087,
2957,
3470,
271,
31334,
5933,
310,
4081,
281,
5115,
247,
11839,
342,
247,
32643,
18925,
327,
253,
30156,
1979,
323,
6313,
33234,
253,
906,
310,
6863,
604,
253,
2929,
19401,
253,
7221,
991,
4758,
891,
1158,
436,
310,
247,
1077,
1175,
906,
323,
271,
3302,
789,
275,
436,
2170,
374,
271,
5919,
5933,
310,
2530,
323,
11591,
18,
285,
11591,
19,
2957,
3470,
50276,
20881,
1255,
337,
619,
954,
14672,
403,
327,
253,
9759,
806,
253,
2929,
1057,
417,
12129,
875,
253,
7221,
991,
4758,
285,
253,
4227,
1754,
4758,
5742,
891,
369,
3240,
13477,
672,
891,
806,
3261,
387,
253,
5955,
670,
253,
6863,
1255,
604,
253,
2929,
11120,
2931,
253,
767,
12342,
352,
812,
4354,
1333,
253,
5933,
310,
7221,
991,
8654,
533,
417,
4227,
8654,
33810,
352,
651,
320,
1199,
1805,
604,
253,
4477,
812,
4853,
253,
14539,
5120,
9077,
285,
253,
4871,
275,
253,
10199,
2403,
352,
6927,
323,
253,
10668,
281,
7472,
253,
1543,
337,
352,
4558,
4722,
281,
923,
253,
2538,
273,
1027,
2957,
3470,
1097,
281,
253,
3410,
10454,
285,
13782,
10454,
50276,
37585,
337,
642,
4373,
23053,
323,
39383,
5549,
5474,
33032,
2520,
2929,
19401,
253,
1895,
273,
3055,
14539,
5120,
9077,
275,
534,
1677,
247,
10895,
277,
11253,
273,
295,
3530,
253,
4736,
310,
281,
3453,
247,
49123,
1159,
269,
326,
46926,
253,
16774,
2495,
298,
9194,
18,
79,
2204,
74,
11591,
71,
2981,
28212,
253,
4477,
1918,
247,
6313,
33234,
5933,
323,
253,
954,
2087,
2715,
273,
253,
1895,
534,
19401,
247,
803,
292,
1269,
285,
247,
11233,
37913,
2957,
1159,
11591,
285,
597,
4044,
271,
3264,
6714,
16774,
2495,
273,
4871,
89,
2808,
89,
79,
50276,
1542,
247,
4751,
6960,
873,
436,
5933,
310,
5919,
285,
253,
2934,
310,
281,
27904,
5206,
247,
13493,
1127,
9765,
3066,
253,
17619,
5122,
285,
840,
17910,
1242,
4044,
253,
2457,
1159,
269,
407,
1289,
5845,
2366,
253,
3470,
2797,
432,
17910,
272,
327,
253,
767,
27959,
273,
278,
3562,
3066,
9765,
253,
4477,
3877,
326,
247,
2969,
7092,
273,
34018,
253,
440,
6320,
1025,
16774,
2495,
347,
253,
4868,
1159,
1543,
275,
247,
1781,
2228,
2957,
594,
3185,
597,
897,
247,
502,
6390,
2715,
273,
253,
2957,
1159,
347,
253,
4868,
1159,
4795,
275,
12054,
1698,
7340,
253,
625,
2087,
33234,
5933,
556,
247,
2074,
13746,
3707,
1024,
581,
556,
281,
27904,
5206,
2709,
13493,
2792,
9765,
534,
5644,
281,
247,
1679,
5919,
5933,
1955,
281,
2709,
5841,
281,
253,
17619,
5122,
50276,
783,
4477,
671,
4044,
247,
425,
1513,
16464,
2406,
3033,
273,
4871,
89,
2808,
89,
79,
597,
5115,
436,
407,
8493,
281,
247,
1929,
33234,
2406,
3033,
323,
33234,
11333,
326,
3453,
247,
8985,
4972,
326,
310,
2810,
281,
253,
3280,
597,
671,
921,
326,
1223,
627,
310,
247,
8037,
875,
253,
5183,
5170,
285,
2406,
14493,
627,
403,
803,
1507,
326,
18996,
8968,
1016,
3033,
50275,
19164,
414,
50276,
783,
9021,
403,
3236,
285,
651,
320,
273,
1199,
1600,
281,
253,
4583,
5145,
4715,
3114,
352,
651,
320,
1175,
281,
2319,
5368,
33234,
11333,
327,
253,
8244,
2905,
9400,
273,
2969,
4872,
9077,
275,
253,
2905,
789,
2593,
923,
16447,
928,
355,
18754,
3820,
376,
278,
3591,
408,
266,
480,
698,
73,
658,
34886,
7822,
38622,
277,
924,
334,
3779,
300,
268,
46832,
5582,
21673,
3055,
2969,
4872,
9077,
15613,
2294,
22474,
1732,
311,
1384,
18895,
25921,
15781,
1384,
1423,
50275,
15177,
253,
3082,
908,
403,
2629,
5609,
275,
33234,
824,
347,
17619,
5122,
5889,
323,
253,
5170,
14493,
285,
23082,
432,
1929,
33234,
3237,
323,
253,
2406,
14493,
891,
717,
9648,
13224,
326,
253,
789,
310,
3590,
3738,
891,
452,
417,
10141,
1046,
2014,
2508,
50272,
498,
15752,
253,
2929,
310,
6571,
973,
15720,
816,
323,
29867,
352,
651,
320,
1175,
281,
11120,
1375,
253,
3515,
673,
273,
253,
33234,
5933,
323,
253,
2087,
803,
1507,
50273,
9188,
40348,
14539,
5120,
9077,
310,
271,
1774,
20523,
275,
253,
5145,
4715,
4968,
3364,
436,
789,
16424,
253,
1375,
273,
253,
1445,
327,
33234,
5145,
4715,
11333,
407,
6240,
436,
1895,
281,
253,
33234,
5145,
4715,
4968,
3364,
253,
11333,
285,
27947,
3559,
403,
4942,
15246,
285,
3477,
281,
956,
285,
253,
4477,
671,
3553,
247,
873,
273,
27807,
1527,
3533,
534,
778,
1421,
281,
2007,
4685,
273,
253,
10454,
273,
5145,
4715,
8892,
762,
33234,
10806,
50275,
2887,
275,
5701,
1840,
642,
2442,
323,
4016,
38058,
3486,
5474,
339,
431,
248,
2929,
2175,
253,
1895,
273,
722,
41377,
1365,
3055,
14539,
5120,
9077,
352,
806,
23970,
271,
5933,
285,
697,
6714,
2495,
352,
840,
2175,
247,
2406,
3033,
323,
16161,
436,
1895,
27904,
285,
2722,
326,
253,
8037,
875,
253,
767,
14493,
310,
6863,
275,
253,
3282,
326,
323,
1016,
3033,
627,
2226,
803,
1507,
323,
534,
1016,
3033,
310,
6863,
50276,
395,
3021,
253,
8037,
2550,
320,
4581,
253,
5933,
6613,
275,
2822,
4872,
673,
323,
9106,
6960,
5239,
342,
11591,
18,
285,
11591,
1423,
11655,
50275,
13552,
1974,
310,
16293,
407,
22128,
327,
253,
17619,
5122,
281,
10040,
3146,
3609,
7887,
3470,
327,
4577,
27959,
273,
253,
5028,
840,
352,
10384,
2629,
5889,
281,
29211,
253,
11068,
7563,
2439,
25142,
50274,
296,
3755,
20556,
50276,
783,
2173,
1895,
273,
14539,
5120,
9077,
556,
417,
644,
5421,
1078,
275,
253,
11068,
6239,
50275,
783,
2929,
3400,
247,
2590,
14846,
273,
253,
1895,
16984,
5170,
285,
2406,
14493,
285,
13260,
326,
1581,
323,
7756,
390,
6863,
1255,
273,
253,
1543,
50275,
783,
2929,
310,
4518,
3542,
285,
973,
10932,
50275,
20881,
1255,
265,
50276,
783,
2929,
812,
2085,
625,
33631,
30328,
327,
253,
1543,
323,
1650,
275,
752,
7533,
436,
651,
320,
247,
14282,
8542,
5933,
285,
275,
752,
7533,
352,
310,
1335,
247,
806,
3177,
326,
3198,
11701,
281,
320,
3732,
2057,
247,
5955,
390,
1355,
13506,
4679,
812,
1361,
2096,
841,
1543,
3340,
1677,
326,
627,
310,
642,
2045,
789,
327,
253,
2170,
50276,
2520,
651,
1918,
271,
30328,
327,
253,
4376,
273,
11068,
253,
1842,
1632,
281,
19928,
502,
8201,
3966,
50274,
3429,
89,
476,
320,
271,
6685,
1781,
10671,
285,
1056,
436,
5933,
45783,
50275,
7152,
33032,
2520,
2929,
310,
253,
806,
281,
2968,
342,
33234,
14539,
5120,
9077,
835,
253,
5028,
310,
690,
10571,
6960,
873,
1269,
285,
253,
4736,
310,
281,
1089,
247,
45973,
269,
633,
80,
520,
326,
46926,
247,
2176,
16774,
2957,
253,
2929,
806,
25339,
253,
9106,
16586,
873,
1269,
1083,
285,
840,
253,
10571,
6960,
873,
1083,
407,
16994,
247,
26647,
273,
253,
9106,
16586,
5933,
50276,
2520,
310,
247,
253,
806,
2929,
281,
18915,
436,
1895,
20544,
50276,
7053,
281,
2968,
342,
436,
1895,
50276,
6013,
5170,
285,
2406,
35800,
326,
3469,
327,
1097,
2412,
89,
285,
4871,
89,
253,
2781,
3985,
273,
271,
1331,
469,
404,
275,
1269,
50276,
20881,
1255,
50276,
23725,
285,
2406,
14493,
13414,
3761,
2568,
4751,
326,
310,
273,
2282,
3264,
432,
247,
806,
2929,
50276,
74,
1158,
436,
310,
271,
4722,
2929,
326,
310,
2779,
281,
978,
12894,
956,
484,
2987,
327,
436,
2715,
273,
209,
693,
285,
1142,
643,
11640,
273,
20793,
209,
693,
84,
247,
2590,
2997,
806,
2929,
281,
18915,
247,
747,
1895,
285,
347,
824,
253,
13497,
310,
2649,
3426,
2568,
2490,
187,
4118,
18435,
27,
2252,
30628,
1119,
253,
2929,
973,
3542,
342,
642,
4092,
24626,
5001,
253,
36594,
359,
3524,
4477,
19071,
253,
5701,
432,
253,
30628,
275,
616,
2457,
18520,
281,
3157,
253,
9759
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
19401,
253,
1895,
273,
33234,
14539,
5120,
9077,
323,
2087,
2957,
3470,
271,
31334,
5933,
310,
4081,
281,
5115,
247,
11839,
342,
247,
32643,
18925,
327,
253,
30156,
1979,
323,
6313,
33234,
271,
5919,
5933,
310,
2530,
323,
11591,
18,
285,
11591,
19,
2957,
3470,
20544,
337,
436,
2929,
19401,
253,
1895,
273,
33234,
14539,
5120,
9077,
323,
2087,
2957,
3470,
271,
31334,
5933,
310,
4081,
281,
5115,
247,
11839,
342,
247,
32643,
18925,
327,
253,
30156,
1979,
323,
6313,
33234,
253,
906,
310,
6863,
604,
253,
2929,
19401,
253,
7221,
991,
4758,
891,
1158,
436,
310,
247,
1077,
1175,
906,
323,
271,
3302,
789,
275,
436,
2170,
374,
271,
5919,
5933,
310,
2530,
323,
11591,
18,
285,
11591,
19,
2957,
3470,
50276,
20881,
1255,
337,
619,
954,
14672,
403,
327,
253,
9759,
806,
253,
2929,
1057,
417,
12129,
875,
253,
7221,
991,
4758,
285,
253,
4227,
1754,
4758,
5742,
891,
369,
3240,
13477,
672,
891,
806,
3261,
387,
253,
5955,
670,
253,
6863,
1255,
604,
253,
2929,
11120,
2931,
253,
767,
12342,
352,
812,
4354,
1333,
253,
5933,
310,
7221,
991,
8654,
533,
417,
4227,
8654,
33810,
352,
651,
320,
1199,
1805,
604,
253,
4477,
812,
4853,
253,
14539,
5120,
9077,
285,
253,
4871,
275,
253,
10199,
2403,
352,
6927,
323,
253,
10668,
281,
7472,
253,
1543,
337,
352,
4558,
4722,
281,
923,
253,
2538,
273,
1027,
2957,
3470,
1097,
281,
253,
3410,
10454,
285,
13782,
10454,
50276,
37585,
337,
642,
4373,
23053,
323,
39383,
5549,
5474,
33032,
2520,
2929,
19401,
253,
1895,
273,
3055,
14539,
5120,
9077,
275,
534,
1677,
247,
10895,
277,
11253,
273,
295,
3530,
253,
4736,
310,
281,
3453,
247,
49123,
1159,
269,
326,
46926,
253,
16774,
2495,
298,
9194,
18,
79,
2204,
74,
11591,
71,
2981,
28212,
253,
4477,
1918,
247,
6313,
33234,
5933,
323,
253,
954,
2087,
2715,
273,
253,
1895,
534,
19401,
247,
803,
292,
1269,
285,
247,
11233,
37913,
2957,
1159,
11591,
285,
597,
4044,
271,
3264,
6714,
16774,
2495,
273,
4871,
89,
2808,
89,
79,
50276,
1542,
247,
4751,
6960,
873,
436,
5933,
310,
5919,
285,
253,
2934,
310,
281,
27904,
5206,
247,
13493,
1127,
9765,
3066,
253,
17619,
5122,
285,
840,
17910,
1242,
4044,
253,
2457,
1159,
269,
407,
1289,
5845,
2366,
253,
3470,
2797,
432,
17910,
272,
327,
253,
767,
27959,
273,
278,
3562,
3066,
9765,
253,
4477,
3877,
326,
247,
2969,
7092,
273,
34018,
253,
440,
6320,
1025,
16774,
2495,
347,
253,
4868,
1159,
1543,
275,
247,
1781,
2228,
2957,
594,
3185,
597,
897,
247,
502,
6390,
2715,
273,
253,
2957,
1159,
347,
253,
4868,
1159,
4795,
275,
12054,
1698,
7340,
253,
625,
2087,
33234,
5933,
556,
247,
2074,
13746,
3707,
1024,
581,
556,
281,
27904,
5206,
2709,
13493,
2792,
9765,
534,
5644,
281,
247,
1679,
5919,
5933,
1955,
281,
2709,
5841,
281,
253,
17619,
5122,
50276,
783,
4477,
671,
4044,
247,
425,
1513,
16464,
2406,
3033,
273,
4871,
89,
2808,
89,
79,
597,
5115,
436,
407,
8493,
281,
247,
1929,
33234,
2406,
3033,
323,
33234,
11333,
326,
3453,
247,
8985,
4972,
326,
310,
2810,
281,
253,
3280,
597,
671,
921,
326,
1223,
627,
310,
247,
8037,
875,
253,
5183,
5170,
285,
2406,
14493,
627,
403,
803,
1507,
326,
18996,
8968,
1016,
3033,
50275,
19164,
414,
50276,
783,
9021,
403,
3236,
285,
651,
320,
273,
1199,
1600,
281,
253,
4583,
5145,
4715,
3114,
352,
651,
320,
1175,
281,
2319,
5368,
33234,
11333,
327,
253,
8244,
2905,
9400,
273,
2969,
4872,
9077,
275,
253,
2905,
789,
2593,
923,
16447,
928,
355,
18754,
3820,
376,
278,
3591,
408,
266,
480,
698,
73,
658,
34886,
7822,
38622,
277,
924,
334,
3779,
300,
268,
46832,
5582,
21673,
3055,
2969,
4872,
9077,
15613,
2294,
22474,
1732,
311,
1384,
18895,
25921,
15781,
1384,
1423,
50275,
15177,
253,
3082,
908,
403,
2629,
5609,
275,
33234,
824,
347,
17619,
5122,
5889,
323,
253,
5170,
14493,
285,
23082,
432,
1929,
33234,
3237,
323,
253,
2406,
14493,
891,
717,
9648,
13224,
326,
253,
789,
310,
3590,
3738,
891,
452,
417,
10141,
1046,
2014,
2508,
50272,
498,
15752,
253,
2929,
310,
6571,
973,
15720,
816,
323,
29867,
352,
651,
320,
1175,
281,
11120,
1375,
253,
3515,
673,
273,
253,
33234,
5933,
323,
253,
2087,
803,
1507,
50273,
9188,
40348,
14539,
5120,
9077,
310,
271,
1774,
20523,
275,
253,
5145,
4715,
4968,
3364,
436,
789,
16424,
253,
1375,
273,
253,
1445,
327,
33234,
5145,
4715,
11333,
407,
6240,
436,
1895,
281,
253,
33234,
5145,
4715,
4968,
3364,
253,
11333,
285,
27947,
3559,
403,
4942,
15246,
285,
3477,
281,
956,
285,
253,
4477,
671,
3553,
247,
873,
273,
27807,
1527,
3533,
534,
778,
1421,
281,
2007,
4685,
273,
253,
10454,
273,
5145,
4715,
8892,
762,
33234,
10806,
50275,
2887,
275,
5701,
1840,
642,
2442,
323,
4016,
38058,
3486,
5474,
339,
431,
248,
2929,
2175,
253,
1895,
273,
722,
41377,
1365,
3055,
14539,
5120,
9077,
352,
806,
23970,
271,
5933,
285,
697,
6714,
2495,
352,
840,
2175,
247,
2406,
3033,
323,
16161,
436,
1895,
27904,
285,
2722,
326,
253,
8037,
875,
253,
767,
14493,
310,
6863,
275,
253,
3282,
326,
323,
1016,
3033,
627,
2226,
803,
1507,
323,
534,
1016,
3033,
310,
6863,
50276,
395,
3021,
253,
8037,
2550,
320,
4581,
253,
5933,
6613,
275,
2822,
4872,
673,
323,
9106,
6960,
5239,
342,
11591,
18,
285,
11591,
1423,
11655,
50275,
13552,
1974,
310,
16293,
407,
22128,
327,
253,
17619,
5122,
281,
10040,
3146,
3609,
7887,
3470,
327,
4577,
27959,
273,
253,
5028,
840,
352,
10384,
2629,
5889,
281,
29211,
253,
11068,
7563,
2439,
25142,
50274,
296,
3755,
20556,
50276,
783,
2173,
1895,
273,
14539,
5120,
9077,
556,
417,
644,
5421,
1078,
275,
253,
11068,
6239,
50275,
783,
2929,
3400,
247,
2590,
14846,
273,
253,
1895,
16984,
5170,
285,
2406,
14493,
285,
13260,
326,
1581,
323,
7756,
390,
6863,
1255,
273,
253,
1543,
50275,
783,
2929,
310,
4518,
3542,
285,
973,
10932,
50275,
20881,
1255,
265,
50276,
783,
2929,
812,
2085,
625,
33631,
30328,
327,
253,
1543,
323,
1650,
275,
752,
7533,
436,
651,
320,
247,
14282,
8542,
5933,
285,
275,
752,
7533,
352,
310,
1335,
247,
806,
3177,
326,
3198,
11701,
281,
320,
3732,
2057,
247,
5955,
390,
1355,
13506,
4679,
812,
1361,
2096,
841,
1543,
3340,
1677,
326,
627,
310,
642,
2045,
789,
327,
253,
2170,
50276,
2520,
651,
1918,
271,
30328,
327,
253,
4376,
273,
11068,
253,
1842,
1632,
281,
19928,
502,
8201,
3966,
50274,
3429,
89,
476,
320,
271,
6685,
1781,
10671,
285,
1056,
436,
5933,
45783,
50275,
7152,
33032,
2520,
2929,
310,
253,
806,
281,
2968,
342,
33234,
14539,
5120,
9077,
835,
253,
5028,
310,
690,
10571,
6960,
873,
1269,
285,
253,
4736,
310,
281,
1089,
247,
45973,
269,
633,
80,
520,
326,
46926,
247,
2176,
16774,
2957,
253,
2929,
806,
25339,
253,
9106,
16586,
873,
1269,
1083,
285,
840,
253,
10571,
6960,
873,
1083,
407,
16994,
247,
26647,
273,
253,
9106,
16586,
5933,
50276,
2520,
310,
247,
253,
806,
2929,
281,
18915,
436,
1895,
20544,
50276,
7053,
281,
2968,
342,
436,
1895,
50276,
6013,
5170,
285,
2406,
35800,
326,
3469,
327,
1097,
2412,
89,
285,
4871,
89,
253,
2781,
3985,
273,
271,
1331,
469,
404,
275,
1269,
50276,
20881,
1255,
50276,
23725,
285,
2406,
14493,
13414,
3761,
2568,
4751,
326,
310,
273,
2282,
3264,
432,
247,
806,
2929,
50276,
74,
1158,
436,
310,
271,
4722,
2929,
326,
310,
2779,
281,
978,
12894,
956,
484,
2987,
327,
436,
2715,
273,
209,
693,
285,
1142,
643,
11640,
273,
20793,
209,
693,
84,
247,
2590,
2997,
806,
2929,
281,
18915,
247,
747,
1895,
285,
347,
824,
253,
13497,
310,
2649,
3426,
2568,
2490,
187,
4118,
18435,
27,
2252,
30628,
1119,
253,
2929,
973,
3542,
342,
642,
4092,
24626,
5001,
253,
36594,
359,
3524,
4477,
19071,
253,
5701,
432,
253,
30628,
275,
616,
2457,
18520,
281,
3157,
253,
9759
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the problem of neural network compression using analysis in the highdimensional ntk regime their main results show that under this regime the spectral properties of both ntk and ck matrices are independent of the distribution of the weights up to normalization and centering instead they depend on a number of parameters that define the activation functions at each layer this finding informs a new compression technique where a new compressed network can match the activation parameters at each layer to enjoy the same spectral properties as the original net this ntklc technique is evaluated on synthetic data by qualitatively comparing the distribution of eigenvalues and on real data by comparing test accuracy with naive baselines strengths the paper has a clear motivation in the field of neural network compression a relevant problem that is lacking theory it is clearly written with thorough theoretical results and experiments on both synthetic and realworld data weaknesses 1 the claim in line 49 seems to be a central theme of the paper but has no followup discussion on its meaning and implications 2 theoretical claims are presented in the asymptotic regime of infinite n and p assumption 1 3 a particular gmm distribution is chosen for the input data of the studied model without justification for why it is the relevant distribution to be analyzing 4 the results in figure 2 rely on a qualitative measure of closeness to evaluate the method instead of a metric that can be quantified and compared many of the markings in the top histograms are barely noticeable and require magnification to be seen 5 the results in figure 3 are compared with naive baselines instead of competitive state of the art methods the authors have addressed the limitations of using ntk theory to explain the behavior of modern neural networks docsepthe authors showed asymptotic eigenspectral equivalence conditions for fullyconnected ntk given gmm data and certain assumptions based thereon they proposed a net compression scheme with sparse and lowprecision random weights and demonstrated with examples results linking ntk and random matrix theory with dnn compression is of timely interest to the field though i cannot say i followed all proofs the main ideas and motivations are well presented a lack of comprehensive experimental comparison with baseline approaches is limiting the practical significance of the findings see above docsepthis paper characterizes the asymptotic spectral equivalence between ntks of dense and quantized networks it shows that under certain assumptions of data highdim gaussian mixture data and network architectures wide mlps quantized networks have the same ntk eigenspectra of unquantized ones this finding allows the authors to perform model quantization with little performance degradation the paper is very well written the authors crafted their paper with immense care and taste for mathematical detail the main results of the paper theorem 1 and theorem 2 are novel and subsume previous studies 2 32 as special cases overall i think this is a highquality paper one weakness of this paper is in its numerical evaluation as i detailed below the baselines used for the model pruning randomly removing weights seem to be too brutal and too weak it is beneficial to incorporate more realistic baselines such as magnitudebased pruning one limitation of the paper as mentioned in my questions above is its lack of natural baselines for model pruning in the experiment sections i encourage the authors to consider incorporating them
### Summary: | in the paper the authors provide theorems that establish that for gmm input data the ntk matrices of dense and quantized dnns have the same eigenspectra in the asymptotic limit of high input data dimension and sample size these results motivate network compression algorithms which demonstrate good empirical performance even outside the regime for which the proofs are established the theorems provide a novel extension that contains previous studies as special cases the baseline comparisons included in the paper are somewhat limited in nature and the authors should reevaluate their choice to use the word lossless with quotes and instead use a more accurate term that does not require quotes | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
1895,
273,
11454,
2990,
13800,
970,
1783,
275,
253,
1029,
6967,
295,
17922,
9459,
616,
2022,
1543,
921,
326,
762,
436,
9459,
253,
9879,
3607,
273,
1097,
295,
17922,
285,
260,
76,
12624,
403,
3907,
273,
253,
3268,
273,
253,
13461,
598,
281,
21539,
285,
1399,
2158,
3185,
597,
3469,
327,
247,
1180,
273,
3602,
326,
4853,
253,
5743,
3470,
387,
1016,
3828,
436,
4560,
46210,
247,
747,
13800,
5853,
835,
247,
747,
21012,
2990,
476,
3761,
253,
5743,
3602,
387,
1016,
3828,
281,
4264,
253,
1072,
9879,
3607,
347,
253,
3236,
2036,
436,
34900,
7261,
68,
5853,
310,
6760,
327,
13506,
941,
407,
36143,
10941,
253,
3268,
273,
20223,
285,
327,
1524,
941,
407,
10941,
1071,
7200,
342,
27785,
1666,
25379,
20544,
50276,
783,
2929,
556,
247,
2590,
16038,
275,
253,
1673,
273,
11454,
2990,
13800,
247,
4623,
1895,
326,
310,
14999,
3762,
352,
310,
4518,
3542,
342,
11080,
10527,
1543,
285,
4679,
327,
1097,
13506,
285,
1524,
10186,
941,
50276,
20881,
1255,
265,
337,
253,
1750,
275,
1386,
7584,
3133,
281,
320,
247,
4275,
10014,
273,
253,
2929,
533,
556,
642,
956,
484,
5955,
327,
697,
4495,
285,
12739,
374,
10527,
3916,
403,
3559,
275,
253,
20185,
9459,
273,
11968,
295,
285,
268,
9376,
337,
495,
247,
1798,
305,
2188,
3268,
310,
6777,
323,
253,
3280,
941,
273,
253,
5421,
1566,
1293,
22861,
323,
2139,
352,
310,
253,
4623,
3268,
281,
320,
18918,
577,
253,
1543,
275,
4677,
374,
10725,
327,
247,
18276,
2557,
273,
2734,
8098,
281,
7472,
253,
1332,
3185,
273,
247,
7982,
326,
476,
320,
18755,
285,
2429,
1142,
273,
253,
46547,
275,
253,
1755,
47846,
403,
12345,
28629,
285,
2430,
28358,
281,
320,
2326,
608,
253,
1543,
275,
4677,
495,
403,
2429,
342,
27785,
1666,
25379,
3185,
273,
12085,
1375,
273,
253,
1445,
3082,
50276,
783,
4477,
452,
9713,
253,
7364,
273,
970,
295,
17922,
3762,
281,
5513,
253,
3879,
273,
4980,
11454,
6928,
5474,
339,
431,
248,
4477,
2692,
20185,
299,
17731,
808,
1544,
19945,
2515,
323,
4751,
14063,
295,
17922,
1677,
305,
2188,
941,
285,
2176,
13260,
1754,
30134,
597,
4081,
247,
2036,
13800,
6974,
342,
23507,
285,
1698,
40540,
3632,
13461,
285,
5183,
342,
6667,
50273,
16680,
20057,
295,
17922,
285,
3632,
4315,
3762,
342,
277,
9866,
13800,
310,
273,
14793,
1600,
281,
253,
1673,
50275,
2004,
891,
2550,
1333,
891,
3560,
512,
27947,
253,
2022,
5697,
285,
42852,
403,
973,
3559,
50275,
66,
3480,
273,
11088,
5661,
5301,
342,
8245,
7274,
310,
14155,
253,
8542,
8453,
273,
253,
4342,
50276,
2887,
1840,
5474,
33032,
2520,
2929,
45589,
253,
20185,
9879,
19945,
875,
34900,
661,
273,
14086,
285,
2677,
1025,
6928,
352,
2722,
326,
762,
2176,
13260,
273,
941,
1029,
4528,
305,
12064,
7802,
941,
285,
2990,
35615,
4618,
13361,
793,
2677,
1025,
6928,
452,
253,
1072,
295,
17922,
299,
17731,
808,
376,
273,
440,
17149,
1025,
4394,
436,
4560,
4483,
253,
4477,
281,
1347,
1566,
36643,
342,
1652,
3045,
11961,
253,
2929,
310,
1077,
973,
3542,
50276,
783,
4477,
37171,
616,
2929,
342,
24878,
1557,
285,
9075,
323,
15965,
2508,
253,
2022,
1543,
273,
253,
2929,
10012,
337,
285,
10012,
374,
403,
4460,
285,
8790,
2123,
2045,
2175,
374,
4567,
347,
2714,
2219,
4583,
891,
1158,
436,
310,
247,
1029,
15177,
2929,
50275,
531,
14855,
273,
436,
2929,
310,
275,
697,
10704,
7103,
347,
891,
7000,
2708,
253,
1666,
25379,
908,
323,
253,
1566,
819,
25004,
12421,
11922,
13461,
1646,
281,
320,
1512,
23180,
285,
1512,
5075,
352,
310,
12912,
281,
19071,
625,
15958,
1666,
25379,
824,
347,
2849,
7128,
2275,
833,
819,
25004,
50273,
531,
12291,
273,
253,
2929,
347,
5393,
275,
619,
3533,
1840,
310,
697,
3480,
273,
3626,
1666,
25379,
323,
1566,
819,
25004,
275,
253,
3368,
7118,
891,
11907,
253,
4477,
281,
1908,
24049,
731,
50275,
187,
187,
4118,
18435,
27,
249,
253,
2929,
253,
4477,
2085,
39383,
326,
5100,
326,
323,
305,
2188,
3280,
941,
253,
295,
17922,
12624,
273,
14086,
285,
2677,
1025,
277,
79,
2224,
452,
253,
1072,
299,
17731,
808,
376,
275,
253,
20185,
2701,
273,
1029,
3280,
941,
7877,
285,
3410,
1979,
50276,
20513,
1543,
41509,
2990,
13800,
11333,
534,
7568,
1175,
16774,
3045,
1014,
3345,
253,
9459,
323,
534,
253,
27947,
403,
4232,
50276,
783,
39383,
2085,
247,
4460,
6880,
326,
4428,
2045,
2175,
347,
2714,
2219,
50276,
783,
8245,
14023,
2908,
275,
253,
2929,
403,
8489,
3710,
275,
3753,
285,
253,
4477,
943,
294,
45141,
616,
4327,
281,
897,
253,
3159,
2957,
1417,
342,
19101,
285,
3185,
897,
247,
625,
7899,
1307,
326,
1057,
417,
2430,
19101
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
1895,
273,
11454,
2990,
13800,
970,
1783,
275,
253,
1029,
6967,
295,
17922,
9459,
616,
2022,
1543,
921,
326,
762,
436,
9459,
253,
9879,
3607,
273,
1097,
295,
17922,
285,
260,
76,
12624,
403,
3907,
273,
253,
3268,
273,
253,
13461,
598,
281,
21539,
285,
1399,
2158,
3185,
597,
3469,
327,
247,
1180,
273,
3602,
326,
4853,
253,
5743,
3470,
387,
1016,
3828,
436,
4560,
46210,
247,
747,
13800,
5853,
835,
247,
747,
21012,
2990,
476,
3761,
253,
5743,
3602,
387,
1016,
3828,
281,
4264,
253,
1072,
9879,
3607,
347,
253,
3236,
2036,
436,
34900,
7261,
68,
5853,
310,
6760,
327,
13506,
941,
407,
36143,
10941,
253,
3268,
273,
20223,
285,
327,
1524,
941,
407,
10941,
1071,
7200,
342,
27785,
1666,
25379,
20544,
50276,
783,
2929,
556,
247,
2590,
16038,
275,
253,
1673,
273,
11454,
2990,
13800,
247,
4623,
1895,
326,
310,
14999,
3762,
352,
310,
4518,
3542,
342,
11080,
10527,
1543,
285,
4679,
327,
1097,
13506,
285,
1524,
10186,
941,
50276,
20881,
1255,
265,
337,
253,
1750,
275,
1386,
7584,
3133,
281,
320,
247,
4275,
10014,
273,
253,
2929,
533,
556,
642,
956,
484,
5955,
327,
697,
4495,
285,
12739,
374,
10527,
3916,
403,
3559,
275,
253,
20185,
9459,
273,
11968,
295,
285,
268,
9376,
337,
495,
247,
1798,
305,
2188,
3268,
310,
6777,
323,
253,
3280,
941,
273,
253,
5421,
1566,
1293,
22861,
323,
2139,
352,
310,
253,
4623,
3268,
281,
320,
18918,
577,
253,
1543,
275,
4677,
374,
10725,
327,
247,
18276,
2557,
273,
2734,
8098,
281,
7472,
253,
1332,
3185,
273,
247,
7982,
326,
476,
320,
18755,
285,
2429,
1142,
273,
253,
46547,
275,
253,
1755,
47846,
403,
12345,
28629,
285,
2430,
28358,
281,
320,
2326,
608,
253,
1543,
275,
4677,
495,
403,
2429,
342,
27785,
1666,
25379,
3185,
273,
12085,
1375,
273,
253,
1445,
3082,
50276,
783,
4477,
452,
9713,
253,
7364,
273,
970,
295,
17922,
3762,
281,
5513,
253,
3879,
273,
4980,
11454,
6928,
5474,
339,
431,
248,
4477,
2692,
20185,
299,
17731,
808,
1544,
19945,
2515,
323,
4751,
14063,
295,
17922,
1677,
305,
2188,
941,
285,
2176,
13260,
1754,
30134,
597,
4081,
247,
2036,
13800,
6974,
342,
23507,
285,
1698,
40540,
3632,
13461,
285,
5183,
342,
6667,
50273,
16680,
20057,
295,
17922,
285,
3632,
4315,
3762,
342,
277,
9866,
13800,
310,
273,
14793,
1600,
281,
253,
1673,
50275,
2004,
891,
2550,
1333,
891,
3560,
512,
27947,
253,
2022,
5697,
285,
42852,
403,
973,
3559,
50275,
66,
3480,
273,
11088,
5661,
5301,
342,
8245,
7274,
310,
14155,
253,
8542,
8453,
273,
253,
4342,
50276,
2887,
1840,
5474,
33032,
2520,
2929,
45589,
253,
20185,
9879,
19945,
875,
34900,
661,
273,
14086,
285,
2677,
1025,
6928,
352,
2722,
326,
762,
2176,
13260,
273,
941,
1029,
4528,
305,
12064,
7802,
941,
285,
2990,
35615,
4618,
13361,
793,
2677,
1025,
6928,
452,
253,
1072,
295,
17922,
299,
17731,
808,
376,
273,
440,
17149,
1025,
4394,
436,
4560,
4483,
253,
4477,
281,
1347,
1566,
36643,
342,
1652,
3045,
11961,
253,
2929,
310,
1077,
973,
3542,
50276,
783,
4477,
37171,
616,
2929,
342,
24878,
1557,
285,
9075,
323,
15965,
2508,
253,
2022,
1543,
273,
253,
2929,
10012,
337,
285,
10012,
374,
403,
4460,
285,
8790,
2123,
2045,
2175,
374,
4567,
347,
2714,
2219,
4583,
891,
1158,
436,
310,
247,
1029,
15177,
2929,
50275,
531,
14855,
273,
436,
2929,
310,
275,
697,
10704,
7103,
347,
891,
7000,
2708,
253,
1666,
25379,
908,
323,
253,
1566,
819,
25004,
12421,
11922,
13461,
1646,
281,
320,
1512,
23180,
285,
1512,
5075,
352,
310,
12912,
281,
19071,
625,
15958,
1666,
25379,
824,
347,
2849,
7128,
2275,
833,
819,
25004,
50273,
531,
12291,
273,
253,
2929,
347,
5393,
275,
619,
3533,
1840,
310,
697,
3480,
273,
3626,
1666,
25379,
323,
1566,
819,
25004,
275,
253,
3368,
7118,
891,
11907,
253,
4477,
281,
1908,
24049,
731,
50275,
187,
187,
4118,
18435,
27,
249,
253,
2929,
253,
4477,
2085,
39383,
326,
5100,
326,
323,
305,
2188,
3280,
941,
253,
295,
17922,
12624,
273,
14086,
285,
2677,
1025,
277,
79,
2224,
452,
253,
1072,
299,
17731,
808,
376,
275,
253,
20185,
2701,
273,
1029,
3280,
941,
7877,
285,
3410,
1979,
50276,
20513,
1543,
41509,
2990,
13800,
11333,
534,
7568,
1175,
16774,
3045,
1014,
3345,
253,
9459,
323,
534,
253,
27947,
403,
4232,
50276,
783,
39383,
2085,
247,
4460,
6880,
326,
4428,
2045,
2175,
347,
2714,
2219,
50276,
783,
8245,
14023,
2908,
275,
253,
2929,
403,
8489,
3710,
275,
3753,
285,
253,
4477,
943,
294,
45141,
616,
4327,
281,
897,
253,
3159,
2957,
1417,
342,
19101,
285,
3185,
897,
247,
625,
7899,
1307,
326,
1057,
417,
2430,
19101
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this paper the authors have proposed a new approach to determine the optimized subset of weights instead of simply conduct full weights updating in order to better update the weights they measure each weights contribution to the analytical upper bound on the loss reduction from two sides global and locally after evaluation a weight will be updated only if it has a large contribution to the loss reduction given the newly collected data samples the experimental results show that their method can achieve a high inference accuracy while updating a rather small number of weights strength the idea is easy to follow and seems applicable to be adopted paper is well structured and written in general weakness 1 lack of explanations 1 from reward measurement side motivation side in the introduction the authors did not explain why they pick the loss as the weight measurement criteria instead of others eg accuracy while they report the accuracy in the evaluation part as one evaluation results 2 from the update algorithm side the paper did mention their weights updating method is determined via both global and local contributions and they talked in 31 it turns out experimentally that a simple sum of both contributions leads to sufficiently good and robust final results however it is not convincing that those two facts can have the equal impacts on the final prediction 3 from the updating setting side it seems that the defined updating ratio is one important factor as discussed in section2 not enough contents are provided in the paper to describe how to calculate this ratio 4 reinitialize mechanism reinitialize is also another important factor in the weight updating as discussed in section 32 trained from the last round for a long sequence of rounds thus we propose to reinitialize the weights after a certain number of rounds however the computation of how many rounds the network needs to be reinitialized seems not plausible 2 evaluation 1 lack of comparison it would be good if authors can apply their method on some recent works or models which can also show others how flexible their method can be adopted or applied 2 there is no contents in the paper showing how authors decide their experiment settings for example why authors always select k weight changing ratio as very small 001 005 01 02 instead of 05 3 in fig2 it is curious why authors apply different settings on different datasets when comparing their methods 4 for section 42 it would be good if the authors can also try other initialization ways for example using the average weights in each round window instead of directly using the latest round weights 5 in table 1 it seems full updating still can beat the combined method however in fig2 authors did not explain why dpu has better performance than other settings even compare with the full update 6 in fig3 while dpu with reinit can achieve best performance than others there is no explain about why it did not perform well in the first few rounds 7 the authors did not mentioned how many runs which they have conduct their experiments to provide the results 3 some parts need to be further improved for example 1 fig3 it would be good if authors can add some texts information for 1000 5000 2 section3 is a little bit hard to follow need to be reorganized 3 related work can be further improved to better cover most recent worksdocsepsummary the paper proposes a weightwise partial updating paradigm which adaptively selects a subset of weights to update at each training iteration while achieving comparable performance to full training experimental results demonstrate the effectiveness of the proposed partial updating method strengths 1 the paper is well written 2 the process of upperbounding the loss difference is clear 3 experiments are conducted on various datasets with various net structures to support the proposed method weakness my major concern is about novelty and contribution although the paper show some application scenarios of partial updating i still think that pruning would be more proper furthermore the metric of global contribution and local contribution is quite like choosing two similar weight norms to select topk weight which is very similar to pruning tasks so i suggest rejecting this paper the authors rebuttal and the revised version have not fully addressed my concerns it is not surprise that partial updating outperforms pruning by a large margin as the inference of small updating still uses the whole weights of the network comparing to pruning the technical contribution of this work is limited so i would like to keep my original rating docsepsummary this paper presents a method to reduce the bandwidth required to update dnn models on edge devices the key insight is that model updates typically incorporate new data training samples and that after doing so a minority of weights capture the majority of change due to retraining the authors propose a method by which to identify this weight subset and compare the relative size and test accuracy of that update to that of other solutions such as sending the entire network or sending a random subset of the weights on each retraining round experiments with a number of existing data sets and models illustrate that the approach reduces update size more than 77 while maintaining reasonable test accuracy pros paper is sufficiently motivated as edge devices use more and larger models and as their count increases the relevance of partial update techniques to accomodate model decay will remain the proposed technique provides a sister technique to pruning not identifying nodes with greatest weights to retain but identifying weights with the greatest changes to retain the policy is informed by choosing weights that minimize the difference in loss between the fully retrained network and its partially updated version the paper is rounded out by practical items such as encoding weight ids a policy to determine when to retrain the network from scratch reinitialization and avoiding sending updates when validation error does not significantly change the evaluation looks at a variety of angles the ratio of initial training set to update size different data sets and architectures and compares to a random partial update strategy as well as a simplified version of their approach gcpu cons the overall presentation is difficult to parse the technique owes much to pruning methods and methodologies the technical approach choosing weights iterative rewinding follows from recent work on pruning it would be great to have that discussion in related work moving it out of section 31 and section 4 ultimately existing pruning techniques can reduce networks by 90 by amdahls law this implies that these techniques reduce communication by 710 not 7099 equally important does the technique work well on pruned networks unimportant updates may not be as available in such networks on the other hand if you do the comparison and all updates are important then over the course of the lifetime of the installed nn using dpu instead of pruning would be the winner experiments in key graphs arent clear is there reinitialization in figure 2 figure 3 performance never falls relative to full updating during reinitialization while the text s32 makes it seem that the nodes reset all weights using only 1 of the weights would impact test accuracy relative to full updating suggestions questions overall i found the work interesting useful and complete aside from eval sec questions above it would be useful to introduce a metric that combines update size with accuracy loss at the beginning of the paper the evaluation does this but consider pulling it forward and defining it explicitly each round incurs a communication cost in bytes and experiences some accuracy so for example one can capture changes in accuracy per byte ie model improvement by update size since you are comparing to other techniques that can reduce the bandwidth similarly we want to optimize this ratio some networks work very well with small k but how low can you go ie how does one choose k perhaps the accuracybytes metric could be informative it would be interesting to discuss on why winning lottery ticket theory gets us 8090 reductions but this technique admits 99 reductions by retaining information in the rest of the network the startup procedure is not clear the graphs and discussion in s32 make it sound like the entire network reinitializes can we be clear about what the first network looks like id assume all the weights but if we start from random values sending the seed the first round only updates ki weights can the test performance of the network with only 1 of its weights be 65 figure 2 similarly if dpu is reinitializing why is the test accuracy monotically increasing the installed network would go back to ground zero clearly im missing something or your measuring the performance of the network at the server wf and not the installed network wr similarly table 2 should have a column for the number of rounds that required no communication dpu wont send updates if validation error doesnt decrease significantly it isnt clear whether you gave the same benefit to full updating writing terminology notation overall the presentation is difficult to get through for instance section 3 has many awkward constructions it seems like theres a simple picture here similar to the train prune retrain flow of pruning work it seems deeply analogous with the exception that rewinding replaces pruning the evaluation section refers to alg 1 and alg 2 but section 30 refers only to step 1 and step 2 are there better words than step this section also refers to the second step as an optimization step you end p1 by saying youre optimizing eq 1 in step 2 then you say step 1 optimizes the same equation the last sentence of the s30p3 reiterates what was said in p1 im sorry but its a bit of a slog the use of notation is consistent there are a couple of things that felt like speed bumps i kept wanting to parse delta w and delta d using delta as a variable like ki at the end of section 2 introducing a new form of wr as w was confusing do we need w sometimes you use l0 norms s2 eq 2 and other times you use summation s31 you use curly braces s4p1 for the sizes of the initial training and updates it looks like a set not a configuration the text says the two sizes represent the available data samples but here its just a configuration its not the set of samples at all and it wouldnt be bc not all updates r are present nits please learn the difference between that and which remove in order to capitalize figure and section some references use name style other use indices but the bibliography is all by name confused docsepthis paper proposes a deep partial updating paradigm that can reduces the computational and communication cost on the edge devices by only updating most important weights in each round instead of a full update it also proposes metrics to select weights by global and local contributions and the experiment results show the efficacy of the proposed methods in summary the method proposed in this paper looks practical and easy to implement but the theoretical justification needs further clarification im not sure about the significance of this paper as im not an expert in this area so i prefer to leave this to other reviewers to decide in general the paper is well written and easy to follow and the motivation is sound however the justification of the global and local contributions need to be clarified further the inequality of eq3 can hold only if f is lsmooth and convex which indicates the loss function is assumed lsmooth and convex so whats the justification of the definition of global and local contributions when the loss is nonconvex which is the most common case in the experiments without the theoretical justification the global contribution that selects weights with largest values is basically as the same as pruning and the local contribution basically measures the changed loss caused by the update of a weight although they may be still practical but the novelty is limited the experiment results show that the proposed method can obtain similar performance with the full updating but costs much less communication overhead it seems a very practical method in this area and the paper provides an interesting empirical study the simple combination of global and local contributions outperforms each individual contribution im wondering if authors have tried more other ways to combine them and why this way is better one minor comment regarding the structure of the paper as the initialization strategy plays an role in this method it would be better to put the experimental results of comparing different initializations to the formal content and the appendix can be put after the bibliography in one file feedback to the authors response as the authors have addressed some of my main concerns and provided nice extra experimental results i will raise my score to 6
### Summary: | the paper proposes an approach to selectively update the weights of neural networks in federated learning this is an interesting and important problem as several reviewers pointed out this is highly related to pruning although with a different objective it is an interesting paper but is a marginal case in the end due to the weakness on presentation and evaluation | [
7171,
253,
5678,
1332,
2299,
275,
3036,
19,
4477,
858,
417,
5513,
2139,
277,
11113,
556,
1805,
3045,
685,
643,
7533,
1014,
7277,
342,
253,
2120,
5731,
209,
186,
23,
275,
3036,
20,
1223,
277,
11113,
342,
294,
4478,
476,
5115,
1682,
3045,
685,
2571,
627,
310,
642,
5513,
670,
2139,
352,
858,
417,
1347,
973,
275,
253,
806,
1643,
16334,
209,
186,
24,
253,
4477,
858,
417,
5393,
849,
1142,
6613,
534,
597,
452,
2589,
616,
4679,
281,
2085,
253,
1543,
50276,
20,
690,
4243,
878,
281,
320,
2007,
5520,
323,
1650,
209,
186,
18,
3036,
20,
352,
651,
320,
1175,
604,
4477,
476,
823,
690,
17438,
1491,
323,
9098,
29067,
50276,
186,
19,
2593,
20,
310,
247,
1652,
2372,
1892,
281,
956,
878,
281,
320,
294,
34092,
209,
186,
20,
2905,
789,
476,
320,
2007,
5520,
281,
1805,
3835,
954,
3332,
2987,
7152,
339,
793,
360,
3454,
253,
2929,
29328,
247,
2801,
3020,
7898,
22753,
22199,
534,
5223,
1242,
34899,
247,
8578,
273,
13461,
281,
5731,
387,
1016,
3733,
19502,
1223,
17170,
10870,
3045,
281,
2120,
3733,
5661,
1543,
7568,
253,
12510,
273,
253,
4081,
7898,
22753,
1332,
50276,
296,
3755,
20556,
337,
50276,
783,
2929,
310,
973,
3542,
374,
50276,
783,
1232,
273,
5170,
9458,
272,
253,
2957,
3064,
310,
2590,
495,
50276,
16217,
3825,
403,
5196,
327,
2710,
15302,
342,
2710,
2036,
5289,
281,
1329,
253,
4081,
1332,
50275,
20881,
1255,
619,
2201,
4468,
310,
670,
38135,
285,
7680,
3738,
253,
2929,
921,
690,
2898,
15216,
273,
7898,
22753,
891,
1335,
1158,
326,
819,
25004,
651,
320,
625,
1463,
33810,
253,
7982,
273,
4156,
7680,
285,
1980,
7680,
310,
3240,
751,
13887,
767,
2074,
2801,
22429,
281,
3609,
1755,
76,
2801,
534,
310,
1077,
2074,
281,
819,
25004,
8892,
594,
891,
1804,
33944,
436,
2929,
50274,
783,
4477,
30080,
22559,
285,
253,
17265,
2715,
452,
417,
4751,
9713,
619,
7350,
352,
310,
417,
9326,
326,
7898,
22753,
41731,
13015,
819,
25004,
407,
247,
1781,
8459,
347,
253,
17032,
273,
1355,
22753,
1335,
4648,
253,
2644,
13461,
273,
253,
2990,
10941,
281,
819,
25004,
253,
7681,
7680,
273,
436,
789,
310,
3710,
594,
891,
651,
751,
281,
1978,
619,
3236,
13716,
50275,
7152,
339,
793,
360,
3454,
50276,
2520,
2929,
10262,
247,
1332,
281,
4796,
253,
16992,
2424,
281,
5731,
277,
9866,
3210,
327,
5024,
4095,
50275,
783,
2234,
12288,
310,
326,
1566,
11269,
5431,
19071,
747,
941,
3733,
3530,
285,
326,
846,
2509,
594,
247,
15156,
273,
13461,
9232,
253,
5020,
273,
1818,
1955,
281,
851,
26208,
50276,
783,
4477,
12661,
247,
1332,
407,
534,
281,
4271,
436,
2801,
8578,
285,
7277,
253,
4103,
1979,
285,
1071,
7200,
273,
326,
5731,
281,
326,
273,
643,
5482,
824,
347,
10430,
253,
2862,
2990,
390,
10430,
247,
3632,
8578,
273,
253,
13461,
327,
1016,
851,
26208,
3790,
50276,
16217,
3825,
342,
247,
1180,
273,
5368,
941,
5239,
285,
3210,
17093,
326,
253,
2746,
11355,
5731,
1979,
625,
685,
10484,
1223,
11850,
5272,
1071,
7200,
50274,
856,
84,
50274,
20790,
310,
10481,
17194,
50276,
284,
5024,
4095,
897,
625,
285,
4067,
3210,
285,
347,
616,
1385,
5459,
253,
17200,
273,
7898,
5731,
5609,
281,
3648,
351,
366,
1566,
10027,
588,
3464,
50272,
783,
4081,
5853,
3400,
247,
7586,
5853,
281,
819,
25004,
417,
12488,
7632,
342,
6459,
13461,
281,
13280,
533,
12488,
13461,
342,
253,
6459,
2544,
281,
13280,
50276,
783,
3646,
310,
8191,
407,
13887,
13461,
326,
15338,
253,
3064,
275,
2957,
875,
253,
4751,
851,
11273,
2990,
285,
697,
10571,
9300,
2715,
50272,
783,
2929,
310,
9971,
562,
407,
8542,
4957,
824,
347,
9706,
2801,
44077,
247,
3646,
281,
3653,
672,
281,
851,
1949,
253,
2990,
432,
20041,
50276,
250,
19078,
1320,
285,
17816,
10430,
11269,
672,
12820,
2228,
1057,
417,
3012,
1818,
50274,
783,
7103,
4453,
387,
247,
5235,
273,
14636,
253,
4313,
273,
3302,
3733,
873,
281,
5731,
1979,
1027,
941,
5239,
285,
35615,
285,
26662,
281,
247,
3632,
7898,
5731,
5700,
347,
973,
347,
247,
21010,
2715,
273,
616,
2746,
305,
19411,
50272,
5040,
50274,
783,
4583,
9759,
310,
2834,
281,
14390,
50274,
783,
5853,
42261,
1199,
281,
819,
25004,
3082,
285,
39396,
50275,
783,
7681,
2746,
13887,
13461,
34560,
294,
88,
3087,
3637,
432,
3332,
789,
327,
819,
25004,
50276,
262,
651,
320,
1270,
281,
452,
326,
5955,
275,
2905,
789,
4886,
352,
562,
273,
2593,
4562,
285,
2593,
577,
50275,
503,
7325,
5368,
819,
25004,
5609,
476,
4796,
6928,
407,
5091,
50276,
1615,
717,
69,
1240,
5200,
1569,
436,
8018,
326,
841,
5609,
4796,
5511,
407,
47889,
417,
5571,
1525,
50274,
2655,
595,
1774,
1057,
253,
5853,
789,
973,
327,
819,
37437,
6928,
50276,
328,
18108,
11269,
778,
417,
320,
347,
2130,
275,
824,
6928,
50276,
251,
253,
643,
1133,
604,
368,
513,
253,
5301,
285,
512,
11269,
403,
1774,
840,
689,
253,
2282,
273,
253,
12702,
273,
253,
8038,
48257,
970,
277,
11113,
3185,
273,
819,
25004,
651,
320,
253,
13688,
50274,
16217,
3825,
275,
2234,
14580,
403,
2649,
2590,
310,
627,
294,
19078,
1320,
275,
4677,
374,
50276,
13206,
495,
3045,
1620,
11521,
4103,
281,
2120,
22753,
1309,
294,
19078,
1320,
50276,
6050,
253,
2505,
256,
1237,
2789,
352,
1646,
326,
253,
7632,
14932,
512,
13461,
970,
760,
337,
273,
253,
13461,
651,
3486,
1071,
7200,
4103,
281,
2120,
22753,
50273,
35640,
621,
50276,
34974,
50274,
1189,
455,
891,
1119,
253,
789,
4722,
4217,
285,
3426,
9255,
432,
2777,
4706,
3533,
1840,
50273,
262,
651,
320,
4217,
281,
9569,
247,
7982,
326,
24772,
5731,
1979,
342,
7200,
2957,
387,
253,
5068,
273,
253,
2929,
50276,
783,
7103,
1057,
436,
533,
1908,
14252,
352,
3579,
285,
13947,
352,
11120,
50276,
14382,
3790,
1485,
2244,
247,
5511,
2105,
275,
11061,
285,
8450,
690,
7200,
594,
323,
1650,
581,
476,
9232,
2544,
275,
7200,
591,
12243,
26332,
1566,
7756,
407,
5731,
1979,
50276,
17480,
368,
403,
10941,
281,
643,
5609,
326,
476,
4796,
253,
16992,
12014,
359,
971,
281,
22318,
436,
4313,
50274,
8826,
6928,
789,
1077,
973,
342,
1355,
465,
50275,
2858,
849,
1698,
476,
368,
564,
50275,
466,
849,
1057,
581,
5206,
465,
50275,
30875,
253,
7200,
15845,
7982,
812,
320,
27096,
50275,
262,
651,
320,
4722,
281,
2319,
327,
2139,
9880,
36284,
13571,
3762,
4850,
441,
5096,
2270,
23082,
533,
436,
5853,
19943,
8688,
23082,
407,
26179,
1491,
275,
253,
1551,
273,
253,
2990,
50272,
783,
20500,
5199,
310,
417,
2590,
50276,
783,
14580,
285,
5955,
275,
256,
1237,
1056,
352,
3590,
751,
253,
2862,
2990,
294,
19078,
4219,
50276,
5092,
359,
320,
2590,
670,
752,
253,
806,
2990,
4453,
751,
50276,
301,
5467,
512,
253,
13461,
50276,
2858,
604,
359,
1265,
432,
3632,
2193,
10430,
253,
8357,
253,
806,
3790,
760,
11269,
25130,
13461,
50276,
5092,
253,
1071,
3045,
273,
253,
2990,
342,
760,
337,
273,
697,
13461,
320,
7251,
4677,
374,
50276,
3549,
6241,
604,
277,
11113,
310,
294,
19078,
3006,
2139,
310,
253,
1071,
7200,
26734,
1037,
3629,
50276,
783,
8038,
2990,
651,
564,
896,
281,
3216,
5058,
50276,
49346,
516,
5816,
1633,
390,
634,
10499,
253,
3045,
273,
253,
2990,
387,
253,
4771,
259,
71,
285,
417,
253,
8038,
2990,
1488,
50276,
3549,
6241,
2829,
374,
943,
452,
247,
5084,
323,
253,
1180,
273,
16334,
326,
2424,
642,
5511,
50275,
69,
11113,
31451,
5007,
11269,
604,
12820,
2228,
36908,
6379,
3012,
50276,
262,
310,
2649,
2590,
1880,
368,
3534,
253,
1072,
5649,
281,
2120,
22753,
50274,
17695,
50276,
20792,
1497,
50276,
25604,
50275,
1189,
455,
253,
9759,
310,
2834,
281,
755,
949,
50275,
1542,
4227,
2593,
495,
556,
1142,
19328,
35831,
50276,
262,
3133,
751,
253,
373,
247,
2969,
5406,
1060,
2074,
281,
253,
6194,
819,
2517,
851,
1949,
2685,
273,
819,
25004,
789,
50276,
262,
3133,
11617,
19890,
342,
253,
6517,
326,
294,
88,
3087,
36287,
819,
25004,
50276,
783,
7103,
2593,
10770,
281,
20320,
337,
285,
20320,
374,
533,
2593,
1884,
10770,
760,
281,
3213,
337,
285,
3213,
374,
50275,
609,
627,
1805,
3000,
685,
3213,
50276,
2520,
2593,
671,
10770,
281,
253,
1273,
3213,
347,
271,
13757,
3213,
50275,
5658,
990,
268,
18,
407,
3981,
368,
250,
39793,
16186,
337,
275,
3213,
374,
840,
368,
1333,
3213,
337,
5556,
4219,
253,
1072,
5150,
50276,
783,
1390,
6197,
273,
253,
256,
1229,
81,
20,
28411,
684,
752,
369,
753,
275,
268,
18,
50276,
303,
7016,
533,
697,
247,
2372,
273,
247,
44216,
50276,
783,
897,
273,
14951,
310,
5185,
50276,
9088,
403,
247,
4564,
273,
1841,
326,
3543,
751,
3885,
44395,
50275,
74,
4934,
14707,
281,
14390,
18687,
259,
285,
18687,
277,
970,
18687,
347,
247,
4778,
751,
25130,
50276,
255,
253,
990,
273,
2593,
374,
16984,
247,
747,
830,
273,
1488,
347,
259,
369,
21643,
50276,
3088,
359,
878,
259,
50272,
32307,
368,
897,
298,
17,
22429,
256,
19,
16186,
374,
285,
643,
2069,
368,
897,
36138,
256,
2405,
50274,
5658,
897,
49721,
1308,
1951,
256,
21,
81,
18,
323,
253,
9552,
273,
253,
3302,
3733,
285,
11269,
50276,
262,
4453,
751,
247,
873,
417,
247,
6661,
50276,
783,
2505,
2296,
253,
767,
9552,
1957,
253,
2130,
941,
3530,
50276,
2858,
1060,
697,
816,
247,
6661,
50276,
953,
417,
253,
873,
273,
3530,
387,
512,
285,
352,
651,
2649,
320,
49501,
417,
512,
11269,
391,
403,
1246,
50272,
79,
953,
50274,
32897,
3037,
253,
3064,
875,
326,
285,
534,
50275,
12163,
275,
1340,
281,
50275,
38479,
907,
4677,
285,
2593,
50273,
8826,
10414,
897,
1416,
3740,
643,
897,
14452,
533,
253,
20314,
20561,
310,
512,
407,
1416,
50276,
8259,
3197,
50273,
7152,
33032,
2520,
2929,
29328,
247,
3676,
7898,
22753,
22199,
326,
476,
11355,
253,
15180,
285,
5511,
2105,
327,
253,
5024,
4095,
407,
760,
22753,
954,
1774,
13461,
275,
1016,
3790,
3185,
273,
247,
2120,
5731,
50276,
262,
671,
29328,
17082,
281,
3609,
13461,
407,
4156,
285,
1980,
9021,
285,
253,
3368,
1543,
921,
253,
10307,
273,
253,
4081,
3082,
50275,
249,
6010,
253,
1332,
4081,
275,
436,
2929,
4453,
8542,
285,
3477,
281,
3359,
533,
253,
10527,
22861,
3198,
2007,
37699,
516,
417,
2119,
670,
253,
8453,
273,
436,
2929,
347,
516,
417,
271,
6485,
275,
436,
2170,
594,
891,
4510,
281,
3553,
436,
281,
643,
30628,
281,
7617,
50275,
249,
2087,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
285,
253,
16038,
310,
3590,
50276,
35529,
253,
22861,
273,
253,
4156,
285,
1980,
9021,
878,
281,
320,
31637,
2007,
253,
11370,
273,
16186,
20,
476,
2186,
760,
604,
269,
310,
298,
34006,
285,
17133,
50276,
4609,
6492,
253,
2957,
1159,
310,
8025,
298,
34006,
285,
17133,
594,
47515,
253,
22861,
273,
253,
5426,
273,
4156,
285,
1980,
9021,
50276,
9453,
253,
2957,
310,
1327,
44181,
534,
310,
253,
954,
1846,
1083,
275,
253,
4679,
50276,
14920,
253,
10527,
22861,
253,
4156,
7680,
326,
34899,
13461,
342,
6253,
2193,
310,
10323,
347,
253,
1072,
347,
819,
25004,
50276,
395,
253,
1980,
7680,
10323,
5593,
253,
4391,
2957,
4269,
407,
253,
5731,
273,
247,
2801,
50276,
20261,
597,
778,
320,
1335,
8542,
533,
253,
38135,
310,
3710,
50275,
783,
3368,
1543,
921,
326,
253,
4081,
1332,
476,
4044,
2074,
3045,
342,
253,
2120,
22753,
533,
4815,
1199,
1679,
5511,
18332,
50276,
262,
3133,
247,
1077,
8542,
1332,
275,
436,
2170,
285,
253,
2929,
3400,
271,
4722,
16774,
1263,
253,
2969,
5019,
273,
4156,
285,
1980,
9021,
41731,
13015,
1016,
2060,
7680,
516,
12371,
604,
4477,
452,
3597,
625,
643,
4088,
281,
13398,
731,
285,
2139,
436,
1039,
310,
1805,
50274,
531,
5884,
4385,
5001,
253,
2605,
273,
253,
2929,
347,
253,
31850,
5700,
7120,
271,
2554,
275,
436,
1332,
352,
651,
320,
1805,
281,
1691,
253,
5661,
1543,
273,
10941,
1027,
3302,
5904,
281,
253,
7473,
2600,
285,
253,
30762,
476,
320,
1691,
846,
253,
20314,
20561,
275,
581,
1873,
50274,
44333,
281,
253,
4477,
2380,
50275,
284,
253,
4477,
452,
9713,
690,
273,
619,
2022,
7350,
285,
2530,
5322,
4465,
5661,
1543,
891,
588,
7164,
619,
4868,
281,
721,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
271,
2746,
281,
21656,
5731,
253,
13461,
273,
11454,
6928,
275,
10208,
12072,
4715,
436,
310,
271,
4722,
285,
1774,
1895,
347,
2067,
30628,
8042,
562,
436,
310,
4122,
2905,
281,
819,
25004,
3738,
342,
247,
1027,
8103,
50276,
262,
310,
271,
4722,
2929,
533,
310,
247,
16888,
1083,
275,
253,
990,
1955,
281,
253,
14855,
327,
9759,
285,
7103,
50275
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
7171,
253,
5678,
1332,
2299,
275,
3036,
19,
4477,
858,
417,
5513,
2139,
277,
11113,
556,
1805,
3045,
685,
643,
7533,
1014,
7277,
342,
253,
2120,
5731,
209,
186,
23,
275,
3036,
20,
1223,
277,
11113,
342,
294,
4478,
476,
5115,
1682,
3045,
685,
2571,
627,
310,
642,
5513,
670,
2139,
352,
858,
417,
1347,
973,
275,
253,
806,
1643,
16334,
209,
186,
24,
253,
4477,
858,
417,
5393,
849,
1142,
6613,
534,
597,
452,
2589,
616,
4679,
281,
2085,
253,
1543,
50276,
20,
690,
4243,
878,
281,
320,
2007,
5520,
323,
1650,
209,
186,
18,
3036,
20,
352,
651,
320,
1175,
604,
4477,
476,
823,
690,
17438,
1491,
323,
9098,
29067,
50276,
186,
19,
2593,
20,
310,
247,
1652,
2372,
1892,
281,
956,
878,
281,
320,
294,
34092,
209,
186,
20,
2905,
789,
476,
320,
2007,
5520,
281,
1805,
3835,
954,
3332,
2987,
7152,
339,
793,
360,
3454,
253,
2929,
29328,
247,
2801,
3020,
7898,
22753,
22199,
534,
5223,
1242,
34899,
247,
8578,
273,
13461,
281,
5731,
387,
1016,
3733,
19502,
1223,
17170,
10870,
3045,
281,
2120,
3733,
5661,
1543,
7568,
253,
12510,
273,
253,
4081,
7898,
22753,
1332,
50276,
296,
3755,
20556,
337,
50276,
783,
2929,
310,
973,
3542,
374,
50276,
783,
1232,
273,
5170,
9458,
272,
253,
2957,
3064,
310,
2590,
495,
50276,
16217,
3825,
403,
5196,
327,
2710,
15302,
342,
2710,
2036,
5289,
281,
1329,
253,
4081,
1332,
50275,
20881,
1255,
619,
2201,
4468,
310,
670,
38135,
285,
7680,
3738,
253,
2929,
921,
690,
2898,
15216,
273,
7898,
22753,
891,
1335,
1158,
326,
819,
25004,
651,
320,
625,
1463,
33810,
253,
7982,
273,
4156,
7680,
285,
1980,
7680,
310,
3240,
751,
13887,
767,
2074,
2801,
22429,
281,
3609,
1755,
76,
2801,
534,
310,
1077,
2074,
281,
819,
25004,
8892,
594,
891,
1804,
33944,
436,
2929,
50274,
783,
4477,
30080,
22559,
285,
253,
17265,
2715,
452,
417,
4751,
9713,
619,
7350,
352,
310,
417,
9326,
326,
7898,
22753,
41731,
13015,
819,
25004,
407,
247,
1781,
8459,
347,
253,
17032,
273,
1355,
22753,
1335,
4648,
253,
2644,
13461,
273,
253,
2990,
10941,
281,
819,
25004,
253,
7681,
7680,
273,
436,
789,
310,
3710,
594,
891,
651,
751,
281,
1978,
619,
3236,
13716,
50275,
7152,
339,
793,
360,
3454,
50276,
2520,
2929,
10262,
247,
1332,
281,
4796,
253,
16992,
2424,
281,
5731,
277,
9866,
3210,
327,
5024,
4095,
50275,
783,
2234,
12288,
310,
326,
1566,
11269,
5431,
19071,
747,
941,
3733,
3530,
285,
326,
846,
2509,
594,
247,
15156,
273,
13461,
9232,
253,
5020,
273,
1818,
1955,
281,
851,
26208,
50276,
783,
4477,
12661,
247,
1332,
407,
534,
281,
4271,
436,
2801,
8578,
285,
7277,
253,
4103,
1979,
285,
1071,
7200,
273,
326,
5731,
281,
326,
273,
643,
5482,
824,
347,
10430,
253,
2862,
2990,
390,
10430,
247,
3632,
8578,
273,
253,
13461,
327,
1016,
851,
26208,
3790,
50276,
16217,
3825,
342,
247,
1180,
273,
5368,
941,
5239,
285,
3210,
17093,
326,
253,
2746,
11355,
5731,
1979,
625,
685,
10484,
1223,
11850,
5272,
1071,
7200,
50274,
856,
84,
50274,
20790,
310,
10481,
17194,
50276,
284,
5024,
4095,
897,
625,
285,
4067,
3210,
285,
347,
616,
1385,
5459,
253,
17200,
273,
7898,
5731,
5609,
281,
3648,
351,
366,
1566,
10027,
588,
3464,
50272,
783,
4081,
5853,
3400,
247,
7586,
5853,
281,
819,
25004,
417,
12488,
7632,
342,
6459,
13461,
281,
13280,
533,
12488,
13461,
342,
253,
6459,
2544,
281,
13280,
50276,
783,
3646,
310,
8191,
407,
13887,
13461,
326,
15338,
253,
3064,
275,
2957,
875,
253,
4751,
851,
11273,
2990,
285,
697,
10571,
9300,
2715,
50272,
783,
2929,
310,
9971,
562,
407,
8542,
4957,
824,
347,
9706,
2801,
44077,
247,
3646,
281,
3653,
672,
281,
851,
1949,
253,
2990,
432,
20041,
50276,
250,
19078,
1320,
285,
17816,
10430,
11269,
672,
12820,
2228,
1057,
417,
3012,
1818,
50274,
783,
7103,
4453,
387,
247,
5235,
273,
14636,
253,
4313,
273,
3302,
3733,
873,
281,
5731,
1979,
1027,
941,
5239,
285,
35615,
285,
26662,
281,
247,
3632,
7898,
5731,
5700,
347,
973,
347,
247,
21010,
2715,
273,
616,
2746,
305,
19411,
50272,
5040,
50274,
783,
4583,
9759,
310,
2834,
281,
14390,
50274,
783,
5853,
42261,
1199,
281,
819,
25004,
3082,
285,
39396,
50275,
783,
7681,
2746,
13887,
13461,
34560,
294,
88,
3087,
3637,
432,
3332,
789,
327,
819,
25004,
50276,
262,
651,
320,
1270,
281,
452,
326,
5955,
275,
2905,
789,
4886,
352,
562,
273,
2593,
4562,
285,
2593,
577,
50275,
503,
7325,
5368,
819,
25004,
5609,
476,
4796,
6928,
407,
5091,
50276,
1615,
717,
69,
1240,
5200,
1569,
436,
8018,
326,
841,
5609,
4796,
5511,
407,
47889,
417,
5571,
1525,
50274,
2655,
595,
1774,
1057,
253,
5853,
789,
973,
327,
819,
37437,
6928,
50276,
328,
18108,
11269,
778,
417,
320,
347,
2130,
275,
824,
6928,
50276,
251,
253,
643,
1133,
604,
368,
513,
253,
5301,
285,
512,
11269,
403,
1774,
840,
689,
253,
2282,
273,
253,
12702,
273,
253,
8038,
48257,
970,
277,
11113,
3185,
273,
819,
25004,
651,
320,
253,
13688,
50274,
16217,
3825,
275,
2234,
14580,
403,
2649,
2590,
310,
627,
294,
19078,
1320,
275,
4677,
374,
50276,
13206,
495,
3045,
1620,
11521,
4103,
281,
2120,
22753,
1309,
294,
19078,
1320,
50276,
6050,
253,
2505,
256,
1237,
2789,
352,
1646,
326,
253,
7632,
14932,
512,
13461,
970,
760,
337,
273,
253,
13461,
651,
3486,
1071,
7200,
4103,
281,
2120,
22753,
50273,
35640,
621,
50276,
34974,
50274,
1189,
455,
891,
1119,
253,
789,
4722,
4217,
285,
3426,
9255,
432,
2777,
4706,
3533,
1840,
50273,
262,
651,
320,
4217,
281,
9569,
247,
7982,
326,
24772,
5731,
1979,
342,
7200,
2957,
387,
253,
5068,
273,
253,
2929,
50276,
783,
7103,
1057,
436,
533,
1908,
14252,
352,
3579,
285,
13947,
352,
11120,
50276,
14382,
3790,
1485,
2244,
247,
5511,
2105,
275,
11061,
285,
8450,
690,
7200,
594,
323,
1650,
581,
476,
9232,
2544,
275,
7200,
591,
12243,
26332,
1566,
7756,
407,
5731,
1979,
50276,
17480,
368,
403,
10941,
281,
643,
5609,
326,
476,
4796,
253,
16992,
12014,
359,
971,
281,
22318,
436,
4313,
50274,
8826,
6928,
789,
1077,
973,
342,
1355,
465,
50275,
2858,
849,
1698,
476,
368,
564,
50275,
466,
849,
1057,
581,
5206,
465,
50275,
30875,
253,
7200,
15845,
7982,
812,
320,
27096,
50275,
262,
651,
320,
4722,
281,
2319,
327,
2139,
9880,
36284,
13571,
3762,
4850,
441,
5096,
2270,
23082,
533,
436,
5853,
19943,
8688,
23082,
407,
26179,
1491,
275,
253,
1551,
273,
253,
2990,
50272,
783,
20500,
5199,
310,
417,
2590,
50276,
783,
14580,
285,
5955,
275,
256,
1237,
1056,
352,
3590,
751,
253,
2862,
2990,
294,
19078,
4219,
50276,
5092,
359,
320,
2590,
670,
752,
253,
806,
2990,
4453,
751,
50276,
301,
5467,
512,
253,
13461,
50276,
2858,
604,
359,
1265,
432,
3632,
2193,
10430,
253,
8357,
253,
806,
3790,
760,
11269,
25130,
13461,
50276,
5092,
253,
1071,
3045,
273,
253,
2990,
342,
760,
337,
273,
697,
13461,
320,
7251,
4677,
374,
50276,
3549,
6241,
604,
277,
11113,
310,
294,
19078,
3006,
2139,
310,
253,
1071,
7200,
26734,
1037,
3629,
50276,
783,
8038,
2990,
651,
564,
896,
281,
3216,
5058,
50276,
49346,
516,
5816,
1633,
390,
634,
10499,
253,
3045,
273,
253,
2990,
387,
253,
4771,
259,
71,
285,
417,
253,
8038,
2990,
1488,
50276,
3549,
6241,
2829,
374,
943,
452,
247,
5084,
323,
253,
1180,
273,
16334,
326,
2424,
642,
5511,
50275,
69,
11113,
31451,
5007,
11269,
604,
12820,
2228,
36908,
6379,
3012,
50276,
262,
310,
2649,
2590,
1880,
368,
3534,
253,
1072,
5649,
281,
2120,
22753,
50274,
17695,
50276,
20792,
1497,
50276,
25604,
50275,
1189,
455,
253,
9759,
310,
2834,
281,
755,
949,
50275,
1542,
4227,
2593,
495,
556,
1142,
19328,
35831,
50276,
262,
3133,
751,
253,
373,
247,
2969,
5406,
1060,
2074,
281,
253,
6194,
819,
2517,
851,
1949,
2685,
273,
819,
25004,
789,
50276,
262,
3133,
11617,
19890,
342,
253,
6517,
326,
294,
88,
3087,
36287,
819,
25004,
50276,
783,
7103,
2593,
10770,
281,
20320,
337,
285,
20320,
374,
533,
2593,
1884,
10770,
760,
281,
3213,
337,
285,
3213,
374,
50275,
609,
627,
1805,
3000,
685,
3213,
50276,
2520,
2593,
671,
10770,
281,
253,
1273,
3213,
347,
271,
13757,
3213,
50275,
5658,
990,
268,
18,
407,
3981,
368,
250,
39793,
16186,
337,
275,
3213,
374,
840,
368,
1333,
3213,
337,
5556,
4219,
253,
1072,
5150,
50276,
783,
1390,
6197,
273,
253,
256,
1229,
81,
20,
28411,
684,
752,
369,
753,
275,
268,
18,
50276,
303,
7016,
533,
697,
247,
2372,
273,
247,
44216,
50276,
783,
897,
273,
14951,
310,
5185,
50276,
9088,
403,
247,
4564,
273,
1841,
326,
3543,
751,
3885,
44395,
50275,
74,
4934,
14707,
281,
14390,
18687,
259,
285,
18687,
277,
970,
18687,
347,
247,
4778,
751,
25130,
50276,
255,
253,
990,
273,
2593,
374,
16984,
247,
747,
830,
273,
1488,
347,
259,
369,
21643,
50276,
3088,
359,
878,
259,
50272,
32307,
368,
897,
298,
17,
22429,
256,
19,
16186,
374,
285,
643,
2069,
368,
897,
36138,
256,
2405,
50274,
5658,
897,
49721,
1308,
1951,
256,
21,
81,
18,
323,
253,
9552,
273,
253,
3302,
3733,
285,
11269,
50276,
262,
4453,
751,
247,
873,
417,
247,
6661,
50276,
783,
2505,
2296,
253,
767,
9552,
1957,
253,
2130,
941,
3530,
50276,
2858,
1060,
697,
816,
247,
6661,
50276,
953,
417,
253,
873,
273,
3530,
387,
512,
285,
352,
651,
2649,
320,
49501,
417,
512,
11269,
391,
403,
1246,
50272,
79,
953,
50274,
32897,
3037,
253,
3064,
875,
326,
285,
534,
50275,
12163,
275,
1340,
281,
50275,
38479,
907,
4677,
285,
2593,
50273,
8826,
10414,
897,
1416,
3740,
643,
897,
14452,
533,
253,
20314,
20561,
310,
512,
407,
1416,
50276,
8259,
3197,
50273,
7152,
33032,
2520,
2929,
29328,
247,
3676,
7898,
22753,
22199,
326,
476,
11355,
253,
15180,
285,
5511,
2105,
327,
253,
5024,
4095,
407,
760,
22753,
954,
1774,
13461,
275,
1016,
3790,
3185,
273,
247,
2120,
5731,
50276,
262,
671,
29328,
17082,
281,
3609,
13461,
407,
4156,
285,
1980,
9021,
285,
253,
3368,
1543,
921,
253,
10307,
273,
253,
4081,
3082,
50275,
249,
6010,
253,
1332,
4081,
275,
436,
2929,
4453,
8542,
285,
3477,
281,
3359,
533,
253,
10527,
22861,
3198,
2007,
37699,
516,
417,
2119,
670,
253,
8453,
273,
436,
2929,
347,
516,
417,
271,
6485,
275,
436,
2170,
594,
891,
4510,
281,
3553,
436,
281,
643,
30628,
281,
7617,
50275,
249,
2087,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
285,
253,
16038,
310,
3590,
50276,
35529,
253,
22861,
273,
253,
4156,
285,
1980,
9021,
878,
281,
320,
31637,
2007,
253,
11370,
273,
16186,
20,
476,
2186,
760,
604,
269,
310,
298,
34006,
285,
17133,
50276,
4609,
6492,
253,
2957,
1159,
310,
8025,
298,
34006,
285,
17133,
594,
47515,
253,
22861,
273,
253,
5426,
273,
4156,
285,
1980,
9021,
50276,
9453,
253,
2957,
310,
1327,
44181,
534,
310,
253,
954,
1846,
1083,
275,
253,
4679,
50276,
14920,
253,
10527,
22861,
253,
4156,
7680,
326,
34899,
13461,
342,
6253,
2193,
310,
10323,
347,
253,
1072,
347,
819,
25004,
50276,
395,
253,
1980,
7680,
10323,
5593,
253,
4391,
2957,
4269,
407,
253,
5731,
273,
247,
2801,
50276,
20261,
597,
778,
320,
1335,
8542,
533,
253,
38135,
310,
3710,
50275,
783,
3368,
1543,
921,
326,
253,
4081,
1332,
476,
4044,
2074,
3045,
342,
253,
2120,
22753,
533,
4815,
1199,
1679,
5511,
18332,
50276,
262,
3133,
247,
1077,
8542,
1332,
275,
436,
2170,
285,
253,
2929,
3400,
271,
4722,
16774,
1263,
253,
2969,
5019,
273,
4156,
285,
1980,
9021,
41731,
13015,
1016,
2060,
7680,
516,
12371,
604,
4477,
452,
3597,
625,
643,
4088,
281,
13398,
731,
285,
2139,
436,
1039,
310,
1805,
50274,
531,
5884,
4385,
5001,
253,
2605,
273,
253,
2929,
347,
253,
31850,
5700,
7120,
271,
2554,
275,
436,
1332,
352,
651,
320,
1805,
281,
1691,
253,
5661,
1543,
273,
10941,
1027,
3302,
5904,
281,
253,
7473,
2600,
285,
253,
30762,
476,
320,
1691,
846,
253,
20314,
20561,
275,
581,
1873,
50274,
44333,
281,
253,
4477,
2380,
50275,
284,
253,
4477,
452,
9713,
690,
273,
619,
2022,
7350,
285,
2530,
5322,
4465,
5661,
1543,
891,
588,
7164,
619,
4868,
281,
721,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
271,
2746,
281,
21656,
5731,
253,
13461,
273,
11454,
6928,
275,
10208,
12072,
4715,
436,
310,
271,
4722,
285,
1774,
1895,
347,
2067,
30628,
8042,
562,
436,
310,
4122,
2905,
281,
819,
25004,
3738,
342,
247,
1027,
8103,
50276,
262,
310,
271,
4722,
2929,
533,
310,
247,
16888,
1083,
275,
253,
990,
1955,
281,
253,
14855,
327,
9759,
285,
7103,
50275
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper uses cyclegan to map neuronal activities of mice as measured by calcium traces pre and postlearning the main contributions are 1 empirical results of using cyclegan to learn the pre and postlearning mapping look promising 2 using both attention mask which is for gating residual concatenations and gradcam to help with interpretation 3 sorting neurons with an autoencoders reconstruction error essentially sorting them based on their importance strength extensive ablation studies although most are not in the main text but in the appendix using an autoencoder to sort neurons without bias seem to work better than sorting them by firing rate using paired synthetic data to show the effectiveness of applying cyclegan and for real data cycled reconstruction and other distribution metrics are promising interpretations from both attention masks and gradcam comply well with experiment settings weakness figures are not explained clearly what are 689 in the right columns of attention mask figures like 3 4 5 the writing is not very effective for example the second to last paragraph in section 24 can be simplified to a much shorter one with the addition of an equation describing it an equation paired with the module figure will also be easier to understand than this long paragraph question suggestion is the model shared among all mice or does each have its individual model namely is this learned mapping more universal or more individual formulation wise selfattention can also be applied among different neurons just like in graph attention networks and this might eliminate the need for presorting essentially disentangling spatial and temporal information modeling the spatial relationship as a graph with a learned adjacency matrix in this way neurons will be permutation invariantequivariant and the sorting is not needed and the whole model can be learned end to end and lastly its not a 9page paper at all there are too many places in the experiment section referring to appendix sections that require one to look into them for getting a full context the same is true for the model architecture part i feel im forced to read a 32page paper empirically applying cyclegan to reveal the mapping of pre and poslearning neuronal activities shows good results although the architecture or method novelty is not significant and there is some unclarity of the writing it can be a good starting point for further explorations docsepthis paper presents a new method for learning the transformation in neural population activity that takes place during task learning the method is based on cyclegan but includes additional modifications related to neural data and the manner in which it is collected the paper also presents visual interrogations of the learned model to better understand details of the learning process strengths the paper proposes a novel and creative analysis method for understanding learningrelated transformations in neural activity i am not aware of work like this in the neuroscience literature inclusion of selfattention is a nice way get a better handle on these potentially complex transformations is there more that one can say about the masks extracted in fig 4 the gradcam localization maps are also very useful for understanding what the model learns in particular the positional attention maps in fig 5 are really cool it would be neat to investigate these neuronspositions in more detail especially if you could show this model uncovered some aspect of the data that would have been difficult to find with traditional methods concerns intro in other words given the neural recordings of a novice animal can we translate the neuronal activities that correspond to the animal with expertlevel performance and vice versa this is a super interesting question and the answer this paper appears to give is yes however im left scratching my head about what exactly one can learn from this translation im not suggesting a new analysis though ultimately the usefulness of this method will depend on whether or not it can uncover new insightsguide new experiments but a more thorough discussion on how these translations can be used would make this a more compelling introduction 22 i found this section hard to follow perhaps moving fig b1 or something similar to the main text would help with this even more useful but less general would be an explanation of the cyclegan in the context of the neural data for example let x and y be the neural activity before and after learning respectively the ganbased framework consists of a generator g xy that maps novice neural activity to expert neural activity and a second generator f yx that maps expert neural activity to novice neural activity this would make it easier for me at least to have an intuitive understanding of what the different losses correspond to 23 again describing maex fx etc would be a lot clearer in the context of the neural activity 231 why does this ordering process work i understand that in order to use 2d convolutions there must be some nonrandom ordering to the cells but its a bit bizarre to me that reconstrutcion quality from an autoencoder would be meaningful in this way is it possible to motivate this choice better another useful baseline would be to just use 1d convolutions in time and remove the spatial structure of course this means you can no longer use architectures out of the box but also removes a poorly understood aspect of the preprocessing 232 what is the motivation for this spatiotemporal transformation while useful to show that the model can handle this it seems fundamentally different from the types of transformations present in the neural data 24 there are a lot of details here that are important to document but distract from the main point of the paper perhaps move most of this to supplemental and use the extra space for a modelprocess diagram 31 the raw mae numbers are difficult to interpret as presented in the text maybe one part of table e1 could be moved to the main text also i think presenting figure 2 first is a faster way to get an intuition for what the model is doing and how well it is working minor typo second paragraph in introduction prince et al exteneded the framework to work with table a2 day 4 day 1 rewards 23 where do the splits 3000 200 200 come from is this the total number of segments how did you arrive at this number is this related to the stride of the sliding window seems this would result in highly correlated data samples is that an issue 32 would be nice to see parts of fig f1 in the main text maybe show a single neuron and present more in the supplemental the paper addresses an interesting neuroscience problem from a unique perspective however i am still unclear exactly what the method is learning and how it can be used to gain additional insights into the data my initial assessment is to not accept this paper docsepthe paper proposes to learn a mapping for neural activity in the mouse visual cortex the mapping is from neural activity before learning to neural activity after learning and this is achieved using cyclegan the paper also performs additional analysis to interpret the weights learned by the generator and discriminator networks as well as assess the quality of the networks reconstruction of the neural activity the approach mapping prelearning neural activity to postlearning activity using gans is interesting as is the methodology to sort neurons based on an autoencoder reconstruction loss it was also good to see the paper systematically exploring how to choose a loss function for training the gans a nontrivial issue for gan training in general and not always addressed in gan papers however there are some major concerns about the paper 1 the overall motivation of why one would want to map prelearning to postlearning neural activity was not clear although the introduction and discussion briefly discuss interpretability for neural learning it was not clear how this analysis contributes to that 2 using gradcam maps to visualise regions of interest for the discriminator was interesting but this analysis does not seem to lead to any deeper understanding about the neural activity while the discriminators learn that the activity around the reward regions are critical to distinguishing pre and postlearning activity figure 5 it is not clear why the generators dont do this moreover it is possible that this effect would vanish or other regions of interest might appear if the networks were conditioned on information about the stimulus or trials it is also not clear how this analysis is generally applicable or what insights can be gained if it were applied to neural data from a different task 3 reordering the neurons in the data using an autoencoder reconstruction loss appears to be a critical preprocessing step in the training pipeline however the choice of an autoencoder over other approaches to reordering are not clearly motivated although this does lead to better reconstruction of the neural data from the gans it appears to make the subsequent step with the cyclegan redundant if you can accurately reconstruct neural data from the autoencoder then why train an additional set of adversarial networks to do the same thing again 4 it would be nice to have an estimate of the compute resources and time required to train all the networks in the pipeline autoencoder cyclegan and perform the posthoc analysis with gradcam etc in the light of doubts about motivation and benefits of the method it would also be relevant to know how computationally expensive it is to implement 5 there was an overall lack of clarity and particularly in the methods and results section all equations are inline and not numbered and therefore references to terms in the equation are hard to keep in mind while trying to understand section 22 and 23 extensive details about architecture and training frequently detract from understanding what steps the training pipeline consist of perhaps these should be moved to the appendix there was no explanation of how to interpret the gradcam maps in figures 4 and 5 or what the colours numbers mean and the explanation of their overlap with reward regions appeared to be handwaving it would be nice to see the pre and postlearning neural activity for precisely those neurons that the discriminator assigns attention to in figure 4 the description of the results was confusing and many of the plots that the conclusions here rely on are in the supplementary section making it hard for a reader to follow any reasoning based on these plots minor comments 1 heatmaps in figures 2 and 4 are hard to interpret without colourbars 2 neural activity in figure 1 is barely visible due to the colour scheme 3 the explanation for how plots in figure 5 is generated appears only in the caption with no elucidation in the text 4 there does not appear to be information of recording time per trial in the main paper the motivation for the methods presented in the paper are not clear the conclusions are not very convincing and consequently it is hard to judge the contributions of the paper a lack of clarity in the methods and results section exacerbates this
### Summary: | this paper received 2 marginally below and 1 marginally above ratings we discussed the paper with the reviewers and there was broad consensus that 1 the paper lacked clarity 2 multiple modeling choices were debatable eg ordering or embedding of neurons and convolution over neurons and not sufficiently justified and these choices will critically impact the conclusions drawn from the analysis 3 we were not convinced by the relevance of the synthetic data to reflect a meaningful biological process 4 we did not see any meaningful knowledge gained for biology from this whole analysis my recommendation is thus to reject this paper | [
1293,
8492,
1646,
281,
789,
1805,
685,
23762,
731,
407,
14954,
2281,
50276,
5302,
18433,
13506,
941,
281,
921,
253,
12510,
273,
9433,
5880,
1247,
285,
323,
1524,
941,
6776,
264,
14433,
285,
643,
3268,
17082,
403,
12532,
27838,
432,
1097,
4116,
25965,
285,
3805,
12583,
16041,
973,
342,
3368,
7533,
50275,
20881,
1255,
50276,
40203,
403,
417,
5544,
4518,
752,
403,
721,
2511,
275,
253,
987,
9930,
273,
4116,
8989,
8442,
751,
495,
577,
608,
50276,
783,
4028,
310,
417,
1077,
3576,
323,
1650,
253,
1273,
281,
1390,
12494,
275,
2593,
2164,
476,
320,
21010,
281,
247,
1199,
12217,
581,
342,
253,
1635,
273,
271,
5150,
12930,
352,
271,
5150,
18433,
342,
253,
6333,
4677,
588,
671,
320,
6927,
281,
2096,
685,
436,
1048,
12494,
50276,
19751,
50276,
35640,
279,
50276,
261,
253,
1566,
6096,
2190,
512,
3754,
390,
1057,
1016,
452,
697,
2060,
1566,
10775,
310,
436,
6311,
10603,
625,
10898,
390,
625,
2060,
50275,
630,
1427,
15822,
1881,
42959,
476,
671,
320,
3732,
2190,
1027,
8512,
816,
751,
275,
4216,
4116,
6928,
285,
436,
1537,
13469,
253,
878,
323,
838,
12655,
9093,
557,
290,
36874,
8820,
285,
11935,
1491,
14053,
253,
8820,
2954,
347,
247,
4216,
342,
247,
6311,
3067,
43850,
4315,
275,
436,
1039,
8512,
588,
320,
29391,
13727,
8275,
6410,
285,
253,
23762,
310,
417,
3058,
285,
253,
2644,
1566,
476,
320,
6311,
990,
281,
990,
50275,
395,
1390,
314,
697,
417,
247,
898,
6377,
2929,
387,
512,
627,
403,
1512,
1142,
5053,
275,
253,
3368,
2593,
14339,
281,
30762,
7118,
326,
2430,
581,
281,
1007,
715,
731,
323,
2970,
247,
2120,
3634,
253,
1072,
310,
2032,
323,
253,
1566,
10336,
629,
891,
1928,
516,
6726,
281,
1239,
247,
4567,
6377,
2929,
45190,
9433,
5880,
1247,
281,
10313,
253,
10603,
273,
638,
285,
803,
28269,
16069,
4712,
2722,
1175,
1543,
3738,
253,
10336,
390,
1332,
38135,
310,
417,
1534,
285,
627,
310,
690,
440,
498,
15752,
273,
253,
4028,
352,
476,
320,
247,
1175,
4983,
1127,
323,
2007,
31880,
569,
5474,
33032,
2520,
2929,
10262,
247,
747,
1332,
323,
4715,
253,
9261,
275,
11454,
3072,
2425,
326,
3936,
1659,
1309,
4836,
4715,
253,
1332,
310,
1754,
327,
5880,
1247,
533,
3797,
3081,
14586,
2905,
281,
11454,
941,
285,
253,
5133,
275,
534,
352,
310,
5728,
253,
2929,
671,
10262,
5304,
16608,
569,
273,
253,
6311,
1566,
281,
1805,
2096,
4278,
273,
253,
4715,
1232,
20544,
50276,
783,
2929,
29328,
247,
4460,
285,
10995,
1783,
1332,
323,
4685,
4715,
4919,
21257,
275,
11454,
2425,
891,
717,
417,
6600,
273,
789,
751,
436,
275,
253,
6551,
21559,
6239,
50276,
249,
3444,
273,
1881,
42959,
310,
247,
5322,
1039,
755,
247,
1805,
6016,
327,
841,
7826,
2570,
21257,
310,
627,
625,
326,
581,
476,
1333,
670,
253,
25965,
10375,
275,
3036,
577,
50276,
783,
3805,
12583,
14536,
8115,
403,
671,
1077,
4217,
323,
4685,
752,
253,
1566,
33772,
275,
1798,
253,
40798,
4116,
8115,
275,
3036,
608,
403,
1663,
4484,
352,
651,
320,
18176,
281,
7409,
841,
8512,
35507,
275,
625,
2508,
3340,
604,
368,
812,
921,
436,
1566,
27819,
690,
4809,
273,
253,
941,
326,
651,
452,
644,
2834,
281,
1089,
342,
5899,
3082,
50275,
585,
1209,
2224,
50276,
35322,
275,
643,
3000,
1677,
253,
11454,
19654,
273,
247,
22458,
547,
5893,
476,
359,
16497,
253,
16069,
4712,
326,
2723,
281,
253,
5893,
342,
6485,
5251,
3045,
285,
12008,
26620,
436,
310,
247,
2221,
4722,
1953,
285,
253,
3662,
436,
2929,
4620,
281,
1918,
310,
4754,
2299,
516,
1669,
47348,
619,
1481,
670,
752,
4555,
581,
476,
3037,
432,
436,
10234,
516,
417,
7738,
247,
747,
1783,
2167,
9142,
253,
31471,
273,
436,
1332,
588,
3469,
327,
1880,
390,
417,
352,
476,
32355,
747,
16039,
22433,
747,
4679,
533,
247,
625,
11080,
5955,
327,
849,
841,
29971,
476,
320,
908,
651,
1056,
436,
247,
625,
18511,
10199,
50276,
1423,
891,
1119,
436,
2593,
1892,
281,
956,
4931,
4886,
3036,
270,
18,
390,
1633,
2074,
281,
253,
2022,
2505,
651,
1361,
342,
436,
1014,
625,
4217,
533,
1679,
2087,
651,
320,
271,
8813,
273,
253,
5880,
1247,
275,
253,
3634,
273,
253,
11454,
941,
323,
1650,
1339,
1269,
285,
340,
320,
253,
11454,
2425,
1078,
285,
846,
4715,
2975,
253,
36827,
3169,
7792,
8414,
273,
247,
14156,
305,
1269,
90,
326,
8115,
22458,
547,
11454,
2425,
281,
6485,
11454,
2425,
285,
247,
1273,
14156,
269,
340,
89,
326,
8115,
6485,
11454,
2425,
281,
22458,
547,
11454,
2425,
436,
651,
1056,
352,
6927,
323,
479,
387,
1878,
281,
452,
271,
27350,
4685,
273,
752,
253,
1027,
11655,
2723,
281,
50276,
1508,
969,
12930,
6429,
911,
269,
89,
3966,
651,
320,
247,
2257,
30909,
275,
253,
3634,
273,
253,
11454,
2425,
50276,
18390,
2139,
1057,
436,
15824,
1232,
789,
891,
2096,
326,
275,
1340,
281,
897,
374,
69,
2410,
17009,
627,
1364,
320,
690,
1327,
14719,
15824,
281,
253,
1341,
533,
697,
247,
2372,
27541,
281,
479,
326,
8756,
1344,
307,
48966,
3290,
432,
271,
6753,
36465,
651,
320,
14282,
275,
436,
1039,
310,
352,
1896,
281,
41509,
436,
4327,
1805,
1529,
4217,
8245,
651,
320,
281,
816,
897,
337,
69,
2410,
17009,
275,
673,
285,
5386,
253,
8820,
2605,
273,
2282,
436,
2097,
368,
476,
642,
3356,
897,
35615,
562,
273,
253,
3817,
533,
671,
26586,
247,
15225,
7192,
4809,
273,
253,
638,
21678,
50276,
19136,
752,
310,
253,
16038,
323,
436,
7046,
7173,
358,
23702,
9261,
1223,
4217,
281,
921,
326,
253,
1566,
476,
6016,
436,
352,
3133,
26401,
1027,
432,
253,
3510,
273,
21257,
1246,
275,
253,
11454,
941,
50276,
1348,
627,
403,
247,
2257,
273,
4278,
1060,
326,
403,
1774,
281,
3389,
533,
36815,
432,
253,
2022,
1127,
273,
253,
2929,
4931,
2118,
954,
273,
436,
281,
25702,
285,
897,
253,
4465,
2317,
323,
247,
1566,
7404,
10659,
50276,
2405,
253,
9305,
278,
3348,
3904,
403,
2834,
281,
4665,
347,
3559,
275,
253,
2505,
5046,
581,
629,
273,
2829,
299,
18,
812,
320,
4395,
281,
253,
2022,
2505,
671,
891,
1158,
15250,
4677,
374,
806,
310,
247,
7938,
1039,
281,
755,
271,
30328,
323,
752,
253,
1566,
310,
2509,
285,
849,
973,
352,
310,
2444,
50275,
37585,
50276,
555,
5367,
1273,
12494,
275,
10199,
24012,
1162,
355,
1021,
2348,
264,
253,
7792,
281,
789,
342,
50276,
2420,
247,
19,
1388,
577,
50276,
1201,
337,
23267,
50276,
1508,
835,
513,
253,
36509,
27295,
1052,
1052,
1705,
432,
310,
436,
253,
2264,
1180,
273,
13288,
849,
858,
368,
12666,
387,
436,
1180,
310,
436,
2905,
281,
253,
31482,
273,
253,
20661,
3497,
3133,
436,
651,
906,
275,
4122,
9578,
941,
3530,
310,
326,
271,
2523,
50276,
1237,
651,
320,
5322,
281,
923,
4243,
273,
3036,
269,
18,
275,
253,
2022,
2505,
5046,
921,
247,
2014,
23586,
285,
1246,
625,
275,
253,
25702,
253,
2929,
12453,
271,
4722,
6551,
21559,
1895,
432,
247,
4451,
8668,
2299,
891,
717,
1335,
12744,
4555,
752,
253,
1332,
310,
4715,
285,
849,
352,
476,
320,
908,
281,
6351,
3081,
16039,
715,
253,
941,
619,
3302,
6803,
310,
281,
417,
2997,
436,
2929,
5474,
339,
431,
248,
2929,
29328,
281,
3037,
247,
10603,
323,
11454,
2425,
275,
253,
6521,
5304,
14031,
253,
10603,
310,
432,
11454,
2425,
1078,
4715,
281,
11454,
2425,
846,
4715,
285,
436,
310,
6786,
970,
5880,
1247,
253,
2929,
671,
17923,
3081,
1783,
281,
4665,
253,
13461,
6311,
407,
253,
14156,
285,
7134,
12915,
6928,
347,
973,
347,
2939,
253,
3290,
273,
253,
6928,
14433,
273,
253,
11454,
2425,
253,
2746,
10603,
638,
28269,
11454,
2425,
281,
1501,
28269,
2425,
970,
305,
507,
310,
4722,
347,
310,
253,
16182,
281,
3686,
8512,
1754,
327,
271,
6753,
36465,
14433,
2957,
352,
369,
671,
1175,
281,
923,
253,
2929,
24181,
18216,
849,
281,
5206,
247,
2957,
1159,
323,
3733,
253,
305,
507,
50276,
66,
37825,
2523,
323,
36827,
3733,
275,
2087,
285,
417,
1900,
9713,
275,
36827,
9380,
2299,
627,
403,
690,
2201,
7350,
670,
253,
2929,
337,
253,
4583,
16038,
273,
2139,
581,
651,
971,
281,
3711,
638,
28269,
281,
1501,
28269,
11454,
2425,
369,
417,
2590,
3738,
253,
10199,
285,
5955,
13366,
2319,
4665,
1430,
323,
11454,
4715,
352,
369,
417,
2590,
849,
436,
1783,
17904,
281,
326,
50276,
19,
970,
3805,
12583,
8115,
281,
5304,
885,
4811,
273,
1600,
323,
253,
7134,
12915,
369,
4722,
533,
436,
1783,
1057,
417,
1646,
281,
1421,
281,
667,
12861,
4685,
670,
253,
11454,
2425,
1223,
253,
20741,
2392,
3037,
326,
253,
2425,
1475,
253,
10921,
4811,
403,
4619,
281,
32495,
638,
285,
1501,
28269,
2425,
4677,
608,
352,
310,
417,
2590,
2139,
253,
21025,
13414,
513,
436,
25761,
352,
310,
1896,
326,
436,
1055,
651,
29259,
390,
643,
4811,
273,
1600,
1537,
3176,
604,
253,
6928,
497,
27039,
327,
1491,
670,
253,
15199,
390,
7587,
352,
310,
671,
417,
2590,
849,
436,
1783,
310,
3839,
7763,
390,
752,
16039,
476,
320,
12103,
604,
352,
497,
3732,
281,
11454,
941,
432,
247,
1027,
4836,
50276,
20,
294,
48539,
253,
8512,
275,
253,
941,
970,
271,
6753,
36465,
14433,
2957,
4620,
281,
320,
247,
4619,
638,
21678,
3213,
275,
253,
3733,
15722,
50276,
35529,
253,
4327,
273,
271,
6753,
36465,
689,
643,
7274,
281,
294,
48539,
403,
417,
4518,
17194,
3738,
436,
1057,
1421,
281,
1805,
14433,
273,
253,
11454,
941,
432,
253,
305,
507,
352,
4620,
281,
1056,
253,
6774,
3213,
342,
253,
5880,
1247,
28116,
604,
368,
476,
13613,
17029,
11454,
941,
432,
253,
6753,
36465,
840,
2139,
6194,
271,
3081,
873,
273,
48960,
6928,
281,
513,
253,
1072,
2181,
969,
50276,
21,
352,
651,
320,
5322,
281,
452,
271,
6642,
273,
253,
11897,
5300,
285,
673,
2424,
281,
6194,
512,
253,
6928,
275,
253,
15722,
6753,
36465,
5880,
1247,
285,
1347,
253,
1501,
37806,
1783,
342,
3805,
12583,
3966,
275,
253,
1708,
273,
24626,
670,
16038,
285,
5373,
273,
253,
1332,
352,
651,
671,
320,
4623,
281,
871,
849,
43245,
8214,
352,
310,
281,
3359,
50276,
22,
627,
369,
271,
4583,
3480,
273,
19843,
285,
3782,
275,
253,
3082,
285,
1543,
2593,
50275,
455,
7424,
403,
13866,
285,
417,
31050,
285,
3103,
10414,
281,
2426,
275,
253,
5150,
403,
1892,
281,
1978,
275,
2564,
1223,
2820,
281,
2096,
2593,
3307,
285,
3495,
50274,
2068,
3134,
4278,
670,
10336,
285,
3733,
7208,
843,
974,
432,
4685,
752,
5018,
253,
3733,
15722,
2882,
273,
4931,
841,
943,
320,
4395,
281,
253,
30762,
50276,
9088,
369,
642,
8813,
273,
849,
281,
4665,
253,
3805,
12583,
8115,
275,
8442,
577,
285,
608,
390,
752,
253,
22290,
50276,
40957,
1599,
285,
253,
8813,
273,
616,
14787,
342,
10921,
4811,
5420,
281,
320,
1133,
88,
3292,
352,
651,
320,
5322,
281,
923,
253,
638,
285,
1501,
28269,
11454,
2425,
323,
10534,
1110,
8512,
326,
253,
7134,
12915,
39360,
4116,
281,
275,
4677,
577,
50276,
783,
5740,
273,
253,
1543,
369,
21643,
285,
1142,
273,
253,
14777,
326,
253,
11815,
1060,
10725,
327,
403,
275,
253,
24864,
2593,
2403,
352,
1892,
323,
247,
9414,
281,
956,
667,
14720,
1754,
327,
841,
14777,
50276,
37585,
5701,
337,
4250,
23226,
275,
8442,
374,
285,
577,
403,
1892,
281,
4665,
1293,
10688,
33396,
374,
11454,
2425,
275,
4677,
337,
310,
12345,
7985,
1955,
281,
253,
10688,
6974,
495,
253,
8813,
323,
849,
14777,
275,
4677,
608,
310,
4561,
4620,
760,
275,
253,
11743,
342,
642,
19125,
318,
275,
253,
2505,
577,
627,
1057,
417,
3176,
281,
320,
1491,
273,
7663,
673,
591,
2332,
275,
253,
2022,
2929,
50276,
783,
16038,
323,
253,
3082,
3559,
275,
253,
2929,
403,
417,
2590,
253,
11815,
403,
417,
1077,
21414,
285,
17912,
352,
310,
1892,
281,
5963,
253,
9021,
273,
253,
2929,
247,
3480,
273,
19843,
275,
253,
3082,
285,
1543,
2593,
21951,
684,
436,
2490,
187,
4118,
18435,
27,
2520,
2929,
2959,
374,
42876,
2708,
285,
337,
42876,
1840,
17503,
359,
5469,
253,
2929,
342,
253,
30628,
285,
627,
369,
3862,
13969,
326,
337,
253,
2929,
20296,
19843,
374,
2709,
14053,
10165,
497,
4274,
17980,
24088,
15824,
390,
21496,
273,
8512,
285,
27311,
689,
8512,
285,
417,
10481,
17285,
285,
841,
10165,
588,
21038,
3486,
253,
11815,
8392,
432,
253,
1783,
495,
359,
497,
417,
13762,
407,
253,
17200,
273,
253,
13506,
941,
281,
4887,
247,
14282,
7534,
1232,
577,
359,
858,
417,
923,
667,
14282,
3640,
12103,
323,
16775,
432,
436,
2644,
1783,
619,
17401,
310,
3021,
281,
12009,
436,
2929
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
1293,
8492,
1646,
281,
789,
1805,
685,
23762,
731,
407,
14954,
2281,
50276,
5302,
18433,
13506,
941,
281,
921,
253,
12510,
273,
9433,
5880,
1247,
285,
323,
1524,
941,
6776,
264,
14433,
285,
643,
3268,
17082,
403,
12532,
27838,
432,
1097,
4116,
25965,
285,
3805,
12583,
16041,
973,
342,
3368,
7533,
50275,
20881,
1255,
50276,
40203,
403,
417,
5544,
4518,
752,
403,
721,
2511,
275,
253,
987,
9930,
273,
4116,
8989,
8442,
751,
495,
577,
608,
50276,
783,
4028,
310,
417,
1077,
3576,
323,
1650,
253,
1273,
281,
1390,
12494,
275,
2593,
2164,
476,
320,
21010,
281,
247,
1199,
12217,
581,
342,
253,
1635,
273,
271,
5150,
12930,
352,
271,
5150,
18433,
342,
253,
6333,
4677,
588,
671,
320,
6927,
281,
2096,
685,
436,
1048,
12494,
50276,
19751,
50276,
35640,
279,
50276,
261,
253,
1566,
6096,
2190,
512,
3754,
390,
1057,
1016,
452,
697,
2060,
1566,
10775,
310,
436,
6311,
10603,
625,
10898,
390,
625,
2060,
50275,
630,
1427,
15822,
1881,
42959,
476,
671,
320,
3732,
2190,
1027,
8512,
816,
751,
275,
4216,
4116,
6928,
285,
436,
1537,
13469,
253,
878,
323,
838,
12655,
9093,
557,
290,
36874,
8820,
285,
11935,
1491,
14053,
253,
8820,
2954,
347,
247,
4216,
342,
247,
6311,
3067,
43850,
4315,
275,
436,
1039,
8512,
588,
320,
29391,
13727,
8275,
6410,
285,
253,
23762,
310,
417,
3058,
285,
253,
2644,
1566,
476,
320,
6311,
990,
281,
990,
50275,
395,
1390,
314,
697,
417,
247,
898,
6377,
2929,
387,
512,
627,
403,
1512,
1142,
5053,
275,
253,
3368,
2593,
14339,
281,
30762,
7118,
326,
2430,
581,
281,
1007,
715,
731,
323,
2970,
247,
2120,
3634,
253,
1072,
310,
2032,
323,
253,
1566,
10336,
629,
891,
1928,
516,
6726,
281,
1239,
247,
4567,
6377,
2929,
45190,
9433,
5880,
1247,
281,
10313,
253,
10603,
273,
638,
285,
803,
28269,
16069,
4712,
2722,
1175,
1543,
3738,
253,
10336,
390,
1332,
38135,
310,
417,
1534,
285,
627,
310,
690,
440,
498,
15752,
273,
253,
4028,
352,
476,
320,
247,
1175,
4983,
1127,
323,
2007,
31880,
569,
5474,
33032,
2520,
2929,
10262,
247,
747,
1332,
323,
4715,
253,
9261,
275,
11454,
3072,
2425,
326,
3936,
1659,
1309,
4836,
4715,
253,
1332,
310,
1754,
327,
5880,
1247,
533,
3797,
3081,
14586,
2905,
281,
11454,
941,
285,
253,
5133,
275,
534,
352,
310,
5728,
253,
2929,
671,
10262,
5304,
16608,
569,
273,
253,
6311,
1566,
281,
1805,
2096,
4278,
273,
253,
4715,
1232,
20544,
50276,
783,
2929,
29328,
247,
4460,
285,
10995,
1783,
1332,
323,
4685,
4715,
4919,
21257,
275,
11454,
2425,
891,
717,
417,
6600,
273,
789,
751,
436,
275,
253,
6551,
21559,
6239,
50276,
249,
3444,
273,
1881,
42959,
310,
247,
5322,
1039,
755,
247,
1805,
6016,
327,
841,
7826,
2570,
21257,
310,
627,
625,
326,
581,
476,
1333,
670,
253,
25965,
10375,
275,
3036,
577,
50276,
783,
3805,
12583,
14536,
8115,
403,
671,
1077,
4217,
323,
4685,
752,
253,
1566,
33772,
275,
1798,
253,
40798,
4116,
8115,
275,
3036,
608,
403,
1663,
4484,
352,
651,
320,
18176,
281,
7409,
841,
8512,
35507,
275,
625,
2508,
3340,
604,
368,
812,
921,
436,
1566,
27819,
690,
4809,
273,
253,
941,
326,
651,
452,
644,
2834,
281,
1089,
342,
5899,
3082,
50275,
585,
1209,
2224,
50276,
35322,
275,
643,
3000,
1677,
253,
11454,
19654,
273,
247,
22458,
547,
5893,
476,
359,
16497,
253,
16069,
4712,
326,
2723,
281,
253,
5893,
342,
6485,
5251,
3045,
285,
12008,
26620,
436,
310,
247,
2221,
4722,
1953,
285,
253,
3662,
436,
2929,
4620,
281,
1918,
310,
4754,
2299,
516,
1669,
47348,
619,
1481,
670,
752,
4555,
581,
476,
3037,
432,
436,
10234,
516,
417,
7738,
247,
747,
1783,
2167,
9142,
253,
31471,
273,
436,
1332,
588,
3469,
327,
1880,
390,
417,
352,
476,
32355,
747,
16039,
22433,
747,
4679,
533,
247,
625,
11080,
5955,
327,
849,
841,
29971,
476,
320,
908,
651,
1056,
436,
247,
625,
18511,
10199,
50276,
1423,
891,
1119,
436,
2593,
1892,
281,
956,
4931,
4886,
3036,
270,
18,
390,
1633,
2074,
281,
253,
2022,
2505,
651,
1361,
342,
436,
1014,
625,
4217,
533,
1679,
2087,
651,
320,
271,
8813,
273,
253,
5880,
1247,
275,
253,
3634,
273,
253,
11454,
941,
323,
1650,
1339,
1269,
285,
340,
320,
253,
11454,
2425,
1078,
285,
846,
4715,
2975,
253,
36827,
3169,
7792,
8414,
273,
247,
14156,
305,
1269,
90,
326,
8115,
22458,
547,
11454,
2425,
281,
6485,
11454,
2425,
285,
247,
1273,
14156,
269,
340,
89,
326,
8115,
6485,
11454,
2425,
281,
22458,
547,
11454,
2425,
436,
651,
1056,
352,
6927,
323,
479,
387,
1878,
281,
452,
271,
27350,
4685,
273,
752,
253,
1027,
11655,
2723,
281,
50276,
1508,
969,
12930,
6429,
911,
269,
89,
3966,
651,
320,
247,
2257,
30909,
275,
253,
3634,
273,
253,
11454,
2425,
50276,
18390,
2139,
1057,
436,
15824,
1232,
789,
891,
2096,
326,
275,
1340,
281,
897,
374,
69,
2410,
17009,
627,
1364,
320,
690,
1327,
14719,
15824,
281,
253,
1341,
533,
697,
247,
2372,
27541,
281,
479,
326,
8756,
1344,
307,
48966,
3290,
432,
271,
6753,
36465,
651,
320,
14282,
275,
436,
1039,
310,
352,
1896,
281,
41509,
436,
4327,
1805,
1529,
4217,
8245,
651,
320,
281,
816,
897,
337,
69,
2410,
17009,
275,
673,
285,
5386,
253,
8820,
2605,
273,
2282,
436,
2097,
368,
476,
642,
3356,
897,
35615,
562,
273,
253,
3817,
533,
671,
26586,
247,
15225,
7192,
4809,
273,
253,
638,
21678,
50276,
19136,
752,
310,
253,
16038,
323,
436,
7046,
7173,
358,
23702,
9261,
1223,
4217,
281,
921,
326,
253,
1566,
476,
6016,
436,
352,
3133,
26401,
1027,
432,
253,
3510,
273,
21257,
1246,
275,
253,
11454,
941,
50276,
1348,
627,
403,
247,
2257,
273,
4278,
1060,
326,
403,
1774,
281,
3389,
533,
36815,
432,
253,
2022,
1127,
273,
253,
2929,
4931,
2118,
954,
273,
436,
281,
25702,
285,
897,
253,
4465,
2317,
323,
247,
1566,
7404,
10659,
50276,
2405,
253,
9305,
278,
3348,
3904,
403,
2834,
281,
4665,
347,
3559,
275,
253,
2505,
5046,
581,
629,
273,
2829,
299,
18,
812,
320,
4395,
281,
253,
2022,
2505,
671,
891,
1158,
15250,
4677,
374,
806,
310,
247,
7938,
1039,
281,
755,
271,
30328,
323,
752,
253,
1566,
310,
2509,
285,
849,
973,
352,
310,
2444,
50275,
37585,
50276,
555,
5367,
1273,
12494,
275,
10199,
24012,
1162,
355,
1021,
2348,
264,
253,
7792,
281,
789,
342,
50276,
2420,
247,
19,
1388,
577,
50276,
1201,
337,
23267,
50276,
1508,
835,
513,
253,
36509,
27295,
1052,
1052,
1705,
432,
310,
436,
253,
2264,
1180,
273,
13288,
849,
858,
368,
12666,
387,
436,
1180,
310,
436,
2905,
281,
253,
31482,
273,
253,
20661,
3497,
3133,
436,
651,
906,
275,
4122,
9578,
941,
3530,
310,
326,
271,
2523,
50276,
1237,
651,
320,
5322,
281,
923,
4243,
273,
3036,
269,
18,
275,
253,
2022,
2505,
5046,
921,
247,
2014,
23586,
285,
1246,
625,
275,
253,
25702,
253,
2929,
12453,
271,
4722,
6551,
21559,
1895,
432,
247,
4451,
8668,
2299,
891,
717,
1335,
12744,
4555,
752,
253,
1332,
310,
4715,
285,
849,
352,
476,
320,
908,
281,
6351,
3081,
16039,
715,
253,
941,
619,
3302,
6803,
310,
281,
417,
2997,
436,
2929,
5474,
339,
431,
248,
2929,
29328,
281,
3037,
247,
10603,
323,
11454,
2425,
275,
253,
6521,
5304,
14031,
253,
10603,
310,
432,
11454,
2425,
1078,
4715,
281,
11454,
2425,
846,
4715,
285,
436,
310,
6786,
970,
5880,
1247,
253,
2929,
671,
17923,
3081,
1783,
281,
4665,
253,
13461,
6311,
407,
253,
14156,
285,
7134,
12915,
6928,
347,
973,
347,
2939,
253,
3290,
273,
253,
6928,
14433,
273,
253,
11454,
2425,
253,
2746,
10603,
638,
28269,
11454,
2425,
281,
1501,
28269,
2425,
970,
305,
507,
310,
4722,
347,
310,
253,
16182,
281,
3686,
8512,
1754,
327,
271,
6753,
36465,
14433,
2957,
352,
369,
671,
1175,
281,
923,
253,
2929,
24181,
18216,
849,
281,
5206,
247,
2957,
1159,
323,
3733,
253,
305,
507,
50276,
66,
37825,
2523,
323,
36827,
3733,
275,
2087,
285,
417,
1900,
9713,
275,
36827,
9380,
2299,
627,
403,
690,
2201,
7350,
670,
253,
2929,
337,
253,
4583,
16038,
273,
2139,
581,
651,
971,
281,
3711,
638,
28269,
281,
1501,
28269,
11454,
2425,
369,
417,
2590,
3738,
253,
10199,
285,
5955,
13366,
2319,
4665,
1430,
323,
11454,
4715,
352,
369,
417,
2590,
849,
436,
1783,
17904,
281,
326,
50276,
19,
970,
3805,
12583,
8115,
281,
5304,
885,
4811,
273,
1600,
323,
253,
7134,
12915,
369,
4722,
533,
436,
1783,
1057,
417,
1646,
281,
1421,
281,
667,
12861,
4685,
670,
253,
11454,
2425,
1223,
253,
20741,
2392,
3037,
326,
253,
2425,
1475,
253,
10921,
4811,
403,
4619,
281,
32495,
638,
285,
1501,
28269,
2425,
4677,
608,
352,
310,
417,
2590,
2139,
253,
21025,
13414,
513,
436,
25761,
352,
310,
1896,
326,
436,
1055,
651,
29259,
390,
643,
4811,
273,
1600,
1537,
3176,
604,
253,
6928,
497,
27039,
327,
1491,
670,
253,
15199,
390,
7587,
352,
310,
671,
417,
2590,
849,
436,
1783,
310,
3839,
7763,
390,
752,
16039,
476,
320,
12103,
604,
352,
497,
3732,
281,
11454,
941,
432,
247,
1027,
4836,
50276,
20,
294,
48539,
253,
8512,
275,
253,
941,
970,
271,
6753,
36465,
14433,
2957,
4620,
281,
320,
247,
4619,
638,
21678,
3213,
275,
253,
3733,
15722,
50276,
35529,
253,
4327,
273,
271,
6753,
36465,
689,
643,
7274,
281,
294,
48539,
403,
417,
4518,
17194,
3738,
436,
1057,
1421,
281,
1805,
14433,
273,
253,
11454,
941,
432,
253,
305,
507,
352,
4620,
281,
1056,
253,
6774,
3213,
342,
253,
5880,
1247,
28116,
604,
368,
476,
13613,
17029,
11454,
941,
432,
253,
6753,
36465,
840,
2139,
6194,
271,
3081,
873,
273,
48960,
6928,
281,
513,
253,
1072,
2181,
969,
50276,
21,
352,
651,
320,
5322,
281,
452,
271,
6642,
273,
253,
11897,
5300,
285,
673,
2424,
281,
6194,
512,
253,
6928,
275,
253,
15722,
6753,
36465,
5880,
1247,
285,
1347,
253,
1501,
37806,
1783,
342,
3805,
12583,
3966,
275,
253,
1708,
273,
24626,
670,
16038,
285,
5373,
273,
253,
1332,
352,
651,
671,
320,
4623,
281,
871,
849,
43245,
8214,
352,
310,
281,
3359,
50276,
22,
627,
369,
271,
4583,
3480,
273,
19843,
285,
3782,
275,
253,
3082,
285,
1543,
2593,
50275,
455,
7424,
403,
13866,
285,
417,
31050,
285,
3103,
10414,
281,
2426,
275,
253,
5150,
403,
1892,
281,
1978,
275,
2564,
1223,
2820,
281,
2096,
2593,
3307,
285,
3495,
50274,
2068,
3134,
4278,
670,
10336,
285,
3733,
7208,
843,
974,
432,
4685,
752,
5018,
253,
3733,
15722,
2882,
273,
4931,
841,
943,
320,
4395,
281,
253,
30762,
50276,
9088,
369,
642,
8813,
273,
849,
281,
4665,
253,
3805,
12583,
8115,
275,
8442,
577,
285,
608,
390,
752,
253,
22290,
50276,
40957,
1599,
285,
253,
8813,
273,
616,
14787,
342,
10921,
4811,
5420,
281,
320,
1133,
88,
3292,
352,
651,
320,
5322,
281,
923,
253,
638,
285,
1501,
28269,
11454,
2425,
323,
10534,
1110,
8512,
326,
253,
7134,
12915,
39360,
4116,
281,
275,
4677,
577,
50276,
783,
5740,
273,
253,
1543,
369,
21643,
285,
1142,
273,
253,
14777,
326,
253,
11815,
1060,
10725,
327,
403,
275,
253,
24864,
2593,
2403,
352,
1892,
323,
247,
9414,
281,
956,
667,
14720,
1754,
327,
841,
14777,
50276,
37585,
5701,
337,
4250,
23226,
275,
8442,
374,
285,
577,
403,
1892,
281,
4665,
1293,
10688,
33396,
374,
11454,
2425,
275,
4677,
337,
310,
12345,
7985,
1955,
281,
253,
10688,
6974,
495,
253,
8813,
323,
849,
14777,
275,
4677,
608,
310,
4561,
4620,
760,
275,
253,
11743,
342,
642,
19125,
318,
275,
253,
2505,
577,
627,
1057,
417,
3176,
281,
320,
1491,
273,
7663,
673,
591,
2332,
275,
253,
2022,
2929,
50276,
783,
16038,
323,
253,
3082,
3559,
275,
253,
2929,
403,
417,
2590,
253,
11815,
403,
417,
1077,
21414,
285,
17912,
352,
310,
1892,
281,
5963,
253,
9021,
273,
253,
2929,
247,
3480,
273,
19843,
275,
253,
3082,
285,
1543,
2593,
21951,
684,
436,
2490,
187,
4118,
18435,
27,
2520,
2929,
2959,
374,
42876,
2708,
285,
337,
42876,
1840,
17503,
359,
5469,
253,
2929,
342,
253,
30628,
285,
627,
369,
3862,
13969,
326,
337,
253,
2929,
20296,
19843,
374,
2709,
14053,
10165,
497,
4274,
17980,
24088,
15824,
390,
21496,
273,
8512,
285,
27311,
689,
8512,
285,
417,
10481,
17285,
285,
841,
10165,
588,
21038,
3486,
253,
11815,
8392,
432,
253,
1783,
495,
359,
497,
417,
13762,
407,
253,
17200,
273,
253,
13506,
941,
281,
4887,
247,
14282,
7534,
1232,
577,
359,
858,
417,
923,
667,
14282,
3640,
12103,
323,
16775,
432,
436,
2644,
1783,
619,
17401,
310,
3021,
281,
12009,
436,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper theoretically and empirically analyzes the link between connectivity patterns of deep networks and their convergence based on this analysis the authors propose a couple of trainingfree metrics effective width and effective depth and use them to do trainingfree nas the paper presents a thorough link between gradient descent convergence and structural properties of deep networks by proving a link between number of unique paths number of incoming paths at certain nodes in the network and the least eigenvalue of a neural network gaussian process nngp variance essentially the authors look at how the nngp variance changes as the data propagates through the model and relate it to how many unique paths it goes through and how many incoming edges land at the last layer in the dag topology used to formulate this problem these principles are then baked into their effective depth and effective width metrics that do not require any training or forwardbackward passes to compute some empirical results are shown for this convergence the remaining paper looks at trainingfree nas by cutting down the search space results are shown on cifar10100 imagenet16120 and imagenet the paper has following strengths 1 this is a principled approach to study an important problem ie understanding how deep network architecture topologies and connectivity patterns relate to training convergence knowing this is obviously very valuable because a we can design better models directly without a long timeconsuming search b in future better understanding of how architectures impact accuracy can enable us to create even more novel models indeed it is a hard problem so coming up with an intuitive method for this problem is very hard 2 the proof seems very intuitive and the resulting metrics effective depth and effective width are very simple to compute and do not require any training or even forwardbackward passes one can just look at the architecture connectivity and tell with reasonable confidence if it is a good or bad architecture 3 empirical results for training convergence and nasbench architectures etc present some interesting insights although they also bring up some questions see below 4 authors have shown experiments all the way to imagenet which is nice despite the strengths there are some weaknesses and questions addressing these would likely make the paper much stronger 1 the empirical validation of the theory section 51 is done using only three architecture topologies this is not that expensive particularly for mnist and cifar10 can the authors look at many different deep nets with different connectivity patterns and somehow present more data on convergence 2 experiments section could in general be stronger for instance the imagenet result on tenas is not good enough we see some accuracy improvement but we also see model size increase compared to vanilla tenas moreover since the trainingfree nas itself is very cheap further improvements in search time do not make a significant difference in figure 5 each point represents a subset of models that achieved a similar accuracy were there any interesting patterns in terms of their number of parametersmacs example did the whiteblack circles contain models of a certain size in terms of parameter counts were the models deeper but narrower within a circle or shallower but wider 3 the very first paper that pioneered the creation of such connectivitybased metrics was reference r1 cited by the authors this paper needs to be discussed in much more detail in section 2223 as there are many interesting synergies for instance r1 showed the link between connectivity patterns and training convergence too r1 showed very concrete training convergence curves and showed that their proposed metric correlated very well with convergence the metric proposed by r1 is also trainingfree indeed the present study is much more finegrained more theoretical and is more generally applicable than r1 4 other weaknesses involve things like there is no characterization of differences in kernel sizes etc but this is not critical and can be a good future work 5 proof for equation 14 in appendix b has a typo kiil should have a square on sigma right below proof of lemma 32 honestly this paper could be much stronger if there was a bit more focus on empirical results for justifying the theory eg if section 51 was much stronger r1 bhardwaj kartikeya guihong li and radu marculescu how does topology influence gradient propagation and model performance of deep networks with densenettype skip connections proceedings of the ieeecvf conference on computer vision and pattern recognition 2021 authors could comment on role of kernel sizes etc and how the connectivity alone does not take this into account in the present study docsepthe authors proposed a formal analysis framework to estimate the the upper bound of training loss convergence for dags with variety of network topologies based on that they proposed a plugandplay method to speech up previously reported nas methods by apply a filtration the results demonstrate the proposed method to search better networks and faster to my best knowledge this is perhaps the first theoretical study directly focusing on the finegrained nn topological connectivity rather than a general nn function despite some prior works exploring part of that eg 50 for nn widthdepth 69 for skip connection it is an important yet understudied direction and this work appears to lay a good foundation i have gone over the details of the derivation for the convergence analysis of dnn regarding the connectivity patterns and it seems sound for the bound estimation the authors chose a simplified theoretical model of dags mse loss using nngp kernel the key step is to estimate the flow of nngp variance and mean through unidirectional information paths in the specific topology the theoretical results are examined by a series of simulations in sec 51 and sec 52 the notions of effective depth and effective width are clearly defined and the authors also gave clear guidelines how to use them in nas it is shown to accelerate two latest nas methods and outperform more across multiple benchmarks the experiments validate the effectiveness of the proposed method in general i think this is a cool and wellpolished paper one question is the authors demonstrated their downselection technique to two trainingfree nas approaches naswot and tenas which are already very fast and therefore show only nonsignificant reduction of search time i wonder why not trying the proposed approach on more costly and accurate search methods and see if the accuracysearch efficiency gain is still favorable current discussion is sufficient docsepthis paper studied how a wide networks nngp kernel can depict the optimization dynamics of a particular dnn topology by propagating the nngp kernel spectrum and showing the topology to affect the bound of convergence rate based on this observation the authors created two notions of efficient depth and effective width that can be pluggedin existing nas methods to filter out unpromising connectivity patterns for speedup strength this is an important new piece of work towards connecting deep learning theory and nas prior arts already adapted theoretical properties of general dnns but never established their correlations with the concrete nn architecture topology except some empirical correlation observations this paper is the first to theoretically justify the optimization implication of general dnn topology which is likely to become a milestone for future work in this frontier throughout the paper the theory and application aspects are tightly coupled and the story is coherent their claim is theoretically sound a nonsurprising yet nice adaptation of nngp proofs in the experiments multiple benchmarks are reported with error bars section 53 is very helpful in understanding how the effective widthdepth principles are used to improve nas in a plugin fashion i especially like how the authors can choose d and m in a principled justified way not adhoc overall the writing is very good clear and easy to follow its a mature paper weakness nngp is a rough characterization of optimization dynamics hence its preciseness in comparing architectures is limited more importantly as mentioned on lines 280 although our d and m are only inspired from the optimization perspective not the complexity or generalization our method mainly filters out bad architectures at a coarse level but does not promote elites in a finegrained way i was not sure whether this optimizationonly selection bias will lead us to missing architectures that are excellent in complexitygeneralization hence offsetting their perhaps mediocre optimization behavior the authors applied their principles to accelerating tenas and naswot those are two earliest trainingfree nas approaches could the authors also try more recent ones and see improvements eg zennas or zerocost proxy nas moreover how about nontrainingfree nas methods can they be accelerated by such prefiltering too in tables 1 and 2 the flops of searched architectures could be included too no particular negative impact limitation was discussed by authors
### Summary: | this paper studies the relationship between connectivity of a deep network and its convergence both theoretically and empirically the paper also studies simpler metrics such as effective depth and width to guide the architecture search overall this is an impressive theoretical paper supported by empirical evidences all the three reviewers find the paper a valuable contribution to an important theoretical problem in deep learning after reading the rebuttals reviewer rabp recommended to accept this paper in its current form reviewer d7qw felt that all the concerns had been well addressed and increased the score by one reviewer 6d9f agreed with the authors response | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
28055,
285,
45190,
3537,
13505,
253,
3048,
875,
17769,
6127,
273,
3676,
6928,
285,
616,
14940,
1754,
327,
436,
1783,
253,
4477,
12661,
247,
4564,
273,
3733,
4924,
17082,
3576,
4871,
285,
3576,
6864,
285,
897,
731,
281,
513,
3733,
4924,
13332,
253,
2929,
10262,
247,
11080,
3048,
875,
11786,
18499,
14940,
285,
8350,
3607,
273,
3676,
6928,
407,
18597,
247,
3048,
875,
1180,
273,
4451,
11865,
1180,
273,
19363,
11865,
387,
2176,
7632,
275,
253,
2990,
285,
253,
1878,
25023,
273,
247,
11454,
2990,
305,
12064,
1232,
295,
1251,
81,
11041,
9093,
253,
4477,
1007,
387,
849,
253,
295,
1251,
81,
11041,
2544,
347,
253,
941,
8641,
684,
949,
253,
1566,
285,
14588,
352,
281,
849,
1142,
4451,
11865,
352,
4566,
949,
285,
849,
1142,
19363,
9297,
2659,
387,
253,
1390,
3828,
275,
253,
31398,
18080,
908,
281,
36803,
436,
1895,
841,
9241,
403,
840,
30363,
715,
616,
3576,
6864,
285,
3576,
4871,
17082,
326,
513,
417,
2430,
667,
3733,
390,
3579,
2135,
1034,
11999,
281,
11897,
690,
16774,
1543,
403,
2011,
323,
436,
14940,
253,
5780,
2929,
4453,
387,
3733,
4924,
13332,
407,
9968,
1066,
253,
3186,
2317,
1543,
403,
2011,
327,
260,
338,
274,
6903,
361,
4440,
257,
292,
1036,
8193,
285,
4440,
257,
292,
253,
2929,
556,
1563,
20544,
50276,
18,
436,
310,
247,
3505,
74,
6216,
2746,
281,
1263,
271,
1774,
1895,
26332,
4685,
849,
3676,
2990,
10336,
1755,
5970,
285,
17769,
6127,
14588,
281,
3733,
14940,
8958,
436,
310,
9090,
1077,
9865,
984,
247,
359,
476,
2216,
1805,
3210,
3587,
1293,
247,
1048,
673,
33136,
3186,
270,
275,
2852,
1805,
4685,
273,
849,
35615,
3486,
7200,
476,
8046,
441,
281,
2794,
1014,
625,
4460,
3210,
6296,
352,
310,
247,
1892,
1895,
594,
3551,
598,
342,
271,
27350,
1332,
323,
436,
1895,
310,
1077,
1892,
50276,
19,
253,
4737,
3133,
1077,
27350,
285,
253,
4795,
17082,
3576,
6864,
285,
3576,
4871,
403,
1077,
2969,
281,
11897,
285,
513,
417,
2430,
667,
3733,
390,
1014,
3579,
2135,
1034,
11999,
581,
476,
816,
1007,
387,
253,
10336,
17769,
285,
2028,
342,
5272,
7162,
604,
352,
310,
247,
1175,
390,
3076,
10336,
50276,
20,
16774,
1543,
323,
3733,
14940,
285,
13332,
31591,
35615,
3966,
1246,
690,
4722,
16039,
3738,
597,
671,
3324,
598,
690,
3533,
923,
2708,
50276,
21,
4477,
452,
2011,
4679,
512,
253,
1039,
281,
4440,
257,
292,
534,
310,
5322,
50276,
3229,
3784,
253,
20544,
627,
403,
690,
32213,
285,
3533,
15974,
841,
651,
2779,
1056,
253,
2929,
1199,
10046,
50276,
18,
253,
16774,
12820,
273,
253,
3762,
2593,
8319,
310,
2218,
970,
760,
1264,
10336,
1755,
5970,
436,
310,
417,
326,
8214,
3782,
323,
278,
79,
382,
285,
260,
338,
274,
740,
476,
253,
4477,
1007,
387,
1142,
1027,
3676,
37507,
342,
1027,
17769,
6127,
285,
10380,
1246,
625,
941,
327,
14940,
50276,
19,
4679,
2593,
812,
275,
2087,
320,
10046,
323,
4227,
253,
4440,
257,
292,
906,
327,
3578,
284,
310,
417,
1175,
2217,
359,
923,
690,
7200,
7756,
533,
359,
671,
923,
1566,
1979,
2572,
2429,
281,
26724,
3578,
284,
25761,
1580,
253,
3733,
4924,
13332,
3139,
310,
1077,
11142,
2007,
11701,
275,
3186,
673,
513,
417,
1056,
247,
1534,
3064,
275,
4677,
608,
1016,
1127,
6125,
247,
8578,
273,
3210,
326,
6786,
247,
2074,
7200,
497,
627,
667,
4722,
6127,
275,
2426,
273,
616,
1180,
273,
3602,
12432,
84,
1650,
858,
253,
3168,
11958,
14240,
3831,
3210,
273,
247,
2176,
1979,
275,
2426,
273,
4764,
9372,
497,
253,
3210,
12861,
533,
39937,
1561,
247,
9096,
390,
3091,
1017,
533,
14200,
50276,
20,
253,
1077,
806,
2929,
326,
33536,
35842,
253,
8869,
273,
824,
17769,
3169,
17082,
369,
3806,
391,
18,
11106,
407,
253,
4477,
436,
2929,
3198,
281,
320,
5469,
275,
1199,
625,
2508,
275,
2593,
374,
20360,
347,
627,
403,
1142,
4722,
26455,
447,
323,
4227,
391,
18,
2692,
253,
3048,
875,
17769,
6127,
285,
3733,
14940,
1512,
391,
18,
2692,
1077,
11859,
3733,
14940,
9191,
285,
2692,
326,
616,
4081,
7982,
9578,
1077,
973,
342,
14940,
253,
7982,
4081,
407,
391,
18,
310,
671,
3733,
4924,
6296,
253,
1246,
1263,
310,
1199,
625,
4030,
72,
11273,
625,
10527,
285,
310,
625,
3839,
7763,
685,
391,
18,
50275,
21,
643,
32213,
6388,
1841,
751,
627,
310,
642,
14846,
273,
3910,
275,
10295,
9552,
3966,
533,
436,
310,
417,
4619,
285,
476,
320,
247,
1175,
2852,
789,
50276,
22,
4737,
323,
5150,
1638,
275,
30762,
270,
556,
247,
1745,
80,
25130,
300,
943,
452,
247,
6278,
327,
40009,
987,
2708,
4737,
273,
18057,
4567,
20509,
436,
2929,
812,
320,
1199,
10046,
604,
627,
369,
247,
2372,
625,
2770,
327,
16774,
1543,
323,
816,
5411,
253,
3762,
24088,
604,
2593,
8319,
369,
1199,
10046,
50276,
83,
18,
270,
10984,
88,
1432,
465,
435,
74,
2364,
66,
1149,
6356,
543,
632,
285,
1985,
86,
2304,
4939,
14573,
849,
1057,
18080,
4833,
11786,
18634,
285,
1566,
3045,
273,
3676,
6928,
342,
12006,
257,
292,
881,
17049,
10291,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
43425,
50276,
43355,
812,
4385,
327,
2554,
273,
10295,
9552,
3966,
285,
849,
253,
17769,
3815,
1057,
417,
1379,
436,
715,
2395,
275,
253,
1246,
1263,
5474,
339,
431,
248,
4477,
4081,
247,
7473,
1783,
7792,
281,
6642,
253,
253,
5170,
3033,
273,
3733,
2957,
14940,
323,
277,
3544,
342,
5235,
273,
2990,
1755,
5970,
1754,
327,
326,
597,
4081,
247,
10358,
395,
1993,
1332,
281,
6519,
598,
3786,
2361,
13332,
3082,
407,
4647,
247,
24077,
253,
1543,
7568,
253,
4081,
1332,
281,
3186,
1805,
6928,
285,
7938,
50275,
936,
619,
1682,
3640,
436,
310,
4931,
253,
806,
10527,
1263,
3587,
13654,
327,
253,
4030,
72,
11273,
48257,
17597,
17769,
2581,
685,
247,
2087,
48257,
1159,
5747,
690,
2720,
2987,
18216,
629,
273,
326,
24088,
2456,
323,
48257,
4871,
16719,
10447,
323,
17049,
4602,
352,
310,
271,
1774,
2568,
762,
14091,
728,
3884,
285,
436,
789,
4620,
281,
2242,
247,
1175,
12153,
50274,
74,
452,
4783,
689,
253,
4278,
273,
253,
28529,
323,
253,
14940,
1783,
273,
277,
9866,
5001,
253,
17769,
6127,
285,
352,
3133,
3590,
323,
253,
3033,
13418,
253,
4477,
9703,
247,
21010,
10527,
1566,
273,
277,
3544,
50276,
78,
339,
2957,
970,
295,
1251,
81,
10295,
253,
2234,
3213,
310,
281,
6642,
253,
2685,
273,
295,
1251,
81,
11041,
285,
1599,
949,
440,
301,
30869,
1491,
11865,
275,
253,
2173,
18080,
253,
10527,
1543,
403,
6730,
407,
247,
2962,
273,
9938,
275,
4706,
8319,
285,
4706,
8073,
50275,
783,
27367,
273,
3576,
6864,
285,
3576,
4871,
403,
4518,
2931,
285,
253,
4477,
671,
3534,
2590,
9600,
849,
281,
897,
731,
275,
13332,
352,
310,
2011,
281,
28523,
767,
6323,
13332,
3082,
285,
562,
32231,
625,
2439,
2709,
49602,
253,
4679,
17813,
253,
12510,
273,
253,
4081,
1332,
50275,
249,
2087,
891,
1158,
436,
310,
247,
4484,
285,
973,
4818,
1428,
2929,
581,
1953,
310,
253,
4477,
5183,
616,
1066,
27423,
5853,
281,
767,
3733,
4924,
13332,
7274,
13332,
88,
302,
285,
3578,
284,
534,
403,
2168,
1077,
3809,
285,
3103,
921,
760,
14122,
525,
13907,
5141,
273,
3186,
673,
891,
4282,
2139,
417,
2820,
253,
4081,
2746,
327,
625,
19983,
285,
7899,
3186,
3082,
285,
923,
604,
253,
7200,
8716,
6733,
6351,
310,
1335,
13857,
50276,
6259,
5955,
310,
4209,
50276,
7152,
33032,
2520,
2929,
5421,
849,
247,
4618,
6928,
295,
1251,
81,
10295,
476,
17154,
253,
13757,
8062,
273,
247,
1798,
277,
9866,
18080,
407,
42995,
253,
295,
1251,
81,
10295,
6637,
285,
4645,
253,
18080,
281,
2818,
253,
3033,
273,
14940,
2281,
1754,
327,
436,
8310,
253,
4477,
3562,
767,
27367,
273,
5919,
6864,
285,
3576,
4871,
326,
476,
320,
43867,
249,
5368,
13332,
3082,
281,
5806,
562,
440,
13382,
2182,
17769,
6127,
323,
3885,
484,
50276,
45563,
436,
310,
271,
1774,
747,
5313,
273,
789,
4404,
12873,
3676,
4715,
3762,
285,
13332,
2720,
14635,
2168,
12956,
10527,
3607,
273,
2087,
277,
79,
2224,
533,
1620,
4232,
616,
13007,
342,
253,
11859,
48257,
10336,
18080,
3707,
690,
16774,
5921,
7313,
436,
2929,
310,
253,
806,
281,
28055,
15249,
253,
13757,
27570,
273,
2087,
277,
9866,
18080,
534,
310,
2779,
281,
2489,
247,
41457,
323,
2852,
789,
275,
436,
34642,
4768,
253,
2929,
253,
3762,
285,
2898,
7794,
403,
18996,
9904,
285,
253,
2926,
310,
18893,
616,
1750,
310,
28055,
3590,
247,
14122,
321,
20733,
2568,
5322,
15644,
273,
295,
1251,
81,
27947,
275,
253,
4679,
2709,
49602,
403,
2361,
342,
2228,
8965,
50275,
4674,
8676,
310,
1077,
9371,
275,
4685,
849,
253,
3576,
4871,
16719,
9241,
403,
908,
281,
3157,
13332,
275,
247,
15191,
8142,
891,
3340,
751,
849,
253,
4477,
476,
5206,
277,
285,
278,
275,
247,
3505,
74,
6216,
17285,
1039,
417,
519,
37806,
50276,
1189,
455,
253,
4028,
310,
1077,
1175,
2590,
285,
3477,
281,
956,
697,
247,
14242,
2929,
50276,
20881,
1255,
50276,
79,
1251,
81,
310,
247,
7227,
14846,
273,
13757,
8062,
7613,
697,
3509,
261,
8098,
275,
10941,
35615,
310,
3710,
625,
15538,
347,
5393,
327,
3104,
22121,
3738,
776,
277,
285,
278,
403,
760,
11797,
432,
253,
13757,
8668,
417,
253,
10454,
390,
26647,
776,
1332,
7194,
15116,
562,
3076,
35615,
387,
247,
25319,
1268,
533,
1057,
417,
8591,
43599,
275,
247,
4030,
72,
11273,
1039,
891,
369,
417,
2119,
1880,
436,
13757,
7483,
5438,
8492,
588,
1421,
441,
281,
5816,
35615,
326,
403,
7126,
275,
10454,
16691,
1320,
7613,
8409,
1076,
616,
4931,
12069,
49636,
13757,
3879,
50276,
783,
4477,
3732,
616,
9241,
281,
38757,
3578,
284,
285,
13332,
88,
302,
1110,
403,
767,
18353,
3733,
4924,
13332,
7274,
812,
253,
4477,
671,
1611,
625,
3332,
4394,
285,
923,
11701,
24088,
1182,
35148,
390,
1182,
254,
406,
493,
17335,
13332,
25761,
849,
670,
25450,
26208,
4924,
13332,
3082,
476,
597,
320,
21702,
407,
824,
638,
10978,
272,
1512,
50276,
249,
7180,
337,
285,
374,
253,
892,
2695,
273,
16113,
35615,
812,
320,
2908,
1512,
642,
1798,
4016,
3486,
12291,
369,
5469,
407,
4477,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
2954,
875,
17769,
273,
247,
3676,
2990,
285,
697,
14940,
1097,
28055,
285,
45190,
253,
2929,
671,
2175,
19554,
17082,
824,
347,
3576,
6864,
285,
4871,
281,
7102,
253,
10336,
3186,
4583,
436,
310,
271,
13943,
10527,
2929,
4516,
407,
16774,
20456,
2979,
50275,
455,
253,
1264,
30628,
1089,
253,
2929,
247,
9865,
7680,
281,
271,
1774,
10527,
1895,
275,
3676,
4715,
846,
4361,
253,
30080,
85,
932,
37317,
28301,
81,
8521,
281,
2997,
436,
2929,
275,
697,
1655,
830,
37317,
277,
24,
82,
88,
3543,
326,
512,
253,
7350,
574,
644,
973,
9713,
285,
2559,
253,
4868,
407,
581,
37317,
721,
69,
26,
71,
5821,
342,
253,
4477,
2380,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
28055,
285,
45190,
3537,
13505,
253,
3048,
875,
17769,
6127,
273,
3676,
6928,
285,
616,
14940,
1754,
327,
436,
1783,
253,
4477,
12661,
247,
4564,
273,
3733,
4924,
17082,
3576,
4871,
285,
3576,
6864,
285,
897,
731,
281,
513,
3733,
4924,
13332,
253,
2929,
10262,
247,
11080,
3048,
875,
11786,
18499,
14940,
285,
8350,
3607,
273,
3676,
6928,
407,
18597,
247,
3048,
875,
1180,
273,
4451,
11865,
1180,
273,
19363,
11865,
387,
2176,
7632,
275,
253,
2990,
285,
253,
1878,
25023,
273,
247,
11454,
2990,
305,
12064,
1232,
295,
1251,
81,
11041,
9093,
253,
4477,
1007,
387,
849,
253,
295,
1251,
81,
11041,
2544,
347,
253,
941,
8641,
684,
949,
253,
1566,
285,
14588,
352,
281,
849,
1142,
4451,
11865,
352,
4566,
949,
285,
849,
1142,
19363,
9297,
2659,
387,
253,
1390,
3828,
275,
253,
31398,
18080,
908,
281,
36803,
436,
1895,
841,
9241,
403,
840,
30363,
715,
616,
3576,
6864,
285,
3576,
4871,
17082,
326,
513,
417,
2430,
667,
3733,
390,
3579,
2135,
1034,
11999,
281,
11897,
690,
16774,
1543,
403,
2011,
323,
436,
14940,
253,
5780,
2929,
4453,
387,
3733,
4924,
13332,
407,
9968,
1066,
253,
3186,
2317,
1543,
403,
2011,
327,
260,
338,
274,
6903,
361,
4440,
257,
292,
1036,
8193,
285,
4440,
257,
292,
253,
2929,
556,
1563,
20544,
50276,
18,
436,
310,
247,
3505,
74,
6216,
2746,
281,
1263,
271,
1774,
1895,
26332,
4685,
849,
3676,
2990,
10336,
1755,
5970,
285,
17769,
6127,
14588,
281,
3733,
14940,
8958,
436,
310,
9090,
1077,
9865,
984,
247,
359,
476,
2216,
1805,
3210,
3587,
1293,
247,
1048,
673,
33136,
3186,
270,
275,
2852,
1805,
4685,
273,
849,
35615,
3486,
7200,
476,
8046,
441,
281,
2794,
1014,
625,
4460,
3210,
6296,
352,
310,
247,
1892,
1895,
594,
3551,
598,
342,
271,
27350,
1332,
323,
436,
1895,
310,
1077,
1892,
50276,
19,
253,
4737,
3133,
1077,
27350,
285,
253,
4795,
17082,
3576,
6864,
285,
3576,
4871,
403,
1077,
2969,
281,
11897,
285,
513,
417,
2430,
667,
3733,
390,
1014,
3579,
2135,
1034,
11999,
581,
476,
816,
1007,
387,
253,
10336,
17769,
285,
2028,
342,
5272,
7162,
604,
352,
310,
247,
1175,
390,
3076,
10336,
50276,
20,
16774,
1543,
323,
3733,
14940,
285,
13332,
31591,
35615,
3966,
1246,
690,
4722,
16039,
3738,
597,
671,
3324,
598,
690,
3533,
923,
2708,
50276,
21,
4477,
452,
2011,
4679,
512,
253,
1039,
281,
4440,
257,
292,
534,
310,
5322,
50276,
3229,
3784,
253,
20544,
627,
403,
690,
32213,
285,
3533,
15974,
841,
651,
2779,
1056,
253,
2929,
1199,
10046,
50276,
18,
253,
16774,
12820,
273,
253,
3762,
2593,
8319,
310,
2218,
970,
760,
1264,
10336,
1755,
5970,
436,
310,
417,
326,
8214,
3782,
323,
278,
79,
382,
285,
260,
338,
274,
740,
476,
253,
4477,
1007,
387,
1142,
1027,
3676,
37507,
342,
1027,
17769,
6127,
285,
10380,
1246,
625,
941,
327,
14940,
50276,
19,
4679,
2593,
812,
275,
2087,
320,
10046,
323,
4227,
253,
4440,
257,
292,
906,
327,
3578,
284,
310,
417,
1175,
2217,
359,
923,
690,
7200,
7756,
533,
359,
671,
923,
1566,
1979,
2572,
2429,
281,
26724,
3578,
284,
25761,
1580,
253,
3733,
4924,
13332,
3139,
310,
1077,
11142,
2007,
11701,
275,
3186,
673,
513,
417,
1056,
247,
1534,
3064,
275,
4677,
608,
1016,
1127,
6125,
247,
8578,
273,
3210,
326,
6786,
247,
2074,
7200,
497,
627,
667,
4722,
6127,
275,
2426,
273,
616,
1180,
273,
3602,
12432,
84,
1650,
858,
253,
3168,
11958,
14240,
3831,
3210,
273,
247,
2176,
1979,
275,
2426,
273,
4764,
9372,
497,
253,
3210,
12861,
533,
39937,
1561,
247,
9096,
390,
3091,
1017,
533,
14200,
50276,
20,
253,
1077,
806,
2929,
326,
33536,
35842,
253,
8869,
273,
824,
17769,
3169,
17082,
369,
3806,
391,
18,
11106,
407,
253,
4477,
436,
2929,
3198,
281,
320,
5469,
275,
1199,
625,
2508,
275,
2593,
374,
20360,
347,
627,
403,
1142,
4722,
26455,
447,
323,
4227,
391,
18,
2692,
253,
3048,
875,
17769,
6127,
285,
3733,
14940,
1512,
391,
18,
2692,
1077,
11859,
3733,
14940,
9191,
285,
2692,
326,
616,
4081,
7982,
9578,
1077,
973,
342,
14940,
253,
7982,
4081,
407,
391,
18,
310,
671,
3733,
4924,
6296,
253,
1246,
1263,
310,
1199,
625,
4030,
72,
11273,
625,
10527,
285,
310,
625,
3839,
7763,
685,
391,
18,
50275,
21,
643,
32213,
6388,
1841,
751,
627,
310,
642,
14846,
273,
3910,
275,
10295,
9552,
3966,
533,
436,
310,
417,
4619,
285,
476,
320,
247,
1175,
2852,
789,
50276,
22,
4737,
323,
5150,
1638,
275,
30762,
270,
556,
247,
1745,
80,
25130,
300,
943,
452,
247,
6278,
327,
40009,
987,
2708,
4737,
273,
18057,
4567,
20509,
436,
2929,
812,
320,
1199,
10046,
604,
627,
369,
247,
2372,
625,
2770,
327,
16774,
1543,
323,
816,
5411,
253,
3762,
24088,
604,
2593,
8319,
369,
1199,
10046,
50276,
83,
18,
270,
10984,
88,
1432,
465,
435,
74,
2364,
66,
1149,
6356,
543,
632,
285,
1985,
86,
2304,
4939,
14573,
849,
1057,
18080,
4833,
11786,
18634,
285,
1566,
3045,
273,
3676,
6928,
342,
12006,
257,
292,
881,
17049,
10291,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
43425,
50276,
43355,
812,
4385,
327,
2554,
273,
10295,
9552,
3966,
285,
849,
253,
17769,
3815,
1057,
417,
1379,
436,
715,
2395,
275,
253,
1246,
1263,
5474,
339,
431,
248,
4477,
4081,
247,
7473,
1783,
7792,
281,
6642,
253,
253,
5170,
3033,
273,
3733,
2957,
14940,
323,
277,
3544,
342,
5235,
273,
2990,
1755,
5970,
1754,
327,
326,
597,
4081,
247,
10358,
395,
1993,
1332,
281,
6519,
598,
3786,
2361,
13332,
3082,
407,
4647,
247,
24077,
253,
1543,
7568,
253,
4081,
1332,
281,
3186,
1805,
6928,
285,
7938,
50275,
936,
619,
1682,
3640,
436,
310,
4931,
253,
806,
10527,
1263,
3587,
13654,
327,
253,
4030,
72,
11273,
48257,
17597,
17769,
2581,
685,
247,
2087,
48257,
1159,
5747,
690,
2720,
2987,
18216,
629,
273,
326,
24088,
2456,
323,
48257,
4871,
16719,
10447,
323,
17049,
4602,
352,
310,
271,
1774,
2568,
762,
14091,
728,
3884,
285,
436,
789,
4620,
281,
2242,
247,
1175,
12153,
50274,
74,
452,
4783,
689,
253,
4278,
273,
253,
28529,
323,
253,
14940,
1783,
273,
277,
9866,
5001,
253,
17769,
6127,
285,
352,
3133,
3590,
323,
253,
3033,
13418,
253,
4477,
9703,
247,
21010,
10527,
1566,
273,
277,
3544,
50276,
78,
339,
2957,
970,
295,
1251,
81,
10295,
253,
2234,
3213,
310,
281,
6642,
253,
2685,
273,
295,
1251,
81,
11041,
285,
1599,
949,
440,
301,
30869,
1491,
11865,
275,
253,
2173,
18080,
253,
10527,
1543,
403,
6730,
407,
247,
2962,
273,
9938,
275,
4706,
8319,
285,
4706,
8073,
50275,
783,
27367,
273,
3576,
6864,
285,
3576,
4871,
403,
4518,
2931,
285,
253,
4477,
671,
3534,
2590,
9600,
849,
281,
897,
731,
275,
13332,
352,
310,
2011,
281,
28523,
767,
6323,
13332,
3082,
285,
562,
32231,
625,
2439,
2709,
49602,
253,
4679,
17813,
253,
12510,
273,
253,
4081,
1332,
50275,
249,
2087,
891,
1158,
436,
310,
247,
4484,
285,
973,
4818,
1428,
2929,
581,
1953,
310,
253,
4477,
5183,
616,
1066,
27423,
5853,
281,
767,
3733,
4924,
13332,
7274,
13332,
88,
302,
285,
3578,
284,
534,
403,
2168,
1077,
3809,
285,
3103,
921,
760,
14122,
525,
13907,
5141,
273,
3186,
673,
891,
4282,
2139,
417,
2820,
253,
4081,
2746,
327,
625,
19983,
285,
7899,
3186,
3082,
285,
923,
604,
253,
7200,
8716,
6733,
6351,
310,
1335,
13857,
50276,
6259,
5955,
310,
4209,
50276,
7152,
33032,
2520,
2929,
5421,
849,
247,
4618,
6928,
295,
1251,
81,
10295,
476,
17154,
253,
13757,
8062,
273,
247,
1798,
277,
9866,
18080,
407,
42995,
253,
295,
1251,
81,
10295,
6637,
285,
4645,
253,
18080,
281,
2818,
253,
3033,
273,
14940,
2281,
1754,
327,
436,
8310,
253,
4477,
3562,
767,
27367,
273,
5919,
6864,
285,
3576,
4871,
326,
476,
320,
43867,
249,
5368,
13332,
3082,
281,
5806,
562,
440,
13382,
2182,
17769,
6127,
323,
3885,
484,
50276,
45563,
436,
310,
271,
1774,
747,
5313,
273,
789,
4404,
12873,
3676,
4715,
3762,
285,
13332,
2720,
14635,
2168,
12956,
10527,
3607,
273,
2087,
277,
79,
2224,
533,
1620,
4232,
616,
13007,
342,
253,
11859,
48257,
10336,
18080,
3707,
690,
16774,
5921,
7313,
436,
2929,
310,
253,
806,
281,
28055,
15249,
253,
13757,
27570,
273,
2087,
277,
9866,
18080,
534,
310,
2779,
281,
2489,
247,
41457,
323,
2852,
789,
275,
436,
34642,
4768,
253,
2929,
253,
3762,
285,
2898,
7794,
403,
18996,
9904,
285,
253,
2926,
310,
18893,
616,
1750,
310,
28055,
3590,
247,
14122,
321,
20733,
2568,
5322,
15644,
273,
295,
1251,
81,
27947,
275,
253,
4679,
2709,
49602,
403,
2361,
342,
2228,
8965,
50275,
4674,
8676,
310,
1077,
9371,
275,
4685,
849,
253,
3576,
4871,
16719,
9241,
403,
908,
281,
3157,
13332,
275,
247,
15191,
8142,
891,
3340,
751,
849,
253,
4477,
476,
5206,
277,
285,
278,
275,
247,
3505,
74,
6216,
17285,
1039,
417,
519,
37806,
50276,
1189,
455,
253,
4028,
310,
1077,
1175,
2590,
285,
3477,
281,
956,
697,
247,
14242,
2929,
50276,
20881,
1255,
50276,
79,
1251,
81,
310,
247,
7227,
14846,
273,
13757,
8062,
7613,
697,
3509,
261,
8098,
275,
10941,
35615,
310,
3710,
625,
15538,
347,
5393,
327,
3104,
22121,
3738,
776,
277,
285,
278,
403,
760,
11797,
432,
253,
13757,
8668,
417,
253,
10454,
390,
26647,
776,
1332,
7194,
15116,
562,
3076,
35615,
387,
247,
25319,
1268,
533,
1057,
417,
8591,
43599,
275,
247,
4030,
72,
11273,
1039,
891,
369,
417,
2119,
1880,
436,
13757,
7483,
5438,
8492,
588,
1421,
441,
281,
5816,
35615,
326,
403,
7126,
275,
10454,
16691,
1320,
7613,
8409,
1076,
616,
4931,
12069,
49636,
13757,
3879,
50276,
783,
4477,
3732,
616,
9241,
281,
38757,
3578,
284,
285,
13332,
88,
302,
1110,
403,
767,
18353,
3733,
4924,
13332,
7274,
812,
253,
4477,
671,
1611,
625,
3332,
4394,
285,
923,
11701,
24088,
1182,
35148,
390,
1182,
254,
406,
493,
17335,
13332,
25761,
849,
670,
25450,
26208,
4924,
13332,
3082,
476,
597,
320,
21702,
407,
824,
638,
10978,
272,
1512,
50276,
249,
7180,
337,
285,
374,
253,
892,
2695,
273,
16113,
35615,
812,
320,
2908,
1512,
642,
1798,
4016,
3486,
12291,
369,
5469,
407,
4477,
2490,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
2954,
875,
17769,
273,
247,
3676,
2990,
285,
697,
14940,
1097,
28055,
285,
45190,
253,
2929,
671,
2175,
19554,
17082,
824,
347,
3576,
6864,
285,
4871,
281,
7102,
253,
10336,
3186,
4583,
436,
310,
271,
13943,
10527,
2929,
4516,
407,
16774,
20456,
2979,
50275,
455,
253,
1264,
30628,
1089,
253,
2929,
247,
9865,
7680,
281,
271,
1774,
10527,
1895,
275,
3676,
4715,
846,
4361,
253,
30080,
85,
932,
37317,
28301,
81,
8521,
281,
2997,
436,
2929,
275,
697,
1655,
830,
37317,
277,
24,
82,
88,
3543,
326,
512,
253,
7350,
574,
644,
973,
9713,
285,
2559,
253,
4868,
407,
581,
37317,
721,
69,
26,
71,
5821,
342,
253,
4477,
2380,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this paper the author proposed a transformerbased encoderdecoder framework for labelfree text style transfer the described task under the unsupervised setup is important and instructive for the text style transfer domain the model architecture is well demonstrated and the writing is easy to follow up the experiment results show satisfying performance even comparing with stateoftheart supervised methods however i have some concerns that may lead to the weakness of the paper 1 about the assumption the author claimed the method is labelfree however the unsupervised model is based on an assumption that two adjacent sentences should have the same style with the assumption the training of the model is actually weaksupervised because in each step the paired sentences are provided with the same style this assumption is actually utilizing the contextlevel supervision instead of the sentencelevel labels this idea is also previously used in 1 2 about the framework the model adds the exacted style vector to all the hidden states of the encoder how can the author guarantee that the encoder will not extract the style information of the input also is it possible that the style vectors still contain the content information from context 3 about the style vector the model changes the style of the sentence by adding a direction from the source style vector to the target style vector the approach may work under the assumption that the style vector space is linear to the semantic meanings but there is no regularizer or training loss to guarantee the linearity assumption of the style extractor why didnt the author directly replace the sentence style vector with the target style vector 4 about the dataset the model is only evaluated on one dataset it could be more solid if the author conduct experiment on other commonlyused style transfer datasets such as yelp and personalitycaptions 2 besides the split amazon review dataset only has two sentiment classes as positive and negative it could be more persuasive if the model is tested on other datasets with multiple sentiments to verify the effectiveness of the proposed restyling strategy 5 about the evaluation the author only reported the performance of content and style preservation acc and selfbleu the sentence generation quality is expected to report by testing the bleu score of the generated sentences 6 in figure 4 there is no clear pattern between positive and negative sentence embedding the difference in embedding space is mainly caused by different topics which in my understanding are the content of sentences this means the style vectors cannot eliminate the content information and also failed to separate sentences with different sentiments reference 1 zhang et al improving the dialogue generation consistency via selfsupervised learning 2019 2 shuster et al engaging image captioning via personality 2018docsepthis paper tackles the problem of extracting and modeling text writing style without using labeled data traditionally modeling text style requires either paired sentences supervised or two pools of unpaired sentences socalled unsupervised this paper exploits 1 language model pretraining and 2 free supervision signals in text to achieve modeling text styles without labeled data for the 1st point the authors correctly hypothesize that a large pretrained language model eg t5 already knows about style information and one can isolate the style information using the right finetun signal for the 2nd point the authors assume that text style eg sentiment is slowmoving and consistent for adjacent sentences i guess its a similar signal is exploit by next sentence prediction in bert and cbow in word2vec and this is used as the free supervision signal to their model finetune in the experiments section the authors test their model on transfer learning tasks the experiments fig 2 3 seem to suggest that at at given high content preservation score 50 the proposed model is not as accurate as other supervised models but with low content preservation the model can steadily improve accuracy by modifier more words fig 2 in figure 2 the textsettr accuracy has almost inverse linear response wrt to the content preservation score but in figure 3 the plot for textsettr stopped at 3050 what would happen if the modification percentage is higher would textsettr get closer to 100 accuracy another small issue with figure 3 i believe the task is binary pos vs neg it might be more useful to plot the accuracy from 50100 instead of from 0 100 since 50 is the practical lower bound performance docsep reasons for score this paper proposes a novel approach to the labelfree style transfer task where an input is corrupted via different strategies and fed into an autoencoder which is additionally conditioned on its prior adjacent context sentence via a style encoder which adds its mean pooled hidden state to the former before decoding both encoders are initialized to and leverage the strength of pretrained t5 model additionally the amount of additiondeletion of tokens is tunable at both training and inference time the overall idea is quite compelling but the papers argument could be improved greatly with revisions to its existing experimental setup and more evaluation overall to better and more thoroughly back its claims pros pros 1 the authors propose a novel approach to the labelfree style transfer task that is based on evaluating how training under different combinations of 3 noising strategies noise back translation and noisy back translation on input texts can be used in conjunction with an autoencoder and style encoder over the prior sentence context to then do inference given an input text and a small number of examplars of the source and target styles the idea is laid out fairly clearly both for training and inference though certain particulars there and in the experiments section were a little unclear and could have benefited from some formal notation see next section 2 the quantitative results on the amazon dataset for their best model on both the full data and few shot regimes are quite impressive compared with the other labelfree style transfer paper they compare against xu 2020 3 the few qualitative examples shown are impressive particularly the american british ones 4 the tuning hyper parameter is a useful addition though itd be interesting to see how dataset dependent it is cons 1 overall the writing was a little unclear at certain spots and could have benefitted greatly from some equations explicitly stating the setup for instance i was unclear if the context representation was added which the text suggests or concated to the noisy encoding before being decoded the later is suggested by figure 1 especially since the 4 float values for the tuning rate ranges are said to be prepended similarly the sampling strategy used as opposed to greedy decoding 2 doing quantitative evaluation on only one dataset amazon and then only showing examples of how the model does qualitatively over another dataset english common crawl c4 without doing any human eval is a little disappointing the idea is novel enough where even just doing some more automated evals would be suffice for me for instance why werent automated metrics given for the english common crawl dataset those results and having information on training set size and the average token size of each example for c4 should be given also the authors compare against lample 19 for the pospos and negneg setup for the amazon data why not show the results for the syelp data as well does the 2040 adddelete tuning work better there as well or is it dataset dependent 3 there are two issues with your use of the amazon dataset first it doesnt really provide apples to oranges comparison against the prior papers as they traintest on the same data from li 18 which has 270 k training examples whereas the work here generates 236 m training examples it seems you should either see how those papers do with that much data or limit your dataset to be of at least comparable size to be fair second the amazon test set is only of size 500 so assessing results on that alone seems insuffice 4 the paper hypothesizes that style is a slow moving feature consistent over large spans of text hence the use of only the prior adjacent sentence as context the paper shows that using just an adjacent sentence gives promising results but doesnt show that its necessarily better than just using examplars or using a leading paragraph to derive the style from i dont think this is exactly necessarily to address here but for future work it would be nice to see such a comparison additionally how would using a 1000 examplars as opposed to 100 at inference time affect performance a graph showing how accuracy and content preservation were affected by that would be interesting to gain better understanding similarly showing how just the nbt strategy did alone as opposed to n nbt would be interesting 5 i didnt find the multiple aspect umap embedding visualization particularly convincing for how well the embeddings separate the sentiment aspect as there is substantial overlap within each category particularly software i dont know if this is particularly necessary for your argument in my opinion especially compared with evals on other datasets but if so then itd be interesting to have quantitative numbers for those separations and compared with how it differs from just taking the t5 embeddings and doing the same umap 6 the replace noise strategy feels pretty arbitrary is there any motivation behind using that as opposed to using a lm or another strategy to replace tokens 7 a citation for using self bleu as opposed to multibleu in the evaluation procedure section would be helpful additionally a citation of ke wang hang hua xiaojun wan controllable unsupervised text attribute transfer via editing entangled latent representation neurips 19 particularly for its tunable aspect could be an addition to the related work section 8 this is nitpicky and probably for future work but the use of examplars doesnt necessarily limit the user to a predefined set of styles like the unsupervised case does however it would be interesting to see what would happen given out of domain examplars for either the source or target classes at inference time questions during rebuttal period please address and clarify the cons above possible prior citation docsepthis paper proposes a method for textstyle transfer where they dont need label information of the interested style the extend t5 model to develop their architecture which models style extract a style vector from arbitrary text and use this vector to condition the decoder to perform style transfer however the current presentation of the paper is hard to follow which raises the following concern 1 as they need to provide two sentences which has to be chronological sentences it is not possible to obtain always hence they randomly select sentence pair but then two sentences may not bear same style how the authors are incorporating the same 2 for inference they need sentence exemplar both both style this contradicts their previous claim they have not compared with 3 how noise introduction in helpful for style corrupted sentence generation they do not use any heuristic and from a single sentence there can be multiple variation of the corrupted version are all those taken at the time of training then the style it is learning is possibly not the intended one as different corrupted sentences might need different style to reconstruct the sentence back 4 model portion is extremely cryptic what is back translation etc at least should be explained in one line 5 due to the unreadability of the model i cannot provide judgement on the result section
### Summary: | this paper proposes a new method for labelfree text style transfer the method employs the pretrained language model t5 and makes an assumption that two adjacent sentences in a document have the same style experimental results show satisfying results compared with supervised methods pros the paper is generally clearly written the proposed method appears to be new experiments have been conducted cons the fundamental assumption of the method is not convincing enough issue 1 of r3 issue 4 of r4 issue 1 of r2 the proposed model is also not convincing enough issues 2 and 3 of r3 issue 3 of r2 there are problems with the experiments for example it would be better to use more datasets in the experiments issue 4 of r3 issue 2 of r4 discussions have been made among the reviewers the reviewers appreciate the efforts made by the authors in the rebuttal including the additional experiments however they are not fully convinced and still feel that the submission is not strong enough as an iclr paper | [
21942,
5971,
347,
2762,
285,
4016,
352,
812,
320,
625,
34593,
604,
253,
1566,
310,
5762,
327,
643,
15302,
342,
2709,
39236,
281,
12654,
253,
12510,
273,
253,
4081,
1551,
1190,
272,
5700,
50275,
22,
670,
253,
7103,
253,
2488,
760,
2361,
253,
3045,
273,
2600,
285,
3740,
23029,
756,
285,
1881,
934,
86,
253,
6197,
5978,
3290,
310,
3264,
281,
1304,
407,
5175,
253,
7387,
86,
4868,
273,
253,
4561,
14683,
50276,
23,
275,
4677,
577,
627,
310,
642,
2590,
3102,
875,
2762,
285,
4016,
6197,
21496,
253,
3064,
275,
21496,
2317,
310,
7194,
4269,
407,
1027,
12989,
534,
275,
619,
4685,
403,
253,
2600,
273,
14683,
436,
2097,
253,
3740,
11390,
2550,
13469,
253,
2600,
1491,
285,
671,
4242,
281,
4858,
14683,
342,
1027,
39236,
50275,
14005,
50276,
18,
1182,
12109,
1162,
355,
11138,
253,
17414,
5978,
15274,
3066,
1881,
35421,
4715,
6247,
50276,
19,
439,
8976,
1162,
355,
15966,
2460,
11743,
272,
3066,
13216,
4765,
7152,
33032,
2520,
2929,
39223,
253,
1895,
273,
34705,
285,
14053,
2505,
4028,
3740,
1293,
970,
13130,
941,
50276,
1206,
30718,
595,
14053,
2505,
3740,
4419,
2057,
18433,
14683,
22296,
390,
767,
24283,
273,
47223,
14683,
9267,
18859,
440,
35421,
436,
2929,
40725,
337,
3448,
1566,
3215,
26208,
285,
374,
1959,
20446,
6298,
275,
2505,
281,
5115,
14053,
2505,
14957,
1293,
13130,
941,
50276,
1542,
253,
337,
296,
1127,
253,
4477,
9113,
41661,
326,
247,
1781,
3215,
11273,
3448,
1566,
24088,
246,
22,
2168,
6057,
670,
3740,
1491,
285,
581,
476,
20843,
253,
3740,
1491,
970,
253,
987,
1442,
292,
328,
2625,
323,
253,
374,
2109,
1127,
253,
4477,
5467,
326,
2505,
3740,
24088,
21942,
310,
3468,
26621,
285,
5185,
323,
9701,
14683,
891,
5476,
697,
247,
2074,
2625,
310,
22059,
407,
1735,
6197,
10554,
275,
270,
797,
285,
260,
11939,
275,
3159,
19,
4642,
285,
436,
310,
908,
347,
253,
1959,
20446,
2625,
281,
616,
1566,
1442,
292,
2517,
50276,
249,
253,
4679,
2593,
253,
4477,
1071,
616,
1566,
327,
3700,
4715,
8892,
253,
4679,
3036,
374,
495,
1646,
281,
1804,
326,
387,
387,
1677,
1029,
2600,
23029,
4868,
50275,
1235,
253,
4081,
1566,
310,
417,
347,
7899,
347,
643,
22296,
3210,
533,
342,
1698,
2600,
23029,
253,
1566,
476,
25060,
3157,
7200,
407,
34301,
625,
3000,
3036,
374,
50275,
249,
4677,
374,
253,
2505,
1178,
1206,
7200,
556,
2761,
13737,
4872,
2380,
8772,
281,
253,
2600,
23029,
4868,
533,
275,
4677,
495,
253,
7484,
323,
2505,
1178,
1206,
6331,
387,
1884,
1235,
752,
651,
5108,
604,
253,
11237,
7155,
310,
2169,
651,
2505,
1178,
1206,
755,
8003,
281,
2233,
7200,
50276,
23955,
1355,
2523,
342,
4677,
495,
891,
2868,
253,
4836,
310,
8985,
803,
4632,
2297,
352,
1537,
320,
625,
4217,
281,
7484,
253,
7200,
432,
28416,
361,
3185,
273,
432,
470,
2233,
1580,
2456,
310,
253,
8542,
2406,
3033,
3045,
5474,
33032,
4606,
323,
4868,
50274,
2520,
2929,
29328,
247,
4460,
2746,
281,
253,
5188,
813,
658,
3740,
3700,
4836,
835,
271,
3280,
310,
40634,
3066,
1027,
8130,
285,
10208,
715,
271,
6753,
36465,
534,
310,
23000,
27039,
327,
697,
2720,
9701,
3634,
6197,
3066,
247,
3740,
32049,
534,
11323,
697,
1599,
24462,
8763,
1375,
281,
253,
3438,
1078,
28490,
50276,
15617,
2349,
351,
398,
403,
31260,
281,
285,
25057,
253,
4757,
273,
3215,
11273,
246,
22,
1566,
50276,
29483,
595,
253,
2408,
273,
1635,
615,
37713,
273,
21761,
310,
10839,
494,
387,
1097,
3733,
285,
17032,
673,
50274,
783,
4583,
2934,
310,
3240,
18511,
533,
253,
9380,
4154,
812,
320,
5520,
10260,
342,
38549,
281,
697,
5368,
5661,
9978,
285,
625,
7103,
4583,
281,
1805,
285,
625,
16575,
896,
697,
3916,
50272,
856,
84,
50276,
856,
84,
337,
253,
4477,
12661,
247,
4460,
2746,
281,
253,
5188,
813,
658,
3740,
3700,
4836,
326,
310,
1754,
327,
16344,
849,
3733,
762,
1027,
13553,
273,
495,
642,
2182,
8130,
50275,
24946,
896,
10234,
285,
27620,
896,
10234,
50276,
251,
3280,
17438,
476,
320,
908,
275,
17385,
342,
271,
6753,
36465,
285,
3740,
32049,
689,
253,
2720,
6197,
3634,
281,
840,
513,
17032,
1677,
271,
3280,
2505,
285,
247,
1355,
1180,
273,
1174,
446,
1032,
273,
253,
2603,
285,
2303,
14957,
50276,
783,
2934,
310,
10090,
562,
9648,
4518,
1097,
323,
3733,
285,
17032,
2167,
2176,
1798,
84,
627,
285,
275,
253,
4679,
2593,
497,
247,
1652,
12744,
285,
812,
452,
37081,
432,
690,
7473,
14951,
50276,
2887,
1735,
2593,
50274,
19,
253,
11745,
1543,
327,
253,
7001,
251,
10895,
323,
616,
1682,
1566,
327,
1097,
253,
2120,
941,
285,
1643,
5103,
27005,
403,
3240,
13943,
2429,
342,
253,
643,
5188,
813,
658,
3740,
3700,
2929,
597,
7277,
1411,
50276,
46036,
9169,
50274,
20,
253,
1643,
18276,
6667,
2011,
403,
13943,
50276,
35456,
253,
41290,
266,
50276,
31005,
763,
4394,
50274,
21,
253,
25184,
4373,
4764,
310,
247,
4217,
1635,
50276,
2004,
352,
69,
320,
4722,
281,
923,
849,
10895,
7976,
352,
310,
50273,
5040,
50276,
18,
4583,
253,
4028,
369,
247,
1652,
12744,
387,
2176,
13977,
285,
812,
452,
2750,
2166,
10260,
432,
690,
7424,
11120,
14851,
253,
9978,
50276,
1542,
4227,
891,
369,
12744,
604,
253,
3634,
6779,
369,
2879,
50276,
4609,
253,
2505,
5936,
50276,
263,
7036,
456,
281,
253,
27620,
9706,
1078,
1146,
45775,
50276,
783,
1996,
310,
5125,
407,
4677,
337,
3340,
1580,
253,
577,
8253,
2193,
323,
253,
25184,
2281,
13794,
403,
753,
281,
320,
3765,
1834,
50275,
3549,
6241,
253,
10491,
5700,
908,
50276,
284,
10066,
281,
38754,
28490,
50275,
19,
2509,
11745,
7103,
327,
760,
581,
10895,
50276,
32687,
50276,
395,
840,
760,
4645,
6667,
273,
849,
253,
1566,
1057,
36143,
689,
1529,
10895,
50276,
1205,
3742,
1846,
37431,
260,
21,
50276,
14920,
2509,
667,
1966,
2777,
310,
247,
1652,
31623,
50276,
783,
2934,
310,
4460,
2217,
835,
1014,
816,
2509,
690,
625,
16644,
612,
932,
651,
320,
36433,
323,
479,
50276,
1542,
4227,
2139,
359,
624,
16644,
17082,
1677,
323,
253,
48087,
1846,
37431,
10895,
50276,
21808,
1543,
285,
1907,
1491,
327,
3733,
873,
1979,
285,
253,
3388,
10669,
1979,
273,
1016,
1650,
323,
260,
21,
943,
320,
1677,
50276,
12563,
253,
4477,
7277,
1411,
298,
4636,
655,
323,
253,
803,
993,
285,
2297,
8265,
9978,
323,
253,
7001,
251,
941,
2139,
417,
921,
253,
1543,
323,
253,
726,
47705,
941,
347,
973,
50275,
18566,
253,
1384,
1449,
823,
16435,
25184,
789,
1805,
627,
347,
973,
390,
310,
352,
10895,
7976,
50276,
20,
627,
403,
767,
3374,
342,
634,
897,
273,
253,
7001,
251,
10895,
50276,
7053,
352,
36908,
1663,
2085,
28580,
281,
390,
6525,
5301,
1411,
253,
2720,
9380,
347,
597,
1140,
565,
383,
327,
253,
1072,
941,
432,
632,
1283,
534,
556,
50276,
20256,
465,
3733,
6667,
5727,
253,
789,
1060,
15693,
28131,
278,
3733,
6667,
50276,
262,
3133,
368,
943,
2057,
923,
849,
1110,
9380,
513,
342,
326,
1199,
941,
390,
2701,
634,
10895,
281,
320,
273,
387,
1878,
10870,
1979,
281,
320,
4344,
50275,
9815,
253,
7001,
251,
1071,
873,
310,
760,
273,
1979,
6783,
594,
18005,
1543,
327,
326,
3815,
3133,
1210,
2066,
547,
50276,
21,
253,
2929,
6482,
4219,
326,
3740,
310,
247,
3468,
4886,
4735,
5185,
689,
1781,
35742,
273,
2505,
7613,
253,
897,
273,
760,
253,
2720,
9701,
6197,
347,
3634,
50276,
783,
2929,
2722,
326,
970,
816,
271,
9701,
6197,
4245,
12532,
1543,
533,
36908,
921,
326,
697,
7933,
1805,
685,
816,
970,
1174,
446,
1032,
390,
970,
247,
4283,
12494,
281,
15313,
253,
3740,
432,
50275,
74,
13414,
1158,
436,
310,
4555,
7933,
281,
2953,
1060,
533,
323,
2852,
789,
352,
651,
320,
5322,
281,
923,
824,
247,
5301,
50276,
29483,
595,
849,
651,
970,
247,
9098,
1174,
446,
1032,
347,
10066,
281,
2233,
387,
17032,
673,
2818,
3045,
50276,
66,
4216,
4645,
849,
7200,
285,
2600,
23029,
497,
5876,
407,
326,
651,
320,
4722,
281,
6351,
1805,
4685,
50276,
3549,
6241,
4645,
849,
816,
253,
295,
2612,
5700,
858,
3815,
347,
10066,
281,
295,
50276,
79,
2612,
50276,
12756,
320,
4722,
50275,
22,
891,
42126,
1089,
253,
2709,
4809,
5111,
522,
21496,
24426,
3782,
21414,
323,
849,
973,
253,
46234,
4858,
253,
21942,
4809,
347,
627,
310,
6832,
14787,
1561,
1016,
7140,
50276,
35456,
3694,
50275,
74,
13414,
871,
604,
436,
310,
3782,
3309,
323,
634,
4154,
275,
619,
4743,
50276,
20432,
2429,
342,
612,
932,
327,
643,
15302,
533,
604,
594,
840,
352,
69,
320,
4722,
281,
452,
11745,
3904,
323,
1110,
2533,
569,
285,
2429,
342,
849,
352,
19986,
432,
816,
3192,
253,
246,
22,
46234,
285,
2509,
253,
1072,
5111,
522,
50275,
23,
253,
8171,
6046,
5700,
9193,
3965,
10341,
50276,
261,
627,
667,
16038,
3212,
970,
326,
347,
10066,
281,
970,
247,
298,
78,
390,
1529,
5700,
281,
8171,
21761,
50276,
24,
247,
25577,
323,
970,
1881,
7387,
86,
347,
10066,
281,
1554,
917,
86,
275,
253,
7103,
5199,
2593,
651,
320,
9371,
50276,
29483,
595,
247,
25577,
273,
1058,
259,
606,
10913,
288,
5738,
1269,
571,
13511,
328,
259,
266,
3661,
494,
440,
35421,
2505,
11104,
3700,
3066,
14835,
36255,
21624,
6779,
50276,
32167,
2824,
655,
50276,
35456,
323,
697,
10839,
494,
4809,
812,
320,
271,
1635,
281,
253,
2905,
789,
2593,
50276,
25,
436,
310,
12389,
29397,
90,
285,
3164,
323,
2852,
789,
533,
253,
897,
273,
1174,
446,
1032,
36908,
7933,
2701,
253,
2608,
281,
247,
41364,
873,
273,
14957,
50276,
3022,
253,
440,
35421,
1083,
1057,
50276,
35529,
352,
651,
320,
4722,
281,
923,
752,
651,
5108,
1677,
562,
273,
5028,
1174,
446,
1032,
323,
2057,
253,
2603,
390,
2303,
5971,
387,
17032,
673,
50274,
34974,
1309,
30080,
22559,
2180,
50275,
32897,
2953,
285,
19148,
253,
772,
1840,
50273,
24902,
2720,
25577,
50276,
7152,
33032,
2520,
2929,
29328,
247,
1332,
323,
2505,
4826,
3700,
835,
597,
13414,
878,
5203,
1491,
273,
253,
6110,
3740,
253,
9017,
246,
22,
1566,
281,
1287,
616,
10336,
534,
3210,
3740,
4908,
247,
3740,
4972,
432,
10341,
2505,
285,
50276,
2327,
436,
4972,
281,
1617,
253,
29810,
281,
1347,
3740,
3700,
50274,
35529,
253,
1655,
9759,
273,
253,
2929,
310,
1892,
281,
956,
534,
16540,
253,
1563,
4468,
50276,
18,
347,
597,
878,
281,
2085,
767,
14683,
534,
556,
281,
320,
20600,
1975,
14683,
352,
310,
417,
1896,
281,
4044,
1900,
50276,
48521,
597,
12421,
3609,
6197,
4667,
533,
840,
767,
14683,
778,
417,
8800,
1072,
3740,
849,
253,
4477,
403,
24049,
253,
1072,
50276,
19,
323,
17032,
597,
878,
6197,
17449,
274,
1097,
1097,
3740,
436,
40878,
616,
2045,
1750,
597,
452,
417,
2429,
342,
50276,
20,
849,
6046,
10199,
275,
9371,
323,
3740,
40634,
6197,
5978,
597,
513,
417,
897,
667,
47641,
285,
432,
247,
2014,
6197,
627,
476,
320,
2709,
7629,
273,
253,
40634,
2715,
403,
512,
1110,
2668,
387,
253,
673,
273,
3733,
840,
253,
3740,
352,
310,
4715,
310,
6830,
417,
253,
6034,
581,
347,
1027,
40634,
14683,
1537,
50276,
22990,
1027,
3740,
281,
17029,
253,
6197,
896,
50276,
21,
1566,
5110,
310,
6685,
10105,
280,
752,
310,
896,
10234,
3966,
387,
1878,
943,
320,
5544,
275,
581,
1386,
50276,
22,
1955,
281,
253,
440,
1088,
1430,
273,
253,
1566,
891,
2550,
2085,
31536,
327,
253,
906,
2593,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
747,
1332,
323,
5188,
813,
658,
2505,
3740,
3700,
253,
1332,
27532,
253,
3215,
11273,
3448,
1566,
246,
22,
285,
2789,
271,
9376,
326,
767,
9701,
14683,
275,
247,
3389,
452,
253,
1072,
3740,
5661,
1543,
921,
14127,
1543,
2429,
342,
22296,
3082,
50276,
856,
84,
50276,
783,
2929,
310,
3839,
4518,
3542,
50276,
783,
4081,
1332,
4620,
281,
320,
747,
50276,
16217,
3825,
452,
644,
5196,
50276,
5040,
50276,
783,
7936,
9376,
273,
253,
1332,
310,
417,
21414,
2217,
2523,
337,
273,
391,
20,
2523,
577,
273,
391,
21,
2523,
337,
273,
391,
19,
50276,
783,
4081,
1566,
310,
671,
417,
21414,
2217,
3374,
374,
285,
495,
273,
391,
20,
2523,
495,
273,
391,
19,
50276,
9088,
403,
3237,
342,
253,
4679,
323,
1650,
352,
651,
320,
1805,
281,
897,
625,
15302,
275,
253,
4679,
2523,
577,
273,
391,
20,
2523,
374,
273,
391,
21,
50276,
35844,
621,
452,
644,
1160,
2190,
253,
30628,
253,
30628,
11435,
253,
6031,
1160,
407,
253,
4477,
275,
253,
30080,
22559,
1690,
253,
3081,
4679,
2299,
597,
403,
417,
4751,
13762,
285,
1335,
1928,
326,
253,
19529,
310,
417,
2266,
2217,
347,
271,
17857,
32888,
2929,
50276
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
21942,
5971,
347,
2762,
285,
4016,
352,
812,
320,
625,
34593,
604,
253,
1566,
310,
5762,
327,
643,
15302,
342,
2709,
39236,
281,
12654,
253,
12510,
273,
253,
4081,
1551,
1190,
272,
5700,
50275,
22,
670,
253,
7103,
253,
2488,
760,
2361,
253,
3045,
273,
2600,
285,
3740,
23029,
756,
285,
1881,
934,
86,
253,
6197,
5978,
3290,
310,
3264,
281,
1304,
407,
5175,
253,
7387,
86,
4868,
273,
253,
4561,
14683,
50276,
23,
275,
4677,
577,
627,
310,
642,
2590,
3102,
875,
2762,
285,
4016,
6197,
21496,
253,
3064,
275,
21496,
2317,
310,
7194,
4269,
407,
1027,
12989,
534,
275,
619,
4685,
403,
253,
2600,
273,
14683,
436,
2097,
253,
3740,
11390,
2550,
13469,
253,
2600,
1491,
285,
671,
4242,
281,
4858,
14683,
342,
1027,
39236,
50275,
14005,
50276,
18,
1182,
12109,
1162,
355,
11138,
253,
17414,
5978,
15274,
3066,
1881,
35421,
4715,
6247,
50276,
19,
439,
8976,
1162,
355,
15966,
2460,
11743,
272,
3066,
13216,
4765,
7152,
33032,
2520,
2929,
39223,
253,
1895,
273,
34705,
285,
14053,
2505,
4028,
3740,
1293,
970,
13130,
941,
50276,
1206,
30718,
595,
14053,
2505,
3740,
4419,
2057,
18433,
14683,
22296,
390,
767,
24283,
273,
47223,
14683,
9267,
18859,
440,
35421,
436,
2929,
40725,
337,
3448,
1566,
3215,
26208,
285,
374,
1959,
20446,
6298,
275,
2505,
281,
5115,
14053,
2505,
14957,
1293,
13130,
941,
50276,
1542,
253,
337,
296,
1127,
253,
4477,
9113,
41661,
326,
247,
1781,
3215,
11273,
3448,
1566,
24088,
246,
22,
2168,
6057,
670,
3740,
1491,
285,
581,
476,
20843,
253,
3740,
1491,
970,
253,
987,
1442,
292,
328,
2625,
323,
253,
374,
2109,
1127,
253,
4477,
5467,
326,
2505,
3740,
24088,
21942,
310,
3468,
26621,
285,
5185,
323,
9701,
14683,
891,
5476,
697,
247,
2074,
2625,
310,
22059,
407,
1735,
6197,
10554,
275,
270,
797,
285,
260,
11939,
275,
3159,
19,
4642,
285,
436,
310,
908,
347,
253,
1959,
20446,
2625,
281,
616,
1566,
1442,
292,
2517,
50276,
249,
253,
4679,
2593,
253,
4477,
1071,
616,
1566,
327,
3700,
4715,
8892,
253,
4679,
3036,
374,
495,
1646,
281,
1804,
326,
387,
387,
1677,
1029,
2600,
23029,
4868,
50275,
1235,
253,
4081,
1566,
310,
417,
347,
7899,
347,
643,
22296,
3210,
533,
342,
1698,
2600,
23029,
253,
1566,
476,
25060,
3157,
7200,
407,
34301,
625,
3000,
3036,
374,
50275,
249,
4677,
374,
253,
2505,
1178,
1206,
7200,
556,
2761,
13737,
4872,
2380,
8772,
281,
253,
2600,
23029,
4868,
533,
275,
4677,
495,
253,
7484,
323,
2505,
1178,
1206,
6331,
387,
1884,
1235,
752,
651,
5108,
604,
253,
11237,
7155,
310,
2169,
651,
2505,
1178,
1206,
755,
8003,
281,
2233,
7200,
50276,
23955,
1355,
2523,
342,
4677,
495,
891,
2868,
253,
4836,
310,
8985,
803,
4632,
2297,
352,
1537,
320,
625,
4217,
281,
7484,
253,
7200,
432,
28416,
361,
3185,
273,
432,
470,
2233,
1580,
2456,
310,
253,
8542,
2406,
3033,
3045,
5474,
33032,
4606,
323,
4868,
50274,
2520,
2929,
29328,
247,
4460,
2746,
281,
253,
5188,
813,
658,
3740,
3700,
4836,
835,
271,
3280,
310,
40634,
3066,
1027,
8130,
285,
10208,
715,
271,
6753,
36465,
534,
310,
23000,
27039,
327,
697,
2720,
9701,
3634,
6197,
3066,
247,
3740,
32049,
534,
11323,
697,
1599,
24462,
8763,
1375,
281,
253,
3438,
1078,
28490,
50276,
15617,
2349,
351,
398,
403,
31260,
281,
285,
25057,
253,
4757,
273,
3215,
11273,
246,
22,
1566,
50276,
29483,
595,
253,
2408,
273,
1635,
615,
37713,
273,
21761,
310,
10839,
494,
387,
1097,
3733,
285,
17032,
673,
50274,
783,
4583,
2934,
310,
3240,
18511,
533,
253,
9380,
4154,
812,
320,
5520,
10260,
342,
38549,
281,
697,
5368,
5661,
9978,
285,
625,
7103,
4583,
281,
1805,
285,
625,
16575,
896,
697,
3916,
50272,
856,
84,
50276,
856,
84,
337,
253,
4477,
12661,
247,
4460,
2746,
281,
253,
5188,
813,
658,
3740,
3700,
4836,
326,
310,
1754,
327,
16344,
849,
3733,
762,
1027,
13553,
273,
495,
642,
2182,
8130,
50275,
24946,
896,
10234,
285,
27620,
896,
10234,
50276,
251,
3280,
17438,
476,
320,
908,
275,
17385,
342,
271,
6753,
36465,
285,
3740,
32049,
689,
253,
2720,
6197,
3634,
281,
840,
513,
17032,
1677,
271,
3280,
2505,
285,
247,
1355,
1180,
273,
1174,
446,
1032,
273,
253,
2603,
285,
2303,
14957,
50276,
783,
2934,
310,
10090,
562,
9648,
4518,
1097,
323,
3733,
285,
17032,
2167,
2176,
1798,
84,
627,
285,
275,
253,
4679,
2593,
497,
247,
1652,
12744,
285,
812,
452,
37081,
432,
690,
7473,
14951,
50276,
2887,
1735,
2593,
50274,
19,
253,
11745,
1543,
327,
253,
7001,
251,
10895,
323,
616,
1682,
1566,
327,
1097,
253,
2120,
941,
285,
1643,
5103,
27005,
403,
3240,
13943,
2429,
342,
253,
643,
5188,
813,
658,
3740,
3700,
2929,
597,
7277,
1411,
50276,
46036,
9169,
50274,
20,
253,
1643,
18276,
6667,
2011,
403,
13943,
50276,
35456,
253,
41290,
266,
50276,
31005,
763,
4394,
50274,
21,
253,
25184,
4373,
4764,
310,
247,
4217,
1635,
50276,
2004,
352,
69,
320,
4722,
281,
923,
849,
10895,
7976,
352,
310,
50273,
5040,
50276,
18,
4583,
253,
4028,
369,
247,
1652,
12744,
387,
2176,
13977,
285,
812,
452,
2750,
2166,
10260,
432,
690,
7424,
11120,
14851,
253,
9978,
50276,
1542,
4227,
891,
369,
12744,
604,
253,
3634,
6779,
369,
2879,
50276,
4609,
253,
2505,
5936,
50276,
263,
7036,
456,
281,
253,
27620,
9706,
1078,
1146,
45775,
50276,
783,
1996,
310,
5125,
407,
4677,
337,
3340,
1580,
253,
577,
8253,
2193,
323,
253,
25184,
2281,
13794,
403,
753,
281,
320,
3765,
1834,
50275,
3549,
6241,
253,
10491,
5700,
908,
50276,
284,
10066,
281,
38754,
28490,
50275,
19,
2509,
11745,
7103,
327,
760,
581,
10895,
50276,
32687,
50276,
395,
840,
760,
4645,
6667,
273,
849,
253,
1566,
1057,
36143,
689,
1529,
10895,
50276,
1205,
3742,
1846,
37431,
260,
21,
50276,
14920,
2509,
667,
1966,
2777,
310,
247,
1652,
31623,
50276,
783,
2934,
310,
4460,
2217,
835,
1014,
816,
2509,
690,
625,
16644,
612,
932,
651,
320,
36433,
323,
479,
50276,
1542,
4227,
2139,
359,
624,
16644,
17082,
1677,
323,
253,
48087,
1846,
37431,
10895,
50276,
21808,
1543,
285,
1907,
1491,
327,
3733,
873,
1979,
285,
253,
3388,
10669,
1979,
273,
1016,
1650,
323,
260,
21,
943,
320,
1677,
50276,
12563,
253,
4477,
7277,
1411,
298,
4636,
655,
323,
253,
803,
993,
285,
2297,
8265,
9978,
323,
253,
7001,
251,
941,
2139,
417,
921,
253,
1543,
323,
253,
726,
47705,
941,
347,
973,
50275,
18566,
253,
1384,
1449,
823,
16435,
25184,
789,
1805,
627,
347,
973,
390,
310,
352,
10895,
7976,
50276,
20,
627,
403,
767,
3374,
342,
634,
897,
273,
253,
7001,
251,
10895,
50276,
7053,
352,
36908,
1663,
2085,
28580,
281,
390,
6525,
5301,
1411,
253,
2720,
9380,
347,
597,
1140,
565,
383,
327,
253,
1072,
941,
432,
632,
1283,
534,
556,
50276,
20256,
465,
3733,
6667,
5727,
253,
789,
1060,
15693,
28131,
278,
3733,
6667,
50276,
262,
3133,
368,
943,
2057,
923,
849,
1110,
9380,
513,
342,
326,
1199,
941,
390,
2701,
634,
10895,
281,
320,
273,
387,
1878,
10870,
1979,
281,
320,
4344,
50275,
9815,
253,
7001,
251,
1071,
873,
310,
760,
273,
1979,
6783,
594,
18005,
1543,
327,
326,
3815,
3133,
1210,
2066,
547,
50276,
21,
253,
2929,
6482,
4219,
326,
3740,
310,
247,
3468,
4886,
4735,
5185,
689,
1781,
35742,
273,
2505,
7613,
253,
897,
273,
760,
253,
2720,
9701,
6197,
347,
3634,
50276,
783,
2929,
2722,
326,
970,
816,
271,
9701,
6197,
4245,
12532,
1543,
533,
36908,
921,
326,
697,
7933,
1805,
685,
816,
970,
1174,
446,
1032,
390,
970,
247,
4283,
12494,
281,
15313,
253,
3740,
432,
50275,
74,
13414,
1158,
436,
310,
4555,
7933,
281,
2953,
1060,
533,
323,
2852,
789,
352,
651,
320,
5322,
281,
923,
824,
247,
5301,
50276,
29483,
595,
849,
651,
970,
247,
9098,
1174,
446,
1032,
347,
10066,
281,
2233,
387,
17032,
673,
2818,
3045,
50276,
66,
4216,
4645,
849,
7200,
285,
2600,
23029,
497,
5876,
407,
326,
651,
320,
4722,
281,
6351,
1805,
4685,
50276,
3549,
6241,
4645,
849,
816,
253,
295,
2612,
5700,
858,
3815,
347,
10066,
281,
295,
50276,
79,
2612,
50276,
12756,
320,
4722,
50275,
22,
891,
42126,
1089,
253,
2709,
4809,
5111,
522,
21496,
24426,
3782,
21414,
323,
849,
973,
253,
46234,
4858,
253,
21942,
4809,
347,
627,
310,
6832,
14787,
1561,
1016,
7140,
50276,
35456,
3694,
50275,
74,
13414,
871,
604,
436,
310,
3782,
3309,
323,
634,
4154,
275,
619,
4743,
50276,
20432,
2429,
342,
612,
932,
327,
643,
15302,
533,
604,
594,
840,
352,
69,
320,
4722,
281,
452,
11745,
3904,
323,
1110,
2533,
569,
285,
2429,
342,
849,
352,
19986,
432,
816,
3192,
253,
246,
22,
46234,
285,
2509,
253,
1072,
5111,
522,
50275,
23,
253,
8171,
6046,
5700,
9193,
3965,
10341,
50276,
261,
627,
667,
16038,
3212,
970,
326,
347,
10066,
281,
970,
247,
298,
78,
390,
1529,
5700,
281,
8171,
21761,
50276,
24,
247,
25577,
323,
970,
1881,
7387,
86,
347,
10066,
281,
1554,
917,
86,
275,
253,
7103,
5199,
2593,
651,
320,
9371,
50276,
29483,
595,
247,
25577,
273,
1058,
259,
606,
10913,
288,
5738,
1269,
571,
13511,
328,
259,
266,
3661,
494,
440,
35421,
2505,
11104,
3700,
3066,
14835,
36255,
21624,
6779,
50276,
32167,
2824,
655,
50276,
35456,
323,
697,
10839,
494,
4809,
812,
320,
271,
1635,
281,
253,
2905,
789,
2593,
50276,
25,
436,
310,
12389,
29397,
90,
285,
3164,
323,
2852,
789,
533,
253,
897,
273,
1174,
446,
1032,
36908,
7933,
2701,
253,
2608,
281,
247,
41364,
873,
273,
14957,
50276,
3022,
253,
440,
35421,
1083,
1057,
50276,
35529,
352,
651,
320,
4722,
281,
923,
752,
651,
5108,
1677,
562,
273,
5028,
1174,
446,
1032,
323,
2057,
253,
2603,
390,
2303,
5971,
387,
17032,
673,
50274,
34974,
1309,
30080,
22559,
2180,
50275,
32897,
2953,
285,
19148,
253,
772,
1840,
50273,
24902,
2720,
25577,
50276,
7152,
33032,
2520,
2929,
29328,
247,
1332,
323,
2505,
4826,
3700,
835,
597,
13414,
878,
5203,
1491,
273,
253,
6110,
3740,
253,
9017,
246,
22,
1566,
281,
1287,
616,
10336,
534,
3210,
3740,
4908,
247,
3740,
4972,
432,
10341,
2505,
285,
50276,
2327,
436,
4972,
281,
1617,
253,
29810,
281,
1347,
3740,
3700,
50274,
35529,
253,
1655,
9759,
273,
253,
2929,
310,
1892,
281,
956,
534,
16540,
253,
1563,
4468,
50276,
18,
347,
597,
878,
281,
2085,
767,
14683,
534,
556,
281,
320,
20600,
1975,
14683,
352,
310,
417,
1896,
281,
4044,
1900,
50276,
48521,
597,
12421,
3609,
6197,
4667,
533,
840,
767,
14683,
778,
417,
8800,
1072,
3740,
849,
253,
4477,
403,
24049,
253,
1072,
50276,
19,
323,
17032,
597,
878,
6197,
17449,
274,
1097,
1097,
3740,
436,
40878,
616,
2045,
1750,
597,
452,
417,
2429,
342,
50276,
20,
849,
6046,
10199,
275,
9371,
323,
3740,
40634,
6197,
5978,
597,
513,
417,
897,
667,
47641,
285,
432,
247,
2014,
6197,
627,
476,
320,
2709,
7629,
273,
253,
40634,
2715,
403,
512,
1110,
2668,
387,
253,
673,
273,
3733,
840,
253,
3740,
352,
310,
4715,
310,
6830,
417,
253,
6034,
581,
347,
1027,
40634,
14683,
1537,
50276,
22990,
1027,
3740,
281,
17029,
253,
6197,
896,
50276,
21,
1566,
5110,
310,
6685,
10105,
280,
752,
310,
896,
10234,
3966,
387,
1878,
943,
320,
5544,
275,
581,
1386,
50276,
22,
1955,
281,
253,
440,
1088,
1430,
273,
253,
1566,
891,
2550,
2085,
31536,
327,
253,
906,
2593,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
747,
1332,
323,
5188,
813,
658,
2505,
3740,
3700,
253,
1332,
27532,
253,
3215,
11273,
3448,
1566,
246,
22,
285,
2789,
271,
9376,
326,
767,
9701,
14683,
275,
247,
3389,
452,
253,
1072,
3740,
5661,
1543,
921,
14127,
1543,
2429,
342,
22296,
3082,
50276,
856,
84,
50276,
783,
2929,
310,
3839,
4518,
3542,
50276,
783,
4081,
1332,
4620,
281,
320,
747,
50276,
16217,
3825,
452,
644,
5196,
50276,
5040,
50276,
783,
7936,
9376,
273,
253,
1332,
310,
417,
21414,
2217,
2523,
337,
273,
391,
20,
2523,
577,
273,
391,
21,
2523,
337,
273,
391,
19,
50276,
783,
4081,
1566,
310,
671,
417,
21414,
2217,
3374,
374,
285,
495,
273,
391,
20,
2523,
495,
273,
391,
19,
50276,
9088,
403,
3237,
342,
253,
4679,
323,
1650,
352,
651,
320,
1805,
281,
897,
625,
15302,
275,
253,
4679,
2523,
577,
273,
391,
20,
2523,
374,
273,
391,
21,
50276,
35844,
621,
452,
644,
1160,
2190,
253,
30628,
253,
30628,
11435,
253,
6031,
1160,
407,
253,
4477,
275,
253,
30080,
22559,
1690,
253,
3081,
4679,
2299,
597,
403,
417,
4751,
13762,
285,
1335,
1928,
326,
253,
19529,
310,
417,
2266,
2217,
347,
271,
17857,
32888,
2929,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
1 the proposed solution makes sense and is a reliable extension considering the baseline picks the max of the posterior pdx its a nice idea to extend the algorithm to include more randomness and increase the exploration to improve robustness of the results 2 the paper is mostly well written notably the introduction section 1 is extensive easy to read and presents excellently the state of the art and the motivation for the work 3 the results corroborate the improvement of the algorithm displaying enough significance to warrant using their solution rather than the original baseline 1 main issue relates to section 32 this is the main novelty of the paper and i feel that it is badly presented judging from the text there is not sufficient information that would allow a user to reproduce the algorithm when using the tracking algoritm is an image subvolume with the seed at the center used the output of the original algorithm is a vector that maximises pdx this is just a direction given a subvolume and a previous centre point how is the next voxel selected is the next subvolume in the image within the direction of the vector used what do the authors mean when they stay state the median location of all living agents is computed after each step is the location of all living agents the majority voted voxel in the direction of the previous vector 2 unless i misunderstood the stochastic tracking strategy is just the centerline tracking algorithm of wolterink 2019 initiated at different seeds in a sphere with radius r and using some stopping criterion based on the surviving agents how is this different than just running the algorithm various times with different initialisations and computing fc1 dots cn where fcdot is a function that returns the desired object in my view the proposal is too incremental and lacks novelty docsep the design of multiple stochastic agents reaching consensus makes the method less sensitive to local errors made by the cnn orientation classifier which is a major advantage over the previous nonstochastic approach the writing of the paper is very clear the clinical problem the challenges the method the experiments and results etc the structure is well organized making reading the paper an pleasant experience the major weakness of the proposed method is that it contains many hyperparameters that may require careful tuning for example in the stochastic tracker the number of agents the radius around the seed point the thresholding values for determining the stopping criteria and also the distance threshold that was used for computing the evaluation metrics how did the author tune these parameters how sensitive are they to influence the tracking results have the authors done any analysis on these parameters the proposed method was evaluated on a rather small dataset considering the variability in different subjects in this type of data would the method with the same hyperparameter setting still work if not how easy would it be to tune those parameters docsepthe paper is well written both in terms of structure and in terms of language the method is analysed in an adequate and easy to understand manner the method looks both interesting and powerful there is an interesting and in depth discussion about limitations and abilities of the proposed methods the method that the authors propose by their admittance is inspired by wolternick et al to what extend is this paper novel compared to the proposed solution of wolternick et al there is no comparison against other methods that perform centreline extraction making the experimentation section of the paper lacking significantly docsepthe paper is wellwritten and nicely illustrated follows the standard structure adequately summarizes relevant prior work and also discusses its own contributions well i found the method section lacking in particular the introduction of the deep neural centerline tracking which this paper is centrally based on but that was probably for lack of space maybe it would help to explicitly point the reader to the prior work which describes this in much more detail the sentence inspired by a method does not sound like one would find all necessary algorithmic detail in the cited paper the method is definitely wellsuited for this task and likely superior to previous works on similar images i am not sure how uncommon 4d mri is for the small intestines the imaging itself may be quite a challenge here in particular the motivation for directly tracking the centerline instead of starting with a segmentation mask is sound fig 4 was particularly helpful in understanding how the stochastic tracking works in practice one weakness is that the dataset is relatively small the authors have only 14 mri datasets from healthy volunteers therefore the evaluation also does not perform a proper training validation testing split but a leaveoneout cross validation is used given this small dataset this is certainly a good idea although i wonder if the one split used for algorithmic development should have been excluded from the evaluation the authors state that they retrained this fold with a different random seed i also think that the surface dsc is a strange choice given the fact that the reference annotations do not even have radius information since the whole evaluation is based on surfacebased precision and recall measures a centerlinebased variant would seem more adequate and even simpler in the end this is not a serious weakness though i believe the results would not change much
### Summary: | the paper presents a method to extract intestine centerline in 3d cinemri the methodological contribution lies in adding stochasticity to an existing centerline tracking method by establishing a consensus among the multiple stochastic agents r1 r2 and r3 rated the methodological contribution as small or incremental although strongly building on an existing prior work there is some novelty in the methodological aspects linked to the stochasticity as recognized by r3 after the rebuttal and in the methodology as mentioned by r1 r2 and r1 raised some issues regarding the clarity of the methodology and requested further analysis of the results most of these questions were addressed satisfactorily in the rebuttal and revised version r2 and r4 both raised some concerns regarding the hyperparameter tuning and setting although the rebuttal claims to keep experimental optimization of the hyperparameters to a minimum relies instead on physiological prior knowledge authors might want to look at a more indepth analysis of their sensitivity and influence in the future the same goes for considering other baseline methods a larger database and different evaluation metrics after the rebuttal there was however a consensus among the reviewers on the value of publishing this work in midl in its current state | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
18,
253,
4081,
2900,
2789,
3282,
285,
310,
247,
9630,
6880,
7296,
253,
8245,
21460,
253,
2781,
273,
253,
12637,
268,
9665,
697,
247,
5322,
2934,
281,
9017,
253,
5933,
281,
2486,
625,
3632,
1255,
285,
2572,
253,
17947,
281,
3157,
31640,
273,
253,
1543,
50275,
19,
253,
2929,
310,
6571,
973,
3542,
19836,
253,
10199,
2593,
337,
310,
9470,
3477,
281,
1239,
285,
10262,
6552,
1574,
253,
1375,
273,
253,
1445,
285,
253,
16038,
323,
253,
789,
50275,
20,
253,
1543,
25092,
366,
253,
7756,
273,
253,
5933,
19703,
2217,
8453,
281,
7501,
970,
616,
2900,
2581,
685,
253,
3236,
8245,
337,
2022,
2523,
7033,
281,
2593,
4567,
436,
310,
253,
2022,
38135,
273,
253,
2929,
285,
891,
1928,
326,
352,
310,
16426,
3559,
32721,
432,
253,
2505,
627,
310,
417,
4209,
1491,
326,
651,
1581,
247,
2608,
281,
18302,
253,
5933,
672,
970,
253,
12544,
355,
3892,
262,
78,
310,
271,
2460,
749,
21970,
342,
253,
8357,
387,
253,
4055,
908,
253,
3453,
273,
253,
3236,
5933,
310,
247,
4972,
326,
11903,
3013,
268,
9665,
436,
310,
816,
247,
3884,
1677,
247,
749,
21970,
285,
247,
2045,
9145,
1127,
849,
310,
253,
1735,
46092,
4236,
310,
253,
1735,
749,
21970,
275,
253,
2460,
1561,
253,
3884,
273,
253,
4972,
908,
752,
513,
253,
4477,
1599,
672,
597,
3297,
1375,
253,
8876,
4328,
273,
512,
3811,
6083,
310,
10302,
846,
1016,
3213,
310,
253,
4328,
273,
512,
3811,
6083,
253,
5020,
14285,
46092,
275,
253,
3884,
273,
253,
2045,
4972,
50275,
19,
5734,
891,
46485,
253,
19191,
12544,
5700,
310,
816,
253,
4055,
1282,
12544,
5933,
273,
259,
311,
350,
750,
6247,
15247,
387,
1027,
12922,
275,
247,
15269,
342,
9941,
391,
285,
970,
690,
15910,
17705,
1754,
327,
253,
21548,
6083,
849,
310,
436,
1027,
685,
816,
3515,
253,
5933,
2710,
2069,
342,
1027,
3302,
18058,
285,
12672,
269,
68,
18,
20200,
260,
79,
835,
269,
3830,
310,
247,
1159,
326,
6548,
253,
6799,
1789,
275,
619,
1859,
253,
10419,
310,
1512,
32809,
285,
19756,
38135,
5474,
33032,
253,
2216,
273,
2709,
19191,
6083,
10922,
13969,
2789,
253,
1332,
1679,
7996,
281,
1980,
6332,
1160,
407,
253,
260,
9866,
11259,
30410,
534,
310,
247,
2201,
5750,
689,
253,
2045,
1327,
296,
17283,
2746,
50276,
783,
4028,
273,
253,
2929,
310,
1077,
2590,
253,
3382,
1895,
253,
7881,
253,
1332,
253,
4679,
285,
1543,
3966,
253,
2605,
310,
973,
10932,
2403,
4361,
253,
2929,
271,
17127,
2793,
50276,
783,
2201,
14855,
273,
253,
4081,
1332,
310,
326,
352,
4428,
1142,
4373,
22041,
326,
778,
2430,
10182,
25184,
323,
1650,
275,
253,
19191,
40143,
253,
1180,
273,
6083,
253,
9941,
1475,
253,
8357,
1127,
253,
7887,
272,
2193,
323,
8925,
253,
15910,
6866,
285,
671,
253,
4181,
7887,
326,
369,
908,
323,
12672,
253,
7103,
17082,
849,
858,
253,
2488,
19928,
841,
3602,
849,
7996,
403,
597,
281,
4833,
253,
12544,
1543,
452,
253,
4477,
2218,
667,
1783,
327,
841,
3602,
50276,
783,
4081,
1332,
369,
6760,
327,
247,
2581,
1355,
10895,
7296,
253,
13099,
275,
1027,
5705,
275,
436,
1511,
273,
941,
651,
253,
1332,
342,
253,
1072,
4373,
19484,
4758,
1335,
789,
604,
417,
849,
3477,
651,
352,
320,
281,
19928,
1110,
3602,
5474,
339,
431,
248,
2929,
310,
973,
3542,
1097,
275,
2426,
273,
2605,
285,
275,
2426,
273,
3448,
253,
1332,
310,
15626,
275,
271,
10599,
285,
3477,
281,
2096,
5133,
253,
1332,
4453,
1097,
4722,
285,
6422,
627,
310,
271,
4722,
285,
275,
6864,
5955,
670,
7364,
285,
15277,
273,
253,
4081,
3082,
50276,
783,
1332,
326,
253,
4477,
12661,
407,
616,
7599,
770,
593,
310,
11797,
407,
259,
311,
931,
781,
1162,
355,
281,
752,
9017,
310,
436,
2929,
4460,
2429,
281,
253,
4081,
2900,
273,
259,
311,
931,
781,
1162,
355,
50274,
9088,
310,
642,
5301,
1411,
643,
3082,
326,
1347,
9145,
1282,
11998,
2403,
253,
40290,
2593,
273,
253,
2929,
14999,
3012,
50276,
7152,
339,
431,
248,
2929,
310,
973,
15720,
285,
23395,
12800,
3637,
253,
2629,
2605,
18212,
37250,
4623,
2720,
789,
285,
671,
25339,
697,
1211,
9021,
973,
50276,
74,
1119,
253,
1332,
2593,
14999,
275,
1798,
253,
10199,
273,
253,
3676,
11454,
4055,
1282,
12544,
534,
436,
2929,
310,
44411,
1754,
327,
533,
326,
369,
3164,
323,
3480,
273,
2317,
5046,
352,
651,
1361,
281,
11120,
1127,
253,
9414,
281,
253,
2720,
789,
534,
8631,
436,
275,
1199,
625,
2508,
253,
6197,
11797,
407,
247,
1332,
50276,
18566,
417,
3590,
751,
581,
651,
1089,
512,
3309,
5933,
280,
2508,
275,
253,
11106,
2929,
50274,
783,
1332,
310,
7964,
973,
3467,
959,
323,
436,
4836,
285,
2779,
8936,
281,
2045,
2987,
327,
2074,
3888,
891,
717,
417,
2119,
849,
24666,
577,
69,
278,
363,
310,
323,
253,
1355,
540,
383,
1100,
50276,
783,
6979,
3139,
778,
320,
3240,
247,
5691,
1060,
275,
1798,
253,
16038,
323,
3587,
12544,
253,
4055,
1282,
3185,
273,
4983,
342,
247,
26405,
8989,
310,
3590,
50276,
926,
577,
369,
3782,
9371,
275,
4685,
849,
253,
19191,
12544,
2987,
275,
3946,
581,
14855,
310,
326,
253,
10895,
310,
4942,
1355,
253,
4477,
452,
760,
1638,
278,
363,
15302,
432,
5875,
15986,
3103,
253,
7103,
671,
1057,
417,
1347,
247,
1463,
3733,
50276,
29599,
50276,
19462,
8085,
533,
247,
3553,
531,
483,
2831,
12820,
310,
908,
1677,
436,
1355,
10895,
436,
310,
5604,
247,
1175,
2934,
3738,
891,
4282,
604,
253,
581,
8085,
908,
323,
5933,
280,
2440,
943,
452,
644,
10432,
432,
253,
7103,
253,
4477,
1375,
326,
597,
851,
11273,
436,
7975,
342,
247,
1027,
3632,
8357,
50276,
74,
671,
1158,
326,
253,
2553,
277,
1026,
310,
247,
8921,
4327,
1677,
253,
958,
326,
253,
3806,
31825,
513,
417,
1014,
452,
9941,
1491,
1580,
253,
2644,
7103,
310,
1754,
327,
2553,
3169,
12320,
285,
6983,
5593,
247,
4055,
1282,
3169,
12955,
651,
1646,
625,
10599,
285,
1014,
19554,
275,
253,
990,
436,
310,
417,
247,
4092,
14855,
2167,
50276,
74,
2868,
253,
1543,
651,
417,
1818,
1199,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
1332,
281,
4908,
28741,
4055,
1282,
275,
495,
69,
15573,
358,
363,
253,
35961,
7680,
8696,
275,
6240,
19191,
414,
281,
271,
5368,
4055,
1282,
12544,
1332,
407,
14631,
247,
13969,
2190,
253,
2709,
19191,
6083,
50276,
83,
18,
391,
19,
285,
391,
20,
20139,
253,
35961,
7680,
347,
1355,
390,
32809,
3738,
7052,
3652,
327,
271,
5368,
2720,
789,
627,
310,
690,
38135,
275,
253,
35961,
7794,
7939,
281,
253,
19191,
414,
347,
7478,
407,
391,
20,
846,
253,
30080,
22559,
285,
275,
253,
16182,
347,
5393,
407,
391,
18,
391,
19,
285,
391,
18,
5439,
690,
3374,
5001,
253,
19843,
273,
253,
16182,
285,
9521,
2007,
1783,
273,
253,
1543,
954,
273,
841,
3533,
497,
9713,
3449,
5906,
1031,
275,
253,
30080,
22559,
285,
17265,
2715,
391,
19,
285,
391,
21,
1097,
5439,
690,
7350,
5001,
253,
4373,
19484,
25184,
285,
4758,
3738,
253,
30080,
22559,
3916,
281,
1978,
5661,
13757,
273,
253,
4373,
22041,
281,
247,
5927,
15771,
3185,
327,
13424,
2720,
3640,
4477,
1537,
971,
281,
1007,
387,
247,
625,
801,
554,
394,
1783,
273,
616,
7340,
285,
4833,
275,
253,
2852,
253,
1072,
4566,
323,
7296,
643,
8245,
3082,
247,
4067,
5447,
285,
1027,
7103,
17082,
50276,
6438,
253,
30080,
22559,
627,
369,
2299,
247,
13969,
2190,
253,
30628,
327,
253,
1318,
273,
18051,
436,
789,
275,
4260,
77,
275,
697,
1655,
1375,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
18,
253,
4081,
2900,
2789,
3282,
285,
310,
247,
9630,
6880,
7296,
253,
8245,
21460,
253,
2781,
273,
253,
12637,
268,
9665,
697,
247,
5322,
2934,
281,
9017,
253,
5933,
281,
2486,
625,
3632,
1255,
285,
2572,
253,
17947,
281,
3157,
31640,
273,
253,
1543,
50275,
19,
253,
2929,
310,
6571,
973,
3542,
19836,
253,
10199,
2593,
337,
310,
9470,
3477,
281,
1239,
285,
10262,
6552,
1574,
253,
1375,
273,
253,
1445,
285,
253,
16038,
323,
253,
789,
50275,
20,
253,
1543,
25092,
366,
253,
7756,
273,
253,
5933,
19703,
2217,
8453,
281,
7501,
970,
616,
2900,
2581,
685,
253,
3236,
8245,
337,
2022,
2523,
7033,
281,
2593,
4567,
436,
310,
253,
2022,
38135,
273,
253,
2929,
285,
891,
1928,
326,
352,
310,
16426,
3559,
32721,
432,
253,
2505,
627,
310,
417,
4209,
1491,
326,
651,
1581,
247,
2608,
281,
18302,
253,
5933,
672,
970,
253,
12544,
355,
3892,
262,
78,
310,
271,
2460,
749,
21970,
342,
253,
8357,
387,
253,
4055,
908,
253,
3453,
273,
253,
3236,
5933,
310,
247,
4972,
326,
11903,
3013,
268,
9665,
436,
310,
816,
247,
3884,
1677,
247,
749,
21970,
285,
247,
2045,
9145,
1127,
849,
310,
253,
1735,
46092,
4236,
310,
253,
1735,
749,
21970,
275,
253,
2460,
1561,
253,
3884,
273,
253,
4972,
908,
752,
513,
253,
4477,
1599,
672,
597,
3297,
1375,
253,
8876,
4328,
273,
512,
3811,
6083,
310,
10302,
846,
1016,
3213,
310,
253,
4328,
273,
512,
3811,
6083,
253,
5020,
14285,
46092,
275,
253,
3884,
273,
253,
2045,
4972,
50275,
19,
5734,
891,
46485,
253,
19191,
12544,
5700,
310,
816,
253,
4055,
1282,
12544,
5933,
273,
259,
311,
350,
750,
6247,
15247,
387,
1027,
12922,
275,
247,
15269,
342,
9941,
391,
285,
970,
690,
15910,
17705,
1754,
327,
253,
21548,
6083,
849,
310,
436,
1027,
685,
816,
3515,
253,
5933,
2710,
2069,
342,
1027,
3302,
18058,
285,
12672,
269,
68,
18,
20200,
260,
79,
835,
269,
3830,
310,
247,
1159,
326,
6548,
253,
6799,
1789,
275,
619,
1859,
253,
10419,
310,
1512,
32809,
285,
19756,
38135,
5474,
33032,
253,
2216,
273,
2709,
19191,
6083,
10922,
13969,
2789,
253,
1332,
1679,
7996,
281,
1980,
6332,
1160,
407,
253,
260,
9866,
11259,
30410,
534,
310,
247,
2201,
5750,
689,
253,
2045,
1327,
296,
17283,
2746,
50276,
783,
4028,
273,
253,
2929,
310,
1077,
2590,
253,
3382,
1895,
253,
7881,
253,
1332,
253,
4679,
285,
1543,
3966,
253,
2605,
310,
973,
10932,
2403,
4361,
253,
2929,
271,
17127,
2793,
50276,
783,
2201,
14855,
273,
253,
4081,
1332,
310,
326,
352,
4428,
1142,
4373,
22041,
326,
778,
2430,
10182,
25184,
323,
1650,
275,
253,
19191,
40143,
253,
1180,
273,
6083,
253,
9941,
1475,
253,
8357,
1127,
253,
7887,
272,
2193,
323,
8925,
253,
15910,
6866,
285,
671,
253,
4181,
7887,
326,
369,
908,
323,
12672,
253,
7103,
17082,
849,
858,
253,
2488,
19928,
841,
3602,
849,
7996,
403,
597,
281,
4833,
253,
12544,
1543,
452,
253,
4477,
2218,
667,
1783,
327,
841,
3602,
50276,
783,
4081,
1332,
369,
6760,
327,
247,
2581,
1355,
10895,
7296,
253,
13099,
275,
1027,
5705,
275,
436,
1511,
273,
941,
651,
253,
1332,
342,
253,
1072,
4373,
19484,
4758,
1335,
789,
604,
417,
849,
3477,
651,
352,
320,
281,
19928,
1110,
3602,
5474,
339,
431,
248,
2929,
310,
973,
3542,
1097,
275,
2426,
273,
2605,
285,
275,
2426,
273,
3448,
253,
1332,
310,
15626,
275,
271,
10599,
285,
3477,
281,
2096,
5133,
253,
1332,
4453,
1097,
4722,
285,
6422,
627,
310,
271,
4722,
285,
275,
6864,
5955,
670,
7364,
285,
15277,
273,
253,
4081,
3082,
50276,
783,
1332,
326,
253,
4477,
12661,
407,
616,
7599,
770,
593,
310,
11797,
407,
259,
311,
931,
781,
1162,
355,
281,
752,
9017,
310,
436,
2929,
4460,
2429,
281,
253,
4081,
2900,
273,
259,
311,
931,
781,
1162,
355,
50274,
9088,
310,
642,
5301,
1411,
643,
3082,
326,
1347,
9145,
1282,
11998,
2403,
253,
40290,
2593,
273,
253,
2929,
14999,
3012,
50276,
7152,
339,
431,
248,
2929,
310,
973,
15720,
285,
23395,
12800,
3637,
253,
2629,
2605,
18212,
37250,
4623,
2720,
789,
285,
671,
25339,
697,
1211,
9021,
973,
50276,
74,
1119,
253,
1332,
2593,
14999,
275,
1798,
253,
10199,
273,
253,
3676,
11454,
4055,
1282,
12544,
534,
436,
2929,
310,
44411,
1754,
327,
533,
326,
369,
3164,
323,
3480,
273,
2317,
5046,
352,
651,
1361,
281,
11120,
1127,
253,
9414,
281,
253,
2720,
789,
534,
8631,
436,
275,
1199,
625,
2508,
253,
6197,
11797,
407,
247,
1332,
50276,
18566,
417,
3590,
751,
581,
651,
1089,
512,
3309,
5933,
280,
2508,
275,
253,
11106,
2929,
50274,
783,
1332,
310,
7964,
973,
3467,
959,
323,
436,
4836,
285,
2779,
8936,
281,
2045,
2987,
327,
2074,
3888,
891,
717,
417,
2119,
849,
24666,
577,
69,
278,
363,
310,
323,
253,
1355,
540,
383,
1100,
50276,
783,
6979,
3139,
778,
320,
3240,
247,
5691,
1060,
275,
1798,
253,
16038,
323,
3587,
12544,
253,
4055,
1282,
3185,
273,
4983,
342,
247,
26405,
8989,
310,
3590,
50276,
926,
577,
369,
3782,
9371,
275,
4685,
849,
253,
19191,
12544,
2987,
275,
3946,
581,
14855,
310,
326,
253,
10895,
310,
4942,
1355,
253,
4477,
452,
760,
1638,
278,
363,
15302,
432,
5875,
15986,
3103,
253,
7103,
671,
1057,
417,
1347,
247,
1463,
3733,
50276,
29599,
50276,
19462,
8085,
533,
247,
3553,
531,
483,
2831,
12820,
310,
908,
1677,
436,
1355,
10895,
436,
310,
5604,
247,
1175,
2934,
3738,
891,
4282,
604,
253,
581,
8085,
908,
323,
5933,
280,
2440,
943,
452,
644,
10432,
432,
253,
7103,
253,
4477,
1375,
326,
597,
851,
11273,
436,
7975,
342,
247,
1027,
3632,
8357,
50276,
74,
671,
1158,
326,
253,
2553,
277,
1026,
310,
247,
8921,
4327,
1677,
253,
958,
326,
253,
3806,
31825,
513,
417,
1014,
452,
9941,
1491,
1580,
253,
2644,
7103,
310,
1754,
327,
2553,
3169,
12320,
285,
6983,
5593,
247,
4055,
1282,
3169,
12955,
651,
1646,
625,
10599,
285,
1014,
19554,
275,
253,
990,
436,
310,
417,
247,
4092,
14855,
2167,
50276,
74,
2868,
253,
1543,
651,
417,
1818,
1199,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
1332,
281,
4908,
28741,
4055,
1282,
275,
495,
69,
15573,
358,
363,
253,
35961,
7680,
8696,
275,
6240,
19191,
414,
281,
271,
5368,
4055,
1282,
12544,
1332,
407,
14631,
247,
13969,
2190,
253,
2709,
19191,
6083,
50276,
83,
18,
391,
19,
285,
391,
20,
20139,
253,
35961,
7680,
347,
1355,
390,
32809,
3738,
7052,
3652,
327,
271,
5368,
2720,
789,
627,
310,
690,
38135,
275,
253,
35961,
7794,
7939,
281,
253,
19191,
414,
347,
7478,
407,
391,
20,
846,
253,
30080,
22559,
285,
275,
253,
16182,
347,
5393,
407,
391,
18,
391,
19,
285,
391,
18,
5439,
690,
3374,
5001,
253,
19843,
273,
253,
16182,
285,
9521,
2007,
1783,
273,
253,
1543,
954,
273,
841,
3533,
497,
9713,
3449,
5906,
1031,
275,
253,
30080,
22559,
285,
17265,
2715,
391,
19,
285,
391,
21,
1097,
5439,
690,
7350,
5001,
253,
4373,
19484,
25184,
285,
4758,
3738,
253,
30080,
22559,
3916,
281,
1978,
5661,
13757,
273,
253,
4373,
22041,
281,
247,
5927,
15771,
3185,
327,
13424,
2720,
3640,
4477,
1537,
971,
281,
1007,
387,
247,
625,
801,
554,
394,
1783,
273,
616,
7340,
285,
4833,
275,
253,
2852,
253,
1072,
4566,
323,
7296,
643,
8245,
3082,
247,
4067,
5447,
285,
1027,
7103,
17082,
50276,
6438,
253,
30080,
22559,
627,
369,
2299,
247,
13969,
2190,
253,
30628,
327,
253,
1318,
273,
18051,
436,
789,
275,
4260,
77,
275,
697,
1655,
1375,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper introduces egotaskqa a new benchmark with questions and answers for diagnostic analyses on spatial temporal and causal relationships between entities in goaloriented task videos the dataset is extended from the egocentric multiagent lemma dataset with four different types of qa annotations descriptive predictive explanatory and counterfactual the paper also provides experiments with stateoftheart models on the benchmark along with human performances for diagnoses on the reasoning tasks in goaloriented task videos building upon an existing dataset considering whats missing in the existing works the data annotation process is systematic and the authors provide the reasoning and procedures behind the annotation in the paper provides extensive evaluation of stateoftheart models along with naive baseline and human performances provides detailed information including procedures of annotation statistics model setup and datasheet in the supplementary materials has a website providing data exploration along with code and model checkpoints the tables are hard to follow without bolding for example the highest numbers in table 2 and the increase of performance in hcrn in table 3 should be easier to read if bolded terms like most likely in table 2 are vague and confusing the authors should consider explaining more about the term to clarify there are four columns but only two category columns in table 3 the authors should consider adding an appropriate name for each column docsepaugmented two datasets based on lemma dataset a normal and indirect dataset both datasets had contributions of four question types descriptive predictive explanatory and counterfactual across different question scopes world intents and goals and multiagent from an egocentric view the main difference between the indirect dataset from the normal dataset was the usage of words like something instead of object names to limit textual exploitation they compared results between these two datasets and the indirect dataset results were worse than the normal dataset results shedding light on the flaws of general sota video text alignment models as exploiting textual relationships rather than using sufficiently knowledgeable spatial temporal reasoning modules good contributions of augmenting datasets novel qa for a video qa dataset as described in table 1 good results and discussion limitations described are relatively small good figures in general aside from grammatical issues described in weaknesses normalindirect dataset idea was a great idea and showed good results poor grammar in benchmark vocabularyspatial relations certain figures grammar is distracting from the paper fig 1 and 2 certain figures can be very difficult to understand due to complexity doesnt explore model architecturereasoning modules at all lack of forward guidance for future papersworks weak explanation for broader impact docsepthis paper created the egotaskqa benchmark for a direct evaluation of questionanswering on realworld egocentric videos the designed questions are aiming for video understanding of the human tasks from different perspectives action dependencies and effects intents and goals and agents beliefs about others this benchmark will help to evaluate agents capability in a comprehensive way with descriptive predictive explanatory ad counterfactual questions and thus help to develop more intelligent agents this paper introduces a new benchmark egotaskqa that contains 40k balanced questionanswer pairs those questions are targeted to understand the video from multiple perspectives to evaluate the agents intelligence the questions are very broad ranging from descriptive predictive counterfactual and explanatory to evaluate the agent over the spatial temporal and causal understanding of the tasks the generated questions in the proposed benchmark take care of the diversity and balance of each kind the splits of normal and indirect of the dataset will help understand whether the model is using the correlation between the questions in the training and evaluation set without understanding the task instead using language shortcuts in relations among questions evaluate model performance on question scopes types targeting semantics and overall answer categories will show its overall capacity in understanding the ablation of object information and languageonly shows the objects are very important visual clues in qa tasks the data scope is indoor goaloriented tasks so this might cause limitations for a broader community docsepthis dataset uses videos from the lemma dataset egocentric videos of humanhuman humanobject interactions for studying video qa the authors extend lemma with annotations of objects agents and their relationships from amazon mechanical turk then the authors build causal dependency relationships between agents and objects in the videos the questions in the dataset used in qa is then automatically generated with 4 types of questions descriptive predictive counterfactual explanatory the dataset is evaluated on a set of 6 existing video models showing a gap from the model to human performance based on table 1 and the authors review of related works there is a richer set of questions in this dataset compared to baseline although i have some concerns about the questions themselves see first point in weaknesses the proposed question types descriptive predictive counterfactual explanatory and the generated causal dependency relationships are interesting for understanding the performance of video qa models the authors benchmark a set of stateoftheart video models on their dataset with both normal split as well as a split for studying indirect references these benchmarking efforts helps the community understand the strengths weaknesses of existing models the full dataset is not available at this stage even to the reviewers please correct me if im wrong also based on samples im seeing from the data the automatically generated questions in this work seems a lot less clear than prior work such as agqa a from the authors website some samples q what does the person want watermelon to be for doing the action cut something using something in the video q what will the status of the last object that has status change change to if the actor pour from something into something in the future could the authors clarify why the generated questions here are less clear due to the above i have concerns about the ability of this dataset to evaluate qa of the different question types proposed by the authors furthermore comparing table 2 here with table 2 in agqa many of the question categories are fairly similar since this dataset 40k questions is also a lot smaller than agqa 36m the significance of this dataset is not clear to me would appreciate clarifications from the authors there was just 1 trial run for all the benchmarking while i understand that running video models are expensive this does bring some concerns on the variability of results across runs and reproducibility the ml reproducibility framework was not used here a agqa a benchmark for compositional spatiotemporal reasoning docsepthis work introduces a new video question answering benchmark that consists of egocentric videos with finegrained annotations of object states objectobject humanobject and multiagent relations and causal action relations the dataset also contains four types of questionanswer pairs including counterfactual and explanatory questions in addition to questions that aim to capture intents goals and object states and changes finally the work compares 6 video question answering models on two data splits the work significantly expands an existing egocentric dataset with 40k questions covering 4 different types this dataset will be very useful to the embodied ai and vqavqc research communities the paper comes with a comprehensive related work with a summary table that contrasts the contributions of this work wrt relevant literature evaluation compares 6 vqa models several ablation studies are also present to demonstrate the usefulness of object information and language supervision albeit masking with a common term vs using the noun terms does not necessarily imply better action understanding and further analysis is needed unfortunately reviewers are not able to access the data without exposing their identities the website mentions filling in a formlicense data agreement for downloading the data with a note during review process we refer to the website for data examples and temporarily forbids full data download it is unclear how the dataset will be distributed afterward ie i on which platform the datasheet in the supplementary material mentions dataset could be accessed on our website but would that be restricted access and ii whether all code will be released and scripts for ease of reproducibility of the reported experiments will become available it is unclear how the correctness of answers generated by functional programs is verified similarly how questions are machinegenerated remains fairly unexplained to my understanding neither the code for the qa construction is not opensourced nor the paper contains sufficient details on qa quality is the evaluation of the quality of the generated answers limited to the one described in lines 239241 randomly sampling 50 questions from each category it seems that some categories have very low accuracy the observed performance increase of textonly models on object state change questions is worthy of further analysis perhaps the associations captured are not actionaware or contextaware but rather simplistic linguistic cooccurrence patterns as the experiments in section 43 suggest however the performance drop between normal and indirect splits for textonly models is marginal would be nice to open source models both for reproducibility and to allow future research to perform more exploration on what exactly languageguided models are able to learn
### Summary: | the reviewers are positive regarding the high level of the contribution of the work for the neurips 2022 track datasets and benchmarks the authors properly addressed all reviewers comments and concerns during the rebuttal period | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
23970,
299,
19559,
1945,
31569,
247,
747,
22791,
342,
3533,
285,
9172,
323,
10401,
6260,
327,
8820,
11935,
285,
19349,
7688,
875,
14429,
275,
4736,
21085,
4836,
10556,
253,
10895,
310,
6508,
432,
253,
24088,
406,
19458,
4471,
12788,
18057,
10895,
342,
1740,
1027,
3510,
273,
2805,
66,
31825,
27389,
15970,
41355,
285,
4828,
12690,
780,
253,
2929,
671,
3400,
4679,
342,
1375,
23037,
14387,
3210,
327,
253,
22791,
2112,
342,
1966,
16226,
323,
28978,
327,
253,
14720,
8892,
275,
4736,
21085,
4836,
10556,
50276,
22157,
2220,
271,
5368,
10895,
7296,
47515,
5816,
275,
253,
5368,
2987,
50276,
783,
941,
22581,
1232,
310,
12082,
285,
253,
4477,
2085,
253,
14720,
285,
7259,
3212,
253,
22581,
275,
253,
2929,
50276,
11404,
1487,
9470,
7103,
273,
1375,
23037,
14387,
3210,
2112,
342,
27785,
8245,
285,
1966,
16226,
50276,
11404,
1487,
7000,
1491,
1690,
7259,
273,
22581,
9990,
1566,
9978,
285,
7621,
14934,
275,
253,
24864,
4753,
50276,
7110,
247,
4422,
5277,
941,
17947,
2112,
342,
2127,
285,
1566,
2451,
10801,
50276,
783,
7180,
403,
1892,
281,
956,
1293,
13433,
272,
323,
1650,
253,
4585,
3904,
275,
2829,
374,
285,
253,
2572,
273,
3045,
275,
288,
7083,
79,
275,
2829,
495,
943,
320,
6927,
281,
1239,
604,
13433,
264,
50276,
27169,
751,
954,
2779,
275,
2829,
374,
403,
21248,
285,
21643,
253,
4477,
943,
1908,
15571,
625,
670,
253,
1307,
281,
19148,
50276,
9088,
403,
1740,
9930,
533,
760,
767,
7140,
9930,
275,
2829,
495,
253,
4477,
943,
1908,
6240,
271,
4569,
1416,
323,
1016,
5084,
5474,
33032,
2321,
16390,
767,
15302,
1754,
327,
18057,
10895,
50276,
66,
2622,
285,
11686,
10895,
1097,
15302,
574,
9021,
273,
1740,
1953,
3510,
27389,
15970,
41355,
285,
4828,
12690,
780,
2439,
1027,
1953,
660,
11192,
1533,
540,
592,
285,
7342,
285,
4471,
12788,
432,
271,
24088,
406,
19458,
1859,
253,
2022,
3064,
875,
253,
11686,
10895,
432,
253,
2622,
10895,
369,
253,
10393,
273,
3000,
751,
1633,
3185,
273,
1789,
4454,
281,
2701,
45860,
30211,
597,
2429,
1543,
875,
841,
767,
15302,
285,
253,
11686,
10895,
1543,
497,
7197,
685,
253,
2622,
10895,
1543,
43932,
1708,
327,
253,
32138,
273,
2087,
256,
5503,
3492,
2505,
12420,
3210,
347,
38883,
45860,
7688,
2581,
685,
970,
10481,
37289,
8820,
11935,
14720,
11911,
50276,
12311,
9021,
273,
35919,
272,
15302,
4460,
2805,
66,
323,
247,
3492,
2805,
66,
10895,
347,
2529,
275,
2829,
337,
1175,
1543,
285,
5955,
7364,
2529,
403,
4942,
1355,
1175,
8442,
275,
2087,
9255,
432,
47412,
474,
3374,
2529,
275,
32213,
2622,
527,
4128,
10895,
2934,
369,
247,
1270,
2934,
285,
2692,
1175,
1543,
50276,
31943,
28146,
275,
22791,
30318,
1033,
29445,
2493,
2176,
8442,
28146,
310,
940,
25031,
432,
253,
2929,
3036,
337,
285,
374,
2176,
8442,
476,
320,
1077,
2834,
281,
2096,
1955,
281,
10454,
36908,
8338,
1566,
10336,
10752,
272,
11911,
387,
512,
3480,
273,
3579,
12925,
323,
2852,
9380,
4896,
5075,
8813,
323,
16055,
3486,
5474,
33032,
2520,
2929,
3562,
253,
299,
19559,
1945,
31569,
22791,
323,
247,
1480,
7103,
273,
1953,
507,
88,
2158,
327,
1524,
10186,
24088,
406,
19458,
10556,
253,
4158,
3533,
403,
26400,
323,
3492,
4685,
273,
253,
1966,
8892,
432,
1027,
24302,
2250,
21011,
285,
2538,
540,
592,
285,
7342,
285,
6083,
13379,
670,
2571,
50275,
2520,
22791,
588,
1361,
281,
7472,
6083,
14603,
275,
247,
11088,
1039,
342,
27389,
15970,
41355,
519,
4828,
12690,
780,
3533,
285,
3021,
1361,
281,
1287,
625,
17497,
6083,
50274,
2520,
2929,
23970,
247,
747,
22791,
299,
19559,
1945,
31569,
326,
4428,
3387,
76,
16645,
1953,
31984,
8557,
1110,
3533,
403,
10522,
281,
2096,
253,
3492,
432,
2709,
24302,
281,
7472,
253,
6083,
9260,
50275,
783,
3533,
403,
1077,
3862,
12319,
432,
27389,
15970,
4828,
12690,
780,
285,
41355,
281,
7472,
253,
5570,
689,
253,
8820,
11935,
285,
19349,
4685,
273,
253,
8892,
50275,
783,
4561,
3533,
275,
253,
4081,
22791,
1379,
1557,
273,
253,
9991,
285,
6654,
273,
1016,
2238,
50275,
783,
36509,
273,
2622,
285,
11686,
273,
253,
10895,
588,
1361,
2096,
1880,
253,
1566,
310,
970,
253,
5921,
875,
253,
3533,
275,
253,
3733,
285,
7103,
873,
1293,
4685,
253,
4836,
3185,
970,
3448,
28194,
84,
275,
2493,
2190,
3533,
50275,
45141,
1566,
3045,
327,
1953,
660,
11192,
3510,
12262,
35185,
285,
4583,
3662,
9050,
588,
921,
697,
4583,
5350,
275,
4685,
50274,
783,
28913,
273,
1789,
1491,
285,
3448,
7483,
2722,
253,
5113,
403,
1077,
1774,
5304,
30591,
275,
2805,
66,
8892,
50274,
783,
941,
7990,
310,
24340,
4736,
21085,
8892,
594,
436,
1537,
2847,
7364,
323,
247,
16055,
3114,
5474,
33032,
2520,
10895,
4648,
10556,
432,
253,
18057,
10895,
24088,
406,
19458,
10556,
273,
1966,
13961,
50276,
13961,
6082,
6355,
323,
12392,
3492,
2805,
66,
253,
4477,
9017,
18057,
342,
31825,
273,
5113,
6083,
285,
616,
7688,
432,
7001,
251,
8651,
10709,
76,
840,
253,
4477,
1973,
19349,
18925,
7688,
875,
6083,
285,
5113,
275,
253,
10556,
253,
3533,
275,
253,
10895,
908,
275,
2805,
66,
310,
840,
8356,
4561,
342,
577,
3510,
273,
3533,
27389,
15970,
4828,
12690,
780,
41355,
253,
10895,
310,
6760,
327,
247,
873,
273,
721,
5368,
3492,
3210,
4645,
247,
8037,
432,
253,
1566,
281,
1966,
3045,
50276,
3169,
327,
2829,
337,
285,
253,
4477,
2278,
273,
2905,
2987,
627,
310,
247,
38539,
873,
273,
3533,
275,
436,
10895,
2429,
281,
8245,
3738,
891,
452,
690,
7350,
670,
253,
3533,
3746,
923,
806,
1127,
275,
32213,
253,
4081,
1953,
3510,
27389,
15970,
4828,
12690,
780,
41355,
285,
253,
4561,
19349,
18925,
7688,
403,
4722,
323,
4685,
253,
3045,
273,
3492,
2805,
66,
3210,
50275,
783,
4477,
22791,
247,
873,
273,
1375,
23037,
14387,
3492,
3210,
327,
616,
10895,
342,
1097,
2622,
8085,
347,
973,
347,
247,
8085,
323,
12392,
11686,
10414,
841,
22791,
272,
6031,
7729,
253,
3114,
2096,
253,
20544,
50276,
20881,
1255,
265,
273,
5368,
3210,
50274,
783,
2120,
10895,
310,
417,
2130,
387,
436,
3924,
1014,
281,
253,
30628,
4496,
3451,
479,
604,
516,
3430,
671,
1754,
327,
3530,
516,
6523,
432,
253,
941,
253,
8356,
4561,
3533,
275,
436,
789,
3133,
247,
2257,
1679,
2590,
685,
2720,
789,
824,
347,
639,
31569,
247,
432,
253,
4477,
4422,
690,
3530,
2805,
752,
1057,
253,
1436,
971,
37385,
46842,
281,
320,
323,
2509,
253,
2250,
2624,
1633,
970,
1633,
275,
253,
3492,
2805,
752,
588,
253,
3708,
273,
253,
1390,
1789,
326,
556,
3708,
1818,
1818,
281,
604,
253,
12353,
6531,
432,
1633,
715,
1633,
275,
253,
2852,
812,
253,
4477,
19148,
2139,
253,
4561,
3533,
1060,
403,
1679,
2590,
50275,
21848,
281,
253,
1840,
891,
452,
7350,
670,
253,
3745,
273,
436,
10895,
281,
7472,
2805,
66,
273,
253,
1027,
1953,
3510,
4081,
407,
253,
4477,
33810,
10941,
2829,
374,
1060,
342,
2829,
374,
275,
639,
31569,
1142,
273,
253,
1953,
9050,
403,
9648,
2074,
1580,
436,
10895,
3387,
76,
3533,
310,
671,
247,
2257,
4577,
685,
639,
31569,
5540,
78,
253,
8453,
273,
436,
10895,
310,
417,
2590,
281,
479,
50276,
12756,
11435,
8254,
6787,
432,
253,
4477,
50275,
9088,
369,
816,
337,
2332,
1408,
323,
512,
253,
22791,
272,
1223,
891,
2096,
326,
3515,
3492,
3210,
403,
8214,
436,
1057,
3324,
690,
7350,
327,
253,
13099,
273,
1543,
2439,
6613,
285,
38041,
253,
13361,
38041,
7792,
369,
417,
908,
1060,
50276,
66,
639,
31569,
247,
22791,
323,
5889,
267,
7046,
7173,
358,
23702,
14720,
5474,
33032,
2520,
789,
23970,
247,
747,
3492,
1953,
22291,
22791,
326,
8414,
273,
24088,
406,
19458,
10556,
342,
4030,
72,
11273,
31825,
273,
1789,
3054,
1789,
6082,
1966,
6082,
285,
4471,
12788,
2493,
285,
19349,
2250,
2493,
253,
10895,
671,
4428,
1740,
3510,
273,
1953,
31984,
8557,
1690,
4828,
12690,
780,
285,
41355,
3533,
275,
1635,
281,
3533,
326,
4388,
281,
9232,
540,
592,
7342,
285,
1789,
3054,
285,
2544,
4720,
253,
789,
26662,
721,
3492,
1953,
22291,
3210,
327,
767,
941,
36509,
50276,
783,
789,
3012,
35205,
271,
5368,
24088,
406,
19458,
10895,
342,
3387,
76,
3533,
10985,
577,
1027,
3510,
436,
10895,
588,
320,
1077,
4217,
281,
253,
36080,
23105,
285,
362,
82,
580,
38697,
2561,
7888,
50276,
783,
2929,
3249,
342,
247,
11088,
2905,
789,
342,
247,
6010,
2829,
326,
39165,
253,
9021,
273,
436,
789,
8772,
4623,
6239,
50276,
15419,
2368,
26662,
721,
362,
31569,
3210,
2067,
28913,
2175,
403,
671,
1246,
281,
7568,
253,
31471,
273,
1789,
1491,
285,
3448,
20446,
23447,
44790,
342,
247,
1846,
1307,
4632,
970,
253,
28407,
2426,
1057,
417,
7933,
16084,
1805,
2250,
4685,
285,
2007,
1783,
310,
3058,
50276,
328,
9520,
30628,
403,
417,
2104,
281,
2289,
253,
941,
1293,
28248,
616,
22925,
253,
4422,
25957,
12868,
275,
247,
830,
21997,
941,
4345,
323,
33676,
253,
941,
342,
247,
3877,
50276,
32674,
2278,
1232,
359,
3730,
281,
253,
4422,
323,
941,
6667,
285,
20220,
17163,
2352,
2120,
941,
6184,
352,
310,
12744,
849,
253,
10895,
588,
320,
5939,
28279,
26332,
891,
327,
534,
5147,
253,
7621,
14934,
275,
253,
24864,
2144,
25957,
10895,
812,
320,
19197,
327,
776,
4422,
533,
651,
326,
320,
11096,
2289,
285,
21255,
1880,
512,
2127,
588,
320,
4439,
285,
20477,
323,
11990,
273,
38041,
273,
253,
2361,
4679,
588,
2489,
2130,
50275,
262,
310,
12744,
849,
253,
36594,
273,
9172,
4561,
407,
5164,
5659,
310,
16058,
12014,
849,
3533,
403,
5145,
20419,
4558,
9648,
49374,
281,
619,
4685,
6747,
253,
2127,
323,
253,
2805,
66,
5140,
310,
417,
13279,
47549,
4543,
253,
2929,
4428,
4209,
4278,
327,
2805,
66,
3290,
310,
253,
7103,
273,
253,
3290,
273,
253,
4561,
9172,
3710,
281,
253,
581,
2529,
275,
3104,
27862,
24552,
12421,
10491,
2456,
3533,
432,
1016,
7140,
352,
3133,
326,
690,
9050,
452,
1077,
1698,
7200,
50275,
783,
2540,
3045,
2572,
273,
2505,
7483,
3210,
327,
1789,
1375,
1818,
3533,
310,
18338,
273,
2007,
1783,
4931,
253,
12485,
10848,
403,
417,
2250,
13823,
390,
3634,
13823,
533,
2581,
8077,
2531,
32019,
820,
30714,
1196,
6127,
347,
253,
4679,
275,
2593,
7652,
1804,
2299,
253,
3045,
5926,
875,
2622,
285,
11686,
36509,
323,
2505,
7483,
3210,
310,
16888,
651,
320,
5322,
281,
1527,
2603,
3210,
1097,
323,
38041,
285,
281,
1581,
2852,
2561,
281,
1347,
625,
17947,
327,
752,
4555,
3448,
26960,
3210,
403,
2104,
281,
3037,
2490,
187,
4118,
18435,
27,
783,
30628,
403,
2762,
5001,
253,
1029,
1268,
273,
253,
7680,
273,
253,
789,
323,
253,
5723,
2824,
1384,
1423,
3540,
15302,
285,
49602,
253,
4477,
6283,
9713,
512,
30628,
5701,
285,
7350,
1309,
253,
30080,
22559,
2180
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
23970,
299,
19559,
1945,
31569,
247,
747,
22791,
342,
3533,
285,
9172,
323,
10401,
6260,
327,
8820,
11935,
285,
19349,
7688,
875,
14429,
275,
4736,
21085,
4836,
10556,
253,
10895,
310,
6508,
432,
253,
24088,
406,
19458,
4471,
12788,
18057,
10895,
342,
1740,
1027,
3510,
273,
2805,
66,
31825,
27389,
15970,
41355,
285,
4828,
12690,
780,
253,
2929,
671,
3400,
4679,
342,
1375,
23037,
14387,
3210,
327,
253,
22791,
2112,
342,
1966,
16226,
323,
28978,
327,
253,
14720,
8892,
275,
4736,
21085,
4836,
10556,
50276,
22157,
2220,
271,
5368,
10895,
7296,
47515,
5816,
275,
253,
5368,
2987,
50276,
783,
941,
22581,
1232,
310,
12082,
285,
253,
4477,
2085,
253,
14720,
285,
7259,
3212,
253,
22581,
275,
253,
2929,
50276,
11404,
1487,
9470,
7103,
273,
1375,
23037,
14387,
3210,
2112,
342,
27785,
8245,
285,
1966,
16226,
50276,
11404,
1487,
7000,
1491,
1690,
7259,
273,
22581,
9990,
1566,
9978,
285,
7621,
14934,
275,
253,
24864,
4753,
50276,
7110,
247,
4422,
5277,
941,
17947,
2112,
342,
2127,
285,
1566,
2451,
10801,
50276,
783,
7180,
403,
1892,
281,
956,
1293,
13433,
272,
323,
1650,
253,
4585,
3904,
275,
2829,
374,
285,
253,
2572,
273,
3045,
275,
288,
7083,
79,
275,
2829,
495,
943,
320,
6927,
281,
1239,
604,
13433,
264,
50276,
27169,
751,
954,
2779,
275,
2829,
374,
403,
21248,
285,
21643,
253,
4477,
943,
1908,
15571,
625,
670,
253,
1307,
281,
19148,
50276,
9088,
403,
1740,
9930,
533,
760,
767,
7140,
9930,
275,
2829,
495,
253,
4477,
943,
1908,
6240,
271,
4569,
1416,
323,
1016,
5084,
5474,
33032,
2321,
16390,
767,
15302,
1754,
327,
18057,
10895,
50276,
66,
2622,
285,
11686,
10895,
1097,
15302,
574,
9021,
273,
1740,
1953,
3510,
27389,
15970,
41355,
285,
4828,
12690,
780,
2439,
1027,
1953,
660,
11192,
1533,
540,
592,
285,
7342,
285,
4471,
12788,
432,
271,
24088,
406,
19458,
1859,
253,
2022,
3064,
875,
253,
11686,
10895,
432,
253,
2622,
10895,
369,
253,
10393,
273,
3000,
751,
1633,
3185,
273,
1789,
4454,
281,
2701,
45860,
30211,
597,
2429,
1543,
875,
841,
767,
15302,
285,
253,
11686,
10895,
1543,
497,
7197,
685,
253,
2622,
10895,
1543,
43932,
1708,
327,
253,
32138,
273,
2087,
256,
5503,
3492,
2505,
12420,
3210,
347,
38883,
45860,
7688,
2581,
685,
970,
10481,
37289,
8820,
11935,
14720,
11911,
50276,
12311,
9021,
273,
35919,
272,
15302,
4460,
2805,
66,
323,
247,
3492,
2805,
66,
10895,
347,
2529,
275,
2829,
337,
1175,
1543,
285,
5955,
7364,
2529,
403,
4942,
1355,
1175,
8442,
275,
2087,
9255,
432,
47412,
474,
3374,
2529,
275,
32213,
2622,
527,
4128,
10895,
2934,
369,
247,
1270,
2934,
285,
2692,
1175,
1543,
50276,
31943,
28146,
275,
22791,
30318,
1033,
29445,
2493,
2176,
8442,
28146,
310,
940,
25031,
432,
253,
2929,
3036,
337,
285,
374,
2176,
8442,
476,
320,
1077,
2834,
281,
2096,
1955,
281,
10454,
36908,
8338,
1566,
10336,
10752,
272,
11911,
387,
512,
3480,
273,
3579,
12925,
323,
2852,
9380,
4896,
5075,
8813,
323,
16055,
3486,
5474,
33032,
2520,
2929,
3562,
253,
299,
19559,
1945,
31569,
22791,
323,
247,
1480,
7103,
273,
1953,
507,
88,
2158,
327,
1524,
10186,
24088,
406,
19458,
10556,
253,
4158,
3533,
403,
26400,
323,
3492,
4685,
273,
253,
1966,
8892,
432,
1027,
24302,
2250,
21011,
285,
2538,
540,
592,
285,
7342,
285,
6083,
13379,
670,
2571,
50275,
2520,
22791,
588,
1361,
281,
7472,
6083,
14603,
275,
247,
11088,
1039,
342,
27389,
15970,
41355,
519,
4828,
12690,
780,
3533,
285,
3021,
1361,
281,
1287,
625,
17497,
6083,
50274,
2520,
2929,
23970,
247,
747,
22791,
299,
19559,
1945,
31569,
326,
4428,
3387,
76,
16645,
1953,
31984,
8557,
1110,
3533,
403,
10522,
281,
2096,
253,
3492,
432,
2709,
24302,
281,
7472,
253,
6083,
9260,
50275,
783,
3533,
403,
1077,
3862,
12319,
432,
27389,
15970,
4828,
12690,
780,
285,
41355,
281,
7472,
253,
5570,
689,
253,
8820,
11935,
285,
19349,
4685,
273,
253,
8892,
50275,
783,
4561,
3533,
275,
253,
4081,
22791,
1379,
1557,
273,
253,
9991,
285,
6654,
273,
1016,
2238,
50275,
783,
36509,
273,
2622,
285,
11686,
273,
253,
10895,
588,
1361,
2096,
1880,
253,
1566,
310,
970,
253,
5921,
875,
253,
3533,
275,
253,
3733,
285,
7103,
873,
1293,
4685,
253,
4836,
3185,
970,
3448,
28194,
84,
275,
2493,
2190,
3533,
50275,
45141,
1566,
3045,
327,
1953,
660,
11192,
3510,
12262,
35185,
285,
4583,
3662,
9050,
588,
921,
697,
4583,
5350,
275,
4685,
50274,
783,
28913,
273,
1789,
1491,
285,
3448,
7483,
2722,
253,
5113,
403,
1077,
1774,
5304,
30591,
275,
2805,
66,
8892,
50274,
783,
941,
7990,
310,
24340,
4736,
21085,
8892,
594,
436,
1537,
2847,
7364,
323,
247,
16055,
3114,
5474,
33032,
2520,
10895,
4648,
10556,
432,
253,
18057,
10895,
24088,
406,
19458,
10556,
273,
1966,
13961,
50276,
13961,
6082,
6355,
323,
12392,
3492,
2805,
66,
253,
4477,
9017,
18057,
342,
31825,
273,
5113,
6083,
285,
616,
7688,
432,
7001,
251,
8651,
10709,
76,
840,
253,
4477,
1973,
19349,
18925,
7688,
875,
6083,
285,
5113,
275,
253,
10556,
253,
3533,
275,
253,
10895,
908,
275,
2805,
66,
310,
840,
8356,
4561,
342,
577,
3510,
273,
3533,
27389,
15970,
4828,
12690,
780,
41355,
253,
10895,
310,
6760,
327,
247,
873,
273,
721,
5368,
3492,
3210,
4645,
247,
8037,
432,
253,
1566,
281,
1966,
3045,
50276,
3169,
327,
2829,
337,
285,
253,
4477,
2278,
273,
2905,
2987,
627,
310,
247,
38539,
873,
273,
3533,
275,
436,
10895,
2429,
281,
8245,
3738,
891,
452,
690,
7350,
670,
253,
3533,
3746,
923,
806,
1127,
275,
32213,
253,
4081,
1953,
3510,
27389,
15970,
4828,
12690,
780,
41355,
285,
253,
4561,
19349,
18925,
7688,
403,
4722,
323,
4685,
253,
3045,
273,
3492,
2805,
66,
3210,
50275,
783,
4477,
22791,
247,
873,
273,
1375,
23037,
14387,
3492,
3210,
327,
616,
10895,
342,
1097,
2622,
8085,
347,
973,
347,
247,
8085,
323,
12392,
11686,
10414,
841,
22791,
272,
6031,
7729,
253,
3114,
2096,
253,
20544,
50276,
20881,
1255,
265,
273,
5368,
3210,
50274,
783,
2120,
10895,
310,
417,
2130,
387,
436,
3924,
1014,
281,
253,
30628,
4496,
3451,
479,
604,
516,
3430,
671,
1754,
327,
3530,
516,
6523,
432,
253,
941,
253,
8356,
4561,
3533,
275,
436,
789,
3133,
247,
2257,
1679,
2590,
685,
2720,
789,
824,
347,
639,
31569,
247,
432,
253,
4477,
4422,
690,
3530,
2805,
752,
1057,
253,
1436,
971,
37385,
46842,
281,
320,
323,
2509,
253,
2250,
2624,
1633,
970,
1633,
275,
253,
3492,
2805,
752,
588,
253,
3708,
273,
253,
1390,
1789,
326,
556,
3708,
1818,
1818,
281,
604,
253,
12353,
6531,
432,
1633,
715,
1633,
275,
253,
2852,
812,
253,
4477,
19148,
2139,
253,
4561,
3533,
1060,
403,
1679,
2590,
50275,
21848,
281,
253,
1840,
891,
452,
7350,
670,
253,
3745,
273,
436,
10895,
281,
7472,
2805,
66,
273,
253,
1027,
1953,
3510,
4081,
407,
253,
4477,
33810,
10941,
2829,
374,
1060,
342,
2829,
374,
275,
639,
31569,
1142,
273,
253,
1953,
9050,
403,
9648,
2074,
1580,
436,
10895,
3387,
76,
3533,
310,
671,
247,
2257,
4577,
685,
639,
31569,
5540,
78,
253,
8453,
273,
436,
10895,
310,
417,
2590,
281,
479,
50276,
12756,
11435,
8254,
6787,
432,
253,
4477,
50275,
9088,
369,
816,
337,
2332,
1408,
323,
512,
253,
22791,
272,
1223,
891,
2096,
326,
3515,
3492,
3210,
403,
8214,
436,
1057,
3324,
690,
7350,
327,
253,
13099,
273,
1543,
2439,
6613,
285,
38041,
253,
13361,
38041,
7792,
369,
417,
908,
1060,
50276,
66,
639,
31569,
247,
22791,
323,
5889,
267,
7046,
7173,
358,
23702,
14720,
5474,
33032,
2520,
789,
23970,
247,
747,
3492,
1953,
22291,
22791,
326,
8414,
273,
24088,
406,
19458,
10556,
342,
4030,
72,
11273,
31825,
273,
1789,
3054,
1789,
6082,
1966,
6082,
285,
4471,
12788,
2493,
285,
19349,
2250,
2493,
253,
10895,
671,
4428,
1740,
3510,
273,
1953,
31984,
8557,
1690,
4828,
12690,
780,
285,
41355,
3533,
275,
1635,
281,
3533,
326,
4388,
281,
9232,
540,
592,
7342,
285,
1789,
3054,
285,
2544,
4720,
253,
789,
26662,
721,
3492,
1953,
22291,
3210,
327,
767,
941,
36509,
50276,
783,
789,
3012,
35205,
271,
5368,
24088,
406,
19458,
10895,
342,
3387,
76,
3533,
10985,
577,
1027,
3510,
436,
10895,
588,
320,
1077,
4217,
281,
253,
36080,
23105,
285,
362,
82,
580,
38697,
2561,
7888,
50276,
783,
2929,
3249,
342,
247,
11088,
2905,
789,
342,
247,
6010,
2829,
326,
39165,
253,
9021,
273,
436,
789,
8772,
4623,
6239,
50276,
15419,
2368,
26662,
721,
362,
31569,
3210,
2067,
28913,
2175,
403,
671,
1246,
281,
7568,
253,
31471,
273,
1789,
1491,
285,
3448,
20446,
23447,
44790,
342,
247,
1846,
1307,
4632,
970,
253,
28407,
2426,
1057,
417,
7933,
16084,
1805,
2250,
4685,
285,
2007,
1783,
310,
3058,
50276,
328,
9520,
30628,
403,
417,
2104,
281,
2289,
253,
941,
1293,
28248,
616,
22925,
253,
4422,
25957,
12868,
275,
247,
830,
21997,
941,
4345,
323,
33676,
253,
941,
342,
247,
3877,
50276,
32674,
2278,
1232,
359,
3730,
281,
253,
4422,
323,
941,
6667,
285,
20220,
17163,
2352,
2120,
941,
6184,
352,
310,
12744,
849,
253,
10895,
588,
320,
5939,
28279,
26332,
891,
327,
534,
5147,
253,
7621,
14934,
275,
253,
24864,
2144,
25957,
10895,
812,
320,
19197,
327,
776,
4422,
533,
651,
326,
320,
11096,
2289,
285,
21255,
1880,
512,
2127,
588,
320,
4439,
285,
20477,
323,
11990,
273,
38041,
273,
253,
2361,
4679,
588,
2489,
2130,
50275,
262,
310,
12744,
849,
253,
36594,
273,
9172,
4561,
407,
5164,
5659,
310,
16058,
12014,
849,
3533,
403,
5145,
20419,
4558,
9648,
49374,
281,
619,
4685,
6747,
253,
2127,
323,
253,
2805,
66,
5140,
310,
417,
13279,
47549,
4543,
253,
2929,
4428,
4209,
4278,
327,
2805,
66,
3290,
310,
253,
7103,
273,
253,
3290,
273,
253,
4561,
9172,
3710,
281,
253,
581,
2529,
275,
3104,
27862,
24552,
12421,
10491,
2456,
3533,
432,
1016,
7140,
352,
3133,
326,
690,
9050,
452,
1077,
1698,
7200,
50275,
783,
2540,
3045,
2572,
273,
2505,
7483,
3210,
327,
1789,
1375,
1818,
3533,
310,
18338,
273,
2007,
1783,
4931,
253,
12485,
10848,
403,
417,
2250,
13823,
390,
3634,
13823,
533,
2581,
8077,
2531,
32019,
820,
30714,
1196,
6127,
347,
253,
4679,
275,
2593,
7652,
1804,
2299,
253,
3045,
5926,
875,
2622,
285,
11686,
36509,
323,
2505,
7483,
3210,
310,
16888,
651,
320,
5322,
281,
1527,
2603,
3210,
1097,
323,
38041,
285,
281,
1581,
2852,
2561,
281,
1347,
625,
17947,
327,
752,
4555,
3448,
26960,
3210,
403,
2104,
281,
3037,
2490,
187,
4118,
18435,
27,
783,
30628,
403,
2762,
5001,
253,
1029,
1268,
273,
253,
7680,
273,
253,
789,
323,
253,
5723,
2824,
1384,
1423,
3540,
15302,
285,
49602,
253,
4477,
6283,
9713,
512,
30628,
5701,
285,
7350,
1309,
253,
30080,
22559,
2180
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents a modelbased contact servoing framework to perform control the contact between a compliant tool and the robots environmet contact is parameterized as a binary contact flag line of contact and an endeffector wrench dynamics are learned in a latent space with an encoderdecoder framework that is used to predict the contact parameters at every step using a pointcloud observation and the input wrench measured at the robots wrist strengths data collection procedure is selfsupervised without needing human intervention or labelling indepth analysis of different ablations of the model to study and evaluate the effectiveness of different components of the system weaknesses generalization experiments are restricted to a single spatula furthermore the surface in contact with the spatula also remains the same would be interesting to see generalization to new spatulas or even just traintest on more tools as well as analyzing variations in other parameters such as the contact surface docsepthe paper presents a robot control framework for controlling contact forces at the tip of a pregrasped tool remarkably the proposed approach considers a situation where the compliance of the grasped tool can be used to change the geometry of the contact the authors presents a method to learn a dynamic model which allows to predict the effects to the robot actions on the contact geometry and on the robot endeffector location this dynamic model is used in an mpc framework to control contact forces in a variety of scenarios including controlling the contact forces at the tip of a flexible spatula to scrape a target object in presence of obstacles the paper presents a novel solution to a novel problem controlling tool contact forces is a very interesting problem which to my knowledge has been addressed only in the rigid case the submitted paper considers the nonrigid case which is a significant novelty the major limitation of the paper is the need of a complicated datagathering phase which requires precision equipment photoneo and human supervision docsepthe authors proposed a learning approach for modeling toolenvironment interaction which learn the dynamics from real world data key of proposed method is embedding the latent spaces from the robots sensor data and decodes the contact feature representation which consists of binary contact value the line of contact and an endeffector wrench the authors verified the scraping operation of a claylike object using a spatula attached to the tip of an actual robot as a task involving contact between the environment and a compliance tool experimental results show that the proposed method is superior in all evaluation items contact force error binary accuracy and contact line error furthermore by adding the data extension and wrench offset action proposed by the authors the robot is able to scrape objects from the table while avoiding contact with obstacles the overall writing of the paper is easy to understandable and the attached video also helped the reader understand most methods using supervised learning to learn robot policies from realworld data do not consider the adequacy of the generated trajectory or contact because they give the model predictions directly as motor commands in contrast the authors use mmpi to account for losses to trajectory and contact predictions and they clearly describe their specific method furthermore the authors also demonstrate the effectiveness of the proposed method by conducting multiple quantitative evaluations one concern is that there are few comparisons with other methods since the authors evaluated the accuracy of the task only based on differences in the structure of their proposed model with or without vision and data augmentation etc it is better to include the results of comparisons with other studies if the authors want to claim the effectiveness of the proposed method many previous studies of tasks involving contact exist 1 showed that by using imitation learning to learn the contact force with the floor surface the robot can properly mop the floor even if the grasping position and length of the mop are different the reviewer has some major comments the reviewer agrees with the issue on line 21 however as the solution shown by the proposed method is limited it would be better to show the consideration and results of the toolenvironment interaction when the grasping position of the spatula is changed line 49 states that a variety of trajectories are realized but basically the robot is just moving the spatula straight therefore the diversity of trajectories is not shown in order to demonstrate the potential of the proposed method it would be helpful to have results for varying the positions of objects and obstacles the reviewer understood that by collecting training data with a random action policy contact information between the tool and the spatula could be collected as shown in fig 2 the reviewers also understood that the robot predicts its behavior every step based on the observed data on the other hand obstacles and objects are not included in the training data how do they recognize the objects and rub them with the spatula it is unclear how the approach trajectory to the object is learned from the random trajectory explaining the data flow during learning and during execution will help the reader understand finally how are the desired contact trajectories in figure 4 calculated are they arbitrarily determined by human 1 sakaino sho bilateral controlbased imitation learning for velocitycontrolled robot 2021 ieee 30th international symposium on industrial electronics isie ieee 2021 docsepthe paper propose a method to learn the dynamics of these contact features from real world data with unknow tool geometry and propose a controller that uses learned dynamics model for visuotactile contact servoing and show that it is effective at performing scraping tasks with a spatula strengths 1 propose a contact configuration a binary contact mode 2 present a framework for modeling compliant toolenvironment contact interactions by learning contact feature dynamics 3 propose a learned model architecture to capture the dynamics of contact features trained in a supervised fashion using real world selfsupervised data 4 design and demonstrate a controller using the contact feature dynamics to realize diverse goal trajectories weaknesses 1 authors believe the line contacts model can be straightforward to extend to patch contacts by using a richer contact descriptor but do not proof it 2 the model is used for compliant toolenvironment contact interactions but only test on a spatula
### Summary: | this paper presents an architecture for modeling the interaction of a compliant tool and the environment the contact dynamics is represented by the contact indicator contact line and endeffector wrench the method is demonstrated on hardware using a compliant spatula to scrape an obstacle the authors successfully addressed many of the reviewers concerns most notably by adding more examples with multiple and unseen tools to demonstrate the generalization capability the paper makes a valuable contribution to the conference by providing a novel solution to an interesting problem | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
1566,
3169,
3057,
48269,
272,
7792,
281,
1347,
1453,
253,
3057,
875,
247,
38147,
4968,
285,
253,
25497,
546,
2411,
3899,
3057,
310,
4764,
1025,
347,
247,
8985,
3057,
7908,
1386,
273,
3057,
285,
271,
990,
8222,
263,
259,
4099,
8062,
403,
6311,
275,
247,
21624,
2317,
342,
271,
32049,
48759,
7792,
326,
310,
908,
281,
3283,
253,
3057,
3602,
387,
1046,
3213,
970,
247,
1127,
18534,
8310,
285,
253,
3280,
259,
4099,
4080,
387,
253,
25497,
20051,
20544,
50276,
2203,
4849,
5199,
310,
1881,
35421,
1293,
25312,
1966,
7268,
390,
46684,
50276,
527,
554,
394,
1783,
273,
1027,
490,
77,
569,
273,
253,
1566,
281,
1263,
285,
7472,
253,
12510,
273,
1027,
4295,
273,
253,
985,
50276,
20881,
1255,
265,
50276,
16691,
1320,
4679,
403,
11096,
281,
247,
2014,
7046,
3627,
33810,
253,
2553,
275,
3057,
342,
253,
7046,
3627,
671,
4558,
253,
1072,
651,
320,
4722,
281,
923,
26647,
281,
747,
7046,
37961,
390,
1014,
816,
1140,
565,
383,
327,
625,
5657,
347,
973,
347,
18918,
10575,
275,
643,
3602,
824,
347,
253,
3057,
2553,
5474,
339,
431,
248,
2929,
10262,
247,
15688,
1453,
7792,
323,
10938,
3057,
5621,
387,
253,
9092,
273,
247,
638,
737,
4938,
264,
4968,
24678,
253,
4081,
2746,
19401,
247,
4112,
835,
253,
10276,
273,
253,
46137,
4968,
476,
320,
908,
281,
1818,
253,
12087,
273,
253,
3057,
253,
4477,
10262,
247,
1332,
281,
3037,
247,
7870,
1566,
534,
4483,
281,
3283,
253,
2538,
281,
253,
15688,
5231,
327,
253,
3057,
12087,
285,
327,
253,
15688,
990,
8222,
263,
4328,
436,
7870,
1566,
310,
908,
275,
271,
278,
5902,
7792,
281,
1453,
3057,
5621,
275,
247,
5235,
273,
15216,
1690,
10938,
253,
3057,
5621,
387,
253,
9092,
273,
247,
12112,
7046,
3627,
281,
49939,
247,
2303,
1789,
275,
3361,
273,
24238,
253,
2929,
10262,
247,
4460,
2900,
281,
247,
4460,
1895,
10938,
4968,
3057,
5621,
310,
247,
1077,
4722,
1895,
534,
281,
619,
3640,
556,
644,
9713,
760,
275,
253,
16572,
1083,
253,
9262,
2929,
19401,
253,
1327,
10389,
301,
1083,
534,
310,
247,
1534,
38135,
253,
2201,
12291,
273,
253,
2929,
310,
253,
878,
273,
247,
9542,
2856,
356,
44627,
3408,
534,
4419,
12320,
6500,
2359,
531,
80,
285,
1966,
20446,
50276,
7152,
339,
431,
248,
4477,
4081,
247,
4715,
2746,
323,
14053,
4968,
20034,
5016,
534,
3037,
253,
8062,
432,
1524,
1533,
941,
186,
2234,
273,
4081,
1332,
310,
21496,
253,
21624,
8470,
432,
253,
25497,
8468,
941,
285,
1086,
3180,
253,
3057,
4735,
6779,
534,
8414,
273,
8985,
3057,
1318,
253,
1386,
273,
3057,
285,
271,
990,
8222,
263,
259,
4099,
186,
783,
4477,
16058,
253,
47232,
4254,
273,
247,
24272,
3022,
1789,
970,
247,
7046,
3627,
7660,
281,
253,
9092,
273,
271,
4588,
15688,
347,
247,
4836,
7668,
3057,
875,
253,
3126,
285,
247,
10276,
4968,
186,
5661,
1543,
921,
326,
253,
4081,
1332,
310,
8936,
275,
512,
7103,
4957,
3057,
3490,
2228,
8985,
7200,
285,
3057,
1386,
2228,
33810,
407,
6240,
253,
941,
6880,
285,
259,
4099,
8409,
2250,
4081,
407,
253,
4477,
253,
15688,
310,
2104,
281,
49939,
5113,
432,
253,
2829,
1223,
17816,
3057,
342,
24238,
253,
4583,
4028,
273,
253,
2929,
310,
3477,
281,
34007,
285,
253,
7660,
3492,
671,
6518,
253,
9414,
2096,
954,
3082,
970,
22296,
4715,
281,
3037,
15688,
7823,
432,
1524,
10186,
941,
513,
417,
1908,
253,
50172,
273,
253,
4561,
18974,
390,
3057,
984,
597,
1918,
253,
1566,
13650,
3587,
347,
5694,
13896,
275,
4499,
253,
4477,
897,
5823,
2059,
281,
2395,
323,
11655,
281,
18974,
285,
3057,
13650,
285,
597,
4518,
6266,
616,
2173,
1332,
33810,
253,
4477,
671,
7568,
253,
12510,
273,
253,
4081,
1332,
407,
16472,
2709,
11745,
27163,
50276,
531,
4468,
310,
326,
627,
403,
1643,
14023,
342,
643,
3082,
1580,
253,
4477,
6760,
253,
7200,
273,
253,
4836,
760,
1754,
327,
3910,
275,
253,
2605,
273,
616,
4081,
1566,
342,
390,
1293,
8113,
285,
941,
42072,
3966,
352,
310,
1805,
281,
2486,
253,
1543,
273,
14023,
342,
643,
2175,
604,
253,
4477,
971,
281,
1750,
253,
12510,
273,
253,
4081,
1332,
1142,
2045,
2175,
273,
8892,
7668,
3057,
2226,
337,
2692,
326,
407,
970,
45738,
4715,
281,
3037,
253,
3057,
3490,
342,
253,
5254,
2553,
253,
15688,
476,
6283,
278,
412,
253,
5254,
1014,
604,
253,
48635,
1899,
285,
2978,
273,
253,
278,
412,
403,
1027,
253,
37317,
556,
690,
2201,
5701,
253,
37317,
18726,
342,
253,
2523,
327,
1386,
3127,
2299,
347,
253,
2900,
2011,
407,
253,
4081,
1332,
310,
3710,
352,
651,
320,
1805,
281,
921,
253,
8180,
285,
1543,
273,
253,
4968,
20034,
5016,
672,
253,
48635,
1899,
273,
253,
7046,
3627,
310,
4391,
1386,
7584,
3054,
326,
247,
5235,
273,
24102,
403,
8156,
533,
10323,
253,
15688,
310,
816,
4886,
253,
7046,
3627,
4951,
3103,
253,
9991,
273,
24102,
310,
417,
2011,
275,
1340,
281,
7568,
253,
2442,
273,
253,
4081,
1332,
352,
651,
320,
9371,
281,
452,
1543,
323,
11962,
253,
6887,
273,
5113,
285,
24238,
253,
37317,
7192,
326,
407,
17055,
3733,
941,
342,
247,
3632,
2250,
3646,
3057,
1491,
875,
253,
4968,
285,
253,
7046,
3627,
812,
320,
5728,
347,
2011,
275,
3036,
374,
253,
30628,
671,
7192,
326,
253,
15688,
26295,
697,
3879,
1046,
3213,
1754,
327,
253,
2540,
941,
327,
253,
643,
1133,
24238,
285,
5113,
403,
417,
2908,
275,
253,
3733,
941,
849,
513,
597,
9446,
253,
5113,
285,
7692,
731,
342,
253,
7046,
3627,
352,
310,
12744,
849,
253,
2746,
18974,
281,
253,
1789,
310,
6311,
432,
253,
3632,
18974,
15571,
253,
941,
2685,
1309,
4715,
285,
1309,
10636,
588,
1361,
253,
9414,
2096,
4720,
849,
403,
253,
6799,
3057,
24102,
275,
4677,
577,
5118,
403,
597,
29607,
3413,
407,
1966,
186,
28910,
337,
256,
518,
404,
80,
6738,
19379,
1453,
3169,
45738,
4715,
323,
7602,
16894,
15688,
43425,
26332,
1796,
1884,
394,
5213,
18870,
35835,
327,
9787,
23598,
310,
466,
26332,
1796,
43425,
5474,
339,
431,
248,
2929,
12661,
247,
1332,
281,
3037,
253,
8062,
273,
841,
3057,
3386,
432,
1524,
1533,
941,
342,
440,
14428,
4968,
12087,
285,
12661,
247,
9763,
326,
4648,
6311,
8062,
1566,
323,
1649,
86,
302,
514,
587,
3057,
48269,
272,
285,
921,
326,
352,
310,
3576,
387,
9591,
47232,
8892,
342,
247,
7046,
3627,
20544,
337,
12661,
247,
3057,
6661,
247,
8985,
3057,
4438,
50276,
19,
1246,
247,
7792,
323,
14053,
38147,
4968,
20034,
3057,
6355,
407,
4715,
3057,
4735,
8062,
495,
12661,
247,
6311,
1566,
10336,
281,
9232,
253,
8062,
273,
3057,
3386,
10166,
275,
247,
22296,
8142,
970,
1524,
1533,
1881,
35421,
941,
577,
2216,
285,
7568,
247,
9763,
970,
253,
3057,
4735,
8062,
281,
8968,
11117,
4736,
24102,
50276,
20881,
1255,
265,
337,
4477,
2868,
253,
1386,
12716,
1566,
476,
320,
15246,
281,
9017,
281,
12097,
12716,
407,
970,
247,
38539,
3057,
30047,
533,
513,
417,
4737,
352,
374,
253,
1566,
310,
908,
323,
38147,
4968,
20034,
3057,
6355,
533,
760,
1071,
327,
247,
7046,
3627,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
271,
10336,
323,
14053,
253,
5016,
273,
247,
38147,
4968,
285,
253,
3126,
253,
3057,
8062,
310,
6607,
407,
253,
3057,
15301,
3057,
1386,
285,
990,
8222,
263,
259,
4099,
253,
1332,
310,
5183,
327,
10309,
970,
247,
38147,
7046,
3627,
281,
49939,
271,
26982,
50276,
783,
4477,
8379,
9713,
1142,
273,
253,
30628,
7350,
954,
19836,
407,
6240,
625,
6667,
342,
2709,
285,
39709,
5657,
281,
7568,
253,
26647,
14603,
253,
2929,
2789,
247,
9865,
7680,
281,
253,
8059,
407,
5277,
247,
4460,
2900,
281,
271,
4722,
1895,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
247,
1566,
3169,
3057,
48269,
272,
7792,
281,
1347,
1453,
253,
3057,
875,
247,
38147,
4968,
285,
253,
25497,
546,
2411,
3899,
3057,
310,
4764,
1025,
347,
247,
8985,
3057,
7908,
1386,
273,
3057,
285,
271,
990,
8222,
263,
259,
4099,
8062,
403,
6311,
275,
247,
21624,
2317,
342,
271,
32049,
48759,
7792,
326,
310,
908,
281,
3283,
253,
3057,
3602,
387,
1046,
3213,
970,
247,
1127,
18534,
8310,
285,
253,
3280,
259,
4099,
4080,
387,
253,
25497,
20051,
20544,
50276,
2203,
4849,
5199,
310,
1881,
35421,
1293,
25312,
1966,
7268,
390,
46684,
50276,
527,
554,
394,
1783,
273,
1027,
490,
77,
569,
273,
253,
1566,
281,
1263,
285,
7472,
253,
12510,
273,
1027,
4295,
273,
253,
985,
50276,
20881,
1255,
265,
50276,
16691,
1320,
4679,
403,
11096,
281,
247,
2014,
7046,
3627,
33810,
253,
2553,
275,
3057,
342,
253,
7046,
3627,
671,
4558,
253,
1072,
651,
320,
4722,
281,
923,
26647,
281,
747,
7046,
37961,
390,
1014,
816,
1140,
565,
383,
327,
625,
5657,
347,
973,
347,
18918,
10575,
275,
643,
3602,
824,
347,
253,
3057,
2553,
5474,
339,
431,
248,
2929,
10262,
247,
15688,
1453,
7792,
323,
10938,
3057,
5621,
387,
253,
9092,
273,
247,
638,
737,
4938,
264,
4968,
24678,
253,
4081,
2746,
19401,
247,
4112,
835,
253,
10276,
273,
253,
46137,
4968,
476,
320,
908,
281,
1818,
253,
12087,
273,
253,
3057,
253,
4477,
10262,
247,
1332,
281,
3037,
247,
7870,
1566,
534,
4483,
281,
3283,
253,
2538,
281,
253,
15688,
5231,
327,
253,
3057,
12087,
285,
327,
253,
15688,
990,
8222,
263,
4328,
436,
7870,
1566,
310,
908,
275,
271,
278,
5902,
7792,
281,
1453,
3057,
5621,
275,
247,
5235,
273,
15216,
1690,
10938,
253,
3057,
5621,
387,
253,
9092,
273,
247,
12112,
7046,
3627,
281,
49939,
247,
2303,
1789,
275,
3361,
273,
24238,
253,
2929,
10262,
247,
4460,
2900,
281,
247,
4460,
1895,
10938,
4968,
3057,
5621,
310,
247,
1077,
4722,
1895,
534,
281,
619,
3640,
556,
644,
9713,
760,
275,
253,
16572,
1083,
253,
9262,
2929,
19401,
253,
1327,
10389,
301,
1083,
534,
310,
247,
1534,
38135,
253,
2201,
12291,
273,
253,
2929,
310,
253,
878,
273,
247,
9542,
2856,
356,
44627,
3408,
534,
4419,
12320,
6500,
2359,
531,
80,
285,
1966,
20446,
50276,
7152,
339,
431,
248,
4477,
4081,
247,
4715,
2746,
323,
14053,
4968,
20034,
5016,
534,
3037,
253,
8062,
432,
1524,
1533,
941,
186,
2234,
273,
4081,
1332,
310,
21496,
253,
21624,
8470,
432,
253,
25497,
8468,
941,
285,
1086,
3180,
253,
3057,
4735,
6779,
534,
8414,
273,
8985,
3057,
1318,
253,
1386,
273,
3057,
285,
271,
990,
8222,
263,
259,
4099,
186,
783,
4477,
16058,
253,
47232,
4254,
273,
247,
24272,
3022,
1789,
970,
247,
7046,
3627,
7660,
281,
253,
9092,
273,
271,
4588,
15688,
347,
247,
4836,
7668,
3057,
875,
253,
3126,
285,
247,
10276,
4968,
186,
5661,
1543,
921,
326,
253,
4081,
1332,
310,
8936,
275,
512,
7103,
4957,
3057,
3490,
2228,
8985,
7200,
285,
3057,
1386,
2228,
33810,
407,
6240,
253,
941,
6880,
285,
259,
4099,
8409,
2250,
4081,
407,
253,
4477,
253,
15688,
310,
2104,
281,
49939,
5113,
432,
253,
2829,
1223,
17816,
3057,
342,
24238,
253,
4583,
4028,
273,
253,
2929,
310,
3477,
281,
34007,
285,
253,
7660,
3492,
671,
6518,
253,
9414,
2096,
954,
3082,
970,
22296,
4715,
281,
3037,
15688,
7823,
432,
1524,
10186,
941,
513,
417,
1908,
253,
50172,
273,
253,
4561,
18974,
390,
3057,
984,
597,
1918,
253,
1566,
13650,
3587,
347,
5694,
13896,
275,
4499,
253,
4477,
897,
5823,
2059,
281,
2395,
323,
11655,
281,
18974,
285,
3057,
13650,
285,
597,
4518,
6266,
616,
2173,
1332,
33810,
253,
4477,
671,
7568,
253,
12510,
273,
253,
4081,
1332,
407,
16472,
2709,
11745,
27163,
50276,
531,
4468,
310,
326,
627,
403,
1643,
14023,
342,
643,
3082,
1580,
253,
4477,
6760,
253,
7200,
273,
253,
4836,
760,
1754,
327,
3910,
275,
253,
2605,
273,
616,
4081,
1566,
342,
390,
1293,
8113,
285,
941,
42072,
3966,
352,
310,
1805,
281,
2486,
253,
1543,
273,
14023,
342,
643,
2175,
604,
253,
4477,
971,
281,
1750,
253,
12510,
273,
253,
4081,
1332,
1142,
2045,
2175,
273,
8892,
7668,
3057,
2226,
337,
2692,
326,
407,
970,
45738,
4715,
281,
3037,
253,
3057,
3490,
342,
253,
5254,
2553,
253,
15688,
476,
6283,
278,
412,
253,
5254,
1014,
604,
253,
48635,
1899,
285,
2978,
273,
253,
278,
412,
403,
1027,
253,
37317,
556,
690,
2201,
5701,
253,
37317,
18726,
342,
253,
2523,
327,
1386,
3127,
2299,
347,
253,
2900,
2011,
407,
253,
4081,
1332,
310,
3710,
352,
651,
320,
1805,
281,
921,
253,
8180,
285,
1543,
273,
253,
4968,
20034,
5016,
672,
253,
48635,
1899,
273,
253,
7046,
3627,
310,
4391,
1386,
7584,
3054,
326,
247,
5235,
273,
24102,
403,
8156,
533,
10323,
253,
15688,
310,
816,
4886,
253,
7046,
3627,
4951,
3103,
253,
9991,
273,
24102,
310,
417,
2011,
275,
1340,
281,
7568,
253,
2442,
273,
253,
4081,
1332,
352,
651,
320,
9371,
281,
452,
1543,
323,
11962,
253,
6887,
273,
5113,
285,
24238,
253,
37317,
7192,
326,
407,
17055,
3733,
941,
342,
247,
3632,
2250,
3646,
3057,
1491,
875,
253,
4968,
285,
253,
7046,
3627,
812,
320,
5728,
347,
2011,
275,
3036,
374,
253,
30628,
671,
7192,
326,
253,
15688,
26295,
697,
3879,
1046,
3213,
1754,
327,
253,
2540,
941,
327,
253,
643,
1133,
24238,
285,
5113,
403,
417,
2908,
275,
253,
3733,
941,
849,
513,
597,
9446,
253,
5113,
285,
7692,
731,
342,
253,
7046,
3627,
352,
310,
12744,
849,
253,
2746,
18974,
281,
253,
1789,
310,
6311,
432,
253,
3632,
18974,
15571,
253,
941,
2685,
1309,
4715,
285,
1309,
10636,
588,
1361,
253,
9414,
2096,
4720,
849,
403,
253,
6799,
3057,
24102,
275,
4677,
577,
5118,
403,
597,
29607,
3413,
407,
1966,
186,
28910,
337,
256,
518,
404,
80,
6738,
19379,
1453,
3169,
45738,
4715,
323,
7602,
16894,
15688,
43425,
26332,
1796,
1884,
394,
5213,
18870,
35835,
327,
9787,
23598,
310,
466,
26332,
1796,
43425,
5474,
339,
431,
248,
2929,
12661,
247,
1332,
281,
3037,
253,
8062,
273,
841,
3057,
3386,
432,
1524,
1533,
941,
342,
440,
14428,
4968,
12087,
285,
12661,
247,
9763,
326,
4648,
6311,
8062,
1566,
323,
1649,
86,
302,
514,
587,
3057,
48269,
272,
285,
921,
326,
352,
310,
3576,
387,
9591,
47232,
8892,
342,
247,
7046,
3627,
20544,
337,
12661,
247,
3057,
6661,
247,
8985,
3057,
4438,
50276,
19,
1246,
247,
7792,
323,
14053,
38147,
4968,
20034,
3057,
6355,
407,
4715,
3057,
4735,
8062,
495,
12661,
247,
6311,
1566,
10336,
281,
9232,
253,
8062,
273,
3057,
3386,
10166,
275,
247,
22296,
8142,
970,
1524,
1533,
1881,
35421,
941,
577,
2216,
285,
7568,
247,
9763,
970,
253,
3057,
4735,
8062,
281,
8968,
11117,
4736,
24102,
50276,
20881,
1255,
265,
337,
4477,
2868,
253,
1386,
12716,
1566,
476,
320,
15246,
281,
9017,
281,
12097,
12716,
407,
970,
247,
38539,
3057,
30047,
533,
513,
417,
4737,
352,
374,
253,
1566,
310,
908,
323,
38147,
4968,
20034,
3057,
6355,
533,
760,
1071,
327,
247,
7046,
3627,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
271,
10336,
323,
14053,
253,
5016,
273,
247,
38147,
4968,
285,
253,
3126,
253,
3057,
8062,
310,
6607,
407,
253,
3057,
15301,
3057,
1386,
285,
990,
8222,
263,
259,
4099,
253,
1332,
310,
5183,
327,
10309,
970,
247,
38147,
7046,
3627,
281,
49939,
271,
26982,
50276,
783,
4477,
8379,
9713,
1142,
273,
253,
30628,
7350,
954,
19836,
407,
6240,
625,
6667,
342,
2709,
285,
39709,
5657,
281,
7568,
253,
26647,
14603,
253,
2929,
2789,
247,
9865,
7680,
281,
253,
8059,
407,
5277,
247,
4460,
2900,
281,
271,
4722,
1895,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper tackles the trustworthiness of concept bottleneck models cbm by improving the accuracyinterpretability tradeoff using a conceptbased architecture concept embedding model or cem which represents each concept as a supervised vector furthermore the authors propose two metrics for evaluating concept representations strengths 1 the paper tackles an important problem of trustworthiness and accuracyinterpretability tradeoff 2 the paper is wellorganized and easy to follow 3 the authors performed multiple experiments demonstrating that the proposed approach would work reasonably well in different scenarios and improves the accuracyinterpretability 4 scalable to realworld cases with lacking complete concept supervision weaknesses 1 lack of necessary statistical analysis some improvements do not seem to be significant it would be better to provide task and concept accuracy as well as statistical significance eg error bars in a table 2 does randint regularizer increase the training time and cost i think providing and comparison of the training cost or model sizes can also be helpful 3 have you tried applying randint regularizer to boolcbm fuzzycbm or other baselines based on the provided results in figure 6 it seems that a lot of improvements are due to adding the randint regularizer for a fair comparison i would like to see if adding that to the existing methods can improve their performance as well the authors discussed the limitations of the proposed method in section 6 docsepthe paper extends the concept bottleneck method by generating two embedding vectors for each concept one representing the embedding when the concept is active while the other representing when the concept is inactive these two embedding are linearly combined through a scoring function similar to the gating mechanism the produces a probability score of which embedding is to be used the concept embedding vectors are then used in the downstream task for prediction this extension increase the model capacity to encode more information about concepts strength easy to read simple extension weakness constraining the architecture of all other baselines as similar to the architecture of the proposed method seems to be unfair i still can not see how this model can be used in scenarios where incomplete concepts exist in particular in eq 1 how can the model be trained without a full supervision of concepts the information bottleneck metric in section 4 is a bit unclear more detailed explanation would be preferred no docsepconcept bottleneck models implicitly learn to explain the downstream tasks in addition to learning how to perform them however these models forgo predictive performance on the downstream tasks the authors propose concept embedding models cems a novel family of concept bottleneck models that address this issue strengths the proposed techniques perform better than cbms and their variants on downstream tasks concept representation learned by cems is more aligned to the ground truth concepts and successfully captures the semantics of the images weaknesses cems assume that the datasets contain annotations of concepts which is not valid in practice and are often quite expensive to obtain i would encourage the authors to list the limitations of the proposed approach docsepthis paper introduces a model named concept embedding model cem based on concept bottleneck model cbm architecture compared to cbm cem contains an embedding generator layer that considers two embedding representations one for activate and one for inactivate and then produces an embedding representation for one concept results show that the model produces high task accuracy and interpretability at the same time compared to cbmfamily models strength this paper tries to solve a challenging research question to design xai models which are good at task performance and interpretability the authors conduct experiments on multiple datasets and evaluate different models using different metrics weakness 1 the proposed model is not thoroughly studied for instance the relationship between chat and chat this is the novelty of the proposed model compared to cbm for instance for one concept do these two embeddings represent opposite concepts 2 there is some unclearness about the baseline models in the paper for example the cem uses m16 to represent one concept in chat bottleneck what is the dimension of chat for cbms in appendix a5 it says k m 1 for hybridcbm does it mean that the dimension of chat is k m 1 dimension why not km as in cem 3 the lack of justification for the proposed cas in fig3 the baseline model no concepts has an even better score than booleancbm and fuzzycbm 4 the qualitative results are not very convincing fig5 c and appendix fig 5 show the samples and their nearest neighbors for one concept however it does not reflex information about the concept and for a welltrained classifier it should find out the visually similar samples based on euclidean distance in the embedding space this paper does not emphasize the advantage of using two separate concept representations chat and chat which is the novelty of the cem and does not evaluate the interpretability and user trust thoroughly
### Summary: | this paper proposes concept embedding models which learn interpretable highdimensional concept representations to exploit the tradeoff between accuracy interpretability and interventions on concepts reviewers vote for accepting this paper the authors are encouraged to further improve this work based on reviewers comments in the camera ready and put the new experiments and discussions during the authorreviewer discussion phrase into the final revision in particular the following add statistical significance test of experimental results compare training costs and model sizes better justify the proposed cas mechanism investigate the robustness of learned concepts address the fairness concerns raised by reviewers in comparison with baselines | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
39223,
253,
4517,
11448,
1632,
273,
4473,
3673,
44856,
3210,
260,
5844,
407,
11138,
253,
7200,
22416,
1430,
5454,
2727,
970,
247,
4473,
3169,
10336,
4473,
21496,
1566,
390,
260,
358,
534,
6125,
1016,
4473,
347,
247,
22296,
4972,
33810,
253,
4477,
12661,
767,
17082,
323,
16344,
4473,
14237,
20544,
337,
253,
2929,
39223,
271,
1774,
1895,
273,
4517,
11448,
1632,
285,
7200,
22416,
1430,
5454,
2727,
374,
253,
2929,
310,
973,
34092,
285,
3477,
281,
956,
495,
253,
4477,
2684,
2709,
4679,
17227,
326,
253,
4081,
2746,
651,
789,
12054,
973,
275,
1027,
15216,
285,
19132,
253,
7200,
22416,
1430,
577,
44755,
281,
1524,
10186,
2219,
342,
14999,
3426,
4473,
20446,
50276,
20881,
1255,
265,
337,
50276,
77,
471,
273,
3309,
7605,
1783,
690,
11701,
513,
417,
1646,
281,
320,
1534,
352,
651,
320,
1805,
281,
2085,
4836,
285,
4473,
7200,
347,
973,
347,
7605,
8453,
24088,
2228,
8965,
275,
247,
2829,
50276,
19,
1057,
40819,
565,
3963,
6081,
2572,
253,
3733,
673,
285,
2105,
891,
1158,
5277,
285,
5301,
273,
253,
3733,
2105,
390,
1566,
9552,
476,
671,
320,
9371,
495,
452,
368,
3597,
9433,
40819,
565,
3963,
6081,
281,
7301,
68,
5844,
31921,
68,
5844,
390,
643,
1666,
25379,
1754,
327,
253,
2530,
1543,
275,
4677,
721,
352,
3133,
326,
247,
2257,
273,
11701,
403,
1955,
281,
6240,
253,
40819,
565,
3963,
6081,
323,
247,
4344,
5301,
891,
651,
751,
281,
923,
604,
6240,
326,
281,
253,
5368,
3082,
476,
3157,
616,
3045,
347,
973,
50275,
783,
4477,
5469,
253,
7364,
273,
253,
4081,
1332,
275,
2593,
721,
5474,
339,
431,
248,
2929,
8725,
253,
4473,
3673,
44856,
1332,
407,
11365,
767,
21496,
11390,
323,
1016,
4473,
581,
9999,
253,
21496,
672,
253,
4473,
310,
3939,
1223,
253,
643,
9999,
672,
253,
4473,
310,
25515,
841,
767,
21496,
403,
23352,
5678,
949,
247,
14755,
1159,
2074,
281,
253,
305,
839,
5122,
253,
11330,
247,
5912,
4868,
273,
534,
21496,
310,
281,
320,
908,
253,
4473,
21496,
11390,
403,
840,
908,
275,
253,
15450,
4836,
323,
10554,
436,
6880,
2572,
253,
1566,
5350,
281,
22573,
625,
1491,
670,
12342,
50275,
45563,
50276,
36423,
281,
1239,
50275,
19583,
6880,
50275,
20881,
1255,
50276,
3474,
26208,
253,
10336,
273,
512,
643,
1666,
25379,
347,
2074,
281,
253,
10336,
273,
253,
4081,
1332,
3133,
281,
320,
16593,
50275,
74,
1335,
476,
417,
923,
849,
436,
1566,
476,
320,
908,
275,
15216,
835,
18464,
12342,
2226,
275,
1798,
275,
16186,
337,
849,
476,
253,
1566,
320,
10166,
1293,
247,
2120,
20446,
273,
12342,
50276,
783,
1491,
3673,
44856,
7982,
275,
2593,
577,
310,
247,
2372,
12744,
625,
7000,
8813,
651,
320,
9013,
50276,
2369,
50276,
7152,
33032,
31503,
3673,
44856,
3210,
29688,
3037,
281,
5513,
253,
15450,
8892,
275,
1635,
281,
4715,
849,
281,
1347,
731,
2299,
841,
3210,
323,
2184,
15970,
3045,
327,
253,
15450,
8892,
253,
4477,
12661,
4473,
21496,
3210,
260,
3030,
247,
4460,
2021,
273,
4473,
3673,
44856,
3210,
326,
2953,
436,
2523,
20544,
50276,
783,
4081,
5609,
1347,
1805,
685,
34795,
983,
285,
616,
11640,
327,
15450,
8892,
50276,
31503,
6779,
6311,
407,
260,
3030,
310,
625,
15616,
281,
253,
3216,
5083,
12342,
285,
8379,
28174,
253,
35185,
273,
253,
3888,
32213,
50276,
336,
983,
5467,
326,
253,
15302,
3831,
31825,
273,
12342,
534,
310,
417,
3588,
275,
3946,
285,
403,
2223,
3240,
8214,
281,
4044,
891,
651,
11907,
253,
4477,
281,
1618,
253,
7364,
273,
253,
4081,
2746,
5474,
33032,
2520,
2929,
23970,
247,
1566,
4907,
4473,
21496,
1566,
260,
358,
1754,
327,
4473,
3673,
44856,
1566,
260,
5844,
10336,
2429,
281,
260,
5844,
260,
358,
4428,
271,
21496,
14156,
3828,
326,
19401,
767,
21496,
14237,
581,
323,
17561,
285,
581,
323,
275,
36915,
285,
840,
11330,
271,
21496,
6779,
323,
581,
4473,
1543,
921,
326,
253,
1566,
11330,
1029,
4836,
7200,
285,
4665,
1430,
387,
253,
1072,
673,
2429,
281,
260,
5844,
11807,
3210,
50276,
45563,
436,
2929,
14177,
281,
8415,
247,
11132,
2561,
1953,
281,
2216,
1269,
2284,
3210,
534,
403,
1175,
387,
4836,
3045,
285,
4665,
1430,
253,
4477,
2589,
4679,
327,
2709,
15302,
285,
7472,
1027,
3210,
970,
1027,
17082,
50275,
20881,
1255,
50276,
18,
253,
4081,
1566,
310,
417,
16575,
5421,
323,
4227,
253,
2954,
875,
12939,
285,
12939,
436,
310,
253,
38135,
273,
253,
4081,
1566,
2429,
281,
260,
5844,
323,
4227,
323,
581,
4473,
513,
841,
767,
46234,
1957,
7285,
12342,
50275,
19,
627,
310,
690,
12744,
1255,
670,
253,
8245,
3210,
275,
253,
2929,
323,
1650,
253,
260,
358,
4648,
278,
1036,
281,
1957,
581,
4473,
275,
12939,
3673,
44856,
752,
310,
253,
7877,
273,
12939,
323,
34795,
983,
275,
30762,
247,
22,
352,
2296,
50275,
76,
50276,
78,
50276,
18,
323,
9769,
68,
5844,
1057,
352,
1599,
326,
253,
7877,
273,
12939,
310,
465,
50276,
78,
50276,
18,
7877,
2139,
417,
10771,
347,
275,
260,
358,
50276,
20,
253,
3480,
273,
22861,
323,
253,
4081,
6483,
275,
3036,
20,
253,
8245,
1566,
642,
12342,
556,
271,
1014,
1805,
4868,
685,
1766,
1306,
1377,
5844,
285,
31921,
68,
5844,
50275,
21,
253,
18276,
1543,
403,
417,
1077,
21414,
3036,
22,
260,
285,
30762,
3036,
608,
921,
253,
3530,
285,
616,
5275,
15833,
323,
581,
4473,
2299,
352,
1057,
417,
22375,
1491,
670,
253,
4473,
285,
323,
247,
973,
32927,
30410,
352,
943,
1089,
562,
253,
25910,
2074,
3530,
1754,
327,
299,
26365,
4181,
275,
253,
21496,
2317,
50276,
2520,
2929,
1057,
417,
22175,
253,
5750,
273,
970,
767,
4858,
4473,
14237,
12939,
285,
12939,
534,
310,
253,
38135,
273,
253,
260,
358,
285,
1057,
417,
7472,
253,
4665,
1430,
285,
2608,
4517,
16575,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
4473,
21496,
3210,
534,
3037,
4665,
494,
1029,
6967,
4473,
14237,
281,
22059,
253,
5454,
2727,
875,
7200,
4665,
1430,
285,
12214,
327,
12342,
30628,
6273,
323,
18738,
436,
2929,
253,
4477,
403,
14659,
281,
2007,
3157,
436,
789,
1754,
327,
30628,
5701,
275,
253,
6568,
4704,
285,
1691,
253,
747,
4679,
285,
11985,
1309,
253,
2488,
15337,
254,
5955,
12616,
715,
253,
2457,
18520,
275,
1798,
253,
1563,
50275,
1911,
7605,
8453,
1071,
273,
5661,
1543,
50276,
23813,
3733,
4815,
285,
1566,
9552,
50276,
29266,
15249,
253,
4081,
6483,
5122,
50276,
24889,
12894,
253,
31640,
273,
6311,
12342,
50276,
12025,
253,
28959,
7350,
5439,
407,
30628,
275,
5301,
342,
1666,
25379,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
39223,
253,
4517,
11448,
1632,
273,
4473,
3673,
44856,
3210,
260,
5844,
407,
11138,
253,
7200,
22416,
1430,
5454,
2727,
970,
247,
4473,
3169,
10336,
4473,
21496,
1566,
390,
260,
358,
534,
6125,
1016,
4473,
347,
247,
22296,
4972,
33810,
253,
4477,
12661,
767,
17082,
323,
16344,
4473,
14237,
20544,
337,
253,
2929,
39223,
271,
1774,
1895,
273,
4517,
11448,
1632,
285,
7200,
22416,
1430,
5454,
2727,
374,
253,
2929,
310,
973,
34092,
285,
3477,
281,
956,
495,
253,
4477,
2684,
2709,
4679,
17227,
326,
253,
4081,
2746,
651,
789,
12054,
973,
275,
1027,
15216,
285,
19132,
253,
7200,
22416,
1430,
577,
44755,
281,
1524,
10186,
2219,
342,
14999,
3426,
4473,
20446,
50276,
20881,
1255,
265,
337,
50276,
77,
471,
273,
3309,
7605,
1783,
690,
11701,
513,
417,
1646,
281,
320,
1534,
352,
651,
320,
1805,
281,
2085,
4836,
285,
4473,
7200,
347,
973,
347,
7605,
8453,
24088,
2228,
8965,
275,
247,
2829,
50276,
19,
1057,
40819,
565,
3963,
6081,
2572,
253,
3733,
673,
285,
2105,
891,
1158,
5277,
285,
5301,
273,
253,
3733,
2105,
390,
1566,
9552,
476,
671,
320,
9371,
495,
452,
368,
3597,
9433,
40819,
565,
3963,
6081,
281,
7301,
68,
5844,
31921,
68,
5844,
390,
643,
1666,
25379,
1754,
327,
253,
2530,
1543,
275,
4677,
721,
352,
3133,
326,
247,
2257,
273,
11701,
403,
1955,
281,
6240,
253,
40819,
565,
3963,
6081,
323,
247,
4344,
5301,
891,
651,
751,
281,
923,
604,
6240,
326,
281,
253,
5368,
3082,
476,
3157,
616,
3045,
347,
973,
50275,
783,
4477,
5469,
253,
7364,
273,
253,
4081,
1332,
275,
2593,
721,
5474,
339,
431,
248,
2929,
8725,
253,
4473,
3673,
44856,
1332,
407,
11365,
767,
21496,
11390,
323,
1016,
4473,
581,
9999,
253,
21496,
672,
253,
4473,
310,
3939,
1223,
253,
643,
9999,
672,
253,
4473,
310,
25515,
841,
767,
21496,
403,
23352,
5678,
949,
247,
14755,
1159,
2074,
281,
253,
305,
839,
5122,
253,
11330,
247,
5912,
4868,
273,
534,
21496,
310,
281,
320,
908,
253,
4473,
21496,
11390,
403,
840,
908,
275,
253,
15450,
4836,
323,
10554,
436,
6880,
2572,
253,
1566,
5350,
281,
22573,
625,
1491,
670,
12342,
50275,
45563,
50276,
36423,
281,
1239,
50275,
19583,
6880,
50275,
20881,
1255,
50276,
3474,
26208,
253,
10336,
273,
512,
643,
1666,
25379,
347,
2074,
281,
253,
10336,
273,
253,
4081,
1332,
3133,
281,
320,
16593,
50275,
74,
1335,
476,
417,
923,
849,
436,
1566,
476,
320,
908,
275,
15216,
835,
18464,
12342,
2226,
275,
1798,
275,
16186,
337,
849,
476,
253,
1566,
320,
10166,
1293,
247,
2120,
20446,
273,
12342,
50276,
783,
1491,
3673,
44856,
7982,
275,
2593,
577,
310,
247,
2372,
12744,
625,
7000,
8813,
651,
320,
9013,
50276,
2369,
50276,
7152,
33032,
31503,
3673,
44856,
3210,
29688,
3037,
281,
5513,
253,
15450,
8892,
275,
1635,
281,
4715,
849,
281,
1347,
731,
2299,
841,
3210,
323,
2184,
15970,
3045,
327,
253,
15450,
8892,
253,
4477,
12661,
4473,
21496,
3210,
260,
3030,
247,
4460,
2021,
273,
4473,
3673,
44856,
3210,
326,
2953,
436,
2523,
20544,
50276,
783,
4081,
5609,
1347,
1805,
685,
34795,
983,
285,
616,
11640,
327,
15450,
8892,
50276,
31503,
6779,
6311,
407,
260,
3030,
310,
625,
15616,
281,
253,
3216,
5083,
12342,
285,
8379,
28174,
253,
35185,
273,
253,
3888,
32213,
50276,
336,
983,
5467,
326,
253,
15302,
3831,
31825,
273,
12342,
534,
310,
417,
3588,
275,
3946,
285,
403,
2223,
3240,
8214,
281,
4044,
891,
651,
11907,
253,
4477,
281,
1618,
253,
7364,
273,
253,
4081,
2746,
5474,
33032,
2520,
2929,
23970,
247,
1566,
4907,
4473,
21496,
1566,
260,
358,
1754,
327,
4473,
3673,
44856,
1566,
260,
5844,
10336,
2429,
281,
260,
5844,
260,
358,
4428,
271,
21496,
14156,
3828,
326,
19401,
767,
21496,
14237,
581,
323,
17561,
285,
581,
323,
275,
36915,
285,
840,
11330,
271,
21496,
6779,
323,
581,
4473,
1543,
921,
326,
253,
1566,
11330,
1029,
4836,
7200,
285,
4665,
1430,
387,
253,
1072,
673,
2429,
281,
260,
5844,
11807,
3210,
50276,
45563,
436,
2929,
14177,
281,
8415,
247,
11132,
2561,
1953,
281,
2216,
1269,
2284,
3210,
534,
403,
1175,
387,
4836,
3045,
285,
4665,
1430,
253,
4477,
2589,
4679,
327,
2709,
15302,
285,
7472,
1027,
3210,
970,
1027,
17082,
50275,
20881,
1255,
50276,
18,
253,
4081,
1566,
310,
417,
16575,
5421,
323,
4227,
253,
2954,
875,
12939,
285,
12939,
436,
310,
253,
38135,
273,
253,
4081,
1566,
2429,
281,
260,
5844,
323,
4227,
323,
581,
4473,
513,
841,
767,
46234,
1957,
7285,
12342,
50275,
19,
627,
310,
690,
12744,
1255,
670,
253,
8245,
3210,
275,
253,
2929,
323,
1650,
253,
260,
358,
4648,
278,
1036,
281,
1957,
581,
4473,
275,
12939,
3673,
44856,
752,
310,
253,
7877,
273,
12939,
323,
34795,
983,
275,
30762,
247,
22,
352,
2296,
50275,
76,
50276,
78,
50276,
18,
323,
9769,
68,
5844,
1057,
352,
1599,
326,
253,
7877,
273,
12939,
310,
465,
50276,
78,
50276,
18,
7877,
2139,
417,
10771,
347,
275,
260,
358,
50276,
20,
253,
3480,
273,
22861,
323,
253,
4081,
6483,
275,
3036,
20,
253,
8245,
1566,
642,
12342,
556,
271,
1014,
1805,
4868,
685,
1766,
1306,
1377,
5844,
285,
31921,
68,
5844,
50275,
21,
253,
18276,
1543,
403,
417,
1077,
21414,
3036,
22,
260,
285,
30762,
3036,
608,
921,
253,
3530,
285,
616,
5275,
15833,
323,
581,
4473,
2299,
352,
1057,
417,
22375,
1491,
670,
253,
4473,
285,
323,
247,
973,
32927,
30410,
352,
943,
1089,
562,
253,
25910,
2074,
3530,
1754,
327,
299,
26365,
4181,
275,
253,
21496,
2317,
50276,
2520,
2929,
1057,
417,
22175,
253,
5750,
273,
970,
767,
4858,
4473,
14237,
12939,
285,
12939,
534,
310,
253,
38135,
273,
253,
260,
358,
285,
1057,
417,
7472,
253,
4665,
1430,
285,
2608,
4517,
16575,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
4473,
21496,
3210,
534,
3037,
4665,
494,
1029,
6967,
4473,
14237,
281,
22059,
253,
5454,
2727,
875,
7200,
4665,
1430,
285,
12214,
327,
12342,
30628,
6273,
323,
18738,
436,
2929,
253,
4477,
403,
14659,
281,
2007,
3157,
436,
789,
1754,
327,
30628,
5701,
275,
253,
6568,
4704,
285,
1691,
253,
747,
4679,
285,
11985,
1309,
253,
2488,
15337,
254,
5955,
12616,
715,
253,
2457,
18520,
275,
1798,
253,
1563,
50275,
1911,
7605,
8453,
1071,
273,
5661,
1543,
50276,
23813,
3733,
4815,
285,
1566,
9552,
50276,
29266,
15249,
253,
4081,
6483,
5122,
50276,
24889,
12894,
253,
31640,
273,
6311,
12342,
50276,
12025,
253,
28959,
7350,
5439,
407,
30628,
275,
5301,
342,
1666,
25379,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper investigates how the sarah stochastic recursive gradient algorithm can be applied to trust region policy optimization the authors analyze the sarah algorithm using its approximating ordinary and stochastic differential equations the empirical performance of sarapo is then compared with svrpo and trpo on several benchmark problems although the idea of applying sarah to reduce the variance of gradient estimates in policy gradient algorithms is interesting and potentially quite significant variance of gradient estimates is a major problem in policy gradient algorithms i recommend rejecting this paper at the present time due to issues with clarity and quality particularly of the experiments not enough of the possible values for experimental settings were tested to say anything conclusive about the performance of the algorithms being compared for the values that were tested no measures of the variability of performance or statistical significance of the results were given this is important because the performance of the algorithms is similar on many of the environments and it is important to know if the improved performance of sarapo observed on some of the environments is statistically significant or simply due to the small sample size the paper also needs improvements in clarity grammatical errors and sentence fragments make it challenging to understand at times section 23 seemed very brief and did not include enough discussion of design decisions made in the algorithm for example the authors say the fisher information matrix can be approximated by hessian matrix of the kl divergence when the current distribution exactly matches that of the base distribution but then suggest using the hessian of the kl of the old parameters and the new parameters which are not the same what are the consequences of this approximation are there alternative approaches the analysis in section 3 is interesting but the technique has been applied to sgd before and the results only seem to confirm findings from the original sarah paper to improve the paper i would suggest moving section 3 to an appendix and using the extra space to further explain details and conduct additional simpler experiments additional experiments on simpler environments and policy gradient algorithms reinforce reinforce with baseline would allow the authors to try more possible values for experimental settings and do enough runs to obtain more conclusive results about performance then the authors can present their results applying sarah to trpo with some measure of statistical significancedocsepthe paper extends sarah to policy optimization with theoretical analysis and experimental study 1 the theoretical analysis under certain assumption seems novel but the significance is unknown compared to similar analysis 2 the analysis demonstrates the advantage of sarah over svrg as noted in remark 1 it would be better to give explicit equations for svrg in order for comparison 3 experimental results seem to show empirically that the sarah is only comparable to svrg 4 presentation needs to be improved docsepthis paper proposes a new policy gradient method for reinforcement learning the method essentially combines sarah and trust region method using fisher information matrix the effectiveness of the proposed method is verified in experiments sarah is a variance reduction method developed in stochastic optimization literature which significantly accelerates convergence speed of stochastic gradient descent since the policy gradient often suffers from high variance during the training a combination with variance reduction methods is quite reasonable however this work seems to be rather incremental compared to a previous method adopting another variance reduction method svrg xu2017 papini2018 moreover the advantage of the proposed method over svrpg svrg policy gradient is unclear both theoretically and experimentally papini2018 provided a convergence guarantee with its convergence rate while this paper does not give such a result it would be nice if the authors could clarify theoretical advantages over svrpg minor comment the description of svrg updates in page 2 is wrong the notation of h in section 31 ode analysis is not defined at this time
### Summary: | the use of sarah for policy optimization in rl is novel with some theoretical analysis to demonstrate convergence of this approach however concerns were raised in terms of clarity of the paper empirical results and in placement of this theory relative to a previous variance reduction algorithm called svrpg the author response similarly did not explain the novelty of the theory beyond the convergence results of what was given by the paper on svrpg by incorporating some of the reviewer comments this paper could be a meaningful and useful contribution | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2340,
684,
849,
253,
34886,
1240,
19191,
33037,
11786,
5933,
476,
320,
3732,
281,
4517,
2919,
3646,
13757,
253,
4477,
12106,
253,
34886,
1240,
5933,
970,
697,
4020,
839,
9826,
285,
19191,
8967,
7424,
253,
16774,
3045,
273,
34886,
40240,
310,
840,
2429,
342,
18504,
83,
5367,
285,
492,
5367,
327,
2067,
22791,
3237,
50276,
20261,
253,
2934,
273,
9433,
34886,
1240,
281,
4796,
253,
11041,
273,
11786,
8197,
275,
3646,
11786,
11333,
310,
4722,
285,
7826,
3240,
1534,
11041,
273,
11786,
8197,
310,
247,
2201,
1895,
275,
3646,
11786,
11333,
891,
5583,
33944,
436,
2929,
387,
253,
1246,
673,
1955,
281,
3374,
342,
19843,
285,
3290,
3782,
273,
253,
4679,
50276,
1439,
2217,
273,
253,
1896,
2193,
323,
5661,
7533,
497,
5762,
281,
1333,
2712,
38662,
670,
253,
3045,
273,
253,
11333,
1146,
2429,
323,
253,
2193,
326,
497,
5762,
642,
5593,
273,
253,
13099,
273,
3045,
390,
7605,
8453,
273,
253,
1543,
497,
1677,
436,
310,
1774,
984,
253,
3045,
273,
253,
11333,
310,
2074,
327,
1142,
273,
253,
12620,
285,
352,
310,
1774,
281,
871,
604,
253,
5520,
3045,
273,
34886,
40240,
2540,
327,
690,
273,
253,
12620,
310,
10126,
1534,
390,
3365,
1955,
281,
253,
1355,
3410,
1979,
50276,
783,
2929,
671,
3198,
11701,
275,
19843,
47412,
474,
6332,
285,
6197,
14251,
1056,
352,
11132,
281,
2096,
387,
2069,
2593,
3495,
4455,
1077,
4864,
285,
858,
417,
2486,
2217,
5955,
273,
2216,
7089,
1160,
275,
253,
5933,
323,
1650,
253,
4477,
1333,
253,
27633,
1491,
4315,
476,
320,
34930,
407,
344,
859,
757,
4315,
273,
253,
27451,
23279,
672,
253,
1655,
3268,
4555,
10129,
326,
273,
253,
2613,
3268,
533,
840,
1804,
970,
253,
344,
859,
757,
273,
253,
27451,
273,
253,
1711,
3602,
285,
253,
747,
3602,
534,
403,
417,
253,
1072,
752,
403,
253,
9099,
273,
436,
11193,
403,
627,
5795,
7274,
50276,
783,
1783,
275,
2593,
495,
310,
4722,
533,
253,
5853,
556,
644,
3732,
281,
256,
35333,
1078,
285,
253,
1543,
760,
1646,
281,
6583,
4342,
432,
253,
3236,
34886,
1240,
2929,
50276,
936,
3157,
253,
2929,
891,
651,
1804,
4886,
2593,
495,
281,
271,
30762,
285,
970,
253,
4465,
2317,
281,
2007,
5513,
4278,
285,
2589,
3081,
19554,
4679,
3081,
4679,
327,
19554,
12620,
285,
3646,
11786,
11333,
28432,
28432,
342,
8245,
651,
1581,
253,
4477,
281,
1611,
625,
1896,
2193,
323,
5661,
7533,
285,
513,
2217,
6613,
281,
4044,
625,
38662,
1543,
670,
3045,
840,
253,
4477,
476,
1246,
616,
1543,
9433,
34886,
1240,
281,
492,
5367,
342,
690,
2557,
273,
7605,
1415,
3086,
406,
339,
431,
248,
2929,
8725,
34886,
1240,
281,
3646,
13757,
342,
10527,
1783,
285,
5661,
1263,
50275,
18,
253,
10527,
1783,
762,
2176,
9376,
3133,
4460,
533,
253,
8453,
310,
7202,
2429,
281,
2074,
1783,
50275,
19,
253,
1783,
14371,
253,
5750,
273,
34886,
1240,
689,
18504,
15164,
347,
4879,
275,
7579,
337,
352,
651,
320,
1805,
281,
1918,
6843,
7424,
323,
18504,
15164,
275,
1340,
323,
5301,
50276,
20,
5661,
1543,
1646,
281,
921,
45190,
326,
253,
34886,
1240,
310,
760,
10870,
281,
18504,
15164,
50276,
21,
9759,
3198,
281,
320,
5520,
5474,
33032,
2520,
2929,
29328,
247,
747,
3646,
11786,
1332,
323,
35221,
4715,
253,
1332,
9093,
24772,
34886,
1240,
285,
4517,
2919,
1332,
970,
27633,
1491,
4315,
253,
12510,
273,
253,
4081,
1332,
310,
16058,
275,
4679,
50276,
48417,
1240,
310,
247,
11041,
5141,
1332,
3715,
275,
19191,
13757,
6239,
534,
3012,
17308,
684,
14940,
3885,
273,
19191,
11786,
18499,
1580,
253,
3646,
11786,
2223,
27171,
432,
1029,
11041,
1309,
253,
3733,
247,
5019,
342,
11041,
5141,
3082,
310,
3240,
5272,
2299,
436,
789,
3133,
281,
320,
2581,
32809,
2429,
281,
247,
2045,
1332,
25987,
1529,
11041,
5141,
1332,
18504,
15164,
1269,
86,
7132,
13860,
5391,
7798,
25761,
253,
5750,
273,
253,
4081,
1332,
689,
18504,
83,
8159,
18504,
15164,
50276,
22872,
11786,
310,
12744,
1097,
28055,
285,
21657,
13860,
5391,
7798,
2530,
247,
14940,
12215,
342,
697,
14940,
2281,
1223,
436,
2929,
1057,
417,
1918,
824,
247,
906,
352,
651,
320,
5322,
604,
253,
4477,
812,
19148,
10527,
11361,
689,
18504,
83,
8159,
50276,
37585,
4385,
50276,
783,
5740,
273,
18504,
15164,
11269,
275,
3239,
374,
310,
3430,
50276,
783,
14951,
273,
288,
275,
2593,
4562,
258,
615,
1783,
310,
417,
2931,
387,
436,
673,
2490,
187,
4118,
18435,
27,
783,
897,
273,
34886,
1240,
323,
3646,
13757,
275,
391,
77,
310,
4460,
342,
690,
10527,
1783,
281,
7568,
14940,
273,
436,
2746,
2299,
7350,
497,
5439,
275,
2426,
273,
19843,
273,
253,
2929,
16774,
1543,
285,
275,
14663,
273,
436,
3762,
4103,
281,
247,
2045,
11041,
5141,
5933,
1925,
18504,
83,
8159,
253,
2488,
2380,
12014,
858,
417,
5513,
253,
38135,
273,
253,
3762,
4457,
253,
14940,
1543,
273,
752,
369,
1677,
407,
253,
2929,
327,
18504,
83,
8159,
50276,
1615,
24049,
690,
273,
253,
37317,
5701,
436,
2929,
812,
320,
247,
14282,
285,
4217,
7680
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2340,
684,
849,
253,
34886,
1240,
19191,
33037,
11786,
5933,
476,
320,
3732,
281,
4517,
2919,
3646,
13757,
253,
4477,
12106,
253,
34886,
1240,
5933,
970,
697,
4020,
839,
9826,
285,
19191,
8967,
7424,
253,
16774,
3045,
273,
34886,
40240,
310,
840,
2429,
342,
18504,
83,
5367,
285,
492,
5367,
327,
2067,
22791,
3237,
50276,
20261,
253,
2934,
273,
9433,
34886,
1240,
281,
4796,
253,
11041,
273,
11786,
8197,
275,
3646,
11786,
11333,
310,
4722,
285,
7826,
3240,
1534,
11041,
273,
11786,
8197,
310,
247,
2201,
1895,
275,
3646,
11786,
11333,
891,
5583,
33944,
436,
2929,
387,
253,
1246,
673,
1955,
281,
3374,
342,
19843,
285,
3290,
3782,
273,
253,
4679,
50276,
1439,
2217,
273,
253,
1896,
2193,
323,
5661,
7533,
497,
5762,
281,
1333,
2712,
38662,
670,
253,
3045,
273,
253,
11333,
1146,
2429,
323,
253,
2193,
326,
497,
5762,
642,
5593,
273,
253,
13099,
273,
3045,
390,
7605,
8453,
273,
253,
1543,
497,
1677,
436,
310,
1774,
984,
253,
3045,
273,
253,
11333,
310,
2074,
327,
1142,
273,
253,
12620,
285,
352,
310,
1774,
281,
871,
604,
253,
5520,
3045,
273,
34886,
40240,
2540,
327,
690,
273,
253,
12620,
310,
10126,
1534,
390,
3365,
1955,
281,
253,
1355,
3410,
1979,
50276,
783,
2929,
671,
3198,
11701,
275,
19843,
47412,
474,
6332,
285,
6197,
14251,
1056,
352,
11132,
281,
2096,
387,
2069,
2593,
3495,
4455,
1077,
4864,
285,
858,
417,
2486,
2217,
5955,
273,
2216,
7089,
1160,
275,
253,
5933,
323,
1650,
253,
4477,
1333,
253,
27633,
1491,
4315,
476,
320,
34930,
407,
344,
859,
757,
4315,
273,
253,
27451,
23279,
672,
253,
1655,
3268,
4555,
10129,
326,
273,
253,
2613,
3268,
533,
840,
1804,
970,
253,
344,
859,
757,
273,
253,
27451,
273,
253,
1711,
3602,
285,
253,
747,
3602,
534,
403,
417,
253,
1072,
752,
403,
253,
9099,
273,
436,
11193,
403,
627,
5795,
7274,
50276,
783,
1783,
275,
2593,
495,
310,
4722,
533,
253,
5853,
556,
644,
3732,
281,
256,
35333,
1078,
285,
253,
1543,
760,
1646,
281,
6583,
4342,
432,
253,
3236,
34886,
1240,
2929,
50276,
936,
3157,
253,
2929,
891,
651,
1804,
4886,
2593,
495,
281,
271,
30762,
285,
970,
253,
4465,
2317,
281,
2007,
5513,
4278,
285,
2589,
3081,
19554,
4679,
3081,
4679,
327,
19554,
12620,
285,
3646,
11786,
11333,
28432,
28432,
342,
8245,
651,
1581,
253,
4477,
281,
1611,
625,
1896,
2193,
323,
5661,
7533,
285,
513,
2217,
6613,
281,
4044,
625,
38662,
1543,
670,
3045,
840,
253,
4477,
476,
1246,
616,
1543,
9433,
34886,
1240,
281,
492,
5367,
342,
690,
2557,
273,
7605,
1415,
3086,
406,
339,
431,
248,
2929,
8725,
34886,
1240,
281,
3646,
13757,
342,
10527,
1783,
285,
5661,
1263,
50275,
18,
253,
10527,
1783,
762,
2176,
9376,
3133,
4460,
533,
253,
8453,
310,
7202,
2429,
281,
2074,
1783,
50275,
19,
253,
1783,
14371,
253,
5750,
273,
34886,
1240,
689,
18504,
15164,
347,
4879,
275,
7579,
337,
352,
651,
320,
1805,
281,
1918,
6843,
7424,
323,
18504,
15164,
275,
1340,
323,
5301,
50276,
20,
5661,
1543,
1646,
281,
921,
45190,
326,
253,
34886,
1240,
310,
760,
10870,
281,
18504,
15164,
50276,
21,
9759,
3198,
281,
320,
5520,
5474,
33032,
2520,
2929,
29328,
247,
747,
3646,
11786,
1332,
323,
35221,
4715,
253,
1332,
9093,
24772,
34886,
1240,
285,
4517,
2919,
1332,
970,
27633,
1491,
4315,
253,
12510,
273,
253,
4081,
1332,
310,
16058,
275,
4679,
50276,
48417,
1240,
310,
247,
11041,
5141,
1332,
3715,
275,
19191,
13757,
6239,
534,
3012,
17308,
684,
14940,
3885,
273,
19191,
11786,
18499,
1580,
253,
3646,
11786,
2223,
27171,
432,
1029,
11041,
1309,
253,
3733,
247,
5019,
342,
11041,
5141,
3082,
310,
3240,
5272,
2299,
436,
789,
3133,
281,
320,
2581,
32809,
2429,
281,
247,
2045,
1332,
25987,
1529,
11041,
5141,
1332,
18504,
15164,
1269,
86,
7132,
13860,
5391,
7798,
25761,
253,
5750,
273,
253,
4081,
1332,
689,
18504,
83,
8159,
18504,
15164,
50276,
22872,
11786,
310,
12744,
1097,
28055,
285,
21657,
13860,
5391,
7798,
2530,
247,
14940,
12215,
342,
697,
14940,
2281,
1223,
436,
2929,
1057,
417,
1918,
824,
247,
906,
352,
651,
320,
5322,
604,
253,
4477,
812,
19148,
10527,
11361,
689,
18504,
83,
8159,
50276,
37585,
4385,
50276,
783,
5740,
273,
18504,
15164,
11269,
275,
3239,
374,
310,
3430,
50276,
783,
14951,
273,
288,
275,
2593,
4562,
258,
615,
1783,
310,
417,
2931,
387,
436,
673,
2490,
187,
4118,
18435,
27,
783,
897,
273,
34886,
1240,
323,
3646,
13757,
275,
391,
77,
310,
4460,
342,
690,
10527,
1783,
281,
7568,
14940,
273,
436,
2746,
2299,
7350,
497,
5439,
275,
2426,
273,
19843,
273,
253,
2929,
16774,
1543,
285,
275,
14663,
273,
436,
3762,
4103,
281,
247,
2045,
11041,
5141,
5933,
1925,
18504,
83,
8159,
253,
2488,
2380,
12014,
858,
417,
5513,
253,
38135,
273,
253,
3762,
4457,
253,
14940,
1543,
273,
752,
369,
1677,
407,
253,
2929,
327,
18504,
83,
8159,
50276,
1615,
24049,
690,
273,
253,
37317,
5701,
436,
2929,
812,
320,
247,
14282,
285,
4217,
7680
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a metalearning mechanism to address current generalization limitations in dynamics forecasting works in which the majority of works learn deep learning models capable of capturing one dynamics at a time to this end they propose dyad a twostaged approach in which an encoder learns the timeinvariant taskspecific hidden features of a given observation sequence and influences a forecasting network that aims to learn the shared dynamics of the heterogeneous domain a timeinvariant loss function for the encoder is proposed to encourage timetask disentanglement between encoder and forecaster leveraging weaksupervision from taskspecific parameters they leverage styletransfer adaptive instance normalization for forecaster adaptation and propose a novel encoderinfluenced padding layer called adapad to address unknown boundary errors theoretical proofs are provided to quantify loss term tradeoffs in the generalization error and show that there is a relationship between error and task relatedness for the source and target domains ablations are provided for each proposed model component and are evaluated on three physical dynamics tasks including synthetic flows and realworld sea surface temperature and ocean current tasks strengths this work tackles a novel task for dynamic systems through the application of metalearning to enable learning a set of dynamics functions with shared underlying mechanics under one model it provides ample discussion on common failings of learning dynamical systems and proposes fixes to address them it is placed well within relevant literature and indeed tackles an insofar uncommon setting the architecture choice of a taskfeature encoder that adapts a forecasting network per task via interlayer controlled padding is intuitive and the rigorous ablations to support the addition of each component strengthens the work ample baselines are provided from classical video prediction models to applied metalearning methods used as comparisons the provided code in the supplementary material is clean to run and has reproducible results regarding baselines and proposed model the writing and presentation is clear with little confusion on details concerning architecture or the training setting the network components are explained well and effectively leverages visualizations in its presentation minor code cleanup readmemd a note regarding the base size of the dataset 150gb from datagenerationpy and how to generate smaller testing versions would help for more approachable evaluation of the provided code there are a variety of unused imports throughout all of the provided scripts which should be cleaned up to reduce unneeded dependencies datagenerationpy has mkdir errors given no path checking for output folders on subsequent runs requires a local module import for phigeom sphere not already included variable resolution is undefined and needs removing minor writing cleanup line 623 backpropogation the authors properly address limitations in the claims of their proofs and what information should be gleaned from them additionally limitations in the metrics used to support evaluation are discussed in the appendix potential negative societal impact are not discussed in the work and is denoted as such in the authors checklist docsepthe paper proposes a method to predict dynamical systems when system parameters can differ between training and evaluation first a time invariant network is trained to predict the dynamical invariants which could be the number of vortices in a flow or system parameters then a prediction network is trained to forecast the system dynamics taking the dynamic invariants as an input strengths interesting and innovative idea i may not be aware of prior work using this approach good experimental results although maybe a bit limited datasets good and clear presentation weaknesses one potential weakness of the method is that one needs access to the invariants for training docsepa physics informed metalearning architecture is introduced to model different kinds of dynamics an encoder generates a timeinvariant latent state hatz representing physical properties of the observed dynamics in combination with the dynamics field a decoder takes this latent state to condition its prediction of the next dynamic steps originality strengths the adapad layer looks intriguing as it allows the boundary condition bc to depend on x in contrast to traditional padding methods zeroconstant or mirror adapad thus seems able to account for dynamic bcs such as neumann and cauchy however it might help if it additionally receives x as input have the authors experimented with this fairly simple but highly efficient and well designed architecture to model different kinds of dynamics with one and the same forecaster quality strengths overall very detailed and well motivated model architecture data description and evaluation clear demonstration of successful encoding of physically relevant factors figure 6 exhaustive ablations showing the relevance of the different introduced network components weaknesses apart from the ablations in table 2 and in the appendix it would be informative to see the effect of different choices of m clarity and significance strengths superiority of the model shown on both synthetic and realworld datasets weaknesses im not fully clear about the conclusion of the proof and would appreciate an intuition about the result and what it actually means adapad operator not explicitly evaluated on changing boundary conditions limitations not addressed by the authors id be curious to learn about situations where this model might have difficulties
### Summary: | this work proposes a modelbased metalearning method to forecast physical dynamics the proposed approach is able to generalize across heterogeneous domains as demonstrated in convincing sets of experiments the reviewers found the work to be well motivated clear and selfcontained authors justified the proposed model architecture and the ablation studies conducted showed the importance of the network components the authors also provided an adequate description of the data and the evaluation strategy as well as theoretical guarantees on the generalization error in several settings | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
5148,
613,
920,
5122,
281,
2953,
1655,
26647,
7364,
275,
8062,
16923,
272,
2987,
275,
534,
253,
5020,
273,
2987,
3037,
3676,
4715,
3210,
7032,
273,
26475,
581,
8062,
387,
247,
673,
281,
436,
990,
597,
12661,
17713,
324,
247,
2500,
493,
2961,
2746,
275,
534,
271,
32049,
33772,
253,
673,
25168,
8892,
29765,
8763,
3386,
273,
247,
1677,
8310,
3425,
285,
16178,
247,
16923,
272,
2990,
326,
13698,
281,
3037,
253,
6096,
8062,
273,
253,
22766,
5028,
247,
673,
25168,
2957,
1159,
323,
253,
32049,
310,
4081,
281,
11907,
4522,
292,
1945,
557,
290,
606,
1338,
875,
32049,
285,
2273,
33293,
19732,
2977,
5075,
12185,
4694,
432,
8892,
29765,
3602,
597,
25057,
30085,
1059,
16147,
1592,
17825,
4227,
21539,
323,
2273,
33293,
15644,
285,
12661,
247,
4460,
32049,
249,
8501,
5666,
13294,
3828,
1925,
519,
522,
324,
281,
2953,
7202,
7548,
6332,
10527,
27947,
403,
2530,
281,
22048,
2957,
1307,
5454,
14273,
275,
253,
26647,
2228,
285,
921,
326,
627,
310,
247,
2954,
875,
2228,
285,
4836,
2905,
1255,
323,
253,
2603,
285,
2303,
10625,
490,
77,
569,
403,
2530,
323,
1016,
4081,
1566,
4445,
285,
403,
6760,
327,
1264,
3520,
8062,
8892,
1690,
13506,
14221,
285,
1524,
10186,
6150,
2553,
3276,
285,
12927,
1655,
8892,
20544,
50276,
2520,
789,
39223,
247,
4460,
4836,
323,
7870,
2718,
949,
253,
2898,
273,
5148,
613,
920,
281,
8046,
4715,
247,
873,
273,
8062,
3470,
342,
6096,
6944,
17823,
762,
581,
1566,
352,
3400,
24904,
5955,
327,
1846,
1891,
723,
273,
4715,
18525,
2718,
285,
29328,
26019,
281,
2953,
731,
352,
310,
4845,
973,
1561,
4623,
6239,
285,
6296,
39223,
271,
37900,
24666,
4758,
50276,
783,
10336,
4327,
273,
247,
4836,
24594,
32049,
326,
5223,
84,
247,
16923,
272,
2990,
591,
4836,
3066,
734,
12026,
6537,
13294,
310,
27350,
285,
253,
26565,
490,
77,
569,
281,
1329,
253,
1635,
273,
1016,
4445,
4056,
49966,
253,
789,
50276,
4636,
1666,
25379,
403,
2530,
432,
8946,
3492,
10554,
3210,
281,
3732,
5148,
613,
920,
3082,
908,
347,
14023,
50276,
783,
2530,
2127,
275,
253,
24864,
2144,
310,
4076,
281,
1408,
285,
556,
41374,
1543,
5001,
1666,
25379,
285,
4081,
1566,
50275,
783,
4028,
285,
9759,
310,
2590,
342,
1652,
13775,
327,
4278,
8664,
10336,
390,
253,
3733,
4758,
253,
2990,
4295,
403,
5544,
973,
285,
8069,
19732,
1131,
5304,
5904,
275,
697,
9759,
50276,
37585,
2127,
34709,
50276,
1088,
6441,
69,
50276,
66,
3877,
5001,
253,
2613,
1979,
273,
253,
10895,
7783,
20773,
432,
2856,
6533,
3328,
4789,
285,
849,
281,
6635,
4577,
5175,
9508,
651,
1361,
323,
625,
2746,
494,
7103,
273,
253,
2530,
2127,
50276,
9088,
403,
247,
5235,
273,
30732,
27226,
4768,
512,
273,
253,
2530,
20477,
534,
943,
320,
22269,
598,
281,
4796,
440,
34498,
21011,
50276,
8608,
6533,
3328,
4789,
50275,
7110,
36904,
7341,
6332,
1677,
642,
1854,
12669,
323,
3453,
32160,
327,
6774,
6613,
50276,
36042,
247,
1980,
6333,
1395,
323,
815,
9236,
297,
15269,
417,
2168,
2908,
50276,
18645,
6064,
310,
17011,
285,
3198,
11922,
50276,
37585,
4028,
34709,
50276,
1282,
721,
1508,
896,
8560,
462,
318,
50276,
783,
4477,
6283,
2953,
7364,
275,
253,
3916,
273,
616,
27947,
285,
752,
1491,
943,
320,
20786,
20625,
432,
731,
23000,
7364,
275,
253,
17082,
908,
281,
1329,
7103,
403,
5469,
275,
253,
30762,
2442,
4016,
38058,
3486,
403,
417,
5469,
275,
253,
789,
285,
310,
17007,
347,
824,
275,
253,
4477,
44282,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
1332,
281,
3283,
18525,
2718,
672,
985,
3602,
476,
9184,
875,
3733,
285,
7103,
806,
247,
673,
13727,
2990,
310,
10166,
281,
3283,
253,
18525,
38318,
534,
812,
320,
253,
1180,
273,
37764,
1271,
275,
247,
2685,
390,
985,
3602,
840,
247,
10554,
2990,
310,
10166,
281,
16923,
253,
985,
8062,
3192,
253,
7870,
38318,
347,
271,
3280,
50275,
296,
3755,
20556,
50276,
47606,
285,
16694,
2934,
891,
778,
417,
320,
6600,
273,
2720,
789,
970,
436,
2746,
50276,
12311,
5661,
1543,
3738,
5046,
247,
2372,
3710,
15302,
50276,
12311,
285,
2590,
9759,
50275,
20881,
1255,
265,
581,
2442,
14855,
273,
253,
1332,
310,
326,
581,
3198,
2289,
281,
253,
38318,
323,
3733,
50275,
7152,
339,
4904,
12057,
8191,
5148,
613,
920,
10336,
310,
5611,
281,
1566,
1027,
9351,
273,
8062,
271,
32049,
15693,
247,
673,
25168,
21624,
1375,
7856,
91,
9999,
3520,
3607,
273,
253,
2540,
8062,
275,
5019,
342,
253,
8062,
1673,
247,
29810,
3936,
436,
21624,
1375,
281,
1617,
697,
10554,
273,
253,
1735,
7870,
5018,
50276,
19164,
414,
20544,
50276,
783,
519,
522,
324,
3828,
4453,
27807,
347,
352,
4483,
253,
7548,
1617,
49501,
281,
3469,
327,
1269,
275,
4499,
281,
5899,
13294,
3082,
1182,
254,
16033,
3223,
390,
11472,
519,
522,
324,
3021,
3133,
2104,
281,
2395,
323,
7870,
270,
6113,
824,
347,
425,
25331,
285,
7318,
29420,
2299,
352,
1537,
1361,
604,
352,
23000,
14488,
1269,
347,
3280,
452,
253,
4477,
3368,
264,
342,
436,
50276,
25525,
314,
2969,
533,
4122,
5919,
285,
973,
4158,
10336,
281,
1566,
1027,
9351,
273,
8062,
342,
581,
285,
253,
1072,
2273,
33293,
50275,
15177,
20544,
50276,
1189,
455,
1077,
7000,
285,
973,
17194,
1566,
10336,
941,
5740,
285,
7103,
50276,
8250,
20028,
273,
5547,
9706,
273,
13318,
4623,
2616,
4677,
721,
50276,
911,
8648,
422,
490,
77,
569,
4645,
253,
17200,
273,
253,
1027,
5611,
2990,
4295,
50276,
20881,
1255,
265,
50276,
522,
435,
432,
253,
490,
77,
569,
275,
2829,
374,
285,
275,
253,
30762,
352,
651,
320,
27096,
281,
923,
253,
1055,
273,
1027,
10165,
273,
278,
50275,
498,
15752,
285,
8453,
20544,
50276,
12185,
1528,
414,
273,
253,
1566,
2011,
327,
1097,
13506,
285,
1524,
10186,
15302,
50276,
20881,
1255,
265,
50276,
303,
417,
4751,
2590,
670,
253,
6452,
273,
253,
4737,
285,
651,
11435,
271,
30328,
670,
253,
906,
285,
752,
352,
2686,
2097,
50276,
324,
522,
324,
5572,
417,
11120,
6760,
327,
6890,
7548,
2515,
7364,
417,
9713,
407,
253,
4477,
2654,
320,
14338,
281,
3037,
670,
9534,
835,
436,
1566,
1537,
452,
12748,
2490,
187,
4118,
18435,
27,
2520,
789,
29328,
247,
1566,
3169,
5148,
613,
920,
1332,
281,
16923,
3520,
8062,
253,
4081,
2746,
310,
2104,
281,
39970,
2439,
22766,
10625,
347,
5183,
275,
21414,
5239,
273,
4679,
253,
30628,
1119,
253,
789,
281,
320,
973,
17194,
2590,
285,
1881,
41010,
4477,
17285,
253,
4081,
1566,
10336,
285,
253,
28913,
2175,
5196,
2692,
253,
6349,
273,
253,
2990,
4295,
253,
4477,
671,
2530,
271,
10599,
5740,
273,
253,
941,
285,
253,
7103,
5700,
347,
973,
347,
10527,
23632,
327,
253,
26647,
2228,
275,
2067,
7533,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
5148,
613,
920,
5122,
281,
2953,
1655,
26647,
7364,
275,
8062,
16923,
272,
2987,
275,
534,
253,
5020,
273,
2987,
3037,
3676,
4715,
3210,
7032,
273,
26475,
581,
8062,
387,
247,
673,
281,
436,
990,
597,
12661,
17713,
324,
247,
2500,
493,
2961,
2746,
275,
534,
271,
32049,
33772,
253,
673,
25168,
8892,
29765,
8763,
3386,
273,
247,
1677,
8310,
3425,
285,
16178,
247,
16923,
272,
2990,
326,
13698,
281,
3037,
253,
6096,
8062,
273,
253,
22766,
5028,
247,
673,
25168,
2957,
1159,
323,
253,
32049,
310,
4081,
281,
11907,
4522,
292,
1945,
557,
290,
606,
1338,
875,
32049,
285,
2273,
33293,
19732,
2977,
5075,
12185,
4694,
432,
8892,
29765,
3602,
597,
25057,
30085,
1059,
16147,
1592,
17825,
4227,
21539,
323,
2273,
33293,
15644,
285,
12661,
247,
4460,
32049,
249,
8501,
5666,
13294,
3828,
1925,
519,
522,
324,
281,
2953,
7202,
7548,
6332,
10527,
27947,
403,
2530,
281,
22048,
2957,
1307,
5454,
14273,
275,
253,
26647,
2228,
285,
921,
326,
627,
310,
247,
2954,
875,
2228,
285,
4836,
2905,
1255,
323,
253,
2603,
285,
2303,
10625,
490,
77,
569,
403,
2530,
323,
1016,
4081,
1566,
4445,
285,
403,
6760,
327,
1264,
3520,
8062,
8892,
1690,
13506,
14221,
285,
1524,
10186,
6150,
2553,
3276,
285,
12927,
1655,
8892,
20544,
50276,
2520,
789,
39223,
247,
4460,
4836,
323,
7870,
2718,
949,
253,
2898,
273,
5148,
613,
920,
281,
8046,
4715,
247,
873,
273,
8062,
3470,
342,
6096,
6944,
17823,
762,
581,
1566,
352,
3400,
24904,
5955,
327,
1846,
1891,
723,
273,
4715,
18525,
2718,
285,
29328,
26019,
281,
2953,
731,
352,
310,
4845,
973,
1561,
4623,
6239,
285,
6296,
39223,
271,
37900,
24666,
4758,
50276,
783,
10336,
4327,
273,
247,
4836,
24594,
32049,
326,
5223,
84,
247,
16923,
272,
2990,
591,
4836,
3066,
734,
12026,
6537,
13294,
310,
27350,
285,
253,
26565,
490,
77,
569,
281,
1329,
253,
1635,
273,
1016,
4445,
4056,
49966,
253,
789,
50276,
4636,
1666,
25379,
403,
2530,
432,
8946,
3492,
10554,
3210,
281,
3732,
5148,
613,
920,
3082,
908,
347,
14023,
50276,
783,
2530,
2127,
275,
253,
24864,
2144,
310,
4076,
281,
1408,
285,
556,
41374,
1543,
5001,
1666,
25379,
285,
4081,
1566,
50275,
783,
4028,
285,
9759,
310,
2590,
342,
1652,
13775,
327,
4278,
8664,
10336,
390,
253,
3733,
4758,
253,
2990,
4295,
403,
5544,
973,
285,
8069,
19732,
1131,
5304,
5904,
275,
697,
9759,
50276,
37585,
2127,
34709,
50276,
1088,
6441,
69,
50276,
66,
3877,
5001,
253,
2613,
1979,
273,
253,
10895,
7783,
20773,
432,
2856,
6533,
3328,
4789,
285,
849,
281,
6635,
4577,
5175,
9508,
651,
1361,
323,
625,
2746,
494,
7103,
273,
253,
2530,
2127,
50276,
9088,
403,
247,
5235,
273,
30732,
27226,
4768,
512,
273,
253,
2530,
20477,
534,
943,
320,
22269,
598,
281,
4796,
440,
34498,
21011,
50276,
8608,
6533,
3328,
4789,
50275,
7110,
36904,
7341,
6332,
1677,
642,
1854,
12669,
323,
3453,
32160,
327,
6774,
6613,
50276,
36042,
247,
1980,
6333,
1395,
323,
815,
9236,
297,
15269,
417,
2168,
2908,
50276,
18645,
6064,
310,
17011,
285,
3198,
11922,
50276,
37585,
4028,
34709,
50276,
1282,
721,
1508,
896,
8560,
462,
318,
50276,
783,
4477,
6283,
2953,
7364,
275,
253,
3916,
273,
616,
27947,
285,
752,
1491,
943,
320,
20786,
20625,
432,
731,
23000,
7364,
275,
253,
17082,
908,
281,
1329,
7103,
403,
5469,
275,
253,
30762,
2442,
4016,
38058,
3486,
403,
417,
5469,
275,
253,
789,
285,
310,
17007,
347,
824,
275,
253,
4477,
44282,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
1332,
281,
3283,
18525,
2718,
672,
985,
3602,
476,
9184,
875,
3733,
285,
7103,
806,
247,
673,
13727,
2990,
310,
10166,
281,
3283,
253,
18525,
38318,
534,
812,
320,
253,
1180,
273,
37764,
1271,
275,
247,
2685,
390,
985,
3602,
840,
247,
10554,
2990,
310,
10166,
281,
16923,
253,
985,
8062,
3192,
253,
7870,
38318,
347,
271,
3280,
50275,
296,
3755,
20556,
50276,
47606,
285,
16694,
2934,
891,
778,
417,
320,
6600,
273,
2720,
789,
970,
436,
2746,
50276,
12311,
5661,
1543,
3738,
5046,
247,
2372,
3710,
15302,
50276,
12311,
285,
2590,
9759,
50275,
20881,
1255,
265,
581,
2442,
14855,
273,
253,
1332,
310,
326,
581,
3198,
2289,
281,
253,
38318,
323,
3733,
50275,
7152,
339,
4904,
12057,
8191,
5148,
613,
920,
10336,
310,
5611,
281,
1566,
1027,
9351,
273,
8062,
271,
32049,
15693,
247,
673,
25168,
21624,
1375,
7856,
91,
9999,
3520,
3607,
273,
253,
2540,
8062,
275,
5019,
342,
253,
8062,
1673,
247,
29810,
3936,
436,
21624,
1375,
281,
1617,
697,
10554,
273,
253,
1735,
7870,
5018,
50276,
19164,
414,
20544,
50276,
783,
519,
522,
324,
3828,
4453,
27807,
347,
352,
4483,
253,
7548,
1617,
49501,
281,
3469,
327,
1269,
275,
4499,
281,
5899,
13294,
3082,
1182,
254,
16033,
3223,
390,
11472,
519,
522,
324,
3021,
3133,
2104,
281,
2395,
323,
7870,
270,
6113,
824,
347,
425,
25331,
285,
7318,
29420,
2299,
352,
1537,
1361,
604,
352,
23000,
14488,
1269,
347,
3280,
452,
253,
4477,
3368,
264,
342,
436,
50276,
25525,
314,
2969,
533,
4122,
5919,
285,
973,
4158,
10336,
281,
1566,
1027,
9351,
273,
8062,
342,
581,
285,
253,
1072,
2273,
33293,
50275,
15177,
20544,
50276,
1189,
455,
1077,
7000,
285,
973,
17194,
1566,
10336,
941,
5740,
285,
7103,
50276,
8250,
20028,
273,
5547,
9706,
273,
13318,
4623,
2616,
4677,
721,
50276,
911,
8648,
422,
490,
77,
569,
4645,
253,
17200,
273,
253,
1027,
5611,
2990,
4295,
50276,
20881,
1255,
265,
50276,
522,
435,
432,
253,
490,
77,
569,
275,
2829,
374,
285,
275,
253,
30762,
352,
651,
320,
27096,
281,
923,
253,
1055,
273,
1027,
10165,
273,
278,
50275,
498,
15752,
285,
8453,
20544,
50276,
12185,
1528,
414,
273,
253,
1566,
2011,
327,
1097,
13506,
285,
1524,
10186,
15302,
50276,
20881,
1255,
265,
50276,
303,
417,
4751,
2590,
670,
253,
6452,
273,
253,
4737,
285,
651,
11435,
271,
30328,
670,
253,
906,
285,
752,
352,
2686,
2097,
50276,
324,
522,
324,
5572,
417,
11120,
6760,
327,
6890,
7548,
2515,
7364,
417,
9713,
407,
253,
4477,
2654,
320,
14338,
281,
3037,
670,
9534,
835,
436,
1566,
1537,
452,
12748,
2490,
187,
4118,
18435,
27,
2520,
789,
29328,
247,
1566,
3169,
5148,
613,
920,
1332,
281,
16923,
3520,
8062,
253,
4081,
2746,
310,
2104,
281,
39970,
2439,
22766,
10625,
347,
5183,
275,
21414,
5239,
273,
4679,
253,
30628,
1119,
253,
789,
281,
320,
973,
17194,
2590,
285,
1881,
41010,
4477,
17285,
253,
4081,
1566,
10336,
285,
253,
28913,
2175,
5196,
2692,
253,
6349,
273,
253,
2990,
4295,
253,
4477,
671,
2530,
271,
10599,
5740,
273,
253,
941,
285,
253,
7103,
5700,
347,
973,
347,
10527,
23632,
327,
253,
26647,
2228,
275,
2067,
7533,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a new dataset for the chemistry domain as a realworld qa task the authors collected chemistry questions from a web page and used crowdsourcing to annotate the questions with labels and conditions required for solving them as baseline models the authors propose an endtoend neural network model and a symbolic solver that uses predefined rules they demonstrate that the neural model struggles to solve the task but the symbolic solver outperforms it with the predefined rules that cover only a part of predicates in the dataset arguing that their dataset is challenging and can foster the research of realworld problems in the chemical domain pros the paper proposes a new dataset chemistryqa that consists of 4500 questions with labels of target variables and conditions provided in the questions the paper proposes a graphsearch based solver with an extraction mechanism which is carefully implemented for this task appendix has thorough lists of the predicates and units in the dataset and functions used in the baseline solver cons overall i think that the writing of the paper can be improved more there are some typos and formatting issues which reduce the paper strength besides in sections 21 22 and 321 the paragraphs refer to figures and tables in appendix this seems to violate the submission instructions my primary concern is on the quality of collected questions in section 22 the authors say that they performed some checkandverify mechanism in the first batch which should be described in detail some related questions did the author compute the interannotator agreement on sampled questions what kind of rules did you define for the verification how many workers did work on the annotation in total did the same pool of workers work on the annotation for the first batch and the subsequent batches is there actual human performance on the collected questions seemingly it is not guaranteed that posted questions on the web page are reasonably answerable the purpose of employing two baseline models is not explained well in the introduction the authors say that to verify the dataset is consistent with the purpose of evaluating ai comprehensive capability however their hypotheses for the experiments are not clearly stated it seems that the authors stopped implementing the predefined functions for the symbolic solver at the point where the solver outperforms the neural network model the authors could have implemented more but it is not clearly explained why they only implemented the predefined functions for the 355 predicate coverage what would happen if there are a larger number of functions implemented comparison with existing datasets can be elaborated more for example because the option answer type should include different types of entities and the option is just a form directly comparing it with value formula and equation does not make sense to me in the error analysis 18 of cases are classified into other reason which does not look like a small number to me can the authors break this down into more detailed categories typos section 21 it this website in this website cite devlin et al 2019 for using bert there is an illformatted cell in table 4 there is inconsistent use of decimal commas 4500 vs 4418 add whitespace before citations eg in section 24 docsepstrengths a new qa dataset for chemistry qa consisting of 4500 questions and covering 200 topics crowdsourced annotation of variable and conditions weaknesses strong baseline results are missing intermediate steps is missing in the annotations which are really helpful in training an endtoend model the topic distribution is missing from the paper experimental results and analysis is not enough overall the idea of curating and annotating a new dataset for chemistry qa dataset is good i feel a stronger baseline would have helped much in understanding and analysing the quality dataset and annotations also the question complexity analysistopic distribution is missing overall the paper writing could be improved a lot in the current version it is difficult to follow question if the whole problem can be converted into a set of triples and conditions they why not use graph based qa techniques it will be interesting to see how neural symbolic machinesneural module network perform on this dataset topic distribution question type distribution etc are missing any specific reasons for using 12 layers of transformer encoders and 6 layers of transformer decoders in extractor plus graph search based solver which graph search algorithm is used in section 322 and table 5 typos it this website on this website docsepsummary this paper proposes a new dataset based on textbook classroom chemistry questions for complex knowledge retrieval and aggregation the authors scrape several thousands questions from online repositories and add additional natural language annotations signifying the quantities to be solved for in each question as well as the declarative knowledge two baselines one endtoend neural and another symbolic both fail at this dataset strengths the dataset targets the important question of how to build models that can retrieve knowledge while performing complex reasoning weaknesses asis the dataset fails to target the knowledge retrieval componentmodels are either expected to magically know how to calculate the answer or use hardcoded functions that complete a graph of values the neural baseline also seems a bit nonstandard raising questions of how well modern systems can actually do on the task furthermore the endtoend neural system is disadvantaged in that it likely has not seen much chemistryrelated content during finetuning whereas the symbolic baseline has access to a host of humandefined functions furthermore dataset quality is a bit difficult to assess without more samples recommendation 3 this benchmark is motivated by the lofty goals of encouraging the development of models that can combine knowledge retrieval complex reasoning and language understanding however its unclear to this reviewer whether it will prove useful in making progress towards such goalstheyre too conflated to be meaningfully evaluated within this context to improve the benchmark and make it more amenable toward advancing those research goals versus just being a difficult datasets that current models cannot handle id recommend explicitly targeting and evaluating this knowledge retrieval component as well for instance given a specific knowledgebase thats guaranteed to span the facts necessary to answer the questions how well can a model 1 retrieve relevant information and 2 use such relevant information to answer questions questions chemical calculation problems cannot be solved by endtoend neural networks since complex symbolic calculations are required this is a hypothesisthere are many tasks where complex symbolic calculations are required but endtoend networks excel what extent of knowledge is required to solve this task for instance many old semantic parsing datasets came with databases and it was guaranteed that within the database an answer would occur what would a corresponding knowledge graph for this case look like and how complex would it be unlike similar tasks annotation we cannot collect all the atomic operations needed before starting annotation since the set of chemical operators is not closed the set of mathematical operators is also not closed eg in math word problems why is this approach better than collecting all the operations represented in the dataset even if it doesnt cover all of the operations that one could conceivably see the annotation interface process looks quite regularyou arent expecting too much variation from the templates given given that you can help crowdworkers with these templates why not just use these templates as the baseline for a formal meaning representation that would encompass the knowledge needed for the task can you give more details about the annotation process beyond the short paragraph near the end of section 22 we employed crowdsourcing for this annotation workaround 336 hours id be surprised if any crowdworker could label this sort of data well what quality control filters did you put in place can we see more random samples of the dataset so we can better assess its quality endtoend solver where did you get this model architecture from such that this method represents a class of powerful neural networks which achieved stateoftheart performance on many qa tasks ive never seen bert used in a seq2seq setting like this instead people tend to use models trained on inputoutput pairs like bart or t5 id like to see how this compares to using bart or t5 since its not clear that the bert initialization would be good for generation graphsearch based solver the need to implement specific functions 78 in this case is significant and undermines the point of this dataset in my opinion theres no inherent value in learning to solve chemical equations wellthe hope is that in the process of doing so well get modeling insights into what methods work well and can be generally applied to other knowledgeintensive tasks this graphsearch based solver seems narrowly scoped to chemistryqa and difficult to adapt to other tasks and its not entirely clear why we should value its results tokenlevel accuracy is it guaranteed that the output of the graphsearch based solver will be the same length as the gold output how else how is tokenlevel accuracy computeddocseppaper summary this paper presents a question answering dataset called chemistryqa it is different from existing datasets in that chemistryqa requires open knowledge and complex solving processes it provides triplet like extraction annotation which isolates language understanding and domain knowledge experimental results show that a neural encoderdecoder model and an extractorplussolver do not work well strengths the dataset contains realworld qa that requires the ability to perform complex chemistry calculation and reasoning it is difficult for crowdsourcing workers to generate such complex questions the authors proposed a novel annotation method that target variables and conditions are labeled in a triplelike format weaknesses the dataset seems small to acquire the ability to perform complex calculation and reasoning the training validation and testing datasets consist of 3433 485 and 500 questions respectively the paper does not show statistics of the dataset such as the average length of questions and answers and the unique number of answers the paper does not show the performances broken down by question types although the endtoend solver achieves an answer accuracy of 0164 i think it is important to show more detail on what it can and cannot do the authors uses a pretrained bert as the encoder of the endtoend solver and trained the decoder from scratch i think pretrained encoderdecoder models such as t5 and bart are better as the baselines of the endtoend solver than the model used in this paper review summary the paper is well motivated chemistryqa can be a useful dataset to evaluate the ability of chemistry calculation and reasoning while the dataset seems small to acquire the ability i think it can benefit a lot with a more comprehensive analysis of evaluation results of baselines
### Summary: | the authors propose a new dataset chemistryqa which has complex questions requiring scientific and mathematical reasoning they show that existing sota models do not perform well on this dataset thereby establishing the complexity of the dataset the reviewers raised several concerns as summarised below 1 writing is not very clear 2 the quality of the dataset is hard to judge as some crucial information about the dataset creation process is missing 3 the size of the dataset is small 4 some stronger qa baselines need to be included unfortunately the authors did not provide a rebuttal hence its current form this paper cannot be accepted | [
10186,
3237,
275,
253,
5793,
5028,
50276,
856,
84,
50275,
783,
2929,
29328,
247,
747,
10895,
18090,
31569,
326,
8414,
273,
5329,
361,
3533,
342,
13301,
273,
2303,
4903,
285,
2515,
2530,
275,
253,
3533,
50276,
783,
2929,
29328,
247,
4216,
8716,
1754,
47037,
342,
271,
11998,
5122,
534,
310,
9257,
9009,
323,
436,
4836,
50276,
50237,
556,
11080,
10894,
273,
253,
2063,
31290,
285,
5085,
275,
253,
10895,
285,
3470,
908,
275,
253,
8245,
47037,
50276,
5040,
50275,
1189,
455,
891,
1158,
326,
253,
4028,
273,
253,
2929,
476,
320,
5520,
625,
627,
403,
690,
963,
993,
285,
33907,
3374,
534,
4796,
253,
2929,
4757,
16280,
275,
7118,
3127,
3307,
285,
33251,
253,
33295,
3730,
281,
8442,
285,
7180,
275,
30762,
436,
3133,
281,
20835,
253,
19529,
7997,
50276,
2577,
3625,
4468,
310,
327,
253,
3290,
273,
5728,
3533,
275,
2593,
3307,
253,
4477,
1333,
326,
597,
2684,
690,
2451,
395,
36302,
5122,
275,
253,
806,
14604,
534,
943,
320,
2529,
275,
2508,
690,
2905,
3533,
50274,
14958,
253,
2488,
11897,
253,
734,
11423,
1080,
4345,
327,
19958,
3533,
50274,
5371,
2238,
273,
4803,
858,
368,
4853,
323,
253,
21999,
50274,
5430,
1142,
5820,
858,
789,
327,
253,
22581,
275,
2264,
50274,
14958,
253,
1072,
6363,
273,
5820,
789,
327,
253,
22581,
323,
253,
806,
14604,
285,
253,
6774,
39657,
50274,
261,
627,
4588,
1966,
3045,
327,
253,
5728,
3533,
16907,
352,
310,
417,
16293,
326,
9269,
3533,
327,
253,
4384,
3239,
403,
12054,
3662,
494,
50276,
783,
4096,
273,
19693,
767,
8245,
3210,
310,
417,
5544,
973,
275,
253,
10199,
253,
4477,
1333,
326,
281,
12654,
253,
10895,
310,
5185,
342,
253,
4096,
273,
16344,
23105,
11088,
14603,
2299,
616,
24316,
323,
253,
4679,
403,
417,
4518,
4767,
50276,
262,
3133,
326,
253,
4477,
6331,
16994,
253,
41364,
3470,
323,
253,
24762,
47037,
387,
253,
1127,
835,
253,
47037,
41731,
13015,
253,
11454,
2990,
1566,
253,
4477,
812,
452,
9009,
625,
533,
352,
310,
417,
4518,
5544,
2139,
597,
760,
9009,
253,
41364,
3470,
323,
253,
26033,
29524,
7031,
752,
651,
5108,
604,
627,
403,
247,
4067,
1180,
273,
3470,
9009,
50276,
47109,
342,
5368,
15302,
476,
320,
50221,
625,
323,
1650,
984,
253,
4500,
3662,
1511,
943,
2486,
1027,
3510,
273,
14429,
285,
253,
4500,
310,
816,
247,
830,
3587,
10941,
352,
342,
1318,
7212,
285,
5150,
1057,
417,
1056,
3282,
281,
479,
50276,
249,
253,
2228,
1783,
1283,
273,
2219,
403,
10509,
715,
643,
1921,
534,
1057,
417,
1007,
751,
247,
1355,
1180,
281,
479,
476,
253,
4477,
2740,
436,
1066,
715,
625,
7000,
9050,
50276,
555,
993,
50275,
4674,
3127,
352,
436,
4422,
50276,
249,
436,
4422,
50276,
41766,
1474,
3642,
1162,
355,
6247,
323,
970,
270,
797,
50276,
9088,
310,
271,
2853,
630,
19822,
894,
275,
2829,
577,
50276,
9088,
310,
16706,
897,
273,
14492,
764,
284,
5329,
361,
4632,
7127,
1093,
50276,
1911,
19991,
4511,
1078,
30404,
24088,
275,
2593,
2164,
5474,
33032,
296,
3755,
20556,
50276,
66,
747,
2805,
66,
10895,
323,
18090,
2805,
66,
11253,
273,
5329,
361,
3533,
285,
10985,
1052,
12989,
24597,
47549,
22581,
273,
4778,
285,
2515,
50275,
20881,
1255,
265,
50275,
9072,
8245,
1543,
403,
5816,
10444,
5018,
310,
5816,
275,
253,
31825,
534,
403,
1663,
9371,
275,
3733,
271,
990,
936,
423,
1566,
50276,
783,
9400,
3268,
310,
5816,
432,
253,
2929,
5661,
1543,
285,
1783,
310,
417,
2217,
50275,
1189,
455,
253,
2934,
273,
1095,
839,
285,
12182,
839,
247,
747,
10895,
323,
18090,
2805,
66,
10895,
310,
1175,
891,
1928,
247,
10046,
8245,
651,
452,
6518,
1199,
275,
4685,
285,
5127,
272,
253,
3290,
10895,
285,
31825,
671,
253,
1953,
10454,
5127,
382,
6361,
3268,
310,
5816,
4583,
253,
2929,
4028,
812,
320,
5520,
247,
2257,
275,
253,
1655,
2715,
352,
310,
2834,
281,
956,
50276,
19751,
604,
253,
2644,
1895,
476,
320,
11516,
715,
247,
873,
273,
1195,
1868,
285,
2515,
597,
2139,
417,
897,
4216,
1754,
2805,
66,
5609,
50276,
262,
588,
320,
4722,
281,
923,
849,
11454,
24762,
10679,
570,
1546,
6333,
2990,
1347,
327,
436,
10895,
50276,
24841,
3268,
1953,
1511,
3268,
3966,
403,
5816,
50276,
1279,
2173,
4606,
323,
970,
1249,
8090,
273,
39707,
2349,
351,
398,
285,
721,
8090,
273,
39707,
1086,
351,
398,
275,
4908,
263,
5043,
4216,
3186,
1754,
47037,
50276,
4609,
4216,
3186,
5933,
310,
908,
275,
2593,
31619,
285,
2829,
608,
50275,
555,
993,
50276,
262,
436,
4422,
50275,
251,
436,
4422,
5474,
339,
793,
360,
3454,
436,
2929,
29328,
247,
747,
10895,
1754,
327,
40554,
50276,
2437,
4461,
18090,
3533,
323,
2570,
3640,
25064,
285,
20828,
253,
4477,
49939,
2067,
6763,
3533,
432,
3909,
43445,
285,
823,
3081,
3626,
3448,
31825,
861,
5411,
253,
13483,
281,
320,
14042,
323,
275,
1016,
1953,
347,
973,
347,
253,
18600,
800,
3640,
767,
1666,
25379,
581,
990,
936,
423,
11454,
285,
1529,
24762,
1097,
1891,
387,
436,
10895,
50276,
296,
3755,
20556,
253,
10895,
8571,
253,
1774,
1953,
273,
849,
281,
1973,
3210,
326,
476,
19553,
3640,
1223,
9591,
2570,
14720,
50276,
20881,
1255,
265,
347,
261,
253,
10895,
10224,
281,
2303,
253,
3640,
25064,
4445,
19286,
403,
2057,
3264,
281,
4231,
1037,
871,
849,
281,
10173,
253,
3662,
390,
897,
1892,
38059,
3470,
326,
3426,
247,
4216,
273,
2193,
253,
11454,
8245,
671,
3133,
247,
2372,
1327,
15291,
12976,
3533,
273,
849,
973,
4980,
2718,
476,
2686,
513,
327,
253,
4836,
33810,
253,
990,
936,
423,
11454,
985,
310,
12307,
2961,
275,
326,
352,
2779,
556,
417,
2326,
1199,
18090,
4919,
2600,
1309,
1442,
292,
25004,
5727,
253,
24762,
8245,
556,
2289,
281,
247,
3167,
273,
1547,
395,
37224,
3470,
33810,
10895,
3290,
310,
247,
2372,
2834,
281,
2939,
1293,
625,
3530,
50276,
250,
27167,
318,
495,
50276,
2520,
22791,
310,
17194,
407,
253,
298,
47218,
7342,
273,
18462,
253,
2440,
273,
3210,
326,
476,
13398,
3640,
25064,
2570,
14720,
285,
3448,
4685,
2299,
697,
12744,
281,
436,
37317,
1880,
352,
588,
5276,
4217,
275,
2403,
4780,
4404,
824,
4736,
296,
26512,
250,
1512,
49446,
456,
281,
320,
4495,
2920,
6760,
1561,
436,
3634,
281,
3157,
253,
22791,
285,
1056,
352,
625,
42133,
2584,
26441,
1110,
2561,
7342,
7147,
816,
1146,
247,
2834,
15302,
326,
1655,
3210,
2550,
6016,
2654,
5583,
11120,
12262,
285,
16344,
436,
3640,
25064,
4445,
347,
973,
323,
4227,
1677,
247,
2173,
3640,
4793,
28763,
16293,
281,
13905,
253,
5441,
3309,
281,
3662,
253,
3533,
849,
973,
476,
247,
1566,
337,
19553,
4623,
1491,
285,
374,
897,
824,
4623,
1491,
281,
3662,
3533,
50276,
34974,
50276,
20465,
10272,
3237,
2550,
320,
14042,
407,
990,
936,
423,
11454,
6928,
1580,
2570,
24762,
10426,
403,
2424,
436,
310,
247,
6482,
382,
1568,
403,
1142,
8892,
835,
2570,
24762,
10426,
403,
2424,
533,
990,
936,
423,
6928,
34219,
50276,
5371,
6070,
273,
3640,
310,
2424,
281,
8415,
436,
4836,
323,
4227,
1142,
1711,
24705,
29072,
15302,
2210,
342,
16634,
285,
352,
369,
16293,
326,
1561,
253,
5447,
271,
3662,
651,
2826,
752,
651,
247,
3969,
3640,
4216,
323,
436,
1083,
1007,
751,
285,
849,
2570,
651,
352,
320,
50276,
328,
3022,
2074,
8892,
22581,
359,
2550,
4822,
512,
253,
13805,
5871,
3058,
1078,
4983,
22581,
1580,
253,
873,
273,
5793,
9158,
310,
417,
4581,
253,
873,
273,
15965,
9158,
310,
671,
417,
4581,
24088,
275,
14168,
3159,
3237,
2139,
310,
436,
2746,
1805,
685,
17055,
512,
253,
5871,
6607,
275,
253,
10895,
1014,
604,
352,
36908,
3835,
512,
273,
253,
5871,
326,
581,
812,
10686,
400,
1598,
923,
50276,
783,
22581,
5673,
50276,
7404,
4453,
3240,
5433,
552,
276,
403,
2649,
16764,
1512,
1199,
7629,
432,
253,
20665,
1677,
1677,
326,
368,
476,
1361,
9539,
26719,
342,
841,
20665,
2139,
417,
816,
897,
841,
20665,
347,
253,
8245,
323,
247,
7473,
4495,
6779,
326,
651,
18387,
253,
3640,
3058,
323,
253,
4836,
50276,
5092,
368,
1918,
625,
4278,
670,
253,
22581,
1232,
4457,
253,
2159,
12494,
2822,
253,
990,
273,
2593,
3307,
359,
7091,
24597,
40883,
323,
436,
22581,
42182,
34128,
3038,
2654,
320,
9861,
604,
667,
9539,
29960,
812,
5203,
436,
3686,
273,
941,
973,
752,
3290,
1453,
15116,
858,
368,
1691,
275,
1659,
476,
359,
923,
625,
3632,
3530,
273,
253,
10895,
594,
359,
476,
1805,
2939,
697,
3290,
50276,
423,
936,
423,
47037,
835,
858,
368,
755,
436,
1566,
10336,
432,
824,
326,
436,
1332,
6125,
247,
966,
273,
6422,
11454,
6928,
534,
6786,
1375,
23037,
14387,
3045,
327,
1142,
2805,
66,
8892,
209,
422,
1620,
2326,
270,
797,
908,
275,
247,
22510,
19,
14571,
4758,
751,
436,
3185,
952,
5257,
281,
897,
3210,
10166,
327,
3280,
9252,
8557,
751,
44693,
390,
246,
22,
2654,
751,
281,
923,
849,
436,
26662,
281,
970,
44693,
390,
246,
22,
1580,
697,
417,
2590,
326,
253,
270,
797,
31850,
651,
320,
1175,
323,
5978,
50276,
10580,
8716,
1754,
47037,
253,
878,
281,
3359,
2173,
3470,
10523,
275,
436,
1083,
310,
1534,
285,
35162,
1100,
253,
1127,
273,
436,
10895,
275,
619,
4743,
253,
373,
642,
12794,
1318,
275,
4715,
281,
8415,
5793,
7424,
973,
783,
3524,
310,
326,
275,
253,
1232,
273,
2509,
594,
973,
755,
14053,
16039,
715,
752,
3082,
789,
973,
285,
476,
320,
3839,
3732,
281,
643,
3640,
47986,
8892,
436,
4216,
8716,
1754,
47037,
3133,
35440,
660,
11802,
281,
18090,
31569,
285,
2834,
281,
5223,
281,
643,
8892,
285,
697,
417,
7094,
2590,
2139,
359,
943,
1318,
697,
1543,
50276,
13763,
5251,
7200,
310,
352,
16293,
326,
253,
3453,
273,
253,
4216,
8716,
1754,
47037,
588,
320,
253,
1072,
2978,
347,
253,
5328,
3453,
849,
2010,
849,
310,
10669,
5251,
7200,
10302,
7152,
339,
377,
6653,
6010,
50276,
2520,
2929,
10262,
247,
1953,
22291,
10895,
1925,
18090,
31569,
50276,
262,
310,
1027,
432,
5368,
15302,
275,
326,
18090,
31569,
4419,
1527,
3640,
285,
2570,
16161,
4870,
50276,
262,
3400,
39716,
751,
11998,
22581,
534,
14470,
3448,
4685,
285,
5028,
3640,
5661,
1543,
921,
326,
247,
11454,
32049,
48759,
1566,
285,
271,
4908,
263,
446,
1316,
14930,
513,
417,
789,
973,
50276,
296,
3755,
20556,
50276,
783,
10895,
4428,
1524,
10186,
2805,
66,
326,
4419,
253,
3745,
281,
1347,
2570,
18090,
10272,
285,
14720,
352,
310,
2834,
323,
24597,
40883,
5820,
281,
6635,
824,
2570,
3533,
50276,
783,
4477,
4081,
247,
4460,
22581,
1332,
326,
2303,
4903,
285,
2515,
403,
13130,
275,
247,
16260,
3022,
5981,
50276,
20881,
1255,
265,
50276,
783,
10895,
3133,
1355,
281,
16270,
253,
3745,
281,
1347,
2570,
10272,
285,
14720,
253,
3733,
12820,
285,
5175,
15302,
2882,
273,
5910,
1610,
40873,
285,
6783,
3533,
2975,
50276,
783,
2929,
1057,
417,
921,
9990,
273,
253,
10895,
824,
347,
253,
3388,
2978,
273,
3533,
285,
9172,
285,
253,
4451,
1180,
273,
9172,
50276,
783,
2929,
1057,
417,
921,
253,
16226,
7154,
1066,
407,
1953,
3510,
3738,
253,
990,
936,
423,
47037,
33526,
271,
3662,
7200,
273,
470,
18467,
891,
1158,
352,
310,
1774,
281,
921,
625,
2508,
327,
752,
352,
476,
285,
2550,
513,
50276,
783,
4477,
4648,
247,
3215,
11273,
270,
797,
347,
253,
32049,
273,
253,
990,
936,
423,
47037,
285,
10166,
253,
29810,
432,
20041,
50276,
74,
1158,
3215,
11273,
32049,
48759,
3210,
824,
347,
246,
22,
285,
44693,
403,
1805,
347,
253,
1666,
25379,
273,
253,
990,
936,
423,
47037,
685,
253,
1566,
908,
275,
436,
2929,
50274,
15337,
6010,
50276,
783,
2929,
310,
973,
17194,
18090,
31569,
476,
320,
247,
4217,
10895,
281,
7472,
253,
3745,
273,
18090,
10272,
285,
14720,
1223,
253,
10895,
3133,
1355,
281,
16270,
253,
3745,
50275,
74,
1158,
352,
476,
5649,
247,
2257,
342,
247,
625,
11088,
1783,
273,
7103,
1543,
273,
1666,
25379,
187,
187,
4118,
18435,
27,
783,
4477,
12661,
247,
747,
10895,
18090,
31569,
534,
556,
2570,
3533,
10568,
8249,
285,
15965,
14720,
597,
921,
326,
5368,
256,
5503,
3210,
513,
417,
1347,
973,
327,
436,
10895,
7624,
14631,
253,
10454,
273,
253,
10895,
50275,
783,
30628,
5439,
2067,
7350,
347,
10405,
1701,
2708,
50276,
18,
4028,
310,
417,
1077,
2590,
374,
253,
3290,
273,
253,
10895,
310,
1892,
281,
5963,
347,
690,
9560,
1491,
670,
253,
10895,
8869,
1232,
310,
5816,
50276,
20,
253,
1979,
273,
253,
10895,
310,
1355,
577,
690,
10046,
2805,
66,
1666,
25379,
878,
281,
320,
2908,
50276,
328,
9520,
253,
4477,
858,
417,
2085,
247,
30080,
22559,
7613,
697,
1655,
830,
436,
2929,
2550,
320,
7607,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
10186,
3237,
275,
253,
5793,
5028,
50276,
856,
84,
50275,
783,
2929,
29328,
247,
747,
10895,
18090,
31569,
326,
8414,
273,
5329,
361,
3533,
342,
13301,
273,
2303,
4903,
285,
2515,
2530,
275,
253,
3533,
50276,
783,
2929,
29328,
247,
4216,
8716,
1754,
47037,
342,
271,
11998,
5122,
534,
310,
9257,
9009,
323,
436,
4836,
50276,
50237,
556,
11080,
10894,
273,
253,
2063,
31290,
285,
5085,
275,
253,
10895,
285,
3470,
908,
275,
253,
8245,
47037,
50276,
5040,
50275,
1189,
455,
891,
1158,
326,
253,
4028,
273,
253,
2929,
476,
320,
5520,
625,
627,
403,
690,
963,
993,
285,
33907,
3374,
534,
4796,
253,
2929,
4757,
16280,
275,
7118,
3127,
3307,
285,
33251,
253,
33295,
3730,
281,
8442,
285,
7180,
275,
30762,
436,
3133,
281,
20835,
253,
19529,
7997,
50276,
2577,
3625,
4468,
310,
327,
253,
3290,
273,
5728,
3533,
275,
2593,
3307,
253,
4477,
1333,
326,
597,
2684,
690,
2451,
395,
36302,
5122,
275,
253,
806,
14604,
534,
943,
320,
2529,
275,
2508,
690,
2905,
3533,
50274,
14958,
253,
2488,
11897,
253,
734,
11423,
1080,
4345,
327,
19958,
3533,
50274,
5371,
2238,
273,
4803,
858,
368,
4853,
323,
253,
21999,
50274,
5430,
1142,
5820,
858,
789,
327,
253,
22581,
275,
2264,
50274,
14958,
253,
1072,
6363,
273,
5820,
789,
327,
253,
22581,
323,
253,
806,
14604,
285,
253,
6774,
39657,
50274,
261,
627,
4588,
1966,
3045,
327,
253,
5728,
3533,
16907,
352,
310,
417,
16293,
326,
9269,
3533,
327,
253,
4384,
3239,
403,
12054,
3662,
494,
50276,
783,
4096,
273,
19693,
767,
8245,
3210,
310,
417,
5544,
973,
275,
253,
10199,
253,
4477,
1333,
326,
281,
12654,
253,
10895,
310,
5185,
342,
253,
4096,
273,
16344,
23105,
11088,
14603,
2299,
616,
24316,
323,
253,
4679,
403,
417,
4518,
4767,
50276,
262,
3133,
326,
253,
4477,
6331,
16994,
253,
41364,
3470,
323,
253,
24762,
47037,
387,
253,
1127,
835,
253,
47037,
41731,
13015,
253,
11454,
2990,
1566,
253,
4477,
812,
452,
9009,
625,
533,
352,
310,
417,
4518,
5544,
2139,
597,
760,
9009,
253,
41364,
3470,
323,
253,
26033,
29524,
7031,
752,
651,
5108,
604,
627,
403,
247,
4067,
1180,
273,
3470,
9009,
50276,
47109,
342,
5368,
15302,
476,
320,
50221,
625,
323,
1650,
984,
253,
4500,
3662,
1511,
943,
2486,
1027,
3510,
273,
14429,
285,
253,
4500,
310,
816,
247,
830,
3587,
10941,
352,
342,
1318,
7212,
285,
5150,
1057,
417,
1056,
3282,
281,
479,
50276,
249,
253,
2228,
1783,
1283,
273,
2219,
403,
10509,
715,
643,
1921,
534,
1057,
417,
1007,
751,
247,
1355,
1180,
281,
479,
476,
253,
4477,
2740,
436,
1066,
715,
625,
7000,
9050,
50276,
555,
993,
50275,
4674,
3127,
352,
436,
4422,
50276,
249,
436,
4422,
50276,
41766,
1474,
3642,
1162,
355,
6247,
323,
970,
270,
797,
50276,
9088,
310,
271,
2853,
630,
19822,
894,
275,
2829,
577,
50276,
9088,
310,
16706,
897,
273,
14492,
764,
284,
5329,
361,
4632,
7127,
1093,
50276,
1911,
19991,
4511,
1078,
30404,
24088,
275,
2593,
2164,
5474,
33032,
296,
3755,
20556,
50276,
66,
747,
2805,
66,
10895,
323,
18090,
2805,
66,
11253,
273,
5329,
361,
3533,
285,
10985,
1052,
12989,
24597,
47549,
22581,
273,
4778,
285,
2515,
50275,
20881,
1255,
265,
50275,
9072,
8245,
1543,
403,
5816,
10444,
5018,
310,
5816,
275,
253,
31825,
534,
403,
1663,
9371,
275,
3733,
271,
990,
936,
423,
1566,
50276,
783,
9400,
3268,
310,
5816,
432,
253,
2929,
5661,
1543,
285,
1783,
310,
417,
2217,
50275,
1189,
455,
253,
2934,
273,
1095,
839,
285,
12182,
839,
247,
747,
10895,
323,
18090,
2805,
66,
10895,
310,
1175,
891,
1928,
247,
10046,
8245,
651,
452,
6518,
1199,
275,
4685,
285,
5127,
272,
253,
3290,
10895,
285,
31825,
671,
253,
1953,
10454,
5127,
382,
6361,
3268,
310,
5816,
4583,
253,
2929,
4028,
812,
320,
5520,
247,
2257,
275,
253,
1655,
2715,
352,
310,
2834,
281,
956,
50276,
19751,
604,
253,
2644,
1895,
476,
320,
11516,
715,
247,
873,
273,
1195,
1868,
285,
2515,
597,
2139,
417,
897,
4216,
1754,
2805,
66,
5609,
50276,
262,
588,
320,
4722,
281,
923,
849,
11454,
24762,
10679,
570,
1546,
6333,
2990,
1347,
327,
436,
10895,
50276,
24841,
3268,
1953,
1511,
3268,
3966,
403,
5816,
50276,
1279,
2173,
4606,
323,
970,
1249,
8090,
273,
39707,
2349,
351,
398,
285,
721,
8090,
273,
39707,
1086,
351,
398,
275,
4908,
263,
5043,
4216,
3186,
1754,
47037,
50276,
4609,
4216,
3186,
5933,
310,
908,
275,
2593,
31619,
285,
2829,
608,
50275,
555,
993,
50276,
262,
436,
4422,
50275,
251,
436,
4422,
5474,
339,
793,
360,
3454,
436,
2929,
29328,
247,
747,
10895,
1754,
327,
40554,
50276,
2437,
4461,
18090,
3533,
323,
2570,
3640,
25064,
285,
20828,
253,
4477,
49939,
2067,
6763,
3533,
432,
3909,
43445,
285,
823,
3081,
3626,
3448,
31825,
861,
5411,
253,
13483,
281,
320,
14042,
323,
275,
1016,
1953,
347,
973,
347,
253,
18600,
800,
3640,
767,
1666,
25379,
581,
990,
936,
423,
11454,
285,
1529,
24762,
1097,
1891,
387,
436,
10895,
50276,
296,
3755,
20556,
253,
10895,
8571,
253,
1774,
1953,
273,
849,
281,
1973,
3210,
326,
476,
19553,
3640,
1223,
9591,
2570,
14720,
50276,
20881,
1255,
265,
347,
261,
253,
10895,
10224,
281,
2303,
253,
3640,
25064,
4445,
19286,
403,
2057,
3264,
281,
4231,
1037,
871,
849,
281,
10173,
253,
3662,
390,
897,
1892,
38059,
3470,
326,
3426,
247,
4216,
273,
2193,
253,
11454,
8245,
671,
3133,
247,
2372,
1327,
15291,
12976,
3533,
273,
849,
973,
4980,
2718,
476,
2686,
513,
327,
253,
4836,
33810,
253,
990,
936,
423,
11454,
985,
310,
12307,
2961,
275,
326,
352,
2779,
556,
417,
2326,
1199,
18090,
4919,
2600,
1309,
1442,
292,
25004,
5727,
253,
24762,
8245,
556,
2289,
281,
247,
3167,
273,
1547,
395,
37224,
3470,
33810,
10895,
3290,
310,
247,
2372,
2834,
281,
2939,
1293,
625,
3530,
50276,
250,
27167,
318,
495,
50276,
2520,
22791,
310,
17194,
407,
253,
298,
47218,
7342,
273,
18462,
253,
2440,
273,
3210,
326,
476,
13398,
3640,
25064,
2570,
14720,
285,
3448,
4685,
2299,
697,
12744,
281,
436,
37317,
1880,
352,
588,
5276,
4217,
275,
2403,
4780,
4404,
824,
4736,
296,
26512,
250,
1512,
49446,
456,
281,
320,
4495,
2920,
6760,
1561,
436,
3634,
281,
3157,
253,
22791,
285,
1056,
352,
625,
42133,
2584,
26441,
1110,
2561,
7342,
7147,
816,
1146,
247,
2834,
15302,
326,
1655,
3210,
2550,
6016,
2654,
5583,
11120,
12262,
285,
16344,
436,
3640,
25064,
4445,
347,
973,
323,
4227,
1677,
247,
2173,
3640,
4793,
28763,
16293,
281,
13905,
253,
5441,
3309,
281,
3662,
253,
3533,
849,
973,
476,
247,
1566,
337,
19553,
4623,
1491,
285,
374,
897,
824,
4623,
1491,
281,
3662,
3533,
50276,
34974,
50276,
20465,
10272,
3237,
2550,
320,
14042,
407,
990,
936,
423,
11454,
6928,
1580,
2570,
24762,
10426,
403,
2424,
436,
310,
247,
6482,
382,
1568,
403,
1142,
8892,
835,
2570,
24762,
10426,
403,
2424,
533,
990,
936,
423,
6928,
34219,
50276,
5371,
6070,
273,
3640,
310,
2424,
281,
8415,
436,
4836,
323,
4227,
1142,
1711,
24705,
29072,
15302,
2210,
342,
16634,
285,
352,
369,
16293,
326,
1561,
253,
5447,
271,
3662,
651,
2826,
752,
651,
247,
3969,
3640,
4216,
323,
436,
1083,
1007,
751,
285,
849,
2570,
651,
352,
320,
50276,
328,
3022,
2074,
8892,
22581,
359,
2550,
4822,
512,
253,
13805,
5871,
3058,
1078,
4983,
22581,
1580,
253,
873,
273,
5793,
9158,
310,
417,
4581,
253,
873,
273,
15965,
9158,
310,
671,
417,
4581,
24088,
275,
14168,
3159,
3237,
2139,
310,
436,
2746,
1805,
685,
17055,
512,
253,
5871,
6607,
275,
253,
10895,
1014,
604,
352,
36908,
3835,
512,
273,
253,
5871,
326,
581,
812,
10686,
400,
1598,
923,
50276,
783,
22581,
5673,
50276,
7404,
4453,
3240,
5433,
552,
276,
403,
2649,
16764,
1512,
1199,
7629,
432,
253,
20665,
1677,
1677,
326,
368,
476,
1361,
9539,
26719,
342,
841,
20665,
2139,
417,
816,
897,
841,
20665,
347,
253,
8245,
323,
247,
7473,
4495,
6779,
326,
651,
18387,
253,
3640,
3058,
323,
253,
4836,
50276,
5092,
368,
1918,
625,
4278,
670,
253,
22581,
1232,
4457,
253,
2159,
12494,
2822,
253,
990,
273,
2593,
3307,
359,
7091,
24597,
40883,
323,
436,
22581,
42182,
34128,
3038,
2654,
320,
9861,
604,
667,
9539,
29960,
812,
5203,
436,
3686,
273,
941,
973,
752,
3290,
1453,
15116,
858,
368,
1691,
275,
1659,
476,
359,
923,
625,
3632,
3530,
273,
253,
10895,
594,
359,
476,
1805,
2939,
697,
3290,
50276,
423,
936,
423,
47037,
835,
858,
368,
755,
436,
1566,
10336,
432,
824,
326,
436,
1332,
6125,
247,
966,
273,
6422,
11454,
6928,
534,
6786,
1375,
23037,
14387,
3045,
327,
1142,
2805,
66,
8892,
209,
422,
1620,
2326,
270,
797,
908,
275,
247,
22510,
19,
14571,
4758,
751,
436,
3185,
952,
5257,
281,
897,
3210,
10166,
327,
3280,
9252,
8557,
751,
44693,
390,
246,
22,
2654,
751,
281,
923,
849,
436,
26662,
281,
970,
44693,
390,
246,
22,
1580,
697,
417,
2590,
326,
253,
270,
797,
31850,
651,
320,
1175,
323,
5978,
50276,
10580,
8716,
1754,
47037,
253,
878,
281,
3359,
2173,
3470,
10523,
275,
436,
1083,
310,
1534,
285,
35162,
1100,
253,
1127,
273,
436,
10895,
275,
619,
4743,
253,
373,
642,
12794,
1318,
275,
4715,
281,
8415,
5793,
7424,
973,
783,
3524,
310,
326,
275,
253,
1232,
273,
2509,
594,
973,
755,
14053,
16039,
715,
752,
3082,
789,
973,
285,
476,
320,
3839,
3732,
281,
643,
3640,
47986,
8892,
436,
4216,
8716,
1754,
47037,
3133,
35440,
660,
11802,
281,
18090,
31569,
285,
2834,
281,
5223,
281,
643,
8892,
285,
697,
417,
7094,
2590,
2139,
359,
943,
1318,
697,
1543,
50276,
13763,
5251,
7200,
310,
352,
16293,
326,
253,
3453,
273,
253,
4216,
8716,
1754,
47037,
588,
320,
253,
1072,
2978,
347,
253,
5328,
3453,
849,
2010,
849,
310,
10669,
5251,
7200,
10302,
7152,
339,
377,
6653,
6010,
50276,
2520,
2929,
10262,
247,
1953,
22291,
10895,
1925,
18090,
31569,
50276,
262,
310,
1027,
432,
5368,
15302,
275,
326,
18090,
31569,
4419,
1527,
3640,
285,
2570,
16161,
4870,
50276,
262,
3400,
39716,
751,
11998,
22581,
534,
14470,
3448,
4685,
285,
5028,
3640,
5661,
1543,
921,
326,
247,
11454,
32049,
48759,
1566,
285,
271,
4908,
263,
446,
1316,
14930,
513,
417,
789,
973,
50276,
296,
3755,
20556,
50276,
783,
10895,
4428,
1524,
10186,
2805,
66,
326,
4419,
253,
3745,
281,
1347,
2570,
18090,
10272,
285,
14720,
352,
310,
2834,
323,
24597,
40883,
5820,
281,
6635,
824,
2570,
3533,
50276,
783,
4477,
4081,
247,
4460,
22581,
1332,
326,
2303,
4903,
285,
2515,
403,
13130,
275,
247,
16260,
3022,
5981,
50276,
20881,
1255,
265,
50276,
783,
10895,
3133,
1355,
281,
16270,
253,
3745,
281,
1347,
2570,
10272,
285,
14720,
253,
3733,
12820,
285,
5175,
15302,
2882,
273,
5910,
1610,
40873,
285,
6783,
3533,
2975,
50276,
783,
2929,
1057,
417,
921,
9990,
273,
253,
10895,
824,
347,
253,
3388,
2978,
273,
3533,
285,
9172,
285,
253,
4451,
1180,
273,
9172,
50276,
783,
2929,
1057,
417,
921,
253,
16226,
7154,
1066,
407,
1953,
3510,
3738,
253,
990,
936,
423,
47037,
33526,
271,
3662,
7200,
273,
470,
18467,
891,
1158,
352,
310,
1774,
281,
921,
625,
2508,
327,
752,
352,
476,
285,
2550,
513,
50276,
783,
4477,
4648,
247,
3215,
11273,
270,
797,
347,
253,
32049,
273,
253,
990,
936,
423,
47037,
285,
10166,
253,
29810,
432,
20041,
50276,
74,
1158,
3215,
11273,
32049,
48759,
3210,
824,
347,
246,
22,
285,
44693,
403,
1805,
347,
253,
1666,
25379,
273,
253,
990,
936,
423,
47037,
685,
253,
1566,
908,
275,
436,
2929,
50274,
15337,
6010,
50276,
783,
2929,
310,
973,
17194,
18090,
31569,
476,
320,
247,
4217,
10895,
281,
7472,
253,
3745,
273,
18090,
10272,
285,
14720,
1223,
253,
10895,
3133,
1355,
281,
16270,
253,
3745,
50275,
74,
1158,
352,
476,
5649,
247,
2257,
342,
247,
625,
11088,
1783,
273,
7103,
1543,
273,
1666,
25379,
187,
187,
4118,
18435,
27,
783,
4477,
12661,
247,
747,
10895,
18090,
31569,
534,
556,
2570,
3533,
10568,
8249,
285,
15965,
14720,
597,
921,
326,
5368,
256,
5503,
3210,
513,
417,
1347,
973,
327,
436,
10895,
7624,
14631,
253,
10454,
273,
253,
10895,
50275,
783,
30628,
5439,
2067,
7350,
347,
10405,
1701,
2708,
50276,
18,
4028,
310,
417,
1077,
2590,
374,
253,
3290,
273,
253,
10895,
310,
1892,
281,
5963,
347,
690,
9560,
1491,
670,
253,
10895,
8869,
1232,
310,
5816,
50276,
20,
253,
1979,
273,
253,
10895,
310,
1355,
577,
690,
10046,
2805,
66,
1666,
25379,
878,
281,
320,
2908,
50276,
328,
9520,
253,
4477,
858,
417,
2085,
247,
30080,
22559,
7613,
697,
1655,
830,
436,
2929,
2550,
320,
7607,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a method to learn neural radiance fields that represent the underlying scene free of reflective components in the scene ie explicitly represented the transmitted regions of the scene prior work in representing transmitted radiance field relies on reflection removal from the input image sequence however this is a challenging problem and typically results in photometric inconsistencies the proposed method uses a novel formulation leveraged on the observation that reflective components in the radiance field are sparser than the transmitted components a patchbased rendering scheme is used to handle the local characteristics of reflectivetransmissive components strengths the paper is well written and the exposition is clear the paper provides a through introduction and a motivation for the solution before properly explaining the proposed solution as such i find the paper to be a useful contribution to the community and beneficial for the reader the use of the transmission encoder with pyramidscale features is interesting and the choice of wg and wl is properly motivated the recurring edge constraints are the core strength of the paper and the description provided in section 42 is succinct the qualitative and quantitative results in the paper and supplemental material clearly demonstrates that the transmitted radiance field is captured free form noise due to reflection weaknesses the authors rightly point out that weighting coefficients are dependent on several factors the viewing direction wrt lights in the scene and the camera position are correlated and more discussion is warranted on whether an mlp that encodes the weighting coefficients is sufficient in general yes the authors discuss the limitations of the work docsepthis paper targets to solve the novelview synthesis problem with reflection removal that is novelview synthesis of a transmitted object from images corrupted by reflections a naive baseline that applies reflection removal techniques to each input image before nerf does not solve the problem as the resultant image would not be multiview consistent this is because most reflectionremoval techniques cannot take advantage of multiple viewpoints this paper solves this problem by introducing 1 transmission feature integration and 2 recurring edge constraints first transmission feature integration is based on the idea of pixelnerf that the feature from other viewpoints can assist the training and the paper used transmission feature instead of the vanilla pixel feature in pixelnerf second recurring edge constraints are based on the assumption that a reflected component is sparse in its presence in the aligned image the paper also collected a new dataset for real multiview images corrupted by reflections and the proposed method shows promising results strengths promising results the proposed method shows promising results on real multiview images corrupted by reflections the comparison with other methods such as nerf nerfw and rr nerf also shows that the proposed method performs superior both qualitatively and quantitatively especially when the number of input images is limited new dataset of multiview images with and without reflections the paper shows the newly collected multiview images which can facilitate further research on multiview reconstruction and reflection removal weaknesses while the paper proposes an interesting method with promising results there are some weaknesses that can be improved the presentation of the manuscript can be improved there are some ambiguous definitions or explanations line 123 what is transmission and reflection entanglement if it means transmission and reflection have an inherent ambiguity then the proposed method cannot disambiguate either due to the absorption reflection and refractive effect should be further clarified motion inconsistency the terminology motion inconsistency used frequently all around the paper including the abstract used for recurring edge constraints is somewhat misleading the key idea used for recurring edge constraints is that the reflected component may not exist in some viewpoints and thereby have a sparse presence the reason for this phenomenon is the size of the reflector is limited which causes the reflected object to be outside the reflector and disappear in some viewpoints it has nothing to do with motion and thus the term motion inconsistency is not the appropriate term to describe the method maybe the reflected object is at a different depth from the transmitted object and moves differently in the image eg larger disparity when it is further but it is not the information that the proposed method directly uses the description in the main paragraph line 187 is already clear so just choosing a better terminology would improve the clarity of the proposed method line 210 what is psi the notation seems to be not defined some important details about the transmission feature are missing what network is used for feature w from line 155 i assume the network is based on errnet but it is difficult to see which part of the errnet is used as there are many components in the errnet line 162 is not enough for understanding the exact structure also line 232 explains the pretraining of the transmission encoder briefly and it is somewhat confusing if the method is different from the original errnet the network structure and the training detail needs to be added to the supplemental material missing baseline a baseline that might be interesting is missing that is rr pixelnerf without transmission feature one of the main contributions of this paper is using the transmission feature which is the combination of 1 reflection removal and 2 pixelnerf assist the training of nerf if these two parts are divided into the reflection removal part and the pixelnerf part it can be another baseline of rr pixelnerf which will be a more fair and interesting baseline rec has a limited performance at least quantitatively the second main contribution of this paper is using recurring edge constraints rec but the effect of rec seems to be marginal quantitatively as shown in the ablation study table 2 the psnr without rec is 2248 which is almost the same as that of the complete model 2275 it would be interesting to see how rec works in more challenging data the questions in the above section questions include some limitations that are not handled in the paper nonplanar reflector and large reflector the proposed method may not work for those cases of reflectors docsepthis paper proposes a novel neural radiance field rendering method that is dealing with specular reflection on the objects surface the proposed method aims at recovering only the transmission radiance behind the reflection to that end this paper proposes to prepare two dedicated networks ie tmlp and rmlp to learn the transmission features and reflection features this is achieved by applying a single image reflection removal method to the training data to separate the background and the reflection the learned transmission and reflection color radiance are then combined in a convex combination in addition in order to guide the learning of background highfrequency details this method also applies recurring edge constraints which utilize the observation that background edges appear consistently in multiple different views strengths 1 this paper is generally wellwritten with clear motivation in the introduction section it clearly defines the current problem and challenge left by existing nerfbased methods which is the reconstruction of scenes behind the transparent surfaces with specular reflection 2 the comprehensive experiments show that the proposed method consistently outperforms the stateoftheart methods by a considerable margin in both qualitative and quantitative evaluations 3 this paper proposes a new nerf purpose dataset which is particularly focusing on the scenes behind the specular reflection the proposed dataset may impose a strong impact on future research in this area weakness 1 i found the performance comparison with respect to baseline method mvsnerf is a bit unfair because the selected baseline methods are not designed to deal with reflection and hence it tends to predict the reflected scene as is therefore the quantitative psnr results are much worse than the proposed method as expected especially in figure 3 mvsnerf almost reconstructs the exact appearance of the target view 2 for a nerf method it is also important to know the performance of the proposed method applied to normal nonreflective scenes otherwise the usage of the proposed method is just limited to reflective scenes in the submitted paper and supplementary material all the examples and benchmark data are performed on the scenes with reflection the authors are suggested to provide more comparison quantitative and real normal scene examples in the rebuttal period 3 what is the processing speed and the network complexity of the proposed method compared to baseline methods in order to prove the effectiveness of the proposed method it is crucial to verify that the performance gain is not coming from the extra number of parameters in the network as well as the preprocessed edge map and reflection purged features 4 from the ablation study the recurring edge constraints rec only bring in very little improvement but it is considered as one of the two contributions in the method section it seems that the proposed method is not very effective 5 it is true that the proposed method outperforms other baselines on the reflective nerf dataset by a large number however the method itself is quite straightforward with limited novelty it is critical to understand the effectiveness of the proposed method by providing the performance comparison on normal datasets and hence prove the validity of the proposed method the limitation of the proposed method is to apply it to any normal scenes or nerf datasets if it cannot perform well on nonreflective scenes the generalizability of the method will be the biggest limitation docsepthis paper proposes a novel view synthesis network specially designed for seethrough scenarios this paper introduces a transmission encoder which separately estimates the transmission amount against the specular highlights reflection in addition this paper introduces a recurring edge constraint to account for the frequency of edges strengths the application and approach of the transmissive scenario sound interesting to me the specular reflection on glass in the seethrough scenario has been rarely discussed in the neural rendering field yet i found that this new research problem is interesting existing solutions such as vanilla nerf seem to fail when there is a specular reflection in input images while the proposed method works properly weaknesses even though the motivation of the proposed method sounds interesting im not fully sure if this paper is completely developed and evaluated to solve the technical challenges specular reflection works very differently from transmission for instance when the camera motion occurs the specular reflection and transmitted image move in opposite directions about the depth position of glass surfaces the proposed model doesnt seem to account for the physical phenomenon instead it just tries to separate the transmission and reflection along the given view vector which is not physically plausible this observation should be valid from a specific view angle if the method accumulates multiple observations in a voxel grid the accurate separation cannot be achievable by increasing the number of observations i would like to hear more in the rebuttal the evaluation of this paper is one of the weakest points except for the main results shown in the teaser most results do not include strong specular reflection according to the proposed formulation of the recurring edge constraint the proposed method may work properly when there are strong contrast edges in the transmitted image the main result of the picture frame is the case in other cases the results do not include any strong specular reflection i think the results look very cherrypicking with a very small number of examples i would like to see more results to validate the performance of the proposed method limitations are clearly mentioned in the main paper
### Summary: | this paper proposes a novel neural radiance field rendering method that is dealing with specular reflection on the objects surface the authors present a novel method to solve the limitation of the existing nerfbased methods for the scenes behind the transparent surfaces with specular reflection the review results are two a7 and two ba5 after carefully checking out the rebuttals and discussions i recommend the paper to be accepted for this neurips | [
285,
253,
4327,
273,
259,
72,
285,
259,
77,
310,
6283,
17194,
50276,
783,
36108,
5024,
10806,
403,
253,
5161,
4757,
273,
253,
2929,
285,
253,
5740,
2530,
275,
2593,
5976,
310,
18382,
4291,
50276,
783,
18276,
285,
11745,
1543,
275,
253,
2929,
285,
25702,
2144,
4518,
14371,
326,
253,
12573,
1985,
5155,
1673,
310,
10848,
1959,
830,
6046,
1955,
281,
12906,
50275,
20881,
1255,
265,
253,
4477,
35155,
1127,
562,
326,
42428,
10303,
403,
7976,
327,
2067,
2616,
253,
14657,
3884,
8772,
10654,
275,
253,
6200,
285,
253,
6568,
1899,
403,
9578,
285,
625,
5955,
310,
26085,
327,
1880,
271,
13361,
81,
326,
31360,
253,
42428,
10303,
310,
4209,
275,
2087,
50276,
9820,
253,
4477,
2319,
253,
7364,
273,
253,
789,
5474,
33032,
2520,
2929,
8571,
281,
8415,
253,
4460,
1374,
9066,
1895,
342,
12906,
8570,
326,
310,
4460,
1374,
9066,
273,
247,
12573,
1789,
432,
3888,
40634,
407,
24233,
247,
27785,
8245,
326,
10384,
12906,
8570,
5609,
281,
1016,
3280,
2460,
1078,
38998,
71,
1057,
417,
8415,
253,
1895,
347,
253,
29395,
2460,
651,
417,
320,
1554,
400,
827,
5185,
436,
310,
984,
954,
12906,
2013,
19249,
5609,
2550,
1379,
5750,
273,
2709,
1859,
10801,
436,
2929,
35910,
436,
1895,
407,
16984,
337,
6322,
4735,
9554,
285,
374,
36108,
5024,
10806,
806,
6322,
4735,
9554,
310,
1754,
327,
253,
2934,
273,
12275,
1216,
71,
326,
253,
4735,
432,
643,
1859,
10801,
476,
10073,
253,
3733,
285,
253,
2929,
908,
6322,
4735,
3185,
273,
253,
26724,
12275,
4735,
275,
12275,
1216,
71,
1273,
36108,
5024,
10806,
403,
1754,
327,
253,
9376,
326,
247,
11392,
4445,
310,
23507,
275,
697,
3361,
275,
253,
15616,
2460,
253,
2929,
671,
5728,
247,
747,
10895,
323,
1524,
1554,
400,
827,
3888,
40634,
407,
24233,
285,
253,
4081,
1332,
2722,
12532,
1543,
50275,
296,
3755,
20556,
50276,
13382,
2182,
1543,
253,
4081,
1332,
2722,
12532,
1543,
327,
1524,
1554,
400,
827,
3888,
40634,
407,
24233,
253,
5301,
342,
643,
3082,
824,
347,
38998,
71,
38998,
25837,
285,
391,
83,
50276,
1216,
71,
671,
2722,
326,
253,
4081,
1332,
17923,
8936,
1097,
36143,
285,
36878,
3340,
672,
253,
1180,
273,
3280,
3888,
310,
3710,
50276,
1826,
10895,
273,
1554,
400,
827,
3888,
342,
285,
1293,
24233,
253,
2929,
2722,
253,
9841,
5728,
1554,
400,
827,
3888,
534,
476,
12454,
2007,
2561,
327,
1554,
400,
827,
14433,
285,
12906,
8570,
50275,
20881,
1255,
265,
1223,
253,
2929,
29328,
271,
4722,
1332,
342,
12532,
1543,
627,
403,
690,
32213,
326,
476,
320,
5520,
50276,
783,
9759,
273,
253,
7714,
476,
320,
5520,
627,
403,
690,
23851,
14308,
390,
22909,
50272,
1282,
15567,
752,
310,
6322,
285,
12906,
26294,
604,
352,
2097,
6322,
285,
12906,
452,
271,
12794,
28931,
840,
253,
4081,
1332,
2550,
557,
19062,
6340,
2057,
1955,
281,
253,
11058,
12906,
285,
32450,
1055,
50276,
11425,
320,
2007,
31637,
50272,
26049,
43430,
253,
28939,
3200,
43430,
908,
7208,
512,
1475,
253,
2929,
1690,
253,
12002,
908,
323,
36108,
5024,
10806,
310,
8489,
24363,
253,
2234,
2934,
908,
323,
36108,
5024,
10806,
310,
326,
253,
11392,
4445,
778,
417,
2226,
275,
690,
1859,
10801,
285,
7624,
452,
247,
23507,
3361,
253,
1921,
323,
436,
11562,
310,
253,
1979,
273,
253,
4887,
263,
310,
3710,
534,
5997,
253,
11392,
1789,
281,
320,
3345,
253,
4887,
263,
285,
15529,
275,
690,
1859,
10801,
352,
556,
2717,
281,
513,
342,
3200,
285,
3021,
253,
1307,
3200,
43430,
310,
417,
253,
4569,
1307,
281,
6266,
253,
1332,
5046,
253,
11392,
1789,
310,
387,
247,
1027,
6864,
432,
253,
12573,
1789,
285,
9727,
13359,
275,
253,
2460,
24088,
4067,
37808,
672,
352,
310,
2007,
533,
352,
310,
417,
253,
1491,
326,
253,
4081,
1332,
3587,
4648,
253,
5740,
275,
253,
2022,
12494,
1386,
25165,
310,
2168,
2590,
594,
816,
13887,
247,
1805,
28939,
651,
3157,
253,
19843,
273,
253,
4081,
1332,
50272,
1282,
20048,
752,
310,
3714,
74,
253,
14951,
3133,
281,
320,
417,
2931,
50276,
8826,
1774,
4278,
670,
253,
6322,
4735,
403,
5816,
752,
2990,
310,
908,
323,
4735,
259,
432,
1386,
20029,
891,
5467,
253,
2990,
310,
1754,
327,
1486,
3024,
533,
352,
310,
2834,
281,
923,
534,
629,
273,
253,
1486,
3024,
310,
908,
347,
627,
403,
1142,
4295,
275,
253,
1486,
3024,
1386,
23094,
310,
417,
2217,
323,
4685,
253,
3242,
2605,
671,
1386,
26972,
11424,
253,
3215,
26208,
273,
253,
6322,
32049,
13366,
285,
352,
310,
8489,
21643,
604,
253,
1332,
310,
1027,
432,
253,
3236,
1486,
3024,
253,
2990,
2605,
285,
253,
3733,
2508,
3198,
281,
320,
2879,
281,
253,
25702,
2144,
50276,
33722,
8245,
247,
8245,
326,
1537,
320,
4722,
310,
5816,
326,
310,
391,
83,
50276,
29206,
1216,
71,
1293,
6322,
4735,
581,
273,
253,
2022,
9021,
273,
436,
2929,
310,
970,
253,
6322,
4735,
534,
310,
253,
5019,
273,
337,
12906,
8570,
285,
374,
12275,
1216,
71,
10073,
253,
3733,
273,
38998,
71,
604,
841,
767,
4243,
403,
4272,
715,
253,
12906,
8570,
629,
285,
253,
12275,
1216,
71,
629,
352,
476,
320,
1529,
8245,
273,
391,
83,
50276,
29206,
1216,
71,
534,
588,
320,
247,
625,
4344,
285,
4722,
8245,
50276,
2845,
556,
247,
3710,
3045,
387,
1878,
36878,
253,
1273,
2022,
7680,
273,
436,
2929,
310,
970,
36108,
5024,
10806,
761,
533,
253,
1055,
273,
761,
3133,
281,
320,
16888,
36878,
347,
2011,
275,
253,
28913,
1263,
2829,
374,
253,
3714,
23838,
50276,
14920,
761,
310,
374,
20520,
534,
310,
2761,
253,
1072,
347,
326,
273,
253,
3426,
1566,
374,
20450,
352,
651,
320,
4722,
281,
923,
849,
761,
2987,
275,
625,
11132,
941,
253,
3533,
275,
253,
1840,
2593,
3533,
2486,
690,
7364,
326,
403,
417,
15726,
275,
253,
2929,
1327,
11139,
274,
4887,
263,
285,
1781,
4887,
263,
253,
4081,
1332,
778,
417,
789,
323,
1110,
2219,
273,
4887,
641,
5474,
33032,
2520,
2929,
29328,
247,
4460,
11454,
1985,
5155,
1673,
18164,
1332,
326,
310,
10620,
342,
946,
792,
12906,
327,
253,
5113,
2553,
253,
4081,
1332,
13698,
387,
27930,
760,
253,
6322,
1985,
5155,
3212,
253,
12906,
281,
326,
990,
436,
2929,
29328,
281,
10347,
767,
9940,
6928,
26332,
246,
1686,
81,
285,
391,
1686,
81,
281,
3037,
253,
6322,
3386,
285,
12906,
3386,
436,
310,
6786,
407,
9433,
247,
2014,
2460,
12906,
8570,
1332,
281,
253,
3733,
941,
281,
4858,
253,
4114,
285,
253,
12906,
253,
6311,
6322,
285,
12906,
3295,
1985,
5155,
403,
840,
5678,
275,
247,
17133,
5019,
275,
1635,
275,
1340,
281,
7102,
253,
4715,
273,
4114,
1029,
18163,
4278,
436,
1332,
671,
10384,
36108,
5024,
10806,
534,
16584,
253,
8310,
326,
4114,
9297,
3176,
12724,
275,
2709,
1027,
6849,
50276,
296,
3755,
20556,
337,
436,
2929,
310,
3839,
973,
15720,
342,
2590,
16038,
275,
253,
10199,
2593,
352,
4518,
13067,
253,
1655,
1895,
285,
5691,
1669,
407,
5368,
38998,
71,
3169,
3082,
534,
310,
253,
14433,
273,
13451,
3212,
253,
13955,
9421,
342,
946,
792,
12906,
50276,
19,
253,
11088,
4679,
921,
326,
253,
4081,
1332,
12724,
41731,
13015,
253,
1375,
23037,
14387,
3082,
407,
247,
10665,
8459,
275,
1097,
18276,
285,
11745,
27163,
50276,
20,
436,
2929,
29328,
247,
747,
38998,
71,
4096,
10895,
534,
310,
3782,
13654,
327,
253,
13451,
3212,
253,
946,
792,
12906,
253,
4081,
10895,
778,
16209,
247,
2266,
3486,
327,
2852,
2561,
275,
436,
2170,
50275,
20881,
1255,
337,
891,
1119,
253,
3045,
5301,
342,
1675,
281,
8245,
1332,
278,
10936,
1216,
71,
310,
247,
2372,
16593,
984,
253,
4236,
8245,
3082,
403,
417,
4158,
281,
2968,
342,
12906,
285,
7613,
352,
14280,
281,
3283,
253,
11392,
6200,
347,
310,
3103,
253,
11745,
3714,
23838,
1543,
403,
1199,
7197,
685,
253,
4081,
1332,
347,
3264,
3340,
275,
4677,
495,
278,
10936,
1216,
71,
2761,
17029,
84,
253,
3242,
7286,
273,
253,
2303,
1859,
50276,
19,
323,
247,
38998,
71,
1332,
352,
310,
671,
1774,
281,
871,
253,
3045,
273,
253,
4081,
1332,
3732,
281,
2622,
1327,
22697,
422,
13451,
5010,
253,
10393,
273,
253,
4081,
1332,
310,
816,
3710,
281,
29210,
13451,
50276,
249,
253,
9262,
2929,
285,
24864,
2144,
512,
253,
6667,
285,
22791,
941,
403,
2684,
327,
253,
13451,
342,
12906,
253,
4477,
403,
5125,
281,
2085,
625,
5301,
11745,
285,
1524,
2622,
6200,
6667,
275,
253,
30080,
22559,
2180,
50276,
20,
752,
310,
253,
5162,
3885,
285,
253,
2990,
10454,
273,
253,
4081,
1332,
2429,
281,
8245,
3082,
275,
1340,
281,
5276,
253,
12510,
273,
253,
4081,
1332,
352,
310,
9560,
281,
12654,
326,
253,
3045,
6351,
310,
417,
3551,
432,
253,
4465,
1180,
273,
3602,
275,
253,
2990,
347,
973,
347,
253,
638,
36981,
5024,
3711,
285,
12906,
1460,
2400,
3386,
50275,
21,
432,
253,
28913,
1263,
253,
36108,
5024,
10806,
761,
760,
3324,
275,
1077,
1652,
7756,
533,
352,
310,
2783,
347,
581,
273,
253,
767,
9021,
275,
253,
1332,
2593,
352,
3133,
326,
253,
4081,
1332,
310,
417,
1077,
3576,
50276,
22,
352,
310,
2032,
326,
253,
4081,
1332,
41731,
13015,
643,
1666,
25379,
327,
253,
29210,
38998,
71,
10895,
407,
247,
1781,
1180,
2299,
253,
1332,
3139,
310,
3240,
15246,
342,
3710,
38135,
352,
310,
4619,
281,
2096,
253,
12510,
273,
253,
4081,
1332,
407,
5277,
253,
3045,
5301,
327,
2622,
15302,
285,
7613,
5276,
253,
13091,
273,
253,
4081,
1332,
50274,
783,
12291,
273,
253,
4081,
1332,
310,
281,
4647,
352,
281,
667,
2622,
13451,
390,
38998,
71,
15302,
604,
352,
2550,
1347,
973,
327,
1327,
22697,
422,
13451,
253,
2087,
50228,
273,
253,
1332,
588,
320,
253,
5962,
12291,
50276,
7152,
33032,
2520,
2929,
29328,
247,
4460,
1859,
9066,
2990,
24443,
4158,
323,
396,
678,
903,
15216,
436,
2929,
23970,
247,
6322,
32049,
534,
11794,
8197,
253,
6322,
2408,
1411,
253,
946,
792,
16681,
12906,
275,
1635,
436,
2929,
23970,
247,
36108,
5024,
7658,
281,
2395,
323,
253,
4294,
273,
9297,
50276,
296,
3755,
20556,
50276,
783,
2898,
285,
2746,
273,
253,
811,
35407,
10076,
3590,
4722,
281,
479,
253,
946,
792,
12906,
327,
5253,
275,
253,
396,
678,
903,
10076,
556,
644,
11766,
5469,
275,
253,
11454,
18164,
1673,
2568,
891,
1119,
326,
436,
747,
2561,
1895,
310,
4722,
5368,
5482,
824,
347,
26724,
38998,
71,
1646,
281,
1891,
672,
627,
310,
247,
946,
792,
12906,
275,
3280,
3888,
1223,
253,
4081,
1332,
2987,
6283,
50275,
20881,
1255,
265,
50276,
9154,
2167,
253,
16038,
273,
253,
4081,
1332,
7835,
4722,
516,
417,
4751,
2119,
604,
436,
2929,
310,
4336,
3715,
285,
6760,
281,
8415,
253,
7681,
7881,
946,
792,
12906,
2987,
1077,
13359,
432,
6322,
323,
4227,
672,
253,
6568,
3200,
6634,
253,
946,
792,
12906,
285,
12573,
2460,
2118,
275,
7285,
10746,
670,
253,
6864,
1899,
273,
5253,
9421,
253,
4081,
1566,
36908,
1646,
281,
2395,
323,
253,
3520,
11562,
3185,
352,
816,
14177,
281,
4858,
253,
6322,
285,
12906,
2112,
253,
1677,
1859,
4972,
534,
310,
417,
13318,
21541,
436,
8310,
943,
320,
3588,
432,
247,
2173,
1859,
6907,
604,
253,
1332,
7358,
17815,
2709,
7313,
275,
247,
46092,
9860,
253,
7899,
9712,
2550,
320,
39941,
407,
3629,
253,
1180,
273,
7313,
891,
651,
751,
281,
4089,
625,
275,
253,
30080,
22559,
50275,
783,
7103,
273,
436,
2929,
310,
581,
273,
253,
5075,
383,
2792,
3707,
323,
253,
2022,
1543,
2011,
275,
253,
716,
12290,
954,
1543,
513,
417,
2486,
2266,
946,
792,
12906,
2556,
281,
253,
4081,
15895,
273,
253,
36108,
5024,
7658,
253,
4081,
1332,
778,
789,
6283,
672,
627,
403,
2266,
4499,
9297,
275,
253,
12573,
2460,
253,
2022,
906,
273,
253,
5406,
3665,
310,
253,
1083,
275,
643,
2219,
253,
1543,
513,
417,
2486,
667,
2266,
946,
792,
12906,
891,
1158,
253,
1543,
1007,
1077,
33804,
81,
12427,
342,
247,
1077,
1355,
1180,
273,
6667,
891,
651,
751,
281,
923,
625,
1543,
281,
17813,
253,
3045,
273,
253,
4081,
1332,
50276,
17465,
569,
403,
4518,
5393,
275,
253,
2022,
2929,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
4460,
11454,
1985,
5155,
1673,
18164,
1332,
326,
310,
10620,
342,
946,
792,
12906,
327,
253,
5113,
2553,
253,
4477,
1246,
247,
4460,
1332,
281,
8415,
253,
12291,
273,
253,
5368,
38998,
71,
3169,
3082,
323,
253,
13451,
3212,
253,
13955,
9421,
342,
946,
792,
12906,
253,
2278,
1543,
403,
767,
247,
24,
285,
767,
18927,
22,
846,
9257,
12669,
562,
253,
30080,
85,
932,
285,
11985,
891,
5583,
253,
2929,
281,
320,
7607,
323,
436,
5723,
2824
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
285,
253,
4327,
273,
259,
72,
285,
259,
77,
310,
6283,
17194,
50276,
783,
36108,
5024,
10806,
403,
253,
5161,
4757,
273,
253,
2929,
285,
253,
5740,
2530,
275,
2593,
5976,
310,
18382,
4291,
50276,
783,
18276,
285,
11745,
1543,
275,
253,
2929,
285,
25702,
2144,
4518,
14371,
326,
253,
12573,
1985,
5155,
1673,
310,
10848,
1959,
830,
6046,
1955,
281,
12906,
50275,
20881,
1255,
265,
253,
4477,
35155,
1127,
562,
326,
42428,
10303,
403,
7976,
327,
2067,
2616,
253,
14657,
3884,
8772,
10654,
275,
253,
6200,
285,
253,
6568,
1899,
403,
9578,
285,
625,
5955,
310,
26085,
327,
1880,
271,
13361,
81,
326,
31360,
253,
42428,
10303,
310,
4209,
275,
2087,
50276,
9820,
253,
4477,
2319,
253,
7364,
273,
253,
789,
5474,
33032,
2520,
2929,
8571,
281,
8415,
253,
4460,
1374,
9066,
1895,
342,
12906,
8570,
326,
310,
4460,
1374,
9066,
273,
247,
12573,
1789,
432,
3888,
40634,
407,
24233,
247,
27785,
8245,
326,
10384,
12906,
8570,
5609,
281,
1016,
3280,
2460,
1078,
38998,
71,
1057,
417,
8415,
253,
1895,
347,
253,
29395,
2460,
651,
417,
320,
1554,
400,
827,
5185,
436,
310,
984,
954,
12906,
2013,
19249,
5609,
2550,
1379,
5750,
273,
2709,
1859,
10801,
436,
2929,
35910,
436,
1895,
407,
16984,
337,
6322,
4735,
9554,
285,
374,
36108,
5024,
10806,
806,
6322,
4735,
9554,
310,
1754,
327,
253,
2934,
273,
12275,
1216,
71,
326,
253,
4735,
432,
643,
1859,
10801,
476,
10073,
253,
3733,
285,
253,
2929,
908,
6322,
4735,
3185,
273,
253,
26724,
12275,
4735,
275,
12275,
1216,
71,
1273,
36108,
5024,
10806,
403,
1754,
327,
253,
9376,
326,
247,
11392,
4445,
310,
23507,
275,
697,
3361,
275,
253,
15616,
2460,
253,
2929,
671,
5728,
247,
747,
10895,
323,
1524,
1554,
400,
827,
3888,
40634,
407,
24233,
285,
253,
4081,
1332,
2722,
12532,
1543,
50275,
296,
3755,
20556,
50276,
13382,
2182,
1543,
253,
4081,
1332,
2722,
12532,
1543,
327,
1524,
1554,
400,
827,
3888,
40634,
407,
24233,
253,
5301,
342,
643,
3082,
824,
347,
38998,
71,
38998,
25837,
285,
391,
83,
50276,
1216,
71,
671,
2722,
326,
253,
4081,
1332,
17923,
8936,
1097,
36143,
285,
36878,
3340,
672,
253,
1180,
273,
3280,
3888,
310,
3710,
50276,
1826,
10895,
273,
1554,
400,
827,
3888,
342,
285,
1293,
24233,
253,
2929,
2722,
253,
9841,
5728,
1554,
400,
827,
3888,
534,
476,
12454,
2007,
2561,
327,
1554,
400,
827,
14433,
285,
12906,
8570,
50275,
20881,
1255,
265,
1223,
253,
2929,
29328,
271,
4722,
1332,
342,
12532,
1543,
627,
403,
690,
32213,
326,
476,
320,
5520,
50276,
783,
9759,
273,
253,
7714,
476,
320,
5520,
627,
403,
690,
23851,
14308,
390,
22909,
50272,
1282,
15567,
752,
310,
6322,
285,
12906,
26294,
604,
352,
2097,
6322,
285,
12906,
452,
271,
12794,
28931,
840,
253,
4081,
1332,
2550,
557,
19062,
6340,
2057,
1955,
281,
253,
11058,
12906,
285,
32450,
1055,
50276,
11425,
320,
2007,
31637,
50272,
26049,
43430,
253,
28939,
3200,
43430,
908,
7208,
512,
1475,
253,
2929,
1690,
253,
12002,
908,
323,
36108,
5024,
10806,
310,
8489,
24363,
253,
2234,
2934,
908,
323,
36108,
5024,
10806,
310,
326,
253,
11392,
4445,
778,
417,
2226,
275,
690,
1859,
10801,
285,
7624,
452,
247,
23507,
3361,
253,
1921,
323,
436,
11562,
310,
253,
1979,
273,
253,
4887,
263,
310,
3710,
534,
5997,
253,
11392,
1789,
281,
320,
3345,
253,
4887,
263,
285,
15529,
275,
690,
1859,
10801,
352,
556,
2717,
281,
513,
342,
3200,
285,
3021,
253,
1307,
3200,
43430,
310,
417,
253,
4569,
1307,
281,
6266,
253,
1332,
5046,
253,
11392,
1789,
310,
387,
247,
1027,
6864,
432,
253,
12573,
1789,
285,
9727,
13359,
275,
253,
2460,
24088,
4067,
37808,
672,
352,
310,
2007,
533,
352,
310,
417,
253,
1491,
326,
253,
4081,
1332,
3587,
4648,
253,
5740,
275,
253,
2022,
12494,
1386,
25165,
310,
2168,
2590,
594,
816,
13887,
247,
1805,
28939,
651,
3157,
253,
19843,
273,
253,
4081,
1332,
50272,
1282,
20048,
752,
310,
3714,
74,
253,
14951,
3133,
281,
320,
417,
2931,
50276,
8826,
1774,
4278,
670,
253,
6322,
4735,
403,
5816,
752,
2990,
310,
908,
323,
4735,
259,
432,
1386,
20029,
891,
5467,
253,
2990,
310,
1754,
327,
1486,
3024,
533,
352,
310,
2834,
281,
923,
534,
629,
273,
253,
1486,
3024,
310,
908,
347,
627,
403,
1142,
4295,
275,
253,
1486,
3024,
1386,
23094,
310,
417,
2217,
323,
4685,
253,
3242,
2605,
671,
1386,
26972,
11424,
253,
3215,
26208,
273,
253,
6322,
32049,
13366,
285,
352,
310,
8489,
21643,
604,
253,
1332,
310,
1027,
432,
253,
3236,
1486,
3024,
253,
2990,
2605,
285,
253,
3733,
2508,
3198,
281,
320,
2879,
281,
253,
25702,
2144,
50276,
33722,
8245,
247,
8245,
326,
1537,
320,
4722,
310,
5816,
326,
310,
391,
83,
50276,
29206,
1216,
71,
1293,
6322,
4735,
581,
273,
253,
2022,
9021,
273,
436,
2929,
310,
970,
253,
6322,
4735,
534,
310,
253,
5019,
273,
337,
12906,
8570,
285,
374,
12275,
1216,
71,
10073,
253,
3733,
273,
38998,
71,
604,
841,
767,
4243,
403,
4272,
715,
253,
12906,
8570,
629,
285,
253,
12275,
1216,
71,
629,
352,
476,
320,
1529,
8245,
273,
391,
83,
50276,
29206,
1216,
71,
534,
588,
320,
247,
625,
4344,
285,
4722,
8245,
50276,
2845,
556,
247,
3710,
3045,
387,
1878,
36878,
253,
1273,
2022,
7680,
273,
436,
2929,
310,
970,
36108,
5024,
10806,
761,
533,
253,
1055,
273,
761,
3133,
281,
320,
16888,
36878,
347,
2011,
275,
253,
28913,
1263,
2829,
374,
253,
3714,
23838,
50276,
14920,
761,
310,
374,
20520,
534,
310,
2761,
253,
1072,
347,
326,
273,
253,
3426,
1566,
374,
20450,
352,
651,
320,
4722,
281,
923,
849,
761,
2987,
275,
625,
11132,
941,
253,
3533,
275,
253,
1840,
2593,
3533,
2486,
690,
7364,
326,
403,
417,
15726,
275,
253,
2929,
1327,
11139,
274,
4887,
263,
285,
1781,
4887,
263,
253,
4081,
1332,
778,
417,
789,
323,
1110,
2219,
273,
4887,
641,
5474,
33032,
2520,
2929,
29328,
247,
4460,
11454,
1985,
5155,
1673,
18164,
1332,
326,
310,
10620,
342,
946,
792,
12906,
327,
253,
5113,
2553,
253,
4081,
1332,
13698,
387,
27930,
760,
253,
6322,
1985,
5155,
3212,
253,
12906,
281,
326,
990,
436,
2929,
29328,
281,
10347,
767,
9940,
6928,
26332,
246,
1686,
81,
285,
391,
1686,
81,
281,
3037,
253,
6322,
3386,
285,
12906,
3386,
436,
310,
6786,
407,
9433,
247,
2014,
2460,
12906,
8570,
1332,
281,
253,
3733,
941,
281,
4858,
253,
4114,
285,
253,
12906,
253,
6311,
6322,
285,
12906,
3295,
1985,
5155,
403,
840,
5678,
275,
247,
17133,
5019,
275,
1635,
275,
1340,
281,
7102,
253,
4715,
273,
4114,
1029,
18163,
4278,
436,
1332,
671,
10384,
36108,
5024,
10806,
534,
16584,
253,
8310,
326,
4114,
9297,
3176,
12724,
275,
2709,
1027,
6849,
50276,
296,
3755,
20556,
337,
436,
2929,
310,
3839,
973,
15720,
342,
2590,
16038,
275,
253,
10199,
2593,
352,
4518,
13067,
253,
1655,
1895,
285,
5691,
1669,
407,
5368,
38998,
71,
3169,
3082,
534,
310,
253,
14433,
273,
13451,
3212,
253,
13955,
9421,
342,
946,
792,
12906,
50276,
19,
253,
11088,
4679,
921,
326,
253,
4081,
1332,
12724,
41731,
13015,
253,
1375,
23037,
14387,
3082,
407,
247,
10665,
8459,
275,
1097,
18276,
285,
11745,
27163,
50276,
20,
436,
2929,
29328,
247,
747,
38998,
71,
4096,
10895,
534,
310,
3782,
13654,
327,
253,
13451,
3212,
253,
946,
792,
12906,
253,
4081,
10895,
778,
16209,
247,
2266,
3486,
327,
2852,
2561,
275,
436,
2170,
50275,
20881,
1255,
337,
891,
1119,
253,
3045,
5301,
342,
1675,
281,
8245,
1332,
278,
10936,
1216,
71,
310,
247,
2372,
16593,
984,
253,
4236,
8245,
3082,
403,
417,
4158,
281,
2968,
342,
12906,
285,
7613,
352,
14280,
281,
3283,
253,
11392,
6200,
347,
310,
3103,
253,
11745,
3714,
23838,
1543,
403,
1199,
7197,
685,
253,
4081,
1332,
347,
3264,
3340,
275,
4677,
495,
278,
10936,
1216,
71,
2761,
17029,
84,
253,
3242,
7286,
273,
253,
2303,
1859,
50276,
19,
323,
247,
38998,
71,
1332,
352,
310,
671,
1774,
281,
871,
253,
3045,
273,
253,
4081,
1332,
3732,
281,
2622,
1327,
22697,
422,
13451,
5010,
253,
10393,
273,
253,
4081,
1332,
310,
816,
3710,
281,
29210,
13451,
50276,
249,
253,
9262,
2929,
285,
24864,
2144,
512,
253,
6667,
285,
22791,
941,
403,
2684,
327,
253,
13451,
342,
12906,
253,
4477,
403,
5125,
281,
2085,
625,
5301,
11745,
285,
1524,
2622,
6200,
6667,
275,
253,
30080,
22559,
2180,
50276,
20,
752,
310,
253,
5162,
3885,
285,
253,
2990,
10454,
273,
253,
4081,
1332,
2429,
281,
8245,
3082,
275,
1340,
281,
5276,
253,
12510,
273,
253,
4081,
1332,
352,
310,
9560,
281,
12654,
326,
253,
3045,
6351,
310,
417,
3551,
432,
253,
4465,
1180,
273,
3602,
275,
253,
2990,
347,
973,
347,
253,
638,
36981,
5024,
3711,
285,
12906,
1460,
2400,
3386,
50275,
21,
432,
253,
28913,
1263,
253,
36108,
5024,
10806,
761,
760,
3324,
275,
1077,
1652,
7756,
533,
352,
310,
2783,
347,
581,
273,
253,
767,
9021,
275,
253,
1332,
2593,
352,
3133,
326,
253,
4081,
1332,
310,
417,
1077,
3576,
50276,
22,
352,
310,
2032,
326,
253,
4081,
1332,
41731,
13015,
643,
1666,
25379,
327,
253,
29210,
38998,
71,
10895,
407,
247,
1781,
1180,
2299,
253,
1332,
3139,
310,
3240,
15246,
342,
3710,
38135,
352,
310,
4619,
281,
2096,
253,
12510,
273,
253,
4081,
1332,
407,
5277,
253,
3045,
5301,
327,
2622,
15302,
285,
7613,
5276,
253,
13091,
273,
253,
4081,
1332,
50274,
783,
12291,
273,
253,
4081,
1332,
310,
281,
4647,
352,
281,
667,
2622,
13451,
390,
38998,
71,
15302,
604,
352,
2550,
1347,
973,
327,
1327,
22697,
422,
13451,
253,
2087,
50228,
273,
253,
1332,
588,
320,
253,
5962,
12291,
50276,
7152,
33032,
2520,
2929,
29328,
247,
4460,
1859,
9066,
2990,
24443,
4158,
323,
396,
678,
903,
15216,
436,
2929,
23970,
247,
6322,
32049,
534,
11794,
8197,
253,
6322,
2408,
1411,
253,
946,
792,
16681,
12906,
275,
1635,
436,
2929,
23970,
247,
36108,
5024,
7658,
281,
2395,
323,
253,
4294,
273,
9297,
50276,
296,
3755,
20556,
50276,
783,
2898,
285,
2746,
273,
253,
811,
35407,
10076,
3590,
4722,
281,
479,
253,
946,
792,
12906,
327,
5253,
275,
253,
396,
678,
903,
10076,
556,
644,
11766,
5469,
275,
253,
11454,
18164,
1673,
2568,
891,
1119,
326,
436,
747,
2561,
1895,
310,
4722,
5368,
5482,
824,
347,
26724,
38998,
71,
1646,
281,
1891,
672,
627,
310,
247,
946,
792,
12906,
275,
3280,
3888,
1223,
253,
4081,
1332,
2987,
6283,
50275,
20881,
1255,
265,
50276,
9154,
2167,
253,
16038,
273,
253,
4081,
1332,
7835,
4722,
516,
417,
4751,
2119,
604,
436,
2929,
310,
4336,
3715,
285,
6760,
281,
8415,
253,
7681,
7881,
946,
792,
12906,
2987,
1077,
13359,
432,
6322,
323,
4227,
672,
253,
6568,
3200,
6634,
253,
946,
792,
12906,
285,
12573,
2460,
2118,
275,
7285,
10746,
670,
253,
6864,
1899,
273,
5253,
9421,
253,
4081,
1566,
36908,
1646,
281,
2395,
323,
253,
3520,
11562,
3185,
352,
816,
14177,
281,
4858,
253,
6322,
285,
12906,
2112,
253,
1677,
1859,
4972,
534,
310,
417,
13318,
21541,
436,
8310,
943,
320,
3588,
432,
247,
2173,
1859,
6907,
604,
253,
1332,
7358,
17815,
2709,
7313,
275,
247,
46092,
9860,
253,
7899,
9712,
2550,
320,
39941,
407,
3629,
253,
1180,
273,
7313,
891,
651,
751,
281,
4089,
625,
275,
253,
30080,
22559,
50275,
783,
7103,
273,
436,
2929,
310,
581,
273,
253,
5075,
383,
2792,
3707,
323,
253,
2022,
1543,
2011,
275,
253,
716,
12290,
954,
1543,
513,
417,
2486,
2266,
946,
792,
12906,
2556,
281,
253,
4081,
15895,
273,
253,
36108,
5024,
7658,
253,
4081,
1332,
778,
789,
6283,
672,
627,
403,
2266,
4499,
9297,
275,
253,
12573,
2460,
253,
2022,
906,
273,
253,
5406,
3665,
310,
253,
1083,
275,
643,
2219,
253,
1543,
513,
417,
2486,
667,
2266,
946,
792,
12906,
891,
1158,
253,
1543,
1007,
1077,
33804,
81,
12427,
342,
247,
1077,
1355,
1180,
273,
6667,
891,
651,
751,
281,
923,
625,
1543,
281,
17813,
253,
3045,
273,
253,
4081,
1332,
50276,
17465,
569,
403,
4518,
5393,
275,
253,
2022,
2929,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
4460,
11454,
1985,
5155,
1673,
18164,
1332,
326,
310,
10620,
342,
946,
792,
12906,
327,
253,
5113,
2553,
253,
4477,
1246,
247,
4460,
1332,
281,
8415,
253,
12291,
273,
253,
5368,
38998,
71,
3169,
3082,
323,
253,
13451,
3212,
253,
13955,
9421,
342,
946,
792,
12906,
253,
2278,
1543,
403,
767,
247,
24,
285,
767,
18927,
22,
846,
9257,
12669,
562,
253,
30080,
85,
932,
285,
11985,
891,
5583,
253,
2929,
281,
320,
7607,
323,
436,
5723,
2824
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper empirically studies the reason for the phenomenon that deep neural networks can memorize the data labels even the labels are randomly generated new geometric measures by replica meanfield theory are applied in the analysis the findings of the paper are interesting it shows the heterogeneity in layers and training stage of the neural net i memorization occurs in deeper layers rewinding the final layer to the early weights mitigates memorization ii when memorization happens the early layer still learn representations that can generalize iii in the training early activations stabilize first and deeper layers weights stabilize first iv near initialization the gradient is dominated by unpermuted examples i have the following questionscomments it is better to further explain the intuition of the manifold geometry metrics the current figure 1b is not very clear in manifold capacity what do p and n exactly mean is this p the number of classes as used elsewhere the paper explains that by training on permuted examples the network can learn generalizable representations at the initial training stage because the gradient ignores permuted examples but why in the later training stage the early layers and later layers show different generalization properties in general this paper carries wellorganized experiments one shortcoming is that the paper does not provide a methodology to solve the generalization problem or further theoretical analysis of the observations but the empirical discoveries are novel and can be beneficial to the deep learning community updates thanks for the authors response the modified version improves clarity i think this paper provides nice observations and initial analysis to the community and can be beneficial to future work so i recommend this paper to be accepteddocsepthe authors apply mftma to dnns trained on cifar with label noise to analyze their behaviors between generalization and memorization based on experimental results they claim that what is involved in memorization are not lower layers but higher layers this claim is convincing another claim that this is not caused by a vanishing gradient effect is plausible too im sure these results give some insights into understanding generalization and memorization by dnns questions why do the authors consider only convolutional layers not fullyconnected layers for the analyses in the experiment of rewinding individual layers the three fc layers are left untouched why is mftma the only method that can examineverify the above finding comments at the first reading i didnt understand what restored examples means and it took me a while to understand it the caption for fig a7 has an error cifar100 should be tiny imagenet docsep summary this paper investigates memorization in deep neural networks dnns authors leverage mean field theoretic geometric analysis method mftma to analyze when and where memorization occurs in a dnn through empirical analysis they show that i generalizing feature are learned initially and that memorization happen later in training mostly in the top layers ii we can mitigate memorization by rewinding the top layers parameters to earlier values they also show that their mftma metrics can highlight the phenomena of double decent finally they demonstrate that gradient descent initially ignores noisy example and focus on correctly labeled examples reasons for score i lean toward acceptance this paper makes interesting observation regarding memorization of deep network it performs a good empirical study which provide enough evidences for the different claims although mftma could be a better explained in the main paper pros as stated above the paper makes interesting observation regarding memorization of deep network it performs a thorough empirical study cons i found it hard to understand mftma without referring to the appendix a it would be nice to expand the explanation of mftma in the main paper in addition it would be good to further explain fig 1 b which contains a lot of information does the observation scale to larger dataset such as imagenet experiments are run for only one seed docsepthis paper analyses memorization in dnns from the lens of memorization fitting random labels and finds that it seems to happen in later layers these results are obtained using the mftma framework a manifold analysis tool testing geometric properties of individual layers the analysis also attempts to explain why such a phenomenon exists and makes a few interesting observations this paper does not propose any new algorithm but instead settles some important questions by infirming or affirming past speculation on layer behaviour found in the literature i find three particularly interesting results in this paper later layers seem to be responsible for memorization while early layers seem to converge last but consistently learn generalizing features although this may not be true for other architectures increasing the dimensionality of the network to induce double descent decreases the manifold dimensionality of the last layer this is consistent with overparameterization making everything smootherflatter and more easily disentangleable in the last layer for examples with the wrong class gradients initially vanish due to destructive interference which seems to be a driving force for the initial good generalization performance downsides of the paper the setting explored here is somewhat artificial 1 the requirement on a high enough epsilon random label proportion may not represent real use of dnns i write this having seen fig a8 this is also a common criticism of doubledescent results 2 the models trained here dont seem to exceed 40 testing accuracy again not necessarily representing real use of dnns this is a bit surprising considering even models from back in 2013 had above 60 accuracy on cifar100 although the results of the paper do not hinge entirely on it the reliance on mftma limits the interpretation somewhat while an interesting tool its not clear to me that it allows us to make strong statements about the geometry of neural networks in particular for the early layers mftma may not be able to capture the geometry of features which might still be somewhat entangled yet possess a lot of richness i have some issues with the presentation of the paper this paper does not really introduce a novel lens on generalization or significantly new ideas although id argue it formalizes existing ideas and properly tests them empirically on the value of the contribution i think having empirical evidence of the studied phenomena is valuable more so than previous speculation on them the empirical results presented here do open the door for new questions to be answered and may help focus the ongoing investigation of memorization and generalization in dnns additional comments something seems wrong with figure 2bmiddle two columns arent permuted and restored examples the same inputs x but with the corresponding y changed if this is the case then their umap should be the same the only difference between the second column and the third column should be the coloring of the points i presume that the figure shows a different minibatch of xs for these two columns i would highly recommend not doing so and using the exact same inputs it would be consistent with the text and the presentation eg fig 1a all figures the label fonts should be bigger from the iclr formatting guidelines use 10 point type for text and all artwork must be neat clean and legible having to zoom in and out to be able to read figures properly hurts accessibility and legibility which detracts from the quality of the paper packing text results and figures in an 8page document can be hard but synthesizing information including visual information contained in figures is an essential skill in conveying knowledge here are a few suggestions for this particular paper figure 1a seems unnecessary the text conveys these 3 concepts clearly figure 1b is important and should take the entire width of the page with legible fonts figure 2as subplots all share the same x and y axis making their naming redundant and taking up space figure 2bs column labels are also repeated needlessly taking up vertical space figure 3s x axis doesnt need individual layer name labels and could be replaced with a single layer depth label 3a and 3b also share this axis leading to wasted vertical space space that could be used to make fonts larger idem for figure 4a individual layers do not need to be named but rather the concept of layer depth can be conveyed with a properly labelled colorbar gradient 4cde could be less wide and leave more horizontal space to make fonts larger in figure 5a its not immediately clear that the x axis are individual layers the lognabla label should be on the colorbar rather than on top of the figure id also suggest flipping the x and y axis as the x axis is typically used for time this would allow there to have the three subplots side by side with a shared labelled colorbar on the right matplotlib seems to be used here see matplotlibpyplotsubplotss sharexsharey arguments for examples
### Summary: | the paper offers novel insights about memorization the process by which deep neural networks are able to learn examples with incorrect labels the core insight is that late layers are responsible for memorization the paper presents a thorough examination of this claim from different angles the experiments involving rewinding late layers are especially innovative the reviewers found the insights valuable and voted unanimously for accepting the paper the sentiment is well summarized by r2 the findings of the paper are interesting it shows the heterogeneity in layers and training stage of the neural net i would like to bring to your attention the coherent gradients paper see also r1 comment this and other related papers already discusses the effect of label permutation on the gradient norm please make sure you discuss this related work as a minor comment please improve the resolution of all figures in the paper in summary it is my pleasure to recommend the acceptance of the paper thank you for submitting your work to iclr and please make sure you address all remarks of the reviewers in the cameraready version | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
45190,
2175,
253,
1921,
323,
253,
11562,
326,
3676,
11454,
6928,
476,
16407,
907,
253,
941,
13301,
1014,
253,
13301,
403,
12421,
4561,
747,
17856,
5593,
407,
36804,
1599,
3423,
3762,
403,
3732,
275,
253,
1783,
50275,
783,
4342,
273,
253,
2929,
403,
4722,
352,
2722,
253,
19331,
275,
8090,
285,
3733,
3924,
273,
253,
11454,
2036,
50276,
74,
16407,
1320,
6634,
275,
12861,
8090,
294,
88,
3087,
253,
2457,
3828,
281,
253,
2393,
13461,
4784,
304,
684,
16407,
1320,
50276,
2886,
672,
16407,
1320,
6569,
253,
2393,
3828,
1335,
3037,
14237,
326,
476,
39970,
50276,
12211,
275,
253,
3733,
2393,
1396,
569,
33292,
806,
285,
12861,
8090,
13461,
33292,
806,
50275,
400,
2822,
31850,
253,
11786,
310,
14691,
407,
440,
33309,
4525,
6667,
50276,
74,
452,
253,
1563,
3533,
26122,
50275,
262,
310,
1805,
281,
2007,
5513,
253,
30328,
273,
253,
16751,
12087,
17082,
50276,
783,
1655,
4677,
337,
67,
310,
417,
1077,
2590,
50275,
249,
16751,
5350,
752,
513,
268,
285,
295,
4555,
1599,
310,
436,
268,
253,
1180,
273,
5971,
347,
908,
11358,
50275,
783,
2929,
11424,
326,
407,
3733,
327,
8143,
4525,
6667,
253,
2990,
476,
3037,
2087,
12729,
14237,
387,
253,
3302,
3733,
3924,
984,
253,
11786,
35136,
8143,
4525,
6667,
533,
2139,
275,
253,
1996,
3733,
3924,
253,
2393,
8090,
285,
1996,
8090,
921,
1027,
26647,
3607,
50276,
249,
2087,
436,
2929,
15814,
973,
34092,
4679,
581,
2159,
4202,
310,
326,
253,
2929,
1057,
417,
2085,
247,
16182,
281,
8415,
253,
26647,
1895,
390,
2007,
10527,
1783,
273,
253,
7313,
50276,
2858,
253,
16774,
32912,
403,
4460,
285,
476,
320,
12912,
281,
253,
3676,
4715,
3114,
50272,
484,
24275,
6701,
323,
253,
4477,
2380,
253,
7321,
2715,
19132,
19843,
891,
1158,
436,
2929,
3400,
5322,
7313,
285,
3302,
1783,
281,
253,
3114,
285,
476,
320,
12912,
281,
2852,
789,
594,
891,
5583,
436,
2929,
281,
320,
7607,
7152,
339,
431,
248,
4477,
4647,
278,
649,
785,
281,
277,
79,
2224,
10166,
327,
260,
338,
274,
342,
5203,
6046,
281,
12106,
616,
13576,
875,
26647,
285,
16407,
1320,
1754,
327,
5661,
1543,
597,
1750,
326,
752,
310,
3206,
275,
16407,
1320,
403,
417,
2406,
8090,
533,
2169,
8090,
436,
1750,
310,
21414,
1529,
1750,
326,
436,
310,
417,
4269,
407,
247,
29199,
11786,
1055,
310,
21541,
1512,
516,
2119,
841,
1543,
1918,
690,
16039,
715,
4685,
26647,
285,
16407,
1320,
407,
277,
79,
2224,
50276,
34974,
50276,
22309,
513,
253,
4477,
1908,
760,
27311,
267,
8090,
417,
4751,
14063,
8090,
323,
253,
6260,
275,
253,
3368,
273,
294,
88,
3087,
2060,
8090,
253,
1264,
269,
68,
8090,
403,
1669,
48976,
2139,
50276,
261,
278,
649,
785,
253,
760,
1332,
326,
476,
9186,
36302,
253,
1840,
4560,
50276,
26122,
50276,
255,
253,
806,
4361,
891,
42126,
2096,
752,
16789,
6667,
2097,
285,
352,
2335,
479,
247,
1223,
281,
2096,
352,
253,
11743,
323,
3036,
247,
24,
556,
271,
2228,
260,
338,
274,
2313,
943,
320,
10058,
4440,
257,
292,
5474,
33032,
50276,
8774,
436,
2929,
2340,
684,
16407,
1320,
275,
3676,
11454,
6928,
277,
79,
2224,
50276,
43355,
25057,
1599,
1673,
253,
30325,
17856,
1783,
1332,
278,
649,
785,
281,
12106,
672,
285,
835,
16407,
1320,
6634,
275,
247,
277,
9866,
949,
16774,
1783,
597,
921,
326,
891,
2087,
3006,
4735,
403,
6311,
8523,
285,
326,
16407,
1320,
5108,
1996,
275,
3733,
6571,
275,
253,
1755,
8090,
21255,
359,
476,
29966,
16407,
1320,
407,
294,
88,
3087,
253,
1755,
8090,
3602,
281,
4321,
2193,
597,
671,
921,
326,
616,
278,
649,
785,
17082,
476,
6780,
253,
16958,
273,
4021,
12524,
4720,
597,
7568,
326,
11786,
18499,
8523,
35136,
27620,
1650,
285,
2770,
327,
9113,
13130,
6667,
50275,
250,
3743,
323,
4868,
50276,
74,
9644,
2584,
14924,
436,
2929,
2789,
4722,
8310,
5001,
16407,
1320,
273,
3676,
2990,
352,
17923,
247,
1175,
16774,
1263,
534,
2085,
2217,
20456,
2979,
323,
253,
1027,
3916,
50276,
20261,
278,
649,
785,
812,
320,
247,
1805,
5544,
275,
253,
2022,
2929,
50272,
856,
84,
50276,
284,
4767,
1840,
253,
2929,
2789,
4722,
8310,
5001,
16407,
1320,
273,
3676,
2990,
50275,
262,
17923,
247,
11080,
16774,
1263,
50273,
5040,
50275,
74,
1119,
352,
1892,
281,
2096,
278,
649,
785,
1293,
14339,
281,
253,
30762,
247,
352,
651,
320,
5322,
281,
5645,
253,
8813,
273,
278,
649,
785,
275,
253,
2022,
2929,
275,
1635,
352,
651,
320,
1175,
281,
2007,
5513,
3036,
337,
270,
534,
4428,
247,
2257,
273,
1491,
50275,
18566,
253,
8310,
4311,
281,
4067,
10895,
824,
347,
4440,
257,
292,
50274,
16217,
3825,
403,
1408,
323,
760,
581,
8357,
50272,
7152,
33032,
2520,
2929,
6260,
16407,
1320,
275,
277,
79,
2224,
432,
253,
9655,
273,
16407,
1320,
50276,
31893,
3632,
13301,
285,
9010,
326,
352,
3133,
281,
5108,
275,
1996,
8090,
841,
1543,
403,
2797,
970,
253,
278,
649,
785,
7792,
247,
16751,
1783,
4968,
5175,
17856,
3607,
273,
2060,
8090,
253,
1783,
671,
9437,
281,
5513,
2139,
824,
247,
11562,
4961,
285,
2789,
247,
1643,
4722,
7313,
436,
2929,
1057,
417,
12661,
667,
747,
5933,
533,
3185,
3414,
868,
690,
1774,
3533,
407,
2192,
2683,
272,
390,
48984,
2469,
22898,
327,
3828,
8770,
1119,
275,
253,
6239,
50276,
74,
1089,
1264,
3782,
4722,
1543,
275,
436,
2929,
50276,
31312,
8090,
1646,
281,
320,
5506,
323,
16407,
1320,
1223,
2393,
8090,
1646,
281,
29623,
1390,
533,
12724,
3037,
2087,
3006,
3386,
3738,
436,
778,
417,
320,
2032,
323,
643,
35615,
50276,
41193,
253,
7877,
1319,
273,
253,
2990,
281,
10808,
4021,
18499,
12075,
253,
16751,
7877,
1319,
273,
253,
1390,
3828,
436,
310,
5185,
342,
689,
19484,
1320,
2403,
3253,
39797,
977,
1258,
2569,
285,
625,
4354,
557,
290,
2134,
494,
275,
253,
1390,
3828,
50276,
1542,
6667,
342,
253,
3430,
966,
27935,
8523,
29259,
1955,
281,
27009,
11689,
534,
3133,
281,
320,
247,
6276,
3490,
323,
253,
3302,
1175,
26647,
3045,
50276,
3487,
84,
1487,
273,
253,
2929,
50276,
783,
4758,
14859,
1060,
310,
8489,
13345,
337,
253,
8284,
327,
247,
1029,
2217,
299,
4277,
3632,
5203,
8394,
778,
417,
1957,
1524,
897,
273,
277,
79,
2224,
891,
3630,
436,
1907,
2326,
3036,
247,
25,
436,
310,
671,
247,
1846,
14226,
273,
25128,
40513,
1543,
374,
253,
3210,
10166,
1060,
13414,
1646,
281,
8268,
3387,
5175,
7200,
969,
417,
7933,
9999,
1524,
897,
273,
277,
79,
2224,
436,
310,
247,
2372,
10084,
7296,
1014,
3210,
432,
896,
275,
4072,
574,
1840,
3925,
7200,
327,
260,
338,
274,
2313,
50276,
20261,
253,
1543,
273,
253,
2929,
513,
417,
38864,
7094,
327,
352,
253,
22095,
327,
278,
649,
785,
7787,
253,
7914,
8489,
1223,
271,
4722,
4968,
697,
417,
2590,
281,
479,
326,
352,
4483,
441,
281,
1056,
2266,
7234,
670,
253,
12087,
273,
11454,
6928,
275,
1798,
323,
253,
2393,
8090,
278,
649,
785,
778,
417,
320,
2104,
281,
9232,
253,
12087,
273,
3386,
534,
1537,
1335,
320,
8489,
36255,
2568,
7081,
247,
2257,
273,
37175,
50276,
74,
452,
690,
3374,
342,
253,
9759,
273,
253,
2929,
50276,
2520,
2929,
1057,
417,
1663,
9569,
247,
4460,
9655,
327,
26647,
390,
3012,
747,
5697,
3738,
2654,
9059,
352,
7473,
4219,
5368,
5697,
285,
6283,
5216,
731,
45190,
50276,
251,
253,
1318,
273,
253,
7680,
50276,
74,
1158,
1907,
16774,
1941,
273,
253,
5421,
16958,
310,
9865,
625,
594,
685,
2045,
22898,
327,
731,
50276,
783,
16774,
1543,
3559,
1060,
513,
1527,
253,
3369,
323,
747,
3533,
281,
320,
9577,
285,
778,
1361,
2770,
253,
10800,
5839,
273,
16407,
1320,
285,
26647,
275,
277,
79,
2224,
50275,
38092,
5701,
50276,
17873,
3133,
3430,
342,
4677,
374,
5844,
3209,
767,
9930,
403,
2649,
8143,
4525,
285,
16789,
6667,
253,
1072,
14800,
1269,
533,
342,
253,
3969,
340,
4391,
604,
436,
310,
253,
1083,
840,
616,
5111,
522,
943,
320,
253,
1072,
253,
760,
3064,
875,
253,
1273,
5084,
285,
253,
2626,
5084,
943,
320,
253,
35633,
273,
253,
2792,
891,
35533,
326,
253,
4677,
2722,
247,
1027,
1054,
487,
1506,
273,
48361,
323,
841,
767,
9930,
891,
651,
4122,
5583,
417,
2509,
594,
285,
970,
253,
3242,
1072,
14800,
352,
651,
320,
5185,
342,
253,
2505,
285,
253,
9759,
24088,
3036,
337,
66,
50276,
455,
8442,
253,
5203,
36622,
943,
320,
8750,
432,
253,
17857,
32888,
33907,
9600,
897,
884,
1127,
1511,
323,
2505,
285,
512,
28227,
1364,
320,
18176,
4076,
285,
1791,
917,
1907,
281,
21282,
275,
285,
562,
281,
320,
2104,
281,
1239,
8442,
6283,
31835,
28092,
285,
1791,
2322,
534,
843,
974,
84,
432,
253,
3290,
273,
253,
2929,
22485,
2505,
1543,
285,
8442,
275,
271,
854,
6377,
3389,
476,
320,
1892,
533,
35143,
3006,
1491,
1690,
5304,
1491,
6221,
275,
8442,
310,
271,
5667,
10861,
275,
43049,
3640,
50276,
1568,
403,
247,
1643,
13991,
323,
436,
1798,
2929,
4677,
337,
66,
3133,
15279,
253,
2505,
11785,
656,
841,
495,
12342,
4518,
4677,
337,
67,
310,
1774,
285,
943,
1379,
253,
2862,
4871,
273,
253,
3239,
342,
1791,
917,
36622,
4677,
374,
284,
749,
42045,
512,
3894,
253,
1072,
1269,
285,
340,
7844,
2403,
616,
26086,
28116,
285,
3192,
598,
2317,
4677,
374,
1768,
5084,
13301,
403,
671,
6015,
878,
13102,
3192,
598,
9118,
2317,
4677,
495,
84,
1269,
7844,
36908,
878,
2060,
3828,
1416,
13301,
285,
812,
320,
7932,
342,
247,
2014,
3828,
6864,
5203,
50276,
20,
66,
285,
495,
67,
671,
3894,
436,
7844,
4283,
281,
25262,
9118,
2317,
2317,
326,
812,
320,
908,
281,
1056,
36622,
4067,
2654,
358,
323,
4677,
577,
66,
2060,
8090,
513,
417,
878,
281,
320,
4907,
533,
2581,
253,
4473,
273,
3828,
6864,
476,
320,
29403,
342,
247,
6283,
27214,
3295,
2009,
11786,
50276,
21,
68,
615,
812,
320,
1679,
4618,
285,
3553,
625,
11593,
2317,
281,
1056,
36622,
4067,
50276,
249,
4677,
608,
66,
697,
417,
4745,
2590,
326,
253,
1269,
7844,
403,
2060,
8090,
253,
298,
2331,
6348,
5203,
943,
320,
327,
253,
3295,
2009,
2581,
685,
327,
1755,
273,
253,
4677,
2654,
671,
1804,
46899,
253,
1269,
285,
340,
7844,
347,
253,
1269,
7844,
310,
5431,
908,
323,
673,
436,
651,
1581,
627,
281,
452,
253,
1264,
749,
42045,
1930,
407,
1930,
342,
247,
6096,
27214,
3295,
2009,
327,
253,
987,
1111,
14095,
4658,
3133,
281,
320,
908,
1060,
923,
1111,
14095,
4658,
4789,
42045,
538,
14095,
859,
3894,
89,
18316,
90,
7125,
323,
6667,
2490,
187,
4118,
18435,
27,
783,
2929,
6131,
4460,
16039,
670,
16407,
1320,
253,
1232,
407,
534,
3676,
11454,
6928,
403,
2104,
281,
3037,
6667,
342,
13583,
13301,
253,
5161,
12288,
310,
326,
3563,
8090,
403,
5506,
323,
16407,
1320,
253,
2929,
10262,
247,
11080,
8368,
273,
436,
1750,
432,
1027,
14636,
253,
4679,
7668,
294,
88,
3087,
3563,
8090,
403,
3340,
16694,
50276,
783,
30628,
1119,
253,
16039,
9865,
285,
14285,
38350,
323,
18738,
253,
2929,
253,
21942,
310,
973,
17903,
407,
391,
19,
253,
4342,
273,
253,
2929,
403,
4722,
352,
2722,
253,
19331,
275,
8090,
285,
3733,
3924,
273,
253,
11454,
2036,
50276,
74,
651,
751,
281,
3324,
281,
634,
4116,
253,
18893,
27935,
2929,
923,
671,
391,
18,
4385,
436,
285,
643,
2905,
9380,
2168,
25339,
253,
1055,
273,
5203,
29391,
327,
253,
11786,
5222,
4496,
1056,
2119,
368,
2319,
436,
2905,
789,
347,
247,
5884,
4385,
4496,
3157,
253,
6064,
273,
512,
8442,
275,
253,
2929,
50275,
249,
6010,
352,
310,
619,
11284,
281,
5583,
253,
14924,
273,
253,
2929,
5717,
368,
323,
29315,
634,
789,
281,
17857,
32888,
285,
4496,
1056,
2119,
368,
2953,
512,
16157,
273,
253,
30628,
275,
253,
4049,
254,
609,
5102,
2715,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
45190,
2175,
253,
1921,
323,
253,
11562,
326,
3676,
11454,
6928,
476,
16407,
907,
253,
941,
13301,
1014,
253,
13301,
403,
12421,
4561,
747,
17856,
5593,
407,
36804,
1599,
3423,
3762,
403,
3732,
275,
253,
1783,
50275,
783,
4342,
273,
253,
2929,
403,
4722,
352,
2722,
253,
19331,
275,
8090,
285,
3733,
3924,
273,
253,
11454,
2036,
50276,
74,
16407,
1320,
6634,
275,
12861,
8090,
294,
88,
3087,
253,
2457,
3828,
281,
253,
2393,
13461,
4784,
304,
684,
16407,
1320,
50276,
2886,
672,
16407,
1320,
6569,
253,
2393,
3828,
1335,
3037,
14237,
326,
476,
39970,
50276,
12211,
275,
253,
3733,
2393,
1396,
569,
33292,
806,
285,
12861,
8090,
13461,
33292,
806,
50275,
400,
2822,
31850,
253,
11786,
310,
14691,
407,
440,
33309,
4525,
6667,
50276,
74,
452,
253,
1563,
3533,
26122,
50275,
262,
310,
1805,
281,
2007,
5513,
253,
30328,
273,
253,
16751,
12087,
17082,
50276,
783,
1655,
4677,
337,
67,
310,
417,
1077,
2590,
50275,
249,
16751,
5350,
752,
513,
268,
285,
295,
4555,
1599,
310,
436,
268,
253,
1180,
273,
5971,
347,
908,
11358,
50275,
783,
2929,
11424,
326,
407,
3733,
327,
8143,
4525,
6667,
253,
2990,
476,
3037,
2087,
12729,
14237,
387,
253,
3302,
3733,
3924,
984,
253,
11786,
35136,
8143,
4525,
6667,
533,
2139,
275,
253,
1996,
3733,
3924,
253,
2393,
8090,
285,
1996,
8090,
921,
1027,
26647,
3607,
50276,
249,
2087,
436,
2929,
15814,
973,
34092,
4679,
581,
2159,
4202,
310,
326,
253,
2929,
1057,
417,
2085,
247,
16182,
281,
8415,
253,
26647,
1895,
390,
2007,
10527,
1783,
273,
253,
7313,
50276,
2858,
253,
16774,
32912,
403,
4460,
285,
476,
320,
12912,
281,
253,
3676,
4715,
3114,
50272,
484,
24275,
6701,
323,
253,
4477,
2380,
253,
7321,
2715,
19132,
19843,
891,
1158,
436,
2929,
3400,
5322,
7313,
285,
3302,
1783,
281,
253,
3114,
285,
476,
320,
12912,
281,
2852,
789,
594,
891,
5583,
436,
2929,
281,
320,
7607,
7152,
339,
431,
248,
4477,
4647,
278,
649,
785,
281,
277,
79,
2224,
10166,
327,
260,
338,
274,
342,
5203,
6046,
281,
12106,
616,
13576,
875,
26647,
285,
16407,
1320,
1754,
327,
5661,
1543,
597,
1750,
326,
752,
310,
3206,
275,
16407,
1320,
403,
417,
2406,
8090,
533,
2169,
8090,
436,
1750,
310,
21414,
1529,
1750,
326,
436,
310,
417,
4269,
407,
247,
29199,
11786,
1055,
310,
21541,
1512,
516,
2119,
841,
1543,
1918,
690,
16039,
715,
4685,
26647,
285,
16407,
1320,
407,
277,
79,
2224,
50276,
34974,
50276,
22309,
513,
253,
4477,
1908,
760,
27311,
267,
8090,
417,
4751,
14063,
8090,
323,
253,
6260,
275,
253,
3368,
273,
294,
88,
3087,
2060,
8090,
253,
1264,
269,
68,
8090,
403,
1669,
48976,
2139,
50276,
261,
278,
649,
785,
253,
760,
1332,
326,
476,
9186,
36302,
253,
1840,
4560,
50276,
26122,
50276,
255,
253,
806,
4361,
891,
42126,
2096,
752,
16789,
6667,
2097,
285,
352,
2335,
479,
247,
1223,
281,
2096,
352,
253,
11743,
323,
3036,
247,
24,
556,
271,
2228,
260,
338,
274,
2313,
943,
320,
10058,
4440,
257,
292,
5474,
33032,
50276,
8774,
436,
2929,
2340,
684,
16407,
1320,
275,
3676,
11454,
6928,
277,
79,
2224,
50276,
43355,
25057,
1599,
1673,
253,
30325,
17856,
1783,
1332,
278,
649,
785,
281,
12106,
672,
285,
835,
16407,
1320,
6634,
275,
247,
277,
9866,
949,
16774,
1783,
597,
921,
326,
891,
2087,
3006,
4735,
403,
6311,
8523,
285,
326,
16407,
1320,
5108,
1996,
275,
3733,
6571,
275,
253,
1755,
8090,
21255,
359,
476,
29966,
16407,
1320,
407,
294,
88,
3087,
253,
1755,
8090,
3602,
281,
4321,
2193,
597,
671,
921,
326,
616,
278,
649,
785,
17082,
476,
6780,
253,
16958,
273,
4021,
12524,
4720,
597,
7568,
326,
11786,
18499,
8523,
35136,
27620,
1650,
285,
2770,
327,
9113,
13130,
6667,
50275,
250,
3743,
323,
4868,
50276,
74,
9644,
2584,
14924,
436,
2929,
2789,
4722,
8310,
5001,
16407,
1320,
273,
3676,
2990,
352,
17923,
247,
1175,
16774,
1263,
534,
2085,
2217,
20456,
2979,
323,
253,
1027,
3916,
50276,
20261,
278,
649,
785,
812,
320,
247,
1805,
5544,
275,
253,
2022,
2929,
50272,
856,
84,
50276,
284,
4767,
1840,
253,
2929,
2789,
4722,
8310,
5001,
16407,
1320,
273,
3676,
2990,
50275,
262,
17923,
247,
11080,
16774,
1263,
50273,
5040,
50275,
74,
1119,
352,
1892,
281,
2096,
278,
649,
785,
1293,
14339,
281,
253,
30762,
247,
352,
651,
320,
5322,
281,
5645,
253,
8813,
273,
278,
649,
785,
275,
253,
2022,
2929,
275,
1635,
352,
651,
320,
1175,
281,
2007,
5513,
3036,
337,
270,
534,
4428,
247,
2257,
273,
1491,
50275,
18566,
253,
8310,
4311,
281,
4067,
10895,
824,
347,
4440,
257,
292,
50274,
16217,
3825,
403,
1408,
323,
760,
581,
8357,
50272,
7152,
33032,
2520,
2929,
6260,
16407,
1320,
275,
277,
79,
2224,
432,
253,
9655,
273,
16407,
1320,
50276,
31893,
3632,
13301,
285,
9010,
326,
352,
3133,
281,
5108,
275,
1996,
8090,
841,
1543,
403,
2797,
970,
253,
278,
649,
785,
7792,
247,
16751,
1783,
4968,
5175,
17856,
3607,
273,
2060,
8090,
253,
1783,
671,
9437,
281,
5513,
2139,
824,
247,
11562,
4961,
285,
2789,
247,
1643,
4722,
7313,
436,
2929,
1057,
417,
12661,
667,
747,
5933,
533,
3185,
3414,
868,
690,
1774,
3533,
407,
2192,
2683,
272,
390,
48984,
2469,
22898,
327,
3828,
8770,
1119,
275,
253,
6239,
50276,
74,
1089,
1264,
3782,
4722,
1543,
275,
436,
2929,
50276,
31312,
8090,
1646,
281,
320,
5506,
323,
16407,
1320,
1223,
2393,
8090,
1646,
281,
29623,
1390,
533,
12724,
3037,
2087,
3006,
3386,
3738,
436,
778,
417,
320,
2032,
323,
643,
35615,
50276,
41193,
253,
7877,
1319,
273,
253,
2990,
281,
10808,
4021,
18499,
12075,
253,
16751,
7877,
1319,
273,
253,
1390,
3828,
436,
310,
5185,
342,
689,
19484,
1320,
2403,
3253,
39797,
977,
1258,
2569,
285,
625,
4354,
557,
290,
2134,
494,
275,
253,
1390,
3828,
50276,
1542,
6667,
342,
253,
3430,
966,
27935,
8523,
29259,
1955,
281,
27009,
11689,
534,
3133,
281,
320,
247,
6276,
3490,
323,
253,
3302,
1175,
26647,
3045,
50276,
3487,
84,
1487,
273,
253,
2929,
50276,
783,
4758,
14859,
1060,
310,
8489,
13345,
337,
253,
8284,
327,
247,
1029,
2217,
299,
4277,
3632,
5203,
8394,
778,
417,
1957,
1524,
897,
273,
277,
79,
2224,
891,
3630,
436,
1907,
2326,
3036,
247,
25,
436,
310,
671,
247,
1846,
14226,
273,
25128,
40513,
1543,
374,
253,
3210,
10166,
1060,
13414,
1646,
281,
8268,
3387,
5175,
7200,
969,
417,
7933,
9999,
1524,
897,
273,
277,
79,
2224,
436,
310,
247,
2372,
10084,
7296,
1014,
3210,
432,
896,
275,
4072,
574,
1840,
3925,
7200,
327,
260,
338,
274,
2313,
50276,
20261,
253,
1543,
273,
253,
2929,
513,
417,
38864,
7094,
327,
352,
253,
22095,
327,
278,
649,
785,
7787,
253,
7914,
8489,
1223,
271,
4722,
4968,
697,
417,
2590,
281,
479,
326,
352,
4483,
441,
281,
1056,
2266,
7234,
670,
253,
12087,
273,
11454,
6928,
275,
1798,
323,
253,
2393,
8090,
278,
649,
785,
778,
417,
320,
2104,
281,
9232,
253,
12087,
273,
3386,
534,
1537,
1335,
320,
8489,
36255,
2568,
7081,
247,
2257,
273,
37175,
50276,
74,
452,
690,
3374,
342,
253,
9759,
273,
253,
2929,
50276,
2520,
2929,
1057,
417,
1663,
9569,
247,
4460,
9655,
327,
26647,
390,
3012,
747,
5697,
3738,
2654,
9059,
352,
7473,
4219,
5368,
5697,
285,
6283,
5216,
731,
45190,
50276,
251,
253,
1318,
273,
253,
7680,
50276,
74,
1158,
1907,
16774,
1941,
273,
253,
5421,
16958,
310,
9865,
625,
594,
685,
2045,
22898,
327,
731,
50276,
783,
16774,
1543,
3559,
1060,
513,
1527,
253,
3369,
323,
747,
3533,
281,
320,
9577,
285,
778,
1361,
2770,
253,
10800,
5839,
273,
16407,
1320,
285,
26647,
275,
277,
79,
2224,
50275,
38092,
5701,
50276,
17873,
3133,
3430,
342,
4677,
374,
5844,
3209,
767,
9930,
403,
2649,
8143,
4525,
285,
16789,
6667,
253,
1072,
14800,
1269,
533,
342,
253,
3969,
340,
4391,
604,
436,
310,
253,
1083,
840,
616,
5111,
522,
943,
320,
253,
1072,
253,
760,
3064,
875,
253,
1273,
5084,
285,
253,
2626,
5084,
943,
320,
253,
35633,
273,
253,
2792,
891,
35533,
326,
253,
4677,
2722,
247,
1027,
1054,
487,
1506,
273,
48361,
323,
841,
767,
9930,
891,
651,
4122,
5583,
417,
2509,
594,
285,
970,
253,
3242,
1072,
14800,
352,
651,
320,
5185,
342,
253,
2505,
285,
253,
9759,
24088,
3036,
337,
66,
50276,
455,
8442,
253,
5203,
36622,
943,
320,
8750,
432,
253,
17857,
32888,
33907,
9600,
897,
884,
1127,
1511,
323,
2505,
285,
512,
28227,
1364,
320,
18176,
4076,
285,
1791,
917,
1907,
281,
21282,
275,
285,
562,
281,
320,
2104,
281,
1239,
8442,
6283,
31835,
28092,
285,
1791,
2322,
534,
843,
974,
84,
432,
253,
3290,
273,
253,
2929,
22485,
2505,
1543,
285,
8442,
275,
271,
854,
6377,
3389,
476,
320,
1892,
533,
35143,
3006,
1491,
1690,
5304,
1491,
6221,
275,
8442,
310,
271,
5667,
10861,
275,
43049,
3640,
50276,
1568,
403,
247,
1643,
13991,
323,
436,
1798,
2929,
4677,
337,
66,
3133,
15279,
253,
2505,
11785,
656,
841,
495,
12342,
4518,
4677,
337,
67,
310,
1774,
285,
943,
1379,
253,
2862,
4871,
273,
253,
3239,
342,
1791,
917,
36622,
4677,
374,
284,
749,
42045,
512,
3894,
253,
1072,
1269,
285,
340,
7844,
2403,
616,
26086,
28116,
285,
3192,
598,
2317,
4677,
374,
1768,
5084,
13301,
403,
671,
6015,
878,
13102,
3192,
598,
9118,
2317,
4677,
495,
84,
1269,
7844,
36908,
878,
2060,
3828,
1416,
13301,
285,
812,
320,
7932,
342,
247,
2014,
3828,
6864,
5203,
50276,
20,
66,
285,
495,
67,
671,
3894,
436,
7844,
4283,
281,
25262,
9118,
2317,
2317,
326,
812,
320,
908,
281,
1056,
36622,
4067,
2654,
358,
323,
4677,
577,
66,
2060,
8090,
513,
417,
878,
281,
320,
4907,
533,
2581,
253,
4473,
273,
3828,
6864,
476,
320,
29403,
342,
247,
6283,
27214,
3295,
2009,
11786,
50276,
21,
68,
615,
812,
320,
1679,
4618,
285,
3553,
625,
11593,
2317,
281,
1056,
36622,
4067,
50276,
249,
4677,
608,
66,
697,
417,
4745,
2590,
326,
253,
1269,
7844,
403,
2060,
8090,
253,
298,
2331,
6348,
5203,
943,
320,
327,
253,
3295,
2009,
2581,
685,
327,
1755,
273,
253,
4677,
2654,
671,
1804,
46899,
253,
1269,
285,
340,
7844,
347,
253,
1269,
7844,
310,
5431,
908,
323,
673,
436,
651,
1581,
627,
281,
452,
253,
1264,
749,
42045,
1930,
407,
1930,
342,
247,
6096,
27214,
3295,
2009,
327,
253,
987,
1111,
14095,
4658,
3133,
281,
320,
908,
1060,
923,
1111,
14095,
4658,
4789,
42045,
538,
14095,
859,
3894,
89,
18316,
90,
7125,
323,
6667,
2490,
187,
4118,
18435,
27,
783,
2929,
6131,
4460,
16039,
670,
16407,
1320,
253,
1232,
407,
534,
3676,
11454,
6928,
403,
2104,
281,
3037,
6667,
342,
13583,
13301,
253,
5161,
12288,
310,
326,
3563,
8090,
403,
5506,
323,
16407,
1320,
253,
2929,
10262,
247,
11080,
8368,
273,
436,
1750,
432,
1027,
14636,
253,
4679,
7668,
294,
88,
3087,
3563,
8090,
403,
3340,
16694,
50276,
783,
30628,
1119,
253,
16039,
9865,
285,
14285,
38350,
323,
18738,
253,
2929,
253,
21942,
310,
973,
17903,
407,
391,
19,
253,
4342,
273,
253,
2929,
403,
4722,
352,
2722,
253,
19331,
275,
8090,
285,
3733,
3924,
273,
253,
11454,
2036,
50276,
74,
651,
751,
281,
3324,
281,
634,
4116,
253,
18893,
27935,
2929,
923,
671,
391,
18,
4385,
436,
285,
643,
2905,
9380,
2168,
25339,
253,
1055,
273,
5203,
29391,
327,
253,
11786,
5222,
4496,
1056,
2119,
368,
2319,
436,
2905,
789,
347,
247,
5884,
4385,
4496,
3157,
253,
6064,
273,
512,
8442,
275,
253,
2929,
50275,
249,
6010,
352,
310,
619,
11284,
281,
5583,
253,
14924,
273,
253,
2929,
5717,
368,
323,
29315,
634,
789,
281,
17857,
32888,
285,
4496,
1056,
2119,
368,
2953,
512,
16157,
273,
253,
30628,
275,
253,
4049,
254,
609,
5102,
2715,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper addresses the problem of learning generalizable context in rl in particular it suggests learning disentangled context representation of each confounding in the environment using the proposed model domino which optimizes decomposed mi objectives it adopts the contrastive learning method when learning the disentangled context representation regarding trajectories sampled from the setting of the same confounding as positive pair and of different confounding as negative pair the authors also provide a theoretical basis for how optimizing their decomposed mi objective can make ince a tighter lower bound by alleviating the underestimation of mi by learning policy conditioning on the learned context vector domino can achieve higher generalization performance compared to both modelbased and modelfree baselines strengths the paper is well written and clear to understand using contrastive loss when learning disentangled representation of each confounding is novel and intuitive and it is intriguing to get an idea of sampling negative pairs from different episodes the experiments are comprehensive and the results are impressive weaknesses however the proof of lemma 1 and theorem 1 lacks mathematical rigor also there is some missing specific information about notations in the proof thereby undermining the clarity and soundness of the paper eg wy and e visualization of the learned context embeddings does not show how effectively each confounding is encoded yes the authors adequately addressed the limitations and potential negative social impact of their work docsepthis paper studies a contextual reinforcement learning rl setting where the environment dynamics are parameterized by independent factors which the authors refer to as confounders in each episode the underlying factors can vary they present a method for contextual metareinforcement learning rl called domino which learns to encode the rl agents current trajectory into a set of independent context vectors these independent context vectors can then be used as inputs to the transition model in modelbased rl mbrl and as an input to the policy in modelfree rl thereby providing the agent with an inferred context for the underlying environment factors in any given episode importantly their method assumes the underlying environment factors are similarly independent the main contributions of the paper are the method domino for learning independent context vectors from the trajectory and their analysis and experimental results demonstrating the favorable properties of this method including improved empirical performance against baselines learning entangled context vectors when the underlying independence assumptions are valid strengths the paper provides a simple method for improving contextaware meta rl in an environment with multiple independent factors of variation that impact the transition dynamics the method itself is clearly described this seems to be the first method to directly exploit an explicit assumption of independence among the underlying environment factors of variation the method performs well against sensible baselines importantly the method performs well against an ablation that does not learn disentangled context vectors weaknesses the reported results in the table 1 and 2 have high overlap between the authors domino and mino methods and the baselines the signficance of these results could be made clearer by reporting the results of a welch ttest between the proposed method and the baselines similarly the performance comparison plot in figure 1b should have error bars it should also state what method of averaging was used for the plotted values the paper can benefit from a full pass to improve the clarity of the writing there are numerous missing details about basic figures such as what measure of uncertainty is represented by the error bars for each plot and table there are also several ambiguous phrasings and sentences with confusing wording for example a key aspect of this paper is the analysis of infonce as a loose bound of the mutual information however the authors never define whether this bound is an upper or lower bound while this detail can be inferred from context i think it is important to make this point clearer to the reader relatedly the definition of mi underestimation in l45 is unclear given that the independence assumption is core to this work it is unclear how significant this setting will be in practice and for future work moreover it seems important for the experiments to assess how valid such an independence assumption is in practice and crucially what is the price in performance one might expect to pay for making this assumption an experiment assessing the performance of domino and mino on a more complex environment whose underlying factors of variation are not mutually independent would improve this paper by providing a more complete picture of the effectiveness of this method there seems to be an underlying assumption that the n independent context vectors aim to encode information about the underlying factors of variation in the environment however this connection is actually never explicitly made in the writing making the jump from discussing mi in terms of environment factors to context vectors 41 to 42 unclear it seems that domino requires setting the number of context vectors n equal to the number of environment factors of variation in general we may not know this value exactly adding a sensitivity analysis to how dependent the performance is on setting n to this exact value would provide important information on how applicable this method is in practice minor comments l22 mythologies should be morphologies l4748 first the context encoder embeds the past stateaction pairs into disentangled context vectors is an inaccurate description as it must first be optimized to do so as next described in l4849 this paper could consider citing related work in unsupervised environment design 1234 and more generally rl work in procedurallygenerated environments 56 these works are deeply related as they effectively perform metarl over a space of environment variations with an implicitly learned context ignoring this line of work seems like a significant oversight references 1 dennis et al 2020 emergent complexity and zeroshot transfer via unsupervised environment design 2 jiang et al 2021 prioritized level replay 3 jiang et al 2021 replayguided adversarial environment design 4 parkerholder et al 2022 evolving curricula with regretbased environment design 5 raileanu et al 2021 decoupling value and policy for generalization in reinforcement learning 6 cobbe et al 2019 leveraging procedural generation to benchmark reinforcement learning the core assumption of this work also acts as its primary limitation the environment factors of variation are assumed to be independent and their number known a priori the authors should make an effort to emphasize this limitation and to what extent they believe such an assumption of independence may be applicable in practice docsepthis paper tackles the problem of generalization in mdps where the dynamics changes are assumed to be caused by multiple independent factors denoted as context the proposed framework domino learns a context encoder that maps trajectories to a latent context via decomposed mutual information using noisecontrastive estimation infonce the authors combine domino with modelfree and modelbased rl algorithms and perform experiments in classic environments as well as in the mujoco benchmark in settings where multiple confounders change simultaneously additionally qualitative visualizations of the latent context vectors are presented using tsne strengths the idea of capturing the different confounders that may affect the dynamics of the mdp into different latent contexts is novel and interesting the experimental results show that the proposed method can in general achieve better performance than the stateoftheart weaknesses the paper needs improvement regarding the clarity of the mathematical definitions such as the objective functions it is not clear whether the improvements are because of the decomposed mutual information framework or because of other algorithmic improvements see below the paper could benefit from a discussion regarding the assumption of independent confounders for instance how difficult it would be to adapt the algorithm to the case where we have corelated confounders docsepthis paper proposes a decomposed mutual information method to learn disentangled context information which can generalize reinforcement learning algorithms into unseen environments the experimental experiments demonstrate that the proposed method can achieve better performance than the previous methods strengths 1 the writing of this paper is pretty well and the idea of it is easy to follow 2 the figures in this paper are very clear and very well 3 the extensive experiments show the effectiveness of the proposed method weakness 1 based on the title i assume that this study focuses on the metareinforcement learning problem the conventional metareinforcement learning methods include an adaptation process but this paper makes no mention of this process additionally the paper states that it intends to train a general contextencoder to solve the adaptation problem indicating that the papers context is the dynamics generalization in reinforcement learning this paper also mentions it in line 84 which is in contrast to the title of the paper which refers to metareinforcement learning 2 the second problem of this paper is the novelty the paper aims to maximize the mutual information between contexts extracted from historical information and the historical trajectories however this paper does not make clear the relationship with 123 which also attempt to maximize the mi between context vector and historical trajectories furthermore this work does not compare the performance with 3 and even does not acknowledge it despite the fact that 3 focuses on a similar problem to this paper as a result of the missing contribution and experimental comparisons with 123 i believe this papers uniqueness is somewhat limited 3 the number of learned context vectors c is set as the number of environments in the study which is the primary hyperparameter of the suggested technique however in a realworld setting the number of environments is not available making it unfair to compare it to the baseline tmcl which doesnt rely on such prior information this increases my concerns about the technical soundness of this paper in conclusion while the writing and experimental results are excellent this paper suffers from the aforementioned clarity and novelty issues if the authors address my concerns in their response i will consider raising my score after rebuttal i think that the additional experimental results and discussion in the revision resolve my concerns about the clarity problem of the submission so i increase my score from 4 to 6 accordingly minors i believe that ria considers context information and constructs confounder sets with multiple confounders so i believe that ria should be discussed in the introductions confounder discussion line 42 1 haotian fu hongyao tang jianye hao chen chen xidong feng dong li and wulong liu towards effective context for metareinforcement learning an approach based on contrastive learning 2 li l huang y chen m luo s luo d huang j 2021 provably improved contextbased offline metarl with attention and contrastive learning arxiv preprint arxiv210210774 3 guo j gong m tao d a relational intervention approach for unsupervised dynamics generalization in modelbased reinforcement learningcinternational conference on learning representations 2022 please refer to the weakness listed above
### Summary: | this paper proposes domino an optimization framework for contextual meta reinforcement learning the reviewers generally agree that the paper is well written the idea is novel and interesting the evaluation is comprehensive and the results are impressive reviewers also raised a few concerns in the initial reviews such as the proof of lemma 1 and theorem 1 and the mathematical definitions throughout the discussion phase most of these concerns were sufficiently addressed and the review scores were increased accordingly overall the quality of the revised paper has improved significantly during the rebuttal thus i recommend accepting this paper please incorporate the remaining reviewers suggestions in the future version of this paper | [
973,
3542,
285,
2590,
281,
2096,
970,
4499,
422,
2957,
672,
4715,
557,
290,
33195,
6779,
273,
1016,
34541,
310,
4460,
285,
27350,
285,
352,
310,
27807,
281,
755,
271,
2934,
273,
10491,
4016,
8557,
432,
1027,
13305,
253,
4679,
403,
11088,
285,
253,
1543,
403,
13943,
50275,
20881,
1255,
265,
50276,
35529,
253,
4737,
273,
18057,
337,
285,
10012,
337,
19756,
15965,
8132,
263,
671,
627,
310,
690,
5816,
2173,
1491,
670,
41818,
275,
253,
4737,
7624,
762,
39590,
253,
19843,
285,
3590,
1255,
273,
253,
2929,
24088,
37136,
285,
299,
24426,
273,
253,
6311,
3634,
46234,
1057,
417,
921,
849,
8069,
1016,
34541,
310,
16202,
50276,
9820,
253,
4477,
18212,
9713,
253,
7364,
285,
2442,
4016,
2675,
3486,
273,
616,
789,
5474,
33032,
2520,
2929,
2175,
247,
33876,
35221,
4715,
391,
77,
4758,
835,
253,
3126,
8062,
403,
4764,
1025,
407,
3907,
2616,
534,
253,
4477,
3730,
281,
347,
44667,
398,
275,
1016,
9037,
253,
6944,
2616,
476,
6889,
597,
1246,
247,
1332,
323,
33876,
1313,
609,
249,
19503,
4715,
391,
77,
1925,
2328,
2610,
534,
33772,
281,
22573,
253,
391,
77,
6083,
1655,
18974,
715,
247,
873,
273,
3907,
3634,
11390,
841,
3907,
3634,
11390,
476,
840,
320,
908,
347,
14800,
281,
253,
5502,
1566,
275,
1566,
3169,
391,
77,
278,
1288,
77,
285,
347,
271,
3280,
281,
253,
3646,
275,
771,
813,
658,
391,
77,
7624,
5277,
253,
5570,
342,
271,
22245,
3634,
323,
253,
6944,
3126,
2616,
275,
667,
1677,
9037,
15538,
616,
1332,
19584,
253,
6944,
3126,
2616,
403,
12014,
3907,
253,
2022,
9021,
273,
253,
2929,
403,
253,
1332,
2328,
2610,
323,
4715,
3907,
3634,
11390,
432,
253,
18974,
285,
616,
1783,
285,
5661,
1543,
17227,
253,
13857,
3607,
273,
436,
1332,
1690,
5520,
16774,
3045,
1411,
1666,
25379,
4715,
36255,
3634,
11390,
672,
253,
6944,
14275,
13260,
403,
3588,
20544,
50275,
783,
2929,
3400,
247,
2969,
1332,
323,
11138,
3634,
13823,
11419,
391,
77,
275,
271,
3126,
342,
2709,
3907,
2616,
273,
7629,
326,
3486,
253,
5502,
8062,
253,
1332,
3139,
310,
4518,
2529,
436,
3133,
281,
320,
253,
806,
1332,
281,
3587,
22059,
271,
6843,
9376,
273,
14275,
2190,
253,
6944,
3126,
2616,
273,
7629,
50276,
783,
1332,
17923,
973,
1411,
24600,
1666,
25379,
15538,
253,
1332,
17923,
973,
1411,
271,
28913,
326,
1057,
417,
3037,
557,
290,
33195,
3634,
11390,
50276,
20881,
1255,
265,
50275,
783,
2361,
1543,
275,
253,
2829,
337,
285,
374,
452,
1029,
14787,
875,
253,
4477,
2328,
2610,
285,
1054,
80,
3082,
285,
253,
1666,
25379,
253,
861,
71,
280,
593,
273,
841,
1543,
812,
320,
1160,
30909,
407,
9610,
253,
1543,
273,
247,
6210,
348,
246,
2566,
875,
253,
4081,
1332,
285,
253,
1666,
25379,
50276,
3549,
6241,
253,
3045,
5301,
7484,
275,
4677,
337,
67,
943,
452,
2228,
8965,
352,
943,
671,
1375,
752,
1332,
273,
25001,
369,
908,
323,
253,
17944,
2193,
50276,
783,
2929,
476,
5649,
432,
247,
2120,
1509,
281,
3157,
253,
19843,
273,
253,
4028,
627,
403,
7418,
5816,
4278,
670,
5044,
8442,
824,
347,
752,
2557,
273,
11649,
310,
6607,
407,
253,
2228,
8965,
323,
1016,
7484,
285,
2829,
627,
403,
671,
2067,
23851,
815,
6230,
723,
285,
14683,
342,
21643,
41066,
323,
1650,
50276,
66,
2234,
4809,
273,
436,
2929,
310,
253,
1783,
273,
2192,
19131,
347,
247,
13155,
3033,
273,
253,
15577,
1491,
2299,
253,
4477,
1620,
4853,
1880,
436,
3033,
310,
271,
5170,
390,
2406,
3033,
1223,
436,
2508,
476,
320,
22245,
432,
3634,
891,
1158,
352,
310,
1774,
281,
1056,
436,
1127,
30909,
281,
253,
9414,
2905,
314,
253,
5426,
273,
3641,
22698,
14508,
275,
298,
1857,
310,
12744,
50276,
28821,
326,
253,
14275,
9376,
310,
5161,
281,
436,
789,
352,
310,
12744,
849,
1534,
436,
4758,
588,
320,
275,
3946,
285,
323,
2852,
789,
50276,
3062,
1189,
352,
3133,
1774,
323,
253,
4679,
281,
2939,
849,
3588,
824,
271,
14275,
9376,
310,
275,
3946,
285,
29325,
1365,
752,
310,
253,
4376,
275,
3045,
581,
1537,
1902,
281,
2075,
323,
2403,
436,
9376,
271,
3368,
18005,
253,
3045,
273,
2328,
2610,
285,
1054,
80,
327,
247,
625,
2570,
3126,
3692,
6944,
2616,
273,
7629,
403,
417,
25834,
3907,
651,
3157,
436,
2929,
407,
5277,
247,
625,
3426,
5406,
273,
253,
12510,
273,
436,
1332,
50276,
9088,
3133,
281,
320,
271,
6944,
9376,
326,
253,
295,
3907,
3634,
11390,
4388,
281,
22573,
1491,
670,
253,
6944,
2616,
273,
7629,
275,
253,
3126,
2299,
436,
4602,
310,
2686,
1620,
11120,
1160,
275,
253,
4028,
2403,
253,
6923,
432,
16585,
3641,
275,
2426,
273,
3126,
2616,
281,
3634,
11390,
7609,
281,
5976,
12744,
50276,
262,
3133,
326,
2328,
2610,
4419,
4758,
253,
1180,
273,
3634,
11390,
295,
4503,
281,
253,
1180,
273,
3126,
2616,
273,
7629,
275,
2087,
359,
778,
417,
871,
436,
1318,
4555,
6240,
247,
7340,
1783,
281,
849,
7976,
253,
3045,
310,
327,
4758,
295,
281,
436,
3242,
1318,
651,
2085,
1774,
1491,
327,
849,
7763,
436,
1332,
310,
275,
3946,
50276,
37585,
5701,
50275,
77,
1423,
13296,
5970,
943,
320,
6695,
5970,
50276,
77,
2504,
2385,
806,
253,
3634,
32049,
8473,
84,
253,
2469,
1375,
1913,
8557,
715,
557,
290,
33195,
3634,
11390,
310,
271,
31215,
5740,
347,
352,
1364,
806,
320,
18325,
281,
513,
594,
347,
1735,
2529,
275,
298,
2385,
2537,
50276,
2520,
2929,
812,
1908,
19936,
2905,
789,
275,
440,
35421,
3126,
2216,
1249,
1706,
285,
625,
3839,
391,
77,
789,
275,
3352,
8572,
20419,
12620,
8026,
841,
2987,
403,
11617,
2905,
347,
597,
8069,
1347,
1313,
7694,
689,
247,
2317,
273,
3126,
10575,
342,
271,
29688,
6311,
3634,
23111,
436,
1386,
273,
789,
3133,
751,
247,
1534,
29002,
50276,
250,
3065,
50276,
18,
1850,
24836,
1162,
355,
9169,
47006,
10454,
285,
1182,
254,
6934,
302,
3700,
3066,
440,
35421,
3126,
2216,
50276,
19,
480,
22589,
1162,
355,
43425,
23652,
1025,
1268,
44864,
50276,
20,
480,
22589,
1162,
355,
43425,
44864,
26960,
48960,
3126,
2216,
50276,
21,
5603,
254,
11375,
1162,
355,
1384,
1423,
25537,
22182,
3627,
342,
14938,
3169,
3126,
2216,
50276,
22,
1218,
587,
40160,
1162,
355,
43425,
34430,
4906,
1318,
285,
3646,
323,
26647,
275,
35221,
4715,
50276,
23,
21115,
1257,
1162,
355,
6247,
19732,
2977,
19993,
5978,
281,
22791,
35221,
4715,
253,
5161,
9376,
273,
436,
789,
671,
6993,
347,
697,
3625,
12291,
253,
3126,
2616,
273,
7629,
403,
8025,
281,
320,
3907,
285,
616,
1180,
1929,
247,
30400,
253,
4477,
943,
1056,
271,
3434,
281,
22175,
436,
12291,
285,
281,
752,
6070,
597,
2868,
824,
271,
9376,
273,
14275,
778,
320,
7763,
275,
3946,
5474,
33032,
2520,
2929,
39223,
253,
1895,
273,
26647,
275,
31934,
793,
835,
253,
8062,
2544,
403,
8025,
281,
320,
4269,
407,
2709,
3907,
2616,
17007,
347,
3634,
253,
4081,
7792,
2328,
2610,
33772,
247,
3634,
32049,
326,
8115,
24102,
281,
247,
21624,
3634,
3066,
45765,
15577,
1491,
970,
6046,
45842,
422,
13418,
2192,
19131,
253,
4477,
13398,
2328,
2610,
342,
771,
813,
658,
285,
1566,
3169,
391,
77,
11333,
285,
1347,
4679,
275,
10610,
12620,
347,
973,
347,
275,
253,
278,
10441,
16856,
22791,
275,
7533,
835,
2709,
44667,
398,
1818,
10486,
23000,
18276,
5304,
5904,
273,
253,
21624,
3634,
11390,
403,
3559,
970,
28669,
570,
20544,
50276,
783,
2934,
273,
26475,
253,
1027,
44667,
398,
326,
778,
2818,
253,
8062,
273,
253,
278,
12132,
715,
1027,
21624,
22349,
310,
4460,
285,
4722,
50276,
783,
5661,
1543,
921,
326,
253,
4081,
1332,
476,
275,
2087,
5115,
1805,
3045,
685,
253,
1375,
23037,
14387,
50276,
20881,
1255,
265,
50276,
783,
2929,
3198,
7756,
5001,
253,
19843,
273,
253,
15965,
14308,
824,
347,
253,
8103,
3470,
50276,
262,
310,
417,
2590,
1880,
253,
11701,
403,
984,
273,
253,
45765,
15577,
1491,
7792,
390,
984,
273,
643,
5933,
280,
11701,
923,
2708,
50276,
783,
2929,
812,
5649,
432,
247,
5955,
5001,
253,
9376,
273,
3907,
44667,
398,
323,
4227,
849,
2834,
352,
651,
320,
281,
5223,
253,
5933,
281,
253,
1083,
835,
359,
452,
5161,
16148,
44667,
398,
5474,
33032,
2520,
2929,
29328,
247,
45765,
15577,
1491,
1332,
281,
3037,
557,
290,
33195,
3634,
1491,
534,
476,
39970,
35221,
4715,
11333,
715,
39709,
12620,
253,
5661,
4679,
7568,
326,
253,
4081,
1332,
476,
5115,
1805,
3045,
685,
253,
2045,
3082,
20544,
337,
253,
4028,
273,
436,
2929,
310,
3965,
973,
285,
253,
2934,
273,
352,
310,
3477,
281,
956,
374,
253,
8442,
275,
436,
2929,
403,
1077,
2590,
285,
1077,
973,
495,
253,
9470,
4679,
921,
253,
12510,
273,
253,
4081,
1332,
50276,
20881,
1255,
50276,
18,
1754,
327,
253,
4060,
891,
5467,
326,
436,
1263,
16633,
327,
253,
1313,
609,
249,
19503,
4715,
1895,
253,
6041,
1313,
609,
249,
19503,
4715,
3082,
2486,
271,
15644,
1232,
533,
436,
2929,
2789,
642,
3748,
273,
436,
1232,
23000,
253,
2929,
3054,
326,
352,
31901,
281,
6194,
247,
2087,
3634,
36465,
281,
8415,
253,
15644,
1895,
7809,
326,
253,
9380,
3634,
310,
253,
8062,
26647,
275,
35221,
4715,
436,
2929,
671,
25957,
352,
275,
1386,
11130,
534,
310,
275,
4499,
281,
253,
4060,
273,
253,
2929,
534,
10770,
281,
1313,
609,
249,
19503,
4715,
50276,
19,
253,
1273,
1895,
273,
436,
2929,
310,
253,
38135,
253,
2929,
13698,
281,
22950,
253,
15577,
1491,
875,
22349,
10375,
432,
9493,
1491,
285,
253,
9493,
24102,
2299,
436,
2929,
1057,
417,
1056,
2590,
253,
2954,
342,
15567,
534,
671,
3177,
281,
22950,
253,
3641,
875,
3634,
4972,
285,
9493,
24102,
33810,
436,
789,
1057,
417,
7277,
253,
3045,
342,
495,
285,
1014,
1057,
417,
14409,
352,
5747,
253,
958,
326,
495,
16633,
327,
247,
2074,
1895,
281,
436,
2929,
347,
247,
906,
273,
253,
5816,
7680,
285,
5661,
14023,
342,
15567,
891,
2868,
436,
9380,
34002,
310,
8489,
3710,
50276,
20,
253,
1180,
273,
6311,
3634,
11390,
260,
310,
873,
347,
253,
1180,
273,
12620,
275,
253,
1263,
534,
310,
253,
3625,
4373,
19484,
273,
253,
5125,
5853,
2299,
275,
247,
1524,
10186,
4758,
253,
1180,
273,
12620,
310,
417,
2130,
2403,
352,
16593,
281,
7277,
352,
281,
253,
8245,
246,
78,
498,
534,
36908,
10725,
327,
824,
2720,
1491,
436,
5459,
619,
7350,
670,
253,
7681,
3590,
1255,
273,
436,
2929,
50276,
249,
6452,
1223,
253,
4028,
285,
5661,
1543,
403,
7126,
436,
2929,
27171,
432,
253,
18979,
19843,
285,
38135,
3374,
604,
253,
4477,
2953,
619,
7350,
275,
616,
2380,
891,
588,
1908,
12976,
619,
4868,
50274,
6438,
30080,
22559,
50275,
74,
1158,
326,
253,
3081,
5661,
1543,
285,
5955,
275,
253,
18520,
11322,
619,
7350,
670,
253,
19843,
1895,
273,
253,
19529,
594,
891,
2572,
619,
4868,
432,
577,
281,
721,
15672,
50276,
1222,
641,
891,
2868,
326,
4172,
66,
19401,
3634,
1491,
285,
21031,
1461,
10117,
5239,
342,
2709,
44667,
398,
594,
891,
2868,
326,
4172,
66,
943,
320,
5469,
275,
253,
3092,
960,
1461,
10117,
5955,
1386,
5976,
50276,
18,
419,
302,
757,
15260,
288,
543,
5973,
80,
12717,
480,
757,
6683,
419,
80,
260,
864,
260,
864,
1269,
301,
543,
269,
1205,
277,
543,
632,
285,
259,
335,
543,
632,
86,
4404,
3576,
3634,
323,
1313,
609,
249,
19503,
4715,
271,
2746,
1754,
327,
4499,
422,
4715,
50275,
19,
632,
298,
30287,
606,
340,
260,
864,
278,
26535,
80,
256,
26535,
80,
277,
50276,
11917,
606,
480,
43425,
872,
1598,
5520,
3634,
3169,
28841,
1313,
7694,
342,
4116,
285,
4499,
422,
4715,
549,
32693,
638,
3845,
549,
32693,
19,
11335,
740,
30715,
50276,
20,
1149,
80,
480,
305,
543,
278,
246,
8500,
277,
247,
38524,
7268,
2746,
323,
440,
35421,
8062,
26647,
275,
1566,
3169,
35221,
4715,
5620,
931,
1050,
8059,
327,
4715,
14237,
1384,
1423,
4496,
3730,
281,
253,
14855,
7117,
1840,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
2328,
2610,
271,
13757,
7792,
323,
33876,
11419,
35221,
4715,
253,
30628,
3839,
5194,
326,
253,
2929,
310,
973,
3542,
253,
2934,
310,
4460,
285,
4722,
253,
7103,
310,
11088,
285,
253,
1543,
403,
13943,
30628,
671,
5439,
247,
1643,
7350,
275,
253,
3302,
10123,
824,
347,
253,
4737,
273,
18057,
337,
285,
10012,
337,
285,
253,
15965,
14308,
4768,
253,
5955,
3408,
954,
273,
841,
7350,
497,
10481,
9713,
285,
253,
2278,
7363,
497,
2559,
15672,
4583,
253,
3290,
273,
253,
17265,
2929,
556,
5520,
3012,
1309,
253,
30080,
22559,
3021,
891,
5583,
18738,
436,
2929,
4496,
19071,
253,
5780,
30628,
13991,
275,
253,
2852,
2715,
273,
436,
2929
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
973,
3542,
285,
2590,
281,
2096,
970,
4499,
422,
2957,
672,
4715,
557,
290,
33195,
6779,
273,
1016,
34541,
310,
4460,
285,
27350,
285,
352,
310,
27807,
281,
755,
271,
2934,
273,
10491,
4016,
8557,
432,
1027,
13305,
253,
4679,
403,
11088,
285,
253,
1543,
403,
13943,
50275,
20881,
1255,
265,
50276,
35529,
253,
4737,
273,
18057,
337,
285,
10012,
337,
19756,
15965,
8132,
263,
671,
627,
310,
690,
5816,
2173,
1491,
670,
41818,
275,
253,
4737,
7624,
762,
39590,
253,
19843,
285,
3590,
1255,
273,
253,
2929,
24088,
37136,
285,
299,
24426,
273,
253,
6311,
3634,
46234,
1057,
417,
921,
849,
8069,
1016,
34541,
310,
16202,
50276,
9820,
253,
4477,
18212,
9713,
253,
7364,
285,
2442,
4016,
2675,
3486,
273,
616,
789,
5474,
33032,
2520,
2929,
2175,
247,
33876,
35221,
4715,
391,
77,
4758,
835,
253,
3126,
8062,
403,
4764,
1025,
407,
3907,
2616,
534,
253,
4477,
3730,
281,
347,
44667,
398,
275,
1016,
9037,
253,
6944,
2616,
476,
6889,
597,
1246,
247,
1332,
323,
33876,
1313,
609,
249,
19503,
4715,
391,
77,
1925,
2328,
2610,
534,
33772,
281,
22573,
253,
391,
77,
6083,
1655,
18974,
715,
247,
873,
273,
3907,
3634,
11390,
841,
3907,
3634,
11390,
476,
840,
320,
908,
347,
14800,
281,
253,
5502,
1566,
275,
1566,
3169,
391,
77,
278,
1288,
77,
285,
347,
271,
3280,
281,
253,
3646,
275,
771,
813,
658,
391,
77,
7624,
5277,
253,
5570,
342,
271,
22245,
3634,
323,
253,
6944,
3126,
2616,
275,
667,
1677,
9037,
15538,
616,
1332,
19584,
253,
6944,
3126,
2616,
403,
12014,
3907,
253,
2022,
9021,
273,
253,
2929,
403,
253,
1332,
2328,
2610,
323,
4715,
3907,
3634,
11390,
432,
253,
18974,
285,
616,
1783,
285,
5661,
1543,
17227,
253,
13857,
3607,
273,
436,
1332,
1690,
5520,
16774,
3045,
1411,
1666,
25379,
4715,
36255,
3634,
11390,
672,
253,
6944,
14275,
13260,
403,
3588,
20544,
50275,
783,
2929,
3400,
247,
2969,
1332,
323,
11138,
3634,
13823,
11419,
391,
77,
275,
271,
3126,
342,
2709,
3907,
2616,
273,
7629,
326,
3486,
253,
5502,
8062,
253,
1332,
3139,
310,
4518,
2529,
436,
3133,
281,
320,
253,
806,
1332,
281,
3587,
22059,
271,
6843,
9376,
273,
14275,
2190,
253,
6944,
3126,
2616,
273,
7629,
50276,
783,
1332,
17923,
973,
1411,
24600,
1666,
25379,
15538,
253,
1332,
17923,
973,
1411,
271,
28913,
326,
1057,
417,
3037,
557,
290,
33195,
3634,
11390,
50276,
20881,
1255,
265,
50275,
783,
2361,
1543,
275,
253,
2829,
337,
285,
374,
452,
1029,
14787,
875,
253,
4477,
2328,
2610,
285,
1054,
80,
3082,
285,
253,
1666,
25379,
253,
861,
71,
280,
593,
273,
841,
1543,
812,
320,
1160,
30909,
407,
9610,
253,
1543,
273,
247,
6210,
348,
246,
2566,
875,
253,
4081,
1332,
285,
253,
1666,
25379,
50276,
3549,
6241,
253,
3045,
5301,
7484,
275,
4677,
337,
67,
943,
452,
2228,
8965,
352,
943,
671,
1375,
752,
1332,
273,
25001,
369,
908,
323,
253,
17944,
2193,
50276,
783,
2929,
476,
5649,
432,
247,
2120,
1509,
281,
3157,
253,
19843,
273,
253,
4028,
627,
403,
7418,
5816,
4278,
670,
5044,
8442,
824,
347,
752,
2557,
273,
11649,
310,
6607,
407,
253,
2228,
8965,
323,
1016,
7484,
285,
2829,
627,
403,
671,
2067,
23851,
815,
6230,
723,
285,
14683,
342,
21643,
41066,
323,
1650,
50276,
66,
2234,
4809,
273,
436,
2929,
310,
253,
1783,
273,
2192,
19131,
347,
247,
13155,
3033,
273,
253,
15577,
1491,
2299,
253,
4477,
1620,
4853,
1880,
436,
3033,
310,
271,
5170,
390,
2406,
3033,
1223,
436,
2508,
476,
320,
22245,
432,
3634,
891,
1158,
352,
310,
1774,
281,
1056,
436,
1127,
30909,
281,
253,
9414,
2905,
314,
253,
5426,
273,
3641,
22698,
14508,
275,
298,
1857,
310,
12744,
50276,
28821,
326,
253,
14275,
9376,
310,
5161,
281,
436,
789,
352,
310,
12744,
849,
1534,
436,
4758,
588,
320,
275,
3946,
285,
323,
2852,
789,
50276,
3062,
1189,
352,
3133,
1774,
323,
253,
4679,
281,
2939,
849,
3588,
824,
271,
14275,
9376,
310,
275,
3946,
285,
29325,
1365,
752,
310,
253,
4376,
275,
3045,
581,
1537,
1902,
281,
2075,
323,
2403,
436,
9376,
271,
3368,
18005,
253,
3045,
273,
2328,
2610,
285,
1054,
80,
327,
247,
625,
2570,
3126,
3692,
6944,
2616,
273,
7629,
403,
417,
25834,
3907,
651,
3157,
436,
2929,
407,
5277,
247,
625,
3426,
5406,
273,
253,
12510,
273,
436,
1332,
50276,
9088,
3133,
281,
320,
271,
6944,
9376,
326,
253,
295,
3907,
3634,
11390,
4388,
281,
22573,
1491,
670,
253,
6944,
2616,
273,
7629,
275,
253,
3126,
2299,
436,
4602,
310,
2686,
1620,
11120,
1160,
275,
253,
4028,
2403,
253,
6923,
432,
16585,
3641,
275,
2426,
273,
3126,
2616,
281,
3634,
11390,
7609,
281,
5976,
12744,
50276,
262,
3133,
326,
2328,
2610,
4419,
4758,
253,
1180,
273,
3634,
11390,
295,
4503,
281,
253,
1180,
273,
3126,
2616,
273,
7629,
275,
2087,
359,
778,
417,
871,
436,
1318,
4555,
6240,
247,
7340,
1783,
281,
849,
7976,
253,
3045,
310,
327,
4758,
295,
281,
436,
3242,
1318,
651,
2085,
1774,
1491,
327,
849,
7763,
436,
1332,
310,
275,
3946,
50276,
37585,
5701,
50275,
77,
1423,
13296,
5970,
943,
320,
6695,
5970,
50276,
77,
2504,
2385,
806,
253,
3634,
32049,
8473,
84,
253,
2469,
1375,
1913,
8557,
715,
557,
290,
33195,
3634,
11390,
310,
271,
31215,
5740,
347,
352,
1364,
806,
320,
18325,
281,
513,
594,
347,
1735,
2529,
275,
298,
2385,
2537,
50276,
2520,
2929,
812,
1908,
19936,
2905,
789,
275,
440,
35421,
3126,
2216,
1249,
1706,
285,
625,
3839,
391,
77,
789,
275,
3352,
8572,
20419,
12620,
8026,
841,
2987,
403,
11617,
2905,
347,
597,
8069,
1347,
1313,
7694,
689,
247,
2317,
273,
3126,
10575,
342,
271,
29688,
6311,
3634,
23111,
436,
1386,
273,
789,
3133,
751,
247,
1534,
29002,
50276,
250,
3065,
50276,
18,
1850,
24836,
1162,
355,
9169,
47006,
10454,
285,
1182,
254,
6934,
302,
3700,
3066,
440,
35421,
3126,
2216,
50276,
19,
480,
22589,
1162,
355,
43425,
23652,
1025,
1268,
44864,
50276,
20,
480,
22589,
1162,
355,
43425,
44864,
26960,
48960,
3126,
2216,
50276,
21,
5603,
254,
11375,
1162,
355,
1384,
1423,
25537,
22182,
3627,
342,
14938,
3169,
3126,
2216,
50276,
22,
1218,
587,
40160,
1162,
355,
43425,
34430,
4906,
1318,
285,
3646,
323,
26647,
275,
35221,
4715,
50276,
23,
21115,
1257,
1162,
355,
6247,
19732,
2977,
19993,
5978,
281,
22791,
35221,
4715,
253,
5161,
9376,
273,
436,
789,
671,
6993,
347,
697,
3625,
12291,
253,
3126,
2616,
273,
7629,
403,
8025,
281,
320,
3907,
285,
616,
1180,
1929,
247,
30400,
253,
4477,
943,
1056,
271,
3434,
281,
22175,
436,
12291,
285,
281,
752,
6070,
597,
2868,
824,
271,
9376,
273,
14275,
778,
320,
7763,
275,
3946,
5474,
33032,
2520,
2929,
39223,
253,
1895,
273,
26647,
275,
31934,
793,
835,
253,
8062,
2544,
403,
8025,
281,
320,
4269,
407,
2709,
3907,
2616,
17007,
347,
3634,
253,
4081,
7792,
2328,
2610,
33772,
247,
3634,
32049,
326,
8115,
24102,
281,
247,
21624,
3634,
3066,
45765,
15577,
1491,
970,
6046,
45842,
422,
13418,
2192,
19131,
253,
4477,
13398,
2328,
2610,
342,
771,
813,
658,
285,
1566,
3169,
391,
77,
11333,
285,
1347,
4679,
275,
10610,
12620,
347,
973,
347,
275,
253,
278,
10441,
16856,
22791,
275,
7533,
835,
2709,
44667,
398,
1818,
10486,
23000,
18276,
5304,
5904,
273,
253,
21624,
3634,
11390,
403,
3559,
970,
28669,
570,
20544,
50276,
783,
2934,
273,
26475,
253,
1027,
44667,
398,
326,
778,
2818,
253,
8062,
273,
253,
278,
12132,
715,
1027,
21624,
22349,
310,
4460,
285,
4722,
50276,
783,
5661,
1543,
921,
326,
253,
4081,
1332,
476,
275,
2087,
5115,
1805,
3045,
685,
253,
1375,
23037,
14387,
50276,
20881,
1255,
265,
50276,
783,
2929,
3198,
7756,
5001,
253,
19843,
273,
253,
15965,
14308,
824,
347,
253,
8103,
3470,
50276,
262,
310,
417,
2590,
1880,
253,
11701,
403,
984,
273,
253,
45765,
15577,
1491,
7792,
390,
984,
273,
643,
5933,
280,
11701,
923,
2708,
50276,
783,
2929,
812,
5649,
432,
247,
5955,
5001,
253,
9376,
273,
3907,
44667,
398,
323,
4227,
849,
2834,
352,
651,
320,
281,
5223,
253,
5933,
281,
253,
1083,
835,
359,
452,
5161,
16148,
44667,
398,
5474,
33032,
2520,
2929,
29328,
247,
45765,
15577,
1491,
1332,
281,
3037,
557,
290,
33195,
3634,
1491,
534,
476,
39970,
35221,
4715,
11333,
715,
39709,
12620,
253,
5661,
4679,
7568,
326,
253,
4081,
1332,
476,
5115,
1805,
3045,
685,
253,
2045,
3082,
20544,
337,
253,
4028,
273,
436,
2929,
310,
3965,
973,
285,
253,
2934,
273,
352,
310,
3477,
281,
956,
374,
253,
8442,
275,
436,
2929,
403,
1077,
2590,
285,
1077,
973,
495,
253,
9470,
4679,
921,
253,
12510,
273,
253,
4081,
1332,
50276,
20881,
1255,
50276,
18,
1754,
327,
253,
4060,
891,
5467,
326,
436,
1263,
16633,
327,
253,
1313,
609,
249,
19503,
4715,
1895,
253,
6041,
1313,
609,
249,
19503,
4715,
3082,
2486,
271,
15644,
1232,
533,
436,
2929,
2789,
642,
3748,
273,
436,
1232,
23000,
253,
2929,
3054,
326,
352,
31901,
281,
6194,
247,
2087,
3634,
36465,
281,
8415,
253,
15644,
1895,
7809,
326,
253,
9380,
3634,
310,
253,
8062,
26647,
275,
35221,
4715,
436,
2929,
671,
25957,
352,
275,
1386,
11130,
534,
310,
275,
4499,
281,
253,
4060,
273,
253,
2929,
534,
10770,
281,
1313,
609,
249,
19503,
4715,
50276,
19,
253,
1273,
1895,
273,
436,
2929,
310,
253,
38135,
253,
2929,
13698,
281,
22950,
253,
15577,
1491,
875,
22349,
10375,
432,
9493,
1491,
285,
253,
9493,
24102,
2299,
436,
2929,
1057,
417,
1056,
2590,
253,
2954,
342,
15567,
534,
671,
3177,
281,
22950,
253,
3641,
875,
3634,
4972,
285,
9493,
24102,
33810,
436,
789,
1057,
417,
7277,
253,
3045,
342,
495,
285,
1014,
1057,
417,
14409,
352,
5747,
253,
958,
326,
495,
16633,
327,
247,
2074,
1895,
281,
436,
2929,
347,
247,
906,
273,
253,
5816,
7680,
285,
5661,
14023,
342,
15567,
891,
2868,
436,
9380,
34002,
310,
8489,
3710,
50276,
20,
253,
1180,
273,
6311,
3634,
11390,
260,
310,
873,
347,
253,
1180,
273,
12620,
275,
253,
1263,
534,
310,
253,
3625,
4373,
19484,
273,
253,
5125,
5853,
2299,
275,
247,
1524,
10186,
4758,
253,
1180,
273,
12620,
310,
417,
2130,
2403,
352,
16593,
281,
7277,
352,
281,
253,
8245,
246,
78,
498,
534,
36908,
10725,
327,
824,
2720,
1491,
436,
5459,
619,
7350,
670,
253,
7681,
3590,
1255,
273,
436,
2929,
50276,
249,
6452,
1223,
253,
4028,
285,
5661,
1543,
403,
7126,
436,
2929,
27171,
432,
253,
18979,
19843,
285,
38135,
3374,
604,
253,
4477,
2953,
619,
7350,
275,
616,
2380,
891,
588,
1908,
12976,
619,
4868,
50274,
6438,
30080,
22559,
50275,
74,
1158,
326,
253,
3081,
5661,
1543,
285,
5955,
275,
253,
18520,
11322,
619,
7350,
670,
253,
19843,
1895,
273,
253,
19529,
594,
891,
2572,
619,
4868,
432,
577,
281,
721,
15672,
50276,
1222,
641,
891,
2868,
326,
4172,
66,
19401,
3634,
1491,
285,
21031,
1461,
10117,
5239,
342,
2709,
44667,
398,
594,
891,
2868,
326,
4172,
66,
943,
320,
5469,
275,
253,
3092,
960,
1461,
10117,
5955,
1386,
5976,
50276,
18,
419,
302,
757,
15260,
288,
543,
5973,
80,
12717,
480,
757,
6683,
419,
80,
260,
864,
260,
864,
1269,
301,
543,
269,
1205,
277,
543,
632,
285,
259,
335,
543,
632,
86,
4404,
3576,
3634,
323,
1313,
609,
249,
19503,
4715,
271,
2746,
1754,
327,
4499,
422,
4715,
50275,
19,
632,
298,
30287,
606,
340,
260,
864,
278,
26535,
80,
256,
26535,
80,
277,
50276,
11917,
606,
480,
43425,
872,
1598,
5520,
3634,
3169,
28841,
1313,
7694,
342,
4116,
285,
4499,
422,
4715,
549,
32693,
638,
3845,
549,
32693,
19,
11335,
740,
30715,
50276,
20,
1149,
80,
480,
305,
543,
278,
246,
8500,
277,
247,
38524,
7268,
2746,
323,
440,
35421,
8062,
26647,
275,
1566,
3169,
35221,
4715,
5620,
931,
1050,
8059,
327,
4715,
14237,
1384,
1423,
4496,
3730,
281,
253,
14855,
7117,
1840,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
2328,
2610,
271,
13757,
7792,
323,
33876,
11419,
35221,
4715,
253,
30628,
3839,
5194,
326,
253,
2929,
310,
973,
3542,
253,
2934,
310,
4460,
285,
4722,
253,
7103,
310,
11088,
285,
253,
1543,
403,
13943,
30628,
671,
5439,
247,
1643,
7350,
275,
253,
3302,
10123,
824,
347,
253,
4737,
273,
18057,
337,
285,
10012,
337,
285,
253,
15965,
14308,
4768,
253,
5955,
3408,
954,
273,
841,
7350,
497,
10481,
9713,
285,
253,
2278,
7363,
497,
2559,
15672,
4583,
253,
3290,
273,
253,
17265,
2929,
556,
5520,
3012,
1309,
253,
30080,
22559,
3021,
891,
5583,
18738,
436,
2929,
4496,
19071,
253,
5780,
30628,
13991,
275,
253,
2852,
2715,
273,
436,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes the use of playbacks for uda the author uses the trained model and an offline 3d object tracker to generate highquality pseudolabels of the target domain after that the original model is finetuned on the generated pseudolabels to improve performance on the target domain the paper can be understood in general and the writing is easy to follow the results of the paper are practical its reasonable because it can generate more accurate 3d boxes for the target domain especially for those longdistance objects the authors have done experiments on 5 data sets to show the generalization capability of the method and compare the two decent baselines with the proposed method with the video of the same scene over time i am also curious if the effect of using the point cloud of the previousnext frames to enhance the point cloud data of the current frame which may further enhance the effect and generalization ability of pseudolabels the major novelty of the paper is the combination of offlinetracking and selftraining techniques which is practical for realworld engineering problems however in general i think the novelty is still limited to the iclr community in my view the only difference between the proposed method and st is the introduction of video information an assumption and the offline tracker to make the pseudolabel more accuratedocsep this paper proposed an unsupervised domain adaptation method for 3d lidarbased object detection the idea is simple and straightforward using crossdomain detector offline tracking to provide pseudolabels inspired by similar uda efforts for 2d detection experiments are conducted over multiple selfdriving perception datasets and results validated the effectiveness of the proposed method pros the idea is simple and straightforward the approach is technically sound the presentation is clear the introduction is easy to follow and enjoyable to read related work is thorough properly reflects the current states technical details are clearly described so that reproducing should not be very difficult the experiment showcases solid performance improvement over baselines selftraining and statistical normalization the paper also conducted very detailed and convincing ablation studies consistent improvements have been seen in several datasets and two different detectors cons i have some concerns regarding claimed contributionsnovelty offline tracking is not adequately benchmarked to justify the choice of extrapolation the online tracker is not on par with the current state of the art there is not enough pseudo gt quality analysis against manually labeled groundtruth the usage of video to produce confident pseudo labels for unsupervised domain adaptation has been stressed in the introduction however as the related work described this has been explored before with a similar technical approach for 2d detection offline tracking to produce labels see roychowdhury et al its hard to say if extrapolation is a significant contribution unless adequately benchmarked showcasing the offline tracker has improved using this trick such benchmarking could be done on the kitti tracking benchmark to compare withwithout extrapolation procedure the current ablation on uda tells little information as improvement is not significant there is no comparison against other trackers based on the reported numbers the online tracker adapted from diazruiz et al 2019 is subpar from the current stateoftheart kalmanbased online tracker its hard to justify why this one is chosen why not weng et al 2020 or chiu et al 2020 as mentioned in the paper in particular the tracker in weng et al 2020 is opensourced please provide an map evaluation of the pseudo gt quality over some sequences with gt labels although not required it would be great to see whether the author plan to release the code postrebuttal comments i carefully read the rebuttal and other reviewer comments the author addressed my concerns on pseudolabel quality assessment and comparison against sota trackers from the experimental perspective i am very convinced the paper did a great job now please incorporate these additional experiments into the paper making it more complete that being said similar to other reviewers i am not very convinced about the authors reply on noveltycontribution its true it has not been applied in 3d which is new however i am not convinced by the claims in rebuttal such as using physicsbased dynamics models i think you are referring to kinematicsbased instead of physicsbased 3d extrapolations which could induce potential problems due to the multimodal future uncertainty and selftraining which is not new thus if the paper gets accepted i strongly encourage the author to rewrite the introduction and properly reflect the core contributions overall i am still on the positive side but i am fine with both decisions docsepthe topic of adapting 3d object detectors to new domains is important the paper clearly motivates the problem clearly presents the methods and shows detailed experiments i really enjoyed reading the paper my main concern is that the two components of the method selftraining with pseudo labels and generating more pseudo labels with an object tracker for object detection have been developed and widely used in the computer vision domain for 2d object detectors the main novelty of this paper lies in using the counterparts of the two components in 3d for the new 3d object detection task the use of selftraining is almost the same as all previous methods there are a few interesting engineering parts in using 3d object tracker to expand the pseudo labels such as label extrapolation and interpolation another question i have is that when the object detector gets stronger do we need a stronger object tracking algorithm in order to provide additional useful information if the tracking algorithm is too weak relative to object detection methods the augmented pseudo labels will be too noisy to provide any help discussions or experiments on this point would be very helpful in understanding the application domain of the proposed method although the novelty of the method is rather small the authors have made good efforts in supporting the work with extensive experiments the authors have evaluated their method on five datasets all the 2 out of 5 combinations the results are good across all the scenarios the paper is clearly written and the method is well motivated i am not sure whether a paper with extensive experiments and relatively small technical contribution should be considered as a good paper for iclr after reading other reviews and the rebuttal i opt for acceptance docseppros the proposed method is simple yet effective and has wide uses in realworld applications solid experiments across 5 benchmarks this method does not rely on the source domain data and learned trackers cons the object detector will detect objects accurately only when they are close to the selfdriving car the claim is not supported when there is a large domain gap eg different lidars or significantly different scenarios the proposed model will fail to handle this situation for the static cars why dont use egomotion to model the temporal relationships it should have a better performance than ekf the generation of the pseudolabels depends heavily on the confidence scores obtained from the object detector confidence scores 08 how is the threshold of 080 chosen would other thresholds be more effective why does the author only post results in 5080m in tab 3 the accurate detection in 050m is more important although the relative improvement may be less the method is somewhat similar to the existing trackerbased uda methods thus the novelty is limited however the application of 3d detection and the extensive experiments are great and may benefit further research significantly
### Summary: | this paper proposed an unsupervised domain adaptation method for 3d lidarbased object detection four reviewers provided detailed reviews 3 rated marginally above acceptance threshold and 1 rated ok but not good enough rejection the reviewers appreciated simple yet effective idea the well motivated method the comprehensiveness of the experiments and well written paper however major concerns are also raised regarding the core technical contributions on the proposed approach the acs look at the paper the review the rebuttal and the discussion given the concerns on the core technical contributions the high competitiveness of the iclr field and the lack of enthusiastic endorsements from reviewers the acs believe this work is not ready to be accepted to iclr yet and hence a rejection decision is recommended | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
253,
897,
273,
1132,
12270,
323,
209,
14776,
253,
2488,
4648,
253,
10166,
1566,
285,
271,
28841,
495,
69,
1789,
40143,
281,
6635,
1029,
15177,
10585,
311,
357,
1241,
273,
253,
2303,
5028,
846,
326,
253,
3236,
1566,
310,
1442,
292,
37437,
327,
253,
4561,
10585,
311,
357,
1241,
281,
3157,
3045,
327,
253,
2303,
5028,
50276,
783,
2929,
476,
320,
7192,
275,
2087,
285,
253,
4028,
310,
3477,
281,
956,
253,
1543,
273,
253,
2929,
403,
8542,
697,
5272,
984,
352,
476,
6635,
625,
7899,
495,
69,
12783,
323,
253,
2303,
5028,
3340,
323,
1110,
1048,
19893,
5113,
253,
4477,
452,
2218,
4679,
327,
608,
941,
5239,
281,
921,
253,
26647,
14603,
273,
253,
1332,
285,
7277,
253,
767,
12524,
1666,
25379,
342,
253,
4081,
1332,
50275,
3113,
253,
3492,
273,
253,
1072,
6200,
689,
673,
891,
717,
671,
14338,
604,
253,
1055,
273,
970,
253,
1127,
9005,
273,
253,
2045,
8384,
13009,
281,
7278,
253,
1127,
9005,
941,
273,
253,
1655,
3665,
534,
778,
2007,
7278,
253,
1055,
285,
26647,
3745,
273,
10585,
311,
357,
1241,
50276,
783,
2201,
38135,
273,
253,
2929,
310,
253,
5019,
273,
745,
3642,
11656,
10892,
285,
11329,
649,
26208,
5609,
534,
310,
8542,
323,
1524,
10186,
11369,
3237,
2299,
275,
2087,
891,
1158,
253,
38135,
310,
1335,
3710,
281,
253,
17857,
32888,
3114,
275,
619,
1859,
253,
760,
3064,
875,
253,
4081,
1332,
285,
331,
310,
253,
10199,
273,
3492,
1491,
271,
9376,
285,
253,
28841,
40143,
281,
1056,
253,
10585,
311,
1492,
625,
3933,
456,
406,
33032,
436,
2929,
4081,
271,
440,
35421,
5028,
15644,
1332,
323,
495,
69,
16486,
274,
3169,
1789,
5481,
253,
2934,
310,
2969,
285,
15246,
970,
2831,
13517,
13562,
50276,
2727,
1282,
12544,
281,
2085,
10585,
311,
357,
1241,
11797,
407,
2074,
209,
14776,
6031,
323,
374,
69,
5481,
4679,
403,
5196,
689,
2709,
1881,
41571,
13071,
15302,
285,
1543,
17618,
253,
12510,
273,
253,
4081,
1332,
50275,
856,
84,
50276,
783,
2934,
310,
2969,
285,
15246,
253,
2746,
310,
22335,
3590,
50275,
783,
9759,
310,
2590,
253,
10199,
310,
3477,
281,
956,
285,
30357,
281,
1239,
2905,
789,
310,
11080,
6283,
13806,
253,
1655,
3054,
7681,
4278,
403,
4518,
2529,
594,
326,
39306,
943,
417,
320,
1077,
2834,
50275,
783,
3368,
921,
12866,
4891,
3045,
7756,
689,
1666,
25379,
11329,
649,
26208,
285,
7605,
21539,
50276,
783,
2929,
671,
5196,
1077,
7000,
285,
21414,
28913,
2175,
50275,
32474,
11701,
452,
644,
2326,
275,
2067,
15302,
285,
767,
1027,
25421,
50276,
5040,
50276,
74,
452,
690,
7350,
5001,
7558,
9021,
2369,
652,
555,
50275,
2727,
1282,
12544,
310,
417,
18212,
22791,
264,
281,
15249,
253,
4327,
273,
26480,
17888,
50275,
783,
3909,
40143,
310,
417,
327,
1061,
342,
253,
1655,
1375,
273,
253,
1445,
50275,
9088,
310,
417,
2217,
17927,
305,
85,
3290,
1783,
1411,
13542,
13130,
3216,
33024,
50275,
783,
10393,
273,
3492,
281,
4711,
13224,
17927,
13301,
323,
440,
35421,
5028,
15644,
556,
644,
21198,
275,
253,
10199,
2299,
347,
253,
2905,
789,
2529,
436,
556,
644,
14859,
1078,
342,
247,
2074,
7681,
2746,
323,
374,
69,
5481,
28841,
12544,
281,
4711,
13301,
923,
687,
3155,
319,
20402,
1626,
1162,
355,
50276,
953,
1892,
281,
1333,
604,
26480,
17888,
310,
247,
1534,
7680,
5734,
18212,
22791,
264,
44762,
2355,
253,
28841,
40143,
556,
5520,
970,
436,
10480,
824,
22791,
272,
812,
320,
2218,
327,
253,
465,
21498,
12544,
22791,
281,
7277,
342,
14920,
26480,
17888,
5199,
253,
1655,
28913,
327,
209,
14776,
8599,
1652,
1491,
347,
7756,
310,
417,
1534,
50275,
9088,
310,
642,
5301,
1411,
643,
3540,
398,
1754,
327,
253,
2361,
3904,
253,
3909,
40143,
12956,
432,
277,
13306,
579,
478,
1162,
355,
6247,
310,
749,
1148,
432,
253,
1655,
1375,
23037,
14387,
465,
267,
1342,
3169,
3909,
40143,
697,
1892,
281,
15249,
2139,
436,
581,
310,
6777,
2139,
417,
259,
1205,
1162,
355,
9169,
390,
448,
14353,
1162,
355,
9169,
347,
5393,
275,
253,
2929,
275,
1798,
253,
40143,
275,
259,
1205,
1162,
355,
9169,
310,
13279,
47549,
50275,
32897,
2085,
271,
3711,
7103,
273,
253,
17927,
305,
85,
3290,
689,
690,
6430,
342,
305,
85,
13301,
50275,
20261,
417,
2424,
352,
651,
320,
1270,
281,
923,
1880,
253,
2488,
2098,
281,
3727,
253,
2127,
50274,
5996,
250,
2858,
22559,
5701,
50275,
74,
9257,
1239,
253,
30080,
22559,
285,
643,
37317,
5701,
253,
2488,
9713,
619,
7350,
327,
10585,
311,
1492,
3290,
6803,
285,
5301,
1411,
256,
5503,
3540,
398,
432,
253,
5661,
8668,
891,
717,
1077,
13762,
253,
2929,
858,
247,
1270,
2628,
1024,
4496,
19071,
841,
3081,
4679,
715,
253,
2929,
2403,
352,
625,
3426,
50275,
3529,
1146,
753,
2074,
281,
643,
30628,
891,
717,
417,
1077,
13762,
670,
253,
4477,
12252,
327,
38135,
1987,
2382,
697,
2032,
352,
556,
417,
644,
3732,
275,
495,
69,
534,
310,
747,
2299,
891,
717,
417,
13762,
407,
253,
3916,
275,
30080,
22559,
824,
347,
970,
12057,
3169,
8062,
3210,
891,
1158,
368,
403,
14339,
281,
47266,
3169,
3185,
273,
12057,
3169,
495,
69,
26480,
311,
569,
534,
812,
10808,
2442,
3237,
1955,
281,
253,
23390,
26306,
2852,
11649,
285,
11329,
649,
26208,
534,
310,
417,
747,
3021,
604,
253,
2929,
4850,
7607,
891,
7052,
11907,
253,
2488,
281,
24813,
253,
10199,
285,
6283,
4887,
253,
5161,
9021,
50276,
1189,
455,
891,
717,
1335,
327,
253,
2762,
1930,
533,
891,
717,
4030,
342,
1097,
7089,
50276,
7152,
339,
431,
248,
9400,
273,
42174,
495,
69,
1789,
25421,
281,
747,
10625,
310,
1774,
253,
2929,
4518,
15265,
684,
253,
1895,
4518,
10262,
253,
3082,
285,
2722,
7000,
4679,
50276,
74,
1663,
11346,
4361,
253,
2929,
50275,
2577,
2022,
4468,
310,
326,
253,
767,
4295,
273,
253,
1332,
11329,
649,
26208,
342,
17927,
13301,
285,
11365,
625,
17927,
13301,
342,
271,
1789,
40143,
323,
1789,
5481,
452,
644,
3715,
285,
7561,
908,
275,
253,
4382,
8113,
5028,
323,
374,
69,
1789,
25421,
253,
2022,
38135,
273,
436,
2929,
8696,
275,
970,
253,
21421,
273,
253,
767,
4295,
275,
495,
69,
323,
253,
747,
495,
69,
1789,
5481,
4836,
253,
897,
273,
11329,
649,
26208,
310,
2761,
253,
1072,
347,
512,
2045,
3082,
627,
403,
247,
1643,
4722,
11369,
4243,
275,
970,
495,
69,
1789,
40143,
281,
5645,
253,
17927,
13301,
824,
347,
5203,
26480,
17888,
285,
30370,
50270,
23955,
1953,
891,
452,
310,
326,
672,
253,
1789,
13562,
4850,
10046,
513,
359,
878,
247,
10046,
1789,
12544,
5933,
275,
1340,
281,
2085,
3081,
4217,
1491,
604,
253,
12544,
5933,
310,
1512,
5075,
4103,
281,
1789,
5481,
3082,
253,
31612,
17927,
13301,
588,
320,
1512,
27620,
281,
2085,
667,
1361,
11985,
390,
4679,
327,
436,
1127,
651,
320,
1077,
9371,
275,
4685,
253,
2898,
5028,
273,
253,
4081,
1332,
50274,
20261,
253,
38135,
273,
253,
1332,
310,
2581,
1355,
253,
4477,
452,
1160,
1175,
6031,
275,
8109,
253,
789,
342,
9470,
4679,
253,
4477,
452,
6760,
616,
1332,
327,
2620,
15302,
512,
253,
374,
562,
273,
608,
13553,
253,
1543,
403,
1175,
2439,
512,
253,
15216,
253,
2929,
310,
4518,
3542,
285,
253,
1332,
310,
973,
17194,
50275,
74,
717,
417,
2119,
1880,
247,
2929,
342,
9470,
4679,
285,
4942,
1355,
7681,
7680,
943,
320,
2783,
347,
247,
1175,
2929,
323,
17857,
32888,
50276,
6438,
4361,
643,
10123,
285,
253,
30080,
22559,
891,
1478,
323,
14924,
50274,
7152,
339,
377,
2921,
50276,
783,
4081,
1332,
310,
2969,
2568,
3576,
285,
556,
4618,
4648,
275,
1524,
10186,
4893,
50276,
23706,
4679,
2439,
608,
49602,
50276,
2520,
1332,
1057,
417,
10725,
327,
253,
2603,
5028,
941,
285,
6311,
3540,
398,
50275,
5040,
50276,
783,
1789,
13562,
588,
2736,
5113,
13613,
760,
672,
597,
403,
2810,
281,
253,
1881,
41571,
1113,
253,
1750,
310,
417,
4516,
672,
627,
310,
247,
1781,
5028,
8037,
24088,
1027,
16486,
1032,
390,
3012,
1027,
15216,
253,
4081,
1566,
588,
1891,
281,
6016,
436,
4112,
50275,
1542,
253,
4228,
8458,
2139,
13414,
897,
299,
23640,
5011,
281,
1566,
253,
11935,
7688,
352,
943,
452,
247,
1805,
3045,
685,
34978,
71,
50276,
783,
5978,
273,
253,
10585,
311,
357,
1241,
7024,
11306,
327,
253,
7162,
7363,
2797,
432,
253,
1789,
13562,
7162,
7363,
50276,
2904,
849,
310,
253,
7887,
273,
470,
1438,
6777,
651,
643,
26682,
320,
625,
3576,
50276,
22309,
1057,
253,
2488,
760,
1501,
1543,
275,
2456,
1438,
78,
275,
10334,
495,
253,
7899,
5481,
275,
470,
1235,
78,
310,
625,
1774,
3738,
253,
4103,
7756,
778,
320,
1679,
50276,
783,
1332,
310,
8489,
2074,
281,
253,
5368,
40143,
3169,
209,
14776,
3082,
3021,
253,
38135,
310,
3710,
2299,
253,
2898,
273,
495,
69,
5481,
285,
253,
9470,
4679,
403,
1270,
285,
778,
5649,
2007,
2561,
3012,
187,
187,
4118,
18435,
27,
2520,
2929,
4081,
271,
440,
35421,
5028,
15644,
1332,
323,
495,
69,
16486,
274,
3169,
1789,
5481,
1740,
30628,
2530,
7000,
10123,
495,
20139,
42876,
1840,
14924,
7887,
285,
337,
20139,
8718,
533,
417,
1175,
2217,
50276,
250,
5342,
253,
30628,
14109,
2969,
2568,
3576,
2934,
253,
973,
17194,
1332,
253,
9483,
6460,
273,
253,
4679,
285,
973,
3542,
2929,
2299,
2201,
7350,
403,
671,
5439,
5001,
253,
5161,
7681,
9021,
327,
253,
4081,
2746,
253,
913,
84,
1007,
387,
253,
2929,
253,
2278,
253,
30080,
22559,
285,
253,
5955,
1677,
253,
7350,
327,
253,
5161,
7681,
9021,
253,
1029,
3947,
48826,
273,
253,
17857,
32888,
1673,
285,
253,
3480,
273,
31905,
18883,
942,
432,
30628,
253,
913,
84,
2868,
436,
789,
310,
417,
4704,
281,
320,
7607,
281,
17857,
32888,
2568,
285,
7613,
247,
18235,
3061,
310,
8521,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
253,
897,
273,
1132,
12270,
323,
209,
14776,
253,
2488,
4648,
253,
10166,
1566,
285,
271,
28841,
495,
69,
1789,
40143,
281,
6635,
1029,
15177,
10585,
311,
357,
1241,
273,
253,
2303,
5028,
846,
326,
253,
3236,
1566,
310,
1442,
292,
37437,
327,
253,
4561,
10585,
311,
357,
1241,
281,
3157,
3045,
327,
253,
2303,
5028,
50276,
783,
2929,
476,
320,
7192,
275,
2087,
285,
253,
4028,
310,
3477,
281,
956,
253,
1543,
273,
253,
2929,
403,
8542,
697,
5272,
984,
352,
476,
6635,
625,
7899,
495,
69,
12783,
323,
253,
2303,
5028,
3340,
323,
1110,
1048,
19893,
5113,
253,
4477,
452,
2218,
4679,
327,
608,
941,
5239,
281,
921,
253,
26647,
14603,
273,
253,
1332,
285,
7277,
253,
767,
12524,
1666,
25379,
342,
253,
4081,
1332,
50275,
3113,
253,
3492,
273,
253,
1072,
6200,
689,
673,
891,
717,
671,
14338,
604,
253,
1055,
273,
970,
253,
1127,
9005,
273,
253,
2045,
8384,
13009,
281,
7278,
253,
1127,
9005,
941,
273,
253,
1655,
3665,
534,
778,
2007,
7278,
253,
1055,
285,
26647,
3745,
273,
10585,
311,
357,
1241,
50276,
783,
2201,
38135,
273,
253,
2929,
310,
253,
5019,
273,
745,
3642,
11656,
10892,
285,
11329,
649,
26208,
5609,
534,
310,
8542,
323,
1524,
10186,
11369,
3237,
2299,
275,
2087,
891,
1158,
253,
38135,
310,
1335,
3710,
281,
253,
17857,
32888,
3114,
275,
619,
1859,
253,
760,
3064,
875,
253,
4081,
1332,
285,
331,
310,
253,
10199,
273,
3492,
1491,
271,
9376,
285,
253,
28841,
40143,
281,
1056,
253,
10585,
311,
1492,
625,
3933,
456,
406,
33032,
436,
2929,
4081,
271,
440,
35421,
5028,
15644,
1332,
323,
495,
69,
16486,
274,
3169,
1789,
5481,
253,
2934,
310,
2969,
285,
15246,
970,
2831,
13517,
13562,
50276,
2727,
1282,
12544,
281,
2085,
10585,
311,
357,
1241,
11797,
407,
2074,
209,
14776,
6031,
323,
374,
69,
5481,
4679,
403,
5196,
689,
2709,
1881,
41571,
13071,
15302,
285,
1543,
17618,
253,
12510,
273,
253,
4081,
1332,
50275,
856,
84,
50276,
783,
2934,
310,
2969,
285,
15246,
253,
2746,
310,
22335,
3590,
50275,
783,
9759,
310,
2590,
253,
10199,
310,
3477,
281,
956,
285,
30357,
281,
1239,
2905,
789,
310,
11080,
6283,
13806,
253,
1655,
3054,
7681,
4278,
403,
4518,
2529,
594,
326,
39306,
943,
417,
320,
1077,
2834,
50275,
783,
3368,
921,
12866,
4891,
3045,
7756,
689,
1666,
25379,
11329,
649,
26208,
285,
7605,
21539,
50276,
783,
2929,
671,
5196,
1077,
7000,
285,
21414,
28913,
2175,
50275,
32474,
11701,
452,
644,
2326,
275,
2067,
15302,
285,
767,
1027,
25421,
50276,
5040,
50276,
74,
452,
690,
7350,
5001,
7558,
9021,
2369,
652,
555,
50275,
2727,
1282,
12544,
310,
417,
18212,
22791,
264,
281,
15249,
253,
4327,
273,
26480,
17888,
50275,
783,
3909,
40143,
310,
417,
327,
1061,
342,
253,
1655,
1375,
273,
253,
1445,
50275,
9088,
310,
417,
2217,
17927,
305,
85,
3290,
1783,
1411,
13542,
13130,
3216,
33024,
50275,
783,
10393,
273,
3492,
281,
4711,
13224,
17927,
13301,
323,
440,
35421,
5028,
15644,
556,
644,
21198,
275,
253,
10199,
2299,
347,
253,
2905,
789,
2529,
436,
556,
644,
14859,
1078,
342,
247,
2074,
7681,
2746,
323,
374,
69,
5481,
28841,
12544,
281,
4711,
13301,
923,
687,
3155,
319,
20402,
1626,
1162,
355,
50276,
953,
1892,
281,
1333,
604,
26480,
17888,
310,
247,
1534,
7680,
5734,
18212,
22791,
264,
44762,
2355,
253,
28841,
40143,
556,
5520,
970,
436,
10480,
824,
22791,
272,
812,
320,
2218,
327,
253,
465,
21498,
12544,
22791,
281,
7277,
342,
14920,
26480,
17888,
5199,
253,
1655,
28913,
327,
209,
14776,
8599,
1652,
1491,
347,
7756,
310,
417,
1534,
50275,
9088,
310,
642,
5301,
1411,
643,
3540,
398,
1754,
327,
253,
2361,
3904,
253,
3909,
40143,
12956,
432,
277,
13306,
579,
478,
1162,
355,
6247,
310,
749,
1148,
432,
253,
1655,
1375,
23037,
14387,
465,
267,
1342,
3169,
3909,
40143,
697,
1892,
281,
15249,
2139,
436,
581,
310,
6777,
2139,
417,
259,
1205,
1162,
355,
9169,
390,
448,
14353,
1162,
355,
9169,
347,
5393,
275,
253,
2929,
275,
1798,
253,
40143,
275,
259,
1205,
1162,
355,
9169,
310,
13279,
47549,
50275,
32897,
2085,
271,
3711,
7103,
273,
253,
17927,
305,
85,
3290,
689,
690,
6430,
342,
305,
85,
13301,
50275,
20261,
417,
2424,
352,
651,
320,
1270,
281,
923,
1880,
253,
2488,
2098,
281,
3727,
253,
2127,
50274,
5996,
250,
2858,
22559,
5701,
50275,
74,
9257,
1239,
253,
30080,
22559,
285,
643,
37317,
5701,
253,
2488,
9713,
619,
7350,
327,
10585,
311,
1492,
3290,
6803,
285,
5301,
1411,
256,
5503,
3540,
398,
432,
253,
5661,
8668,
891,
717,
1077,
13762,
253,
2929,
858,
247,
1270,
2628,
1024,
4496,
19071,
841,
3081,
4679,
715,
253,
2929,
2403,
352,
625,
3426,
50275,
3529,
1146,
753,
2074,
281,
643,
30628,
891,
717,
417,
1077,
13762,
670,
253,
4477,
12252,
327,
38135,
1987,
2382,
697,
2032,
352,
556,
417,
644,
3732,
275,
495,
69,
534,
310,
747,
2299,
891,
717,
417,
13762,
407,
253,
3916,
275,
30080,
22559,
824,
347,
970,
12057,
3169,
8062,
3210,
891,
1158,
368,
403,
14339,
281,
47266,
3169,
3185,
273,
12057,
3169,
495,
69,
26480,
311,
569,
534,
812,
10808,
2442,
3237,
1955,
281,
253,
23390,
26306,
2852,
11649,
285,
11329,
649,
26208,
534,
310,
417,
747,
3021,
604,
253,
2929,
4850,
7607,
891,
7052,
11907,
253,
2488,
281,
24813,
253,
10199,
285,
6283,
4887,
253,
5161,
9021,
50276,
1189,
455,
891,
717,
1335,
327,
253,
2762,
1930,
533,
891,
717,
4030,
342,
1097,
7089,
50276,
7152,
339,
431,
248,
9400,
273,
42174,
495,
69,
1789,
25421,
281,
747,
10625,
310,
1774,
253,
2929,
4518,
15265,
684,
253,
1895,
4518,
10262,
253,
3082,
285,
2722,
7000,
4679,
50276,
74,
1663,
11346,
4361,
253,
2929,
50275,
2577,
2022,
4468,
310,
326,
253,
767,
4295,
273,
253,
1332,
11329,
649,
26208,
342,
17927,
13301,
285,
11365,
625,
17927,
13301,
342,
271,
1789,
40143,
323,
1789,
5481,
452,
644,
3715,
285,
7561,
908,
275,
253,
4382,
8113,
5028,
323,
374,
69,
1789,
25421,
253,
2022,
38135,
273,
436,
2929,
8696,
275,
970,
253,
21421,
273,
253,
767,
4295,
275,
495,
69,
323,
253,
747,
495,
69,
1789,
5481,
4836,
253,
897,
273,
11329,
649,
26208,
310,
2761,
253,
1072,
347,
512,
2045,
3082,
627,
403,
247,
1643,
4722,
11369,
4243,
275,
970,
495,
69,
1789,
40143,
281,
5645,
253,
17927,
13301,
824,
347,
5203,
26480,
17888,
285,
30370,
50270,
23955,
1953,
891,
452,
310,
326,
672,
253,
1789,
13562,
4850,
10046,
513,
359,
878,
247,
10046,
1789,
12544,
5933,
275,
1340,
281,
2085,
3081,
4217,
1491,
604,
253,
12544,
5933,
310,
1512,
5075,
4103,
281,
1789,
5481,
3082,
253,
31612,
17927,
13301,
588,
320,
1512,
27620,
281,
2085,
667,
1361,
11985,
390,
4679,
327,
436,
1127,
651,
320,
1077,
9371,
275,
4685,
253,
2898,
5028,
273,
253,
4081,
1332,
50274,
20261,
253,
38135,
273,
253,
1332,
310,
2581,
1355,
253,
4477,
452,
1160,
1175,
6031,
275,
8109,
253,
789,
342,
9470,
4679,
253,
4477,
452,
6760,
616,
1332,
327,
2620,
15302,
512,
253,
374,
562,
273,
608,
13553,
253,
1543,
403,
1175,
2439,
512,
253,
15216,
253,
2929,
310,
4518,
3542,
285,
253,
1332,
310,
973,
17194,
50275,
74,
717,
417,
2119,
1880,
247,
2929,
342,
9470,
4679,
285,
4942,
1355,
7681,
7680,
943,
320,
2783,
347,
247,
1175,
2929,
323,
17857,
32888,
50276,
6438,
4361,
643,
10123,
285,
253,
30080,
22559,
891,
1478,
323,
14924,
50274,
7152,
339,
377,
2921,
50276,
783,
4081,
1332,
310,
2969,
2568,
3576,
285,
556,
4618,
4648,
275,
1524,
10186,
4893,
50276,
23706,
4679,
2439,
608,
49602,
50276,
2520,
1332,
1057,
417,
10725,
327,
253,
2603,
5028,
941,
285,
6311,
3540,
398,
50275,
5040,
50276,
783,
1789,
13562,
588,
2736,
5113,
13613,
760,
672,
597,
403,
2810,
281,
253,
1881,
41571,
1113,
253,
1750,
310,
417,
4516,
672,
627,
310,
247,
1781,
5028,
8037,
24088,
1027,
16486,
1032,
390,
3012,
1027,
15216,
253,
4081,
1566,
588,
1891,
281,
6016,
436,
4112,
50275,
1542,
253,
4228,
8458,
2139,
13414,
897,
299,
23640,
5011,
281,
1566,
253,
11935,
7688,
352,
943,
452,
247,
1805,
3045,
685,
34978,
71,
50276,
783,
5978,
273,
253,
10585,
311,
357,
1241,
7024,
11306,
327,
253,
7162,
7363,
2797,
432,
253,
1789,
13562,
7162,
7363,
50276,
2904,
849,
310,
253,
7887,
273,
470,
1438,
6777,
651,
643,
26682,
320,
625,
3576,
50276,
22309,
1057,
253,
2488,
760,
1501,
1543,
275,
2456,
1438,
78,
275,
10334,
495,
253,
7899,
5481,
275,
470,
1235,
78,
310,
625,
1774,
3738,
253,
4103,
7756,
778,
320,
1679,
50276,
783,
1332,
310,
8489,
2074,
281,
253,
5368,
40143,
3169,
209,
14776,
3082,
3021,
253,
38135,
310,
3710,
2299,
253,
2898,
273,
495,
69,
5481,
285,
253,
9470,
4679,
403,
1270,
285,
778,
5649,
2007,
2561,
3012,
187,
187,
4118,
18435,
27,
2520,
2929,
4081,
271,
440,
35421,
5028,
15644,
1332,
323,
495,
69,
16486,
274,
3169,
1789,
5481,
1740,
30628,
2530,
7000,
10123,
495,
20139,
42876,
1840,
14924,
7887,
285,
337,
20139,
8718,
533,
417,
1175,
2217,
50276,
250,
5342,
253,
30628,
14109,
2969,
2568,
3576,
2934,
253,
973,
17194,
1332,
253,
9483,
6460,
273,
253,
4679,
285,
973,
3542,
2929,
2299,
2201,
7350,
403,
671,
5439,
5001,
253,
5161,
7681,
9021,
327,
253,
4081,
2746,
253,
913,
84,
1007,
387,
253,
2929,
253,
2278,
253,
30080,
22559,
285,
253,
5955,
1677,
253,
7350,
327,
253,
5161,
7681,
9021,
253,
1029,
3947,
48826,
273,
253,
17857,
32888,
1673,
285,
253,
3480,
273,
31905,
18883,
942,
432,
30628,
253,
913,
84,
2868,
436,
789,
310,
417,
4704,
281,
320,
7607,
281,
17857,
32888,
2568,
285,
7613,
247,
18235,
3061,
310,
8521,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studied optimization of minimax games and proposed a recursive optimization algorithm called level k gradient play lvk gp as an update based on predictive updates lvk gp does not require secondorder information which is computationally more efficient than existing algorithms theorem 41 showed that as k increases the level k reasoning of parameter vectors is approaching a limit point and lvinfty gp is equivalent to an ideal algorithm semiproximal point method sppm table 1 then summarized different algorithms and viewed them as approximations of sppm and lvk gp approximation accuracy improves as k increases theorem 51 showed local convergence of sppm toward stationary points and theorems 52 and 53 showed that for a specific bilinear game and quadratic game the sppm converges in terms of squared norm of parameter distances experiments on training gans using 8gaussians cifar10 and stl10 are conducted to show that lvk gp and lvk adam can be used to help existing gan optimizers eg adam provides noticeable gains in performance and stability strengths 1 minimax game optimization is an important problem which has lots of applications such as gans while being difficult to resolve and optimize the research is well motivated 2 the proposed lvk gp algorithm seems novel to me and the authors provided its approximation to sppm and showed the convergence properties of sppm 3 experimental results are supporting the proposed methods weakness 1 the proposed algorithms are lvk gps while the convergence properties are studied for sppm method which creates a gap 2 it is not clear that whether the claim of lvk gp is faster than first order approximations of second order methods is true or not the authors discussed the potential negative impact of using gans to generate images which is reasonable to me docsepthis work introduces a novel algorithm closely related to existing lookahead approaches for solving minmax games in level k gradient play each agent first arrives at a prediction of the opponents move through k steps of recursive reasoning the counter move to the counter move to gradient descent they then make a step of gradient descent in the direction of the gradient computed using their own present position and their opponents position according to the recursive reasoning the authors show convergence results for the theoretical infinite recursion version with k infty finally the authors show numerical experiments on gans demonstrating improved is and fid scores strengths the algorithm seems natural and wellmotivated to me the algorithm appears to provide substantial improvements in gan training weaknesses the emphasis on sppm is confusing and makes it hard to understand how the proposed algorithm relates to other methods in the literature overall there is a lot of handwavey language in the paper that hints at the advantages of the proposed method or results of the paper that are never made concrete overall my present view of the paper is that the results are suitable to be published in neurips however the paper needs to undergo a major reorganizationrewrite to clearly convey its key points i am not sure such a revision is within the scope of a single conference review cycle which is why i tend towards rejection and encouraging the authors to resubmit to a future cycle the authors discuss the additional computational cost of lvk in the very end of section 6 however i was surprised that they did not provide further investigation of this concern by comparing for instance and adam model with k times as many epochs to an lvk adam docsepthis paper proposes a level k gradient play algorithm to stabilize the learning dynamics in minimax games gans by combining the proposed lv k algorithm with adam optimizer this paper could achieve similar results with sota gan model with 30 times fewer parameters strengths this paper proposes the level k gradient play algorithm for stabilizing the training of gans the proposed method has a theoretical guarantee with the assumption that the gradient of the loss function is lipschitz continuous moreover the paper proves and analyses the convergence properties of the lv k algorithm weaknesses 1 on the definition of sppm line 157 the author claims that sppm players arrive at a consensus by knowing precisely what their opponents future strategies will be in line 159 however the stationary point omegat thetat phit obtained with the reasoning step in line 139 should not equal to the future strategies omegat1 thetat1 phit1 in another word the term phit1 used to updating thetat1 does not equal to the term phit1 updated by thetat1 the author should consider changing the notation here line 156 or it may result in a misunderstanding that the levelinfty gp algorithm could use the opponents future strategy to update its current gradient 2 efficiency of the level k algorithm as a level k algorithm would have to compute the gradient for k times for theta and phi this algorithm would take more time for a single step than a regular algorithm i would recommend the author compare the method with baseline models regarding time efficiency similar to appendix figure6 but with x axis as time 3 difference with other gan optimizers the proposed methods could be seen as given current generator we use k step updates to find a better discriminator and use that discriminator to update the generator and vice versa however some theory shows that if we use a good eg optimal discriminator in the beginning then we could obtain no gradient for the generator wassersteingan the omega in this paper could be seen as the optimal phi and theta with the other kept fixed then what is the theoretical foundation between this paper and wgan that makes both methods work 4 limited experiment this paper only conduct experiment on a small 8gaussians experiment and cifar10 as the author claims an improvement against biggan which is good at highresolution images would the algorithm in this paper also be applicable to biggan or other big models 5 the proposed method is to stabilize the training of gan however the author also claims that this algorithm uses 30 times fewer parameters whats the correlation between stabilizing gradients and small models are there any theoretical results on this issue as stated in the weaknesses section docsepthis work propose level k gradient play a new dynamical system for nonconvex nonconcave minmax optimization the key feature of the dynamics is that each player tries to anticipate what the opponent will do in the following round and adapt to it instead of the opponents current iterate under mild assumptions the level infty dynamics are well defined and enjoy local convergence for quadratic games and global convergence for bilinear ones in terms of practical algorithms level k algorithms are shown to converge to a level infty solution as k to infty for sufficiently small learning rates based on a contraction property this means that they can heuristically be used as replacements for level infty level k adam variants are shown to have good empirical performance regarding strengths the presentation of the algorithm intuition and the key technical results is very clear the proposed algorithm and analysis are to the best of my knowledge both novel when compared to other approaches in the nonconvex nonconcave optimization literature while the theoretical guarantees are not particularly strong many approaches can solve bilinear problems or have local convergence guarantees the empirical results are promising the only weakness i detect is that this work is similar to 1 which is not referenced just like in this work the agents try to predict the strategies of the opponents in the next turn and adapt to them this is once again computationally feasible for small learning rates via a contraction argument once again the dynamics globally converge for bilinear games and higher learning rates lead to faster convergence but may be computationally intractable just like this work while the overall technique may be similar to 1 the individual arguments are sufficiently different and the analysis of section 41 theorem 51 and 53 and the experimental analysis are unique to this work overall i propose to accept this work accept 7 i have read the response of the authors which addressed my concern i have thus increased my score to strong accept 8 1 optimal noregret learning in general games bounded regret with unbounded stepsizes via clairvoyant mwu georgios piliouras ryann sim stratis skoulakis arxiv211114737 2021 na
### Summary: | this paper proposes a novel recursive reasoning algorithm for minimax games in which players try to anticipate their opponents next round move instead of reacting to the current round importantly this is achieved without requiring expensive second order information reviewers found the paper clearly written and well motivated addressing an important problem the work appears novel and there is good experimental evidence that the algorithm delivers on its promises | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
5421,
13757,
273,
7221,
991,
3958,
285,
4081,
247,
33037,
13757,
5933,
1925,
1268,
465,
11786,
1132,
298,
46065,
31025,
347,
271,
5731,
1754,
327,
15970,
11269,
298,
46065,
31025,
1057,
417,
2430,
1273,
2621,
1491,
534,
310,
43245,
625,
5919,
685,
5368,
11333,
50276,
33921,
7609,
2692,
326,
347,
465,
5459,
253,
1268,
465,
14720,
273,
4764,
11390,
310,
17682,
247,
2701,
1127,
285,
298,
87,
3259,
31025,
310,
6425,
281,
271,
7445,
5933,
3300,
39552,
89,
1983,
1127,
1332,
26105,
78,
50276,
2420,
337,
840,
17903,
1027,
11333,
285,
11575,
731,
347,
34754,
273,
26105,
78,
285,
298,
46065,
31025,
11193,
7200,
19132,
347,
465,
5459,
50276,
33921,
8319,
2692,
1980,
14940,
273,
26105,
78,
2584,
17429,
2792,
285,
39383,
8073,
285,
8676,
2692,
326,
323,
247,
2173,
10370,
48971,
2165,
285,
21396,
2165,
253,
26105,
78,
26414,
275,
2426,
273,
30044,
5222,
273,
4764,
13849,
50276,
16217,
3825,
327,
3733,
305,
507,
970,
854,
72,
10064,
2458,
260,
338,
274,
740,
285,
331,
77,
740,
403,
5196,
281,
921,
326,
298,
46065,
31025,
285,
298,
46065,
38622,
476,
320,
908,
281,
1361,
5368,
36827,
5556,
14460,
24088,
38622,
3400,
28629,
15988,
275,
3045,
285,
7882,
20544,
50276,
18,
7221,
991,
2165,
13757,
310,
271,
1774,
1895,
534,
556,
8783,
273,
4893,
824,
347,
305,
507,
1223,
1146,
2834,
281,
11322,
285,
22318,
253,
2561,
310,
973,
17194,
50276,
19,
253,
4081,
298,
46065,
31025,
5933,
3133,
4460,
281,
479,
285,
253,
4477,
2530,
697,
11193,
281,
26105,
78,
285,
2692,
253,
14940,
3607,
273,
26105,
78,
50276,
20,
5661,
1543,
403,
8109,
253,
4081,
3082,
50276,
20881,
1255,
50276,
18,
253,
4081,
11333,
403,
298,
46065,
305,
793,
1223,
253,
14940,
3607,
403,
5421,
323,
26105,
78,
1332,
534,
10513,
247,
8037,
50276,
19,
352,
310,
417,
2590,
326,
1880,
253,
1750,
273,
298,
46065,
31025,
310,
7938,
685,
806,
1340,
34754,
273,
1273,
1340,
3082,
310,
2032,
390,
417,
50276,
783,
4477,
5469,
253,
2442,
4016,
3486,
273,
970,
305,
507,
281,
6635,
3888,
534,
310,
5272,
281,
479,
5474,
33032,
2520,
789,
23970,
247,
4460,
5933,
8244,
2905,
281,
5368,
1007,
42338,
7274,
323,
16161,
1054,
4090,
3958,
50276,
249,
1268,
465,
11786,
1132,
1016,
5570,
806,
23981,
387,
247,
10554,
273,
253,
18062,
2118,
949,
465,
5018,
273,
33037,
14720,
50276,
783,
4828,
2118,
281,
253,
4828,
2118,
281,
50276,
29844,
18499,
597,
840,
1056,
247,
3213,
273,
11786,
18499,
275,
253,
3884,
273,
253,
11786,
10302,
970,
616,
1211,
1246,
1899,
285,
616,
18062,
1899,
2556,
281,
253,
33037,
14720,
253,
4477,
921,
14940,
1543,
323,
253,
10527,
11968,
43489,
2715,
342,
465,
50276,
3259,
4720,
253,
4477,
921,
10704,
4679,
327,
305,
507,
17227,
5520,
310,
285,
269,
301,
7363,
50275,
296,
3755,
20556,
50275,
783,
5933,
3133,
3626,
285,
973,
24013,
8550,
281,
479,
50275,
783,
5933,
4620,
281,
2085,
6832,
11701,
275,
36827,
3733,
50275,
20881,
1255,
265,
50275,
783,
15075,
327,
26105,
78,
310,
21643,
285,
2789,
352,
1892,
281,
2096,
849,
253,
4081,
5933,
7033,
281,
643,
3082,
275,
253,
6239,
50276,
1189,
455,
627,
310,
247,
2257,
273,
1133,
15007,
90,
3448,
275,
253,
2929,
326,
28145,
387,
253,
11361,
273,
253,
4081,
1332,
390,
1543,
273,
253,
2929,
326,
403,
1620,
1160,
11859,
50275,
1189,
455,
619,
1246,
1859,
273,
253,
2929,
310,
326,
253,
1543,
403,
7470,
281,
320,
3863,
275,
5723,
2824,
2299,
253,
2929,
3198,
281,
15080,
247,
2201,
40386,
2663,
3852,
281,
4518,
12709,
697,
2234,
2792,
891,
717,
417,
2119,
824,
247,
18520,
310,
1561,
253,
7990,
273,
247,
2014,
8059,
2278,
5880,
534,
310,
2139,
891,
5257,
4404,
18235,
285,
18462,
253,
4477,
281,
501,
538,
2225,
281,
247,
2852,
5880,
50276,
783,
4477,
2319,
253,
3081,
15180,
2105,
273,
298,
46065,
275,
253,
1077,
990,
273,
2593,
721,
2299,
891,
369,
9861,
326,
597,
858,
417,
2085,
2007,
5839,
273,
436,
4468,
407,
10941,
323,
4227,
285,
38622,
1566,
342,
465,
2069,
347,
1142,
44540,
281,
271,
298,
46065,
38622,
5474,
33032,
2520,
2929,
29328,
247,
1268,
465,
11786,
1132,
5933,
281,
33292,
253,
4715,
8062,
275,
7221,
991,
3958,
305,
507,
50276,
1615,
16248,
253,
4081,
298,
87,
465,
5933,
342,
38622,
5556,
6081,
436,
50276,
20790,
812,
5115,
2074,
1543,
342,
256,
5503,
36827,
1566,
342,
1884,
2069,
11184,
3602,
50276,
296,
3755,
20556,
50276,
2520,
2929,
29328,
253,
1268,
465,
11786,
1132,
5933,
323,
41427,
253,
3733,
273,
305,
507,
253,
4081,
1332,
556,
247,
10527,
12215,
342,
253,
9376,
326,
253,
11786,
273,
253,
2957,
1159,
310,
11233,
37913,
5415,
25761,
253,
2929,
19539,
285,
6260,
253,
14940,
3607,
273,
253,
298,
87,
465,
5933,
50276,
20881,
1255,
265,
337,
327,
253,
5426,
273,
26105,
78,
1386,
21043,
253,
2488,
3916,
326,
26105,
78,
3773,
12666,
387,
247,
13969,
407,
8958,
10534,
752,
616,
18062,
2852,
8130,
588,
320,
275,
1386,
22769,
2299,
50276,
783,
17429,
1127,
7005,
909,
255,
50276,
783,
41506,
815,
262,
2797,
342,
253,
14720,
3213,
275,
1386,
15470,
943,
417,
4503,
281,
253,
2852,
8130,
7005,
909,
255,
18,
50276,
783,
41506,
18,
815,
262,
18,
275,
1529,
3159,
253,
1307,
815,
262,
18,
908,
281,
22753,
253,
41506,
18,
1057,
417,
4503,
281,
253,
1307,
815,
262,
18,
9300,
407,
253,
41506,
18,
253,
2488,
943,
1908,
6890,
253,
14951,
1060,
1386,
21807,
390,
352,
778,
906,
275,
247,
40663,
326,
253,
1268,
3259,
31025,
5933,
812,
897,
253,
18062,
2852,
5700,
281,
5731,
697,
1655,
11786,
50276,
19,
6733,
273,
253,
1268,
465,
5933,
347,
247,
1268,
465,
5933,
651,
452,
281,
11897,
253,
11786,
323,
465,
2069,
323,
39116,
285,
815,
74,
436,
5933,
651,
1379,
625,
673,
323,
247,
2014,
3213,
685,
247,
3963,
5933,
891,
651,
5583,
253,
2488,
7277,
253,
1332,
342,
8245,
3210,
5001,
673,
6733,
2074,
281,
30762,
4677,
23,
533,
342,
1269,
7844,
347,
673,
50276,
20,
3064,
342,
643,
36827,
5556,
14460,
253,
4081,
3082,
812,
320,
2326,
347,
1677,
1655,
14156,
359,
897,
465,
3213,
11269,
281,
1089,
247,
1805,
7134,
12915,
285,
897,
326,
7134,
12915,
281,
5731,
253,
14156,
285,
12008,
26620,
2299,
690,
3762,
2722,
326,
604,
359,
897,
247,
1175,
24088,
8654,
7134,
12915,
275,
253,
5068,
840,
359,
812,
4044,
642,
11786,
323,
253,
14156,
369,
2152,
3241,
272,
266,
253,
40639,
275,
436,
2929,
812,
320,
2326,
347,
253,
8654,
815,
74,
285,
39116,
342,
253,
643,
4934,
4229,
840,
752,
310,
253,
10527,
12153,
875,
436,
2929,
285,
259,
1247,
326,
2789,
1097,
3082,
789,
50276,
21,
3710,
3368,
436,
2929,
760,
2589,
3368,
327,
247,
1355,
854,
72,
10064,
2458,
3368,
285,
260,
338,
274,
740,
347,
253,
2488,
3916,
271,
7756,
1411,
1943,
1247,
534,
310,
1175,
387,
1029,
21061,
3888,
651,
253,
5933,
275,
436,
2929,
671,
320,
7763,
281,
1943,
1247,
390,
643,
1943,
3210,
50276,
22,
253,
4081,
1332,
310,
281,
33292,
253,
3733,
273,
36827,
2299,
253,
2488,
671,
3916,
326,
436,
5933,
4648,
1884,
2069,
11184,
3602,
47515,
253,
5921,
875,
41427,
27935,
285,
1355,
3210,
403,
627,
667,
10527,
1543,
327,
436,
2523,
347,
4767,
275,
253,
32213,
2593,
5474,
33032,
2520,
789,
12661,
1268,
465,
11786,
1132,
247,
747,
18525,
985,
323,
1327,
44181,
1327,
45542,
1123,
1054,
4090,
13757,
253,
2234,
4735,
273,
253,
8062,
310,
326,
1016,
4760,
14177,
281,
30258,
752,
253,
16871,
588,
513,
275,
253,
1563,
3790,
285,
5223,
281,
352,
3185,
273,
253,
18062,
1655,
35388,
762,
11134,
13260,
253,
1268,
2192,
555,
8062,
403,
973,
2931,
285,
4264,
1980,
14940,
323,
21396,
3958,
285,
4156,
14940,
323,
10370,
48971,
4394,
275,
2426,
273,
8542,
11333,
1268,
465,
11333,
403,
2011,
281,
29623,
281,
247,
1268,
2192,
555,
2900,
347,
465,
281,
2192,
555,
323,
10481,
1355,
4715,
4142,
1754,
327,
247,
22170,
2867,
436,
2097,
326,
597,
476,
344,
321,
18260,
320,
908,
347,
47105,
323,
1268,
2192,
555,
1268,
465,
38622,
11640,
403,
2011,
281,
452,
1175,
16774,
3045,
50276,
1747,
13218,
20544,
253,
9759,
273,
253,
5933,
30328,
285,
253,
2234,
7681,
1543,
310,
1077,
2590,
253,
4081,
5933,
285,
1783,
403,
281,
253,
1682,
273,
619,
3640,
1097,
4460,
672,
2429,
281,
643,
7274,
275,
253,
1327,
44181,
1327,
45542,
1123,
13757,
6239,
1223,
253,
10527,
23632,
403,
417,
3782,
2266,
1142,
7274,
476,
8415,
10370,
48971,
3237,
390,
452,
1980,
14940,
23632,
253,
16774,
1543,
403,
12532,
50276,
783,
760,
14855,
891,
2736,
310,
326,
436,
789,
310,
2074,
281,
337,
534,
310,
417,
23378,
816,
751,
275,
436,
789,
253,
6083,
1611,
281,
3283,
253,
8130,
273,
253,
18062,
275,
253,
1735,
1614,
285,
5223,
281,
731,
436,
310,
2378,
969,
43245,
17887,
323,
1355,
4715,
4142,
3066,
247,
22170,
4154,
2378,
969,
253,
8062,
21349,
29623,
323,
10370,
48971,
3958,
285,
2169,
4715,
4142,
1421,
281,
7938,
14940,
533,
778,
320,
43245,
540,
44374,
816,
751,
436,
789,
50274,
6050,
253,
4583,
5853,
778,
320,
2074,
281,
337,
253,
2060,
7125,
403,
10481,
1027,
285,
253,
1783,
273,
2593,
7609,
10012,
8319,
285,
8676,
285,
253,
5661,
1783,
403,
4451,
281,
436,
789,
4583,
891,
12661,
281,
2997,
436,
789,
2997,
818,
50276,
74,
452,
1239,
253,
2380,
273,
253,
4477,
534,
9713,
619,
4468,
891,
452,
3021,
2559,
619,
4868,
281,
2266,
2997,
854,
50275,
18,
8654,
295,
410,
72,
1221,
4715,
275,
2087,
3958,
11542,
14938,
342,
45515,
5018,
4219,
3066,
502,
1094,
87,
899,
386,
278,
44217,
3471,
2061,
3783,
268,
3093,
454,
284,
43938,
1136,
948,
15252,
261,
1629,
3941,
30441,
549,
32693,
19,
13721,
2504,
1787,
43425,
50273,
2072,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
4460,
33037,
14720,
5933,
323,
7221,
991,
3958,
275,
534,
3773,
1611,
281,
30258,
616,
18062,
1735,
3790,
2118,
3185,
273,
37986,
281,
253,
1655,
3790,
15538,
436,
310,
6786,
1293,
10568,
8214,
1273,
1340,
1491,
30628,
1119,
253,
2929,
4518,
3542,
285,
973,
17194,
15974,
271,
1774,
1895,
253,
789,
4620,
4460,
285,
627,
310,
1175,
5661,
1941,
326,
253,
5933,
26361,
327,
697,
16966
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
5421,
13757,
273,
7221,
991,
3958,
285,
4081,
247,
33037,
13757,
5933,
1925,
1268,
465,
11786,
1132,
298,
46065,
31025,
347,
271,
5731,
1754,
327,
15970,
11269,
298,
46065,
31025,
1057,
417,
2430,
1273,
2621,
1491,
534,
310,
43245,
625,
5919,
685,
5368,
11333,
50276,
33921,
7609,
2692,
326,
347,
465,
5459,
253,
1268,
465,
14720,
273,
4764,
11390,
310,
17682,
247,
2701,
1127,
285,
298,
87,
3259,
31025,
310,
6425,
281,
271,
7445,
5933,
3300,
39552,
89,
1983,
1127,
1332,
26105,
78,
50276,
2420,
337,
840,
17903,
1027,
11333,
285,
11575,
731,
347,
34754,
273,
26105,
78,
285,
298,
46065,
31025,
11193,
7200,
19132,
347,
465,
5459,
50276,
33921,
8319,
2692,
1980,
14940,
273,
26105,
78,
2584,
17429,
2792,
285,
39383,
8073,
285,
8676,
2692,
326,
323,
247,
2173,
10370,
48971,
2165,
285,
21396,
2165,
253,
26105,
78,
26414,
275,
2426,
273,
30044,
5222,
273,
4764,
13849,
50276,
16217,
3825,
327,
3733,
305,
507,
970,
854,
72,
10064,
2458,
260,
338,
274,
740,
285,
331,
77,
740,
403,
5196,
281,
921,
326,
298,
46065,
31025,
285,
298,
46065,
38622,
476,
320,
908,
281,
1361,
5368,
36827,
5556,
14460,
24088,
38622,
3400,
28629,
15988,
275,
3045,
285,
7882,
20544,
50276,
18,
7221,
991,
2165,
13757,
310,
271,
1774,
1895,
534,
556,
8783,
273,
4893,
824,
347,
305,
507,
1223,
1146,
2834,
281,
11322,
285,
22318,
253,
2561,
310,
973,
17194,
50276,
19,
253,
4081,
298,
46065,
31025,
5933,
3133,
4460,
281,
479,
285,
253,
4477,
2530,
697,
11193,
281,
26105,
78,
285,
2692,
253,
14940,
3607,
273,
26105,
78,
50276,
20,
5661,
1543,
403,
8109,
253,
4081,
3082,
50276,
20881,
1255,
50276,
18,
253,
4081,
11333,
403,
298,
46065,
305,
793,
1223,
253,
14940,
3607,
403,
5421,
323,
26105,
78,
1332,
534,
10513,
247,
8037,
50276,
19,
352,
310,
417,
2590,
326,
1880,
253,
1750,
273,
298,
46065,
31025,
310,
7938,
685,
806,
1340,
34754,
273,
1273,
1340,
3082,
310,
2032,
390,
417,
50276,
783,
4477,
5469,
253,
2442,
4016,
3486,
273,
970,
305,
507,
281,
6635,
3888,
534,
310,
5272,
281,
479,
5474,
33032,
2520,
789,
23970,
247,
4460,
5933,
8244,
2905,
281,
5368,
1007,
42338,
7274,
323,
16161,
1054,
4090,
3958,
50276,
249,
1268,
465,
11786,
1132,
1016,
5570,
806,
23981,
387,
247,
10554,
273,
253,
18062,
2118,
949,
465,
5018,
273,
33037,
14720,
50276,
783,
4828,
2118,
281,
253,
4828,
2118,
281,
50276,
29844,
18499,
597,
840,
1056,
247,
3213,
273,
11786,
18499,
275,
253,
3884,
273,
253,
11786,
10302,
970,
616,
1211,
1246,
1899,
285,
616,
18062,
1899,
2556,
281,
253,
33037,
14720,
253,
4477,
921,
14940,
1543,
323,
253,
10527,
11968,
43489,
2715,
342,
465,
50276,
3259,
4720,
253,
4477,
921,
10704,
4679,
327,
305,
507,
17227,
5520,
310,
285,
269,
301,
7363,
50275,
296,
3755,
20556,
50275,
783,
5933,
3133,
3626,
285,
973,
24013,
8550,
281,
479,
50275,
783,
5933,
4620,
281,
2085,
6832,
11701,
275,
36827,
3733,
50275,
20881,
1255,
265,
50275,
783,
15075,
327,
26105,
78,
310,
21643,
285,
2789,
352,
1892,
281,
2096,
849,
253,
4081,
5933,
7033,
281,
643,
3082,
275,
253,
6239,
50276,
1189,
455,
627,
310,
247,
2257,
273,
1133,
15007,
90,
3448,
275,
253,
2929,
326,
28145,
387,
253,
11361,
273,
253,
4081,
1332,
390,
1543,
273,
253,
2929,
326,
403,
1620,
1160,
11859,
50275,
1189,
455,
619,
1246,
1859,
273,
253,
2929,
310,
326,
253,
1543,
403,
7470,
281,
320,
3863,
275,
5723,
2824,
2299,
253,
2929,
3198,
281,
15080,
247,
2201,
40386,
2663,
3852,
281,
4518,
12709,
697,
2234,
2792,
891,
717,
417,
2119,
824,
247,
18520,
310,
1561,
253,
7990,
273,
247,
2014,
8059,
2278,
5880,
534,
310,
2139,
891,
5257,
4404,
18235,
285,
18462,
253,
4477,
281,
501,
538,
2225,
281,
247,
2852,
5880,
50276,
783,
4477,
2319,
253,
3081,
15180,
2105,
273,
298,
46065,
275,
253,
1077,
990,
273,
2593,
721,
2299,
891,
369,
9861,
326,
597,
858,
417,
2085,
2007,
5839,
273,
436,
4468,
407,
10941,
323,
4227,
285,
38622,
1566,
342,
465,
2069,
347,
1142,
44540,
281,
271,
298,
46065,
38622,
5474,
33032,
2520,
2929,
29328,
247,
1268,
465,
11786,
1132,
5933,
281,
33292,
253,
4715,
8062,
275,
7221,
991,
3958,
305,
507,
50276,
1615,
16248,
253,
4081,
298,
87,
465,
5933,
342,
38622,
5556,
6081,
436,
50276,
20790,
812,
5115,
2074,
1543,
342,
256,
5503,
36827,
1566,
342,
1884,
2069,
11184,
3602,
50276,
296,
3755,
20556,
50276,
2520,
2929,
29328,
253,
1268,
465,
11786,
1132,
5933,
323,
41427,
253,
3733,
273,
305,
507,
253,
4081,
1332,
556,
247,
10527,
12215,
342,
253,
9376,
326,
253,
11786,
273,
253,
2957,
1159,
310,
11233,
37913,
5415,
25761,
253,
2929,
19539,
285,
6260,
253,
14940,
3607,
273,
253,
298,
87,
465,
5933,
50276,
20881,
1255,
265,
337,
327,
253,
5426,
273,
26105,
78,
1386,
21043,
253,
2488,
3916,
326,
26105,
78,
3773,
12666,
387,
247,
13969,
407,
8958,
10534,
752,
616,
18062,
2852,
8130,
588,
320,
275,
1386,
22769,
2299,
50276,
783,
17429,
1127,
7005,
909,
255,
50276,
783,
41506,
815,
262,
2797,
342,
253,
14720,
3213,
275,
1386,
15470,
943,
417,
4503,
281,
253,
2852,
8130,
7005,
909,
255,
18,
50276,
783,
41506,
18,
815,
262,
18,
275,
1529,
3159,
253,
1307,
815,
262,
18,
908,
281,
22753,
253,
41506,
18,
1057,
417,
4503,
281,
253,
1307,
815,
262,
18,
9300,
407,
253,
41506,
18,
253,
2488,
943,
1908,
6890,
253,
14951,
1060,
1386,
21807,
390,
352,
778,
906,
275,
247,
40663,
326,
253,
1268,
3259,
31025,
5933,
812,
897,
253,
18062,
2852,
5700,
281,
5731,
697,
1655,
11786,
50276,
19,
6733,
273,
253,
1268,
465,
5933,
347,
247,
1268,
465,
5933,
651,
452,
281,
11897,
253,
11786,
323,
465,
2069,
323,
39116,
285,
815,
74,
436,
5933,
651,
1379,
625,
673,
323,
247,
2014,
3213,
685,
247,
3963,
5933,
891,
651,
5583,
253,
2488,
7277,
253,
1332,
342,
8245,
3210,
5001,
673,
6733,
2074,
281,
30762,
4677,
23,
533,
342,
1269,
7844,
347,
673,
50276,
20,
3064,
342,
643,
36827,
5556,
14460,
253,
4081,
3082,
812,
320,
2326,
347,
1677,
1655,
14156,
359,
897,
465,
3213,
11269,
281,
1089,
247,
1805,
7134,
12915,
285,
897,
326,
7134,
12915,
281,
5731,
253,
14156,
285,
12008,
26620,
2299,
690,
3762,
2722,
326,
604,
359,
897,
247,
1175,
24088,
8654,
7134,
12915,
275,
253,
5068,
840,
359,
812,
4044,
642,
11786,
323,
253,
14156,
369,
2152,
3241,
272,
266,
253,
40639,
275,
436,
2929,
812,
320,
2326,
347,
253,
8654,
815,
74,
285,
39116,
342,
253,
643,
4934,
4229,
840,
752,
310,
253,
10527,
12153,
875,
436,
2929,
285,
259,
1247,
326,
2789,
1097,
3082,
789,
50276,
21,
3710,
3368,
436,
2929,
760,
2589,
3368,
327,
247,
1355,
854,
72,
10064,
2458,
3368,
285,
260,
338,
274,
740,
347,
253,
2488,
3916,
271,
7756,
1411,
1943,
1247,
534,
310,
1175,
387,
1029,
21061,
3888,
651,
253,
5933,
275,
436,
2929,
671,
320,
7763,
281,
1943,
1247,
390,
643,
1943,
3210,
50276,
22,
253,
4081,
1332,
310,
281,
33292,
253,
3733,
273,
36827,
2299,
253,
2488,
671,
3916,
326,
436,
5933,
4648,
1884,
2069,
11184,
3602,
47515,
253,
5921,
875,
41427,
27935,
285,
1355,
3210,
403,
627,
667,
10527,
1543,
327,
436,
2523,
347,
4767,
275,
253,
32213,
2593,
5474,
33032,
2520,
789,
12661,
1268,
465,
11786,
1132,
247,
747,
18525,
985,
323,
1327,
44181,
1327,
45542,
1123,
1054,
4090,
13757,
253,
2234,
4735,
273,
253,
8062,
310,
326,
1016,
4760,
14177,
281,
30258,
752,
253,
16871,
588,
513,
275,
253,
1563,
3790,
285,
5223,
281,
352,
3185,
273,
253,
18062,
1655,
35388,
762,
11134,
13260,
253,
1268,
2192,
555,
8062,
403,
973,
2931,
285,
4264,
1980,
14940,
323,
21396,
3958,
285,
4156,
14940,
323,
10370,
48971,
4394,
275,
2426,
273,
8542,
11333,
1268,
465,
11333,
403,
2011,
281,
29623,
281,
247,
1268,
2192,
555,
2900,
347,
465,
281,
2192,
555,
323,
10481,
1355,
4715,
4142,
1754,
327,
247,
22170,
2867,
436,
2097,
326,
597,
476,
344,
321,
18260,
320,
908,
347,
47105,
323,
1268,
2192,
555,
1268,
465,
38622,
11640,
403,
2011,
281,
452,
1175,
16774,
3045,
50276,
1747,
13218,
20544,
253,
9759,
273,
253,
5933,
30328,
285,
253,
2234,
7681,
1543,
310,
1077,
2590,
253,
4081,
5933,
285,
1783,
403,
281,
253,
1682,
273,
619,
3640,
1097,
4460,
672,
2429,
281,
643,
7274,
275,
253,
1327,
44181,
1327,
45542,
1123,
13757,
6239,
1223,
253,
10527,
23632,
403,
417,
3782,
2266,
1142,
7274,
476,
8415,
10370,
48971,
3237,
390,
452,
1980,
14940,
23632,
253,
16774,
1543,
403,
12532,
50276,
783,
760,
14855,
891,
2736,
310,
326,
436,
789,
310,
2074,
281,
337,
534,
310,
417,
23378,
816,
751,
275,
436,
789,
253,
6083,
1611,
281,
3283,
253,
8130,
273,
253,
18062,
275,
253,
1735,
1614,
285,
5223,
281,
731,
436,
310,
2378,
969,
43245,
17887,
323,
1355,
4715,
4142,
3066,
247,
22170,
4154,
2378,
969,
253,
8062,
21349,
29623,
323,
10370,
48971,
3958,
285,
2169,
4715,
4142,
1421,
281,
7938,
14940,
533,
778,
320,
43245,
540,
44374,
816,
751,
436,
789,
50274,
6050,
253,
4583,
5853,
778,
320,
2074,
281,
337,
253,
2060,
7125,
403,
10481,
1027,
285,
253,
1783,
273,
2593,
7609,
10012,
8319,
285,
8676,
285,
253,
5661,
1783,
403,
4451,
281,
436,
789,
4583,
891,
12661,
281,
2997,
436,
789,
2997,
818,
50276,
74,
452,
1239,
253,
2380,
273,
253,
4477,
534,
9713,
619,
4468,
891,
452,
3021,
2559,
619,
4868,
281,
2266,
2997,
854,
50275,
18,
8654,
295,
410,
72,
1221,
4715,
275,
2087,
3958,
11542,
14938,
342,
45515,
5018,
4219,
3066,
502,
1094,
87,
899,
386,
278,
44217,
3471,
2061,
3783,
268,
3093,
454,
284,
43938,
1136,
948,
15252,
261,
1629,
3941,
30441,
549,
32693,
19,
13721,
2504,
1787,
43425,
50273,
2072,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
4460,
33037,
14720,
5933,
323,
7221,
991,
3958,
275,
534,
3773,
1611,
281,
30258,
616,
18062,
1735,
3790,
2118,
3185,
273,
37986,
281,
253,
1655,
3790,
15538,
436,
310,
6786,
1293,
10568,
8214,
1273,
1340,
1491,
30628,
1119,
253,
2929,
4518,
3542,
285,
973,
17194,
15974,
271,
1774,
1895,
253,
789,
4620,
4460,
285,
627,
310,
1175,
5661,
1941,
326,
253,
5933,
26361,
327,
697,
16966
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary the augmentation of nlp samples is an important task with no clear applicable to all mechanism this is in sharp contrast to computer vision where techniques like rotation modification of hue saturation as well as umpteen other techniques exist this work tries to address the issue by proposing a technique that carefully amalgamates multiple previously known approaches to generate diverse label preserving examples the experimental results on roberta highlight the applicability and importance of this data augmentation approach on the downstream task of text classification glue strengths 1 empirical results performance better than previous approaches although minor 2 paper clarity 3 each formulation is backed by a strong intuitive understanding 4 contrastive training negative sampling is one of the crucial contributions of this work it seems to be making every previously known augmentation approach better please feel free to highlight other major contributions weaknesses minor 1 adhoc regularization parameter selection is necessary for getting performance gains this makes it difficult to conclusively prove that this is an applicable to all data augmentation scheme 2 it would have been better to see the performance gains on more difficult textclassification tasks nonglue or underperforming models nonbert based since the gains are not much it becomes difficult to fathom if the gains are actually due to good objective function or a case of chance for choosing better examples commentsquestions 1 what is the augmentation size being used in the setup i suspect the size plays an important role in such setups and this hasnt been discussed much in the paper also please show the performance trends based on different augmentation sizes 2 how do you measure the diversity as mentioned in the paper title in the generated samples 3 rather than using the adhoc approach for selecting which augmentation stacking scheme is helpful it would have been better to compareuse an approach highlighted in learning to compose domainspecific transformations for data augmentation neurips 2017 correction 1 related work contrastive learning under an unsupervised setting ontrastive contrastive overall this work highlights the importance of incorporating contrastive training for data augmentation please let me know if i have misunderstood somethingsdocseppaper proposes a contrastive learningbased approach to combine different data augmentation techniques for nlp tasks while the widely used consistency loss focuses on a single example the proposed contrastive objective allows capturing the relationships among all data samples which helps in producing diverse and informative examples for experiments the paper explores 5 data augmentation approaches with robertalarge as the classification model empirical results on the standard glue benchmark leads to an impressive 22 average improvement authors also found that backtranslation and adversarial training combination leads to better performance than other da combinations strengths 1 the proposed framework can be applied with any text data augmentation methods its a solid work that will help the nlp community in developing better da techniques for example kumar et al 2020 shows that any pretrained model can be used for data augmentation i believe seq2seq model like t5 bart based augmentation combined with coda will further push the state of the art for text da 2 paper provides clear motivations and describes their methods experiments in detail authors study da in both lowresource and richresource setting ablation studies have been conducted to investigate gains from different components 3 authors plan to release their code which is good for reproducibility weakness my understanding is that all numbers reported in the paper are from a single experiment as a reader i would like to see some variance with the results apart from this i dont see any major issues with the paper questions 1 since one of your goals is to improve the diversity of the augmented data have you tried replacing more words in the cbert model by nature cbert is bound to replace max 15 of the tokens while maintaining the sentence length methods such as backtranslation or seq2seq models do not have such restrictions also have you considered using a pretrained seq2seq model for da as in kumar et al 2020 2 fig 5 backtranslation and adversarial training have similar performance this result is intriguing do you have some further insights into it typos sec22 the first term correspond corresponds sec 4 contrastive learning para ontrastive learning contrastive learning references additional da citations 1 kumar v choudhary a cho e 2020 data augmentation using pretrained transformer models arxiv abs200302245docsepthe paper proposes a novel data augmentation framework which explores different combinations of isolated labelpreserving transformations to improve the diversity of augmented samples the authors find that stacking distinct labelpreserving transformations produces particularly informative samples the paper also introduces a contrastive learning objective to capture the global relationship among the data points in representation space in my opinion the exploration of different combinations of isolated labelpreserving transformations is the major contribution of this paper which may inspire future works for data augmentation however the contrastive regularization object is a bit incremental and i cannot see a big difference compared with moco or supcon strength the idea of stacking distinct labelpreserving transformations is intuitive the integration of the consistency training objective and the contrastive regularization objective is interesting weakness lack of novelty the contrastive regularization object is a bit incremental and this object is very similar to moco or supcon the model has first to obtain the augmented samples which is computation expensive for largescale datasets and may hinder the practical application of the model moreover the overall improvements are relatively small compared with r3f and there is a lack of variance analysis questions what is the computational complexity of coda why using mmd distance in section 31 is stacking distinct labelpreserving transformations the default setting for coda in your glue experiments what if other strategies mix random work better in datasets like qnli rte mrpc and so on why not report results on those datasets what is the major difference between your contrastive regularization and moco or supcon as the improvements are relatively small could you please provide the test of statistical significance what if you stack cut first and then back does the order affect the performance
### Summary: | this paper concerns data augmentation techniques for nlp in particular the authors introduce a general augmentation framework they call coda and demonstrate its utility on a few benchmark nlp tasks reporting promising empirical results the authors addressed some key concerns eg regarding hyperparameters reporting of variances during the discussion period the consensus then is that this work provides a useful and relatively general method for augmentation in nlp and the iclr audience is likely to find this useful | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
50276,
783,
42072,
273,
295,
24343,
3530,
310,
271,
1774,
4836,
342,
642,
2590,
7763,
281,
512,
5122,
436,
310,
275,
9479,
4499,
281,
4382,
8113,
835,
5609,
751,
9381,
11237,
273,
43192,
22185,
347,
973,
347,
5111,
431,
9673,
643,
5609,
2226,
436,
789,
14177,
281,
2953,
253,
2523,
407,
36636,
247,
5853,
326,
9257,
47780,
312,
684,
2709,
3786,
1929,
7274,
281,
6635,
11117,
5203,
24279,
6667,
253,
5661,
1543,
327,
687,
589,
893,
6780,
253,
30437,
285,
6349,
273,
436,
941,
42072,
2746,
327,
253,
15450,
4836,
273,
2505,
9162,
28400,
50276,
296,
3755,
20556,
50276,
18,
16774,
1543,
3045,
1805,
685,
2045,
7274,
3738,
5884,
374,
2929,
19843,
495,
1016,
15895,
310,
17245,
407,
247,
2266,
27350,
4685,
577,
4499,
422,
3733,
4016,
10491,
310,
581,
273,
253,
9560,
9021,
273,
436,
789,
352,
3133,
281,
320,
2403,
1046,
3786,
1929,
42072,
2746,
1805,
50276,
32897,
1928,
1959,
281,
6780,
643,
2201,
9021,
50276,
20881,
1255,
265,
5884,
50276,
18,
519,
37806,
37820,
4764,
5438,
310,
3309,
323,
2970,
3045,
15988,
436,
2789,
352,
2834,
281,
345,
12817,
5276,
326,
436,
310,
271,
7763,
281,
512,
941,
42072,
6974,
374,
352,
651,
452,
644,
1805,
281,
923,
253,
3045,
15988,
327,
625,
2834,
2505,
42070,
8892,
295,
543,
77,
489,
390,
762,
468,
14692,
3210,
1327,
6291,
1754,
1580,
253,
15988,
403,
417,
1199,
352,
4916,
2834,
281,
269,
50092,
604,
253,
15988,
403,
2686,
1955,
281,
1175,
8103,
1159,
390,
247,
1083,
273,
4839,
323,
13887,
1805,
6667,
50275,
26122,
34974,
50276,
18,
752,
310,
253,
42072,
1979,
1146,
908,
275,
253,
9978,
891,
9101,
253,
1979,
7120,
271,
1774,
2554,
275,
824,
873,
8777,
285,
436,
556,
2649,
644,
5469,
1199,
275,
253,
2929,
671,
4496,
921,
253,
3045,
13554,
1754,
327,
1027,
42072,
9552,
50276,
19,
849,
513,
368,
2557,
253,
9991,
347,
5393,
275,
253,
2929,
4060,
275,
253,
4561,
3530,
50275,
20,
2581,
685,
970,
253,
519,
37806,
2746,
323,
17221,
534,
42072,
37444,
6974,
310,
9371,
352,
651,
452,
644,
1805,
281,
7277,
2327,
271,
2746,
16318,
275,
4715,
281,
38530,
10625,
29765,
21257,
323,
941,
42072,
5723,
2824,
4240,
50275,
5528,
15831,
50276,
18,
2905,
789,
4499,
422,
4715,
50276,
4524,
271,
440,
35421,
4758,
327,
1206,
505,
422,
50276,
45842,
422,
50276,
1189,
455,
50276,
2520,
789,
16681,
253,
6349,
273,
24049,
4499,
422,
3733,
323,
941,
42072,
50276,
32897,
1339,
479,
871,
604,
891,
452,
46485,
1260,
678,
723,
7152,
339,
377,
6653,
29328,
247,
4499,
422,
4715,
3169,
2746,
281,
13398,
1027,
941,
42072,
5609,
323,
295,
24343,
8892,
1223,
253,
7561,
908,
15274,
2957,
16633,
327,
247,
2014,
1650,
253,
4081,
4499,
422,
8103,
4483,
26475,
253,
7688,
2190,
512,
941,
3530,
534,
7729,
275,
9603,
11117,
285,
27096,
6667,
323,
4679,
253,
2929,
33826,
608,
941,
42072,
7274,
342,
687,
6291,
45825,
463,
347,
253,
9162,
1566,
16774,
1543,
327,
253,
2629,
28400,
22791,
5644,
281,
271,
13943,
3307,
3388,
7756,
4477,
671,
1119,
326,
896,
20099,
285,
48960,
3733,
5019,
5644,
281,
1805,
3045,
685,
643,
4204,
13553,
50273,
296,
3755,
20556,
337,
253,
4081,
7792,
476,
320,
3732,
342,
667,
2505,
941,
42072,
3082,
697,
247,
4891,
789,
326,
588,
1361,
253,
295,
24343,
3114,
275,
6684,
1805,
4204,
5609,
323,
1650,
465,
22711,
1162,
355,
9169,
2722,
326,
667,
3215,
11273,
1566,
476,
320,
908,
323,
941,
42072,
891,
2868,
22510,
19,
14571,
1566,
751,
246,
22,
44693,
1754,
42072,
5678,
342,
12738,
66,
588,
2007,
7450,
253,
1375,
273,
253,
1445,
323,
2505,
4204,
50276,
19,
2929,
3400,
2590,
42852,
285,
8631,
616,
3082,
4679,
275,
2508,
4477,
1263,
4204,
275,
1097,
1698,
15024,
285,
6793,
15024,
4758,
28913,
2175,
452,
644,
5196,
281,
7409,
15988,
432,
1027,
4295,
50275,
20,
4477,
2098,
281,
3727,
616,
2127,
534,
310,
1175,
323,
38041,
50274,
20881,
1255,
50276,
2577,
4685,
310,
326,
512,
3904,
2361,
275,
253,
2929,
403,
432,
247,
2014,
3368,
347,
247,
9414,
891,
651,
751,
281,
923,
690,
11041,
342,
253,
1543,
7419,
432,
436,
891,
13414,
923,
667,
2201,
3374,
342,
253,
2929,
50273,
34974,
337,
1580,
581,
273,
634,
7342,
310,
281,
3157,
253,
9991,
273,
253,
31612,
941,
452,
368,
3597,
15706,
625,
3000,
275,
253,
260,
6291,
1566,
407,
3753,
260,
6291,
310,
3033,
281,
8171,
2781,
1458,
273,
253,
21761,
1223,
11850,
253,
6197,
2978,
3082,
824,
347,
896,
20099,
390,
22510,
19,
14571,
3210,
513,
417,
452,
824,
13133,
50276,
12563,
452,
368,
2783,
970,
247,
3215,
11273,
22510,
19,
14571,
1566,
323,
4204,
347,
275,
465,
22711,
1162,
355,
9169,
374,
3036,
608,
896,
20099,
285,
48960,
3733,
452,
2074,
3045,
436,
906,
310,
27807,
50276,
3088,
368,
452,
690,
2007,
16039,
715,
352,
50275,
555,
993,
50276,
1704,
1423,
253,
806,
1307,
2723,
50276,
5528,
2541,
84,
50275,
1704,
577,
4499,
422,
4715,
5586,
327,
1206,
505,
422,
4715,
50276,
45842,
422,
4715,
50276,
250,
3065,
3081,
4204,
30404,
337,
465,
22711,
362,
448,
2995,
73,
552,
247,
50276,
4039,
299,
9169,
941,
42072,
970,
3215,
11273,
39707,
3210,
549,
32693,
2117,
1518,
1229,
17537,
22,
7152,
339,
431,
248,
2929,
29328,
247,
4460,
941,
42072,
7792,
534,
33826,
1027,
13553,
273,
7011,
5203,
10192,
26368,
21257,
281,
3157,
253,
9991,
273,
31612,
3530,
50276,
783,
4477,
1089,
326,
37444,
5799,
5203,
10192,
26368,
21257,
11330,
3782,
27096,
3530,
50276,
783,
2929,
671,
23970,
247,
4499,
422,
4715,
8103,
281,
9232,
253,
4156,
2954,
2190,
253,
941,
2792,
275,
6779,
2317,
50274,
249,
619,
4743,
253,
17947,
273,
1027,
13553,
273,
7011,
5203,
10192,
26368,
21257,
310,
253,
2201,
7680,
273,
436,
2929,
534,
778,
26761,
2852,
2987,
323,
941,
42072,
2299,
253,
4499,
422,
37820,
1789,
310,
247,
2372,
32809,
285,
891,
2550,
923,
247,
1943,
3064,
2429,
342,
278,
16856,
390,
7018,
585,
50273,
45563,
50275,
783,
2934,
273,
37444,
5799,
5203,
10192,
26368,
21257,
310,
27350,
50276,
783,
9554,
273,
253,
15274,
3733,
8103,
285,
253,
4499,
422,
37820,
8103,
310,
4722,
50276,
20881,
1255,
50275,
77,
471,
273,
38135,
253,
4499,
422,
37820,
1789,
310,
247,
2372,
32809,
285,
436,
1789,
310,
1077,
2074,
281,
278,
16856,
390,
7018,
585,
50276,
783,
1566,
556,
806,
281,
4044,
253,
31612,
3530,
534,
310,
13782,
8214,
323,
1236,
2510,
25912,
15302,
285,
778,
35007,
253,
8542,
2898,
273,
253,
1566,
25761,
253,
4583,
11701,
403,
4942,
1355,
2429,
342,
391,
20,
71,
285,
627,
310,
247,
3480,
273,
11041,
1783,
50276,
34974,
50276,
5371,
310,
253,
15180,
10454,
273,
12738,
66,
50276,
22309,
970,
5823,
69,
4181,
275,
2593,
4562,
50276,
261,
37444,
5799,
5203,
10192,
26368,
21257,
253,
4284,
4758,
323,
12738,
66,
275,
634,
28400,
4679,
752,
604,
643,
8130,
5878,
3632,
789,
1805,
275,
15302,
751,
2805,
79,
965,
391,
442,
278,
44683,
285,
594,
327,
2139,
417,
1304,
1543,
327,
1110,
15302,
50276,
5371,
310,
253,
2201,
3064,
875,
634,
4499,
422,
37820,
285,
278,
16856,
390,
7018,
585,
50276,
284,
253,
11701,
403,
4942,
1355,
812,
368,
4496,
2085,
253,
1071,
273,
7605,
8453,
50276,
5371,
604,
368,
8031,
2624,
806,
285,
840,
896,
1057,
253,
1340,
2818,
253,
3045,
187,
187,
4118,
18435,
27,
2520,
2929,
7350,
941,
42072,
5609,
323,
295,
24343,
275,
1798,
253,
4477,
9569,
247,
2087,
42072,
7792,
597,
1067,
12738,
66,
285,
7568,
697,
11839,
327,
247,
1643,
22791,
295,
24343,
8892,
9610,
12532,
16774,
1543,
253,
4477,
9713,
690,
2234,
7350,
24088,
5001,
4373,
22041,
9610,
273,
48894,
1309,
253,
5955,
2180,
253,
13969,
840,
310,
326,
436,
789,
3400,
247,
4217,
285,
4942,
2087,
1332,
323,
42072,
275,
295,
24343,
285,
253,
17857,
32888,
8446,
310,
2779,
281,
1089,
436,
4217
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
50276,
783,
42072,
273,
295,
24343,
3530,
310,
271,
1774,
4836,
342,
642,
2590,
7763,
281,
512,
5122,
436,
310,
275,
9479,
4499,
281,
4382,
8113,
835,
5609,
751,
9381,
11237,
273,
43192,
22185,
347,
973,
347,
5111,
431,
9673,
643,
5609,
2226,
436,
789,
14177,
281,
2953,
253,
2523,
407,
36636,
247,
5853,
326,
9257,
47780,
312,
684,
2709,
3786,
1929,
7274,
281,
6635,
11117,
5203,
24279,
6667,
253,
5661,
1543,
327,
687,
589,
893,
6780,
253,
30437,
285,
6349,
273,
436,
941,
42072,
2746,
327,
253,
15450,
4836,
273,
2505,
9162,
28400,
50276,
296,
3755,
20556,
50276,
18,
16774,
1543,
3045,
1805,
685,
2045,
7274,
3738,
5884,
374,
2929,
19843,
495,
1016,
15895,
310,
17245,
407,
247,
2266,
27350,
4685,
577,
4499,
422,
3733,
4016,
10491,
310,
581,
273,
253,
9560,
9021,
273,
436,
789,
352,
3133,
281,
320,
2403,
1046,
3786,
1929,
42072,
2746,
1805,
50276,
32897,
1928,
1959,
281,
6780,
643,
2201,
9021,
50276,
20881,
1255,
265,
5884,
50276,
18,
519,
37806,
37820,
4764,
5438,
310,
3309,
323,
2970,
3045,
15988,
436,
2789,
352,
2834,
281,
345,
12817,
5276,
326,
436,
310,
271,
7763,
281,
512,
941,
42072,
6974,
374,
352,
651,
452,
644,
1805,
281,
923,
253,
3045,
15988,
327,
625,
2834,
2505,
42070,
8892,
295,
543,
77,
489,
390,
762,
468,
14692,
3210,
1327,
6291,
1754,
1580,
253,
15988,
403,
417,
1199,
352,
4916,
2834,
281,
269,
50092,
604,
253,
15988,
403,
2686,
1955,
281,
1175,
8103,
1159,
390,
247,
1083,
273,
4839,
323,
13887,
1805,
6667,
50275,
26122,
34974,
50276,
18,
752,
310,
253,
42072,
1979,
1146,
908,
275,
253,
9978,
891,
9101,
253,
1979,
7120,
271,
1774,
2554,
275,
824,
873,
8777,
285,
436,
556,
2649,
644,
5469,
1199,
275,
253,
2929,
671,
4496,
921,
253,
3045,
13554,
1754,
327,
1027,
42072,
9552,
50276,
19,
849,
513,
368,
2557,
253,
9991,
347,
5393,
275,
253,
2929,
4060,
275,
253,
4561,
3530,
50275,
20,
2581,
685,
970,
253,
519,
37806,
2746,
323,
17221,
534,
42072,
37444,
6974,
310,
9371,
352,
651,
452,
644,
1805,
281,
7277,
2327,
271,
2746,
16318,
275,
4715,
281,
38530,
10625,
29765,
21257,
323,
941,
42072,
5723,
2824,
4240,
50275,
5528,
15831,
50276,
18,
2905,
789,
4499,
422,
4715,
50276,
4524,
271,
440,
35421,
4758,
327,
1206,
505,
422,
50276,
45842,
422,
50276,
1189,
455,
50276,
2520,
789,
16681,
253,
6349,
273,
24049,
4499,
422,
3733,
323,
941,
42072,
50276,
32897,
1339,
479,
871,
604,
891,
452,
46485,
1260,
678,
723,
7152,
339,
377,
6653,
29328,
247,
4499,
422,
4715,
3169,
2746,
281,
13398,
1027,
941,
42072,
5609,
323,
295,
24343,
8892,
1223,
253,
7561,
908,
15274,
2957,
16633,
327,
247,
2014,
1650,
253,
4081,
4499,
422,
8103,
4483,
26475,
253,
7688,
2190,
512,
941,
3530,
534,
7729,
275,
9603,
11117,
285,
27096,
6667,
323,
4679,
253,
2929,
33826,
608,
941,
42072,
7274,
342,
687,
6291,
45825,
463,
347,
253,
9162,
1566,
16774,
1543,
327,
253,
2629,
28400,
22791,
5644,
281,
271,
13943,
3307,
3388,
7756,
4477,
671,
1119,
326,
896,
20099,
285,
48960,
3733,
5019,
5644,
281,
1805,
3045,
685,
643,
4204,
13553,
50273,
296,
3755,
20556,
337,
253,
4081,
7792,
476,
320,
3732,
342,
667,
2505,
941,
42072,
3082,
697,
247,
4891,
789,
326,
588,
1361,
253,
295,
24343,
3114,
275,
6684,
1805,
4204,
5609,
323,
1650,
465,
22711,
1162,
355,
9169,
2722,
326,
667,
3215,
11273,
1566,
476,
320,
908,
323,
941,
42072,
891,
2868,
22510,
19,
14571,
1566,
751,
246,
22,
44693,
1754,
42072,
5678,
342,
12738,
66,
588,
2007,
7450,
253,
1375,
273,
253,
1445,
323,
2505,
4204,
50276,
19,
2929,
3400,
2590,
42852,
285,
8631,
616,
3082,
4679,
275,
2508,
4477,
1263,
4204,
275,
1097,
1698,
15024,
285,
6793,
15024,
4758,
28913,
2175,
452,
644,
5196,
281,
7409,
15988,
432,
1027,
4295,
50275,
20,
4477,
2098,
281,
3727,
616,
2127,
534,
310,
1175,
323,
38041,
50274,
20881,
1255,
50276,
2577,
4685,
310,
326,
512,
3904,
2361,
275,
253,
2929,
403,
432,
247,
2014,
3368,
347,
247,
9414,
891,
651,
751,
281,
923,
690,
11041,
342,
253,
1543,
7419,
432,
436,
891,
13414,
923,
667,
2201,
3374,
342,
253,
2929,
50273,
34974,
337,
1580,
581,
273,
634,
7342,
310,
281,
3157,
253,
9991,
273,
253,
31612,
941,
452,
368,
3597,
15706,
625,
3000,
275,
253,
260,
6291,
1566,
407,
3753,
260,
6291,
310,
3033,
281,
8171,
2781,
1458,
273,
253,
21761,
1223,
11850,
253,
6197,
2978,
3082,
824,
347,
896,
20099,
390,
22510,
19,
14571,
3210,
513,
417,
452,
824,
13133,
50276,
12563,
452,
368,
2783,
970,
247,
3215,
11273,
22510,
19,
14571,
1566,
323,
4204,
347,
275,
465,
22711,
1162,
355,
9169,
374,
3036,
608,
896,
20099,
285,
48960,
3733,
452,
2074,
3045,
436,
906,
310,
27807,
50276,
3088,
368,
452,
690,
2007,
16039,
715,
352,
50275,
555,
993,
50276,
1704,
1423,
253,
806,
1307,
2723,
50276,
5528,
2541,
84,
50275,
1704,
577,
4499,
422,
4715,
5586,
327,
1206,
505,
422,
4715,
50276,
45842,
422,
4715,
50276,
250,
3065,
3081,
4204,
30404,
337,
465,
22711,
362,
448,
2995,
73,
552,
247,
50276,
4039,
299,
9169,
941,
42072,
970,
3215,
11273,
39707,
3210,
549,
32693,
2117,
1518,
1229,
17537,
22,
7152,
339,
431,
248,
2929,
29328,
247,
4460,
941,
42072,
7792,
534,
33826,
1027,
13553,
273,
7011,
5203,
10192,
26368,
21257,
281,
3157,
253,
9991,
273,
31612,
3530,
50276,
783,
4477,
1089,
326,
37444,
5799,
5203,
10192,
26368,
21257,
11330,
3782,
27096,
3530,
50276,
783,
2929,
671,
23970,
247,
4499,
422,
4715,
8103,
281,
9232,
253,
4156,
2954,
2190,
253,
941,
2792,
275,
6779,
2317,
50274,
249,
619,
4743,
253,
17947,
273,
1027,
13553,
273,
7011,
5203,
10192,
26368,
21257,
310,
253,
2201,
7680,
273,
436,
2929,
534,
778,
26761,
2852,
2987,
323,
941,
42072,
2299,
253,
4499,
422,
37820,
1789,
310,
247,
2372,
32809,
285,
891,
2550,
923,
247,
1943,
3064,
2429,
342,
278,
16856,
390,
7018,
585,
50273,
45563,
50275,
783,
2934,
273,
37444,
5799,
5203,
10192,
26368,
21257,
310,
27350,
50276,
783,
9554,
273,
253,
15274,
3733,
8103,
285,
253,
4499,
422,
37820,
8103,
310,
4722,
50276,
20881,
1255,
50275,
77,
471,
273,
38135,
253,
4499,
422,
37820,
1789,
310,
247,
2372,
32809,
285,
436,
1789,
310,
1077,
2074,
281,
278,
16856,
390,
7018,
585,
50276,
783,
1566,
556,
806,
281,
4044,
253,
31612,
3530,
534,
310,
13782,
8214,
323,
1236,
2510,
25912,
15302,
285,
778,
35007,
253,
8542,
2898,
273,
253,
1566,
25761,
253,
4583,
11701,
403,
4942,
1355,
2429,
342,
391,
20,
71,
285,
627,
310,
247,
3480,
273,
11041,
1783,
50276,
34974,
50276,
5371,
310,
253,
15180,
10454,
273,
12738,
66,
50276,
22309,
970,
5823,
69,
4181,
275,
2593,
4562,
50276,
261,
37444,
5799,
5203,
10192,
26368,
21257,
253,
4284,
4758,
323,
12738,
66,
275,
634,
28400,
4679,
752,
604,
643,
8130,
5878,
3632,
789,
1805,
275,
15302,
751,
2805,
79,
965,
391,
442,
278,
44683,
285,
594,
327,
2139,
417,
1304,
1543,
327,
1110,
15302,
50276,
5371,
310,
253,
2201,
3064,
875,
634,
4499,
422,
37820,
285,
278,
16856,
390,
7018,
585,
50276,
284,
253,
11701,
403,
4942,
1355,
812,
368,
4496,
2085,
253,
1071,
273,
7605,
8453,
50276,
5371,
604,
368,
8031,
2624,
806,
285,
840,
896,
1057,
253,
1340,
2818,
253,
3045,
187,
187,
4118,
18435,
27,
2520,
2929,
7350,
941,
42072,
5609,
323,
295,
24343,
275,
1798,
253,
4477,
9569,
247,
2087,
42072,
7792,
597,
1067,
12738,
66,
285,
7568,
697,
11839,
327,
247,
1643,
22791,
295,
24343,
8892,
9610,
12532,
16774,
1543,
253,
4477,
9713,
690,
2234,
7350,
24088,
5001,
4373,
22041,
9610,
273,
48894,
1309,
253,
5955,
2180,
253,
13969,
840,
310,
326,
436,
789,
3400,
247,
4217,
285,
4942,
2087,
1332,
323,
42072,
275,
295,
24343,
285,
253,
17857,
32888,
8446,
310,
2779,
281,
1089,
436,
4217
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents a zeroshot action recognition framework by learning an universal mapping from video to semantic space the unseen action embeddings are repositioned through leveraging the distribution of unlabelled test set the universal mappings from unseen action to test videos are first defined and the target embeddings are treated as weighted frechet means the unseen action embeddings are repositioned as a semantic regularization the results on ucf101 and hmdb51 ucf sports and jhmdb validates the proposed method strength 1 the zeroshot learning for action recognition is good direction however the applicability is limited in real world scenario 2 figure 1 is very clear to give the illustration of the concept and the highlevel idea of the whole paper 3 the performance are very stronger when compared with other baselines weakness 1 the technical details is a little hard to follow since i am not very familiar with the zeroshot learning in my opinion the proposed method is a general approach across video and image domains why not implement the experiments of such method on the image zeroshot learning 2 is there any visualization to intuitively show the learning results by the transductive universal transport the paper gives a new method for zeroshot action recognition by learning the transduction from unseen action to test videos in hyperspherical space the performances are good when compared with other stateoftheart methods however the details of the technique is a little hard to follow for me in my opinion the paper forms well and the writing is very good i give accept currently because of my unfamiliarity of such domain by the way i will follow up the work in the following reviewing sections docsepthe paper targets transductive zeroshot action recognition to alleviate models biased to seen categories the authors propose to reposition unseen action embedding through transduction there are three steps in the proposed method first finding an optimal mapping from unseen actions to the mapped video in the shared hyperspherical space second defining target embeddings as weighted frechet means with the weight given by the transport couplings third repositioning unseen action embeddings along the geodesic between the original and target the zeroshot classification performance of the proposed method is tested on the ucf101 and hmdb datasets the zeroshot spatiotemporal localization performance is tested on the ucf sport and jhmdb datasets while the paper demonstrates stateoftheart results one important concern is about fairer comparisons especially the zeroshot spatiotemporal localization experiments table 3 the setting of the proposed method is different from some compared papers for example the authors focus on transductive zsl while mettes et al 2021 kim et al 2021 and brattoli et al 2020 focus on inductive zsl the proposed method uses both action and object information while brattoli et al 2020 use action information only and mettes et al 2021 use object information only without fairer comparisons it is hard to assess the effectiveness of the proposed method another concern is that the importance of some critical components is not adequately evaluated this is also related to the comments above mentioned the proposed method uses both video features and object information is this critical to obtain a good performance the importance of video features and object information is not properly evaluated one way to show this is to evaluate the performance of the proposed method using only one type of modality partial information is given in figure 3 and the fusion paragraph on page 7 based on figure 3 and the discussion it seems that the proposed method does not outperform brattoli et al 2020 and mettes et al 2021 under the same experimental settings as the compared methods typo in 32 implementation details 21d r21d the paper can be further strengthened by demonstrating fairer comparisons and adequately evaluating the importance of the critical components docsepthis work tries to address the problem of zeroshot action recognition particularly the paper aims at preventing the case that many unseen action categories in the target domain are simply never being selected during inference using the distribution of the unlabelled test set the embeddings of unseen actions in the target domain are reweighted and repositioned along the geodesic such that they are better aligned with embeddings of training actions in the source domain in experiments empirically the proposed method has been evaluated on benchmark datasets for tasks zeroshot action classification and spatiotemporal action localization strengths work on the problem of reducing the biases between seen categories in the source domain and unseen categories in the target domain the semantic space evaluate the approach on benchmark datasets for two tasks zeroshot action classification and spatiotemporal action localization weaknesses novelty seems incremental the proposed transductive universal transport algorithm for embedding reposition seems like a simple weighting method guided by the distribution of the unlabelled test set the paper merely uses existing approaches to solve the transductive optimal transport problem but does not bring any new insights generalization seems a concern the proposed approach heavily depends on the distribution of the unlabelled test set it seems sensitive to the distribution and the number of clusters as shown in figure 2 using the target embeddings seems on par with repositioned embeddings also the proposed approach seems to only works in the case with a small number of clusters this paper proposes a sensible solution to reduce the bias between the source domain and the target domain for the task of action recognition but novelty seems incremental also some experiments seems a bit unconvincing and the approach seems not to scale to general settings docsepthis work introduces transductive universal transport for zeroshot action recognition where no training examples for unseen classes are available to address the biases of prior approaches towards seen classes during inference this paper repositions unseen action embeddings through transduction by using the distribution of the unlabelled test set experimental results on several action recognition datasets demonstrate the effectiveness of the proposed method strengths 1 the use of transductive universal transport for zeroshot action recognition is new 2 the experimental results show the effectiveness of this new method and also better performance than prior states of the art weaknesses 1 the approach needs access to the entire testing video set to obtain distribution information of testing videos this is an unrealistic setting when used in practice a machine learning model should expect to see one testing example a time 2 many symbols are not clearly defined making the math descriptions in this paper hard to read for example in eq 2 it is unclear what ws wu sumlu sumls are what weights for labels ie wlu and wls mean and why us and uu are sets of labels 3 does the use of transductive universal transport bring any computational overhead to the zeroshot learning model it is interesting to see comparison of inference time and complexity with the baseline and some states of the art the idea of using transductive universal transport for zeroshot action recognition is new and the performance is good but the core setting that the entire testing set is available during training to get the distribution information is unrealistic the writing especially the math part needs improvement
### Summary: | this paper was reviewed by four experts in the field and received mixed scores 1 borderline accept 3 borderline reject the reviewers raised their concerns on lack of novelty unconvincing experiment and the presentation of this paper ac feels that this work has great potential but needs more work to better clarify the contribution and include additional ablated study the authors are encouraged to consider the reviewers comments when revising the paper for submission elsewhere | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
10262,
247,
1182,
254,
6934,
302,
2250,
8981,
7792,
407,
4715,
271,
10898,
10603,
432,
3492,
281,
24705,
2317,
253,
39709,
2250,
46234,
403,
294,
3321,
264,
949,
19732,
2977,
253,
3268,
273,
440,
47728,
1071,
873,
253,
10898,
42794,
432,
39709,
2250,
281,
1071,
10556,
403,
806,
2931,
285,
253,
2303,
46234,
403,
4127,
347,
17375,
4107,
25742,
2097,
253,
39709,
2250,
46234,
403,
294,
3321,
264,
347,
247,
24705,
37820,
253,
1543,
327,
44274,
71,
6903,
285,
288,
78,
5470,
3712,
44274,
71,
9001,
285,
480,
11774,
5470,
3588,
684,
253,
4081,
1332,
4757,
337,
253,
1182,
254,
6934,
302,
4715,
323,
2250,
8981,
310,
1175,
3884,
2299,
253,
30437,
310,
3710,
275,
1524,
1533,
10076,
374,
4677,
337,
310,
1077,
2590,
281,
1918,
253,
23356,
273,
253,
4473,
285,
253,
1029,
5251,
2934,
273,
253,
2644,
2929,
495,
253,
3045,
403,
1077,
10046,
672,
2429,
342,
643,
1666,
25379,
50276,
20881,
1255,
337,
253,
7681,
4278,
310,
247,
1652,
1892,
281,
956,
1580,
891,
717,
417,
1077,
7615,
342,
253,
1182,
254,
6934,
302,
4715,
275,
619,
4743,
253,
4081,
1332,
310,
247,
2087,
2746,
2439,
3492,
285,
2460,
10625,
2139,
417,
3359,
253,
4679,
273,
824,
1332,
327,
253,
2460,
1182,
254,
6934,
302,
4715,
374,
50276,
261,
627,
667,
24426,
281,
540,
41597,
921,
253,
4715,
1543,
407,
253,
811,
43324,
10898,
4616,
253,
2929,
4245,
247,
747,
1332,
323,
1182,
254,
6934,
302,
2250,
8981,
407,
4715,
253,
28942,
432,
39709,
2250,
281,
1071,
10556,
275,
24052,
81,
16635,
2317,
253,
16226,
403,
1175,
672,
2429,
342,
643,
1375,
23037,
14387,
3082,
2299,
253,
4278,
273,
253,
5853,
50276,
261,
247,
1652,
1892,
281,
956,
323,
479,
275,
619,
4743,
253,
2929,
4948,
973,
285,
253,
4028,
310,
1077,
1175,
891,
1918,
2997,
4390,
984,
273,
619,
32139,
414,
273,
824,
5028,
407,
253,
1039,
891,
588,
956,
598,
253,
789,
275,
253,
1563,
16725,
7118,
50276,
7152,
339,
431,
248,
2929,
8571,
811,
43324,
1182,
254,
6934,
302,
2250,
8981,
50276,
936,
33623,
3210,
23539,
281,
2326,
9050,
253,
4477,
12661,
281,
294,
3321,
39709,
2250,
21496,
949,
28942,
627,
403,
1264,
5018,
275,
253,
4081,
1332,
806,
4560,
271,
8654,
10603,
432,
39709,
5231,
281,
253,
18301,
3492,
275,
253,
6096,
24052,
81,
16635,
2317,
1273,
13947,
2303,
46234,
347,
17375,
4107,
25742,
2097,
342,
253,
2801,
1677,
407,
253,
4616,
27654,
2626,
294,
3321,
272,
39709,
2250,
46234,
2112,
253,
35917,
875,
253,
3236,
285,
2303,
253,
1182,
254,
6934,
302,
9162,
3045,
273,
253,
4081,
1332,
310,
5762,
327,
253,
44274,
71,
6903,
285,
288,
78,
5470,
15302,
253,
1182,
254,
6934,
302,
7046,
7173,
358,
23702,
14536,
3045,
310,
5762,
327,
253,
44274,
71,
9678,
285,
480,
11774,
5470,
15302,
50275,
6050,
253,
2929,
14371,
1375,
23037,
14387,
1543,
581,
1774,
4468,
310,
670,
22870,
83,
14023,
3340,
253,
1182,
254,
6934,
302,
7046,
7173,
358,
23702,
14536,
4679,
2829,
495,
50275,
783,
4758,
273,
253,
4081,
1332,
310,
1027,
432,
690,
2429,
9380,
50276,
1542,
1650,
253,
4477,
2770,
327,
811,
43324,
1182,
3433,
1223,
1313,
5298,
1162,
355,
43425,
465,
303,
1162,
355,
43425,
50276,
395,
1308,
1595,
10424,
1162,
355,
9169,
2770,
327,
42115,
1182,
3433,
50275,
783,
4081,
1332,
4648,
1097,
2250,
285,
1789,
1491,
1223,
1308,
1595,
10424,
1162,
355,
9169,
897,
2250,
1491,
760,
285,
1313,
5298,
1162,
355,
43425,
897,
1789,
1491,
760,
50275,
14920,
22870,
83,
14023,
352,
310,
1892,
281,
2939,
253,
12510,
273,
253,
4081,
1332,
50276,
23955,
4468,
310,
326,
253,
6349,
273,
690,
4619,
4295,
310,
417,
18212,
6760,
50276,
2520,
310,
671,
2905,
281,
253,
5701,
1840,
5393,
253,
4081,
1332,
4648,
1097,
3492,
3386,
285,
1789,
1491,
310,
436,
4619,
281,
4044,
247,
1175,
3045,
253,
6349,
273,
3492,
3386,
285,
1789,
1491,
310,
417,
6283,
6760,
581,
1039,
281,
921,
436,
310,
281,
7472,
253,
3045,
273,
253,
4081,
1332,
970,
760,
581,
1511,
273,
36453,
7898,
1491,
310,
1677,
275,
4677,
495,
285,
253,
11781,
12494,
327,
3239,
818,
1754,
327,
4677,
495,
285,
253,
5955,
352,
3133,
326,
253,
4081,
1332,
1057,
417,
562,
32231,
1308,
1595,
10424,
1162,
355,
9169,
285,
1313,
5298,
1162,
355,
43425,
762,
253,
1072,
5661,
7533,
347,
253,
2429,
3082,
50276,
555,
5367,
275,
4567,
7092,
4278,
3127,
69,
50276,
83,
1797,
69,
50276,
783,
2929,
476,
320,
2007,
34615,
407,
17227,
22870,
83,
14023,
285,
18212,
16344,
253,
6349,
273,
253,
4619,
4295,
5474,
33032,
2520,
789,
14177,
281,
2953,
253,
1895,
273,
1182,
254,
6934,
302,
2250,
8981,
3782,
253,
2929,
13698,
387,
13538,
253,
1083,
326,
1142,
39709,
2250,
9050,
275,
253,
2303,
5028,
403,
3365,
1620,
1146,
4236,
1309,
17032,
970,
253,
3268,
273,
253,
440,
47728,
1071,
873,
253,
46234,
273,
39709,
5231,
275,
253,
2303,
5028,
403,
294,
24676,
285,
294,
3321,
264,
2112,
253,
35917,
824,
326,
597,
403,
1805,
15616,
342,
46234,
273,
3733,
5231,
275,
253,
2603,
5028,
275,
4679,
45190,
253,
4081,
1332,
556,
644,
6760,
327,
50276,
31591,
4698,
15302,
323,
8892,
1182,
254,
6934,
302,
2250,
9162,
285,
7046,
7173,
358,
23702,
2250,
14536,
20544,
50274,
1601,
327,
253,
1895,
273,
8493,
253,
31306,
875,
2326,
9050,
275,
253,
2603,
5028,
285,
39709,
9050,
275,
253,
2303,
5028,
253,
24705,
2317,
50275,
45141,
253,
2746,
327,
22791,
15302,
323,
767,
8892,
50276,
8260,
6934,
302,
2250,
9162,
285,
7046,
7173,
358,
23702,
2250,
14536,
50276,
20881,
1255,
265,
50274,
2369,
652,
555,
3133,
32809,
253,
4081,
811,
43324,
10898,
4616,
5933,
323,
21496,
294,
3321,
3133,
751,
247,
2969,
42428,
1332,
18107,
407,
253,
3268,
273,
253,
440,
47728,
1071,
873,
253,
2929,
7960,
4648,
5368,
7274,
281,
8415,
253,
811,
43324,
8654,
4616,
1895,
533,
1057,
417,
3324,
667,
747,
16039,
50275,
16691,
1320,
3133,
247,
4468,
253,
4081,
2746,
11306,
7024,
327,
253,
3268,
273,
253,
440,
47728,
1071,
873,
352,
3133,
7996,
281,
253,
3268,
285,
253,
1180,
273,
9959,
50276,
284,
2011,
275,
4677,
374,
970,
253,
2303,
46234,
3133,
327,
1061,
342,
294,
3321,
264,
46234,
671,
253,
4081,
2746,
3133,
281,
760,
2987,
275,
253,
1083,
342,
247,
1355,
1180,
273,
9959,
50276,
2520,
2929,
29328,
247,
24600,
2900,
281,
4796,
253,
8492,
875,
253,
2603,
5028,
285,
253,
2303,
5028,
323,
253,
4836,
273,
2250,
8981,
533,
38135,
3133,
32809,
671,
690,
4679,
3133,
247,
2372,
10915,
87,
19163,
285,
253,
2746,
3133,
417,
281,
4311,
281,
2087,
7533,
50276,
7152,
33032,
2520,
789,
23970,
811,
43324,
10898,
4616,
323,
1182,
254,
6934,
302,
2250,
8981,
835,
642,
3733,
6667,
323,
39709,
5971,
403,
2130,
281,
2953,
253,
31306,
273,
2720,
7274,
4404,
2326,
5971,
1309,
17032,
436,
2929,
294,
35507,
39709,
2250,
46234,
949,
28942,
407,
970,
253,
3268,
273,
253,
440,
47728,
1071,
873,
5661,
1543,
327,
2067,
2250,
8981,
15302,
7568,
253,
12510,
273,
253,
4081,
1332,
20544,
337,
253,
897,
273,
811,
43324,
10898,
4616,
323,
1182,
254,
6934,
302,
2250,
8981,
310,
747,
374,
253,
5661,
1543,
921,
253,
12510,
273,
436,
747,
1332,
285,
671,
1805,
3045,
685,
2720,
3054,
273,
253,
1445,
50276,
20881,
1255,
265,
337,
253,
2746,
3198,
2289,
281,
253,
2862,
5175,
3492,
873,
281,
4044,
3268,
1491,
273,
5175,
10556,
436,
310,
271,
46521,
4758,
672,
908,
275,
3946,
247,
5145,
4715,
1566,
943,
1902,
281,
923,
581,
5175,
1650,
247,
673,
50276,
19,
1142,
14217,
403,
417,
4518,
2931,
2403,
253,
14168,
20121,
275,
436,
2929,
1892,
281,
1239,
323,
1650,
275,
16186,
374,
352,
310,
12744,
752,
37280,
259,
86,
2020,
7675,
2020,
5200,
403,
752,
13461,
323,
13301,
26332,
259,
7675,
285,
259,
5200,
1599,
285,
2139,
441,
285,
1484,
86,
403,
5239,
273,
13301,
50276,
20,
1057,
253,
897,
273,
811,
43324,
10898,
4616,
3324,
667,
15180,
18332,
281,
253,
1182,
254,
6934,
302,
4715,
1566,
352,
310,
4722,
281,
923,
5301,
273,
17032,
673,
285,
10454,
342,
253,
8245,
285,
690,
3054,
273,
253,
1445,
253,
2934,
273,
970,
811,
43324,
10898,
4616,
323,
1182,
254,
6934,
302,
2250,
8981,
310,
747,
285,
253,
3045,
310,
1175,
533,
253,
5161,
4758,
326,
253,
2862,
5175,
873,
310,
2130,
1309,
3733,
281,
755,
253,
3268,
1491,
310,
46521,
253,
4028,
3340,
253,
14168,
629,
3198,
7756,
2490,
187,
4118,
18435,
27,
2520,
2929,
369,
9814,
407,
1740,
10071,
275,
253,
1673,
285,
2959,
6804,
7363,
337,
45210,
2997,
495,
45210,
12009,
253,
30628,
5439,
616,
7350,
327,
3480,
273,
38135,
10915,
87,
19163,
3368,
285,
253,
9759,
273,
436,
2929,
913,
9193,
326,
436,
789,
556,
1270,
2442,
533,
3198,
625,
789,
281,
1805,
19148,
253,
7680,
285,
2486,
3081,
490,
16148,
1263,
253,
4477,
403,
14659,
281,
1908,
253,
30628,
5701,
672,
3585,
2182,
253,
2929,
323,
19529,
11358
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
10262,
247,
1182,
254,
6934,
302,
2250,
8981,
7792,
407,
4715,
271,
10898,
10603,
432,
3492,
281,
24705,
2317,
253,
39709,
2250,
46234,
403,
294,
3321,
264,
949,
19732,
2977,
253,
3268,
273,
440,
47728,
1071,
873,
253,
10898,
42794,
432,
39709,
2250,
281,
1071,
10556,
403,
806,
2931,
285,
253,
2303,
46234,
403,
4127,
347,
17375,
4107,
25742,
2097,
253,
39709,
2250,
46234,
403,
294,
3321,
264,
347,
247,
24705,
37820,
253,
1543,
327,
44274,
71,
6903,
285,
288,
78,
5470,
3712,
44274,
71,
9001,
285,
480,
11774,
5470,
3588,
684,
253,
4081,
1332,
4757,
337,
253,
1182,
254,
6934,
302,
4715,
323,
2250,
8981,
310,
1175,
3884,
2299,
253,
30437,
310,
3710,
275,
1524,
1533,
10076,
374,
4677,
337,
310,
1077,
2590,
281,
1918,
253,
23356,
273,
253,
4473,
285,
253,
1029,
5251,
2934,
273,
253,
2644,
2929,
495,
253,
3045,
403,
1077,
10046,
672,
2429,
342,
643,
1666,
25379,
50276,
20881,
1255,
337,
253,
7681,
4278,
310,
247,
1652,
1892,
281,
956,
1580,
891,
717,
417,
1077,
7615,
342,
253,
1182,
254,
6934,
302,
4715,
275,
619,
4743,
253,
4081,
1332,
310,
247,
2087,
2746,
2439,
3492,
285,
2460,
10625,
2139,
417,
3359,
253,
4679,
273,
824,
1332,
327,
253,
2460,
1182,
254,
6934,
302,
4715,
374,
50276,
261,
627,
667,
24426,
281,
540,
41597,
921,
253,
4715,
1543,
407,
253,
811,
43324,
10898,
4616,
253,
2929,
4245,
247,
747,
1332,
323,
1182,
254,
6934,
302,
2250,
8981,
407,
4715,
253,
28942,
432,
39709,
2250,
281,
1071,
10556,
275,
24052,
81,
16635,
2317,
253,
16226,
403,
1175,
672,
2429,
342,
643,
1375,
23037,
14387,
3082,
2299,
253,
4278,
273,
253,
5853,
50276,
261,
247,
1652,
1892,
281,
956,
323,
479,
275,
619,
4743,
253,
2929,
4948,
973,
285,
253,
4028,
310,
1077,
1175,
891,
1918,
2997,
4390,
984,
273,
619,
32139,
414,
273,
824,
5028,
407,
253,
1039,
891,
588,
956,
598,
253,
789,
275,
253,
1563,
16725,
7118,
50276,
7152,
339,
431,
248,
2929,
8571,
811,
43324,
1182,
254,
6934,
302,
2250,
8981,
50276,
936,
33623,
3210,
23539,
281,
2326,
9050,
253,
4477,
12661,
281,
294,
3321,
39709,
2250,
21496,
949,
28942,
627,
403,
1264,
5018,
275,
253,
4081,
1332,
806,
4560,
271,
8654,
10603,
432,
39709,
5231,
281,
253,
18301,
3492,
275,
253,
6096,
24052,
81,
16635,
2317,
1273,
13947,
2303,
46234,
347,
17375,
4107,
25742,
2097,
342,
253,
2801,
1677,
407,
253,
4616,
27654,
2626,
294,
3321,
272,
39709,
2250,
46234,
2112,
253,
35917,
875,
253,
3236,
285,
2303,
253,
1182,
254,
6934,
302,
9162,
3045,
273,
253,
4081,
1332,
310,
5762,
327,
253,
44274,
71,
6903,
285,
288,
78,
5470,
15302,
253,
1182,
254,
6934,
302,
7046,
7173,
358,
23702,
14536,
3045,
310,
5762,
327,
253,
44274,
71,
9678,
285,
480,
11774,
5470,
15302,
50275,
6050,
253,
2929,
14371,
1375,
23037,
14387,
1543,
581,
1774,
4468,
310,
670,
22870,
83,
14023,
3340,
253,
1182,
254,
6934,
302,
7046,
7173,
358,
23702,
14536,
4679,
2829,
495,
50275,
783,
4758,
273,
253,
4081,
1332,
310,
1027,
432,
690,
2429,
9380,
50276,
1542,
1650,
253,
4477,
2770,
327,
811,
43324,
1182,
3433,
1223,
1313,
5298,
1162,
355,
43425,
465,
303,
1162,
355,
43425,
50276,
395,
1308,
1595,
10424,
1162,
355,
9169,
2770,
327,
42115,
1182,
3433,
50275,
783,
4081,
1332,
4648,
1097,
2250,
285,
1789,
1491,
1223,
1308,
1595,
10424,
1162,
355,
9169,
897,
2250,
1491,
760,
285,
1313,
5298,
1162,
355,
43425,
897,
1789,
1491,
760,
50275,
14920,
22870,
83,
14023,
352,
310,
1892,
281,
2939,
253,
12510,
273,
253,
4081,
1332,
50276,
23955,
4468,
310,
326,
253,
6349,
273,
690,
4619,
4295,
310,
417,
18212,
6760,
50276,
2520,
310,
671,
2905,
281,
253,
5701,
1840,
5393,
253,
4081,
1332,
4648,
1097,
3492,
3386,
285,
1789,
1491,
310,
436,
4619,
281,
4044,
247,
1175,
3045,
253,
6349,
273,
3492,
3386,
285,
1789,
1491,
310,
417,
6283,
6760,
581,
1039,
281,
921,
436,
310,
281,
7472,
253,
3045,
273,
253,
4081,
1332,
970,
760,
581,
1511,
273,
36453,
7898,
1491,
310,
1677,
275,
4677,
495,
285,
253,
11781,
12494,
327,
3239,
818,
1754,
327,
4677,
495,
285,
253,
5955,
352,
3133,
326,
253,
4081,
1332,
1057,
417,
562,
32231,
1308,
1595,
10424,
1162,
355,
9169,
285,
1313,
5298,
1162,
355,
43425,
762,
253,
1072,
5661,
7533,
347,
253,
2429,
3082,
50276,
555,
5367,
275,
4567,
7092,
4278,
3127,
69,
50276,
83,
1797,
69,
50276,
783,
2929,
476,
320,
2007,
34615,
407,
17227,
22870,
83,
14023,
285,
18212,
16344,
253,
6349,
273,
253,
4619,
4295,
5474,
33032,
2520,
789,
14177,
281,
2953,
253,
1895,
273,
1182,
254,
6934,
302,
2250,
8981,
3782,
253,
2929,
13698,
387,
13538,
253,
1083,
326,
1142,
39709,
2250,
9050,
275,
253,
2303,
5028,
403,
3365,
1620,
1146,
4236,
1309,
17032,
970,
253,
3268,
273,
253,
440,
47728,
1071,
873,
253,
46234,
273,
39709,
5231,
275,
253,
2303,
5028,
403,
294,
24676,
285,
294,
3321,
264,
2112,
253,
35917,
824,
326,
597,
403,
1805,
15616,
342,
46234,
273,
3733,
5231,
275,
253,
2603,
5028,
275,
4679,
45190,
253,
4081,
1332,
556,
644,
6760,
327,
50276,
31591,
4698,
15302,
323,
8892,
1182,
254,
6934,
302,
2250,
9162,
285,
7046,
7173,
358,
23702,
2250,
14536,
20544,
50274,
1601,
327,
253,
1895,
273,
8493,
253,
31306,
875,
2326,
9050,
275,
253,
2603,
5028,
285,
39709,
9050,
275,
253,
2303,
5028,
253,
24705,
2317,
50275,
45141,
253,
2746,
327,
22791,
15302,
323,
767,
8892,
50276,
8260,
6934,
302,
2250,
9162,
285,
7046,
7173,
358,
23702,
2250,
14536,
50276,
20881,
1255,
265,
50274,
2369,
652,
555,
3133,
32809,
253,
4081,
811,
43324,
10898,
4616,
5933,
323,
21496,
294,
3321,
3133,
751,
247,
2969,
42428,
1332,
18107,
407,
253,
3268,
273,
253,
440,
47728,
1071,
873,
253,
2929,
7960,
4648,
5368,
7274,
281,
8415,
253,
811,
43324,
8654,
4616,
1895,
533,
1057,
417,
3324,
667,
747,
16039,
50275,
16691,
1320,
3133,
247,
4468,
253,
4081,
2746,
11306,
7024,
327,
253,
3268,
273,
253,
440,
47728,
1071,
873,
352,
3133,
7996,
281,
253,
3268,
285,
253,
1180,
273,
9959,
50276,
284,
2011,
275,
4677,
374,
970,
253,
2303,
46234,
3133,
327,
1061,
342,
294,
3321,
264,
46234,
671,
253,
4081,
2746,
3133,
281,
760,
2987,
275,
253,
1083,
342,
247,
1355,
1180,
273,
9959,
50276,
2520,
2929,
29328,
247,
24600,
2900,
281,
4796,
253,
8492,
875,
253,
2603,
5028,
285,
253,
2303,
5028,
323,
253,
4836,
273,
2250,
8981,
533,
38135,
3133,
32809,
671,
690,
4679,
3133,
247,
2372,
10915,
87,
19163,
285,
253,
2746,
3133,
417,
281,
4311,
281,
2087,
7533,
50276,
7152,
33032,
2520,
789,
23970,
811,
43324,
10898,
4616,
323,
1182,
254,
6934,
302,
2250,
8981,
835,
642,
3733,
6667,
323,
39709,
5971,
403,
2130,
281,
2953,
253,
31306,
273,
2720,
7274,
4404,
2326,
5971,
1309,
17032,
436,
2929,
294,
35507,
39709,
2250,
46234,
949,
28942,
407,
970,
253,
3268,
273,
253,
440,
47728,
1071,
873,
5661,
1543,
327,
2067,
2250,
8981,
15302,
7568,
253,
12510,
273,
253,
4081,
1332,
20544,
337,
253,
897,
273,
811,
43324,
10898,
4616,
323,
1182,
254,
6934,
302,
2250,
8981,
310,
747,
374,
253,
5661,
1543,
921,
253,
12510,
273,
436,
747,
1332,
285,
671,
1805,
3045,
685,
2720,
3054,
273,
253,
1445,
50276,
20881,
1255,
265,
337,
253,
2746,
3198,
2289,
281,
253,
2862,
5175,
3492,
873,
281,
4044,
3268,
1491,
273,
5175,
10556,
436,
310,
271,
46521,
4758,
672,
908,
275,
3946,
247,
5145,
4715,
1566,
943,
1902,
281,
923,
581,
5175,
1650,
247,
673,
50276,
19,
1142,
14217,
403,
417,
4518,
2931,
2403,
253,
14168,
20121,
275,
436,
2929,
1892,
281,
1239,
323,
1650,
275,
16186,
374,
352,
310,
12744,
752,
37280,
259,
86,
2020,
7675,
2020,
5200,
403,
752,
13461,
323,
13301,
26332,
259,
7675,
285,
259,
5200,
1599,
285,
2139,
441,
285,
1484,
86,
403,
5239,
273,
13301,
50276,
20,
1057,
253,
897,
273,
811,
43324,
10898,
4616,
3324,
667,
15180,
18332,
281,
253,
1182,
254,
6934,
302,
4715,
1566,
352,
310,
4722,
281,
923,
5301,
273,
17032,
673,
285,
10454,
342,
253,
8245,
285,
690,
3054,
273,
253,
1445,
253,
2934,
273,
970,
811,
43324,
10898,
4616,
323,
1182,
254,
6934,
302,
2250,
8981,
310,
747,
285,
253,
3045,
310,
1175,
533,
253,
5161,
4758,
326,
253,
2862,
5175,
873,
310,
2130,
1309,
3733,
281,
755,
253,
3268,
1491,
310,
46521,
253,
4028,
3340,
253,
14168,
629,
3198,
7756,
2490,
187,
4118,
18435,
27,
2520,
2929,
369,
9814,
407,
1740,
10071,
275,
253,
1673,
285,
2959,
6804,
7363,
337,
45210,
2997,
495,
45210,
12009,
253,
30628,
5439,
616,
7350,
327,
3480,
273,
38135,
10915,
87,
19163,
3368,
285,
253,
9759,
273,
436,
2929,
913,
9193,
326,
436,
789,
556,
1270,
2442,
533,
3198,
625,
789,
281,
1805,
19148,
253,
7680,
285,
2486,
3081,
490,
16148,
1263,
253,
4477,
403,
14659,
281,
1908,
253,
30628,
5701,
672,
3585,
2182,
253,
2929,
323,
19529,
11358
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work starts by questioning the apparent robustness of quantized networks and demonstrates that such robustness is more so a failure of the attack algorithm in picking up the gradient signal the authors address this by tuning a scalar multiplier applied to the network logits which doesnt modify the models decision boundary through analyzing the jacobian two approaches are proposed to determine the scalar beta without tuning it by performing the attack this approach is quite effective on quantized networks and even provides significant improvement on floatingpoint networks combining with existing attacks like fgsm and pgd the proposed modification might seem trivial at first but it constitutes an important factor the community hasnt taken notice of to the best of my knowledge a few questions 1 i dont see any mentions of tuning the attack step size if we set the new eta to etabeta we can keep the jacobian intact and isolate the effect of temperature scaling for xent 2 how about sweeping beta and plotting against adversarial accuracy this is surely expensive but it would paint a clearer picture of the optimality of beta found using the proposed approaches this can be done in tandem with an attack step of etabeta i like the result overall though the effect of beta on the jacobian and the softmax can and should be separated if my understanding is correct the proposed approaches for determining beta largely depend on the jacobian therefore there should be more investigation on if scaling the jacobian correctly is more important than getting good error signals from softmaxdocsepupdate thanks to the authors for addressing my comments as it was pointed out by the authors temperature rescaling is mostly applicable to nonlinear loss functions for linear loss functions temperature scaling only linear rescales the gradients the difference between the proposed pgd attack and pgd with linear dlr loss is small see the authors response to ar4 the improvements are most significant for fgsm but fgsm is not recommended for the robustness evaluation given the limited technical novelty and small improvements for linear loss functions my score remains unchanged summary the paper studies the robustness of binary neural networks bnns which at first look have higher robustness than fullprecious neural networks the authors highlight the problem of poor signal propagation in bnns which makes gradientbased attacks difficult to address this issue the authors proposed a 1 single scalar rescaling of the jacobian to improve the signal propagation 2 parameterfree hessian norm scaling technique in the experiments the authors demonstrated that the modified attacks reduce the accuracy of bnns to zero and outperform existing gradientbased attacks against floatingpoint networks reasons for score i vote for a weak acceptance of this paper the paper shows that bnns are not robust and introduce an interesting gradient rescaling technique which can also be used to attack fullprecision networks the rescaling technique is well explained easy to apply for any existing attacks and has low computational overhead however as i will discuss below i see some problems comparing the proposed attack against a welltuned pgd attack pros 1 the paper studies the robustness of bnns the robustness of bnns is not well studied and understanding the robustness of bnns is an important research direction 2 the authors highlight the issue of signal propagation in bnns to address this issue they devise a novel lowcomputational complexity technique 3 experimental results for bnns and fullprecious models demonstrate that the modified attack is effective concerns and questions in the experiments the authors use a single parameter for the step size for both pgd and fgsm attacks on the other hand the proposed method computes an optimal rescaling to achieve good signal propagation even though the proposed technique has a low computational budget i believe the authors should do a grid search for the optimal step size for pgd and fgsm attacks for a fair comparison in the experiments pgd l2 attack was unable to reduce the naturally trained models accuracy to 0 this seems strange and unlikely as gradient shattering should not happen for naturally trained models can the authors explain these results is it possible that there might be an implementation issue comments and suggestions the proposed technique amplifies the error signal in a nonlinear way for nonlinear losses such as crossentropy however for other losses such as multiclasshinge loss cw loss the proposed method will simply linearly rescale the error signal attacking cw loss might be useful as it avoids the issues of saturated softmax gradients will this technique be useful for attacking with cw loss the authors claim that this method improves the efficiency of whitebox attacks against full precision models is it possible for the authors to get the results for mnistchallenge and cifar10challenge to see if the method outperforms an optimally tuned pgd attack docsepthis paper studies the robustness of quantized neural networks against adversarial attacks the authors use some slight modification of existing methods to successfully increase the attack success rate in general i think the idea is interesting but i have some concerns that need to be addressed 1 i am not fully convinced by the arguments made at the beginning of section 4 the authors claim that poor signal propagation such as gradient vanishing or gradient exploding should be a problem for adversarial attacks however i do not think the reasonings provided here is specifically for binarized neural networks equations 3 and 4 also works in regular full precision networks i do not think there are any problems which only present in bnns so the arguments here are not strong enough if poor signal propagation is a problem for attacks why we dont see that in full precision networks more discussions on this are welcomed 2 resnet and densenet ref models in table 3 seem to be surprisingly robust under pgd l2 attacks column 3 this adversarial accuracy seems to be comparable to models with adversarial training i think the authors need to provide some explanation on this 3 minor please refrain from only using color to distinguish curvesbars in figures as it may not be friendly to readers with color blindness 4 minor the authors may need to reorganize some sections to make the paper easier to follow for example the related works section before the experiments sectiondocsepthe paper identifies the gradient vanishing issue in the robustness of binary quantized networks therefore it proposes to use temperature scaling approach in the attack generation it has two methods for the temperature scale 1 singular values of the inputoutput jacobian and 2 maximizing the norm of the hessian of the loss updates after rebuttal thanks the authors for answering my questions however i dont think my comments are well addressed even though the paper d may not provide public available code the authors could either use results from the paper d or implement the proposed attack on the models used by d to see the difference strengths the proposed method work well on adv trained models and floatingpoint models practical approach by a simple modification to existing gradient based attacks weaknesses binary quantization is not a well accepted method since it can in general introduce 5 accuracy loss there are a lot more valuable quantization schemes to investigate such as lowbitwith fixed point power of 2 and additive power of 2 y li x dong and w wang additive powersoftwo quantization an efficient nonuniform discretization for neural networks in international conference on learning representations 2020 the novelty is limited since it brings the temperature scaling approach an existing method to the problem of attacking binary quantized models the paper writing is not constructed for easy understanding comments and questions 1 i would like to see comparisons with other attacks that are particularly designed for quantized models 2 the third paragraph in introduction the paper tries to justify two techniques but they are still not well motivated 3 the fourth paraph in introduction mentions both full precision networks and floatingpoint networks whats the difference between these two 4 table 1 results are surprising is the same observation made by other ref works 5 the method is to replace the softmax with a monotonic function softmax with a single scalar during the attack generation then for testing the attack success rate i think the neural network should still use the original softmax without scalar then the attack success rate wont be degradeddocsepupdate since most of my issues have been addressed i have changed my rating from 4 to 6 summary this paper studies the robustness of quantized networks against gradientbased adversarial attacks for l2 and linf norms showing how quantized models suffer from gradient vanishing giving a false sense of security via gradient masking to circumvent this issue the authors propose temperature scaling approaches that can overcome this masking achieving nearperfect perfect success in crafting adversarial inputs for these models reasons for score the papers ultimate goal is to get better gradientbased attack performance on quantized binarized in this case networks however key steps that should have been tried first for benchmarking such as adaptive pgd attacks have not been performed moreover it is not clear what benefit the proposed method has in this scenario compared to gradientfree attacks like boundary the papers contributions although including some nice analyses on temperature scaling based solutions are too weak to be accepted in their current form pros improvement in attack success rates for fullprecision networks even for fgsm seems like an exciting result further analyses and methods on top of this could be used to further increase the strength of these firstorder gradient attacks jacobian and hessian based detailed analyses of temperature scaling and what different solutions correspond to in terms of robustness is quite insightful and interesting cons gradient masking is a relatively wellknown phenomenon in adversarial machine learning in cases when normal firstorder gradient attacks fail techniques like adaptive pgd attacks gradientfree attacks or even blackbox transfer attacks are some straightforward methods to overcome gradient masking thus it is not clear why the authors did not try nongradient attacks before jumping to a complicated algorithm at the very least those attacks like boundary should at least be part of benchmarks for comparison for starters please refer to reliable evaluation of adversarial robustness with an ensemble of diverse parameterfree attacks they have a publicly available implementationhttpsgithubcomfra31autoattack as well all of this is crucial especially since the paper claims section 4 that we would like to point out that the focus of this paper is to improve gradientbased attacks on already trained bnns in general investigating if a model exhibits gradient masking is not a contribution standard checks like comparing transfer rates multistep to singlestep performance attack rates for increasing attack budgets etc are often used to check for gradient masking figure 1 for which norm are these numbers reported without knowing the norm it is hard to say if figure b is a sign of gradient masking or not section 22 these attacks have been further strengthened by a random initial step this is partially true the real benefit comes from having multiple random restarts having just one random initialization by itself is not that useful please rerun evaluation experiments with random restarts 20 is a good number section 3 what does adversarial accuracy refer to is it accuracy on perturbed inputs fx y or success rate of the adversary when trying to change predictions aka fx fx please clarify section 31 clearly indicate gradient masking issues please elaborate not every reader will be familiar with the set of checks used for gradient masking issues with crossentropy based loss and how they promote certain magnitudes of logit values are not new the authors might want to have a look at section 41 of reliable evaluation of adversarial robustness with an ensemble of diverse parameterfree attackshttpsarxivorgpdf200301690pdf to see if there are similaritiesdifferences in the proposed temperaturebased variant and how the proposed method is better than the one in the difference of logitsratio based loss this work seems to be a key and relevant part of related work and should be included in comparisonsbenchmarking implementation of our algorithm will be released upon publication please anonymize and attach the code in response the benefit of using the proposed fgsmpgd attacks on fullprecision models trained with adversarial robustness seems to be negligible table 4 and should not be overstated in results also since these attacks all have random seeds please perform experiments multiple times for statistical significance and report summary statistics minor edits section 22 perturbations to the images the definition here is for adversarial examples in general and should thus be perturbations to data section 22 gradientbased attacks can be written as projected gradient descent pgd this is true only for firstorder gradientbased attacks not all gradientbased attacks examples jsma please correct section 41 since most of the modern networks consist of relu nonlinearities this can and often is circumvented using fakerelu example implementation herehttpsgithubcommadrylabrobustnessblob89bdf8088a8f4bd4a8b86925a2801069ec281feerobustnesstoolscustommodulespyl5 section 5 and they hypothesize that linear networks would be robust to adversarial attacks this is not their conclusion and seems to be out of context section 6 should preferably be either towards the end or at the beginning not clear why it is in the middle of other sections please address and clarify the cons above
### Summary: | the paper studies the robustness of binary neural networks bnns showing how quantized models suffer from gradient vanishing to solve this issue the authors propose temperature scaling approaches that can overcome this masking achieving nearperfect perfect success in crafting adversarial inputs for these models the problem is interesting and important however the major concerns are that the technical novelty is limited raised by two reviewers small improvements for linear loss functions the most related work is not compared in the experiment | [
2523,
273,
2625,
18634,
275,
270,
79,
2224,
281,
2953,
436,
2523,
597,
45018,
247,
4460,
1698,
16777,
1050,
10454,
5853,
495,
5661,
1543,
323,
270,
79,
2224,
285,
2120,
17995,
784,
3210,
7568,
326,
253,
7321,
2983,
310,
3576,
50275,
585,
1209,
2224,
285,
3533,
50276,
249,
253,
4679,
253,
4477,
897,
247,
2014,
4764,
323,
253,
3213,
1979,
323,
1097,
23256,
69,
285,
269,
72,
3610,
8104,
327,
253,
643,
1133,
253,
4081,
1332,
48169,
271,
8654,
46595,
272,
281,
5115,
1175,
2625,
18634,
1014,
2167,
253,
4081,
5853,
556,
247,
1698,
15180,
7563,
891,
2868,
253,
4477,
943,
513,
247,
9860,
3186,
323,
253,
8654,
3213,
1979,
323,
23256,
69,
285,
269,
72,
3610,
8104,
323,
247,
4344,
5301,
50276,
249,
253,
4679,
23256,
69,
298,
19,
2983,
369,
7591,
281,
4796,
253,
10748,
10166,
3210,
7200,
281,
470,
436,
3133,
8921,
285,
11543,
347,
11786,
439,
9476,
943,
417,
5108,
323,
10748,
10166,
3210,
476,
253,
4477,
5513,
841,
1543,
310,
352,
1896,
326,
627,
1537,
320,
271,
7092,
2523,
50275,
26122,
285,
13991,
50276,
783,
4081,
5853,
5457,
7790,
253,
2228,
2625,
275,
247,
14561,
1039,
323,
14561,
11655,
824,
347,
2831,
290,
10144,
2299,
323,
643,
11655,
824,
347,
23559,
14407,
2027,
70,
2957,
260,
88,
2957,
253,
4081,
1332,
588,
3365,
23352,
9708,
1079,
253,
2228,
2625,
20362,
260,
88,
2957,
1537,
320,
4217,
347,
352,
32547,
253,
3374,
273,
23543,
2602,
4090,
27935,
588,
436,
5853,
320,
4217,
323,
20362,
342,
260,
88,
2957,
50275,
783,
4477,
1750,
326,
436,
1332,
19132,
253,
6733,
273,
3168,
3364,
8104,
1411,
2120,
12320,
3210,
310,
352,
1896,
323,
253,
4477,
281,
755,
253,
1543,
323,
278,
79,
382,
48781,
285,
260,
338,
274,
740,
48781,
281,
923,
604,
253,
1332,
41731,
13015,
271,
5556,
595,
24251,
23256,
69,
2983,
5474,
33032,
2520,
2929,
2175,
253,
31640,
273,
2677,
1025,
11454,
6928,
1411,
48960,
8104,
253,
4477,
897,
690,
4512,
11237,
273,
5368,
3082,
281,
8379,
2572,
253,
2983,
2323,
2281,
275,
2087,
891,
1158,
253,
2934,
310,
4722,
533,
891,
452,
690,
7350,
326,
878,
281,
320,
9713,
337,
891,
717,
417,
4751,
13762,
407,
253,
7125,
1160,
387,
253,
5068,
273,
2593,
577,
253,
4477,
1750,
326,
4105,
2625,
18634,
824,
347,
11786,
29199,
390,
11786,
1414,
4442,
943,
320,
247,
1895,
323,
48960,
8104,
2299,
891,
513,
417,
1158,
253,
1921,
723,
2530,
1060,
310,
5742,
323,
10269,
274,
1025,
11454,
6928,
7424,
495,
285,
577,
671,
2987,
275,
3963,
2120,
12320,
6928,
891,
513,
417,
1158,
627,
403,
667,
3237,
534,
760,
1246,
275,
270,
79,
2224,
594,
253,
7125,
1060,
403,
417,
2266,
2217,
604,
4105,
2625,
18634,
310,
247,
1895,
323,
8104,
2139,
359,
13414,
923,
326,
275,
2120,
12320,
6928,
625,
11985,
327,
436,
403,
25213,
374,
501,
3024,
285,
12006,
257,
292,
1275,
3210,
275,
2829,
495,
1646,
281,
320,
19143,
10237,
762,
23256,
69,
298,
19,
8104,
5084,
495,
436,
48960,
7200,
3133,
281,
320,
10870,
281,
3210,
342,
48960,
3733,
891,
1158,
253,
4477,
878,
281,
2085,
690,
8813,
327,
436,
50276,
20,
5884,
4496,
35531,
432,
760,
970,
3295,
281,
12129,
9191,
33396,
275,
8442,
347,
352,
778,
417,
320,
11453,
281,
10668,
342,
3295,
44868,
577,
5884,
253,
4477,
778,
878,
281,
294,
7397,
907,
690,
7118,
281,
1056,
253,
2929,
6927,
281,
956,
323,
1650,
253,
2905,
2987,
2593,
1078,
253,
4679,
2593,
7152,
339,
431,
248,
2929,
22649,
253,
11786,
29199,
2523,
275,
253,
31640,
273,
8985,
2677,
1025,
6928,
3103,
352,
29328,
281,
897,
3276,
13642,
2746,
275,
253,
2983,
5978,
352,
556,
767,
3082,
323,
253,
3276,
4311,
337,
11098,
2193,
273,
253,
3280,
9252,
480,
317,
706,
757,
285,
374,
46875,
253,
5222,
273,
253,
344,
859,
757,
273,
253,
2957,
50276,
484,
24275,
846,
30080,
22559,
50276,
35501,
253,
4477,
323,
22291,
619,
3533,
2299,
891,
13414,
1158,
619,
5701,
403,
973,
9713,
1014,
2167,
253,
2929,
277,
778,
417,
2085,
1345,
2130,
2127,
253,
4477,
812,
2057,
897,
1543,
432,
253,
2929,
277,
390,
3359,
253,
4081,
2983,
327,
253,
3210,
908,
407,
277,
281,
923,
253,
3064,
50276,
296,
3755,
20556,
50275,
783,
4081,
1332,
789,
973,
327,
1604,
10166,
3210,
285,
14974,
3659,
3210,
50275,
81,
26080,
2746,
407,
247,
2969,
11237,
281,
5368,
11786,
1754,
8104,
50275,
20881,
1255,
265,
50275,
26458,
36643,
310,
417,
247,
973,
7607,
1332,
1580,
352,
476,
275,
2087,
9569,
608,
7200,
2957,
627,
403,
247,
2257,
625,
9865,
36643,
15849,
281,
7409,
824,
347,
1698,
2713,
3113,
4229,
1127,
1612,
273,
374,
285,
21842,
1612,
273,
374,
340,
632,
1269,
277,
543,
285,
259,
259,
606,
21842,
9136,
23037,
680,
36643,
271,
5919,
1327,
23714,
35132,
1320,
323,
11454,
6928,
275,
5213,
8059,
327,
4715,
14237,
9169,
50275,
783,
38135,
310,
3710,
1580,
352,
10316,
253,
3276,
13642,
2746,
271,
5368,
1332,
281,
253,
1895,
273,
20362,
8985,
2677,
1025,
3210,
50275,
783,
2929,
4028,
310,
417,
8818,
323,
3477,
4685,
50274,
26122,
285,
3533,
50276,
18,
891,
651,
751,
281,
923,
14023,
342,
643,
8104,
326,
403,
3782,
4158,
323,
2677,
1025,
3210,
50276,
19,
253,
2626,
12494,
275,
10199,
253,
2929,
14177,
281,
15249,
767,
5609,
533,
597,
403,
1335,
417,
973,
17194,
50275,
20,
253,
7002,
1061,
9761,
275,
10199,
25957,
1097,
2120,
12320,
6928,
285,
14974,
3659,
6928,
47515,
253,
3064,
875,
841,
767,
50276,
21,
2829,
337,
1543,
403,
10084,
310,
253,
1072,
8310,
1160,
407,
643,
1275,
2987,
50275,
22,
253,
1332,
310,
281,
8171,
253,
2602,
4090,
342,
247,
45973,
1159,
2602,
4090,
342,
247,
2014,
13434,
1309,
253,
2983,
5978,
840,
323,
5175,
253,
2983,
2323,
2281,
891,
1158,
253,
11454,
2990,
943,
1335,
897,
253,
3236,
2602,
4090,
1293,
13434,
840,
253,
2983,
2323,
2281,
31451,
320,
30853,
7152,
33032,
11183,
50276,
17480,
954,
273,
619,
3374,
452,
644,
9713,
891,
452,
4391,
619,
13716,
432,
577,
281,
721,
50276,
8774,
50276,
2520,
2929,
2175,
253,
31640,
273,
2677,
1025,
6928,
1411,
11786,
3169,
48960,
8104,
323,
298,
19,
285,
298,
2050,
22429,
4645,
849,
2677,
1025,
3210,
11089,
432,
11786,
29199,
4933,
247,
3221,
3282,
273,
3988,
3066,
11786,
44790,
281,
39256,
436,
2523,
253,
4477,
12661,
3276,
13642,
7274,
326,
476,
11399,
436,
44790,
17170,
2822,
32060,
3962,
2323,
275,
49378,
48960,
14800,
323,
841,
3210,
50272,
250,
3743,
323,
4868,
50275,
783,
9380,
12553,
4736,
310,
281,
755,
1805,
11786,
3169,
2983,
3045,
327,
2677,
1025,
10269,
274,
1025,
275,
436,
1083,
6928,
2299,
2234,
5018,
326,
943,
452,
644,
3597,
806,
323,
22791,
272,
824,
347,
17825,
23256,
69,
8104,
452,
417,
644,
2684,
25761,
352,
310,
417,
2590,
752,
5649,
253,
4081,
1332,
556,
275,
436,
10076,
2429,
281,
11786,
4924,
8104,
751,
7548,
50276,
783,
9380,
9021,
3738,
1690,
690,
5322,
6260,
327,
3276,
13642,
1754,
5482,
403,
1512,
5075,
281,
320,
7607,
275,
616,
1655,
830,
50272,
856,
84,
50272,
49831,
420,
275,
2983,
2323,
4142,
323,
2120,
40540,
6928,
1014,
323,
269,
72,
3610,
3133,
751,
271,
12302,
906,
2007,
6260,
285,
3082,
327,
1755,
273,
436,
812,
320,
908,
281,
2007,
2572,
253,
4757,
273,
841,
806,
2621,
11786,
8104,
50275,
47941,
706,
757,
285,
344,
859,
757,
1754,
7000,
6260,
273,
3276,
13642,
285,
752,
1027,
5482,
2723,
281,
275,
2426,
273,
31640,
310,
3240,
47860,
285,
4722,
50272,
5040,
50274,
29844,
44790,
310,
247,
4942,
973,
4304,
11562,
275,
48960,
5145,
4715,
275,
2219,
672,
2622,
806,
2621,
11786,
8104,
1891,
5609,
751,
17825,
23256,
69,
8104,
11786,
4924,
8104,
390,
1014,
2806,
3364,
3700,
8104,
403,
690,
15246,
3082,
281,
11399,
11786,
44790,
3021,
352,
310,
417,
2590,
2139,
253,
4477,
858,
417,
1611,
295,
543,
4614,
850,
8104,
1078,
22802,
281,
247,
9542,
5933,
387,
253,
1077,
1878,
1110,
8104,
751,
7548,
943,
387,
1878,
320,
629,
273,
49602,
323,
5301,
50273,
1542,
37843,
4496,
3730,
281,
9630,
7103,
273,
48960,
31640,
342,
271,
19862,
273,
11117,
4764,
4924,
8104,
597,
452,
247,
13644,
2130,
7092,
3614,
7280,
681,
3804,
2405,
15149,
35946,
50276,
284,
973,
50274,
455,
273,
436,
310,
9560,
3340,
1580,
253,
2929,
3916,
2593,
577,
326,
359,
651,
751,
281,
1127,
562,
326,
253,
2770,
273,
436,
2929,
310,
281,
3157,
11786,
3169,
8104,
327,
2168,
10166,
270,
79,
2224,
50275,
249,
2087,
15686,
604,
247,
1566,
15646,
11786,
44790,
310,
417,
247,
7680,
2629,
12255,
751,
10941,
3700,
4142,
1554,
382,
554,
281,
1625,
46701,
554,
3045,
2983,
4142,
323,
3629,
2983,
35905,
3966,
403,
2223,
908,
281,
2451,
323,
11786,
44790,
50274,
13206,
337,
323,
534,
5222,
403,
841,
3904,
2361,
1293,
8958,
253,
5222,
352,
310,
1892,
281,
1333,
604,
4677,
270,
310,
247,
861,
273,
11786,
44790,
390,
417,
50275,
4674,
3307,
841,
8104,
452,
644,
2007,
34615,
407,
247,
3632,
3302,
3213,
436,
310,
10571,
2032,
253,
1524,
5649,
3249,
432,
1907,
2709,
3632,
1551,
12863,
1907,
816,
581,
3632,
31850,
407,
3139,
310,
417,
326,
4217,
4496,
294,
6321,
7103,
4679,
342,
3632,
1551,
12863,
1384,
310,
247,
1175,
1180,
50275,
4674,
495,
752,
1057,
48960,
7200,
3730,
281,
310,
352,
7200,
327,
44711,
14800,
269,
89,
50276,
90,
390,
2323,
2281,
273,
253,
34014,
672,
2820,
281,
1818,
13650,
38857,
269,
89,
50276,
21448,
4496,
19148,
50275,
4674,
4562,
50276,
49346,
5224,
11786,
44790,
3374,
4496,
21184,
417,
1046,
9414,
588,
320,
7615,
342,
253,
873,
273,
12255,
908,
323,
11786,
44790,
50275,
22402,
342,
2831,
290,
10144,
1754,
2957,
285,
849,
597,
8591,
2176,
32800,
273,
2412,
262,
2193,
403,
417,
747,
253,
4477,
1537,
971,
281,
452,
247,
1007,
387,
2593,
7609,
273,
9630,
7103,
273,
48960,
31640,
342,
271,
19862,
273,
11117,
4764,
4924,
8104,
3614,
39962,
2061,
9275,
9755,
520,
31055,
9275,
281,
923,
604,
627,
403,
22620,
69,
26776,
275,
253,
4081,
3276,
3169,
12955,
285,
849,
253,
4081,
1332,
310,
1805,
685,
253,
581,
275,
253,
3064,
273,
2412,
953,
29603,
1754,
2957,
436,
789,
3133,
281,
320,
247,
2234,
285,
4623,
629,
273,
2905,
789,
285,
943,
320,
2908,
275,
14023,
31591,
4698,
272,
50274,
39595,
273,
776,
5933,
588,
320,
4439,
2220,
9311,
4496,
26314,
907,
285,
16152,
253,
2127,
275,
2380,
50274,
783,
5649,
273,
970,
253,
4081,
269,
5943,
2503,
35333,
8104,
327,
2120,
40540,
3210,
10166,
342,
48960,
31640,
3133,
281,
320,
22879,
2829,
577,
285,
943,
417,
320,
689,
33834,
275,
1543,
671,
1580,
841,
8104,
512,
452,
3632,
12922,
4496,
1347,
4679,
2709,
2069,
323,
7605,
8453,
285,
1304,
6010,
9990,
50273,
37585,
1407,
953,
50275,
4674,
3307,
26309,
281,
253,
3888,
253,
5426,
1060,
310,
323,
48960,
6667,
275,
2087,
285,
943,
3021,
320,
26309,
281,
941,
50275,
4674,
3307,
11786,
3169,
8104,
476,
320,
3542,
347,
16589,
11786,
18499,
23256,
69,
436,
310,
2032,
760,
323,
806,
2621,
11786,
3169,
8104,
417,
512,
11786,
3169,
8104,
6667,
23421,
785,
4496,
3451,
50275,
4674,
7609,
1580,
954,
273,
253,
4980,
6928,
2882,
273,
774,
86,
14561,
1005,
436,
476,
285,
2223,
310,
39256,
264,
970,
15223,
1661,
86,
1650,
7092,
1060,
3614,
7280,
2823,
324,
610,
13068,
18848,
461,
1255,
23723,
2511,
67,
4989,
1438,
2055,
66,
25,
71,
21,
14836,
21,
66,
25,
67,
25,
2090,
1099,
66,
1619,
9104,
2090,
886,
27571,
453,
254,
706,
461,
5210,
296,
1062,
1026,
2229,
14825,
81,
1190,
22,
50275,
4674,
608,
285,
597,
41661,
326,
4872,
6928,
651,
320,
10237,
281,
48960,
8104,
436,
310,
417,
616,
6452,
285,
3133,
281,
320,
562,
273,
3634,
50275,
4674,
721,
943,
13027,
320,
2057,
4404,
253,
990,
390,
387,
253,
5068,
417,
2590,
2139,
352,
310,
275,
253,
4766,
273,
643,
7118,
50275,
32897,
2953,
285,
19148,
253,
772,
1840,
2490,
187,
4118,
18435,
27,
783,
2929,
2175,
253,
31640,
273,
8985,
11454,
6928,
270,
79,
2224,
4645,
849,
2677,
1025,
3210,
11089,
432,
11786,
29199,
281,
8415,
436,
2523,
253,
4477,
12661,
3276,
13642,
7274,
326,
476,
11399,
436,
44790,
17170,
2822,
32060,
3962,
2323,
275,
49378,
48960,
14800,
323,
841,
3210,
253,
1895,
310,
4722,
285,
1774,
2299,
253,
2201,
7350,
403,
326,
253,
7681,
38135,
310,
3710,
5439,
407,
767,
30628,
1355,
11701,
323,
4872,
2957,
3470,
253,
954,
2905,
789,
310,
417,
2429,
275,
253,
3368,
50276
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
2523,
273,
2625,
18634,
275,
270,
79,
2224,
281,
2953,
436,
2523,
597,
45018,
247,
4460,
1698,
16777,
1050,
10454,
5853,
495,
5661,
1543,
323,
270,
79,
2224,
285,
2120,
17995,
784,
3210,
7568,
326,
253,
7321,
2983,
310,
3576,
50275,
585,
1209,
2224,
285,
3533,
50276,
249,
253,
4679,
253,
4477,
897,
247,
2014,
4764,
323,
253,
3213,
1979,
323,
1097,
23256,
69,
285,
269,
72,
3610,
8104,
327,
253,
643,
1133,
253,
4081,
1332,
48169,
271,
8654,
46595,
272,
281,
5115,
1175,
2625,
18634,
1014,
2167,
253,
4081,
5853,
556,
247,
1698,
15180,
7563,
891,
2868,
253,
4477,
943,
513,
247,
9860,
3186,
323,
253,
8654,
3213,
1979,
323,
23256,
69,
285,
269,
72,
3610,
8104,
323,
247,
4344,
5301,
50276,
249,
253,
4679,
23256,
69,
298,
19,
2983,
369,
7591,
281,
4796,
253,
10748,
10166,
3210,
7200,
281,
470,
436,
3133,
8921,
285,
11543,
347,
11786,
439,
9476,
943,
417,
5108,
323,
10748,
10166,
3210,
476,
253,
4477,
5513,
841,
1543,
310,
352,
1896,
326,
627,
1537,
320,
271,
7092,
2523,
50275,
26122,
285,
13991,
50276,
783,
4081,
5853,
5457,
7790,
253,
2228,
2625,
275,
247,
14561,
1039,
323,
14561,
11655,
824,
347,
2831,
290,
10144,
2299,
323,
643,
11655,
824,
347,
23559,
14407,
2027,
70,
2957,
260,
88,
2957,
253,
4081,
1332,
588,
3365,
23352,
9708,
1079,
253,
2228,
2625,
20362,
260,
88,
2957,
1537,
320,
4217,
347,
352,
32547,
253,
3374,
273,
23543,
2602,
4090,
27935,
588,
436,
5853,
320,
4217,
323,
20362,
342,
260,
88,
2957,
50275,
783,
4477,
1750,
326,
436,
1332,
19132,
253,
6733,
273,
3168,
3364,
8104,
1411,
2120,
12320,
3210,
310,
352,
1896,
323,
253,
4477,
281,
755,
253,
1543,
323,
278,
79,
382,
48781,
285,
260,
338,
274,
740,
48781,
281,
923,
604,
253,
1332,
41731,
13015,
271,
5556,
595,
24251,
23256,
69,
2983,
5474,
33032,
2520,
2929,
2175,
253,
31640,
273,
2677,
1025,
11454,
6928,
1411,
48960,
8104,
253,
4477,
897,
690,
4512,
11237,
273,
5368,
3082,
281,
8379,
2572,
253,
2983,
2323,
2281,
275,
2087,
891,
1158,
253,
2934,
310,
4722,
533,
891,
452,
690,
7350,
326,
878,
281,
320,
9713,
337,
891,
717,
417,
4751,
13762,
407,
253,
7125,
1160,
387,
253,
5068,
273,
2593,
577,
253,
4477,
1750,
326,
4105,
2625,
18634,
824,
347,
11786,
29199,
390,
11786,
1414,
4442,
943,
320,
247,
1895,
323,
48960,
8104,
2299,
891,
513,
417,
1158,
253,
1921,
723,
2530,
1060,
310,
5742,
323,
10269,
274,
1025,
11454,
6928,
7424,
495,
285,
577,
671,
2987,
275,
3963,
2120,
12320,
6928,
891,
513,
417,
1158,
627,
403,
667,
3237,
534,
760,
1246,
275,
270,
79,
2224,
594,
253,
7125,
1060,
403,
417,
2266,
2217,
604,
4105,
2625,
18634,
310,
247,
1895,
323,
8104,
2139,
359,
13414,
923,
326,
275,
2120,
12320,
6928,
625,
11985,
327,
436,
403,
25213,
374,
501,
3024,
285,
12006,
257,
292,
1275,
3210,
275,
2829,
495,
1646,
281,
320,
19143,
10237,
762,
23256,
69,
298,
19,
8104,
5084,
495,
436,
48960,
7200,
3133,
281,
320,
10870,
281,
3210,
342,
48960,
3733,
891,
1158,
253,
4477,
878,
281,
2085,
690,
8813,
327,
436,
50276,
20,
5884,
4496,
35531,
432,
760,
970,
3295,
281,
12129,
9191,
33396,
275,
8442,
347,
352,
778,
417,
320,
11453,
281,
10668,
342,
3295,
44868,
577,
5884,
253,
4477,
778,
878,
281,
294,
7397,
907,
690,
7118,
281,
1056,
253,
2929,
6927,
281,
956,
323,
1650,
253,
2905,
2987,
2593,
1078,
253,
4679,
2593,
7152,
339,
431,
248,
2929,
22649,
253,
11786,
29199,
2523,
275,
253,
31640,
273,
8985,
2677,
1025,
6928,
3103,
352,
29328,
281,
897,
3276,
13642,
2746,
275,
253,
2983,
5978,
352,
556,
767,
3082,
323,
253,
3276,
4311,
337,
11098,
2193,
273,
253,
3280,
9252,
480,
317,
706,
757,
285,
374,
46875,
253,
5222,
273,
253,
344,
859,
757,
273,
253,
2957,
50276,
484,
24275,
846,
30080,
22559,
50276,
35501,
253,
4477,
323,
22291,
619,
3533,
2299,
891,
13414,
1158,
619,
5701,
403,
973,
9713,
1014,
2167,
253,
2929,
277,
778,
417,
2085,
1345,
2130,
2127,
253,
4477,
812,
2057,
897,
1543,
432,
253,
2929,
277,
390,
3359,
253,
4081,
2983,
327,
253,
3210,
908,
407,
277,
281,
923,
253,
3064,
50276,
296,
3755,
20556,
50275,
783,
4081,
1332,
789,
973,
327,
1604,
10166,
3210,
285,
14974,
3659,
3210,
50275,
81,
26080,
2746,
407,
247,
2969,
11237,
281,
5368,
11786,
1754,
8104,
50275,
20881,
1255,
265,
50275,
26458,
36643,
310,
417,
247,
973,
7607,
1332,
1580,
352,
476,
275,
2087,
9569,
608,
7200,
2957,
627,
403,
247,
2257,
625,
9865,
36643,
15849,
281,
7409,
824,
347,
1698,
2713,
3113,
4229,
1127,
1612,
273,
374,
285,
21842,
1612,
273,
374,
340,
632,
1269,
277,
543,
285,
259,
259,
606,
21842,
9136,
23037,
680,
36643,
271,
5919,
1327,
23714,
35132,
1320,
323,
11454,
6928,
275,
5213,
8059,
327,
4715,
14237,
9169,
50275,
783,
38135,
310,
3710,
1580,
352,
10316,
253,
3276,
13642,
2746,
271,
5368,
1332,
281,
253,
1895,
273,
20362,
8985,
2677,
1025,
3210,
50275,
783,
2929,
4028,
310,
417,
8818,
323,
3477,
4685,
50274,
26122,
285,
3533,
50276,
18,
891,
651,
751,
281,
923,
14023,
342,
643,
8104,
326,
403,
3782,
4158,
323,
2677,
1025,
3210,
50276,
19,
253,
2626,
12494,
275,
10199,
253,
2929,
14177,
281,
15249,
767,
5609,
533,
597,
403,
1335,
417,
973,
17194,
50275,
20,
253,
7002,
1061,
9761,
275,
10199,
25957,
1097,
2120,
12320,
6928,
285,
14974,
3659,
6928,
47515,
253,
3064,
875,
841,
767,
50276,
21,
2829,
337,
1543,
403,
10084,
310,
253,
1072,
8310,
1160,
407,
643,
1275,
2987,
50275,
22,
253,
1332,
310,
281,
8171,
253,
2602,
4090,
342,
247,
45973,
1159,
2602,
4090,
342,
247,
2014,
13434,
1309,
253,
2983,
5978,
840,
323,
5175,
253,
2983,
2323,
2281,
891,
1158,
253,
11454,
2990,
943,
1335,
897,
253,
3236,
2602,
4090,
1293,
13434,
840,
253,
2983,
2323,
2281,
31451,
320,
30853,
7152,
33032,
11183,
50276,
17480,
954,
273,
619,
3374,
452,
644,
9713,
891,
452,
4391,
619,
13716,
432,
577,
281,
721,
50276,
8774,
50276,
2520,
2929,
2175,
253,
31640,
273,
2677,
1025,
6928,
1411,
11786,
3169,
48960,
8104,
323,
298,
19,
285,
298,
2050,
22429,
4645,
849,
2677,
1025,
3210,
11089,
432,
11786,
29199,
4933,
247,
3221,
3282,
273,
3988,
3066,
11786,
44790,
281,
39256,
436,
2523,
253,
4477,
12661,
3276,
13642,
7274,
326,
476,
11399,
436,
44790,
17170,
2822,
32060,
3962,
2323,
275,
49378,
48960,
14800,
323,
841,
3210,
50272,
250,
3743,
323,
4868,
50275,
783,
9380,
12553,
4736,
310,
281,
755,
1805,
11786,
3169,
2983,
3045,
327,
2677,
1025,
10269,
274,
1025,
275,
436,
1083,
6928,
2299,
2234,
5018,
326,
943,
452,
644,
3597,
806,
323,
22791,
272,
824,
347,
17825,
23256,
69,
8104,
452,
417,
644,
2684,
25761,
352,
310,
417,
2590,
752,
5649,
253,
4081,
1332,
556,
275,
436,
10076,
2429,
281,
11786,
4924,
8104,
751,
7548,
50276,
783,
9380,
9021,
3738,
1690,
690,
5322,
6260,
327,
3276,
13642,
1754,
5482,
403,
1512,
5075,
281,
320,
7607,
275,
616,
1655,
830,
50272,
856,
84,
50272,
49831,
420,
275,
2983,
2323,
4142,
323,
2120,
40540,
6928,
1014,
323,
269,
72,
3610,
3133,
751,
271,
12302,
906,
2007,
6260,
285,
3082,
327,
1755,
273,
436,
812,
320,
908,
281,
2007,
2572,
253,
4757,
273,
841,
806,
2621,
11786,
8104,
50275,
47941,
706,
757,
285,
344,
859,
757,
1754,
7000,
6260,
273,
3276,
13642,
285,
752,
1027,
5482,
2723,
281,
275,
2426,
273,
31640,
310,
3240,
47860,
285,
4722,
50272,
5040,
50274,
29844,
44790,
310,
247,
4942,
973,
4304,
11562,
275,
48960,
5145,
4715,
275,
2219,
672,
2622,
806,
2621,
11786,
8104,
1891,
5609,
751,
17825,
23256,
69,
8104,
11786,
4924,
8104,
390,
1014,
2806,
3364,
3700,
8104,
403,
690,
15246,
3082,
281,
11399,
11786,
44790,
3021,
352,
310,
417,
2590,
2139,
253,
4477,
858,
417,
1611,
295,
543,
4614,
850,
8104,
1078,
22802,
281,
247,
9542,
5933,
387,
253,
1077,
1878,
1110,
8104,
751,
7548,
943,
387,
1878,
320,
629,
273,
49602,
323,
5301,
50273,
1542,
37843,
4496,
3730,
281,
9630,
7103,
273,
48960,
31640,
342,
271,
19862,
273,
11117,
4764,
4924,
8104,
597,
452,
247,
13644,
2130,
7092,
3614,
7280,
681,
3804,
2405,
15149,
35946,
50276,
284,
973,
50274,
455,
273,
436,
310,
9560,
3340,
1580,
253,
2929,
3916,
2593,
577,
326,
359,
651,
751,
281,
1127,
562,
326,
253,
2770,
273,
436,
2929,
310,
281,
3157,
11786,
3169,
8104,
327,
2168,
10166,
270,
79,
2224,
50275,
249,
2087,
15686,
604,
247,
1566,
15646,
11786,
44790,
310,
417,
247,
7680,
2629,
12255,
751,
10941,
3700,
4142,
1554,
382,
554,
281,
1625,
46701,
554,
3045,
2983,
4142,
323,
3629,
2983,
35905,
3966,
403,
2223,
908,
281,
2451,
323,
11786,
44790,
50274,
13206,
337,
323,
534,
5222,
403,
841,
3904,
2361,
1293,
8958,
253,
5222,
352,
310,
1892,
281,
1333,
604,
4677,
270,
310,
247,
861,
273,
11786,
44790,
390,
417,
50275,
4674,
3307,
841,
8104,
452,
644,
2007,
34615,
407,
247,
3632,
3302,
3213,
436,
310,
10571,
2032,
253,
1524,
5649,
3249,
432,
1907,
2709,
3632,
1551,
12863,
1907,
816,
581,
3632,
31850,
407,
3139,
310,
417,
326,
4217,
4496,
294,
6321,
7103,
4679,
342,
3632,
1551,
12863,
1384,
310,
247,
1175,
1180,
50275,
4674,
495,
752,
1057,
48960,
7200,
3730,
281,
310,
352,
7200,
327,
44711,
14800,
269,
89,
50276,
90,
390,
2323,
2281,
273,
253,
34014,
672,
2820,
281,
1818,
13650,
38857,
269,
89,
50276,
21448,
4496,
19148,
50275,
4674,
4562,
50276,
49346,
5224,
11786,
44790,
3374,
4496,
21184,
417,
1046,
9414,
588,
320,
7615,
342,
253,
873,
273,
12255,
908,
323,
11786,
44790,
50275,
22402,
342,
2831,
290,
10144,
1754,
2957,
285,
849,
597,
8591,
2176,
32800,
273,
2412,
262,
2193,
403,
417,
747,
253,
4477,
1537,
971,
281,
452,
247,
1007,
387,
2593,
7609,
273,
9630,
7103,
273,
48960,
31640,
342,
271,
19862,
273,
11117,
4764,
4924,
8104,
3614,
39962,
2061,
9275,
9755,
520,
31055,
9275,
281,
923,
604,
627,
403,
22620,
69,
26776,
275,
253,
4081,
3276,
3169,
12955,
285,
849,
253,
4081,
1332,
310,
1805,
685,
253,
581,
275,
253,
3064,
273,
2412,
953,
29603,
1754,
2957,
436,
789,
3133,
281,
320,
247,
2234,
285,
4623,
629,
273,
2905,
789,
285,
943,
320,
2908,
275,
14023,
31591,
4698,
272,
50274,
39595,
273,
776,
5933,
588,
320,
4439,
2220,
9311,
4496,
26314,
907,
285,
16152,
253,
2127,
275,
2380,
50274,
783,
5649,
273,
970,
253,
4081,
269,
5943,
2503,
35333,
8104,
327,
2120,
40540,
3210,
10166,
342,
48960,
31640,
3133,
281,
320,
22879,
2829,
577,
285,
943,
417,
320,
689,
33834,
275,
1543,
671,
1580,
841,
8104,
512,
452,
3632,
12922,
4496,
1347,
4679,
2709,
2069,
323,
7605,
8453,
285,
1304,
6010,
9990,
50273,
37585,
1407,
953,
50275,
4674,
3307,
26309,
281,
253,
3888,
253,
5426,
1060,
310,
323,
48960,
6667,
275,
2087,
285,
943,
3021,
320,
26309,
281,
941,
50275,
4674,
3307,
11786,
3169,
8104,
476,
320,
3542,
347,
16589,
11786,
18499,
23256,
69,
436,
310,
2032,
760,
323,
806,
2621,
11786,
3169,
8104,
417,
512,
11786,
3169,
8104,
6667,
23421,
785,
4496,
3451,
50275,
4674,
7609,
1580,
954,
273,
253,
4980,
6928,
2882,
273,
774,
86,
14561,
1005,
436,
476,
285,
2223,
310,
39256,
264,
970,
15223,
1661,
86,
1650,
7092,
1060,
3614,
7280,
2823,
324,
610,
13068,
18848,
461,
1255,
23723,
2511,
67,
4989,
1438,
2055,
66,
25,
71,
21,
14836,
21,
66,
25,
67,
25,
2090,
1099,
66,
1619,
9104,
2090,
886,
27571,
453,
254,
706,
461,
5210,
296,
1062,
1026,
2229,
14825,
81,
1190,
22,
50275,
4674,
608,
285,
597,
41661,
326,
4872,
6928,
651,
320,
10237,
281,
48960,
8104,
436,
310,
417,
616,
6452,
285,
3133,
281,
320,
562,
273,
3634,
50275,
4674,
721,
943,
13027,
320,
2057,
4404,
253,
990,
390,
387,
253,
5068,
417,
2590,
2139,
352,
310,
275,
253,
4766,
273,
643,
7118,
50275,
32897,
2953,
285,
19148,
253,
772,
1840,
2490,
187,
4118,
18435,
27,
783,
2929,
2175,
253,
31640,
273,
8985,
11454,
6928,
270,
79,
2224,
4645,
849,
2677,
1025,
3210,
11089,
432,
11786,
29199,
281,
8415,
436,
2523,
253,
4477,
12661,
3276,
13642,
7274,
326,
476,
11399,
436,
44790,
17170,
2822,
32060,
3962,
2323,
275,
49378,
48960,
14800,
323,
841,
3210,
253,
1895,
310,
4722,
285,
1774,
2299,
253,
2201,
7350,
403,
326,
253,
7681,
38135,
310,
3710,
5439,
407,
767,
30628,
1355,
11701,
323,
4872,
2957,
3470,
253,
954,
2905,
789,
310,
417,
2429,
275,
253,
3368,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors tackle the issue of outofdistribution detection for deep learning classifiers by proposing two regularization terms that can be added to a lossfunction to finetune a pretrained network in order to ensure a calibrated probability output of px that can be used to detect ood samples existing approaches rely on either using a function of the logits pyx or directly estimating px based on highlevel features of the deep learning classifier however the logits of a deep network can be overconfident miscalibrated 1 and are often not linearly correlated with px for ood samples 2 furthermore density estimation on a biased lowdimensional projection of x might not produce an unbiased estimate of px instead the authors propose to compute px based on the softmax logits pyx after first recalibrating the joint distribution pxy gxy in eq 3
towards this goal they derive a densityconsistency regularization dcr term based on the asymptotic testing of the consistency of py based on logits and its empirical density function in a batchwise fashion this regularization term effectively although asymptotically recalibrates vy and thus according to the authors derivations recalibrates the joint distribution hxy since a discriminative model already optimizes for the accuracy of pyx calibrating pxy pyx px should ensure a calibrated px which can be used as a reliable ood score as a second contribution this paper introduces a contrastive distribution regularization cdr term that incentivizes a high likelihood ratio between augmented samples assumed to be distributiondeviating and indistribution samples 1 matthias hein maksym andriushchenko and julian bitterwolf why relu networks yield highconfidence predictions far away from the training data and how to mitigate the problem in 2019 ieeecvf conference on computer vision and pattern recognition cvpr pages 4150 2019 2 weitang liu xiaoyun wang john owens and yixuan li energybased outofdistribution detection in advances in neural information processing systems pages 2146421475 2020 strengths the proposed approach is simple and the derivation is straightforward the idea of converting the asymptotic consistency test into a regularization term for finetuning seems novel to me the idea of calibrating py to ensure the calibration of px also seems new the authors present an extensive empirical study in which their approach outperforms most state of the art benchmarks weaknesses the writing could be clearer for example the authors claim to propose a new approach for estimating px when they are actually recalibrating the logitsbased estimation of px calibration was not once mentioned in the paper even though it is equivalent to the assumption of eq 3 that the learned model is faithful to the joint distribution the authors emphasize on the novelty of the densityconsistency regularization term however from the empirical study as well as previous literature it is highly likely that the contrastiveloss term is the main driver of the improved performance ablation studies do not disentangle the contribution of this term from that of the densityconsistency one table 2 is lacking a row for enabling cdr alone it is difficult to assess the significance of these contributions due to the lack of a detailed rather than aggregated ablation study and the use of smaller networks than commonly deployed for the imagenet results which i believe are more interesting than those of cifar10100 3 minderer matthias josip djolonga rob romijnders frances hubis xiaohua zhai neil houlsby dustin tran and mario lucic revisiting the calibration of modern neural networks advances in neural information processing systems 34 2021 1568215694 1 the performance on imagenet is not as impressive as that on cifar10100 which brings up the question of whether this approaches scales with the number of classes or size of dataset furthermore larger networks that are becoming more commonplace for imagenetderived tasks might be better calibrated 3 it would be interesting to verify that this approach can be as effective on these modern architectures 2 while the authors claim robustness to hyper parameter choices the fpr metric does seem to change significantly on the average benchmark with different choices of r and batch size it would be interesting to see if thats on a subset of the dataset or a common theme docsepa method is presented to finetune pretrained networks by introducing regularization terms such that the marginal density py estimated from logits matches the empirical density nyn the claim is that this method allows the sum of logitexponentials logsumy exphyx to be used as a score for ood detection effectiveness on various datasets is examined with good results the exposition is heavy on mathematical proofs at the cost of conceptual clarity and readability a lot of emphasis is placed on various proofs but the intuition and concept do not come across as clearly for instance in a paper on ood detection what is used as the ood score should be clearly emphasized however here the fact that logsumy exphyx is used as the eventual ood score is mentioned only fleetingly and is easy to miss on the first reading some recommendations for improvement 1 i feel that the training algorithm should be included in the main manuscript instead of the appendix since it clarifies how the various regularization terms are calculated and gives a clearer picture of the overall flow some of the details of the lemmas and proofs can be moved to the appendix 2 mathematical notation should be simplified and made consistent to improve readability for instance y is used both as the dependent variable in eq 2 and also as the index of summation in the denominator the term upsilonyn in eq 9 seems to be a scalar however in eqs 12 and 13 the matrix transposition operator is applied to it can the authors please clarify how to interpret upsilonyn also notation should be explained where it is introduced for instance mathbbs is introduced in eq 16 but is explained almost a page later on line 274 the paper is well motivated though and the results seem good but there is a question about benchmarks explained in the next question before no concerns with potential negative societal impact docsepthis paper highlights the drawbacks of current ood detection methods that most of them try to model id px using the features that are intially trained to fit pyx to solve the distribution misalignment a novel densityconsistency regularizaiton is proposed a contrastive criterion is also used to flatten px so that ood samples can locate in lowdensity area to be identified easily the proposed method is evaluated upon both oe and nonoe settings and the performance exceeds the previous methods in a large margin pro the paper is wellmotivated and the methods can address the problem intuitively the method rely on proven assumptions with theoretical proof the paper obtains significant great results across various experiments with fixed hyperparameters con the results in table 1 are super charming however for cifar100 experiments i am more interested in the results when cifar10 is ood as cifar10 and cifar100 are generally looks alike but they are from different label distribution this result usually clearly show how the method capture clues to distinguish idood in cv ood samples are usually considered as semantic anomalies to id datasets however a common shortcoming of current ood detection evaluation protocols use different datasets as ood so that some methods can use lowlevel superficial shortcut to make good results eg a simple lowhighresolution image classifier might do extra well on some ood datasets but actually we hope the model uses semantic difference to distinguish idood i hope the proposed method gets the better result not by superficial shortcut the easiest way to proof is to show id cifar10 ood cifar100 and id cifar100 ood cifar10 cases the paper points out the semantically overlapping objects between imagenet and inaturalist in some other works such as vim outofdistribution with virtuallogit matching httpsgithubcomhendrycksnaturaladvexamples scaling outofdistribution detection for realworld settings some cleaner ood datasets for imagenet are provided authors are suggested to do experiments on these clean ood data or to support the reason that the proposed method does not do well on texture inaturalist due to the label noise the author shall show the appropriated sampled semanticallyoverlapping objects get high confidence it would be great if the authors provide more visualization or deeper analysis on what dcr and cdr impact maybe on the feature space to help readers better understand the method the authors do not include the discussion on the limitation of the method we hope the authors can provide more analysis on its weakness in discusssion docsepthe paper proposes to improve ood detection with energy score 1 by incorporating a densitybased regularizer during training that promotes better estimate of py in particular the regularizer is derived based on the rejection criteria of a hypothesis test to ensure the consistency of monte carlo mean of py estimated per batch and the empirical mean in the training set nyn 1 liu et al energybased outofdistribution detection nips 2020 strengths the motivation that existing ood detection scores suffer from poor density estimation is clear and the task is important the proposed method based on law of large numbers and central limit theorem is simple weaknesses the rationale behind the proposed method is unclear to me currently the works promotes monte carlo estimate of pyy to be closer to nyn in the training set however during test time this assumption may not hold ood detection concerns with label shift and it is unclear why one wants to overfit the density of label distribution in the training set while both lemma 1 and lemma 2 hold when n goes to infinity the batch size used in practice is 256 where monte carlo estimate can result in large variance and unreliable estimation therefore there exists significant gaps between the proposed theoretical insights and practical implementation while y is in low dimensional space obtaining py requires integrating over x which are high dimensional vectors and therefore the effects of dimensionality on the accuracy of estimation is nonnegligible the paper included discussions on limitations
### Summary: | this paper develops a method for improving outofdistribution detection in deep learning based on a novel regularization term there was significant variance in review scores with two championing the paper for acceptance and two borderline scores 7 7 5 4 resulting in an aggregated score just above borderline accept the reviewers arguing for acceptance found the method novel the simplicity of the algorithm compelling and the experiments extensive and convincing one reviewer was concerned that baseline comparisons provided in the paper seem less strong than reported in other work two reviewers questioned the mathematical derivations and some of the underlying assumptions of the paper that two reviewers are arguing for acceptance is a signal that the paper could be a useful contribution and interesting to the community since the experiments seem extensive and seem to demonstrate that the method consistently works well and given that it is simple to implement that seems to validate the underlying assumptions and it could provide a useful baseline therefore the recommendation is to accept the paper please make sure to address the remaining reviewer concerns in the final manuscript | [
7239,
1754,
327,
2412,
953,
285,
697,
16774,
4038,
1159,
275,
247,
14604,
3020,
8142,
50276,
2520,
37820,
1307,
8069,
3738,
38311,
42545,
2560,
684,
46643,
285,
3021,
2556,
281,
253,
4477,
3538,
569,
42545,
2560,
684,
253,
6036,
3268,
288,
5246,
1580,
247,
20741,
800,
1566,
2168,
5556,
4219,
323,
253,
7200,
273,
7239,
89,
24403,
839,
268,
5246,
50276,
4789,
89,
268,
89,
943,
5416,
247,
35890,
268,
89,
534,
476,
320,
908,
347,
247,
9630,
258,
351,
4868,
50276,
284,
247,
1273,
7680,
436,
2929,
23970,
247,
4499,
422,
3268,
37820,
260,
5267,
1307,
326,
15210,
400,
4219,
247,
1029,
12177,
4313,
875,
31612,
3530,
8025,
281,
320,
3268,
3620,
15544,
285,
31929,
2382,
3530,
50274,
18,
1111,
394,
6358,
344,
249,
278,
8765,
1105,
285,
363,
2345,
5756,
7381,
285,
49137,
757,
17123,
40995,
2139,
774,
86,
6928,
4917,
1029,
39943,
13650,
2080,
1977,
432,
253,
3733,
941,
285,
849,
281,
29966,
253,
1895,
275,
6247,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
30105,
1087,
7223,
577,
8970,
6247,
50276,
19,
359,
262,
606,
632,
86,
1269,
571,
899,
328,
259,
606,
480,
2116,
18454,
561,
285,
340,
895,
9041,
632,
2341,
3169,
562,
1171,
35360,
5481,
275,
16424,
275,
11454,
1491,
5162,
2718,
7223,
25901,
1540,
21866,
1976,
9169,
50276,
296,
3755,
20556,
50276,
783,
4081,
2746,
310,
2969,
285,
253,
28529,
310,
15246,
50275,
783,
2934,
273,
22022,
253,
20185,
15274,
1071,
715,
247,
37820,
1307,
323,
1442,
292,
25004,
3133,
4460,
281,
479,
253,
2934,
273,
24403,
839,
7239,
281,
5416,
253,
18543,
273,
268,
89,
671,
3133,
747,
50276,
783,
4477,
1246,
271,
9470,
16774,
1263,
275,
534,
616,
2746,
41731,
13015,
954,
1375,
273,
253,
1445,
49602,
50275,
20881,
1255,
265,
50276,
783,
4028,
812,
320,
30909,
323,
1650,
253,
4477,
1750,
281,
12661,
247,
747,
2746,
323,
26230,
268,
89,
672,
597,
403,
2686,
42545,
2560,
839,
253,
2412,
953,
3169,
13418,
273,
268,
89,
18543,
369,
417,
2378,
5393,
275,
253,
2929,
1014,
2167,
352,
310,
6425,
281,
253,
9376,
273,
16186,
495,
326,
253,
6311,
1566,
310,
21738,
281,
253,
6036,
3268,
50276,
783,
4477,
22175,
327,
253,
38135,
273,
253,
4038,
46540,
1371,
37820,
1307,
2299,
432,
253,
16774,
1263,
347,
973,
347,
2045,
6239,
352,
310,
4122,
2779,
326,
253,
4499,
35209,
1730,
1307,
310,
253,
2022,
6254,
273,
253,
5520,
3045,
28913,
2175,
513,
417,
557,
290,
2134,
253,
7680,
273,
436,
1307,
432,
326,
273,
253,
4038,
46540,
1371,
581,
2829,
374,
310,
14999,
247,
4194,
323,
17690,
260,
5267,
3815,
50276,
262,
310,
2834,
281,
2939,
253,
8453,
273,
841,
9021,
1955,
281,
253,
3480,
273,
247,
7000,
2581,
685,
40006,
28913,
1263,
285,
253,
897,
273,
4577,
6928,
685,
7744,
18329,
323,
253,
4440,
257,
292,
1543,
534,
891,
2868,
403,
625,
4722,
685,
1110,
273,
260,
338,
274,
6903,
361,
50275,
20,
2564,
70,
6554,
1111,
394,
6358,
480,
375,
532,
277,
75,
311,
543,
66,
4848,
10102,
1944,
2109,
398,
1315,
1972,
14713,
261,
1269,
571,
1368,
5738,
1182,
16926,
425,
300,
288,
3941,
84,
1615,
8660,
249,
21191,
285,
2304,
900,
18205,
280,
27694,
2996,
253,
18543,
273,
4980,
11454,
6928,
16424,
275,
11454,
1491,
5162,
2718,
5910,
43425,
1458,
2358,
21351,
30467,
50276,
18,
253,
3045,
327,
4440,
257,
292,
310,
417,
347,
13943,
347,
326,
327,
260,
338,
274,
6903,
361,
534,
10316,
598,
253,
1953,
273,
1880,
436,
7274,
11498,
342,
253,
1180,
273,
5971,
390,
1979,
273,
10895,
33810,
4067,
6928,
326,
403,
7552,
625,
47817,
323,
4440,
257,
292,
13472,
8892,
1537,
320,
1805,
35890,
495,
352,
651,
320,
4722,
281,
12654,
326,
436,
2746,
476,
320,
347,
3576,
327,
841,
4980,
35615,
50276,
19,
1223,
253,
4477,
1750,
31640,
281,
4373,
4764,
10165,
253,
269,
1087,
7982,
1057,
1646,
281,
1818,
3012,
327,
253,
3388,
22791,
342,
1027,
10165,
273,
391,
285,
14604,
1979,
352,
651,
320,
4722,
281,
923,
604,
28763,
327,
247,
8578,
273,
253,
10895,
390,
247,
1846,
10014,
50276,
7152,
339,
4904,
1332,
310,
3559,
281,
1442,
292,
2517,
3215,
11273,
6928,
407,
16984,
37820,
2426,
824,
326,
253,
16888,
4038,
7239,
5998,
432,
2412,
953,
10129,
253,
16774,
4038,
295,
1362,
253,
1750,
310,
326,
436,
1332,
4483,
253,
2020,
273,
2412,
614,
37755,
251,
15738,
2412,
2204,
90,
385,
12039,
89,
281,
320,
908,
347,
247,
4868,
323,
258,
351,
5481,
12510,
327,
2710,
15302,
310,
6730,
342,
1175,
1543,
50275,
783,
47284,
310,
5536,
327,
15965,
27947,
387,
253,
2105,
273,
20178,
19843,
285,
1239,
1430,
247,
2257,
273,
15075,
310,
4845,
327,
2710,
27947,
533,
253,
30328,
285,
4473,
513,
417,
1705,
2439,
347,
4518,
323,
4227,
275,
247,
2929,
327,
258,
351,
5481,
752,
310,
908,
347,
253,
258,
351,
4868,
943,
320,
4518,
21947,
2299,
1060,
253,
958,
326,
2412,
2204,
90,
385,
12039,
89,
310,
908,
347,
253,
27585,
258,
351,
4868,
310,
5393,
760,
18500,
5356,
285,
310,
3477,
281,
2985,
327,
253,
806,
4361,
50276,
8826,
12645,
323,
7756,
337,
891,
1928,
326,
253,
3733,
5933,
943,
320,
2908,
275,
253,
2022,
7714,
3185,
273,
253,
30762,
1580,
352,
8254,
7790,
849,
253,
2710,
37820,
2426,
403,
5118,
285,
4245,
247,
30909,
5406,
273,
253,
4583,
2685,
690,
273,
253,
4278,
273,
253,
458,
44661,
285,
27947,
476,
320,
4395,
281,
253,
30762,
374,
15965,
14951,
943,
320,
21010,
285,
1160,
5185,
281,
3157,
1239,
1430,
323,
4227,
340,
310,
908,
1097,
347,
253,
7976,
4778,
275,
16186,
374,
285,
671,
347,
253,
3605,
273,
36138,
275,
253,
12619,
253,
1307,
35267,
2524,
1362,
275,
16186,
898,
3133,
281,
320,
247,
13434,
2299,
275,
16186,
84,
1249,
285,
2145,
253,
4315,
811,
3321,
5572,
310,
3732,
281,
352,
476,
253,
4477,
4496,
19148,
849,
281,
4665,
35267,
2524,
1362,
671,
14951,
943,
320,
5544,
835,
352,
310,
5611,
323,
4227,
14168,
67,
1768,
310,
5611,
275,
16186,
1668,
533,
310,
5544,
2761,
247,
3239,
1996,
327,
1386,
32900,
50275,
783,
2929,
310,
973,
17194,
2167,
285,
253,
1543,
1646,
1175,
533,
627,
310,
247,
1953,
670,
49602,
5544,
275,
253,
1735,
1953,
1078,
50274,
2369,
7350,
342,
2442,
4016,
38058,
3486,
5474,
33032,
2520,
2929,
16681,
253,
30453,
273,
1655,
258,
351,
5481,
3082,
326,
954,
273,
731,
1611,
281,
1566,
2654,
268,
89,
970,
253,
3386,
326,
403,
540,
1365,
10166,
281,
4944,
7239,
89,
281,
8415,
253,
3268,
3731,
40446,
247,
4460,
4038,
46540,
1371,
3963,
478,
1942,
251,
310,
4081,
247,
4499,
422,
17705,
310,
671,
908,
281,
892,
21562,
268,
89,
594,
326,
258,
351,
3530,
476,
19912,
275,
1698,
20425,
2170,
281,
320,
3636,
4354,
253,
4081,
1332,
310,
6760,
2220,
1097,
258,
70,
285,
1327,
3703,
7533,
285,
253,
3045,
23141,
253,
2045,
3082,
275,
247,
1781,
8459,
354,
50276,
783,
2929,
310,
973,
24013,
8550,
285,
253,
3082,
476,
2953,
253,
1895,
540,
41597,
50276,
783,
1332,
10725,
327,
11464,
13260,
342,
10527,
4737,
50276,
783,
2929,
31326,
1534,
1270,
1543,
2439,
2710,
4679,
342,
4229,
4373,
22041,
50275,
585,
50276,
783,
1543,
275,
2829,
337,
403,
2221,
25507,
2299,
323,
260,
338,
274,
2313,
4679,
891,
717,
625,
6110,
275,
253,
1543,
672,
260,
338,
274,
740,
310,
258,
351,
347,
260,
338,
274,
740,
285,
260,
338,
274,
2313,
403,
3839,
4453,
19605,
533,
597,
403,
432,
1027,
5203,
3268,
436,
906,
3798,
4518,
921,
849,
253,
1332,
9232,
30591,
281,
12129,
2654,
836,
50276,
249,
30105,
258,
351,
3530,
403,
3798,
2783,
347,
24705,
31101,
281,
2654,
15302,
2299,
247,
1846,
2159,
4202,
273,
1655,
258,
351,
5481,
7103,
14238,
897,
1027,
15302,
347,
258,
351,
594,
326,
690,
3082,
476,
897,
1698,
5251,
28019,
28194,
281,
1056,
1175,
1543,
24088,
247,
2969,
1698,
8656,
21061,
2460,
30410,
1537,
513,
4465,
973,
327,
690,
258,
351,
15302,
533,
2686,
359,
3524,
253,
1566,
4648,
24705,
3064,
281,
12129,
2654,
836,
891,
3524,
253,
4081,
1332,
4850,
253,
1805,
906,
417,
407,
28019,
28194,
253,
24746,
1039,
281,
4737,
310,
281,
921,
2654,
260,
338,
274,
740,
258,
351,
260,
338,
274,
2313,
285,
2654,
260,
338,
274,
2313,
258,
351,
260,
338,
274,
740,
2219,
50276,
783,
2929,
2792,
562,
253,
3300,
39904,
21481,
5113,
875,
4440,
257,
292,
285,
275,
8646,
382,
275,
690,
643,
2987,
824,
347,
362,
303,
562,
1171,
35360,
342,
4732,
86,
455,
462,
262,
11038,
5987,
7280,
681,
864,
36765,
6163,
19293,
324,
30275,
10240,
13642,
562,
1171,
35360,
5481,
323,
1524,
10186,
7533,
690,
28452,
258,
351,
15302,
323,
4440,
257,
292,
403,
2530,
4477,
403,
5125,
281,
513,
4679,
327,
841,
4076,
258,
351,
941,
390,
281,
1329,
253,
1921,
326,
253,
4081,
1332,
1057,
417,
513,
973,
327,
14542,
50276,
249,
8646,
382,
1955,
281,
253,
5203,
6046,
253,
2488,
3091,
921,
253,
3991,
456,
19958,
3300,
39904,
1189,
77,
5436,
5113,
755,
1029,
7162,
50276,
262,
651,
320,
1270,
604,
253,
4477,
2085,
625,
24426,
390,
12861,
1783,
327,
752,
277,
7083,
285,
260,
5267,
3486,
5046,
327,
253,
4735,
2317,
281,
1361,
10668,
1805,
2096,
253,
1332,
253,
4477,
513,
417,
2486,
253,
5955,
327,
253,
12291,
273,
253,
1332,
359,
3524,
253,
4477,
476,
2085,
625,
1783,
327,
697,
14855,
275,
1262,
316,
859,
279,
5474,
339,
431,
248,
2929,
29328,
281,
3157,
258,
351,
5481,
342,
2341,
4868,
337,
407,
24049,
247,
4038,
3169,
3963,
6081,
1309,
3733,
326,
18653,
1805,
6642,
273,
7239,
275,
1798,
253,
3963,
6081,
310,
6012,
1754,
327,
253,
18235,
6866,
273,
247,
9079,
1071,
281,
5416,
253,
15274,
273,
1114,
442,
1113,
4213,
1599,
273,
7239,
5998,
591,
14604,
285,
253,
16774,
1599,
275,
253,
3733,
873,
295,
1362,
50272,
18,
632,
86,
1162,
355,
2341,
3169,
562,
1171,
35360,
5481,
295,
2824,
9169,
50276,
296,
3755,
20556,
50276,
783,
16038,
326,
5368,
258,
351,
5481,
7363,
11089,
432,
4105,
4038,
13418,
310,
2590,
285,
253,
4836,
310,
1774,
50275,
783,
4081,
1332,
1754,
327,
1569,
273,
1781,
3904,
285,
4275,
2701,
10012,
310,
2969,
50275,
20881,
1255,
265,
50276,
783,
24775,
3212,
253,
4081,
1332,
310,
12744,
281,
479,
4390,
253,
2987,
18653,
1114,
442,
1113,
4213,
6642,
273,
7239,
90,
281,
320,
8003,
281,
295,
1362,
275,
253,
3733,
873,
2299,
1309,
1071,
673,
436,
9376,
778,
417,
2186,
258,
351,
5481,
7350,
342,
5203,
5333,
285,
352,
310,
12744,
2139,
581,
5605,
281,
689,
8491,
253,
4038,
273,
5203,
3268,
275,
253,
3733,
873,
50276,
6050,
1097,
18057,
337,
285,
18057,
374,
2186,
672,
295,
4566,
281,
23579,
50276,
783,
14604,
1979,
908,
275,
3946,
310,
17558,
835,
1114,
442,
1113,
4213,
6642,
476,
906,
275,
1781,
11041,
285,
36230,
13418,
3103,
627,
4961,
1534,
18388,
875,
253,
4081,
10527,
16039,
285,
8542,
7092,
50276,
6050,
340,
310,
275,
1698,
15759,
2317,
13546,
7239,
4419,
24399,
689,
1269,
534,
403,
1029,
15759,
11390,
285,
3103,
253,
2538,
273,
7877,
1319,
327,
253,
7200,
273,
13418,
310,
1327,
570,
3129,
304,
917,
50275,
783,
2929,
2908,
11985,
327,
7364,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
24357,
247,
1332,
323,
11138,
562,
1171,
35360,
5481,
275,
3676,
4715,
1754,
327,
247,
4460,
37820,
1307,
50276,
9088,
369,
1534,
11041,
275,
2278,
7363,
342,
767,
16928,
272,
253,
2929,
323,
14924,
285,
767,
45210,
7363,
818,
818,
608,
577,
4795,
275,
271,
40006,
4868,
816,
1840,
45210,
2997,
50276,
783,
30628,
16425,
323,
14924,
1119,
253,
1332,
4460,
253,
17647,
273,
253,
5933,
18511,
285,
253,
4679,
9470,
285,
21414,
50276,
531,
37317,
369,
7514,
326,
8245,
14023,
2530,
275,
253,
2929,
1646,
1679,
2266,
685,
2361,
275,
643,
789,
50276,
9389,
30628,
17801,
253,
15965,
3538,
569,
285,
690,
273,
253,
6944,
13260,
273,
253,
2929,
50274,
3529,
767,
30628,
403,
16425,
323,
14924,
310,
247,
2625,
326,
253,
2929,
812,
320,
247,
4217,
7680,
285,
4722,
281,
253,
3114,
50276,
17480,
253,
4679,
1646,
9470,
285,
1646,
281,
7568,
326,
253,
1332,
12724,
2987,
973,
285,
1677,
326,
352,
310,
2969,
281,
3359,
326,
3133,
281,
17813,
253,
6944,
13260,
285,
352,
812,
2085,
247,
4217,
8245,
50276,
45230,
253,
17401,
310,
281,
2997,
253,
2929,
50276,
32897,
1056,
2119,
281,
2953,
253,
5780,
37317,
7350,
275,
253,
2457,
7714,
50276
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
7239,
1754,
327,
2412,
953,
285,
697,
16774,
4038,
1159,
275,
247,
14604,
3020,
8142,
50276,
2520,
37820,
1307,
8069,
3738,
38311,
42545,
2560,
684,
46643,
285,
3021,
2556,
281,
253,
4477,
3538,
569,
42545,
2560,
684,
253,
6036,
3268,
288,
5246,
1580,
247,
20741,
800,
1566,
2168,
5556,
4219,
323,
253,
7200,
273,
7239,
89,
24403,
839,
268,
5246,
50276,
4789,
89,
268,
89,
943,
5416,
247,
35890,
268,
89,
534,
476,
320,
908,
347,
247,
9630,
258,
351,
4868,
50276,
284,
247,
1273,
7680,
436,
2929,
23970,
247,
4499,
422,
3268,
37820,
260,
5267,
1307,
326,
15210,
400,
4219,
247,
1029,
12177,
4313,
875,
31612,
3530,
8025,
281,
320,
3268,
3620,
15544,
285,
31929,
2382,
3530,
50274,
18,
1111,
394,
6358,
344,
249,
278,
8765,
1105,
285,
363,
2345,
5756,
7381,
285,
49137,
757,
17123,
40995,
2139,
774,
86,
6928,
4917,
1029,
39943,
13650,
2080,
1977,
432,
253,
3733,
941,
285,
849,
281,
29966,
253,
1895,
275,
6247,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
30105,
1087,
7223,
577,
8970,
6247,
50276,
19,
359,
262,
606,
632,
86,
1269,
571,
899,
328,
259,
606,
480,
2116,
18454,
561,
285,
340,
895,
9041,
632,
2341,
3169,
562,
1171,
35360,
5481,
275,
16424,
275,
11454,
1491,
5162,
2718,
7223,
25901,
1540,
21866,
1976,
9169,
50276,
296,
3755,
20556,
50276,
783,
4081,
2746,
310,
2969,
285,
253,
28529,
310,
15246,
50275,
783,
2934,
273,
22022,
253,
20185,
15274,
1071,
715,
247,
37820,
1307,
323,
1442,
292,
25004,
3133,
4460,
281,
479,
253,
2934,
273,
24403,
839,
7239,
281,
5416,
253,
18543,
273,
268,
89,
671,
3133,
747,
50276,
783,
4477,
1246,
271,
9470,
16774,
1263,
275,
534,
616,
2746,
41731,
13015,
954,
1375,
273,
253,
1445,
49602,
50275,
20881,
1255,
265,
50276,
783,
4028,
812,
320,
30909,
323,
1650,
253,
4477,
1750,
281,
12661,
247,
747,
2746,
323,
26230,
268,
89,
672,
597,
403,
2686,
42545,
2560,
839,
253,
2412,
953,
3169,
13418,
273,
268,
89,
18543,
369,
417,
2378,
5393,
275,
253,
2929,
1014,
2167,
352,
310,
6425,
281,
253,
9376,
273,
16186,
495,
326,
253,
6311,
1566,
310,
21738,
281,
253,
6036,
3268,
50276,
783,
4477,
22175,
327,
253,
38135,
273,
253,
4038,
46540,
1371,
37820,
1307,
2299,
432,
253,
16774,
1263,
347,
973,
347,
2045,
6239,
352,
310,
4122,
2779,
326,
253,
4499,
35209,
1730,
1307,
310,
253,
2022,
6254,
273,
253,
5520,
3045,
28913,
2175,
513,
417,
557,
290,
2134,
253,
7680,
273,
436,
1307,
432,
326,
273,
253,
4038,
46540,
1371,
581,
2829,
374,
310,
14999,
247,
4194,
323,
17690,
260,
5267,
3815,
50276,
262,
310,
2834,
281,
2939,
253,
8453,
273,
841,
9021,
1955,
281,
253,
3480,
273,
247,
7000,
2581,
685,
40006,
28913,
1263,
285,
253,
897,
273,
4577,
6928,
685,
7744,
18329,
323,
253,
4440,
257,
292,
1543,
534,
891,
2868,
403,
625,
4722,
685,
1110,
273,
260,
338,
274,
6903,
361,
50275,
20,
2564,
70,
6554,
1111,
394,
6358,
480,
375,
532,
277,
75,
311,
543,
66,
4848,
10102,
1944,
2109,
398,
1315,
1972,
14713,
261,
1269,
571,
1368,
5738,
1182,
16926,
425,
300,
288,
3941,
84,
1615,
8660,
249,
21191,
285,
2304,
900,
18205,
280,
27694,
2996,
253,
18543,
273,
4980,
11454,
6928,
16424,
275,
11454,
1491,
5162,
2718,
5910,
43425,
1458,
2358,
21351,
30467,
50276,
18,
253,
3045,
327,
4440,
257,
292,
310,
417,
347,
13943,
347,
326,
327,
260,
338,
274,
6903,
361,
534,
10316,
598,
253,
1953,
273,
1880,
436,
7274,
11498,
342,
253,
1180,
273,
5971,
390,
1979,
273,
10895,
33810,
4067,
6928,
326,
403,
7552,
625,
47817,
323,
4440,
257,
292,
13472,
8892,
1537,
320,
1805,
35890,
495,
352,
651,
320,
4722,
281,
12654,
326,
436,
2746,
476,
320,
347,
3576,
327,
841,
4980,
35615,
50276,
19,
1223,
253,
4477,
1750,
31640,
281,
4373,
4764,
10165,
253,
269,
1087,
7982,
1057,
1646,
281,
1818,
3012,
327,
253,
3388,
22791,
342,
1027,
10165,
273,
391,
285,
14604,
1979,
352,
651,
320,
4722,
281,
923,
604,
28763,
327,
247,
8578,
273,
253,
10895,
390,
247,
1846,
10014,
50276,
7152,
339,
4904,
1332,
310,
3559,
281,
1442,
292,
2517,
3215,
11273,
6928,
407,
16984,
37820,
2426,
824,
326,
253,
16888,
4038,
7239,
5998,
432,
2412,
953,
10129,
253,
16774,
4038,
295,
1362,
253,
1750,
310,
326,
436,
1332,
4483,
253,
2020,
273,
2412,
614,
37755,
251,
15738,
2412,
2204,
90,
385,
12039,
89,
281,
320,
908,
347,
247,
4868,
323,
258,
351,
5481,
12510,
327,
2710,
15302,
310,
6730,
342,
1175,
1543,
50275,
783,
47284,
310,
5536,
327,
15965,
27947,
387,
253,
2105,
273,
20178,
19843,
285,
1239,
1430,
247,
2257,
273,
15075,
310,
4845,
327,
2710,
27947,
533,
253,
30328,
285,
4473,
513,
417,
1705,
2439,
347,
4518,
323,
4227,
275,
247,
2929,
327,
258,
351,
5481,
752,
310,
908,
347,
253,
258,
351,
4868,
943,
320,
4518,
21947,
2299,
1060,
253,
958,
326,
2412,
2204,
90,
385,
12039,
89,
310,
908,
347,
253,
27585,
258,
351,
4868,
310,
5393,
760,
18500,
5356,
285,
310,
3477,
281,
2985,
327,
253,
806,
4361,
50276,
8826,
12645,
323,
7756,
337,
891,
1928,
326,
253,
3733,
5933,
943,
320,
2908,
275,
253,
2022,
7714,
3185,
273,
253,
30762,
1580,
352,
8254,
7790,
849,
253,
2710,
37820,
2426,
403,
5118,
285,
4245,
247,
30909,
5406,
273,
253,
4583,
2685,
690,
273,
253,
4278,
273,
253,
458,
44661,
285,
27947,
476,
320,
4395,
281,
253,
30762,
374,
15965,
14951,
943,
320,
21010,
285,
1160,
5185,
281,
3157,
1239,
1430,
323,
4227,
340,
310,
908,
1097,
347,
253,
7976,
4778,
275,
16186,
374,
285,
671,
347,
253,
3605,
273,
36138,
275,
253,
12619,
253,
1307,
35267,
2524,
1362,
275,
16186,
898,
3133,
281,
320,
247,
13434,
2299,
275,
16186,
84,
1249,
285,
2145,
253,
4315,
811,
3321,
5572,
310,
3732,
281,
352,
476,
253,
4477,
4496,
19148,
849,
281,
4665,
35267,
2524,
1362,
671,
14951,
943,
320,
5544,
835,
352,
310,
5611,
323,
4227,
14168,
67,
1768,
310,
5611,
275,
16186,
1668,
533,
310,
5544,
2761,
247,
3239,
1996,
327,
1386,
32900,
50275,
783,
2929,
310,
973,
17194,
2167,
285,
253,
1543,
1646,
1175,
533,
627,
310,
247,
1953,
670,
49602,
5544,
275,
253,
1735,
1953,
1078,
50274,
2369,
7350,
342,
2442,
4016,
38058,
3486,
5474,
33032,
2520,
2929,
16681,
253,
30453,
273,
1655,
258,
351,
5481,
3082,
326,
954,
273,
731,
1611,
281,
1566,
2654,
268,
89,
970,
253,
3386,
326,
403,
540,
1365,
10166,
281,
4944,
7239,
89,
281,
8415,
253,
3268,
3731,
40446,
247,
4460,
4038,
46540,
1371,
3963,
478,
1942,
251,
310,
4081,
247,
4499,
422,
17705,
310,
671,
908,
281,
892,
21562,
268,
89,
594,
326,
258,
351,
3530,
476,
19912,
275,
1698,
20425,
2170,
281,
320,
3636,
4354,
253,
4081,
1332,
310,
6760,
2220,
1097,
258,
70,
285,
1327,
3703,
7533,
285,
253,
3045,
23141,
253,
2045,
3082,
275,
247,
1781,
8459,
354,
50276,
783,
2929,
310,
973,
24013,
8550,
285,
253,
3082,
476,
2953,
253,
1895,
540,
41597,
50276,
783,
1332,
10725,
327,
11464,
13260,
342,
10527,
4737,
50276,
783,
2929,
31326,
1534,
1270,
1543,
2439,
2710,
4679,
342,
4229,
4373,
22041,
50275,
585,
50276,
783,
1543,
275,
2829,
337,
403,
2221,
25507,
2299,
323,
260,
338,
274,
2313,
4679,
891,
717,
625,
6110,
275,
253,
1543,
672,
260,
338,
274,
740,
310,
258,
351,
347,
260,
338,
274,
740,
285,
260,
338,
274,
2313,
403,
3839,
4453,
19605,
533,
597,
403,
432,
1027,
5203,
3268,
436,
906,
3798,
4518,
921,
849,
253,
1332,
9232,
30591,
281,
12129,
2654,
836,
50276,
249,
30105,
258,
351,
3530,
403,
3798,
2783,
347,
24705,
31101,
281,
2654,
15302,
2299,
247,
1846,
2159,
4202,
273,
1655,
258,
351,
5481,
7103,
14238,
897,
1027,
15302,
347,
258,
351,
594,
326,
690,
3082,
476,
897,
1698,
5251,
28019,
28194,
281,
1056,
1175,
1543,
24088,
247,
2969,
1698,
8656,
21061,
2460,
30410,
1537,
513,
4465,
973,
327,
690,
258,
351,
15302,
533,
2686,
359,
3524,
253,
1566,
4648,
24705,
3064,
281,
12129,
2654,
836,
891,
3524,
253,
4081,
1332,
4850,
253,
1805,
906,
417,
407,
28019,
28194,
253,
24746,
1039,
281,
4737,
310,
281,
921,
2654,
260,
338,
274,
740,
258,
351,
260,
338,
274,
2313,
285,
2654,
260,
338,
274,
2313,
258,
351,
260,
338,
274,
740,
2219,
50276,
783,
2929,
2792,
562,
253,
3300,
39904,
21481,
5113,
875,
4440,
257,
292,
285,
275,
8646,
382,
275,
690,
643,
2987,
824,
347,
362,
303,
562,
1171,
35360,
342,
4732,
86,
455,
462,
262,
11038,
5987,
7280,
681,
864,
36765,
6163,
19293,
324,
30275,
10240,
13642,
562,
1171,
35360,
5481,
323,
1524,
10186,
7533,
690,
28452,
258,
351,
15302,
323,
4440,
257,
292,
403,
2530,
4477,
403,
5125,
281,
513,
4679,
327,
841,
4076,
258,
351,
941,
390,
281,
1329,
253,
1921,
326,
253,
4081,
1332,
1057,
417,
513,
973,
327,
14542,
50276,
249,
8646,
382,
1955,
281,
253,
5203,
6046,
253,
2488,
3091,
921,
253,
3991,
456,
19958,
3300,
39904,
1189,
77,
5436,
5113,
755,
1029,
7162,
50276,
262,
651,
320,
1270,
604,
253,
4477,
2085,
625,
24426,
390,
12861,
1783,
327,
752,
277,
7083,
285,
260,
5267,
3486,
5046,
327,
253,
4735,
2317,
281,
1361,
10668,
1805,
2096,
253,
1332,
253,
4477,
513,
417,
2486,
253,
5955,
327,
253,
12291,
273,
253,
1332,
359,
3524,
253,
4477,
476,
2085,
625,
1783,
327,
697,
14855,
275,
1262,
316,
859,
279,
5474,
339,
431,
248,
2929,
29328,
281,
3157,
258,
351,
5481,
342,
2341,
4868,
337,
407,
24049,
247,
4038,
3169,
3963,
6081,
1309,
3733,
326,
18653,
1805,
6642,
273,
7239,
275,
1798,
253,
3963,
6081,
310,
6012,
1754,
327,
253,
18235,
6866,
273,
247,
9079,
1071,
281,
5416,
253,
15274,
273,
1114,
442,
1113,
4213,
1599,
273,
7239,
5998,
591,
14604,
285,
253,
16774,
1599,
275,
253,
3733,
873,
295,
1362,
50272,
18,
632,
86,
1162,
355,
2341,
3169,
562,
1171,
35360,
5481,
295,
2824,
9169,
50276,
296,
3755,
20556,
50276,
783,
16038,
326,
5368,
258,
351,
5481,
7363,
11089,
432,
4105,
4038,
13418,
310,
2590,
285,
253,
4836,
310,
1774,
50275,
783,
4081,
1332,
1754,
327,
1569,
273,
1781,
3904,
285,
4275,
2701,
10012,
310,
2969,
50275,
20881,
1255,
265,
50276,
783,
24775,
3212,
253,
4081,
1332,
310,
12744,
281,
479,
4390,
253,
2987,
18653,
1114,
442,
1113,
4213,
6642,
273,
7239,
90,
281,
320,
8003,
281,
295,
1362,
275,
253,
3733,
873,
2299,
1309,
1071,
673,
436,
9376,
778,
417,
2186,
258,
351,
5481,
7350,
342,
5203,
5333,
285,
352,
310,
12744,
2139,
581,
5605,
281,
689,
8491,
253,
4038,
273,
5203,
3268,
275,
253,
3733,
873,
50276,
6050,
1097,
18057,
337,
285,
18057,
374,
2186,
672,
295,
4566,
281,
23579,
50276,
783,
14604,
1979,
908,
275,
3946,
310,
17558,
835,
1114,
442,
1113,
4213,
6642,
476,
906,
275,
1781,
11041,
285,
36230,
13418,
3103,
627,
4961,
1534,
18388,
875,
253,
4081,
10527,
16039,
285,
8542,
7092,
50276,
6050,
340,
310,
275,
1698,
15759,
2317,
13546,
7239,
4419,
24399,
689,
1269,
534,
403,
1029,
15759,
11390,
285,
3103,
253,
2538,
273,
7877,
1319,
327,
253,
7200,
273,
13418,
310,
1327,
570,
3129,
304,
917,
50275,
783,
2929,
2908,
11985,
327,
7364,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
24357,
247,
1332,
323,
11138,
562,
1171,
35360,
5481,
275,
3676,
4715,
1754,
327,
247,
4460,
37820,
1307,
50276,
9088,
369,
1534,
11041,
275,
2278,
7363,
342,
767,
16928,
272,
253,
2929,
323,
14924,
285,
767,
45210,
7363,
818,
818,
608,
577,
4795,
275,
271,
40006,
4868,
816,
1840,
45210,
2997,
50276,
783,
30628,
16425,
323,
14924,
1119,
253,
1332,
4460,
253,
17647,
273,
253,
5933,
18511,
285,
253,
4679,
9470,
285,
21414,
50276,
531,
37317,
369,
7514,
326,
8245,
14023,
2530,
275,
253,
2929,
1646,
1679,
2266,
685,
2361,
275,
643,
789,
50276,
9389,
30628,
17801,
253,
15965,
3538,
569,
285,
690,
273,
253,
6944,
13260,
273,
253,
2929,
50274,
3529,
767,
30628,
403,
16425,
323,
14924,
310,
247,
2625,
326,
253,
2929,
812,
320,
247,
4217,
7680,
285,
4722,
281,
253,
3114,
50276,
17480,
253,
4679,
1646,
9470,
285,
1646,
281,
7568,
326,
253,
1332,
12724,
2987,
973,
285,
1677,
326,
352,
310,
2969,
281,
3359,
326,
3133,
281,
17813,
253,
6944,
13260,
285,
352,
812,
2085,
247,
4217,
8245,
50276,
45230,
253,
17401,
310,
281,
2997,
253,
2929,
50276,
32897,
1056,
2119,
281,
2953,
253,
5780,
37317,
7350,
275,
253,
2457,
7714,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary the paper proves that the generalization curve of a linear regression problem can be designed the paper discusses both the underparameterized and overparameterized case and shows that the generalization curve can be designed in either case the paper presents only theoretical results reason for score my vote is for accepting the paper the subject it addresses is of importance and i believe the results that are presented are of sufficient interest pros 1 the generalization error is an important aspect for ml algorithms the paper addresses the case of linear regression one of the simplest ml algorithms however showing that the generalization error can be controlled even for a simple model as this is nonetheless important 2 the paper is well written the problem it addresses is clearly discussed and the development of the proposed method is well detailed cons 1 i would have liked to have some numerical examples to illustrate the design of the generalization curve for a simple case 2 in the setting in the paper you draw the new elements either from a normal distribution or from a mixture distribution when you increase the dimension in a practical settings where i already have the data do such hypothesis still remain true miscellaneous 1 could you please elaborate on the statement that the true linear model is beta 0 in rd for me it is not clear what is the purpose of the statement do you mean that the model parameters are all zero 2 there are some typos present for example the quantiry in the paragraph after lemma 3 they should all be spotted by a spell checker docsepprevious work has shown peaks in generalization error as the capacity of the model increases called the doubledescent phenomenon the submitted paper proposes methods for generating data that would arbitrarily change the number and positions of peaks in a generalizationvscapacity curve for linear regression where the number of features controls the capacity and shows that properties of data can play an important role in this phenomenon this paper tackles an important problem in a quite active area of research with clear presentation and coherent organization existence of this data serves as an impossibility result that shows that relating the double descent phenomenon to the properties of model and interpolation without further assumptions on the data is futile however there is a critical discrepancy between the generalization curves studied in this paper and previous work that i describe below and therefore im leaning towards rejection i will raise my score if the authors can show that the effects on the number and positions of the peaks hold in the original setting as i believe this is an important paper otherwise the generalization error in this paper is normalized by the square of the number of features and this can have major effects on the shape of generalizationvscapacity curve the number of features is what controls the capacity so for example if the regular unnormalized error is flat across different capacities the normalized curve will be a decreasing sequence neither the generalization error in a classical biasvariance curve nor the error that matters to a practitioner is normalized i skimmed through the double descent paper by belkin et al and they also seem to be using the typical generalization error which is not normalized the motivation for normalization in the paper is that the closed form error atopx2 sums over d dimensions and so the generalization error has to be normalized by d2 this does not seem right atop itself has factors that sum over d dimensions and are then inverted so the effect of d will cancel out minor remarks beta and a are not clearly defined in the problem setup update the issue with normalization is fixed in the new version and i am increasing my scoredocsepshort summary the paper claims that the double descent phenomenon arises from specific interactions between properties of typical data and the inductive biases of algorithms and constructs a distribution that generates an arbitrary generalization curve for linear regression specifically building a multiple descent generalization curve both in the under and overparametrized regimes the model that is used in the paper is a linear regression model over an increasing in a revealing manner set of coordinates or features the authors construct the distribution that gives the peaks at custom coordinates by having features being independent standard normal when they want the test error to decrease and to be a independent mixture of gaussians when they want the test to increase main points while the math to my understanding is clean and the exposition is clear my main concern is how the authors relate their findings to double descent this worries me in two related ways first from the perspective of the complexity of the model while adding a dimension to the linear regression adds a parameter im skeptical how this relates to the complexity of the model in how we view complexity in machine learning and in the research area of double descent in particular i would be much more convinced if the authors could show a case where adding a feature in the random features sense where the features are of the whole vector say apply a random rotation and then do the inverse transform sampling or adding a neuron in a two layer network and still being able to decreaseincrease performance arbitrarily or close to it in some sense even doing the same as in httpsarxivorgpdf190307571pdf where they choose a random set of indices of increasing cardinality would convince me much more the second related issue is the distribution of the features i would not mind it if the classifier would use the features uniformly but increasingdecreasing the hardness of the distribution at each coordinate feels very artificial in the following sense assume that the first coordinate is the label or something close to it but the next coordinates are pure noise then both our train and test will increase when we increase the number of features in my intuition this is very far from what is studied and claimed in the double descent literature for example in the sense of belkins interpolation point or nakkirans model complexity we expect the train error to decrease when model capacity increases i do believe that the question of whether we can construct an arbitrary generalization curve is very important and that it should be studied and explored more deeply but im not convinced by the setup in this paper i would be willing to change my opinion in the case the authors will address the above points in a satisfactory manner minor comments 1 the related work in the body of the paper is lacking i one notable paper that should be present is advani saxe 17 httpsarxivorgabs171003667 ii while nayshabur 15 observe the double decent without realizing it and neal 18 study the biasvariance tradeoff nakkiran 19 httpsarxivorgabs191202292 is the first to demonstrate it in a convincing fashion and should be cited as such 2 i would appreciate an explanation for why the loss is scaled by 1d2 this feels rather arbitrary docsepthis paper studies the doublemultiple descents of the prediction error curve for linear regression in both the underparameterized and overparameterized regime the strength while many papers have studied the double descent for linear regression estimate or the minimum ell2 norm solution this paper shows multiple descents when dosqrtn a setting is barely studied by others further while multiple descents have been numerically discovered by other concurrent works they have theoretically proved that such multiple descents exist the weakness the major weakness of the paper is the model settings specifically 1 it is unclear why the prediction error is normalized by the number of features and 2 the bias term is left out in the prediction error due to the true coefficients being zero and only the variance term is considered first for normalization the authors claim that this normalization is necessary for comparison indeed the entire results are hinged on this normalization ie without the normalization the proof can not show the existence of the multiple descents neither in underparameterized regime nor overparameterized regime the reasons i found this normalization is weird are the following i normal linear regression problem does not have such normalization on the prediction error it is unclear why we want to divide a onedimensional error by the feature size ii other double descent works mainly deliver two messages a given a fixed sample size what is the best model gives the best estimate of the response the answer is a larger model ie adding more features may help eg xus pcr paper b given a fixed feature size what is the best sample size that gives the best estimate of the response the answer is using a smaller sample size may help eg hasties double descent paper for both cases i do not see any reason to normalize the prediction error of response by feature size if this normalization is for the purpose of the model selection penalty it is unclear why we should encourage a larger model instead of penalizing it a reasonable quantity for such normalization is the mse of the coefficient ie hatbetabeta2 there are many applications where people are more interested in the coefficients rather than the response maybe the authors should consider this quantity instead of the prediction error for the second weakness of the model settings the bias term has been left out of the prediction error when the true coefficients are assumed to be all zero because of this setting all features are just pure noise irrelevant to the response then we can check that 0 is the best estimate when all features are just pure noise and it seems that there is no motivation for us to learn anything from the random noise if the main purpose of this paper is to deliver a message that using only irrelevant features and adding more of them can help to improve the prediction error this effect is known already in those double descent paper in the overparameterized regime showing multiple descents does not add much value because it never beats the trivial estimate 0 in this setting because of these major weakness i recommend rejection for this paper but i will possibly change my evaluation if the authors can provide a very convincing explanation of the model settings and motivation besides these another suggestion for the paper is that the proof of the theorems and the statement of lemmas takes a lot of places i think they can be replaced by more detailed discussions of the model settings and messages or conclusions from the main theorems for example is there any intuition about what kind of multiple descents curve is more favorable also despite the attractive title i think it is still hard to design the generalization curve without taking the bias term into consideration the room can be left for the analysis of the bias term after response thanks for addressing the concern about normalization it appears that other reviewers have a concern about such normalization as well i suggest the authors remove the results with normalization entirely from the main paper and only have it in the appendix for anyone that is interested in such normalization on the other hand without normalization the results have changed for the underparameterized regime which makes more sense to me and the proof looks quite different in the overparameterized regime as well i did not have time to check the proof and i believe it is better to resubmit the paper as new because of the major changes finally i still have concerns about the fact that only variance is discussed i suggest the authors state their results in a setting where both bias and variance exists and the features added to the model are related to the response otherwise it is a weird message that it is good to add pure noise as features it feels like although we can design multiple descents in the overparameterized regime when noise is large it is very likely that the 0 estimate achieves the best prediction risk so there is no point to go into overparameterization and multiple descents at all in summary i have raised the score to 5 i believe it can be 6 or 7 if all issues are addressed but i am afraid that the paper looks basically new after these changes and thus i am not sure whether it should be still considered for this conference
### Summary: | while there was some interest in the analysis the consensus view was that the original treatment was not sufficiently wellmotivated and the revision was too dissimilar from the original submission for it to be evaluated for publication in this years iclr | [
20673,
407,
247,
15368,
2451,
254,
5474,
339,
377,
250,
3391,
789,
556,
2011,
13596,
275,
26647,
2228,
347,
253,
5350,
273,
253,
1566,
5459,
1925,
253,
25128,
40513,
11562,
253,
9262,
2929,
29328,
3082,
323,
11365,
941,
326,
651,
29607,
1818,
253,
1180,
285,
6887,
273,
13596,
275,
247,
26647,
87,
1026,
522,
10757,
6970,
323,
4872,
9077,
835,
253,
1180,
273,
3386,
5760,
253,
5350,
50276,
395,
2722,
326,
3607,
273,
941,
476,
1132,
271,
1774,
2554,
275,
436,
11562,
50276,
2520,
2929,
39223,
271,
1774,
1895,
275,
247,
3240,
3939,
2170,
273,
2561,
342,
2590,
9759,
285,
18893,
6003,
6242,
273,
436,
941,
11029,
347,
271,
37593,
2322,
906,
326,
2722,
326,
12600,
253,
4021,
18499,
11562,
281,
253,
3607,
273,
1566,
285,
30370,
1293,
2007,
13260,
327,
253,
941,
310,
42568,
2299,
627,
310,
247,
4619,
26210,
875,
253,
26647,
9191,
5421,
275,
436,
2929,
285,
2045,
789,
326,
891,
6266,
2708,
285,
3103,
516,
25661,
4404,
18235,
891,
588,
7164,
619,
4868,
604,
253,
4477,
476,
921,
326,
253,
2538,
327,
253,
1180,
285,
6887,
273,
253,
13596,
2186,
275,
253,
3236,
4758,
347,
891,
2868,
436,
310,
271,
1774,
2929,
5010,
50276,
783,
26647,
2228,
275,
436,
2929,
310,
12650,
407,
253,
6278,
273,
253,
1180,
273,
3386,
285,
436,
476,
452,
2201,
2538,
327,
253,
5281,
273,
26647,
87,
1026,
522,
10757,
6970,
253,
1180,
273,
3386,
310,
752,
5760,
253,
5350,
594,
323,
1650,
604,
253,
3963,
440,
6320,
1025,
2228,
310,
6507,
2439,
1027,
29142,
253,
12650,
6970,
588,
320,
247,
11052,
3425,
50276,
570,
1622,
253,
26647,
2228,
275,
247,
8946,
8492,
87,
14417,
6970,
4543,
253,
2228,
326,
8213,
281,
247,
34815,
310,
12650,
891,
43816,
1314,
949,
253,
4021,
18499,
2929,
407,
1112,
5914,
1162,
355,
285,
597,
671,
1646,
281,
320,
970,
253,
6867,
26647,
2228,
534,
310,
417,
12650,
50276,
783,
16038,
323,
21539,
275,
253,
2929,
310,
326,
253,
4581,
830,
2228,
31831,
89,
19,
22661,
689,
277,
10103,
285,
594,
253,
26647,
2228,
556,
281,
320,
12650,
407,
277,
19,
436,
1057,
417,
1646,
987,
31831,
3139,
556,
2616,
326,
2020,
689,
277,
10103,
285,
403,
840,
28483,
594,
253,
1055,
273,
277,
588,
14002,
562,
50276,
37585,
16157,
50276,
2461,
285,
247,
403,
417,
4518,
2931,
275,
253,
1895,
9978,
50275,
11183,
253,
2523,
342,
21539,
310,
4229,
275,
253,
747,
2715,
285,
891,
717,
3629,
619,
11691,
406,
339,
793,
73,
430,
6010,
253,
2929,
3916,
326,
253,
4021,
18499,
11562,
15877,
432,
2173,
6355,
875,
3607,
273,
6867,
941,
285,
253,
42115,
31306,
273,
11333,
285,
21031,
247,
3268,
326,
15693,
271,
10341,
26647,
6970,
323,
4872,
9077,
5742,
3652,
247,
2709,
18499,
26647,
6970,
1097,
275,
253,
762,
285,
689,
3575,
292,
50065,
27005,
50275,
783,
1566,
326,
310,
908,
275,
253,
2929,
310,
247,
4872,
9077,
1566,
689,
271,
3629,
275,
247,
19678,
5133,
873,
273,
11627,
390,
3386,
253,
4477,
3989,
253,
3268,
326,
4245,
253,
13596,
387,
2840,
11627,
407,
1907,
3386,
1146,
3907,
50276,
15291,
2622,
672,
597,
971,
253,
1071,
2228,
281,
6379,
285,
281,
320,
247,
3907,
7802,
273,
305,
10064,
2458,
672,
597,
971,
253,
1071,
281,
2572,
50276,
7265,
2792,
1223,
253,
14168,
281,
619,
4685,
310,
4076,
285,
253,
47284,
310,
2590,
619,
2022,
4468,
310,
849,
253,
4477,
14588,
616,
4342,
281,
4021,
18499,
436,
28314,
479,
275,
767,
2905,
4088,
806,
432,
253,
8668,
273,
253,
10454,
273,
253,
1566,
50276,
6050,
6240,
247,
7877,
281,
253,
4872,
9077,
11323,
247,
4764,
516,
33872,
849,
436,
7033,
281,
253,
10454,
273,
253,
1566,
275,
849,
359,
1859,
10454,
275,
5145,
4715,
285,
275,
253,
2561,
2170,
273,
4021,
18499,
275,
1798,
891,
651,
320,
1199,
625,
13762,
604,
253,
4477,
812,
921,
247,
1083,
835,
6240,
247,
4735,
275,
253,
3632,
3386,
3282,
835,
253,
3386,
403,
273,
253,
2644,
4972,
1333,
4647,
247,
3632,
9381,
285,
840,
513,
253,
13737,
4979,
10491,
390,
6240,
247,
23586,
275,
247,
767,
3828,
2990,
285,
1335,
1146,
2104,
281,
6379,
19687,
511,
3045,
29607,
390,
2810,
281,
352,
275,
690,
3282,
50276,
9154,
2509,
253,
1072,
347,
275,
5987,
39962,
2061,
9275,
746,
15960,
1976,
3677,
9275,
835,
597,
5206,
247,
3632,
873,
273,
14452,
273,
3629,
46950,
651,
18578,
479,
1199,
625,
253,
1273,
2905,
2523,
310,
253,
3268,
273,
253,
3386,
891,
651,
417,
2564,
352,
604,
253,
30410,
651,
897,
253,
3386,
17568,
533,
3629,
40600,
2355,
253,
38576,
273,
253,
3268,
387,
1016,
13249,
9193,
1077,
13345,
275,
253,
1563,
3282,
5467,
326,
253,
806,
13249,
310,
253,
5203,
390,
1633,
2810,
281,
352,
533,
253,
1735,
11627,
403,
6313,
6046,
840,
1097,
776,
6194,
285,
1071,
588,
2572,
672,
359,
2572,
253,
1180,
273,
3386,
275,
619,
30328,
436,
310,
1077,
2080,
432,
752,
310,
5421,
285,
7558,
275,
253,
4021,
18499,
6239,
323,
1650,
275,
253,
3282,
273,
1112,
7232,
30370,
1127,
390,
295,
36258,
343,
507,
1566,
10454,
359,
1902,
253,
6194,
2228,
281,
6379,
672,
1566,
5350,
5459,
50275,
74,
513,
2868,
326,
253,
1953,
273,
1880,
359,
476,
3989,
271,
10341,
26647,
6970,
310,
1077,
1774,
285,
326,
352,
943,
320,
5421,
285,
14859,
625,
11617,
533,
516,
417,
13762,
407,
253,
9978,
275,
436,
2929,
891,
651,
320,
7378,
281,
1818,
619,
4743,
275,
253,
1083,
253,
4477,
588,
2953,
253,
1840,
2792,
275,
247,
20297,
5133,
50275,
37585,
5701,
337,
253,
2905,
789,
275,
253,
2133,
273,
253,
2929,
310,
14999,
50275,
74,
50276,
531,
16613,
2929,
326,
943,
320,
1246,
310,
1604,
6451,
50276,
84,
991,
70,
1722,
5987,
39962,
2061,
5375,
1166,
2313,
1812,
2251,
50273,
2886,
1223,
295,
698,
8621,
321,
1458,
10018,
253,
4021,
12524,
1293,
27017,
352,
285,
425,
267,
1283,
1263,
253,
8492,
87,
14417,
5454,
2727,
295,
36258,
343,
266,
655,
5987,
39962,
2061,
5375,
746,
8193,
1423,
4529,
310,
253,
806,
281,
7568,
352,
275,
247,
21414,
8142,
285,
943,
320,
11106,
347,
824,
50276,
19,
891,
651,
11435,
271,
8813,
323,
2139,
253,
2957,
310,
24337,
407,
337,
69,
19,
436,
9193,
2581,
10341,
50275,
7152,
33032,
2520,
2929,
2175,
253,
4021,
34263,
1398,
592,
273,
253,
10554,
2228,
6970,
323,
4872,
9077,
275,
1097,
253,
762,
19484,
1025,
285,
689,
19484,
1025,
9459,
50275,
783,
4757,
1223,
1142,
9380,
452,
5421,
253,
4021,
18499,
323,
4872,
9077,
6642,
390,
253,
5927,
11591,
19,
5222,
2900,
50276,
2520,
2929,
2722,
2709,
1398,
592,
672,
9500,
2274,
79,
247,
4758,
310,
12345,
5421,
407,
2571,
2007,
1223,
2709,
1398,
592,
452,
644,
27184,
6888,
407,
643,
17336,
2987,
597,
452,
28055,
8058,
326,
824,
2709,
1398,
592,
2226,
50274,
783,
14855,
253,
2201,
14855,
273,
253,
2929,
310,
253,
1566,
7533,
5742,
337,
352,
310,
12744,
2139,
253,
10554,
2228,
310,
12650,
407,
253,
1180,
273,
3386,
285,
374,
253,
8492,
1307,
310,
1669,
562,
275,
253,
10554,
2228,
1955,
281,
253,
2032,
10303,
1146,
5058,
285,
760,
253,
11041,
1307,
310,
2783,
806,
323,
21539,
253,
4477,
1750,
326,
436,
21539,
310,
3309,
323,
5301,
6296,
253,
2862,
1543,
403,
34865,
264,
327,
436,
21539,
26332,
1293,
253,
21539,
253,
4737,
476,
417,
921,
253,
6242,
273,
253,
2709,
1398,
592,
6747,
275,
762,
19484,
1025,
9459,
4543,
689,
19484,
1025,
9459,
253,
4606,
891,
1119,
436,
21539,
310,
12504,
403,
253,
1563,
50276,
74,
2622,
4872,
9077,
1895,
1057,
417,
452,
824,
21539,
327,
253,
10554,
2228,
352,
310,
12744,
2139,
359,
971,
281,
10957,
247,
327,
264,
37613,
2228,
407,
253,
4735,
1979,
50276,
2886,
643,
4021,
18499,
2987,
7194,
7257,
767,
8169,
50276,
66,
1677,
247,
4229,
3410,
1979,
752,
310,
253,
1682,
1566,
4245,
253,
1682,
6642,
273,
253,
2380,
253,
3662,
310,
247,
4067,
1566,
26332,
6240,
625,
3386,
778,
1361,
24088,
1269,
316,
268,
7083,
2929,
50276,
67,
1677,
247,
4229,
4735,
1979,
752,
310,
253,
1682,
3410,
1979,
326,
4245,
253,
1682,
6642,
273,
253,
2380,
253,
3662,
310,
970,
247,
4577,
3410,
1979,
778,
1361,
24088,
16579,
447,
4021,
18499,
2929,
50276,
1542,
1097,
2219,
891,
513,
417,
923,
667,
1921,
281,
39142,
253,
10554,
2228,
273,
2380,
407,
4735,
1979,
604,
436,
21539,
310,
323,
253,
4096,
273,
253,
1566,
5438,
12339,
352,
310,
12744,
2139,
359,
943,
11907,
247,
4067,
1566,
3185,
273,
29697,
3006,
352,
50273,
66,
5272,
10671,
323,
824,
21539,
310,
253,
278,
339,
273,
253,
10235,
26332,
7856,
9900,
357,
1464,
19,
627,
403,
1142,
4893,
835,
952,
403,
625,
6110,
275,
253,
10303,
2581,
685,
253,
2380,
5046,
253,
4477,
943,
1908,
436,
10671,
3185,
273,
253,
10554,
2228,
50275,
1542,
253,
1273,
14855,
273,
253,
1566,
7533,
253,
8492,
1307,
556,
644,
1669,
562,
273,
253,
10554,
2228,
672,
253,
2032,
10303,
403,
8025,
281,
320,
512,
5058,
984,
273,
436,
4758,
512,
3386,
403,
816,
6313,
6046,
19124,
281,
253,
2380,
840,
359,
476,
2451,
326,
470,
310,
253,
1682,
6642,
672,
512,
3386,
403,
816,
6313,
6046,
285,
352,
3133,
326,
627,
310,
642,
16038,
323,
441,
281,
3037,
2712,
432,
253,
3632,
6046,
604,
253,
2022,
4096,
273,
436,
2929,
310,
281,
7257,
247,
3935,
326,
970,
760,
19124,
3386,
285,
6240,
625,
273,
731,
476,
1361,
281,
3157,
253,
10554,
2228,
436,
1055,
310,
1929,
2168,
275,
1110,
4021,
18499,
2929,
275,
253,
689,
19484,
1025,
9459,
4645,
2709,
1398,
592,
1057,
417,
823,
1199,
1318,
984,
352,
1620,
27125,
253,
14916,
6642,
470,
275,
436,
4758,
50273,
12157,
273,
841,
2201,
14855,
891,
5583,
18235,
323,
436,
2929,
533,
891,
588,
6830,
1818,
619,
7103,
604,
253,
4477,
476,
2085,
247,
1077,
21414,
8813,
273,
253,
1566,
7533,
285,
16038,
50276,
67,
11587,
841,
1529,
14876,
323,
253,
2929,
310,
326,
253,
4737,
273,
253,
39383,
285,
253,
3908,
273,
458,
44661,
3936,
247,
2257,
273,
5053,
891,
1158,
597,
476,
320,
7932,
407,
625,
7000,
11985,
273,
253,
1566,
7533,
285,
8169,
390,
11815,
432,
253,
2022,
39383,
323,
1650,
310,
627,
667,
30328,
670,
752,
2238,
273,
2709,
1398,
592,
6970,
310,
625,
13857,
671,
5747,
253,
12994,
4060,
891,
1158,
352,
310,
1335,
1892,
281,
2216,
253,
26647,
6970,
1293,
3192,
253,
8492,
1307,
715,
8180,
253,
2316,
476,
320,
1669,
323,
253,
1783,
273,
253,
8492,
1307,
50273,
6438,
2380,
6701,
323,
15974,
253,
4468,
670,
21539,
352,
4620,
326,
643,
30628,
452,
247,
4468,
670,
824,
21539,
347,
973,
891,
1804,
253,
4477,
5386,
253,
1543,
342,
21539,
7094,
432,
253,
2022,
2929,
285,
760,
452,
352,
275,
253,
30762,
323,
3780,
326,
310,
6110,
275,
824,
21539,
50275,
251,
253,
643,
1133,
1293,
21539,
253,
1543,
452,
4391,
323,
253,
762,
19484,
1025,
9459,
534,
2789,
625,
3282,
281,
479,
285,
253,
4737,
4453,
3240,
1027,
275,
253,
689,
19484,
1025,
9459,
347,
973,
891,
858,
417,
452,
673,
281,
2451,
253,
4737,
285,
891,
2868,
352,
310,
1805,
281,
501,
538,
2225,
253,
2929,
347,
747,
984,
273,
253,
2201,
2544,
50275,
71,
3341,
891,
1335,
452,
7350,
670,
253,
958,
326,
760,
11041,
310,
5469,
891,
1804,
253,
4477,
1375,
616,
1543,
275,
247,
4758,
835,
1097,
8492,
285,
11041,
4961,
285,
253,
3386,
2879,
281,
253,
1566,
403,
2905,
281,
253,
2380,
5010,
352,
310,
247,
12504,
3935,
326,
352,
310,
1175,
281,
823,
6313,
6046,
347,
3386,
352,
9193,
751,
3738,
359,
476,
2216,
2709,
1398,
592,
275,
253,
689,
19484,
1025,
9459,
672,
6046,
310,
1781,
352,
310,
1077,
2779,
326,
253,
470,
6642,
33526,
253,
1682,
10554,
2495,
594,
627,
310,
642,
1127,
281,
564,
715,
689,
19484,
1320,
285,
2709,
1398,
592,
387,
512,
50272,
249,
6010,
891,
452,
5439,
253,
4868,
281,
608,
891,
2868,
352,
476,
320,
721,
390,
818,
604,
512,
3374,
403,
9713,
533,
891,
717,
9202,
326,
253,
2929,
4453,
10323,
747,
846,
841,
2544,
285,
3021,
891,
717,
417,
2119,
1880,
352,
943,
320,
1335,
2783,
323,
436,
8059,
50275,
187,
187,
4118,
18435,
27,
6050,
627,
369,
690,
1600,
275,
253,
1783,
253,
13969,
1859,
369,
326,
253,
3236,
1971,
369,
417,
10481,
973,
24013,
8550,
285,
253,
18520,
369,
1512,
43110,
432,
253,
3236,
19529,
323,
352,
281,
320,
6760,
323,
9311,
275,
436,
1107,
17857,
32888
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
20673,
407,
247,
15368,
2451,
254,
5474,
339,
377,
250,
3391,
789,
556,
2011,
13596,
275,
26647,
2228,
347,
253,
5350,
273,
253,
1566,
5459,
1925,
253,
25128,
40513,
11562,
253,
9262,
2929,
29328,
3082,
323,
11365,
941,
326,
651,
29607,
1818,
253,
1180,
285,
6887,
273,
13596,
275,
247,
26647,
87,
1026,
522,
10757,
6970,
323,
4872,
9077,
835,
253,
1180,
273,
3386,
5760,
253,
5350,
50276,
395,
2722,
326,
3607,
273,
941,
476,
1132,
271,
1774,
2554,
275,
436,
11562,
50276,
2520,
2929,
39223,
271,
1774,
1895,
275,
247,
3240,
3939,
2170,
273,
2561,
342,
2590,
9759,
285,
18893,
6003,
6242,
273,
436,
941,
11029,
347,
271,
37593,
2322,
906,
326,
2722,
326,
12600,
253,
4021,
18499,
11562,
281,
253,
3607,
273,
1566,
285,
30370,
1293,
2007,
13260,
327,
253,
941,
310,
42568,
2299,
627,
310,
247,
4619,
26210,
875,
253,
26647,
9191,
5421,
275,
436,
2929,
285,
2045,
789,
326,
891,
6266,
2708,
285,
3103,
516,
25661,
4404,
18235,
891,
588,
7164,
619,
4868,
604,
253,
4477,
476,
921,
326,
253,
2538,
327,
253,
1180,
285,
6887,
273,
253,
13596,
2186,
275,
253,
3236,
4758,
347,
891,
2868,
436,
310,
271,
1774,
2929,
5010,
50276,
783,
26647,
2228,
275,
436,
2929,
310,
12650,
407,
253,
6278,
273,
253,
1180,
273,
3386,
285,
436,
476,
452,
2201,
2538,
327,
253,
5281,
273,
26647,
87,
1026,
522,
10757,
6970,
253,
1180,
273,
3386,
310,
752,
5760,
253,
5350,
594,
323,
1650,
604,
253,
3963,
440,
6320,
1025,
2228,
310,
6507,
2439,
1027,
29142,
253,
12650,
6970,
588,
320,
247,
11052,
3425,
50276,
570,
1622,
253,
26647,
2228,
275,
247,
8946,
8492,
87,
14417,
6970,
4543,
253,
2228,
326,
8213,
281,
247,
34815,
310,
12650,
891,
43816,
1314,
949,
253,
4021,
18499,
2929,
407,
1112,
5914,
1162,
355,
285,
597,
671,
1646,
281,
320,
970,
253,
6867,
26647,
2228,
534,
310,
417,
12650,
50276,
783,
16038,
323,
21539,
275,
253,
2929,
310,
326,
253,
4581,
830,
2228,
31831,
89,
19,
22661,
689,
277,
10103,
285,
594,
253,
26647,
2228,
556,
281,
320,
12650,
407,
277,
19,
436,
1057,
417,
1646,
987,
31831,
3139,
556,
2616,
326,
2020,
689,
277,
10103,
285,
403,
840,
28483,
594,
253,
1055,
273,
277,
588,
14002,
562,
50276,
37585,
16157,
50276,
2461,
285,
247,
403,
417,
4518,
2931,
275,
253,
1895,
9978,
50275,
11183,
253,
2523,
342,
21539,
310,
4229,
275,
253,
747,
2715,
285,
891,
717,
3629,
619,
11691,
406,
339,
793,
73,
430,
6010,
253,
2929,
3916,
326,
253,
4021,
18499,
11562,
15877,
432,
2173,
6355,
875,
3607,
273,
6867,
941,
285,
253,
42115,
31306,
273,
11333,
285,
21031,
247,
3268,
326,
15693,
271,
10341,
26647,
6970,
323,
4872,
9077,
5742,
3652,
247,
2709,
18499,
26647,
6970,
1097,
275,
253,
762,
285,
689,
3575,
292,
50065,
27005,
50275,
783,
1566,
326,
310,
908,
275,
253,
2929,
310,
247,
4872,
9077,
1566,
689,
271,
3629,
275,
247,
19678,
5133,
873,
273,
11627,
390,
3386,
253,
4477,
3989,
253,
3268,
326,
4245,
253,
13596,
387,
2840,
11627,
407,
1907,
3386,
1146,
3907,
50276,
15291,
2622,
672,
597,
971,
253,
1071,
2228,
281,
6379,
285,
281,
320,
247,
3907,
7802,
273,
305,
10064,
2458,
672,
597,
971,
253,
1071,
281,
2572,
50276,
7265,
2792,
1223,
253,
14168,
281,
619,
4685,
310,
4076,
285,
253,
47284,
310,
2590,
619,
2022,
4468,
310,
849,
253,
4477,
14588,
616,
4342,
281,
4021,
18499,
436,
28314,
479,
275,
767,
2905,
4088,
806,
432,
253,
8668,
273,
253,
10454,
273,
253,
1566,
50276,
6050,
6240,
247,
7877,
281,
253,
4872,
9077,
11323,
247,
4764,
516,
33872,
849,
436,
7033,
281,
253,
10454,
273,
253,
1566,
275,
849,
359,
1859,
10454,
275,
5145,
4715,
285,
275,
253,
2561,
2170,
273,
4021,
18499,
275,
1798,
891,
651,
320,
1199,
625,
13762,
604,
253,
4477,
812,
921,
247,
1083,
835,
6240,
247,
4735,
275,
253,
3632,
3386,
3282,
835,
253,
3386,
403,
273,
253,
2644,
4972,
1333,
4647,
247,
3632,
9381,
285,
840,
513,
253,
13737,
4979,
10491,
390,
6240,
247,
23586,
275,
247,
767,
3828,
2990,
285,
1335,
1146,
2104,
281,
6379,
19687,
511,
3045,
29607,
390,
2810,
281,
352,
275,
690,
3282,
50276,
9154,
2509,
253,
1072,
347,
275,
5987,
39962,
2061,
9275,
746,
15960,
1976,
3677,
9275,
835,
597,
5206,
247,
3632,
873,
273,
14452,
273,
3629,
46950,
651,
18578,
479,
1199,
625,
253,
1273,
2905,
2523,
310,
253,
3268,
273,
253,
3386,
891,
651,
417,
2564,
352,
604,
253,
30410,
651,
897,
253,
3386,
17568,
533,
3629,
40600,
2355,
253,
38576,
273,
253,
3268,
387,
1016,
13249,
9193,
1077,
13345,
275,
253,
1563,
3282,
5467,
326,
253,
806,
13249,
310,
253,
5203,
390,
1633,
2810,
281,
352,
533,
253,
1735,
11627,
403,
6313,
6046,
840,
1097,
776,
6194,
285,
1071,
588,
2572,
672,
359,
2572,
253,
1180,
273,
3386,
275,
619,
30328,
436,
310,
1077,
2080,
432,
752,
310,
5421,
285,
7558,
275,
253,
4021,
18499,
6239,
323,
1650,
275,
253,
3282,
273,
1112,
7232,
30370,
1127,
390,
295,
36258,
343,
507,
1566,
10454,
359,
1902,
253,
6194,
2228,
281,
6379,
672,
1566,
5350,
5459,
50275,
74,
513,
2868,
326,
253,
1953,
273,
1880,
359,
476,
3989,
271,
10341,
26647,
6970,
310,
1077,
1774,
285,
326,
352,
943,
320,
5421,
285,
14859,
625,
11617,
533,
516,
417,
13762,
407,
253,
9978,
275,
436,
2929,
891,
651,
320,
7378,
281,
1818,
619,
4743,
275,
253,
1083,
253,
4477,
588,
2953,
253,
1840,
2792,
275,
247,
20297,
5133,
50275,
37585,
5701,
337,
253,
2905,
789,
275,
253,
2133,
273,
253,
2929,
310,
14999,
50275,
74,
50276,
531,
16613,
2929,
326,
943,
320,
1246,
310,
1604,
6451,
50276,
84,
991,
70,
1722,
5987,
39962,
2061,
5375,
1166,
2313,
1812,
2251,
50273,
2886,
1223,
295,
698,
8621,
321,
1458,
10018,
253,
4021,
12524,
1293,
27017,
352,
285,
425,
267,
1283,
1263,
253,
8492,
87,
14417,
5454,
2727,
295,
36258,
343,
266,
655,
5987,
39962,
2061,
5375,
746,
8193,
1423,
4529,
310,
253,
806,
281,
7568,
352,
275,
247,
21414,
8142,
285,
943,
320,
11106,
347,
824,
50276,
19,
891,
651,
11435,
271,
8813,
323,
2139,
253,
2957,
310,
24337,
407,
337,
69,
19,
436,
9193,
2581,
10341,
50275,
7152,
33032,
2520,
2929,
2175,
253,
4021,
34263,
1398,
592,
273,
253,
10554,
2228,
6970,
323,
4872,
9077,
275,
1097,
253,
762,
19484,
1025,
285,
689,
19484,
1025,
9459,
50275,
783,
4757,
1223,
1142,
9380,
452,
5421,
253,
4021,
18499,
323,
4872,
9077,
6642,
390,
253,
5927,
11591,
19,
5222,
2900,
50276,
2520,
2929,
2722,
2709,
1398,
592,
672,
9500,
2274,
79,
247,
4758,
310,
12345,
5421,
407,
2571,
2007,
1223,
2709,
1398,
592,
452,
644,
27184,
6888,
407,
643,
17336,
2987,
597,
452,
28055,
8058,
326,
824,
2709,
1398,
592,
2226,
50274,
783,
14855,
253,
2201,
14855,
273,
253,
2929,
310,
253,
1566,
7533,
5742,
337,
352,
310,
12744,
2139,
253,
10554,
2228,
310,
12650,
407,
253,
1180,
273,
3386,
285,
374,
253,
8492,
1307,
310,
1669,
562,
275,
253,
10554,
2228,
1955,
281,
253,
2032,
10303,
1146,
5058,
285,
760,
253,
11041,
1307,
310,
2783,
806,
323,
21539,
253,
4477,
1750,
326,
436,
21539,
310,
3309,
323,
5301,
6296,
253,
2862,
1543,
403,
34865,
264,
327,
436,
21539,
26332,
1293,
253,
21539,
253,
4737,
476,
417,
921,
253,
6242,
273,
253,
2709,
1398,
592,
6747,
275,
762,
19484,
1025,
9459,
4543,
689,
19484,
1025,
9459,
253,
4606,
891,
1119,
436,
21539,
310,
12504,
403,
253,
1563,
50276,
74,
2622,
4872,
9077,
1895,
1057,
417,
452,
824,
21539,
327,
253,
10554,
2228,
352,
310,
12744,
2139,
359,
971,
281,
10957,
247,
327,
264,
37613,
2228,
407,
253,
4735,
1979,
50276,
2886,
643,
4021,
18499,
2987,
7194,
7257,
767,
8169,
50276,
66,
1677,
247,
4229,
3410,
1979,
752,
310,
253,
1682,
1566,
4245,
253,
1682,
6642,
273,
253,
2380,
253,
3662,
310,
247,
4067,
1566,
26332,
6240,
625,
3386,
778,
1361,
24088,
1269,
316,
268,
7083,
2929,
50276,
67,
1677,
247,
4229,
4735,
1979,
752,
310,
253,
1682,
3410,
1979,
326,
4245,
253,
1682,
6642,
273,
253,
2380,
253,
3662,
310,
970,
247,
4577,
3410,
1979,
778,
1361,
24088,
16579,
447,
4021,
18499,
2929,
50276,
1542,
1097,
2219,
891,
513,
417,
923,
667,
1921,
281,
39142,
253,
10554,
2228,
273,
2380,
407,
4735,
1979,
604,
436,
21539,
310,
323,
253,
4096,
273,
253,
1566,
5438,
12339,
352,
310,
12744,
2139,
359,
943,
11907,
247,
4067,
1566,
3185,
273,
29697,
3006,
352,
50273,
66,
5272,
10671,
323,
824,
21539,
310,
253,
278,
339,
273,
253,
10235,
26332,
7856,
9900,
357,
1464,
19,
627,
403,
1142,
4893,
835,
952,
403,
625,
6110,
275,
253,
10303,
2581,
685,
253,
2380,
5046,
253,
4477,
943,
1908,
436,
10671,
3185,
273,
253,
10554,
2228,
50275,
1542,
253,
1273,
14855,
273,
253,
1566,
7533,
253,
8492,
1307,
556,
644,
1669,
562,
273,
253,
10554,
2228,
672,
253,
2032,
10303,
403,
8025,
281,
320,
512,
5058,
984,
273,
436,
4758,
512,
3386,
403,
816,
6313,
6046,
19124,
281,
253,
2380,
840,
359,
476,
2451,
326,
470,
310,
253,
1682,
6642,
672,
512,
3386,
403,
816,
6313,
6046,
285,
352,
3133,
326,
627,
310,
642,
16038,
323,
441,
281,
3037,
2712,
432,
253,
3632,
6046,
604,
253,
2022,
4096,
273,
436,
2929,
310,
281,
7257,
247,
3935,
326,
970,
760,
19124,
3386,
285,
6240,
625,
273,
731,
476,
1361,
281,
3157,
253,
10554,
2228,
436,
1055,
310,
1929,
2168,
275,
1110,
4021,
18499,
2929,
275,
253,
689,
19484,
1025,
9459,
4645,
2709,
1398,
592,
1057,
417,
823,
1199,
1318,
984,
352,
1620,
27125,
253,
14916,
6642,
470,
275,
436,
4758,
50273,
12157,
273,
841,
2201,
14855,
891,
5583,
18235,
323,
436,
2929,
533,
891,
588,
6830,
1818,
619,
7103,
604,
253,
4477,
476,
2085,
247,
1077,
21414,
8813,
273,
253,
1566,
7533,
285,
16038,
50276,
67,
11587,
841,
1529,
14876,
323,
253,
2929,
310,
326,
253,
4737,
273,
253,
39383,
285,
253,
3908,
273,
458,
44661,
3936,
247,
2257,
273,
5053,
891,
1158,
597,
476,
320,
7932,
407,
625,
7000,
11985,
273,
253,
1566,
7533,
285,
8169,
390,
11815,
432,
253,
2022,
39383,
323,
1650,
310,
627,
667,
30328,
670,
752,
2238,
273,
2709,
1398,
592,
6970,
310,
625,
13857,
671,
5747,
253,
12994,
4060,
891,
1158,
352,
310,
1335,
1892,
281,
2216,
253,
26647,
6970,
1293,
3192,
253,
8492,
1307,
715,
8180,
253,
2316,
476,
320,
1669,
323,
253,
1783,
273,
253,
8492,
1307,
50273,
6438,
2380,
6701,
323,
15974,
253,
4468,
670,
21539,
352,
4620,
326,
643,
30628,
452,
247,
4468,
670,
824,
21539,
347,
973,
891,
1804,
253,
4477,
5386,
253,
1543,
342,
21539,
7094,
432,
253,
2022,
2929,
285,
760,
452,
352,
275,
253,
30762,
323,
3780,
326,
310,
6110,
275,
824,
21539,
50275,
251,
253,
643,
1133,
1293,
21539,
253,
1543,
452,
4391,
323,
253,
762,
19484,
1025,
9459,
534,
2789,
625,
3282,
281,
479,
285,
253,
4737,
4453,
3240,
1027,
275,
253,
689,
19484,
1025,
9459,
347,
973,
891,
858,
417,
452,
673,
281,
2451,
253,
4737,
285,
891,
2868,
352,
310,
1805,
281,
501,
538,
2225,
253,
2929,
347,
747,
984,
273,
253,
2201,
2544,
50275,
71,
3341,
891,
1335,
452,
7350,
670,
253,
958,
326,
760,
11041,
310,
5469,
891,
1804,
253,
4477,
1375,
616,
1543,
275,
247,
4758,
835,
1097,
8492,
285,
11041,
4961,
285,
253,
3386,
2879,
281,
253,
1566,
403,
2905,
281,
253,
2380,
5010,
352,
310,
247,
12504,
3935,
326,
352,
310,
1175,
281,
823,
6313,
6046,
347,
3386,
352,
9193,
751,
3738,
359,
476,
2216,
2709,
1398,
592,
275,
253,
689,
19484,
1025,
9459,
672,
6046,
310,
1781,
352,
310,
1077,
2779,
326,
253,
470,
6642,
33526,
253,
1682,
10554,
2495,
594,
627,
310,
642,
1127,
281,
564,
715,
689,
19484,
1320,
285,
2709,
1398,
592,
387,
512,
50272,
249,
6010,
891,
452,
5439,
253,
4868,
281,
608,
891,
2868,
352,
476,
320,
721,
390,
818,
604,
512,
3374,
403,
9713,
533,
891,
717,
9202,
326,
253,
2929,
4453,
10323,
747,
846,
841,
2544,
285,
3021,
891,
717,
417,
2119,
1880,
352,
943,
320,
1335,
2783,
323,
436,
8059,
50275,
187,
187,
4118,
18435,
27,
6050,
627,
369,
690,
1600,
275,
253,
1783,
253,
13969,
1859,
369,
326,
253,
3236,
1971,
369,
417,
10481,
973,
24013,
8550,
285,
253,
18520,
369,
1512,
43110,
432,
253,
3236,
19529,
323,
352,
281,
320,
6760,
323,
9311,
275,
436,
1107,
17857,
32888
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary the paper proposes a multistage directed exploration algorithm xtx it first imitates previous high score trajectories and then switches to an exploration policy with novelty bonuses conceptually xtx is a method that extends goexplore which only acts randomly after reaching the frontier of familiar states the paper argues that with novelty bonuses the agent will be encouraged to explore more promising actions this can especially be helpful when the action space is large like textbased games empirically xtx shows strong performance on a large set of textbased games pros the paper is generally wellwritten and easy to follow the novelty of xtx is clearly elaborated the method surpasses the existing method with a large margin on textbased games the ablation studies show the individual components introduced by xtx can bring improvements cons one weakness of the paper is these experiments did not clarify why the novel part of xtx ie exploration with novelty bonus on the frontier is helpful over random actions the paper hypothesizes that novelty bonuses can encourage the agent to select promising actions in large action spaces however the ablation study figure 2 casts doubts on this hypothesis xtx brings significant improvements over goexplore in zork1 but not other games the difference doesnt seem to be correlated to the size of action spaces questions i dont fully get why the method is motivated to solve the problems with large action space how can an agent receive a novelty bonus if it did not enter that novel state by trying random actions do the authors assume the generalization of the neural network plays a key role here other suggestions the author might want to try other hardexploration tasks for example minigrid or maze can be tested if not atari games like montezuma revenge since these are environments where existing exploration methods are developed we can have a better understanding of how exactly xtx compares to other exploration algorithms rather than the existing textbase game agent without directed exploration reason for the score the writeup and experiments in this paper are of good quality the method itself is novel and the empirical finding in this paper might be particularly interesting for the audience of textbased rl i have minor concerns authors claim about why this method works better than existing exploration algorithms while im happy to increase the score if they are addressed docsepthis paper presents a new exploration algorithm exploitthenexplore xtx for textbased games which require extensive exploration the authors propose an algorithm that explicitly disentangles exploitation and exploration strategies within each episode xtx first learns the exploitation policy that imitates the promising trajectories from past experiences then uses exploration policy to discover the novel stateaction spaces finally the authors demonstrated the outperforming results in the jericho environment this paper is well motivated and most parts are well written but the main method section is written to be difficult to follow the results demonstrate empirical gains in the jericho environment however the baselines consist only of simple algorithms without an exploration strategy the detailed comments and questions are as follows 1 in the experiment the performance is compared with drrn and mprcdqn which lack exploration strategy xtx seems to be an exploration method very similar to goexplore moreover in the paper goexplore and pcpg are mentioned as the most closely related approaches but they are excluded from the baseline algorithms it would be better to demonstrate the results of them together 2 page 5 section 312 sampling trajectories it is hard to follow the explanation can it be understood as a kind of weighted behavior cloning moreover i understand the motivation of biased sampling towards high scores but dont understand the motivation for the length i think that a shorter trajectory length is not necessarily better can you give an intuitive explanation 3 in the paper policy pitextil is modeled as gpt2 and policy pitextinvdy is modeled as drrn is there any reason why each policy is modeled differently especially the policy pitextil is renormalized over the valid action set is there any reason or advantage to learn the policy with gpt2 4 in the experiments the results demonstrate xtx underperforms drrn on enchanter is there any intuitive explanation for this result it would be better if a discussion about what characteristics in the enchanter made the xtx not work would be added this paper is wellmotivated written overall and demonstrates stateoftheart performance in the jericho environment however there are relevant but missing baseline algorithms goexplore pcpg for the main table of experiments i think the results of these algorithms should also be included in the main table and i think this can further support the main arguments of the paper docsepthis paper introduces an agent with a builtin exploration strategy that is aimed at text adventure games or more generally environments with large action spaces and sparse rewards the exploration strategy is constructed from two independent policies one trained with selfimitation learning on successful trajectories and one trained on an inverse dynamics intrinsic reward the agent plays episodes by starting with the exploitation policy for a number of steps that depends on the experience collected up to that point and then switching to the exploration policy the paper is wellwritten describes the contributions clearly and places itself in the context of the existing literature on exploration it includes results on a number of text exploration games from a recent benchmark where it shows by and large a significant improvement relative to the baselines included the main contribution is an exploration strategy with an inepisode switch from an exploitation policy to one aimed at exploration this approach to combining exploration and exploitation is different from much of the existing literature where typically a single policy is used throughout the episode and often throughout training that merges two reward signals since the switching policy in this paper is the element that looks most hardcoded and therefore potentially brittle it would be valuable to investigate a bit more whether a more flexible solution is also possible here while different agent57 whose predecessor ngu is cited might offer inspiration here it also uses multiple policies and manages the switching with a learned bandit mechanism a significant difference is that there the switching only happens between episodes but a similar switching mechanism might be considered here within episodes nonetheless the inepisode switch is there to ensure that exploration happens at the edge of the known region of state space where it is needed and meaningful that is a very sensible thing for the agent to do but there are other exploration strategies that effectively also do that such as random network distillation burda et al 2018 and inverse dynamics pathak et al 2017 which the authors use to train their exploration policy while the exploration region is less explicitly located at the edge of the known state space region in those algorithms than in this paper the prediction errors that they rely on for intrinsic reward generation are more likely to occur at that edge one question i have for the authors here is whether the inverse dynamics reward signal itself can be used to indicate when to switch from explore to exploit in that case the twopolicy solution can be simplified again to a single policy that merges the two behaviours i did not see this ablation in the paper but i believe it would be a good thing to include conversely it would be valuable to see the performance of the strategy proposed here on other exploration benchmarks such as the hard exploration games from the atari suite bellemare et al 2016 while i appreciate that text adventure games are in some ways different from their video counterparts since they have a different observation space language not pixels and action set again language not moves they are still both rl environments and general agents should be able to play both furthermore a game like montezumas revenge has a bottleneck aspect similar to the one that many text based games have as well as the need for exploration on the frontier of the known region of state space all in all it seems that the proposed strategy here could work on a wider range of environments than addressed in the paper if that is not the case it is still a valuable contribution but if it is it would be good to know a last comment the agent proposed in the paper has another unusual feature in that its exploitation policy is trained only by selfimitation while it is important to find the edge of the explored region of state space and the selfimitation training regime can help with this the xtx strategy can also be implemented with an exploitation policy that is trained in a more traditional way with one of the many rl approaches available can the authors comment on why they chose the selfimitation approach instead the paper is well written presents a marked improvement over the baselines provided im not sufficiently familiar with the text adventure game literature to be certain those represent state of the art but i will assume they do unless corrected and provides an interesting approach to the exploration problem through the twopolicy architecture i recommend acceptance but i also feel the paper could be strengthened by addressing the questions raised in the main review section docsepin this paper the authors propose exploitthenexplore xtx a training strategy for agent solving humangenerated textbased games xtx consists of two training phases 1 in the exploitation phase the agent samples high quality experiences in terms of score and trajectory length from its replay buffer using the sampled trajectories an action generation module is trained at a certain game step t the action generation module takes the observation ot as well as the two most recent past actions at1 and at2 as input and generates the new action at in a wordbyword autoregressive manner this process is referred as selfimitation by the authors 2 in the exploration phase in addition to the qlearning loss as used in drrn the authors use two auxiliary losses to encourage the model to capture useful representations first the inverse dynamics loss linv optimizes a module that predicts an action at given two consecutive observations ot and ot1 where ot1 is resulted by at given ot the second loss ldec is a regularizer that optimizes a module reconstructs an action at from its encoding faat during training the two phases take control in an almost alternate manner however there is a coefficient lambda controls the interpolation between the phases the authors show that it is beneficial to not having the exploitation take control solely on a subset of games from the jericho suite the authors show their agent outperform prior works strengths 1 the disentanglement of exploration and exploitation makes sense the phasealternating pipeline is nicely designed 2 the paper is clearly written it is relatively easy to understand how the model look like although the intuition of each component isnt too clear 3 the set of ablation experiments in section 42 are well designed questions and concerns 1 whats the reason of choosing this subset of 12 games while the list seems to cover a wide range of game difficulties but why not using the entire jericho suite 2 the authors cited the invdy agent yao et al 2021 in their section 313 and actually if i understand correctly the entire section 313 is describing yao et als model without any new contribution why do not the authors compare their agent with invdy in result tables 3 in section 31phase 1 the authors describe two criteria that switch the agent to exploration phase can the authors elaborate on the second criterion what does it mean if the number of steps in an episode equal to the longest of the k sampled trajectories if an agent moves back and forth between two locations which may result a super long steps but this behavior is not necessarily desired 4 in section 312 sampling trajectories the authors describe the way they use to sample trajectories however to my understanding the loss shown in eqn 5 is a gamestepwise loss does the authors also sample game steps from the sampled trajectories if so how or they compute this loss on all game steps within the sampled trajectories 5 in the paragraph under eqn 2 the authors mentioned that note that the action distribution over actions a induced by piinvdy is conditioned only on the current observation o however according to eqn 6 it is also conditioned on o which is the next observation ie ot1 references 1 keep calm and explore language models for action generation in textbased games shunyu yao rohan rao matthew hausknecht and karthik narasimhan emnlp 2020 2 reading and acting while blindfolded the need for semantics in text game agents shunyu yao karthik narasimhan and matthew hausknecht naacl 2021 nov 29 2021 we had a good discussion among reviewers let me give the authors some update 1 increased my score to 6 this is because the authors have somewhat addressed my comments im relatively satisfied there are a few concerns remaining as listed below a on modelling novelty the novel components are only a the sampling strategy in exploitation phase and b the twophase pipeline it was a bit weird to almost copy and paste a subsection from a prior work into the main body of this submission which may confuse readers by giving a false message about the contribution however if other coreviewers are fine with it im fine too b after a few paper updates the main results in table 1 is only marginally higher than prior work the authors can add more discussion addressing this in their cameraready 2 we recommend the authors to remove the dragon row from the result tables or rerun when the jericho team fixes the bug as reviewer pskh find out the proposed agents scores exceed the max score on that game i happen to know some core jericho contributors and we tested the dragon gamehttpsifdborgviewgameidsjiyffz8n5patu8l usually when reaching the goal this will pop up dragons treasure store the dragons secret hoard is open before you by the flickering light of your little candle you can make out a heaps of treasure stacked untidily around the floor you can see piles of gold and heaps of jewels many rising higher than the top of your head the dragon has told you it has no use for the treasure and it is now yours you are rich beyond your wildest dreams you have won in that game you scored 25 out of a possible 25 in 101 turns would you like to restart restore a saved game undo your last move give the full score for that game or quit in that game the scoring function works like this 1 for buying the box 1 for finding the screwdriver 2 for finding the candle 1 for finding the matches 1 for for opening the castle door 1 for building the handglider in the right place 2 for getting the swordbooklet 1 for escaping from the tower using the hangglider 2 for killing the troll to get the horn 5 for talking to the troll to get the horn 2 for killing the dragon 5 for charming the dragon instead of killing him 5 for finding the treasure 25 points maximum total eg multiple ways to get the horn minus 2 points for each rescue or dead recovery given this 2 points for each rescue action an agent can get negative total points because the original game did some short to unsigned char converting this caused underflow 128 vs 128 this may because the author of the dragon game in 2003 didnt expect machines to play his game because most humans will give up playing before reaching this underflow point so the weird numbers are not the authors problem as i mentioned they can either remove that row or to rerun whenever jericho fixes that while i like this paper in general my main concern is its novelty and contribution as mentioned in my questions and concerns q2 above the entire section 313 is describing prior work yao et al 2021 the learning from trajectories part of section 312 is describing another prior work yao et al 2020 actually neither yao et al 2021 nor yao et al 2020 is compared in result table as a consequence to my understanding the contribution of this paper is the twophase pipeline and the sampling strategies in section 312 i am not sure if this paper contains enough contributions to publish at iclr please correct me if i understood wrong
### Summary: | i thank the authors for their submission and active participation in the discussions all reviewers are unanimously leaning towards acceptance of this paper reviewers in particular liked that the paper is wellwritten and easy to follow 186etadhexgo well motivated tadh interesting pskh novel 186e and provides gains over baselines 186etadhpskh with interesting ablations 186eexgo i thus recommend accepting the paper and i encourage the authors to further improve their paper based on the reviewer feedback | [
5024,
273,
253,
1929,
2919,
273,
1375,
2317,
835,
352,
310,
3058,
285,
14282,
326,
310,
247,
1077,
24600,
2181,
323,
253,
5570,
281,
513,
533,
627,
403,
643,
17947,
8130,
326,
8069,
671,
513,
326,
824,
347,
3632,
2990,
940,
21755,
3600,
1473,
1162,
355,
4765,
285,
13737,
8062,
1854,
518,
1162,
355,
4240,
534,
253,
4477,
897,
281,
6194,
616,
17947,
3646,
1223,
253,
17947,
2919,
310,
1679,
11120,
4441,
387,
253,
5024,
273,
253,
1929,
1375,
2317,
2919,
275,
1110,
11333,
685,
275,
436,
2929,
253,
10554,
6332,
326,
597,
10725,
327,
323,
15276,
10921,
5978,
403,
625,
2779,
281,
2826,
387,
326,
5024,
581,
1953,
891,
452,
323,
253,
4477,
1060,
310,
1880,
253,
13737,
8062,
10921,
2625,
3139,
476,
320,
908,
281,
5224,
672,
281,
5234,
432,
8338,
281,
22059,
275,
326,
1083,
253,
2500,
15762,
2576,
2900,
476,
320,
21010,
969,
281,
247,
2014,
3646,
326,
14041,
265,
253,
767,
32536,
891,
858,
417,
923,
436,
28913,
275,
253,
2929,
533,
891,
2868,
352,
651,
320,
247,
1175,
2181,
281,
2486,
50276,
585,
13110,
352,
651,
320,
9865,
281,
923,
253,
3045,
273,
253,
5700,
4081,
1060,
327,
643,
17947,
49602,
824,
347,
253,
1892,
17947,
3958,
432,
253,
387,
1792,
18880,
1112,
5616,
609,
1162,
355,
4022,
1223,
891,
11435,
326,
2505,
15865,
3958,
403,
275,
690,
4088,
1027,
432,
616,
3492,
21421,
1580,
597,
452,
247,
1027,
8310,
2317,
3448,
417,
15115,
285,
2250,
873,
969,
3448,
417,
9727,
597,
403,
1335,
1097,
391,
77,
12620,
285,
2087,
6083,
943,
320,
2104,
281,
1132,
1097,
33810,
247,
2165,
751,
1114,
442,
91,
49998,
25442,
556,
247,
3673,
44856,
4809,
2074,
281,
253,
581,
326,
1142,
2505,
1754,
3958,
452,
347,
973,
347,
253,
878,
323,
17947,
327,
253,
34642,
273,
253,
1929,
2919,
273,
1375,
2317,
512,
275,
512,
352,
3133,
326,
253,
4081,
5700,
1060,
812,
789,
327,
247,
14200,
2491,
273,
12620,
685,
9713,
275,
253,
2929,
604,
326,
310,
417,
253,
1083,
352,
310,
1335,
247,
9865,
7680,
533,
604,
352,
310,
352,
651,
320,
1175,
281,
871,
50276,
66,
1390,
4385,
253,
5570,
4081,
275,
253,
2929,
556,
1529,
11555,
4735,
275,
326,
697,
30211,
3646,
310,
10166,
760,
407,
1881,
303,
3535,
1223,
352,
310,
1774,
281,
1089,
253,
5024,
273,
253,
14859,
2919,
273,
1375,
2317,
285,
253,
1881,
303,
3535,
3733,
9459,
476,
1361,
342,
436,
253,
209,
633,
89,
5700,
476,
671,
320,
9009,
342,
271,
30211,
3646,
326,
310,
10166,
275,
247,
625,
5899,
1039,
342,
581,
273,
253,
1142,
391,
77,
7274,
2130,
476,
253,
4477,
4385,
327,
2139,
597,
9703,
253,
1881,
303,
3535,
2746,
3185,
50276,
783,
2929,
310,
973,
3542,
10262,
247,
7101,
7756,
689,
253,
1666,
25379,
2530,
516,
417,
10481,
7615,
342,
253,
2505,
15865,
2165,
6239,
281,
320,
2176,
1110,
1957,
1375,
273,
253,
1445,
533,
891,
588,
5467,
597,
513,
5734,
15045,
285,
3400,
271,
4722,
2746,
281,
253,
17947,
1895,
949,
253,
2500,
15762,
2576,
10336,
891,
5583,
14924,
533,
891,
671,
1928,
253,
2929,
812,
320,
34615,
407,
15974,
253,
3533,
5439,
275,
253,
2022,
2278,
2593,
5474,
339,
9852,
436,
2929,
253,
4477,
12661,
10220,
770,
248,
12929,
446,
410,
209,
633,
89,
247,
3733,
5700,
323,
5570,
16161,
1547,
606,
4330,
456,
2505,
3169,
3958,
50275,
633,
89,
8414,
273,
767,
3733,
12475,
50276,
18,
275,
253,
30211,
3408,
253,
5570,
3530,
1029,
3290,
8450,
275,
2426,
273,
4868,
285,
18974,
2978,
432,
697,
44864,
6391,
970,
253,
19958,
24102,
271,
2250,
5978,
6333,
310,
10166,
387,
247,
2176,
2165,
3213,
246,
253,
2250,
5978,
6333,
3936,
253,
8310,
14366,
347,
973,
347,
253,
767,
954,
3332,
2469,
5231,
387,
18,
285,
387,
19,
347,
3280,
285,
15693,
253,
747,
2250,
387,
275,
247,
3159,
1615,
3418,
47694,
11020,
5133,
436,
1232,
310,
6289,
347,
1881,
303,
3535,
407,
253,
4477,
50275,
19,
275,
253,
17947,
3408,
275,
1635,
281,
253,
2805,
28269,
2957,
347,
908,
275,
1837,
30930,
253,
4477,
897,
767,
24026,
11655,
281,
11907,
253,
1566,
281,
9232,
4217,
14237,
806,
253,
13737,
8062,
2957,
298,
7821,
5556,
4219,
247,
6333,
326,
26295,
271,
2250,
387,
1677,
767,
12640,
7313,
14366,
285,
14366,
18,
835,
14366,
18,
310,
7369,
407,
387,
1677,
14366,
253,
1273,
2957,
298,
8632,
310,
247,
3963,
6081,
326,
5556,
4219,
247,
6333,
17029,
84,
271,
2250,
387,
432,
697,
9706,
4195,
255,
50276,
32674,
3733,
253,
767,
12475,
1379,
1453,
275,
271,
2761,
17958,
5133,
2299,
627,
310,
247,
10235,
29331,
5760,
253,
30370,
875,
253,
12475,
253,
4477,
921,
326,
352,
310,
12912,
281,
417,
1907,
253,
30211,
1379,
1453,
12718,
50275,
251,
247,
8578,
273,
3958,
432,
253,
23313,
469,
80,
18880,
253,
4477,
921,
616,
5570,
562,
32231,
2720,
2987,
50275,
296,
3755,
20556,
50276,
18,
253,
557,
290,
606,
1338,
273,
17947,
285,
30211,
2789,
3282,
253,
3408,
30991,
839,
15722,
310,
23395,
4158,
50276,
19,
253,
2929,
310,
4518,
3542,
352,
310,
4942,
3477,
281,
2096,
849,
253,
1566,
1007,
751,
3738,
253,
30328,
273,
1016,
4445,
310,
2649,
1512,
2590,
50276,
20,
253,
873,
273,
28913,
4679,
275,
2593,
5976,
403,
973,
4158,
50275,
34974,
285,
7350,
50276,
18,
47515,
253,
1921,
273,
13887,
436,
8578,
273,
1249,
3958,
1223,
253,
1618,
3133,
281,
3835,
247,
4618,
2491,
273,
2165,
12748,
533,
2139,
417,
970,
253,
2862,
23313,
469,
80,
18880,
374,
253,
4477,
11106,
253,
828,
6421,
5570,
340,
8500,
1162,
355,
43425,
275,
616,
2593,
31389,
285,
2686,
604,
891,
2096,
9113,
253,
2862,
2593,
31389,
310,
12930,
340,
8500,
1162,
14350,
1566,
1293,
667,
747,
7680,
2139,
513,
417,
253,
4477,
7277,
616,
5570,
342,
828,
6421,
275,
906,
7180,
495,
275,
2593,
4562,
14213,
337,
253,
4477,
6266,
767,
6866,
326,
5234,
253,
5570,
281,
17947,
3408,
476,
253,
4477,
21184,
327,
253,
1273,
17705,
752,
1057,
352,
1599,
604,
253,
1180,
273,
5018,
275,
271,
9037,
4503,
281,
253,
20088,
273,
253,
465,
19958,
24102,
604,
271,
5570,
9727,
896,
285,
6593,
875,
767,
8593,
534,
778,
906,
247,
2221,
1048,
5018,
533,
436,
3879,
310,
417,
7933,
6799,
50276,
21,
275,
2593,
30581,
10491,
24102,
253,
4477,
6266,
253,
1039,
597,
897,
281,
3410,
24102,
2299,
281,
619,
4685,
253,
2957,
2011,
275,
16186,
79,
608,
310,
247,
18814,
383,
554,
3020,
2957,
1057,
253,
4477,
671,
3410,
2165,
5018,
432,
253,
19958,
24102,
604,
594,
849,
390,
597,
11897,
436,
2957,
327,
512,
2165,
5018,
1561,
253,
19958,
24102,
608,
275,
253,
12494,
762,
16186,
79,
374,
253,
4477,
5393,
326,
3877,
326,
253,
2250,
3268,
689,
5231,
247,
5802,
407,
12580,
7821,
6421,
310,
27039,
760,
327,
253,
1655,
8310,
258,
2299,
2556,
281,
16186,
79,
721,
352,
310,
671,
27039,
327,
258,
534,
310,
253,
1735,
8310,
26332,
14366,
18,
50273,
250,
3065,
337,
1978,
11874,
285,
8338,
3448,
3210,
323,
2250,
5978,
275,
2505,
3169,
3958,
439,
328,
30838,
340,
8500,
687,
5582,
1218,
80,
1111,
783,
88,
419,
24378,
570,
12914,
285,
465,
5401,
1479,
295,
31210,
303,
5582,
802,
13307,
81,
9169,
50276,
19,
4361,
285,
8534,
1223,
9645,
8089,
264,
253,
878,
323,
35185,
275,
2505,
2165,
6083,
439,
328,
30838,
340,
8500,
465,
5401,
1479,
295,
31210,
303,
5582,
285,
1111,
783,
88,
419,
24378,
570,
12914,
5549,
29404,
43425,
50272,
30568,
3285,
43425,
359,
574,
247,
1175,
5955,
2190,
30628,
1339,
479,
1918,
253,
4477,
690,
5731,
50276,
18,
50276,
19687,
833,
619,
4868,
281,
721,
436,
310,
984,
253,
4477,
452,
8489,
9713,
619,
5701,
516,
4942,
10048,
627,
403,
247,
1643,
7350,
5780,
347,
7117,
2708,
50276,
66,
327,
26278,
38135,
253,
4460,
4295,
403,
760,
247,
253,
10491,
5700,
275,
30211,
3408,
285,
270,
253,
2500,
2689,
511,
15722,
352,
369,
247,
2372,
12504,
281,
2761,
3491,
285,
15596,
247,
19087,
432,
247,
2720,
789,
715,
253,
2022,
2133,
273,
436,
19529,
534,
778,
40678,
10668,
407,
4933,
247,
3221,
3935,
670,
253,
7680,
2299,
604,
643,
5161,
1374,
398,
403,
4030,
342,
352,
516,
4030,
1512,
50275,
67,
846,
247,
1643,
2929,
11269,
253,
2022,
1543,
275,
2829,
337,
310,
760,
42876,
2169,
685,
2720,
789,
253,
4477,
476,
823,
625,
5955,
15974,
436,
275,
616,
4049,
254,
609,
5102,
50275,
19,
50276,
664,
5583,
253,
4477,
281,
5386,
253,
22159,
4194,
432,
253,
906,
7180,
390,
294,
6321,
672,
253,
23313,
469,
80,
2285,
26019,
253,
7505,
50276,
284,
37317,
268,
3319,
73,
1089,
562,
253,
4081,
6083,
7363,
8268,
253,
2781,
4868,
327,
326,
2165,
50275,
74,
5108,
281,
871,
690,
5161,
23313,
469,
80,
24781,
285,
359,
5762,
253,
22159,
2165,
3614,
338,
5470,
2061,
1374,
13197,
2352,
8020,
90,
567,
91,
25,
79,
22,
4066,
86,
25,
77,
50275,
27978,
672,
10922,
253,
4736,
436,
588,
1684,
598,
50275,
5267,
27702,
21764,
4657,
50276,
783,
41705,
4279,
8511,
472,
310,
1527,
1078,
368,
407,
253,
22336,
2158,
1708,
273,
634,
1652,
28725,
368,
476,
1056,
562,
247,
344,
1825,
273,
21764,
24982,
18093,
301,
1031,
1475,
253,
5254,
368,
476,
923,
41019,
273,
5328,
285,
344,
1825,
273,
49436,
1142,
11002,
2169,
685,
253,
1755,
273,
634,
1481,
253,
22159,
556,
2183,
368,
352,
556,
642,
897,
323,
253,
21764,
285,
352,
310,
1024,
13298,
50276,
5658,
403,
6793,
4457,
634,
4956,
383,
13702,
50275,
5658,
452,
1912,
50275,
249,
326,
2165,
368,
11691,
2030,
562,
273,
247,
1896,
2030,
275,
8437,
7819,
50276,
12756,
368,
751,
281,
19855,
15042,
247,
9809,
2165,
42137,
634,
1390,
2118,
1918,
253,
2120,
4868,
323,
326,
2165,
390,
15856,
50275,
249,
326,
2165,
253,
14755,
1159,
2987,
751,
436,
50276,
18,
323,
11280,
253,
3817,
337,
323,
4560,
253,
11598,
17440,
374,
323,
4560,
253,
28725,
337,
323,
4560,
253,
10129,
337,
323,
323,
5909,
253,
19887,
3369,
337,
323,
3652,
253,
1133,
3129,
1334,
275,
253,
987,
1659,
374,
323,
2970,
253,
14228,
3305,
1059,
337,
323,
34528,
432,
253,
15469,
970,
253,
10913,
3129,
1334,
374,
323,
9811,
253,
37053,
281,
755,
253,
17081,
608,
323,
5015,
281,
253,
37053,
281,
755,
253,
17081,
374,
323,
9811,
253,
22159,
608,
323,
25507,
253,
22159,
3185,
273,
9811,
779,
608,
323,
4560,
253,
21764,
50275,
1099,
2792,
4869,
2264,
24088,
2709,
4088,
281,
755,
253,
17081,
50276,
10420,
374,
2792,
323,
1016,
14471,
390,
3846,
7355,
50276,
28821,
436,
374,
2792,
323,
1016,
14471,
2250,
271,
5570,
476,
755,
4016,
2264,
2792,
984,
253,
3236,
2165,
858,
690,
2159,
281,
10698,
1018,
22022,
436,
4269,
762,
5449,
12842,
4632,
12842,
50275,
2520,
778,
984,
253,
2488,
273,
253,
22159,
2165,
275,
6469,
42126,
1902,
10679,
281,
1132,
521,
2165,
984,
954,
7497,
588,
1918,
598,
4882,
1078,
10922,
436,
762,
5449,
1127,
50275,
601,
253,
12504,
3904,
403,
417,
253,
4477,
1895,
347,
891,
5393,
597,
476,
2057,
5386,
326,
4194,
390,
281,
294,
6321,
10793,
23313,
469,
80,
26019,
326,
50275,
6050,
891,
751,
436,
2929,
275,
2087,
619,
2022,
4468,
310,
697,
38135,
285,
7680,
347,
5393,
275,
619,
3533,
285,
7350,
2805,
19,
1840,
253,
2862,
2593,
31389,
310,
12930,
2720,
789,
340,
8500,
1162,
355,
43425,
253,
4715,
432,
24102,
629,
273,
2593,
30581,
310,
12930,
1529,
2720,
789,
340,
8500,
1162,
355,
9169,
2686,
6747,
340,
8500,
1162,
355,
43425,
4543,
340,
8500,
1162,
355,
9169,
310,
2429,
275,
906,
2829,
347,
247,
9936,
281,
619,
4685,
253,
7680,
273,
436,
2929,
310,
253,
2500,
2689,
511,
15722,
285,
253,
10491,
8130,
275,
2593,
30581,
891,
717,
417,
2119,
604,
436,
2929,
4428,
2217,
9021,
281,
15452,
387,
17857,
32888,
50275,
32897,
3451,
479,
604,
891,
7192,
3430,
50276,
187,
187,
4118,
18435,
27,
74,
5717,
253,
4477,
323,
616,
19529,
285,
3939,
11497,
275,
253,
11985,
512,
30628,
403,
38350,
25661,
4404,
14924,
273,
436,
2929,
30628,
275,
1798,
10490,
326,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
25384,
292,
324,
15741,
2184,
973,
17194,
246,
33233,
4722,
268,
3319,
73,
4460,
25384,
70,
285,
3400,
15988,
689,
1666,
25379,
25384,
292,
33233,
793,
17616,
342,
4722,
490,
77,
569,
25384,
70,
911,
2184,
891,
3021,
5583,
18738,
253,
2929,
285,
50276,
74,
11907,
253,
4477,
281,
2007,
3157,
616,
2929,
1754,
327,
253,
37317,
8680
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
5024,
273,
253,
1929,
2919,
273,
1375,
2317,
835,
352,
310,
3058,
285,
14282,
326,
310,
247,
1077,
24600,
2181,
323,
253,
5570,
281,
513,
533,
627,
403,
643,
17947,
8130,
326,
8069,
671,
513,
326,
824,
347,
3632,
2990,
940,
21755,
3600,
1473,
1162,
355,
4765,
285,
13737,
8062,
1854,
518,
1162,
355,
4240,
534,
253,
4477,
897,
281,
6194,
616,
17947,
3646,
1223,
253,
17947,
2919,
310,
1679,
11120,
4441,
387,
253,
5024,
273,
253,
1929,
1375,
2317,
2919,
275,
1110,
11333,
685,
275,
436,
2929,
253,
10554,
6332,
326,
597,
10725,
327,
323,
15276,
10921,
5978,
403,
625,
2779,
281,
2826,
387,
326,
5024,
581,
1953,
891,
452,
323,
253,
4477,
1060,
310,
1880,
253,
13737,
8062,
10921,
2625,
3139,
476,
320,
908,
281,
5224,
672,
281,
5234,
432,
8338,
281,
22059,
275,
326,
1083,
253,
2500,
15762,
2576,
2900,
476,
320,
21010,
969,
281,
247,
2014,
3646,
326,
14041,
265,
253,
767,
32536,
891,
858,
417,
923,
436,
28913,
275,
253,
2929,
533,
891,
2868,
352,
651,
320,
247,
1175,
2181,
281,
2486,
50276,
585,
13110,
352,
651,
320,
9865,
281,
923,
253,
3045,
273,
253,
5700,
4081,
1060,
327,
643,
17947,
49602,
824,
347,
253,
1892,
17947,
3958,
432,
253,
387,
1792,
18880,
1112,
5616,
609,
1162,
355,
4022,
1223,
891,
11435,
326,
2505,
15865,
3958,
403,
275,
690,
4088,
1027,
432,
616,
3492,
21421,
1580,
597,
452,
247,
1027,
8310,
2317,
3448,
417,
15115,
285,
2250,
873,
969,
3448,
417,
9727,
597,
403,
1335,
1097,
391,
77,
12620,
285,
2087,
6083,
943,
320,
2104,
281,
1132,
1097,
33810,
247,
2165,
751,
1114,
442,
91,
49998,
25442,
556,
247,
3673,
44856,
4809,
2074,
281,
253,
581,
326,
1142,
2505,
1754,
3958,
452,
347,
973,
347,
253,
878,
323,
17947,
327,
253,
34642,
273,
253,
1929,
2919,
273,
1375,
2317,
512,
275,
512,
352,
3133,
326,
253,
4081,
5700,
1060,
812,
789,
327,
247,
14200,
2491,
273,
12620,
685,
9713,
275,
253,
2929,
604,
326,
310,
417,
253,
1083,
352,
310,
1335,
247,
9865,
7680,
533,
604,
352,
310,
352,
651,
320,
1175,
281,
871,
50276,
66,
1390,
4385,
253,
5570,
4081,
275,
253,
2929,
556,
1529,
11555,
4735,
275,
326,
697,
30211,
3646,
310,
10166,
760,
407,
1881,
303,
3535,
1223,
352,
310,
1774,
281,
1089,
253,
5024,
273,
253,
14859,
2919,
273,
1375,
2317,
285,
253,
1881,
303,
3535,
3733,
9459,
476,
1361,
342,
436,
253,
209,
633,
89,
5700,
476,
671,
320,
9009,
342,
271,
30211,
3646,
326,
310,
10166,
275,
247,
625,
5899,
1039,
342,
581,
273,
253,
1142,
391,
77,
7274,
2130,
476,
253,
4477,
4385,
327,
2139,
597,
9703,
253,
1881,
303,
3535,
2746,
3185,
50276,
783,
2929,
310,
973,
3542,
10262,
247,
7101,
7756,
689,
253,
1666,
25379,
2530,
516,
417,
10481,
7615,
342,
253,
2505,
15865,
2165,
6239,
281,
320,
2176,
1110,
1957,
1375,
273,
253,
1445,
533,
891,
588,
5467,
597,
513,
5734,
15045,
285,
3400,
271,
4722,
2746,
281,
253,
17947,
1895,
949,
253,
2500,
15762,
2576,
10336,
891,
5583,
14924,
533,
891,
671,
1928,
253,
2929,
812,
320,
34615,
407,
15974,
253,
3533,
5439,
275,
253,
2022,
2278,
2593,
5474,
339,
9852,
436,
2929,
253,
4477,
12661,
10220,
770,
248,
12929,
446,
410,
209,
633,
89,
247,
3733,
5700,
323,
5570,
16161,
1547,
606,
4330,
456,
2505,
3169,
3958,
50275,
633,
89,
8414,
273,
767,
3733,
12475,
50276,
18,
275,
253,
30211,
3408,
253,
5570,
3530,
1029,
3290,
8450,
275,
2426,
273,
4868,
285,
18974,
2978,
432,
697,
44864,
6391,
970,
253,
19958,
24102,
271,
2250,
5978,
6333,
310,
10166,
387,
247,
2176,
2165,
3213,
246,
253,
2250,
5978,
6333,
3936,
253,
8310,
14366,
347,
973,
347,
253,
767,
954,
3332,
2469,
5231,
387,
18,
285,
387,
19,
347,
3280,
285,
15693,
253,
747,
2250,
387,
275,
247,
3159,
1615,
3418,
47694,
11020,
5133,
436,
1232,
310,
6289,
347,
1881,
303,
3535,
407,
253,
4477,
50275,
19,
275,
253,
17947,
3408,
275,
1635,
281,
253,
2805,
28269,
2957,
347,
908,
275,
1837,
30930,
253,
4477,
897,
767,
24026,
11655,
281,
11907,
253,
1566,
281,
9232,
4217,
14237,
806,
253,
13737,
8062,
2957,
298,
7821,
5556,
4219,
247,
6333,
326,
26295,
271,
2250,
387,
1677,
767,
12640,
7313,
14366,
285,
14366,
18,
835,
14366,
18,
310,
7369,
407,
387,
1677,
14366,
253,
1273,
2957,
298,
8632,
310,
247,
3963,
6081,
326,
5556,
4219,
247,
6333,
17029,
84,
271,
2250,
387,
432,
697,
9706,
4195,
255,
50276,
32674,
3733,
253,
767,
12475,
1379,
1453,
275,
271,
2761,
17958,
5133,
2299,
627,
310,
247,
10235,
29331,
5760,
253,
30370,
875,
253,
12475,
253,
4477,
921,
326,
352,
310,
12912,
281,
417,
1907,
253,
30211,
1379,
1453,
12718,
50275,
251,
247,
8578,
273,
3958,
432,
253,
23313,
469,
80,
18880,
253,
4477,
921,
616,
5570,
562,
32231,
2720,
2987,
50275,
296,
3755,
20556,
50276,
18,
253,
557,
290,
606,
1338,
273,
17947,
285,
30211,
2789,
3282,
253,
3408,
30991,
839,
15722,
310,
23395,
4158,
50276,
19,
253,
2929,
310,
4518,
3542,
352,
310,
4942,
3477,
281,
2096,
849,
253,
1566,
1007,
751,
3738,
253,
30328,
273,
1016,
4445,
310,
2649,
1512,
2590,
50276,
20,
253,
873,
273,
28913,
4679,
275,
2593,
5976,
403,
973,
4158,
50275,
34974,
285,
7350,
50276,
18,
47515,
253,
1921,
273,
13887,
436,
8578,
273,
1249,
3958,
1223,
253,
1618,
3133,
281,
3835,
247,
4618,
2491,
273,
2165,
12748,
533,
2139,
417,
970,
253,
2862,
23313,
469,
80,
18880,
374,
253,
4477,
11106,
253,
828,
6421,
5570,
340,
8500,
1162,
355,
43425,
275,
616,
2593,
31389,
285,
2686,
604,
891,
2096,
9113,
253,
2862,
2593,
31389,
310,
12930,
340,
8500,
1162,
14350,
1566,
1293,
667,
747,
7680,
2139,
513,
417,
253,
4477,
7277,
616,
5570,
342,
828,
6421,
275,
906,
7180,
495,
275,
2593,
4562,
14213,
337,
253,
4477,
6266,
767,
6866,
326,
5234,
253,
5570,
281,
17947,
3408,
476,
253,
4477,
21184,
327,
253,
1273,
17705,
752,
1057,
352,
1599,
604,
253,
1180,
273,
5018,
275,
271,
9037,
4503,
281,
253,
20088,
273,
253,
465,
19958,
24102,
604,
271,
5570,
9727,
896,
285,
6593,
875,
767,
8593,
534,
778,
906,
247,
2221,
1048,
5018,
533,
436,
3879,
310,
417,
7933,
6799,
50276,
21,
275,
2593,
30581,
10491,
24102,
253,
4477,
6266,
253,
1039,
597,
897,
281,
3410,
24102,
2299,
281,
619,
4685,
253,
2957,
2011,
275,
16186,
79,
608,
310,
247,
18814,
383,
554,
3020,
2957,
1057,
253,
4477,
671,
3410,
2165,
5018,
432,
253,
19958,
24102,
604,
594,
849,
390,
597,
11897,
436,
2957,
327,
512,
2165,
5018,
1561,
253,
19958,
24102,
608,
275,
253,
12494,
762,
16186,
79,
374,
253,
4477,
5393,
326,
3877,
326,
253,
2250,
3268,
689,
5231,
247,
5802,
407,
12580,
7821,
6421,
310,
27039,
760,
327,
253,
1655,
8310,
258,
2299,
2556,
281,
16186,
79,
721,
352,
310,
671,
27039,
327,
258,
534,
310,
253,
1735,
8310,
26332,
14366,
18,
50273,
250,
3065,
337,
1978,
11874,
285,
8338,
3448,
3210,
323,
2250,
5978,
275,
2505,
3169,
3958,
439,
328,
30838,
340,
8500,
687,
5582,
1218,
80,
1111,
783,
88,
419,
24378,
570,
12914,
285,
465,
5401,
1479,
295,
31210,
303,
5582,
802,
13307,
81,
9169,
50276,
19,
4361,
285,
8534,
1223,
9645,
8089,
264,
253,
878,
323,
35185,
275,
2505,
2165,
6083,
439,
328,
30838,
340,
8500,
465,
5401,
1479,
295,
31210,
303,
5582,
285,
1111,
783,
88,
419,
24378,
570,
12914,
5549,
29404,
43425,
50272,
30568,
3285,
43425,
359,
574,
247,
1175,
5955,
2190,
30628,
1339,
479,
1918,
253,
4477,
690,
5731,
50276,
18,
50276,
19687,
833,
619,
4868,
281,
721,
436,
310,
984,
253,
4477,
452,
8489,
9713,
619,
5701,
516,
4942,
10048,
627,
403,
247,
1643,
7350,
5780,
347,
7117,
2708,
50276,
66,
327,
26278,
38135,
253,
4460,
4295,
403,
760,
247,
253,
10491,
5700,
275,
30211,
3408,
285,
270,
253,
2500,
2689,
511,
15722,
352,
369,
247,
2372,
12504,
281,
2761,
3491,
285,
15596,
247,
19087,
432,
247,
2720,
789,
715,
253,
2022,
2133,
273,
436,
19529,
534,
778,
40678,
10668,
407,
4933,
247,
3221,
3935,
670,
253,
7680,
2299,
604,
643,
5161,
1374,
398,
403,
4030,
342,
352,
516,
4030,
1512,
50275,
67,
846,
247,
1643,
2929,
11269,
253,
2022,
1543,
275,
2829,
337,
310,
760,
42876,
2169,
685,
2720,
789,
253,
4477,
476,
823,
625,
5955,
15974,
436,
275,
616,
4049,
254,
609,
5102,
50275,
19,
50276,
664,
5583,
253,
4477,
281,
5386,
253,
22159,
4194,
432,
253,
906,
7180,
390,
294,
6321,
672,
253,
23313,
469,
80,
2285,
26019,
253,
7505,
50276,
284,
37317,
268,
3319,
73,
1089,
562,
253,
4081,
6083,
7363,
8268,
253,
2781,
4868,
327,
326,
2165,
50275,
74,
5108,
281,
871,
690,
5161,
23313,
469,
80,
24781,
285,
359,
5762,
253,
22159,
2165,
3614,
338,
5470,
2061,
1374,
13197,
2352,
8020,
90,
567,
91,
25,
79,
22,
4066,
86,
25,
77,
50275,
27978,
672,
10922,
253,
4736,
436,
588,
1684,
598,
50275,
5267,
27702,
21764,
4657,
50276,
783,
41705,
4279,
8511,
472,
310,
1527,
1078,
368,
407,
253,
22336,
2158,
1708,
273,
634,
1652,
28725,
368,
476,
1056,
562,
247,
344,
1825,
273,
21764,
24982,
18093,
301,
1031,
1475,
253,
5254,
368,
476,
923,
41019,
273,
5328,
285,
344,
1825,
273,
49436,
1142,
11002,
2169,
685,
253,
1755,
273,
634,
1481,
253,
22159,
556,
2183,
368,
352,
556,
642,
897,
323,
253,
21764,
285,
352,
310,
1024,
13298,
50276,
5658,
403,
6793,
4457,
634,
4956,
383,
13702,
50275,
5658,
452,
1912,
50275,
249,
326,
2165,
368,
11691,
2030,
562,
273,
247,
1896,
2030,
275,
8437,
7819,
50276,
12756,
368,
751,
281,
19855,
15042,
247,
9809,
2165,
42137,
634,
1390,
2118,
1918,
253,
2120,
4868,
323,
326,
2165,
390,
15856,
50275,
249,
326,
2165,
253,
14755,
1159,
2987,
751,
436,
50276,
18,
323,
11280,
253,
3817,
337,
323,
4560,
253,
11598,
17440,
374,
323,
4560,
253,
28725,
337,
323,
4560,
253,
10129,
337,
323,
323,
5909,
253,
19887,
3369,
337,
323,
3652,
253,
1133,
3129,
1334,
275,
253,
987,
1659,
374,
323,
2970,
253,
14228,
3305,
1059,
337,
323,
34528,
432,
253,
15469,
970,
253,
10913,
3129,
1334,
374,
323,
9811,
253,
37053,
281,
755,
253,
17081,
608,
323,
5015,
281,
253,
37053,
281,
755,
253,
17081,
374,
323,
9811,
253,
22159,
608,
323,
25507,
253,
22159,
3185,
273,
9811,
779,
608,
323,
4560,
253,
21764,
50275,
1099,
2792,
4869,
2264,
24088,
2709,
4088,
281,
755,
253,
17081,
50276,
10420,
374,
2792,
323,
1016,
14471,
390,
3846,
7355,
50276,
28821,
436,
374,
2792,
323,
1016,
14471,
2250,
271,
5570,
476,
755,
4016,
2264,
2792,
984,
253,
3236,
2165,
858,
690,
2159,
281,
10698,
1018,
22022,
436,
4269,
762,
5449,
12842,
4632,
12842,
50275,
2520,
778,
984,
253,
2488,
273,
253,
22159,
2165,
275,
6469,
42126,
1902,
10679,
281,
1132,
521,
2165,
984,
954,
7497,
588,
1918,
598,
4882,
1078,
10922,
436,
762,
5449,
1127,
50275,
601,
253,
12504,
3904,
403,
417,
253,
4477,
1895,
347,
891,
5393,
597,
476,
2057,
5386,
326,
4194,
390,
281,
294,
6321,
10793,
23313,
469,
80,
26019,
326,
50275,
6050,
891,
751,
436,
2929,
275,
2087,
619,
2022,
4468,
310,
697,
38135,
285,
7680,
347,
5393,
275,
619,
3533,
285,
7350,
2805,
19,
1840,
253,
2862,
2593,
31389,
310,
12930,
2720,
789,
340,
8500,
1162,
355,
43425,
253,
4715,
432,
24102,
629,
273,
2593,
30581,
310,
12930,
1529,
2720,
789,
340,
8500,
1162,
355,
9169,
2686,
6747,
340,
8500,
1162,
355,
43425,
4543,
340,
8500,
1162,
355,
9169,
310,
2429,
275,
906,
2829,
347,
247,
9936,
281,
619,
4685,
253,
7680,
273,
436,
2929,
310,
253,
2500,
2689,
511,
15722,
285,
253,
10491,
8130,
275,
2593,
30581,
891,
717,
417,
2119,
604,
436,
2929,
4428,
2217,
9021,
281,
15452,
387,
17857,
32888,
50275,
32897,
3451,
479,
604,
891,
7192,
3430,
50276,
187,
187,
4118,
18435,
27,
74,
5717,
253,
4477,
323,
616,
19529,
285,
3939,
11497,
275,
253,
11985,
512,
30628,
403,
38350,
25661,
4404,
14924,
273,
436,
2929,
30628,
275,
1798,
10490,
326,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
25384,
292,
324,
15741,
2184,
973,
17194,
246,
33233,
4722,
268,
3319,
73,
4460,
25384,
70,
285,
3400,
15988,
689,
1666,
25379,
25384,
292,
33233,
793,
17616,
342,
4722,
490,
77,
569,
25384,
70,
911,
2184,
891,
3021,
5583,
18738,
253,
2929,
285,
50276,
74,
11907,
253,
4477,
281,
2007,
3157,
616,
2929,
1754,
327,
253,
37317,
8680
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents a novel adaptive second order method oasis for large scale optimization problems convex and nonconvex the search direction is obtained by preconditioning the gradient information with a matrix obtained by approximating the hessian diagonal matrix via hutchinsons method with a momentum term the learning rate is updated adaptively by approximating the lipschitz smoothness parameter on the theoretical front convergence analysis is provided for the adaptive learning rate case for the convex and strongly convex setups similar analysis is also provided for the fixed learning rate lr case for the strongly convex and nonconvex settings finally extensive empirical results are provided which show that the proposed method achieves comparable and sometimes better results than other existing methods pros the paper has the following strengths 1 very well written paper providing a clear motivation for the problem considered 2 the theoretical results involving the convergence analysis of the method are rigorous and cover both convex and nonconvex settings 3 the empirical evaluation is extensive and provides a good indication of how the method performs in practice cons the paper has the following weaknesses 1 there is currently not much discussion on the interpretation of the bounds appearing in the convergence analysis for instance how do these results compare with those for existing secondorder methods eg adahessian 2 it would have been helpful to have provided some kind of proof sketch of the theoretical results or atleast an overview of the key steps which i believe might be common to more than one theorem at the moment no such explanation is provided for any of the theoretical results further remarks 1 the setup in fig 1 is really not clear to me what is meant by number of samples x axis of left plot is there an underlying optimization problem considered here such as a quadratic function with matrix a some more detailed explanation on the experimental setup considered for the figure will be very helpful 2 in fig 2 the parameter lambda is not clearly defined i believe this occurs much later in the experiments section 3 in eq 6 is the matrix dk formed by just generating a rademacher z and forming dk z circ nabla2 fwk z because in the adahessian paper they also consider a spatial averaging step for better estimation of the hessian diagonal also should z be zk as in eq 8 later 4 as mentioned on pg 4 in the discussion on literature for adaptive lr the present paper draws upon ideas from the literature on first order methods for adaptive lr so couldnt one do the same analysis for adahessian for adaptive lr 5 it wasnt clear to me why adahessian eq 6 and 7 doesnt approximate the diagonal well while oasis eq 8 and 9 does a better job because we see a temporal average in eq 7 as well which means adahessian should also smooth out the hessian noise over iterations is there any conceptual reason behind this 6 in section 32 shouldnt the distribution from which the sets mathcalik mathcaljk are sampled be specified eg uniformly at random or is it the case that the conditions on the distribution are subsumed by assumptions 414416 also in assumption 416 the sentence where the samples mathcali are drawn independently should be removed since there is a single random variable mathcali 7 since zk is random one would imagine that this randomness is accounted for in the convergence analysis which doesnt seem to be the case for instance the theorems 46 49 seem to be worst case bounds moreover the bound in theorem 49 depends on hatdk which is a random variable this point needs further clarification 8 in theorem 417 its better to write etak eta for consistency of notation 9 both theorems 417 418 show convergence to neighborhoods of stationary points and not to the stationary points themselves there is a discussion after theorem 418 regarding this aspect but it seems a bit strange why this eg decaying learning rate is not accounted for in the analysis to begin with the paper is written in a very clean manner and is easy to follow sufficient background is provided in the introduction which gives the reader a good context to understand the problem setting and contributions the preconditioning step is a modification of an existing method adahessian and the adaptive lr part builds on techniques used for deriving adaptive lr rules for first order methods mishchenko and malitsky 2020 so the novelty aspect is a bit limited in that respect the theoretical results are outlined rigorously although it is not clear what is the novelty of the theoretical results compared to those for other second order methods the empirical evaluation is quite extensive and satisfactory in my view i am giving it a 6 at the moment since i have other comments in further remarks which i hope can be addressed during the rebuttal phase post rebuttal as mentioned in the comments i am satisfied with the authors response to my concerns and i am happy to increase my score to 8 docsepthe paper designs and analyses an algorithm for minimizing a separable function it provides deterministic as well as stochastic versions which either fully compute the gradient or sample it the algorithm estimates the diagonal elements of the hessian via hessianvector products the algorithm makes use of this information for finding a better search direction and the step length eliminating the need for a line search it provides convergence guarantees and a number of experiments on classical ml problems as well as on deep nets strengths the paper considers a fundamental problem in ml ie minimizing a separable function the algorithm and its convergence are proven for many cases ie deterministic stochastic convex nonconvex empirical evidence is given that the algorithm outperforms comparable approaches like adahessian etc no need to tune a learning rate since the step lengths are determined by the curvature of the function this can be really a huge advantage in the stochastic setting ie in deep learning weakness the empirical evidenceexperiments are rather limited deterministic case only two experiments are provided logistic regression nonlinear leastsquares and only two data sets furthermore a comparison to other minimization methods would be very beneficial in this case and not only to adahessian and adgd yes it is stated in the paper that comparison to only diagonal preconditioners is made but in general there are many more methods to solve this case eg quasinewton methods or trust region newtoncg methods which are also used for computing the optimum in the provided code these methods make use of the same information as the presented method and hence a comparison to these methods would also be useful for a better global picture stochastic case again only a very limited number of experiments is provided here having not to tune the learning rate is an enormous plus here and it would be nice to verify the algorithms robustness on a number of different problemsnets the experiments suggest that oasis would be a viable replacement for sgd adam etc but for such a bold statement more experiments are needed i like the paper the algorithm and its versatility especially that one does not need to tune a learning rate can be very beneficial i did not fully read the convergence proofs though they seem sound according to theory and experiments one should always use this algorithm it would be nice to justify this claim by a more comprehensive study eg more problems datasets and other algorithms in the deterministic case and more nets and data sets in the stochastic setting only then one can tell if it is superior to stateoftheart approaches if such experiments were provided in the paper i would have given a higher score docsepthis work proposes oasis a secondorder method which approximates the diagonal of hessian matrix and uses the information to rescale the gradient vector the main difference between oasis and the existing method adahessian yao et al 2020 is on the ways they approximate the diagonal of hessian that adahessian uses a formula similar to adam and oasis uses an exponential average moreover oasis also incorporates the adaptive stepsize in mishchenko malitsky 2020 the authors established the convergence guarantees of oasis under various settings including convex strongly convex and nonconvex cases using various learning rate schedulers such as the adaptive learning rate fixed learning rate and line search empirical results on various machine learning tasks are provided to evaluate oasis the paper is nicely written and easytofollow the topic on how to effectively leverage the hessianvector oracle in large scale machine learning tasks is definitely important and interesting for the main ideas the authors show in figure 1 that oasis approximates the diagonal of hessian much more accurate than the hessian momentum in adahessian which is the main point made in the paper i have the feeling that the hessian momentum is not solely for approximation like the firstorder momentum vector may not be an accurate approximation for the gradient vector but is effective for acceleration another point is that oasis incorporates the adaptive stepsize in mishchenko malitsky 2020 which allows it to adapt to the local lipschitz constant wrt a weighted euclidean norm and thus reduces the tuning effort however it seems to me that these ideas are a bit straightforward and not particularly novel from my perspective adahessian is a diagonalhessianvariant of adam and oasis is the corresponding variant of rmsprop it seems that the adaptive learning rate can also be incorporated into adahessian by choosing a different weighted norm for the theory part i appreciate the thorough analysis of oasis under various settings however i was hoping for more insightful discussion on these results such as how the theorems would suggest a better parameter choice currently they are only convergence guarantees which could be far from the practical performance the theorems in section 41 generalize the results in mishchenko malitsky 2020 in the deterministic setting while there seems to be no theoretical advantage of such generalization btw is there any bound on the scale of qk in theorem 46 it seems that it can be of the order ok which kills the convergence for the empirical results the authors considered various machine learning tasks and the deviation is also plotted in the figures which are appreciated however the improvement in most of the results seems marginal to me and thus may not be appealing to practitioners especially since the hessianvector oracle is around twice as expensive as the gradient oracle for neural nets moreover oasis still requires a learning rate scheduler as shown in the cifar results which makes the statement our methodology does not require the tedious task of learning rate tuning not wellsupported minor comments equation 7 is not centered typo in the citation adaptive gradient descent without descent in 37th international conference on machine learning iclm 2020 2020 i think lemma a1 is covered by theorem 215 in nesterovs updated book nesterov y 2018 lectures on convex optimization vol 137 berlin germany springer international publishing i appreciate the authors efforts on the comprehensive analysis and empirical evaluations of the proposed oasis the paper is also very well written however both the theoretical and practical results seem incremental to me the construction in oasis also seems a bit straightforward moreover oasis still requires parameter tuning in some of the experiments and thus is not fully adaptive
### Summary: | the paper presents a novel approximate second order optimization method for convex and nonconvex optimization problems the search direction is obtained by preconditioning the gradient information with a diagonal approximation of the hessian via hutchinsons method and exponential averaging the learning rate is updated using an estimate of the smoothness parameter the merit of the paper has to be evaluated from the theoretical and empirical point of view from the internal discussion the reviewers agreed that the new algorithm is a mix a known methods mainly present in adahessian with a small tweak on the exponential average moreover the theoretical guarantees do not seem to capture the empirical performance of the algorithm nor they provide any hint on how to set the algorithms hyperparameters for example in theorem 46 the optimal setting of beta2 is 1 that said the most important theoretical contribution seems to lie in the fact that adahessian did not have any formal guarantee hence this paper is the first one to show a formal guarantee this type of algorithms from the empirical point of view the empirical evidence is very limited for the today standards in empirical machine learning papers the reviewers and me do not actually believe that the proposed algorithm dominates the stateoftheart optimization algorithms used in machine learning however in the internal discussion we agreed that the algorithm has still potential and it should be added to the pool of optimization algorithms people can try overall considering the paper in a holistic way there seems to be enough novelty and results to be accepted at this conference that said i would urge the authors to take into account reviewers comments and i also add some personal ones here in particular a frank discussion of current theoretical analysis and empirical evaluation is needed some specific comments adagrad was proposed by two different groups at colt 2010 so both papers should be cited so please add a citation to mcmahan and streeter adaptive bound optimization for online convex optimization colt 2010 remark 47 second item neither reddi et al2019 nor duchi et al 2011 assume bounded iterates that must be proved not assumed instead they explicitly project onto a domain that they assumed to be bounded the convergence of the gradient to zero does not imply convergence to a critical point to prove convergence to a critical point you should prove that the iterates converge that in general is false even for lower bounded functions indeed consider fxlog1expx the iterates would actually diverge while the gradient still go to zero | [
2204,
264,
407,
13260,
577,
14231,
1036,
671,
275,
9376,
35272,
253,
6197,
835,
253,
3530,
14168,
1179,
74,
403,
8392,
10939,
943,
320,
5176,
1580,
627,
310,
247,
2014,
3632,
4778,
14168,
1179,
74,
50276,
186,
818,
1580,
1182,
76,
310,
3632,
581,
651,
8564,
326,
436,
3632,
1255,
310,
20184,
323,
275,
253,
14940,
1783,
534,
36908,
1646,
281,
320,
253,
1083,
323,
4227,
253,
39383,
7904,
7584,
1646,
281,
320,
9065,
1083,
14493,
25761,
253,
3033,
275,
10012,
7584,
7024,
327,
7856,
17261,
534,
310,
247,
3632,
4778,
436,
1127,
3198,
2007,
37699,
28910,
854,
275,
10012,
38362,
697,
1805,
281,
3630,
1162,
518,
50276,
1464,
323,
15274,
273,
14951,
50276,
26,
1097,
39383,
38362,
38627,
921,
14940,
281,
25237,
273,
17429,
2792,
285,
417,
281,
253,
17429,
2792,
3746,
627,
310,
247,
5955,
846,
10012,
38627,
5001,
436,
4809,
533,
352,
3133,
247,
2372,
8921,
2139,
436,
24088,
46957,
4715,
2281,
310,
417,
20184,
323,
275,
253,
1783,
281,
3135,
342,
186,
253,
2929,
310,
3542,
275,
247,
1077,
4076,
5133,
285,
310,
3477,
281,
956,
4209,
4114,
310,
2530,
275,
253,
10199,
534,
4245,
253,
9414,
247,
1175,
3634,
281,
2096,
253,
1895,
4758,
285,
9021,
253,
638,
42743,
3213,
310,
247,
11237,
273,
271,
5368,
1332,
519,
66,
35659,
757,
285,
253,
17825,
298,
83,
629,
21168,
327,
5609,
908,
323,
44190,
17825,
298,
83,
4803,
323,
806,
1340,
3082,
49285,
5756,
7381,
285,
4691,
953,
4742,
9169,
594,
253,
38135,
4809,
310,
247,
2372,
3710,
275,
326,
1675,
253,
10527,
1543,
403,
18627,
8132,
29689,
3738,
352,
310,
417,
2590,
752,
310,
253,
38135,
273,
253,
10527,
1543,
2429,
281,
1110,
323,
643,
1273,
1340,
3082,
253,
16774,
7103,
310,
3240,
9470,
285,
20297,
275,
619,
1859,
891,
717,
4933,
352,
247,
721,
387,
253,
2774,
1580,
891,
452,
643,
5701,
275,
2007,
16157,
534,
891,
3524,
476,
320,
9713,
1309,
253,
30080,
22559,
3408,
50275,
5996,
30080,
22559,
50275,
284,
5393,
275,
253,
5701,
891,
717,
10048,
342,
253,
4477,
2380,
281,
619,
7350,
285,
891,
717,
5211,
281,
2572,
619,
4868,
281,
854,
5474,
339,
431,
248,
2929,
11809,
285,
6260,
271,
5933,
323,
28699,
247,
39690,
1159,
352,
3400,
30027,
347,
973,
347,
19191,
9508,
534,
2057,
4751,
11897,
253,
11786,
390,
3410,
352,
253,
5933,
8197,
253,
16421,
3603,
273,
253,
344,
859,
757,
3066,
344,
859,
757,
11000,
3580,
253,
5933,
2789,
897,
273,
436,
1491,
323,
4560,
247,
1805,
3186,
3884,
285,
253,
3213,
2978,
23703,
253,
878,
323,
247,
1386,
3186,
352,
3400,
14940,
23632,
285,
247,
1180,
273,
4679,
327,
8946,
13361,
3237,
347,
973,
347,
327,
3676,
37507,
20544,
50275,
783,
2929,
19401,
247,
7936,
1895,
275,
13361,
26332,
28699,
247,
39690,
1159,
50276,
783,
5933,
285,
697,
14940,
403,
11464,
323,
1142,
2219,
26332,
30027,
19191,
17133,
1327,
44181,
50276,
358,
5378,
474,
1941,
310,
1677,
326,
253,
5933,
41731,
13015,
10870,
7274,
751,
519,
66,
35659,
757,
3966,
50276,
2369,
878,
281,
19928,
247,
4715,
2281,
1580,
253,
3213,
16095,
403,
3413,
407,
253,
16841,
273,
253,
1159,
436,
476,
320,
1663,
247,
5699,
5750,
275,
253,
19191,
4758,
26332,
275,
3676,
4715,
50276,
20881,
1255,
50276,
783,
16774,
1941,
16217,
3825,
403,
2581,
3710,
50274,
18916,
249,
2531,
1083,
760,
767,
4679,
403,
2530,
21535,
9077,
14561,
1878,
23600,
4420,
285,
760,
767,
941,
5239,
33810,
247,
5301,
281,
643,
41458,
3082,
651,
320,
1077,
12912,
275,
436,
1083,
285,
417,
760,
281,
519,
66,
35659,
757,
285,
519,
35333,
4754,
352,
310,
4767,
275,
253,
2929,
326,
5301,
281,
760,
16421,
638,
12380,
398,
310,
1160,
533,
275,
2087,
627,
403,
1142,
625,
3082,
281,
8415,
436,
1083,
24088,
21582,
460,
88,
1299,
3082,
390,
4517,
2919,
747,
1299,
29676,
3082,
534,
403,
671,
908,
323,
12672,
253,
24571,
275,
253,
2530,
2127,
841,
3082,
1056,
897,
273,
253,
1072,
1491,
347,
253,
3559,
1332,
285,
7613,
247,
5301,
281,
841,
3082,
651,
671,
320,
4217,
323,
247,
1805,
4156,
5406,
50273,
296,
17283,
1083,
969,
760,
247,
1077,
3710,
1180,
273,
4679,
310,
2530,
1060,
1907,
417,
281,
19928,
253,
4715,
2281,
310,
271,
14779,
5043,
1060,
285,
352,
651,
320,
5322,
281,
12654,
253,
11333,
31640,
327,
247,
1180,
273,
1027,
3237,
47301,
253,
4679,
1804,
326,
258,
4914,
651,
320,
247,
16571,
5407,
323,
256,
35333,
38622,
3966,
533,
323,
824,
247,
13433,
3908,
625,
4679,
403,
3058,
891,
751,
253,
2929,
253,
5933,
285,
697,
49607,
3340,
326,
581,
1057,
417,
878,
281,
19928,
247,
4715,
2281,
476,
320,
1077,
12912,
891,
858,
417,
4751,
1239,
253,
14940,
27947,
2167,
597,
1646,
3590,
2556,
281,
3762,
285,
4679,
581,
943,
1900,
897,
436,
5933,
352,
651,
320,
5322,
281,
15249,
436,
1750,
407,
247,
625,
11088,
1263,
24088,
625,
3237,
15302,
285,
643,
11333,
275,
253,
30027,
1083,
285,
625,
37507,
285,
941,
5239,
275,
253,
19191,
4758,
760,
840,
581,
476,
2028,
604,
352,
310,
8936,
281,
1375,
23037,
14387,
7274,
604,
824,
4679,
497,
2530,
275,
253,
2929,
891,
651,
452,
1677,
247,
2169,
4868,
5474,
33032,
2520,
789,
29328,
258,
4914,
247,
1273,
2621,
1332,
534,
4020,
684,
253,
16421,
273,
344,
859,
757,
4315,
285,
4648,
253,
1491,
281,
9708,
1079,
253,
11786,
4972,
253,
2022,
3064,
875,
258,
4914,
285,
253,
5368,
1332,
519,
66,
35659,
757,
340,
8500,
1162,
355,
9169,
310,
327,
253,
4088,
597,
16851,
253,
16421,
273,
344,
859,
757,
326,
519,
66,
35659,
757,
4648,
247,
7212,
2074,
281,
38622,
285,
258,
4914,
4648,
271,
17619,
3388,
25761,
258,
4914,
671,
31167,
253,
17825,
5018,
907,
275,
49285,
5756,
7381,
50276,
10367,
953,
4742,
9169,
253,
4477,
4232,
253,
14940,
23632,
273,
258,
4914,
762,
2710,
7533,
1690,
17133,
7052,
17133,
285,
1327,
44181,
2219,
970,
2710,
4715,
2281,
8194,
335,
398,
824,
347,
253,
17825,
4715,
2281,
4229,
4715,
2281,
285,
1386,
3186,
16774,
1543,
327,
2710,
5145,
4715,
8892,
403,
2530,
281,
7472,
258,
4914,
253,
2929,
310,
23395,
3542,
285,
3477,
936,
25739,
253,
9400,
327,
849,
281,
8069,
25057,
253,
344,
859,
757,
11000,
42295,
275,
1781,
4311,
5145,
4715,
8892,
310,
7964,
1774,
285,
4722,
50275,
1542,
253,
2022,
5697,
253,
4477,
921,
275,
4677,
337,
326,
258,
4914,
4020,
684,
253,
16421,
273,
344,
859,
757,
1199,
625,
7899,
685,
253,
344,
859,
757,
10254,
275,
519,
66,
35659,
757,
534,
310,
253,
2022,
1127,
1160,
275,
253,
2929,
891,
452,
253,
5471,
326,
253,
344,
859,
757,
10254,
310,
417,
12718,
323,
11193,
751,
253,
806,
2621,
10254,
4972,
778,
417,
320,
271,
7899,
11193,
323,
253,
11786,
4972,
533,
310,
3576,
323,
17680,
1529,
1127,
310,
326,
258,
4914,
31167,
253,
17825,
5018,
907,
275,
49285,
5756,
7381,
50276,
10367,
953,
4742,
9169,
534,
4483,
352,
281,
5223,
281,
253,
1980,
11233,
37913,
3638,
8772,
247,
17375,
299,
26365,
5222,
285,
3021,
11355,
253,
25184,
3434,
2299,
352,
3133,
281,
479,
326,
841,
5697,
403,
247,
2372,
15246,
285,
417,
3782,
4460,
432,
619,
8668,
519,
66,
35659,
757,
310,
247,
16421,
35659,
757,
15417,
273,
38622,
285,
258,
4914,
310,
253,
3969,
12955,
273,
391,
983,
8560,
352,
3133,
326,
253,
17825,
4715,
2281,
476,
671,
320,
11217,
715,
519,
66,
35659,
757,
407,
13887,
247,
1027,
17375,
5222,
50276,
1542,
253,
3762,
629,
891,
11435,
253,
11080,
1783,
273,
258,
4914,
762,
2710,
7533,
2299,
891,
369,
11525,
323,
625,
47860,
5955,
327,
841,
1543,
824,
347,
849,
253,
39383,
651,
1804,
247,
1805,
4764,
4327,
4390,
597,
403,
760,
14940,
23632,
534,
812,
320,
2080,
432,
253,
8542,
3045,
253,
39383,
275,
2593,
7609,
39970,
253,
1543,
275,
49285,
5756,
7381,
50276,
10367,
953,
4742,
9169,
275,
253,
30027,
4758,
1223,
627,
3133,
281,
320,
642,
10527,
5750,
273,
824,
26647,
270,
7553,
310,
627,
667,
3033,
327,
253,
4311,
273,
2805,
76,
275,
10012,
7904,
352,
3133,
326,
352,
476,
320,
273,
253,
1340,
8718,
534,
26280,
253,
14940,
50274,
1542,
253,
16774,
1543,
253,
4477,
2783,
2710,
5145,
4715,
8892,
285,
253,
11254,
310,
671,
17944,
275,
253,
8442,
534,
403,
14109,
2299,
253,
7756,
275,
954,
273,
253,
1543,
3133,
16888,
281,
479,
285,
3021,
778,
417,
320,
23176,
281,
24432,
3340,
1580,
253,
344,
859,
757,
11000,
42295,
310,
1475,
7019,
347,
8214,
347,
253,
11786,
42295,
323,
11454,
37507,
25761,
258,
4914,
1335,
4419,
247,
4715,
2281,
8194,
14398,
347,
2011,
275,
253,
260,
338,
274,
1543,
534,
2789,
253,
3908,
776,
16182,
1057,
417,
2430,
253,
38519,
4836,
273,
4715,
2281,
25184,
417,
973,
19391,
50276,
37585,
5701,
50276,
29813,
818,
310,
417,
18932,
50276,
555,
5367,
275,
253,
25577,
17825,
11786,
18499,
1293,
18499,
50276,
249,
5345,
394,
5213,
8059,
327,
5145,
4715,
17857,
20347,
9169,
9169,
50276,
74,
1158,
18057,
247,
18,
310,
6107,
407,
10012,
23917,
275,
295,
9358,
729,
84,
9300,
1984,
295,
9358,
729,
340,
4765,
29608,
327,
17133,
13757,
1936,
14509,
17099,
3642,
14638,
1279,
7203,
254,
5213,
18051,
891,
11435,
253,
4477,
6031,
327,
253,
11088,
1783,
285,
16774,
27163,
273,
253,
4081,
258,
4914,
253,
2929,
310,
671,
1077,
973,
3542,
2299,
1097,
253,
10527,
285,
8542,
1543,
1646,
32809,
281,
479,
253,
5140,
275,
258,
4914,
671,
3133,
247,
2372,
15246,
25761,
258,
4914,
1335,
4419,
4764,
25184,
275,
690,
273,
253,
4679,
285,
3021,
310,
417,
4751,
17825,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
4460,
16851,
1273,
1340,
13757,
1332,
323,
17133,
285,
1327,
44181,
13757,
3237,
253,
3186,
3884,
310,
2797,
407,
638,
42743,
253,
11786,
1491,
342,
247,
16421,
11193,
273,
253,
344,
859,
757,
3066,
288,
9248,
968,
790,
1332,
285,
17619,
25001,
253,
4715,
2281,
310,
9300,
970,
271,
6642,
273,
253,
6032,
1255,
4764,
50276,
783,
15785,
273,
253,
2929,
556,
281,
320,
6760,
432,
253,
10527,
285,
16774,
1127,
273,
1859,
50276,
4064,
253,
4812,
5955,
253,
30628,
5821,
326,
253,
747,
5933,
310,
247,
5878,
247,
1929,
3082,
7194,
1246,
275,
519,
66,
35659,
757,
342,
247,
1355,
48626,
327,
253,
17619,
3388,
25761,
253,
10527,
23632,
513,
417,
1646,
281,
9232,
253,
16774,
3045,
273,
253,
5933,
4543,
597,
2085,
667,
12662,
327,
849,
281,
873,
253,
11333,
4373,
22041,
323,
1650,
275,
10012,
7904,
253,
8654,
4758,
273,
9840,
19,
310,
337,
326,
753,
253,
954,
1774,
10527,
7680,
3133,
281,
7027,
275,
253,
958,
326,
519,
66,
35659,
757,
858,
417,
452,
667,
7473,
12215,
7613,
436,
2929,
310,
253,
806,
581,
281,
921,
247,
7473,
12215,
436,
1511,
273,
11333,
50276,
4064,
253,
16774,
1127,
273,
1859,
253,
16774,
1941,
310,
1077,
3710,
323,
253,
3063,
7465,
275,
16774,
5145,
4715,
9380,
253,
30628,
285,
479,
513,
417,
2686,
2868,
326,
253,
4081,
5933,
36807,
253,
1375,
23037,
14387,
13757,
11333,
908,
275,
5145,
4715,
2299,
275,
253,
4812,
5955,
359,
5821,
326,
253,
5933,
556,
1335,
2442,
285,
352,
943,
320,
2879,
281,
253,
6363,
273,
13757,
11333,
952,
476,
1611,
50276,
1189,
455,
7296,
253,
2929,
275,
247,
45290,
1039,
627,
3133,
281,
320,
2217,
38135,
285,
1543,
281,
320,
7607,
387,
436,
8059,
50276,
3529,
753,
891,
651,
21434,
253,
4477,
281,
1379,
715,
2395,
30628,
5701,
285,
891,
671,
823,
690,
3367,
4394,
1060,
275,
1798,
247,
21332,
5955,
273,
1655,
10527,
1783,
285,
16774,
7103,
310,
3058,
50276,
8826,
2173,
5701,
50276,
324,
356,
4614,
369,
4081,
407,
767,
1027,
2390,
387,
847,
85,
4267,
594,
1097,
9380,
943,
320,
11106,
594,
4496,
823,
247,
25577,
281,
50276,
17475,
785,
5582,
285,
6126,
1715,
17825,
3033,
13757,
323,
3909,
17133,
13757,
847,
85,
4267,
50276,
39808,
7543,
1273,
5382,
6747,
28159,
74,
1162,
355,
9638,
4543,
277,
26550,
1162,
355,
4332,
5467,
11542,
10040,
684,
326,
1364,
320,
8058,
417,
8025,
3185,
597,
11120,
2199,
4830,
247,
5028,
326,
597,
8025,
281,
320,
11542,
50276,
783,
14940,
273,
253,
11786,
281,
5058,
1057,
417,
16084,
14940,
281,
247,
4619,
1127,
281,
5276,
14940,
281,
247,
4619,
1127,
368,
943,
5276,
326,
253,
10040,
684,
29623,
326,
275,
2087,
310,
3221,
1014,
323,
2406,
11542,
3470,
6296,
1908,
269,
89,
2808,
18,
911,
3498,
253,
10040,
684,
651,
2686,
11711,
463,
1223,
253,
11786,
1335,
564,
281,
5058
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
2204,
264,
407,
13260,
577,
14231,
1036,
671,
275,
9376,
35272,
253,
6197,
835,
253,
3530,
14168,
1179,
74,
403,
8392,
10939,
943,
320,
5176,
1580,
627,
310,
247,
2014,
3632,
4778,
14168,
1179,
74,
50276,
186,
818,
1580,
1182,
76,
310,
3632,
581,
651,
8564,
326,
436,
3632,
1255,
310,
20184,
323,
275,
253,
14940,
1783,
534,
36908,
1646,
281,
320,
253,
1083,
323,
4227,
253,
39383,
7904,
7584,
1646,
281,
320,
9065,
1083,
14493,
25761,
253,
3033,
275,
10012,
7584,
7024,
327,
7856,
17261,
534,
310,
247,
3632,
4778,
436,
1127,
3198,
2007,
37699,
28910,
854,
275,
10012,
38362,
697,
1805,
281,
3630,
1162,
518,
50276,
1464,
323,
15274,
273,
14951,
50276,
26,
1097,
39383,
38362,
38627,
921,
14940,
281,
25237,
273,
17429,
2792,
285,
417,
281,
253,
17429,
2792,
3746,
627,
310,
247,
5955,
846,
10012,
38627,
5001,
436,
4809,
533,
352,
3133,
247,
2372,
8921,
2139,
436,
24088,
46957,
4715,
2281,
310,
417,
20184,
323,
275,
253,
1783,
281,
3135,
342,
186,
253,
2929,
310,
3542,
275,
247,
1077,
4076,
5133,
285,
310,
3477,
281,
956,
4209,
4114,
310,
2530,
275,
253,
10199,
534,
4245,
253,
9414,
247,
1175,
3634,
281,
2096,
253,
1895,
4758,
285,
9021,
253,
638,
42743,
3213,
310,
247,
11237,
273,
271,
5368,
1332,
519,
66,
35659,
757,
285,
253,
17825,
298,
83,
629,
21168,
327,
5609,
908,
323,
44190,
17825,
298,
83,
4803,
323,
806,
1340,
3082,
49285,
5756,
7381,
285,
4691,
953,
4742,
9169,
594,
253,
38135,
4809,
310,
247,
2372,
3710,
275,
326,
1675,
253,
10527,
1543,
403,
18627,
8132,
29689,
3738,
352,
310,
417,
2590,
752,
310,
253,
38135,
273,
253,
10527,
1543,
2429,
281,
1110,
323,
643,
1273,
1340,
3082,
253,
16774,
7103,
310,
3240,
9470,
285,
20297,
275,
619,
1859,
891,
717,
4933,
352,
247,
721,
387,
253,
2774,
1580,
891,
452,
643,
5701,
275,
2007,
16157,
534,
891,
3524,
476,
320,
9713,
1309,
253,
30080,
22559,
3408,
50275,
5996,
30080,
22559,
50275,
284,
5393,
275,
253,
5701,
891,
717,
10048,
342,
253,
4477,
2380,
281,
619,
7350,
285,
891,
717,
5211,
281,
2572,
619,
4868,
281,
854,
5474,
339,
431,
248,
2929,
11809,
285,
6260,
271,
5933,
323,
28699,
247,
39690,
1159,
352,
3400,
30027,
347,
973,
347,
19191,
9508,
534,
2057,
4751,
11897,
253,
11786,
390,
3410,
352,
253,
5933,
8197,
253,
16421,
3603,
273,
253,
344,
859,
757,
3066,
344,
859,
757,
11000,
3580,
253,
5933,
2789,
897,
273,
436,
1491,
323,
4560,
247,
1805,
3186,
3884,
285,
253,
3213,
2978,
23703,
253,
878,
323,
247,
1386,
3186,
352,
3400,
14940,
23632,
285,
247,
1180,
273,
4679,
327,
8946,
13361,
3237,
347,
973,
347,
327,
3676,
37507,
20544,
50275,
783,
2929,
19401,
247,
7936,
1895,
275,
13361,
26332,
28699,
247,
39690,
1159,
50276,
783,
5933,
285,
697,
14940,
403,
11464,
323,
1142,
2219,
26332,
30027,
19191,
17133,
1327,
44181,
50276,
358,
5378,
474,
1941,
310,
1677,
326,
253,
5933,
41731,
13015,
10870,
7274,
751,
519,
66,
35659,
757,
3966,
50276,
2369,
878,
281,
19928,
247,
4715,
2281,
1580,
253,
3213,
16095,
403,
3413,
407,
253,
16841,
273,
253,
1159,
436,
476,
320,
1663,
247,
5699,
5750,
275,
253,
19191,
4758,
26332,
275,
3676,
4715,
50276,
20881,
1255,
50276,
783,
16774,
1941,
16217,
3825,
403,
2581,
3710,
50274,
18916,
249,
2531,
1083,
760,
767,
4679,
403,
2530,
21535,
9077,
14561,
1878,
23600,
4420,
285,
760,
767,
941,
5239,
33810,
247,
5301,
281,
643,
41458,
3082,
651,
320,
1077,
12912,
275,
436,
1083,
285,
417,
760,
281,
519,
66,
35659,
757,
285,
519,
35333,
4754,
352,
310,
4767,
275,
253,
2929,
326,
5301,
281,
760,
16421,
638,
12380,
398,
310,
1160,
533,
275,
2087,
627,
403,
1142,
625,
3082,
281,
8415,
436,
1083,
24088,
21582,
460,
88,
1299,
3082,
390,
4517,
2919,
747,
1299,
29676,
3082,
534,
403,
671,
908,
323,
12672,
253,
24571,
275,
253,
2530,
2127,
841,
3082,
1056,
897,
273,
253,
1072,
1491,
347,
253,
3559,
1332,
285,
7613,
247,
5301,
281,
841,
3082,
651,
671,
320,
4217,
323,
247,
1805,
4156,
5406,
50273,
296,
17283,
1083,
969,
760,
247,
1077,
3710,
1180,
273,
4679,
310,
2530,
1060,
1907,
417,
281,
19928,
253,
4715,
2281,
310,
271,
14779,
5043,
1060,
285,
352,
651,
320,
5322,
281,
12654,
253,
11333,
31640,
327,
247,
1180,
273,
1027,
3237,
47301,
253,
4679,
1804,
326,
258,
4914,
651,
320,
247,
16571,
5407,
323,
256,
35333,
38622,
3966,
533,
323,
824,
247,
13433,
3908,
625,
4679,
403,
3058,
891,
751,
253,
2929,
253,
5933,
285,
697,
49607,
3340,
326,
581,
1057,
417,
878,
281,
19928,
247,
4715,
2281,
476,
320,
1077,
12912,
891,
858,
417,
4751,
1239,
253,
14940,
27947,
2167,
597,
1646,
3590,
2556,
281,
3762,
285,
4679,
581,
943,
1900,
897,
436,
5933,
352,
651,
320,
5322,
281,
15249,
436,
1750,
407,
247,
625,
11088,
1263,
24088,
625,
3237,
15302,
285,
643,
11333,
275,
253,
30027,
1083,
285,
625,
37507,
285,
941,
5239,
275,
253,
19191,
4758,
760,
840,
581,
476,
2028,
604,
352,
310,
8936,
281,
1375,
23037,
14387,
7274,
604,
824,
4679,
497,
2530,
275,
253,
2929,
891,
651,
452,
1677,
247,
2169,
4868,
5474,
33032,
2520,
789,
29328,
258,
4914,
247,
1273,
2621,
1332,
534,
4020,
684,
253,
16421,
273,
344,
859,
757,
4315,
285,
4648,
253,
1491,
281,
9708,
1079,
253,
11786,
4972,
253,
2022,
3064,
875,
258,
4914,
285,
253,
5368,
1332,
519,
66,
35659,
757,
340,
8500,
1162,
355,
9169,
310,
327,
253,
4088,
597,
16851,
253,
16421,
273,
344,
859,
757,
326,
519,
66,
35659,
757,
4648,
247,
7212,
2074,
281,
38622,
285,
258,
4914,
4648,
271,
17619,
3388,
25761,
258,
4914,
671,
31167,
253,
17825,
5018,
907,
275,
49285,
5756,
7381,
50276,
10367,
953,
4742,
9169,
253,
4477,
4232,
253,
14940,
23632,
273,
258,
4914,
762,
2710,
7533,
1690,
17133,
7052,
17133,
285,
1327,
44181,
2219,
970,
2710,
4715,
2281,
8194,
335,
398,
824,
347,
253,
17825,
4715,
2281,
4229,
4715,
2281,
285,
1386,
3186,
16774,
1543,
327,
2710,
5145,
4715,
8892,
403,
2530,
281,
7472,
258,
4914,
253,
2929,
310,
23395,
3542,
285,
3477,
936,
25739,
253,
9400,
327,
849,
281,
8069,
25057,
253,
344,
859,
757,
11000,
42295,
275,
1781,
4311,
5145,
4715,
8892,
310,
7964,
1774,
285,
4722,
50275,
1542,
253,
2022,
5697,
253,
4477,
921,
275,
4677,
337,
326,
258,
4914,
4020,
684,
253,
16421,
273,
344,
859,
757,
1199,
625,
7899,
685,
253,
344,
859,
757,
10254,
275,
519,
66,
35659,
757,
534,
310,
253,
2022,
1127,
1160,
275,
253,
2929,
891,
452,
253,
5471,
326,
253,
344,
859,
757,
10254,
310,
417,
12718,
323,
11193,
751,
253,
806,
2621,
10254,
4972,
778,
417,
320,
271,
7899,
11193,
323,
253,
11786,
4972,
533,
310,
3576,
323,
17680,
1529,
1127,
310,
326,
258,
4914,
31167,
253,
17825,
5018,
907,
275,
49285,
5756,
7381,
50276,
10367,
953,
4742,
9169,
534,
4483,
352,
281,
5223,
281,
253,
1980,
11233,
37913,
3638,
8772,
247,
17375,
299,
26365,
5222,
285,
3021,
11355,
253,
25184,
3434,
2299,
352,
3133,
281,
479,
326,
841,
5697,
403,
247,
2372,
15246,
285,
417,
3782,
4460,
432,
619,
8668,
519,
66,
35659,
757,
310,
247,
16421,
35659,
757,
15417,
273,
38622,
285,
258,
4914,
310,
253,
3969,
12955,
273,
391,
983,
8560,
352,
3133,
326,
253,
17825,
4715,
2281,
476,
671,
320,
11217,
715,
519,
66,
35659,
757,
407,
13887,
247,
1027,
17375,
5222,
50276,
1542,
253,
3762,
629,
891,
11435,
253,
11080,
1783,
273,
258,
4914,
762,
2710,
7533,
2299,
891,
369,
11525,
323,
625,
47860,
5955,
327,
841,
1543,
824,
347,
849,
253,
39383,
651,
1804,
247,
1805,
4764,
4327,
4390,
597,
403,
760,
14940,
23632,
534,
812,
320,
2080,
432,
253,
8542,
3045,
253,
39383,
275,
2593,
7609,
39970,
253,
1543,
275,
49285,
5756,
7381,
50276,
10367,
953,
4742,
9169,
275,
253,
30027,
4758,
1223,
627,
3133,
281,
320,
642,
10527,
5750,
273,
824,
26647,
270,
7553,
310,
627,
667,
3033,
327,
253,
4311,
273,
2805,
76,
275,
10012,
7904,
352,
3133,
326,
352,
476,
320,
273,
253,
1340,
8718,
534,
26280,
253,
14940,
50274,
1542,
253,
16774,
1543,
253,
4477,
2783,
2710,
5145,
4715,
8892,
285,
253,
11254,
310,
671,
17944,
275,
253,
8442,
534,
403,
14109,
2299,
253,
7756,
275,
954,
273,
253,
1543,
3133,
16888,
281,
479,
285,
3021,
778,
417,
320,
23176,
281,
24432,
3340,
1580,
253,
344,
859,
757,
11000,
42295,
310,
1475,
7019,
347,
8214,
347,
253,
11786,
42295,
323,
11454,
37507,
25761,
258,
4914,
1335,
4419,
247,
4715,
2281,
8194,
14398,
347,
2011,
275,
253,
260,
338,
274,
1543,
534,
2789,
253,
3908,
776,
16182,
1057,
417,
2430,
253,
38519,
4836,
273,
4715,
2281,
25184,
417,
973,
19391,
50276,
37585,
5701,
50276,
29813,
818,
310,
417,
18932,
50276,
555,
5367,
275,
253,
25577,
17825,
11786,
18499,
1293,
18499,
50276,
249,
5345,
394,
5213,
8059,
327,
5145,
4715,
17857,
20347,
9169,
9169,
50276,
74,
1158,
18057,
247,
18,
310,
6107,
407,
10012,
23917,
275,
295,
9358,
729,
84,
9300,
1984,
295,
9358,
729,
340,
4765,
29608,
327,
17133,
13757,
1936,
14509,
17099,
3642,
14638,
1279,
7203,
254,
5213,
18051,
891,
11435,
253,
4477,
6031,
327,
253,
11088,
1783,
285,
16774,
27163,
273,
253,
4081,
258,
4914,
253,
2929,
310,
671,
1077,
973,
3542,
2299,
1097,
253,
10527,
285,
8542,
1543,
1646,
32809,
281,
479,
253,
5140,
275,
258,
4914,
671,
3133,
247,
2372,
15246,
25761,
258,
4914,
1335,
4419,
4764,
25184,
275,
690,
273,
253,
4679,
285,
3021,
310,
417,
4751,
17825,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
4460,
16851,
1273,
1340,
13757,
1332,
323,
17133,
285,
1327,
44181,
13757,
3237,
253,
3186,
3884,
310,
2797,
407,
638,
42743,
253,
11786,
1491,
342,
247,
16421,
11193,
273,
253,
344,
859,
757,
3066,
288,
9248,
968,
790,
1332,
285,
17619,
25001,
253,
4715,
2281,
310,
9300,
970,
271,
6642,
273,
253,
6032,
1255,
4764,
50276,
783,
15785,
273,
253,
2929,
556,
281,
320,
6760,
432,
253,
10527,
285,
16774,
1127,
273,
1859,
50276,
4064,
253,
4812,
5955,
253,
30628,
5821,
326,
253,
747,
5933,
310,
247,
5878,
247,
1929,
3082,
7194,
1246,
275,
519,
66,
35659,
757,
342,
247,
1355,
48626,
327,
253,
17619,
3388,
25761,
253,
10527,
23632,
513,
417,
1646,
281,
9232,
253,
16774,
3045,
273,
253,
5933,
4543,
597,
2085,
667,
12662,
327,
849,
281,
873,
253,
11333,
4373,
22041,
323,
1650,
275,
10012,
7904,
253,
8654,
4758,
273,
9840,
19,
310,
337,
326,
753,
253,
954,
1774,
10527,
7680,
3133,
281,
7027,
275,
253,
958,
326,
519,
66,
35659,
757,
858,
417,
452,
667,
7473,
12215,
7613,
436,
2929,
310,
253,
806,
581,
281,
921,
247,
7473,
12215,
436,
1511,
273,
11333,
50276,
4064,
253,
16774,
1127,
273,
1859,
253,
16774,
1941,
310,
1077,
3710,
323,
253,
3063,
7465,
275,
16774,
5145,
4715,
9380,
253,
30628,
285,
479,
513,
417,
2686,
2868,
326,
253,
4081,
5933,
36807,
253,
1375,
23037,
14387,
13757,
11333,
908,
275,
5145,
4715,
2299,
275,
253,
4812,
5955,
359,
5821,
326,
253,
5933,
556,
1335,
2442,
285,
352,
943,
320,
2879,
281,
253,
6363,
273,
13757,
11333,
952,
476,
1611,
50276,
1189,
455,
7296,
253,
2929,
275,
247,
45290,
1039,
627,
3133,
281,
320,
2217,
38135,
285,
1543,
281,
320,
7607,
387,
436,
8059,
50276,
3529,
753,
891,
651,
21434,
253,
4477,
281,
1379,
715,
2395,
30628,
5701,
285,
891,
671,
823,
690,
3367,
4394,
1060,
275,
1798,
247,
21332,
5955,
273,
1655,
10527,
1783,
285,
16774,
7103,
310,
3058,
50276,
8826,
2173,
5701,
50276,
324,
356,
4614,
369,
4081,
407,
767,
1027,
2390,
387,
847,
85,
4267,
594,
1097,
9380,
943,
320,
11106,
594,
4496,
823,
247,
25577,
281,
50276,
17475,
785,
5582,
285,
6126,
1715,
17825,
3033,
13757,
323,
3909,
17133,
13757,
847,
85,
4267,
50276,
39808,
7543,
1273,
5382,
6747,
28159,
74,
1162,
355,
9638,
4543,
277,
26550,
1162,
355,
4332,
5467,
11542,
10040,
684,
326,
1364,
320,
8058,
417,
8025,
3185,
597,
11120,
2199,
4830,
247,
5028,
326,
597,
8025,
281,
320,
11542,
50276,
783,
14940,
273,
253,
11786,
281,
5058,
1057,
417,
16084,
14940,
281,
247,
4619,
1127,
281,
5276,
14940,
281,
247,
4619,
1127,
368,
943,
5276,
326,
253,
10040,
684,
29623,
326,
275,
2087,
310,
3221,
1014,
323,
2406,
11542,
3470,
6296,
1908,
269,
89,
2808,
18,
911,
3498,
253,
10040,
684,
651,
2686,
11711,
463,
1223,
253,
11786,
1335,
564,
281,
5058
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose to utilize the partlevel feature representation for the crossdomain generalization problem in point cloud applications given partlevel features grouped from pointwise representations the authors first align them to a learned feature dictionary via crossattention and then alignedfeatures are aggreged with a partweighted maxpooling strategy in addition contrastive learning is conducted in both shapelevel and partlevel empirical results on standard dg benchmark datasets are presented for validation strengths 1 the method is well motivated the authors find that partlevel features present smaller distribution divergence than shapelevel feature in the crossdomain tasks therefore they propose to adopt part level features in the dg tasks 2 some interesting components are proposed and well justified the proposed parttemplate features implicitly achieves the domain alignment by aligning both domains to the learned feature dictionary the proposed part feature aggregation module outperfoms the popular max pooling module 3 the proposed method achieves stateoftheart performance on dg benchmarks weaknesses 1 i am wondering the relationship between the proposed part feature based dg method and the general point cloud models eg pointnet that utilize partlocal features for example in the pointnet the part level feature is extracted and aggregated hierarchically which is quite similar to the strategy adopted in this paper could you clarify it 2 based on the first question could the proposed module be adopted in general point cloud models for example could we replace the last max pooling layer of pointnet with the part feature aggregation module proposed in this paper 3 as for the proposed techniques the contrastive loss is widely adopted as the learning regularization and utilizing part level features is also a common practice in my opinion the main contribution is the implicit domain alignment with the learned feature dictionary and the part feature aggregation module so i suggest that the authors include more related work on the application of dictionary learning in crossdomain problems such as 1 1 li shuai et al category dictionary guided unsupervised domain adaptation for object detection proceedings of the aaai conference on artificial intelligence vol 35 no 3 2021 not found docsepthe authors present a new method for generalizing point cloud classification from synthetic to real data the authors argue that the local geometric features are more generalizable than the whole shape they focus on partbased feature representation and design a partbased domain generalization network for the point cloud classification task the key idea is to build a common feature space using a part template and then align the partlevel features of the source and the target domain to this template the authors demonstrate that the proposed method achieves stateoftheart performance on the 3ddg benchmark i like the idea to align local geometric features to solve domain generalization on point clouds this idea is novel and significant the technical approach to implement this idea is sound and the experimental results demonstrate good performance i also like the idea to verify the hypothesis that local geometric features are more generalizable than global features in fig 2 however i would like to point out a few issues here 1 it is true that in general reducing the part size leads to better generalization but where is the limit at the very least each part can be reduced to a point but i do not believe that pointbased features are the most generalizable it could be more interesting to identify by how many points per part we would reach to limit of generalization here 2 512partlevel and 256partlevel mean 512 and 256 points per part respectively i guess this sounds confusing as i can also think of it as 512 parts and 256 parts it is better to revise this wording like 512pointsperpart and 256pointsperpart i also value the clarity of the writing which is very nice and easy to read despite its great values the paper suffers from the following issues 1 in terms of technical approach the contrastive learning part is less well connected to the partbased features for domain generalization for example if the authors wish to use contrastive learning at least shapelevel contrastive loss should be used for the baseline methods as well or the comparisons should be separated with a table with no contrastive learning utilized in table 1 as i understand the baselines are without contrastive loss but the pdg is with contrastive loss please correct me if i misunderstood 2 my second concern is that the experiments conducted are somewhat simplistic i expect deeper analysis and more experimental settings to be done please see my comments in the question section i think the related work section in this paper is quite short and needs some revisions first while not exactly the same i found in the literature there are some 3d tasks that link different domains together such as scans and cad object retrieval i think this is worth some further discussions about the connections of these specific tasks with the domain generalization problem presented in this paper a shrec17 rgbd to cad retrieval with objectnn dataset 2017 2018 docsepthe presented method detects global features pointnet or dgcnn locally on sampled points then learn relations between those local representations as partlevel aggregation the performance is further improved by contrastive learning authors evaluate the approach on several cross domain datasets where the method is learnt on one domain and tested on another one the target test domain is inaccessible during training domain adaptation is an important and well old problem for 3d point cloud processing pros clear motivation i like the motivation by features distance between domains fig 2 while the idea of learning partbased models from local features is old and highly researched the presented method on 3d point clouds with neural networks focused for the domain adaptations seems novel the problem of domain adaptation of 3d point cloud processing when the target domain is unavailable during training is a very important and unsolved problem most of questions i had during reading were further answered cons the approach is motivated from many previous works that focus on domain adaptation for 3d point cloud that are not cited though the approach is novel by using it in neural networks authors did not evaluate on some datasets pairs trainingtesting that will allow much broader comparison that raises several questions why i believe it will be also good to report why presented numbers of other methods differ from original papers it looks there is no potential for negative social impact of the work
### Summary: | the paper works on domain generalization of 3d point cloud classification and proposes a partbased domain generalization network for the purpose whose key idea is to build a common feature space of part template and align the partlevel features wherein three reviewers appreciate the contributions including the clear motivation the implicit domain alignment by parttemplate features and the proposed part feature aggregation module they also suggest to improve the paper by clearer definitions of parts better organization of contrastive learning in the paper a more complete citation of closely related works etc after discussions between the authors and reviewers consensus is reached on accepting the paper congratulations | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
281,
16584,
253,
629,
5251,
4735,
6779,
323,
253,
2831,
13517,
26647,
1895,
275,
1127,
9005,
4893,
1677,
629,
5251,
3386,
24104,
432,
1127,
3020,
14237,
253,
4477,
806,
8495,
731,
281,
247,
6311,
4735,
19034,
3066,
2831,
42959,
285,
840,
15616,
28862,
403,
9406,
264,
342,
247,
629,
24676,
2781,
10730,
272,
5700,
275,
1635,
4499,
422,
4715,
310,
5196,
275,
1097,
5281,
5251,
285,
629,
5251,
16774,
1543,
327,
2629,
277,
72,
22791,
15302,
403,
3559,
323,
12820,
50276,
296,
3755,
20556,
337,
253,
1332,
310,
973,
17194,
50276,
783,
4477,
1089,
326,
629,
5251,
3386,
1246,
4577,
3268,
23279,
685,
5281,
5251,
4735,
275,
253,
2831,
13517,
8892,
3103,
597,
12661,
281,
5283,
629,
1268,
3386,
275,
253,
277,
72,
8892,
50276,
19,
690,
4722,
4295,
403,
4081,
285,
973,
17285,
253,
4081,
629,
8547,
3386,
29688,
33526,
253,
5028,
12420,
407,
8495,
272,
1097,
10625,
281,
253,
6311,
4735,
19034,
253,
4081,
629,
4735,
20828,
6333,
41731,
71,
3056,
253,
4633,
2781,
45900,
6333,
50275,
20,
253,
4081,
1332,
33526,
1375,
23037,
14387,
3045,
327,
277,
72,
49602,
50275,
20881,
1255,
265,
337,
891,
717,
12371,
253,
2954,
875,
253,
4081,
629,
4735,
1754,
277,
72,
1332,
285,
253,
2087,
1127,
9005,
3210,
24088,
1127,
3024,
326,
16584,
629,
6790,
3386,
50275,
1542,
1650,
275,
253,
1127,
3024,
253,
629,
1268,
4735,
310,
10375,
285,
40006,
20258,
1037,
534,
310,
3240,
2074,
281,
253,
5700,
8671,
275,
436,
2929,
50276,
16534,
368,
19148,
352,
50275,
19,
1754,
327,
253,
806,
1953,
812,
253,
4081,
6333,
320,
8671,
275,
2087,
1127,
9005,
3210,
323,
1650,
812,
359,
8171,
253,
1390,
2781,
45900,
3828,
273,
1127,
3024,
342,
253,
629,
4735,
20828,
6333,
4081,
275,
436,
2929,
50275,
20,
347,
323,
253,
4081,
5609,
253,
4499,
422,
2957,
310,
7561,
8671,
347,
253,
4715,
37820,
285,
17617,
629,
1268,
3386,
310,
671,
247,
1846,
3946,
275,
619,
4743,
253,
2022,
7680,
310,
253,
15424,
5028,
12420,
342,
253,
6311,
4735,
19034,
285,
253,
629,
4735,
20828,
6333,
594,
891,
1804,
326,
253,
4477,
2486,
625,
2905,
789,
327,
253,
2898,
273,
19034,
4715,
275,
2831,
13517,
3237,
824,
347,
337,
50276,
18,
632,
439,
86,
2284,
1162,
355,
7140,
19034,
18107,
440,
35421,
5028,
15644,
323,
1789,
5481,
10061,
273,
253,
39951,
2284,
8059,
327,
13345,
9260,
1936,
4791,
642,
495,
43425,
50276,
1439,
1119,
5474,
339,
431,
248,
4477,
1246,
247,
747,
1332,
323,
2087,
3006,
1127,
9005,
9162,
432,
13506,
281,
1524,
941,
253,
4477,
9059,
326,
253,
1980,
17856,
3386,
403,
625,
2087,
12729,
685,
253,
2644,
5281,
597,
2770,
327,
629,
3169,
4735,
6779,
285,
2216,
247,
629,
3169,
5028,
26647,
2990,
323,
253,
1127,
9005,
9162,
4836,
253,
2234,
2934,
310,
281,
1973,
247,
1846,
4735,
2317,
970,
247,
629,
7646,
285,
840,
8495,
253,
629,
5251,
3386,
273,
253,
2603,
285,
253,
2303,
5028,
281,
436,
7646,
253,
4477,
7568,
326,
253,
4081,
1332,
33526,
1375,
23037,
14387,
3045,
327,
253,
495,
1678,
72,
22791,
50275,
74,
751,
253,
2934,
281,
8495,
1980,
17856,
3386,
281,
8415,
5028,
26647,
327,
1127,
16173,
436,
2934,
310,
4460,
285,
1534,
253,
7681,
2746,
281,
3359,
436,
2934,
310,
3590,
285,
253,
5661,
1543,
7568,
1175,
3045,
50275,
74,
671,
751,
253,
2934,
281,
12654,
253,
9079,
326,
1980,
17856,
3386,
403,
625,
2087,
12729,
685,
4156,
3386,
275,
3036,
374,
2299,
891,
651,
751,
281,
1127,
562,
247,
1643,
3374,
1060,
50274,
18,
352,
310,
2032,
326,
275,
2087,
8493,
253,
629,
1979,
5644,
281,
1805,
26647,
533,
835,
310,
253,
2701,
387,
253,
1077,
1878,
1016,
629,
476,
320,
3777,
281,
247,
1127,
533,
891,
513,
417,
2868,
326,
1127,
3169,
3386,
403,
253,
954,
2087,
12729,
352,
812,
320,
625,
4722,
281,
4271,
407,
849,
1142,
2792,
591,
629,
359,
651,
3986,
281,
2701,
273,
26647,
1060,
50274,
19,
23414,
2003,
5251,
285,
17558,
2003,
5251,
1599,
23414,
285,
17558,
2792,
591,
629,
2975,
891,
5476,
436,
7835,
21643,
347,
891,
476,
671,
1158,
273,
352,
347,
23414,
4243,
285,
17558,
4243,
352,
310,
1805,
281,
49620,
436,
41066,
751,
23414,
10801,
468,
2003,
285,
17558,
10801,
468,
2003,
50274,
74,
671,
1318,
253,
19843,
273,
253,
4028,
534,
310,
1077,
5322,
285,
3477,
281,
1239,
50274,
3229,
3784,
697,
1270,
2193,
253,
2929,
27171,
432,
253,
1563,
3374,
50274,
18,
275,
2426,
273,
7681,
2746,
253,
4499,
422,
4715,
629,
310,
1679,
973,
4802,
281,
253,
629,
3169,
3386,
323,
5028,
26647,
323,
1650,
604,
253,
4477,
5730,
281,
897,
4499,
422,
4715,
387,
1878,
5281,
5251,
4499,
422,
2957,
943,
320,
908,
323,
253,
8245,
3082,
347,
973,
390,
253,
14023,
943,
320,
9070,
342,
247,
2829,
342,
642,
4499,
422,
4715,
12845,
275,
2829,
337,
347,
891,
2096,
253,
1666,
25379,
403,
1293,
4499,
422,
2957,
533,
253,
268,
27421,
310,
342,
4499,
422,
2957,
4496,
3451,
479,
604,
891,
46485,
50274,
19,
619,
1273,
4468,
310,
326,
253,
4679,
5196,
403,
8489,
8077,
2531,
891,
1902,
12861,
1783,
285,
625,
5661,
7533,
281,
320,
2218,
4496,
923,
619,
5701,
275,
253,
1953,
2593,
50276,
74,
1158,
253,
2905,
789,
2593,
275,
436,
2929,
310,
3240,
2159,
285,
3198,
690,
38549,
806,
1223,
417,
4555,
253,
1072,
891,
1119,
275,
253,
6239,
627,
403,
690,
495,
69,
8892,
326,
3048,
1027,
10625,
2366,
824,
347,
20947,
285,
18072,
1789,
25064,
891,
1158,
436,
310,
4409,
690,
2007,
11985,
670,
253,
10291,
273,
841,
2173,
8892,
342,
253,
5028,
26647,
1895,
3559,
275,
436,
2929,
50276,
66,
439,
2845,
1166,
28937,
14836,
281,
18072,
25064,
342,
1789,
9866,
10895,
4240,
4765,
5474,
339,
431,
248,
3559,
1332,
34472,
4156,
3386,
1127,
3024,
390,
277,
23654,
9866,
50276,
9450,
595,
327,
19958,
2792,
840,
3037,
2493,
875,
1110,
1980,
14237,
347,
629,
5251,
20828,
253,
3045,
310,
2007,
5520,
407,
4499,
422,
4715,
50275,
43355,
7472,
253,
2746,
327,
2067,
2831,
5028,
15302,
835,
253,
1332,
310,
34003,
327,
581,
5028,
285,
5762,
327,
1529,
581,
253,
2303,
1071,
5028,
310,
49187,
1309,
3733,
50276,
13517,
15644,
310,
271,
1774,
285,
973,
1711,
1895,
323,
495,
69,
1127,
9005,
5162,
50276,
856,
84,
2590,
16038,
891,
751,
253,
16038,
407,
3386,
4181,
875,
10625,
3036,
374,
50275,
6050,
253,
2934,
273,
4715,
629,
3169,
3210,
432,
1980,
3386,
310,
1711,
285,
4122,
41548,
253,
3559,
1332,
327,
495,
69,
1127,
16173,
342,
11454,
6928,
7106,
323,
253,
5028,
41655,
3133,
4460,
50276,
783,
1895,
273,
5028,
15644,
273,
495,
69,
1127,
9005,
5162,
672,
253,
2303,
5028,
310,
29356,
50276,
32674,
3733,
310,
247,
1077,
1774,
285,
5061,
5336,
1895,
50276,
2252,
273,
3533,
891,
574,
1309,
4361,
497,
2007,
9577,
50275,
5040,
253,
2746,
310,
17194,
432,
1142,
2045,
2987,
326,
2770,
327,
5028,
15644,
323,
495,
69,
1127,
9005,
326,
403,
417,
11106,
2167,
253,
2746,
310,
4460,
407,
970,
352,
275,
11454,
6928,
50276,
43355,
858,
417,
7472,
327,
690,
15302,
8557,
3733,
19462,
326,
588,
1581,
1199,
16055,
5301,
326,
16540,
2067,
3533,
2139,
891,
2868,
352,
588,
320,
671,
1175,
281,
1304,
2139,
3559,
3904,
273,
643,
3082,
9184,
432,
3236,
9380,
50276,
262,
4453,
627,
310,
642,
2442,
323,
4016,
2675,
3486,
273,
253,
789,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
2987,
327,
5028,
26647,
273,
495,
69,
1127,
9005,
9162,
285,
29328,
247,
629,
3169,
5028,
26647,
2990,
323,
253,
4096,
3692,
2234,
2934,
310,
281,
1973,
247,
1846,
4735,
2317,
273,
629,
7646,
285,
8495,
253,
629,
5251,
3386,
10646,
1264,
30628,
11435,
253,
9021,
1690,
253,
2590,
16038,
253,
15424,
5028,
12420,
407,
629,
8547,
3386,
285,
253,
4081,
629,
4735,
20828,
6333,
597,
671,
1804,
281,
3157,
253,
2929,
407,
30909,
14308,
273,
4243,
1805,
6003,
273,
4499,
422,
4715,
275,
253,
2929,
247,
625,
3426,
25577,
273,
8244,
2905,
2987,
3966,
50275,
6438,
11985,
875,
253,
4477,
285,
30628,
13969,
310,
4925,
327,
18738,
253,
2929,
50276,
14829,
37230,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
12661,
281,
16584,
253,
629,
5251,
4735,
6779,
323,
253,
2831,
13517,
26647,
1895,
275,
1127,
9005,
4893,
1677,
629,
5251,
3386,
24104,
432,
1127,
3020,
14237,
253,
4477,
806,
8495,
731,
281,
247,
6311,
4735,
19034,
3066,
2831,
42959,
285,
840,
15616,
28862,
403,
9406,
264,
342,
247,
629,
24676,
2781,
10730,
272,
5700,
275,
1635,
4499,
422,
4715,
310,
5196,
275,
1097,
5281,
5251,
285,
629,
5251,
16774,
1543,
327,
2629,
277,
72,
22791,
15302,
403,
3559,
323,
12820,
50276,
296,
3755,
20556,
337,
253,
1332,
310,
973,
17194,
50276,
783,
4477,
1089,
326,
629,
5251,
3386,
1246,
4577,
3268,
23279,
685,
5281,
5251,
4735,
275,
253,
2831,
13517,
8892,
3103,
597,
12661,
281,
5283,
629,
1268,
3386,
275,
253,
277,
72,
8892,
50276,
19,
690,
4722,
4295,
403,
4081,
285,
973,
17285,
253,
4081,
629,
8547,
3386,
29688,
33526,
253,
5028,
12420,
407,
8495,
272,
1097,
10625,
281,
253,
6311,
4735,
19034,
253,
4081,
629,
4735,
20828,
6333,
41731,
71,
3056,
253,
4633,
2781,
45900,
6333,
50275,
20,
253,
4081,
1332,
33526,
1375,
23037,
14387,
3045,
327,
277,
72,
49602,
50275,
20881,
1255,
265,
337,
891,
717,
12371,
253,
2954,
875,
253,
4081,
629,
4735,
1754,
277,
72,
1332,
285,
253,
2087,
1127,
9005,
3210,
24088,
1127,
3024,
326,
16584,
629,
6790,
3386,
50275,
1542,
1650,
275,
253,
1127,
3024,
253,
629,
1268,
4735,
310,
10375,
285,
40006,
20258,
1037,
534,
310,
3240,
2074,
281,
253,
5700,
8671,
275,
436,
2929,
50276,
16534,
368,
19148,
352,
50275,
19,
1754,
327,
253,
806,
1953,
812,
253,
4081,
6333,
320,
8671,
275,
2087,
1127,
9005,
3210,
323,
1650,
812,
359,
8171,
253,
1390,
2781,
45900,
3828,
273,
1127,
3024,
342,
253,
629,
4735,
20828,
6333,
4081,
275,
436,
2929,
50275,
20,
347,
323,
253,
4081,
5609,
253,
4499,
422,
2957,
310,
7561,
8671,
347,
253,
4715,
37820,
285,
17617,
629,
1268,
3386,
310,
671,
247,
1846,
3946,
275,
619,
4743,
253,
2022,
7680,
310,
253,
15424,
5028,
12420,
342,
253,
6311,
4735,
19034,
285,
253,
629,
4735,
20828,
6333,
594,
891,
1804,
326,
253,
4477,
2486,
625,
2905,
789,
327,
253,
2898,
273,
19034,
4715,
275,
2831,
13517,
3237,
824,
347,
337,
50276,
18,
632,
439,
86,
2284,
1162,
355,
7140,
19034,
18107,
440,
35421,
5028,
15644,
323,
1789,
5481,
10061,
273,
253,
39951,
2284,
8059,
327,
13345,
9260,
1936,
4791,
642,
495,
43425,
50276,
1439,
1119,
5474,
339,
431,
248,
4477,
1246,
247,
747,
1332,
323,
2087,
3006,
1127,
9005,
9162,
432,
13506,
281,
1524,
941,
253,
4477,
9059,
326,
253,
1980,
17856,
3386,
403,
625,
2087,
12729,
685,
253,
2644,
5281,
597,
2770,
327,
629,
3169,
4735,
6779,
285,
2216,
247,
629,
3169,
5028,
26647,
2990,
323,
253,
1127,
9005,
9162,
4836,
253,
2234,
2934,
310,
281,
1973,
247,
1846,
4735,
2317,
970,
247,
629,
7646,
285,
840,
8495,
253,
629,
5251,
3386,
273,
253,
2603,
285,
253,
2303,
5028,
281,
436,
7646,
253,
4477,
7568,
326,
253,
4081,
1332,
33526,
1375,
23037,
14387,
3045,
327,
253,
495,
1678,
72,
22791,
50275,
74,
751,
253,
2934,
281,
8495,
1980,
17856,
3386,
281,
8415,
5028,
26647,
327,
1127,
16173,
436,
2934,
310,
4460,
285,
1534,
253,
7681,
2746,
281,
3359,
436,
2934,
310,
3590,
285,
253,
5661,
1543,
7568,
1175,
3045,
50275,
74,
671,
751,
253,
2934,
281,
12654,
253,
9079,
326,
1980,
17856,
3386,
403,
625,
2087,
12729,
685,
4156,
3386,
275,
3036,
374,
2299,
891,
651,
751,
281,
1127,
562,
247,
1643,
3374,
1060,
50274,
18,
352,
310,
2032,
326,
275,
2087,
8493,
253,
629,
1979,
5644,
281,
1805,
26647,
533,
835,
310,
253,
2701,
387,
253,
1077,
1878,
1016,
629,
476,
320,
3777,
281,
247,
1127,
533,
891,
513,
417,
2868,
326,
1127,
3169,
3386,
403,
253,
954,
2087,
12729,
352,
812,
320,
625,
4722,
281,
4271,
407,
849,
1142,
2792,
591,
629,
359,
651,
3986,
281,
2701,
273,
26647,
1060,
50274,
19,
23414,
2003,
5251,
285,
17558,
2003,
5251,
1599,
23414,
285,
17558,
2792,
591,
629,
2975,
891,
5476,
436,
7835,
21643,
347,
891,
476,
671,
1158,
273,
352,
347,
23414,
4243,
285,
17558,
4243,
352,
310,
1805,
281,
49620,
436,
41066,
751,
23414,
10801,
468,
2003,
285,
17558,
10801,
468,
2003,
50274,
74,
671,
1318,
253,
19843,
273,
253,
4028,
534,
310,
1077,
5322,
285,
3477,
281,
1239,
50274,
3229,
3784,
697,
1270,
2193,
253,
2929,
27171,
432,
253,
1563,
3374,
50274,
18,
275,
2426,
273,
7681,
2746,
253,
4499,
422,
4715,
629,
310,
1679,
973,
4802,
281,
253,
629,
3169,
3386,
323,
5028,
26647,
323,
1650,
604,
253,
4477,
5730,
281,
897,
4499,
422,
4715,
387,
1878,
5281,
5251,
4499,
422,
2957,
943,
320,
908,
323,
253,
8245,
3082,
347,
973,
390,
253,
14023,
943,
320,
9070,
342,
247,
2829,
342,
642,
4499,
422,
4715,
12845,
275,
2829,
337,
347,
891,
2096,
253,
1666,
25379,
403,
1293,
4499,
422,
2957,
533,
253,
268,
27421,
310,
342,
4499,
422,
2957,
4496,
3451,
479,
604,
891,
46485,
50274,
19,
619,
1273,
4468,
310,
326,
253,
4679,
5196,
403,
8489,
8077,
2531,
891,
1902,
12861,
1783,
285,
625,
5661,
7533,
281,
320,
2218,
4496,
923,
619,
5701,
275,
253,
1953,
2593,
50276,
74,
1158,
253,
2905,
789,
2593,
275,
436,
2929,
310,
3240,
2159,
285,
3198,
690,
38549,
806,
1223,
417,
4555,
253,
1072,
891,
1119,
275,
253,
6239,
627,
403,
690,
495,
69,
8892,
326,
3048,
1027,
10625,
2366,
824,
347,
20947,
285,
18072,
1789,
25064,
891,
1158,
436,
310,
4409,
690,
2007,
11985,
670,
253,
10291,
273,
841,
2173,
8892,
342,
253,
5028,
26647,
1895,
3559,
275,
436,
2929,
50276,
66,
439,
2845,
1166,
28937,
14836,
281,
18072,
25064,
342,
1789,
9866,
10895,
4240,
4765,
5474,
339,
431,
248,
3559,
1332,
34472,
4156,
3386,
1127,
3024,
390,
277,
23654,
9866,
50276,
9450,
595,
327,
19958,
2792,
840,
3037,
2493,
875,
1110,
1980,
14237,
347,
629,
5251,
20828,
253,
3045,
310,
2007,
5520,
407,
4499,
422,
4715,
50275,
43355,
7472,
253,
2746,
327,
2067,
2831,
5028,
15302,
835,
253,
1332,
310,
34003,
327,
581,
5028,
285,
5762,
327,
1529,
581,
253,
2303,
1071,
5028,
310,
49187,
1309,
3733,
50276,
13517,
15644,
310,
271,
1774,
285,
973,
1711,
1895,
323,
495,
69,
1127,
9005,
5162,
50276,
856,
84,
2590,
16038,
891,
751,
253,
16038,
407,
3386,
4181,
875,
10625,
3036,
374,
50275,
6050,
253,
2934,
273,
4715,
629,
3169,
3210,
432,
1980,
3386,
310,
1711,
285,
4122,
41548,
253,
3559,
1332,
327,
495,
69,
1127,
16173,
342,
11454,
6928,
7106,
323,
253,
5028,
41655,
3133,
4460,
50276,
783,
1895,
273,
5028,
15644,
273,
495,
69,
1127,
9005,
5162,
672,
253,
2303,
5028,
310,
29356,
50276,
32674,
3733,
310,
247,
1077,
1774,
285,
5061,
5336,
1895,
50276,
2252,
273,
3533,
891,
574,
1309,
4361,
497,
2007,
9577,
50275,
5040,
253,
2746,
310,
17194,
432,
1142,
2045,
2987,
326,
2770,
327,
5028,
15644,
323,
495,
69,
1127,
9005,
326,
403,
417,
11106,
2167,
253,
2746,
310,
4460,
407,
970,
352,
275,
11454,
6928,
50276,
43355,
858,
417,
7472,
327,
690,
15302,
8557,
3733,
19462,
326,
588,
1581,
1199,
16055,
5301,
326,
16540,
2067,
3533,
2139,
891,
2868,
352,
588,
320,
671,
1175,
281,
1304,
2139,
3559,
3904,
273,
643,
3082,
9184,
432,
3236,
9380,
50276,
262,
4453,
627,
310,
642,
2442,
323,
4016,
2675,
3486,
273,
253,
789,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
2987,
327,
5028,
26647,
273,
495,
69,
1127,
9005,
9162,
285,
29328,
247,
629,
3169,
5028,
26647,
2990,
323,
253,
4096,
3692,
2234,
2934,
310,
281,
1973,
247,
1846,
4735,
2317,
273,
629,
7646,
285,
8495,
253,
629,
5251,
3386,
10646,
1264,
30628,
11435,
253,
9021,
1690,
253,
2590,
16038,
253,
15424,
5028,
12420,
407,
629,
8547,
3386,
285,
253,
4081,
629,
4735,
20828,
6333,
597,
671,
1804,
281,
3157,
253,
2929,
407,
30909,
14308,
273,
4243,
1805,
6003,
273,
4499,
422,
4715,
275,
253,
2929,
247,
625,
3426,
25577,
273,
8244,
2905,
2987,
3966,
50275,
6438,
11985,
875,
253,
4477,
285,
30628,
13969,
310,
4925,
327,
18738,
253,
2929,
50276,
14829,
37230,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a new model architecture for 3d problem which leverages the powerful backbones pretrained from the 2d task the idea is straightforward the input point cloud is projected into 2d pixels using an encoder model then the 2d pixels is colored by a coloring module and the colored images are fed into a pretrained vit backbone and then predictions are made by the taskspecific heads the overall approach provides an elegant solution to leverage the representations of 2d models the experimental results demonstrate superior performance on public benchmarks including modelnet40 shapenetpart datasets 1 novel model architectures to utilize pretrained 2d models to my knowledge the idea of using projection into 2d and coloring module is new 2 the idea is simple yet effective the pretrained 2d models are easy to get and the results are promising the proposed approach has some limitations on the feasible tasks to apply on for example it may not work if we want to conduct a 3d segmentation task related to the question above the comparison in table 4 is not fair due to lack of model complexity analysis docsepin this paper the authors introduce pointtopixel prompting p2p a learning framework for leveraging pretrained image transformers for 3d tasks the method is mainly motivated by the data scarcity issue in 3d domains p2p learns a geometrypreserving transformation from point cloud to 2d grid and then a projection to prepare the 2d grid data to be processed by a pretrained image transformer expecting image tokens the main benefit of p2p to my understanding is the ability to achieve comparable accuracy to other 3d models with much fewer parameters that need to be trained with 3d data this is validated on two tasks 3d object classification and 3d part segmentation strengths the question of whether knowledge can be transferred from large pretrained image models for use with 3d domains is interesting the pointtopixel prompt pipeline which is nicely visualized in figure 2 appears to be novel and is simple and elegant this work is a nice demonstration of ideas from nlp transferring successfully to other domains in this case to 3d point cloud processing the paper is wellwritten and easy to read weaknesses texttextbfunclear motivationproblem and significance the current set of claims in the introduction are that a there is a data starvation problem in 3d domain l3435 and b pretraining point cloud transformers suffers from an imbalance between the number of trainable parameters and limited training data leading to insufficient optimization and overfitting l4041 however the data starvation problem seems to only exist for specific objectcentric datasets such as shapenet by contrast consider the large scannet and waymo datasets moreover recent advances in 3d rendering eg nerf suggests that highly lifelike synthetic 3d data may soon become available therefore scarcity of large datasets does not appear to be a fundamental concern moreover point b seems plainly false since recent methods like pointbert work just as well as p2p on eg modelnet40 as a result it is unclear what the actual problem is that is being addressed here and why this prompting method is needed at all the main benefit of p2p seems to be in the use of fewer model parameters but its unclear why this is important texttextbfmultiple unsubstantiated claims these can be addressed with careful editing l54 the endtoend optimization pipeline and the strategy of freezing the pretrained image model promote the bidirectional knowledge flow between points and pixels to my understanding the flow is unidirectional pretrained image features are being used to learn a better representation for points l271 firstly our p2p outperforms traditional 3d pretraining methods on modelnet40 p2ps largest model achieves the same performance as pointbert similarly claims of superiority of p2p l64 l370 are clearly not supported by the accuracy results in the experiments references dai angela angel x chang manolis savva maciej halber thomas funkhouser and matthias niener scannet richlyannotated 3d reconstructions of indoor scenes in proceedings of the ieee conference on computer vision and pattern recognition pp 58285839 2017 sun pei henrik kretzschmar xerxes dotiwalla aurelien chouard vijaysai patnaik paul tsui james guo et al scalability in perception for autonomous driving waymo open dataset in proceedings of the ieeecvf conference on computer vision and pattern recognition pp 24462454 no limitations of the p2p framework should be discussed in the main text eg in the conclusions section docsepthe paper proposes pointtopixel prompting to leverage 2d pretrained models to help 3d point cloud recognition tasks the main modules include a geometrypreserved projection and a geometryaware coloring which fill the gap between 3d point clouds and 2d images the experiments on modelnet40 and scanobjectnn show that p2p achieves comparable performance on classification tasks with only a few trainable parameters strengths 1 the paper first proposes a prompttuning method to adopt 2d pretrained parameters in 3d which is an interesting and novel exploration 2 with p2p prompting the model can achieve competitive results on the shape classification task with much fewer trainable parameters weaknesses 1 although the method leverages extra 2d image knowledge it does not show clear performance or speed advantages over previous 3d networks on both classification and part segmentation the parameters that need to be trained are fewer but the whole model is larger the 2d prior knowledge is not fully exploited in this method 2 the design of simply adding the point features in the same pixel seems trivial and even with the explanations in line190197 i dont really think it preserves geometry also no more experiments are conducted to analyze these design choices 3 more results on scenelevel point cloud understanding with datasets like scannet or s3dis are expected to illustrate the effectiveness of the prompttuning pipeline the limitation is discussed in sec 43 docsepin this paper the authors propose to leverage the pretrained image model for point cloud downstream tasks specifically they introduce a pointtopixel prompting to transform a point cloud as the corresponding image by geometrypreserved projection and geometryaware coloring strengths 1 the paper is well written with clear motivation and good organization 2 leveraging 2d pretraining for 3d tasks is an interesting topic 3 pointtopixel prompting is novel weakness 1 my main concern is the experiment result apparently the proposed design does not improve the performance 2 what are the computation cost and model size for the prompting procedure see strengths and weaknesses
### Summary: | the paper presents a method of prompt tuning to transfer 2d pretrained weights to tackling 3d understanding problems all reviewers are positive about the novelty of the method with large 2d pretrained models higher performances are still expected from xwsj which is also a reasonable comment other 3d understanding tasks such as segmentation and detection of outdoor scenes are strongly encouraged as they are the true needs of the industry | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
747,
1566,
10336,
323,
495,
69,
1895,
534,
19732,
1131,
253,
6422,
896,
47473,
3215,
11273,
432,
253,
374,
69,
4836,
253,
2934,
310,
15246,
253,
3280,
1127,
9005,
310,
16589,
715,
374,
69,
15115,
970,
271,
32049,
1566,
840,
253,
374,
69,
15115,
310,
18010,
407,
247,
35633,
6333,
285,
253,
18010,
3888,
403,
10208,
715,
247,
3215,
11273,
9084,
27882,
285,
840,
13650,
403,
1160,
407,
253,
8892,
29765,
9851,
253,
4583,
2746,
3400,
271,
20654,
2900,
281,
25057,
253,
14237,
273,
374,
69,
3210,
253,
5661,
1543,
7568,
8936,
3045,
327,
1345,
49602,
1690,
1566,
3024,
1449,
439,
522,
257,
292,
2003,
15302,
337,
4460,
1566,
35615,
281,
16584,
3215,
11273,
374,
69,
3210,
281,
619,
3640,
253,
2934,
273,
970,
12378,
715,
374,
69,
285,
35633,
6333,
310,
747,
50276,
19,
253,
2934,
310,
2969,
2568,
3576,
253,
3215,
11273,
374,
69,
3210,
403,
3477,
281,
755,
285,
253,
1543,
403,
12532,
253,
4081,
2746,
556,
690,
7364,
327,
253,
17887,
8892,
281,
4647,
327,
323,
1650,
352,
778,
417,
789,
604,
359,
971,
281,
2589,
247,
495,
69,
26405,
4836,
50276,
4919,
281,
253,
1953,
1840,
253,
5301,
275,
2829,
577,
310,
417,
4344,
1955,
281,
3480,
273,
1566,
10454,
1783,
5474,
339,
9852,
436,
2929,
253,
4477,
9569,
1127,
3956,
16145,
40021,
268,
19,
81,
247,
4715,
7792,
323,
19732,
2977,
3215,
11273,
2460,
4979,
398,
323,
495,
69,
8892,
253,
1332,
310,
7194,
17194,
407,
253,
941,
22888,
414,
2523,
275,
495,
69,
10625,
268,
19,
81,
33772,
247,
12087,
10192,
26368,
9261,
432,
1127,
9005,
281,
374,
69,
9860,
285,
840,
247,
12378,
281,
10347,
253,
374,
69,
9860,
941,
281,
320,
11742,
407,
247,
3215,
11273,
2460,
39707,
16764,
2460,
21761,
253,
2022,
5649,
273,
268,
19,
81,
281,
619,
4685,
310,
253,
3745,
281,
5115,
10870,
7200,
281,
643,
495,
69,
3210,
342,
1199,
11184,
3602,
326,
878,
281,
320,
10166,
342,
495,
69,
941,
436,
310,
17618,
327,
767,
8892,
495,
69,
1789,
9162,
285,
495,
69,
629,
26405,
20544,
50274,
783,
1953,
273,
1880,
3640,
476,
320,
9495,
432,
1781,
3215,
11273,
2460,
3210,
323,
897,
342,
495,
69,
10625,
310,
4722,
50276,
783,
1127,
3956,
16145,
8959,
15722,
534,
310,
23395,
27130,
275,
4677,
374,
4620,
281,
320,
4460,
285,
310,
2969,
285,
20654,
50276,
2520,
789,
310,
247,
5322,
20028,
273,
5697,
432,
295,
24343,
27090,
8379,
281,
643,
10625,
275,
436,
1083,
281,
495,
69,
1127,
9005,
5162,
50276,
783,
2929,
310,
973,
15720,
285,
3477,
281,
1239,
50275,
20881,
1255,
265,
50273,
1156,
1156,
67,
2337,
8250,
16038,
28872,
285,
8453,
253,
1655,
873,
273,
3916,
275,
253,
10199,
403,
326,
247,
627,
310,
247,
941,
35351,
1895,
275,
495,
69,
5028,
298,
1706,
1671,
285,
270,
3215,
26208,
1127,
9005,
4979,
398,
27171,
432,
271,
31561,
875,
253,
1180,
273,
6194,
494,
3602,
285,
3710,
3733,
941,
4283,
281,
12497,
13757,
285,
689,
31893,
298,
1449,
3156,
2299,
253,
941,
35351,
1895,
3133,
281,
760,
2226,
323,
2173,
1789,
37382,
15302,
824,
347,
439,
522,
257,
292,
407,
4499,
1908,
253,
1781,
660,
1136,
292,
285,
1039,
6972,
15302,
25761,
3332,
16424,
275,
495,
69,
18164,
24088,
38998,
71,
5936,
326,
4122,
5243,
44549,
13506,
495,
69,
941,
778,
3517,
2489,
2130,
3103,
22888,
414,
273,
1781,
15302,
1057,
417,
3176,
281,
320,
247,
7936,
4468,
25761,
1127,
270,
3133,
30236,
3221,
1580,
3332,
3082,
751,
1127,
6291,
789,
816,
347,
973,
347,
268,
19,
81,
327,
24088,
1566,
3024,
1449,
50276,
284,
247,
906,
352,
310,
12744,
752,
253,
4588,
1895,
310,
326,
310,
1146,
9713,
1060,
285,
2139,
436,
40021,
1332,
310,
3058,
387,
512,
253,
2022,
5649,
273,
268,
19,
81,
3133,
281,
320,
275,
253,
897,
273,
11184,
1566,
3602,
533,
697,
12744,
2139,
436,
310,
1774,
50276,
1156,
11765,
34263,
440,
44167,
4215,
3916,
841,
476,
320,
9713,
342,
10182,
14835,
50273,
77,
3439,
253,
990,
936,
423,
13757,
15722,
285,
253,
5700,
273,
24250,
253,
3215,
11273,
2460,
1566,
8591,
253,
12246,
30869,
3640,
2685,
875,
2792,
285,
15115,
281,
619,
4685,
253,
2685,
310,
440,
301,
30869,
3215,
11273,
2460,
3386,
403,
1146,
908,
281,
3037,
247,
1805,
6779,
323,
2792,
50274,
77,
28209,
41005,
776,
268,
19,
81,
41731,
13015,
5899,
495,
69,
3215,
26208,
3082,
327,
1566,
3024,
1449,
268,
19,
793,
6253,
1566,
33526,
253,
1072,
3045,
347,
1127,
6291,
50274,
3549,
6241,
3916,
273,
34385,
273,
268,
19,
81,
298,
1540,
298,
23139,
403,
4518,
417,
4516,
407,
253,
7200,
1543,
275,
253,
4679,
50275,
250,
3065,
50274,
1473,
74,
2897,
7896,
23087,
1269,
1683,
637,
13000,
5745,
6156,
5315,
47688,
7905,
589,
289,
4921,
794,
17616,
528,
254,
285,
1111,
394,
6358,
295,
1914,
254,
660,
1136,
292,
6793,
314,
11423,
456,
495,
69,
49866,
6477,
273,
24340,
13451,
275,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
9135,
1619,
3680,
1867,
4240,
50275,
13998,
759,
74,
23394,
16409,
465,
1221,
91,
10629,
4175,
1269,
254,
89,
265,
14261,
74,
12081,
66,
247,
459,
965,
257,
448,
276,
472,
362,
1944,
698,
2284,
869,
2072,
1479,
1349,
335,
246,
3467,
74,
480,
1443,
1149,
80,
1162,
355,
9171,
1430,
275,
13071,
323,
26279,
6276,
1039,
6972,
1527,
10895,
275,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
27146,
44566,
3439,
642,
7364,
273,
253,
268,
19,
81,
7792,
943,
320,
5469,
275,
253,
2022,
2505,
24088,
275,
253,
11815,
2593,
50276,
7152,
339,
431,
248,
2929,
29328,
1127,
3956,
16145,
40021,
281,
25057,
374,
69,
3215,
11273,
3210,
281,
1361,
495,
69,
1127,
9005,
8981,
8892,
253,
2022,
11911,
2486,
247,
12087,
10192,
4994,
12378,
285,
247,
12087,
13823,
35633,
534,
7522,
253,
8037,
875,
495,
69,
1127,
16173,
285,
374,
69,
3888,
253,
4679,
327,
1566,
3024,
1449,
285,
11017,
6082,
9866,
921,
326,
268,
19,
81,
33526,
10870,
3045,
327,
9162,
8892,
342,
760,
247,
1643,
6194,
494,
3602,
20544,
337,
253,
2929,
806,
29328,
247,
8959,
85,
25004,
1332,
281,
5283,
374,
69,
3215,
11273,
3602,
275,
495,
69,
534,
310,
271,
4722,
285,
4460,
17947,
374,
342,
268,
19,
81,
40021,
253,
1566,
476,
5115,
12085,
1543,
327,
253,
5281,
9162,
4836,
342,
1199,
11184,
6194,
494,
3602,
50276,
20881,
1255,
265,
337,
3738,
253,
1332,
19732,
1131,
4465,
374,
69,
2460,
3640,
352,
1057,
417,
921,
2590,
3045,
390,
3885,
11361,
689,
2045,
495,
69,
6928,
327,
1097,
9162,
285,
629,
26405,
253,
3602,
326,
878,
281,
320,
10166,
403,
11184,
533,
253,
2644,
1566,
310,
4067,
253,
374,
69,
2720,
3640,
310,
417,
4751,
28734,
275,
436,
1332,
50276,
19,
253,
2216,
273,
3365,
6240,
253,
1127,
3386,
275,
253,
1072,
12275,
3133,
14916,
285,
1014,
342,
253,
22909,
275,
1386,
746,
520,
4148,
891,
13414,
1663,
1158,
352,
31221,
12087,
671,
642,
625,
4679,
403,
5196,
281,
12106,
841,
2216,
10165,
50276,
20,
625,
1543,
327,
6200,
5251,
1127,
9005,
4685,
342,
15302,
751,
660,
1136,
292,
390,
256,
20,
3431,
403,
3264,
281,
17093,
253,
12510,
273,
253,
8959,
85,
25004,
15722,
50276,
783,
12291,
310,
5469,
275,
4706,
7652,
50276,
7152,
339,
9852,
436,
2929,
253,
4477,
12661,
281,
25057,
253,
3215,
11273,
2460,
1566,
323,
1127,
9005,
15450,
8892,
5742,
597,
9569,
247,
1127,
3956,
16145,
40021,
281,
4979,
247,
1127,
9005,
347,
253,
3969,
2460,
407,
12087,
10192,
4994,
12378,
285,
12087,
13823,
35633,
20544,
337,
253,
2929,
310,
973,
3542,
342,
2590,
16038,
285,
1175,
6003,
374,
19732,
2977,
374,
69,
3215,
26208,
323,
495,
69,
8892,
310,
271,
4722,
9400,
495,
1127,
3956,
16145,
40021,
310,
4460,
50276,
20881,
1255,
337,
619,
2022,
4468,
310,
253,
3368,
906,
8505,
253,
4081,
2216,
1057,
417,
3157,
253,
3045,
374,
752,
403,
253,
13782,
2105,
285,
1566,
1979,
323,
253,
40021,
5199,
50276,
2887,
20544,
285,
32213,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
1332,
273,
8959,
25184,
281,
3700,
374,
69,
3215,
11273,
13461,
281,
46710,
495,
69,
4685,
3237,
512,
30628,
403,
2762,
670,
253,
38135,
273,
253,
1332,
342,
1781,
374,
69,
3215,
11273,
3210,
2169,
16226,
403,
1335,
3264,
432,
1269,
8819,
75,
534,
310,
671,
247,
5272,
4385,
643,
495,
69,
4685,
8892,
824,
347,
26405,
285,
5481,
273,
17603,
13451,
403,
7052,
14659,
347,
597,
403,
253,
2032,
3198,
273,
253,
4491
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
747,
1566,
10336,
323,
495,
69,
1895,
534,
19732,
1131,
253,
6422,
896,
47473,
3215,
11273,
432,
253,
374,
69,
4836,
253,
2934,
310,
15246,
253,
3280,
1127,
9005,
310,
16589,
715,
374,
69,
15115,
970,
271,
32049,
1566,
840,
253,
374,
69,
15115,
310,
18010,
407,
247,
35633,
6333,
285,
253,
18010,
3888,
403,
10208,
715,
247,
3215,
11273,
9084,
27882,
285,
840,
13650,
403,
1160,
407,
253,
8892,
29765,
9851,
253,
4583,
2746,
3400,
271,
20654,
2900,
281,
25057,
253,
14237,
273,
374,
69,
3210,
253,
5661,
1543,
7568,
8936,
3045,
327,
1345,
49602,
1690,
1566,
3024,
1449,
439,
522,
257,
292,
2003,
15302,
337,
4460,
1566,
35615,
281,
16584,
3215,
11273,
374,
69,
3210,
281,
619,
3640,
253,
2934,
273,
970,
12378,
715,
374,
69,
285,
35633,
6333,
310,
747,
50276,
19,
253,
2934,
310,
2969,
2568,
3576,
253,
3215,
11273,
374,
69,
3210,
403,
3477,
281,
755,
285,
253,
1543,
403,
12532,
253,
4081,
2746,
556,
690,
7364,
327,
253,
17887,
8892,
281,
4647,
327,
323,
1650,
352,
778,
417,
789,
604,
359,
971,
281,
2589,
247,
495,
69,
26405,
4836,
50276,
4919,
281,
253,
1953,
1840,
253,
5301,
275,
2829,
577,
310,
417,
4344,
1955,
281,
3480,
273,
1566,
10454,
1783,
5474,
339,
9852,
436,
2929,
253,
4477,
9569,
1127,
3956,
16145,
40021,
268,
19,
81,
247,
4715,
7792,
323,
19732,
2977,
3215,
11273,
2460,
4979,
398,
323,
495,
69,
8892,
253,
1332,
310,
7194,
17194,
407,
253,
941,
22888,
414,
2523,
275,
495,
69,
10625,
268,
19,
81,
33772,
247,
12087,
10192,
26368,
9261,
432,
1127,
9005,
281,
374,
69,
9860,
285,
840,
247,
12378,
281,
10347,
253,
374,
69,
9860,
941,
281,
320,
11742,
407,
247,
3215,
11273,
2460,
39707,
16764,
2460,
21761,
253,
2022,
5649,
273,
268,
19,
81,
281,
619,
4685,
310,
253,
3745,
281,
5115,
10870,
7200,
281,
643,
495,
69,
3210,
342,
1199,
11184,
3602,
326,
878,
281,
320,
10166,
342,
495,
69,
941,
436,
310,
17618,
327,
767,
8892,
495,
69,
1789,
9162,
285,
495,
69,
629,
26405,
20544,
50274,
783,
1953,
273,
1880,
3640,
476,
320,
9495,
432,
1781,
3215,
11273,
2460,
3210,
323,
897,
342,
495,
69,
10625,
310,
4722,
50276,
783,
1127,
3956,
16145,
8959,
15722,
534,
310,
23395,
27130,
275,
4677,
374,
4620,
281,
320,
4460,
285,
310,
2969,
285,
20654,
50276,
2520,
789,
310,
247,
5322,
20028,
273,
5697,
432,
295,
24343,
27090,
8379,
281,
643,
10625,
275,
436,
1083,
281,
495,
69,
1127,
9005,
5162,
50276,
783,
2929,
310,
973,
15720,
285,
3477,
281,
1239,
50275,
20881,
1255,
265,
50273,
1156,
1156,
67,
2337,
8250,
16038,
28872,
285,
8453,
253,
1655,
873,
273,
3916,
275,
253,
10199,
403,
326,
247,
627,
310,
247,
941,
35351,
1895,
275,
495,
69,
5028,
298,
1706,
1671,
285,
270,
3215,
26208,
1127,
9005,
4979,
398,
27171,
432,
271,
31561,
875,
253,
1180,
273,
6194,
494,
3602,
285,
3710,
3733,
941,
4283,
281,
12497,
13757,
285,
689,
31893,
298,
1449,
3156,
2299,
253,
941,
35351,
1895,
3133,
281,
760,
2226,
323,
2173,
1789,
37382,
15302,
824,
347,
439,
522,
257,
292,
407,
4499,
1908,
253,
1781,
660,
1136,
292,
285,
1039,
6972,
15302,
25761,
3332,
16424,
275,
495,
69,
18164,
24088,
38998,
71,
5936,
326,
4122,
5243,
44549,
13506,
495,
69,
941,
778,
3517,
2489,
2130,
3103,
22888,
414,
273,
1781,
15302,
1057,
417,
3176,
281,
320,
247,
7936,
4468,
25761,
1127,
270,
3133,
30236,
3221,
1580,
3332,
3082,
751,
1127,
6291,
789,
816,
347,
973,
347,
268,
19,
81,
327,
24088,
1566,
3024,
1449,
50276,
284,
247,
906,
352,
310,
12744,
752,
253,
4588,
1895,
310,
326,
310,
1146,
9713,
1060,
285,
2139,
436,
40021,
1332,
310,
3058,
387,
512,
253,
2022,
5649,
273,
268,
19,
81,
3133,
281,
320,
275,
253,
897,
273,
11184,
1566,
3602,
533,
697,
12744,
2139,
436,
310,
1774,
50276,
1156,
11765,
34263,
440,
44167,
4215,
3916,
841,
476,
320,
9713,
342,
10182,
14835,
50273,
77,
3439,
253,
990,
936,
423,
13757,
15722,
285,
253,
5700,
273,
24250,
253,
3215,
11273,
2460,
1566,
8591,
253,
12246,
30869,
3640,
2685,
875,
2792,
285,
15115,
281,
619,
4685,
253,
2685,
310,
440,
301,
30869,
3215,
11273,
2460,
3386,
403,
1146,
908,
281,
3037,
247,
1805,
6779,
323,
2792,
50274,
77,
28209,
41005,
776,
268,
19,
81,
41731,
13015,
5899,
495,
69,
3215,
26208,
3082,
327,
1566,
3024,
1449,
268,
19,
793,
6253,
1566,
33526,
253,
1072,
3045,
347,
1127,
6291,
50274,
3549,
6241,
3916,
273,
34385,
273,
268,
19,
81,
298,
1540,
298,
23139,
403,
4518,
417,
4516,
407,
253,
7200,
1543,
275,
253,
4679,
50275,
250,
3065,
50274,
1473,
74,
2897,
7896,
23087,
1269,
1683,
637,
13000,
5745,
6156,
5315,
47688,
7905,
589,
289,
4921,
794,
17616,
528,
254,
285,
1111,
394,
6358,
295,
1914,
254,
660,
1136,
292,
6793,
314,
11423,
456,
495,
69,
49866,
6477,
273,
24340,
13451,
275,
10061,
273,
253,
26332,
1796,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
9135,
1619,
3680,
1867,
4240,
50275,
13998,
759,
74,
23394,
16409,
465,
1221,
91,
10629,
4175,
1269,
254,
89,
265,
14261,
74,
12081,
66,
247,
459,
965,
257,
448,
276,
472,
362,
1944,
698,
2284,
869,
2072,
1479,
1349,
335,
246,
3467,
74,
480,
1443,
1149,
80,
1162,
355,
9171,
1430,
275,
13071,
323,
26279,
6276,
1039,
6972,
1527,
10895,
275,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
27146,
44566,
3439,
642,
7364,
273,
253,
268,
19,
81,
7792,
943,
320,
5469,
275,
253,
2022,
2505,
24088,
275,
253,
11815,
2593,
50276,
7152,
339,
431,
248,
2929,
29328,
1127,
3956,
16145,
40021,
281,
25057,
374,
69,
3215,
11273,
3210,
281,
1361,
495,
69,
1127,
9005,
8981,
8892,
253,
2022,
11911,
2486,
247,
12087,
10192,
4994,
12378,
285,
247,
12087,
13823,
35633,
534,
7522,
253,
8037,
875,
495,
69,
1127,
16173,
285,
374,
69,
3888,
253,
4679,
327,
1566,
3024,
1449,
285,
11017,
6082,
9866,
921,
326,
268,
19,
81,
33526,
10870,
3045,
327,
9162,
8892,
342,
760,
247,
1643,
6194,
494,
3602,
20544,
337,
253,
2929,
806,
29328,
247,
8959,
85,
25004,
1332,
281,
5283,
374,
69,
3215,
11273,
3602,
275,
495,
69,
534,
310,
271,
4722,
285,
4460,
17947,
374,
342,
268,
19,
81,
40021,
253,
1566,
476,
5115,
12085,
1543,
327,
253,
5281,
9162,
4836,
342,
1199,
11184,
6194,
494,
3602,
50276,
20881,
1255,
265,
337,
3738,
253,
1332,
19732,
1131,
4465,
374,
69,
2460,
3640,
352,
1057,
417,
921,
2590,
3045,
390,
3885,
11361,
689,
2045,
495,
69,
6928,
327,
1097,
9162,
285,
629,
26405,
253,
3602,
326,
878,
281,
320,
10166,
403,
11184,
533,
253,
2644,
1566,
310,
4067,
253,
374,
69,
2720,
3640,
310,
417,
4751,
28734,
275,
436,
1332,
50276,
19,
253,
2216,
273,
3365,
6240,
253,
1127,
3386,
275,
253,
1072,
12275,
3133,
14916,
285,
1014,
342,
253,
22909,
275,
1386,
746,
520,
4148,
891,
13414,
1663,
1158,
352,
31221,
12087,
671,
642,
625,
4679,
403,
5196,
281,
12106,
841,
2216,
10165,
50276,
20,
625,
1543,
327,
6200,
5251,
1127,
9005,
4685,
342,
15302,
751,
660,
1136,
292,
390,
256,
20,
3431,
403,
3264,
281,
17093,
253,
12510,
273,
253,
8959,
85,
25004,
15722,
50276,
783,
12291,
310,
5469,
275,
4706,
7652,
50276,
7152,
339,
9852,
436,
2929,
253,
4477,
12661,
281,
25057,
253,
3215,
11273,
2460,
1566,
323,
1127,
9005,
15450,
8892,
5742,
597,
9569,
247,
1127,
3956,
16145,
40021,
281,
4979,
247,
1127,
9005,
347,
253,
3969,
2460,
407,
12087,
10192,
4994,
12378,
285,
12087,
13823,
35633,
20544,
337,
253,
2929,
310,
973,
3542,
342,
2590,
16038,
285,
1175,
6003,
374,
19732,
2977,
374,
69,
3215,
26208,
323,
495,
69,
8892,
310,
271,
4722,
9400,
495,
1127,
3956,
16145,
40021,
310,
4460,
50276,
20881,
1255,
337,
619,
2022,
4468,
310,
253,
3368,
906,
8505,
253,
4081,
2216,
1057,
417,
3157,
253,
3045,
374,
752,
403,
253,
13782,
2105,
285,
1566,
1979,
323,
253,
40021,
5199,
50276,
2887,
20544,
285,
32213,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
1332,
273,
8959,
25184,
281,
3700,
374,
69,
3215,
11273,
13461,
281,
46710,
495,
69,
4685,
3237,
512,
30628,
403,
2762,
670,
253,
38135,
273,
253,
1332,
342,
1781,
374,
69,
3215,
11273,
3210,
2169,
16226,
403,
1335,
3264,
432,
1269,
8819,
75,
534,
310,
671,
247,
5272,
4385,
643,
495,
69,
4685,
8892,
824,
347,
26405,
285,
5481,
273,
17603,
13451,
403,
7052,
14659,
347,
597,
403,
253,
2032,
3198,
273,
253,
4491
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a novel approach to biological sequence optimization that models the task as a linear bandit problem and uses thompson sampling to guide a directed evolution algorithm towards optimizing the linear fitness function the approach tsde is a variation of de that is demonstrated to outperform classical randomblind de alone in simulation that achieves a bayesian regret that is optimal in population size and number of generations the de and tsde algorithms are structurally the same using crossover and point mutations to evolve a population of sequence variants over generations the difference is that tsde tracks a posterior of the fitness function between generations and in each generation draws a fitness function estimate from this posterior and roughly speaking limits the recombined variants and mutation positions to those which would see improvement under that fitness function estimate the result is more efficient convergence to optimal or nearoptimal sequences it is not surprising on its own that a method which takes advantage of side information in crossover and mutation steps should outperform a completely random mutation approach it is disappointing that the authors did not compare this approach to at least one baseline that also also uses this sideinformation such as training and scoring a linear model on proposed sequences to use as a filter between rounds nonetheless a key strength to the paper is its mathematical rigor and the main result of the paper that proves that tsde achieves a bayesian regret of tildeod2sqrtmt the proof summary in the body of the paper was helpful for intuition and great care was taken in the appendix to support each step of the proof the main limitationweakness of both this main result and the algorithm itself arises from the two necessarily assumptions 32 which requires the true fitness function to be a linear function with weights drawn independently from gaussian distributions and 35 which restricts measured values to having homoskedastic iid gaussian additive noise neither of these assumptions are likely to hold or even approximately hold in practice and there is no discussion of the sensitivity of the method and results to violations of these assumptions a secondary limitationweakness is in the applicability of this method and results to biological experiments for two main reasons the linear model and biological feasibility of intervening to filter between generations while the authors claim in appendix 2 that the linear model can be generalized somewhat beyond binary motifs it is not explained how this is done and the application seems to still be restricted to linear models the biological feasibility utility here may be a bigger issue and is discussed in the weaknesses section below strengths 1 the proposed algorithm 1 to the reviewers knowledge is novel though straightforward and sufficient explanation has been provided for others to implement these ideas easily in simulation 2 the papers structure and organization was thoughtful and easy to follow with key contributions highlighted and more technical steps relegated to the appendix 3 the mathematical rigor is high with great care taken in the appendix to explain both the intuition and logic behind all mathematical claims the approach to combining de and bandit theory overall was creative and carefully developed 4 the main result that tsde achieves optimal bayesian regret at least with respect to population size and number of generations is strong 5 simulations show tsde outperforms de handily though this result is not surprising it was worthwhile to confirm this through a demonstration weaknesses 1 several major assumptions would appear to not hold in practice primarily the assumptions that the fitness function is linear in a known relatively small basis of binary motifs that aside there is also the issue of accurately estimating the known variance sigma the assumption that that noise is uniform while in practice heteroskedasticity is common the paper does not address the sensitivity of the performance of their approach or its bounds under minor violations of these assumptionssupport on real data 2 an initial concern was that the linear motif assumption in the paper was not easily generalizable to realistic protein engineering settings in appendix b the authors explain that in real world applications these assumptions were loosened so that individual base pairs were considered for mutation and features were scalarvalued not binaryvalued while more details on these extensions would have been appreciated it is understandable that they may be saved for the other publication mentioned andor remain proprietary 3 one way to alleviate these concerns is to actually show the performance of the algorithm on some simple but more realistic tasks eg this package allows the authors to test their algorithms against some competing methods httpsgithubcomsamsinaiflexs 4 unclear how the mutation rate mu is chosen presumably the choice of mu has some impact on the convergence of the algorithm if not bounds yet this parameter does not appear in the analysis or experiments 5 novelty of the approach while this exact formulation is novel there are other recent examples in the literature that combine thompson sampling with bandit problems such as httpsarxivorgpdf220510113pdf the method in this paper seems to be very similar to the paper under review though it frames itself as a multiarmed bandit approach and therefore compares itself against multiarmed bandit algorithms including ucb methods moreover while the thompson sampling element of the algorithm is key to proving the main result the bound on regret in effect the algorithm alg 1 itself comes out as a fairly standard directed evolution approach 6 the paper claims that the resulting regret bound is optimal in m and t and this is remarked upon but not proven or cited 7 a key argument for the linear bandit model is made in the remark at the end of section 2 in short it says that this problem is not a multiarmed bandit problem primarily because we are not free to choose any action but are limited by biology to mutation and recombination this is the reason why this method is not compared against multiarm bandit methods in simulation and why its regret is not compared to the regret of multiarm bandits at first this seems entirely reasonable however algorithm 1 introduces crossover and mutation steps that involve sequence filtering based on calculating model scores and intervening at this step this adds hugely significant cost to de in a biological setting in classicrandom de the crossover and mutation steps are random not because of a lack of sideknowledge that could in theory be used to direct these steps but because the biological steps of mutation and crossover naturally occur randomly while it is trivial in silico to intervene and filter out undesirable mutants according to side information doing so in practice would require sequencing after evolutionary rounds to see what was produced and then somehow separating out desirable and undesirable variants in the population before continuing alliteratively the whole mutation process could be done in silico and then the populations st could be constructed by hand from the in silico list of variants but doing this completely negates the advantage of de as an inexpensive process and negatives the remark at the end of section 2 explaining why the model is limited to linear bandits if we are going to allow this level of filtering andor individual construction of variants we shouldnt have to limit ourselves to variants that could potentially arise from de mutation and crossover steps given the description in appendix 2 of applying this model in practice and the use of crispr technology in its implementation it seems likely that the sequences were synthesized from an in silico selection step that could have easily been expanded to include sequences unobtainable through crossover or limited point mutations again raising the question of whether the the assumption that this is not a multiarmed bandit problem is really a limitation of the biology or just a desirable limitation for the sake of the mathematical results limitations the main limitations of this paper is its reliance on strong and unlikely assumptions that the fitness landscape be linear and noise be iid gaussian and the questionable applicability in biological experiments identified in weakness 7 were not discussed in the paper neither of these negate the contributions of the paper but should be presented and if possible some discussion of the sensitivity of these results to violations in these assumptions should be added i also think benchmarking against other sequence design work on more conventional challenges would strengthen the paper significantly there is still a bit of a gap between the theoretical results and applicability that im not convinced this work would help close it would be helpful if authors contrast their work with algorithms discussed by sinai and kelsic 2020 on modelguided sequence design in particular discussion in section 5 both with better de modeling and contrasting with algorithms like cbas brookes et al 2019 minor some grammatical issues 1 in second paragraph de one of the top molecular technology breakthrough in the past century demonstrate humans ability to engineer proteins at will breakthrough breakthroughs demonstrate demonstrates humans humanitys 2 the following paragraph was especially confusing since the topic sentence claimed that de remains expensive and timeconsuming but the subsequent sentences all defend the opposite claim that de is generally easy and has been exponentially improved 3 theres also a small issue in figure 11 in the recombination example where a child with a dark gray motif 4 arises from parents without that motif 4 the comment after assumption 35 that our goal is to maximize the bayesian regret presumably was a mistake and the authors meant to minimize instead docsepthis paper focuses on a specific application local directed evolution it starts with a set of candidate sequences and proposes an algorithm that performs two operations on the batch of sequences recombination and common point mutation the response function is modelled as a linear function of d protein motifs where each motif can be or off authors analyze bayesian regret of their procedure supported with comparison to classical evolutionary strategy strengths i like the formalization attempt of the experimental pipeline authors are using and designing an algorithm for it its nice this found a realworld application and improved over classical evolutionary approaches weakness the authors do not do justice to the field of directed evolution there are evolutionary based methods which operate on a current batch of sequences and amend it as described here more aligned with local or evolutionary approaches however there are other approaches which can synthesise specific mutants and operate on the combinatorial space likewise with highthroughput methods perhaps an important citation from this line of work is the paper which for the first time uses bo with gaussian processes and has regret guarantee which appeared in pnas 2013 by authors romero krause and arnold this is a ripe field that falls under the directed evolution keyword as well focus on regret minimization is a big minus i see no reason why to focus on regret minimization a more reasonable benchmark is to report the best variant so far thompson sampling often samples very greedy steps in order to achieve low regret but in practice we do not want this at all we want to explore given the uncertainty of theta we want to be as diverse as possible to find meaningful new information and candidates there are two submethods introduced first the operators are introduced in one order explanation of them is in reversed order and they appear in the algorithm yet reversed i have significant doubts about the proof of theorem 52 namely the sudden emergence of modulus of concentration which i have no clue about the term h2 should be bounded by dsqrttm not the other exponent as authors propose also one hast to explicitly state that this is for linear objective supported on unit hypercube this proof does not work generally a reader might be tempted to generalize to general linear bandits in general but this is not true and its not sufficiently clearly spelled out however due to the simplicity of the objective hypercube linear i have no doubts that the algorithm would eventually converge since there is a random component due to mu one just needs to learn d components sufficiently fast the function in this case is the additive function of motifs this is an extremely simplistic assumption for many practical applications where complex multi mutation epistasis occurs and is of interest to be modelled by datadriven methods there isnt any meaningful baseline algorithm for example and algorithm that would try to estimate the effect of all pointwise mutations ie s s1 first mutation ss2 second mutation etc this would probably converge in odsqrttm steps as well and i believe even faster the paper tries to give a very general introduction to directed evolution however in 4 projects involving de evolution never was the goal to change motifs of the protein and work in this greedy fashion instead enzymatic proportions were to be improved using a few selected mutations in the vicinity of the active site i think the formulation authors have is fine and is probably motivated by their experimental setup but this is a form of directed evolution not the directed evolution i want to stress that this is an important point if the goal of this paper is to introduce this problem to the broader ml community we better do this carefully without over generalizing i am fine with the setup but one has to clearly say that this is a specific setup of de for example in my opinion this paper does not address the major challenges in de eg epistasis combinatorial problems etc lacking theoretical understanding i am not sure the proof is correct since there is a magic quantify appearing suddenly this is my justification for a lower score i think the theorem might be correct but the way there is not clearly stated also i see no reason why scaling in d2 is necessary for such a simple objective linear on a hypercube the paper lacks an algorithmically meaningful baseline which could be executed synthetically docsepthis paper study a cross domain problem for biological sequence optimization with bandit problem and tries to provide theoretical understanding of directed evolution under bandit theory with thompson sampling strengths 1 the problem is interesting 2 the paper gives a theoretical understanding of bandit problem with directed evolution weaknesses 1 the linear setting is always a problem in real world application i am not sure whether the linear bandit assumption suitable enough for real application 2 i do not go through the proof while it may seems that the proof may directly follow the standard proof sketch of bandit theory it will be much better if the authors provide a brief description about the proof difficulties during adopting the typical ts bandit proof technique empty docsepthe paper introduces a variant of a linear bandits model for directed evolution de de is concerned with iteratively optimizing a population of individuals by selecting a subset of promising individuals for mutation and recombination in each step the utility of individuals is modeled with a linear function parameterized by a parameter theta theta is refined via a new variant of thompson sampling the difference to classic thompson sampling is that the one cannot directly sample individuals due to the stochasticity of mutation and recombination the method has sublinear regret bounds that were confirmed in a simulation experiment it was successfully applied to the optimization of crispr sequences the paper is very well written even with no background on biotechnologies one can read it in one go the idea to include mutation and recombination into linear bandits is new as far as i know the paper only mentions protein design optimization and gene editing so i believe that its impact is mostly limited to biotechnology i like that the paper not only claims that the method has realworld applications but has already been able to show that the method has been successful in realworld applications the results in the simulation experiment look promising too however the baseline the basic de approach seems to be rather simple the theoretical claims are well supported the authors do not address the potential negative societal impact of their work their main application gene editing is an exemplary case for an ethically controversial topic i see that the entire debate on this issue cannot be addressed in a single ml paper but one could have at least pointed out that it is an issue and referred to more detailed discussions of it in particular because such cases are specifically mentioned in the submission guidelines
### Summary: | the initial round of reviews for the submitted manuscript was mostly positive in tone but this enthusiasm was tempered by a number of deep technical issues and some more philosophical issues regarding the presentation and framing of the results raised by the reviewers fortunately the author rebuttal and authorreviewer discussion phases went a long way toward clearing up some initial confusion and clarifying the contributions of the authors which swayed the prevailing opinion of the reviewers toward acceptance i want to commend the authors for their enlightening contributions to that discussion which assuaged most of the reviewers initial complaints however i would also like to stress that it is critical that the fruits of this discussion especially with reviewers x52n and fblu be incorporated into a revised version of this manuscript the reviewers are unanimous in this opinion | [
403,
3632,
417,
984,
273,
247,
3480,
273,
1930,
36871,
326,
812,
275,
3762,
320,
908,
281,
1480,
841,
5018,
533,
984,
253,
7534,
5018,
273,
10577,
285,
31595,
10748,
2826,
12421,
50276,
6050,
352,
310,
14916,
275,
2830,
4173,
281,
32014,
285,
5806,
562,
26016,
15396,
2556,
281,
1930,
1491,
2509,
594,
275,
3946,
651,
2430,
12184,
846,
16483,
16334,
281,
923,
752,
369,
4197,
285,
840,
10380,
23694,
562,
11408,
285,
26016,
11640,
275,
253,
3072,
1078,
11440,
50276,
455,
2562,
3146,
253,
2644,
10577,
1232,
812,
320,
2218,
275,
2830,
4173,
285,
840,
253,
7625,
331,
812,
320,
8818,
407,
1133,
432,
253,
275,
2830,
4173,
1618,
273,
11640,
533,
2509,
436,
4336,
2297,
684,
253,
5750,
273,
372,
347,
271,
27296,
1232,
285,
2297,
3993,
253,
7579,
387,
253,
990,
273,
2593,
374,
15571,
2139,
253,
1566,
310,
3710,
281,
4872,
3961,
953,
50276,
338,
359,
403,
1469,
281,
1581,
436,
1268,
273,
19690,
285,
263,
2060,
5140,
273,
11640,
359,
943,
2649,
452,
281,
2701,
9361,
281,
11640,
326,
812,
7826,
12893,
432,
372,
10577,
285,
31595,
5018,
50276,
28821,
253,
5740,
275,
30762,
374,
273,
9433,
436,
1566,
275,
3946,
285,
253,
897,
273,
7550,
1087,
4302,
275,
697,
7092,
352,
3133,
2779,
326,
253,
6430,
497,
17791,
432,
271,
275,
2830,
4173,
5438,
3213,
326,
812,
452,
4354,
644,
11848,
281,
2486,
6430,
440,
706,
14721,
494,
949,
31595,
390,
3710,
1127,
9197,
969,
12976,
253,
1953,
273,
1880,
253,
253,
9376,
326,
436,
310,
417,
247,
4471,
21201,
3961,
262,
1895,
310,
1663,
247,
12291,
273,
253,
16775,
390,
816,
247,
11408,
12291,
323,
253,
13232,
273,
253,
15965,
1543,
50276,
17465,
569,
50275,
783,
2022,
7364,
273,
436,
2929,
310,
697,
22095,
327,
2266,
285,
11543,
13260,
326,
253,
14601,
13016,
320,
4872,
285,
6046,
320,
891,
301,
305,
12064,
285,
253,
30455,
30437,
275,
7534,
4679,
3636,
275,
14855,
818,
50276,
12796,
417,
5469,
275,
253,
2929,
50276,
570,
1622,
273,
841,
2297,
366,
253,
9021,
273,
253,
2929,
533,
943,
320,
3559,
285,
604,
1896,
690,
5955,
273,
253,
7340,
273,
841,
1543,
281,
15927,
275,
841,
13260,
943,
320,
2879,
891,
671,
1158,
22791,
272,
1411,
643,
3425,
2216,
789,
327,
625,
6041,
7881,
651,
17084,
253,
2929,
3012,
627,
310,
1335,
247,
2372,
273,
247,
8037,
875,
253,
10527,
1543,
285,
30437,
326,
516,
417,
13762,
436,
789,
651,
1361,
2810,
50274,
262,
651,
320,
9371,
604,
4477,
4499,
616,
789,
342,
11333,
5469,
407,
256,
45194,
285,
465,
1241,
280,
9169,
327,
1566,
26960,
3425,
2216,
275,
1798,
5955,
275,
2593,
608,
1097,
342,
1805,
372,
14053,
285,
42455,
342,
11333,
751,
260,
10352,
1795,
7095,
1162,
355,
6247,
50275,
37585,
50275,
8826,
47412,
474,
3374,
50275,
18,
275,
1273,
12494,
372,
581,
273,
253,
1755,
5787,
4302,
29709,
275,
253,
2469,
5331,
7568,
7497,
3745,
281,
16518,
4324,
387,
588,
29709,
50276,
7054,
10489,
84,
7568,
50276,
48387,
684,
7497,
50276,
13961,
414,
84,
50274,
19,
253,
1563,
12494,
369,
3340,
21643,
1580,
253,
9400,
6197,
7558,
326,
372,
4558,
8214,
285,
673,
33136,
533,
253,
6774,
14683,
512,
2342,
253,
7285,
1750,
326,
372,
310,
3839,
3477,
285,
556,
644,
28596,
5520,
50274,
20,
253,
373,
671,
247,
1355,
2523,
275,
4677,
1903,
275,
253,
22989,
1650,
835,
247,
1429,
342,
247,
3644,
11978,
14443,
577,
15877,
432,
4651,
1293,
326,
14443,
50274,
21,
253,
4385,
846,
9376,
4791,
326,
776,
4736,
310,
281,
22950,
253,
17699,
16561,
14938,
18289,
369,
247,
10551,
285,
253,
4477,
5486,
281,
15338,
3185,
50275,
7152,
33032,
2520,
2929,
16633,
327,
247,
2173,
2898,
1980,
6828,
5606,
352,
7866,
342,
247,
873,
273,
7431,
6430,
285,
29328,
271,
5933,
326,
17923,
767,
5871,
327,
253,
14604,
273,
6430,
22989,
285,
1846,
1127,
10577,
253,
2380,
1159,
310,
41329,
347,
247,
4872,
1159,
273,
277,
2601,
25907,
835,
1016,
14443,
476,
320,
390,
745,
4477,
12106,
17699,
16561,
14938,
273,
616,
5199,
4516,
342,
5301,
281,
8946,
16483,
5700,
20544,
50275,
74,
751,
253,
7473,
1320,
3177,
273,
253,
5661,
15722,
4477,
403,
970,
285,
20462,
271,
5933,
323,
352,
50276,
953,
5322,
436,
1119,
247,
1524,
10186,
2898,
285,
5520,
689,
8946,
16483,
7274,
50275,
20881,
1255,
50275,
783,
4477,
513,
417,
513,
8426,
281,
253,
1673,
273,
6828,
5606,
627,
403,
16483,
1754,
3082,
534,
10196,
327,
247,
1655,
14604,
273,
6430,
285,
15026,
352,
347,
2529,
1060,
50276,
3062,
15616,
342,
1980,
390,
16483,
7274,
2299,
627,
403,
643,
7274,
534,
476,
35143,
885,
2173,
15396,
285,
10196,
327,
253,
38183,
2317,
21223,
342,
288,
429,
73,
903,
1065,
3082,
4931,
271,
1774,
25577,
432,
436,
1386,
273,
789,
310,
253,
2929,
534,
323,
50276,
783,
806,
673,
4648,
1766,
342,
305,
12064,
4870,
285,
556,
14938,
12215,
534,
5420,
275,
268,
27109,
4072,
407,
4477,
10102,
2771,
465,
376,
2327,
285,
549,
79,
744,
436,
310,
247,
33567,
1673,
326,
11521,
762,
253,
6828,
5606,
23473,
347,
973,
50274,
16651,
327,
14938,
41458,
310,
247,
1943,
19734,
891,
923,
642,
1921,
2139,
281,
2770,
327,
14938,
41458,
247,
625,
5272,
22791,
310,
281,
1304,
253,
1682,
12955,
594,
2080,
289,
297,
10836,
10491,
2223,
3530,
1077,
38754,
5018,
275,
1340,
281,
5115,
1698,
14938,
533,
275,
3946,
359,
513,
417,
971,
436,
387,
512,
359,
971,
281,
8338,
1677,
253,
11649,
273,
39116,
359,
971,
281,
320,
347,
11117,
347,
1896,
281,
1089,
14282,
747,
1491,
285,
9183,
50274,
9088,
403,
767,
749,
30172,
5611,
806,
253,
9158,
403,
5611,
275,
581,
1340,
8813,
273,
731,
310,
275,
13891,
1340,
285,
597,
3176,
275,
253,
5933,
2568,
13891,
50275,
74,
452,
1534,
24626,
670,
253,
4737,
273,
10012,
8073,
50274,
49592,
253,
5982,
21313,
273,
28380,
273,
4719,
534,
891,
452,
642,
22796,
670,
50274,
783,
1307,
288,
19,
943,
320,
11542,
407,
277,
2609,
20583,
417,
253,
643,
23653,
347,
4477,
12661,
50274,
12563,
581,
16579,
281,
11120,
1375,
326,
436,
310,
323,
4872,
8103,
4516,
327,
3943,
50276,
27049,
68,
4338,
436,
4737,
1057,
417,
789,
3839,
247,
9414,
1537,
320,
33400,
281,
39970,
281,
2087,
4872,
3961,
953,
275,
2087,
533,
436,
310,
417,
2032,
285,
697,
417,
10481,
4518,
43997,
562,
2299,
1955,
281,
253,
17647,
273,
253,
8103,
4373,
68,
4338,
50276,
8172,
891,
452,
642,
24626,
326,
253,
5933,
651,
6524,
29623,
1580,
627,
310,
247,
3632,
4445,
1955,
281,
12910,
581,
816,
3198,
281,
3037,
277,
4295,
10481,
3809,
50274,
783,
1159,
275,
436,
1083,
310,
253,
21842,
1159,
273,
25907,
436,
310,
271,
6685,
8077,
2531,
9376,
323,
1142,
8542,
4893,
835,
2570,
4471,
10577,
30009,
4914,
6634,
285,
310,
273,
1600,
281,
320,
41329,
407,
2856,
324,
1069,
257,
3082,
50276,
9088,
310,
2649,
667,
14282,
8245,
5933,
323,
1650,
285,
5933,
326,
651,
1611,
281,
6642,
253,
1055,
273,
512,
1127,
3020,
9197,
26332,
256,
256,
18,
806,
10577,
23524,
19,
1273,
10577,
3966,
436,
651,
3164,
29623,
275,
258,
1397,
2274,
20583,
5018,
347,
973,
285,
891,
2868,
1014,
7938,
50274,
783,
2929,
14177,
281,
1918,
247,
1077,
2087,
10199,
281,
6828,
5606,
2299,
275,
577,
6493,
7668,
372,
5606,
1620,
369,
253,
4736,
281,
1818,
25907,
273,
253,
2601,
285,
789,
275,
436,
38754,
8142,
3185,
27845,
22260,
497,
281,
320,
5520,
970,
247,
1643,
4236,
9197,
275,
253,
21520,
273,
253,
3939,
2670,
891,
1158,
253,
15895,
4477,
452,
310,
4030,
285,
310,
3164,
17194,
407,
616,
5661,
9978,
533,
436,
310,
247,
830,
273,
6828,
5606,
417,
253,
6828,
5606,
891,
971,
281,
4073,
326,
436,
310,
271,
1774,
1127,
604,
253,
4736,
273,
436,
2929,
310,
281,
9569,
436,
1895,
281,
253,
16055,
13361,
3114,
359,
1805,
513,
436,
9257,
1293,
689,
2087,
3006,
891,
717,
4030,
342,
253,
9978,
533,
581,
556,
281,
4518,
1333,
326,
436,
310,
247,
2173,
9978,
273,
372,
323,
1650,
275,
619,
4743,
436,
2929,
1057,
417,
2953,
50276,
783,
2201,
7881,
275,
372,
24088,
30009,
4914,
38183,
3237,
3966,
50275,
77,
10892,
10527,
4685,
891,
717,
417,
2119,
253,
4737,
310,
3451,
1580,
627,
310,
247,
10721,
22048,
15602,
8423,
436,
310,
619,
22861,
323,
247,
2406,
4868,
891,
1158,
253,
10012,
1537,
320,
3451,
533,
253,
1039,
627,
310,
417,
4518,
4767,
671,
891,
923,
642,
1921,
2139,
13642,
275,
277,
19,
310,
3309,
323,
824,
247,
2969,
8103,
4872,
327,
247,
4373,
68,
4338,
50275,
783,
2929,
19756,
271,
5933,
1037,
14282,
8245,
534,
812,
320,
11407,
5132,
85,
1037,
50274,
7152,
33032,
2520,
2929,
1263,
247,
2831,
5028,
1895,
323,
7534,
3425,
13757,
342,
3961,
262,
1895,
285,
14177,
281,
2085,
10527,
4685,
273,
6828,
5606,
762,
3961,
262,
3762,
342,
289,
297,
10836,
10491,
50276,
296,
3755,
20556,
337,
253,
1895,
310,
4722,
374,
253,
2929,
4245,
247,
10527,
4685,
273,
3961,
262,
1895,
342,
6828,
5606,
50275,
20881,
1255,
265,
337,
253,
4872,
4758,
310,
1900,
247,
1895,
275,
1524,
1533,
2898,
891,
717,
417,
2119,
1880,
253,
4872,
3961,
262,
9376,
7470,
2217,
323,
1524,
2898,
374,
891,
513,
417,
564,
949,
253,
4737,
1223,
352,
778,
3133,
326,
253,
4737,
778,
3587,
956,
253,
2629,
4737,
23211,
273,
3961,
262,
3762,
352,
588,
320,
1199,
1805,
604,
253,
4477,
2085,
247,
4864,
5740,
670,
253,
4737,
12748,
1309,
25987,
253,
50276,
6611,
474,
28669,
3961,
262,
4737,
5853,
6325,
5474,
339,
431,
248,
2929,
23970,
247,
12955,
273,
247,
4872,
3961,
953,
1566,
323,
6828,
5606,
372,
372,
310,
7514,
342,
10040,
3146,
39793,
247,
3072,
273,
4292,
407,
17221,
247,
8578,
273,
12532,
4292,
323,
10577,
285,
22989,
275,
1016,
3213,
253,
11839,
273,
4292,
310,
23115,
342,
247,
4872,
1159,
4764,
1025,
407,
247,
4764,
39116,
39116,
310,
22407,
3066,
247,
747,
12955,
273,
289,
297,
10836,
10491,
253,
3064,
281,
10610,
289,
297,
10836,
10491,
310,
326,
253,
581,
2550,
3587,
3410,
4292,
1955,
281,
253,
19191,
414,
273,
10577,
285,
22989,
253,
1332,
556,
749,
8172,
14938,
14493,
326,
497,
5783,
275,
247,
9864,
3368,
352,
369,
8379,
3732,
281,
253,
13757,
273,
7550,
1087,
6430,
50276,
783,
2929,
310,
1077,
973,
3542,
1014,
342,
642,
4114,
327,
1794,
1584,
1451,
5970,
581,
476,
1239,
352,
275,
581,
564,
253,
2934,
281,
2486,
10577,
285,
22989,
715,
4872,
3961,
953,
310,
747,
347,
2080,
347,
891,
871,
253,
2929,
760,
25957,
2601,
2216,
13757,
285,
3320,
14835,
594,
891,
2868,
326,
697,
3486,
310,
6571,
3710,
281,
1794,
24377,
891,
751,
326,
253,
2929,
417,
760,
3916,
326,
253,
1332,
556,
1524,
10186,
4893,
533,
556,
2168,
644,
2104,
281,
921,
326,
253,
1332,
556,
644,
5547,
275,
1524,
10186,
4893,
253,
1543,
275,
253,
9864,
3368,
1007,
12532,
1512,
2299,
253,
8245,
253,
5044,
372,
2746,
3133,
281,
320,
2581,
2969,
253,
10527,
3916,
403,
973,
4516,
253,
4477,
513,
417,
2953,
253,
2442,
4016,
38058,
3486,
273,
616,
789,
616,
2022,
2898,
50276,
20857,
14835,
50276,
261,
271,
34093,
1083,
323,
271,
5105,
1037,
15620,
9400,
891,
923,
326,
253,
2862,
8881,
327,
436,
2523,
2550,
320,
9713,
275,
247,
2014,
13361,
2929,
533,
581,
812,
452,
387,
1878,
8042,
562,
326,
352,
310,
271,
2523,
285,
6289,
281,
625,
7000,
11985,
273,
352,
275,
1798,
984,
824,
2219,
403,
5742,
5393,
275,
253,
19529,
9600,
50276,
187,
187,
4118,
18435,
27,
783,
3302,
3790,
273,
10123,
323,
253,
9262,
7714,
369,
6571,
2762,
275,
10541,
533,
436,
23027,
369,
1565,
11712,
407,
247,
1180,
273,
3676,
7681,
3374,
50276,
395,
690,
625,
22555,
3374,
5001,
253,
9759,
285,
39926,
273,
253,
1543,
50276,
42750,
407,
253,
30628,
22972,
1523,
253,
2488,
30080,
22559,
285,
2488,
15337,
254,
5955,
12475,
2427,
247,
1048,
1039,
2584,
22980,
598,
690,
3302,
13775,
285,
8254,
5411,
253,
9021,
273,
253,
4477,
534,
26740,
264,
253,
27274,
4743,
273,
253,
30628,
2584,
14924,
50276,
74,
971,
281,
49638,
253,
4477,
323,
616,
25441,
2980,
9021,
281,
326,
5955,
534,
718,
86,
2961,
954,
273,
253,
30628,
3302,
14672,
50276,
35529,
891,
651,
671,
751,
281,
4073,
326,
352,
310,
4619,
326,
253,
18098,
273,
436,
5955,
3340,
342,
30628,
1269,
3583,
79,
285,
269,
1559,
86,
320,
11217,
715,
247,
17265,
2715,
273,
436,
7714,
253,
30628,
403,
42293,
275,
436,
4743
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
403,
3632,
417,
984,
273,
247,
3480,
273,
1930,
36871,
326,
812,
275,
3762,
320,
908,
281,
1480,
841,
5018,
533,
984,
253,
7534,
5018,
273,
10577,
285,
31595,
10748,
2826,
12421,
50276,
6050,
352,
310,
14916,
275,
2830,
4173,
281,
32014,
285,
5806,
562,
26016,
15396,
2556,
281,
1930,
1491,
2509,
594,
275,
3946,
651,
2430,
12184,
846,
16483,
16334,
281,
923,
752,
369,
4197,
285,
840,
10380,
23694,
562,
11408,
285,
26016,
11640,
275,
253,
3072,
1078,
11440,
50276,
455,
2562,
3146,
253,
2644,
10577,
1232,
812,
320,
2218,
275,
2830,
4173,
285,
840,
253,
7625,
331,
812,
320,
8818,
407,
1133,
432,
253,
275,
2830,
4173,
1618,
273,
11640,
533,
2509,
436,
4336,
2297,
684,
253,
5750,
273,
372,
347,
271,
27296,
1232,
285,
2297,
3993,
253,
7579,
387,
253,
990,
273,
2593,
374,
15571,
2139,
253,
1566,
310,
3710,
281,
4872,
3961,
953,
50276,
338,
359,
403,
1469,
281,
1581,
436,
1268,
273,
19690,
285,
263,
2060,
5140,
273,
11640,
359,
943,
2649,
452,
281,
2701,
9361,
281,
11640,
326,
812,
7826,
12893,
432,
372,
10577,
285,
31595,
5018,
50276,
28821,
253,
5740,
275,
30762,
374,
273,
9433,
436,
1566,
275,
3946,
285,
253,
897,
273,
7550,
1087,
4302,
275,
697,
7092,
352,
3133,
2779,
326,
253,
6430,
497,
17791,
432,
271,
275,
2830,
4173,
5438,
3213,
326,
812,
452,
4354,
644,
11848,
281,
2486,
6430,
440,
706,
14721,
494,
949,
31595,
390,
3710,
1127,
9197,
969,
12976,
253,
1953,
273,
1880,
253,
253,
9376,
326,
436,
310,
417,
247,
4471,
21201,
3961,
262,
1895,
310,
1663,
247,
12291,
273,
253,
16775,
390,
816,
247,
11408,
12291,
323,
253,
13232,
273,
253,
15965,
1543,
50276,
17465,
569,
50275,
783,
2022,
7364,
273,
436,
2929,
310,
697,
22095,
327,
2266,
285,
11543,
13260,
326,
253,
14601,
13016,
320,
4872,
285,
6046,
320,
891,
301,
305,
12064,
285,
253,
30455,
30437,
275,
7534,
4679,
3636,
275,
14855,
818,
50276,
12796,
417,
5469,
275,
253,
2929,
50276,
570,
1622,
273,
841,
2297,
366,
253,
9021,
273,
253,
2929,
533,
943,
320,
3559,
285,
604,
1896,
690,
5955,
273,
253,
7340,
273,
841,
1543,
281,
15927,
275,
841,
13260,
943,
320,
2879,
891,
671,
1158,
22791,
272,
1411,
643,
3425,
2216,
789,
327,
625,
6041,
7881,
651,
17084,
253,
2929,
3012,
627,
310,
1335,
247,
2372,
273,
247,
8037,
875,
253,
10527,
1543,
285,
30437,
326,
516,
417,
13762,
436,
789,
651,
1361,
2810,
50274,
262,
651,
320,
9371,
604,
4477,
4499,
616,
789,
342,
11333,
5469,
407,
256,
45194,
285,
465,
1241,
280,
9169,
327,
1566,
26960,
3425,
2216,
275,
1798,
5955,
275,
2593,
608,
1097,
342,
1805,
372,
14053,
285,
42455,
342,
11333,
751,
260,
10352,
1795,
7095,
1162,
355,
6247,
50275,
37585,
50275,
8826,
47412,
474,
3374,
50275,
18,
275,
1273,
12494,
372,
581,
273,
253,
1755,
5787,
4302,
29709,
275,
253,
2469,
5331,
7568,
7497,
3745,
281,
16518,
4324,
387,
588,
29709,
50276,
7054,
10489,
84,
7568,
50276,
48387,
684,
7497,
50276,
13961,
414,
84,
50274,
19,
253,
1563,
12494,
369,
3340,
21643,
1580,
253,
9400,
6197,
7558,
326,
372,
4558,
8214,
285,
673,
33136,
533,
253,
6774,
14683,
512,
2342,
253,
7285,
1750,
326,
372,
310,
3839,
3477,
285,
556,
644,
28596,
5520,
50274,
20,
253,
373,
671,
247,
1355,
2523,
275,
4677,
1903,
275,
253,
22989,
1650,
835,
247,
1429,
342,
247,
3644,
11978,
14443,
577,
15877,
432,
4651,
1293,
326,
14443,
50274,
21,
253,
4385,
846,
9376,
4791,
326,
776,
4736,
310,
281,
22950,
253,
17699,
16561,
14938,
18289,
369,
247,
10551,
285,
253,
4477,
5486,
281,
15338,
3185,
50275,
7152,
33032,
2520,
2929,
16633,
327,
247,
2173,
2898,
1980,
6828,
5606,
352,
7866,
342,
247,
873,
273,
7431,
6430,
285,
29328,
271,
5933,
326,
17923,
767,
5871,
327,
253,
14604,
273,
6430,
22989,
285,
1846,
1127,
10577,
253,
2380,
1159,
310,
41329,
347,
247,
4872,
1159,
273,
277,
2601,
25907,
835,
1016,
14443,
476,
320,
390,
745,
4477,
12106,
17699,
16561,
14938,
273,
616,
5199,
4516,
342,
5301,
281,
8946,
16483,
5700,
20544,
50275,
74,
751,
253,
7473,
1320,
3177,
273,
253,
5661,
15722,
4477,
403,
970,
285,
20462,
271,
5933,
323,
352,
50276,
953,
5322,
436,
1119,
247,
1524,
10186,
2898,
285,
5520,
689,
8946,
16483,
7274,
50275,
20881,
1255,
50275,
783,
4477,
513,
417,
513,
8426,
281,
253,
1673,
273,
6828,
5606,
627,
403,
16483,
1754,
3082,
534,
10196,
327,
247,
1655,
14604,
273,
6430,
285,
15026,
352,
347,
2529,
1060,
50276,
3062,
15616,
342,
1980,
390,
16483,
7274,
2299,
627,
403,
643,
7274,
534,
476,
35143,
885,
2173,
15396,
285,
10196,
327,
253,
38183,
2317,
21223,
342,
288,
429,
73,
903,
1065,
3082,
4931,
271,
1774,
25577,
432,
436,
1386,
273,
789,
310,
253,
2929,
534,
323,
50276,
783,
806,
673,
4648,
1766,
342,
305,
12064,
4870,
285,
556,
14938,
12215,
534,
5420,
275,
268,
27109,
4072,
407,
4477,
10102,
2771,
465,
376,
2327,
285,
549,
79,
744,
436,
310,
247,
33567,
1673,
326,
11521,
762,
253,
6828,
5606,
23473,
347,
973,
50274,
16651,
327,
14938,
41458,
310,
247,
1943,
19734,
891,
923,
642,
1921,
2139,
281,
2770,
327,
14938,
41458,
247,
625,
5272,
22791,
310,
281,
1304,
253,
1682,
12955,
594,
2080,
289,
297,
10836,
10491,
2223,
3530,
1077,
38754,
5018,
275,
1340,
281,
5115,
1698,
14938,
533,
275,
3946,
359,
513,
417,
971,
436,
387,
512,
359,
971,
281,
8338,
1677,
253,
11649,
273,
39116,
359,
971,
281,
320,
347,
11117,
347,
1896,
281,
1089,
14282,
747,
1491,
285,
9183,
50274,
9088,
403,
767,
749,
30172,
5611,
806,
253,
9158,
403,
5611,
275,
581,
1340,
8813,
273,
731,
310,
275,
13891,
1340,
285,
597,
3176,
275,
253,
5933,
2568,
13891,
50275,
74,
452,
1534,
24626,
670,
253,
4737,
273,
10012,
8073,
50274,
49592,
253,
5982,
21313,
273,
28380,
273,
4719,
534,
891,
452,
642,
22796,
670,
50274,
783,
1307,
288,
19,
943,
320,
11542,
407,
277,
2609,
20583,
417,
253,
643,
23653,
347,
4477,
12661,
50274,
12563,
581,
16579,
281,
11120,
1375,
326,
436,
310,
323,
4872,
8103,
4516,
327,
3943,
50276,
27049,
68,
4338,
436,
4737,
1057,
417,
789,
3839,
247,
9414,
1537,
320,
33400,
281,
39970,
281,
2087,
4872,
3961,
953,
275,
2087,
533,
436,
310,
417,
2032,
285,
697,
417,
10481,
4518,
43997,
562,
2299,
1955,
281,
253,
17647,
273,
253,
8103,
4373,
68,
4338,
50276,
8172,
891,
452,
642,
24626,
326,
253,
5933,
651,
6524,
29623,
1580,
627,
310,
247,
3632,
4445,
1955,
281,
12910,
581,
816,
3198,
281,
3037,
277,
4295,
10481,
3809,
50274,
783,
1159,
275,
436,
1083,
310,
253,
21842,
1159,
273,
25907,
436,
310,
271,
6685,
8077,
2531,
9376,
323,
1142,
8542,
4893,
835,
2570,
4471,
10577,
30009,
4914,
6634,
285,
310,
273,
1600,
281,
320,
41329,
407,
2856,
324,
1069,
257,
3082,
50276,
9088,
310,
2649,
667,
14282,
8245,
5933,
323,
1650,
285,
5933,
326,
651,
1611,
281,
6642,
253,
1055,
273,
512,
1127,
3020,
9197,
26332,
256,
256,
18,
806,
10577,
23524,
19,
1273,
10577,
3966,
436,
651,
3164,
29623,
275,
258,
1397,
2274,
20583,
5018,
347,
973,
285,
891,
2868,
1014,
7938,
50274,
783,
2929,
14177,
281,
1918,
247,
1077,
2087,
10199,
281,
6828,
5606,
2299,
275,
577,
6493,
7668,
372,
5606,
1620,
369,
253,
4736,
281,
1818,
25907,
273,
253,
2601,
285,
789,
275,
436,
38754,
8142,
3185,
27845,
22260,
497,
281,
320,
5520,
970,
247,
1643,
4236,
9197,
275,
253,
21520,
273,
253,
3939,
2670,
891,
1158,
253,
15895,
4477,
452,
310,
4030,
285,
310,
3164,
17194,
407,
616,
5661,
9978,
533,
436,
310,
247,
830,
273,
6828,
5606,
417,
253,
6828,
5606,
891,
971,
281,
4073,
326,
436,
310,
271,
1774,
1127,
604,
253,
4736,
273,
436,
2929,
310,
281,
9569,
436,
1895,
281,
253,
16055,
13361,
3114,
359,
1805,
513,
436,
9257,
1293,
689,
2087,
3006,
891,
717,
4030,
342,
253,
9978,
533,
581,
556,
281,
4518,
1333,
326,
436,
310,
247,
2173,
9978,
273,
372,
323,
1650,
275,
619,
4743,
436,
2929,
1057,
417,
2953,
50276,
783,
2201,
7881,
275,
372,
24088,
30009,
4914,
38183,
3237,
3966,
50275,
77,
10892,
10527,
4685,
891,
717,
417,
2119,
253,
4737,
310,
3451,
1580,
627,
310,
247,
10721,
22048,
15602,
8423,
436,
310,
619,
22861,
323,
247,
2406,
4868,
891,
1158,
253,
10012,
1537,
320,
3451,
533,
253,
1039,
627,
310,
417,
4518,
4767,
671,
891,
923,
642,
1921,
2139,
13642,
275,
277,
19,
310,
3309,
323,
824,
247,
2969,
8103,
4872,
327,
247,
4373,
68,
4338,
50275,
783,
2929,
19756,
271,
5933,
1037,
14282,
8245,
534,
812,
320,
11407,
5132,
85,
1037,
50274,
7152,
33032,
2520,
2929,
1263,
247,
2831,
5028,
1895,
323,
7534,
3425,
13757,
342,
3961,
262,
1895,
285,
14177,
281,
2085,
10527,
4685,
273,
6828,
5606,
762,
3961,
262,
3762,
342,
289,
297,
10836,
10491,
50276,
296,
3755,
20556,
337,
253,
1895,
310,
4722,
374,
253,
2929,
4245,
247,
10527,
4685,
273,
3961,
262,
1895,
342,
6828,
5606,
50275,
20881,
1255,
265,
337,
253,
4872,
4758,
310,
1900,
247,
1895,
275,
1524,
1533,
2898,
891,
717,
417,
2119,
1880,
253,
4872,
3961,
262,
9376,
7470,
2217,
323,
1524,
2898,
374,
891,
513,
417,
564,
949,
253,
4737,
1223,
352,
778,
3133,
326,
253,
4737,
778,
3587,
956,
253,
2629,
4737,
23211,
273,
3961,
262,
3762,
352,
588,
320,
1199,
1805,
604,
253,
4477,
2085,
247,
4864,
5740,
670,
253,
4737,
12748,
1309,
25987,
253,
50276,
6611,
474,
28669,
3961,
262,
4737,
5853,
6325,
5474,
339,
431,
248,
2929,
23970,
247,
12955,
273,
247,
4872,
3961,
953,
1566,
323,
6828,
5606,
372,
372,
310,
7514,
342,
10040,
3146,
39793,
247,
3072,
273,
4292,
407,
17221,
247,
8578,
273,
12532,
4292,
323,
10577,
285,
22989,
275,
1016,
3213,
253,
11839,
273,
4292,
310,
23115,
342,
247,
4872,
1159,
4764,
1025,
407,
247,
4764,
39116,
39116,
310,
22407,
3066,
247,
747,
12955,
273,
289,
297,
10836,
10491,
253,
3064,
281,
10610,
289,
297,
10836,
10491,
310,
326,
253,
581,
2550,
3587,
3410,
4292,
1955,
281,
253,
19191,
414,
273,
10577,
285,
22989,
253,
1332,
556,
749,
8172,
14938,
14493,
326,
497,
5783,
275,
247,
9864,
3368,
352,
369,
8379,
3732,
281,
253,
13757,
273,
7550,
1087,
6430,
50276,
783,
2929,
310,
1077,
973,
3542,
1014,
342,
642,
4114,
327,
1794,
1584,
1451,
5970,
581,
476,
1239,
352,
275,
581,
564,
253,
2934,
281,
2486,
10577,
285,
22989,
715,
4872,
3961,
953,
310,
747,
347,
2080,
347,
891,
871,
253,
2929,
760,
25957,
2601,
2216,
13757,
285,
3320,
14835,
594,
891,
2868,
326,
697,
3486,
310,
6571,
3710,
281,
1794,
24377,
891,
751,
326,
253,
2929,
417,
760,
3916,
326,
253,
1332,
556,
1524,
10186,
4893,
533,
556,
2168,
644,
2104,
281,
921,
326,
253,
1332,
556,
644,
5547,
275,
1524,
10186,
4893,
253,
1543,
275,
253,
9864,
3368,
1007,
12532,
1512,
2299,
253,
8245,
253,
5044,
372,
2746,
3133,
281,
320,
2581,
2969,
253,
10527,
3916,
403,
973,
4516,
253,
4477,
513,
417,
2953,
253,
2442,
4016,
38058,
3486,
273,
616,
789,
616,
2022,
2898,
50276,
20857,
14835,
50276,
261,
271,
34093,
1083,
323,
271,
5105,
1037,
15620,
9400,
891,
923,
326,
253,
2862,
8881,
327,
436,
2523,
2550,
320,
9713,
275,
247,
2014,
13361,
2929,
533,
581,
812,
452,
387,
1878,
8042,
562,
326,
352,
310,
271,
2523,
285,
6289,
281,
625,
7000,
11985,
273,
352,
275,
1798,
984,
824,
2219,
403,
5742,
5393,
275,
253,
19529,
9600,
50276,
187,
187,
4118,
18435,
27,
783,
3302,
3790,
273,
10123,
323,
253,
9262,
7714,
369,
6571,
2762,
275,
10541,
533,
436,
23027,
369,
1565,
11712,
407,
247,
1180,
273,
3676,
7681,
3374,
50276,
395,
690,
625,
22555,
3374,
5001,
253,
9759,
285,
39926,
273,
253,
1543,
50276,
42750,
407,
253,
30628,
22972,
1523,
253,
2488,
30080,
22559,
285,
2488,
15337,
254,
5955,
12475,
2427,
247,
1048,
1039,
2584,
22980,
598,
690,
3302,
13775,
285,
8254,
5411,
253,
9021,
273,
253,
4477,
534,
26740,
264,
253,
27274,
4743,
273,
253,
30628,
2584,
14924,
50276,
74,
971,
281,
49638,
253,
4477,
323,
616,
25441,
2980,
9021,
281,
326,
5955,
534,
718,
86,
2961,
954,
273,
253,
30628,
3302,
14672,
50276,
35529,
891,
651,
671,
751,
281,
4073,
326,
352,
310,
4619,
326,
253,
18098,
273,
436,
5955,
3340,
342,
30628,
1269,
3583,
79,
285,
269,
1559,
86,
320,
11217,
715,
247,
17265,
2715,
273,
436,
7714,
253,
30628,
403,
42293,
275,
436,
4743
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes convmae a hybrid convolutiontransformer architecture that is friendly to maelike pretraining mae was originally proposed with vit and due to omitted mask tokens in the backbone encoder mae is not trivially extensible to convolutional networks the work extends mae by resorting to the hybrid design of first using convolutions and then using transformers the masking is done blockwise at the resolution of the transformer and masked convolutions are used to avoid potential cheating extensive experiments are done on imagenet classification object detection semantic segmentation video classification various ablation analysis is also provided selfsupervised learning especially masked autoencoding for images is an emerging topic in computer vision a breakthrough in this direction can bear huge significance the work aims at fixing the limitation of mae by introducing an hybrid architecture of convolutions and transformers which is definitely important and relevant to the neurips audience the paper is well written and is clear enough for readers to follow through with good illustrations the experiments are extensive and conclusive the downstream transfers include image classification object detection semantic segmentation and even video understanding is involved which by itself could be an independent investigation the ablations and the conclusions are also covering most of the things i can think of a solid paper clearly with a lot of hard work behind the scene i think the conv part of convmae is an overemphasis the architecture only has 4 conv layers in the bottom of the network while it has 11 transformer blocks for the base model vitb has 12 blocks in total so my current understanding is that convmae has a similar architecture as in xiao tete et al early convolutions help transformers see better advances in neural information processing systems 34 2021 3039230400 this means the majority of the architecture is still transformers and in this regard the differencesignificance over original mae is not that salient this is the biggest concern about the paper it has a risk of oversell with the term conv in it one minor concern is about the scalability of convmae the paper is highly focused on the model size of the base model it is unclear the benefit of convmae can still hold when the model size further scales up as shown in pure vitbased mae some minor typos need to be fixed with proofreading eg should define what is fov at page 2 and the mask ratio should be 75 instead of 25 for mae if i recall correctly i do not see potential negative societal impact concern the paper also points this out at the end which is adequate to me docsepthis paper proposes a selfsupervised framework using a hybrid convolutiontransformer architecture to obtain multiscale hierarchical representations masked convolution is introduced to prevent information leakage in convolution blocks and blockwise masking strategy is applied to improve computational efficiency the resulting model achieves competitive performances in image classification and dense prediction tasks such as object detection strengths 1 this work effectively extends the selfsupervised mae framework to the hierarchical convolutiontransformer hybrid architecture 2 the resulting model outperforms existing selfsupervised models in classification and dense prediction tasks adequate docsepthis paper proposed a new selfsupervised learning framework by integrating hybrid convolutiontransformer architectures and masked convolution into the masked autoencoders the proposed method can achieve computational efficiency and low pretrainingfinetuning gap at the same extensive experiments on several computer vision tasks demonstrate the effectiveness of the proposed method strengths the paper is well written and easy to follow sufficient technique details are provided the proposed method is well motivated and simple several key components are proposed to address heavy computational cost and pretrainingfinetuning discrepancy the proposed method is flexible and can be applied in both image classification and object detection weaknesses it seems hybrid convolutiontransformer architectures have been explored in previous works but show how very similar performance to mae lines 4547 why the proposed method can make them work for mae the differences from previous work and the contribution of the paper remains vague some parts of the method are not clearly illustrated for example in blockwise masking with masked convolutions the authors state that uniformly masking stage1 input tokens would cause all tokens of stage3 to have partially visible information and requires keeping all stage3 tokens why the proposed method can address this issue what is the key idea of the proposed method the required training epochs vary from different methods i wonder whether the proposed method can still outperform others under the same training epochs post rebuttal i thank the authors for their response most of my concerns have been addressed i increased my rating and recommend acceptance for this paper yes docsepthis paper addresses the difficulty of applying mae training with convolutional layers the proposed convmae adopts masked convolutions in the early stage of convolutional layers by applying convolution on the masked featuremaps in this way the information leak is prevented with the proposed convmae training the vit with early convolutional layers can benefit from the mae training and achieved better transfer learning results comparing to standard vit it achieves superior performance on imagenet mscoco 1 novel training strategy to enable mae training for models with convolutional layers 2 strong performance on various transfer learning tasks the comparison with the mae training with standard vit backbones may be unfair due to introducing extra computation cost with the convolutional layers it would be helpful to further break down the improvements docsepthe paper starts with the hypothesis that a multiscale hybrid convolutiontransformer can learn better representations using masked inputs than vanilla vits the original masking scheme proposed in the mae paper can be computationally prohibitive when directly applied to hybrid models this paper presents a multiscale blockwise masking strategy with masked convolutions to efficiently train a hybrid transformerconvolutional model for representation learning the paper shared a broad range of empirical results on classification detection segmentation and video understanding tasks to show the effectiveness of the proposed technique originality the novelty of the paper lies in its proposed multiscale hybrid convolutiontransformer encoder which can generate hierarchical representations and possibly exceed the performance of vanilla vits the idea of hybrid models already exists in multiple pieces of literature coatnet early convolutions etc masked convolutions were introduced in the pixelrnn paper httpsarxivorgpdf160106759pdf the strength of this paper is in its novel combination of existing ideas to produce a very simple hybrid framework that effectively combines the strength of convolutions and transformers i also like the idea of performing masking at the late stage and then progressively upsampling the mask to larger resolutions to avoid the requirement of keeping all tokens in stage 3 the proposed setup naturally generates hierarchical representations and fits nicely with feature pyramid networks it is a nice way to generate a feature pyramid with local context via convolutions and global context using transformers quality the paper primarily describes experiments using vitb scale networks it covers a broad set of vision tasks but it does not cover scale it would be nice to see whether the proposed scheme continues to outperform existing masking techniques for larger models there is also limited runtime comparison with existing techniques the paper shares very informative results of ablation experiments comparing random masking regular convolutions multiscale decoders etc clarity the paper is very well written with a nice flow and explains the concepts with ease nit line 56 pretraing pretraining significance the paper proposes a simple and effective hybrid convolutiontransformer encoder which naturally generates hierarchical representations from an image and outperforms a number of existing techniques yes authors adequately addressed the limitations
### Summary: | the reviewers are positive about this submission initially after the authors rebuttal one reviewer pointed out that the name convmae is not proper to describe the current work the authors respond by claiming using an alternative name which is acknowledged by the reviewer overall all the reviewers stand positive for this work and ac stands with the reviewers the authors shall take the suggestions from the reviewers to further polish the current work in the cameraready submissions | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
2410,
785,
70,
247,
9769,
27311,
16702,
254,
10336,
326,
310,
11453,
281,
278,
4696,
2804,
3215,
26208,
278,
3348,
369,
8927,
4081,
342,
9084,
285,
1955,
281,
11035,
8989,
21761,
275,
253,
27882,
32049,
278,
3348,
310,
417,
35820,
1365,
1021,
35418,
281,
27311,
267,
6928,
253,
789,
8725,
278,
3348,
407,
501,
12655,
281,
253,
9769,
2216,
273,
806,
970,
2410,
17009,
285,
840,
970,
4979,
398,
253,
44790,
310,
2218,
2972,
3020,
387,
253,
6064,
273,
253,
39707,
285,
34741,
2410,
17009,
403,
908,
281,
3693,
2442,
37142,
9470,
4679,
403,
2218,
327,
4440,
257,
292,
9162,
1789,
5481,
24705,
26405,
3492,
9162,
2710,
28913,
1783,
310,
671,
2530,
50275,
1286,
35421,
4715,
3340,
34741,
6753,
27676,
323,
3888,
310,
271,
14149,
9400,
275,
4382,
8113,
247,
29709,
275,
436,
3884,
476,
8800,
5699,
8453,
253,
789,
13698,
387,
18505,
253,
12291,
273,
278,
3348,
407,
16984,
271,
9769,
10336,
273,
2410,
17009,
285,
4979,
398,
534,
310,
7964,
1774,
285,
4623,
281,
253,
5723,
2824,
8446,
50275,
783,
2929,
310,
973,
3542,
285,
310,
2590,
2217,
323,
10668,
281,
956,
949,
342,
1175,
33954,
50274,
783,
4679,
403,
9470,
285,
38662,
253,
15450,
21916,
2486,
2460,
9162,
1789,
5481,
24705,
26405,
285,
1014,
3492,
4685,
310,
3206,
534,
407,
3139,
812,
320,
271,
3907,
5839,
253,
490,
77,
569,
285,
253,
11815,
403,
671,
10985,
954,
273,
253,
1841,
891,
476,
1158,
273,
50276,
66,
4891,
2929,
4518,
342,
247,
2257,
273,
1892,
789,
3212,
253,
6200,
50275,
74,
1158,
253,
2410,
629,
273,
2410,
785,
70,
310,
271,
25732,
2013,
14701,
253,
10336,
760,
556,
577,
2410,
8090,
275,
253,
5004,
273,
253,
2990,
1223,
352,
556,
1903,
39707,
8336,
323,
253,
2613,
1566,
9084,
67,
556,
1249,
8336,
275,
2264,
594,
619,
1655,
4685,
310,
326,
2410,
785,
70,
556,
247,
2074,
10336,
347,
275,
50276,
89,
22728,
14864,
70,
1162,
355,
2393,
2410,
17009,
1361,
4979,
398,
923,
1805,
16424,
275,
11454,
1491,
5162,
2718,
5910,
43425,
1884,
1867,
17569,
8320,
50276,
2520,
2097,
253,
5020,
273,
253,
10336,
310,
1335,
4979,
398,
285,
275,
436,
2743,
253,
3910,
525,
40348,
689,
3236,
278,
3348,
310,
417,
326,
43066,
436,
310,
253,
5962,
4468,
670,
253,
2929,
50276,
262,
556,
247,
2495,
273,
689,
39217,
342,
253,
1307,
2410,
275,
352,
50275,
531,
5884,
4468,
310,
670,
253,
9171,
1430,
273,
2410,
785,
70,
253,
2929,
310,
4122,
7106,
327,
253,
1566,
1979,
273,
253,
2613,
1566,
352,
310,
12744,
253,
5649,
273,
2410,
785,
70,
476,
1335,
2186,
672,
253,
1566,
1979,
2007,
11498,
598,
347,
2011,
275,
6313,
9084,
3169,
278,
3348,
50275,
8826,
5884,
963,
993,
878,
281,
320,
4229,
342,
4737,
24042,
24088,
943,
4853,
752,
310,
269,
729,
387,
3239,
374,
285,
253,
8989,
4313,
943,
320,
6879,
3185,
273,
2030,
323,
278,
3348,
604,
891,
6983,
9113,
50276,
74,
513,
417,
923,
2442,
4016,
38058,
3486,
4468,
253,
2929,
671,
2792,
436,
562,
387,
253,
990,
534,
310,
10599,
281,
479,
5474,
33032,
2520,
2929,
29328,
247,
1881,
35421,
7792,
970,
247,
9769,
27311,
16702,
254,
10336,
281,
4044,
1554,
2865,
1079,
24498,
14237,
34741,
27311,
310,
5611,
281,
3657,
1491,
23753,
275,
27311,
8336,
285,
2972,
3020,
44790,
5700,
310,
3732,
281,
3157,
15180,
6733,
253,
4795,
1566,
33526,
12085,
16226,
275,
2460,
9162,
285,
14086,
10554,
8892,
824,
347,
1789,
5481,
50276,
296,
3755,
20556,
50276,
18,
436,
789,
8069,
8725,
253,
1881,
35421,
278,
3348,
7792,
281,
253,
24498,
27311,
16702,
254,
9769,
10336,
50276,
19,
253,
4795,
1566,
41731,
13015,
5368,
1881,
35421,
3210,
275,
9162,
285,
14086,
10554,
8892,
50276,
14629,
366,
50276,
7152,
33032,
2520,
2929,
4081,
247,
747,
1881,
35421,
4715,
7792,
407,
24399,
9769,
27311,
16702,
254,
35615,
285,
34741,
27311,
715,
253,
34741,
6753,
2083,
351,
398,
253,
4081,
1332,
476,
5115,
15180,
6733,
285,
1698,
3215,
26208,
71,
7795,
25004,
8037,
387,
253,
1072,
9470,
4679,
327,
2067,
4382,
8113,
8892,
7568,
253,
12510,
273,
253,
4081,
1332,
20544,
50275,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
4209,
5853,
4278,
403,
2530,
50276,
783,
4081,
1332,
310,
973,
17194,
285,
2969,
2067,
2234,
4295,
403,
4081,
281,
2953,
5536,
15180,
2105,
285,
3215,
26208,
71,
7795,
25004,
26210,
50276,
783,
4081,
1332,
310,
12112,
285,
476,
320,
3732,
275,
1097,
2460,
9162,
285,
1789,
5481,
50276,
20881,
1255,
265,
50275,
262,
3133,
9769,
27311,
16702,
254,
35615,
452,
644,
14859,
275,
2045,
2987,
533,
921,
849,
1077,
2074,
3045,
281,
278,
3348,
3104,
5329,
2504,
2139,
253,
4081,
1332,
476,
1056,
731,
789,
323,
278,
3348,
253,
3910,
432,
2045,
789,
285,
253,
7680,
273,
253,
2929,
4558,
21248,
50276,
8826,
4243,
273,
253,
1332,
403,
417,
4518,
12800,
323,
1650,
275,
2972,
3020,
44790,
342,
34741,
2410,
17009,
253,
4477,
1375,
326,
17568,
44790,
3924,
18,
3280,
21761,
651,
2847,
512,
21761,
273,
3924,
20,
281,
452,
10571,
7985,
1491,
285,
4419,
7562,
512,
3924,
20,
21761,
2139,
253,
4081,
1332,
476,
2953,
436,
2523,
752,
310,
253,
2234,
2934,
273,
253,
4081,
1332,
50276,
783,
2424,
3733,
44540,
6889,
432,
1027,
3082,
891,
4282,
1880,
253,
4081,
1332,
476,
1335,
562,
32231,
2571,
762,
253,
1072,
3733,
44540,
50276,
5996,
30080,
22559,
50276,
74,
5717,
253,
4477,
323,
616,
2380,
954,
273,
619,
7350,
452,
644,
9713,
891,
2559,
619,
13716,
285,
5583,
14924,
323,
436,
2929,
4754,
5474,
33032,
2520,
2929,
12453,
253,
10183,
273,
9433,
278,
3348,
3733,
342,
27311,
267,
8090,
253,
4081,
2410,
785,
70,
47932,
34741,
2410,
17009,
275,
253,
2393,
3924,
273,
27311,
267,
8090,
407,
9433,
27311,
327,
253,
34741,
4735,
23226,
275,
436,
1039,
253,
1491,
13584,
310,
14415,
342,
253,
4081,
2410,
785,
70,
3733,
253,
9084,
342,
2393,
27311,
267,
8090,
476,
5649,
432,
253,
278,
3348,
3733,
285,
6786,
1805,
3700,
4715,
1543,
10941,
281,
2629,
9084,
50276,
262,
33526,
8936,
3045,
327,
4440,
257,
292,
50276,
983,
68,
16856,
337,
4460,
3733,
5700,
281,
8046,
278,
3348,
3733,
323,
3210,
342,
27311,
267,
8090,
374,
2266,
3045,
327,
2710,
3700,
4715,
8892,
50274,
783,
5301,
342,
253,
278,
3348,
3733,
342,
2629,
9084,
896,
47473,
778,
320,
16593,
1955,
281,
16984,
4465,
13782,
2105,
342,
253,
27311,
267,
8090,
352,
651,
320,
9371,
281,
2007,
2740,
1066,
253,
11701,
5474,
339,
431,
248,
2929,
7866,
342,
253,
9079,
326,
247,
1554,
2865,
1079,
9769,
27311,
16702,
254,
476,
3037,
1805,
14237,
970,
34741,
14800,
685,
26724,
362,
953,
253,
3236,
44790,
6974,
4081,
275,
253,
278,
3348,
2929,
476,
320,
43245,
9419,
1483,
672,
3587,
3732,
281,
9769,
3210,
436,
2929,
10262,
247,
1554,
2865,
1079,
2972,
3020,
44790,
5700,
342,
34741,
2410,
17009,
281,
14556,
6194,
247,
9769,
39707,
13118,
2241,
267,
1566,
323,
6779,
4715,
253,
2929,
6096,
247,
3862,
2491,
273,
16774,
1543,
327,
9162,
5481,
26405,
285,
3492,
4685,
8892,
281,
921,
253,
12510,
273,
253,
4081,
5853,
3236,
414,
253,
38135,
273,
253,
2929,
8696,
275,
697,
4081,
1554,
2865,
1079,
9769,
27311,
16702,
254,
32049,
534,
476,
6635,
24498,
14237,
285,
6830,
8268,
253,
3045,
273,
26724,
362,
953,
253,
2934,
273,
9769,
3210,
2168,
4961,
275,
2709,
7437,
273,
6239,
11959,
3024,
2393,
2410,
17009,
3966,
34741,
2410,
17009,
497,
5611,
275,
253,
12275,
83,
9866,
2929,
5987,
39962,
2061,
9275,
1036,
520,
3071,
32054,
9275,
253,
4757,
273,
436,
2929,
310,
275,
697,
4460,
5019,
273,
5368,
5697,
281,
4711,
247,
1077,
2969,
9769,
7792,
326,
8069,
24772,
253,
4757,
273,
2410,
17009,
285,
4979,
398,
50276,
74,
671,
751,
253,
2934,
273,
9591,
44790,
387,
253,
3563,
3924,
285,
840,
31414,
598,
48027,
253,
8989,
281,
4067,
30285,
281,
3693,
253,
8284,
273,
7562,
512,
21761,
275,
3924,
495,
50276,
783,
4081,
9978,
10748,
15693,
24498,
14237,
285,
13840,
23395,
342,
4735,
39694,
6928,
352,
310,
247,
5322,
1039,
281,
6635,
247,
4735,
39694,
342,
1980,
3634,
3066,
2410,
17009,
285,
4156,
3634,
970,
4979,
398,
50276,
15177,
253,
2929,
8558,
8631,
4679,
970,
9084,
67,
4311,
6928,
352,
10949,
247,
3862,
873,
273,
8113,
8892,
533,
352,
1057,
417,
3835,
4311,
352,
651,
320,
5322,
281,
923,
1880,
253,
4081,
6974,
7788,
281,
562,
32231,
5368,
44790,
5609,
323,
4067,
3210,
627,
310,
671,
3710,
20243,
5301,
342,
5368,
5609,
50276,
783,
2929,
10764,
1077,
27096,
1543,
273,
28913,
4679,
10941,
3632,
44790,
3963,
2410,
17009,
1554,
2865,
1079,
1086,
351,
398,
3966,
50276,
498,
15752,
253,
2929,
310,
1077,
973,
3542,
342,
247,
5322,
2685,
285,
11424,
253,
12342,
342,
11990,
50276,
32202,
1386,
8026,
3215,
376,
272,
50276,
4025,
26208,
50275,
9188,
40348,
253,
2929,
29328,
247,
2969,
285,
3576,
9769,
27311,
16702,
254,
32049,
534,
10748,
15693,
24498,
14237,
432,
271,
2460,
285,
41731,
13015,
247,
1180,
273,
5368,
5609,
50275,
9820,
4477,
18212,
9713,
253,
7364,
50276,
187,
187,
4118,
18435,
27,
783,
30628,
403,
2762,
670,
436,
19529,
8523,
846,
253,
4477,
30080,
22559,
581,
37317,
8042,
562,
326,
253,
1416,
2410,
785,
70,
310,
417,
1463,
281,
6266,
253,
1655,
789,
253,
4477,
3794,
407,
15081,
970,
271,
5795,
1416,
534,
310,
14969,
407,
253,
37317,
4583,
512,
253,
30628,
1462,
2762,
323,
436,
789,
285,
913,
9572,
342,
253,
30628,
253,
4477,
3091,
1379,
253,
13991,
432,
253,
30628,
281,
2007,
40167,
253,
1655,
789,
275,
253,
4049,
254,
609,
5102,
35103,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
2410,
785,
70,
247,
9769,
27311,
16702,
254,
10336,
326,
310,
11453,
281,
278,
4696,
2804,
3215,
26208,
278,
3348,
369,
8927,
4081,
342,
9084,
285,
1955,
281,
11035,
8989,
21761,
275,
253,
27882,
32049,
278,
3348,
310,
417,
35820,
1365,
1021,
35418,
281,
27311,
267,
6928,
253,
789,
8725,
278,
3348,
407,
501,
12655,
281,
253,
9769,
2216,
273,
806,
970,
2410,
17009,
285,
840,
970,
4979,
398,
253,
44790,
310,
2218,
2972,
3020,
387,
253,
6064,
273,
253,
39707,
285,
34741,
2410,
17009,
403,
908,
281,
3693,
2442,
37142,
9470,
4679,
403,
2218,
327,
4440,
257,
292,
9162,
1789,
5481,
24705,
26405,
3492,
9162,
2710,
28913,
1783,
310,
671,
2530,
50275,
1286,
35421,
4715,
3340,
34741,
6753,
27676,
323,
3888,
310,
271,
14149,
9400,
275,
4382,
8113,
247,
29709,
275,
436,
3884,
476,
8800,
5699,
8453,
253,
789,
13698,
387,
18505,
253,
12291,
273,
278,
3348,
407,
16984,
271,
9769,
10336,
273,
2410,
17009,
285,
4979,
398,
534,
310,
7964,
1774,
285,
4623,
281,
253,
5723,
2824,
8446,
50275,
783,
2929,
310,
973,
3542,
285,
310,
2590,
2217,
323,
10668,
281,
956,
949,
342,
1175,
33954,
50274,
783,
4679,
403,
9470,
285,
38662,
253,
15450,
21916,
2486,
2460,
9162,
1789,
5481,
24705,
26405,
285,
1014,
3492,
4685,
310,
3206,
534,
407,
3139,
812,
320,
271,
3907,
5839,
253,
490,
77,
569,
285,
253,
11815,
403,
671,
10985,
954,
273,
253,
1841,
891,
476,
1158,
273,
50276,
66,
4891,
2929,
4518,
342,
247,
2257,
273,
1892,
789,
3212,
253,
6200,
50275,
74,
1158,
253,
2410,
629,
273,
2410,
785,
70,
310,
271,
25732,
2013,
14701,
253,
10336,
760,
556,
577,
2410,
8090,
275,
253,
5004,
273,
253,
2990,
1223,
352,
556,
1903,
39707,
8336,
323,
253,
2613,
1566,
9084,
67,
556,
1249,
8336,
275,
2264,
594,
619,
1655,
4685,
310,
326,
2410,
785,
70,
556,
247,
2074,
10336,
347,
275,
50276,
89,
22728,
14864,
70,
1162,
355,
2393,
2410,
17009,
1361,
4979,
398,
923,
1805,
16424,
275,
11454,
1491,
5162,
2718,
5910,
43425,
1884,
1867,
17569,
8320,
50276,
2520,
2097,
253,
5020,
273,
253,
10336,
310,
1335,
4979,
398,
285,
275,
436,
2743,
253,
3910,
525,
40348,
689,
3236,
278,
3348,
310,
417,
326,
43066,
436,
310,
253,
5962,
4468,
670,
253,
2929,
50276,
262,
556,
247,
2495,
273,
689,
39217,
342,
253,
1307,
2410,
275,
352,
50275,
531,
5884,
4468,
310,
670,
253,
9171,
1430,
273,
2410,
785,
70,
253,
2929,
310,
4122,
7106,
327,
253,
1566,
1979,
273,
253,
2613,
1566,
352,
310,
12744,
253,
5649,
273,
2410,
785,
70,
476,
1335,
2186,
672,
253,
1566,
1979,
2007,
11498,
598,
347,
2011,
275,
6313,
9084,
3169,
278,
3348,
50275,
8826,
5884,
963,
993,
878,
281,
320,
4229,
342,
4737,
24042,
24088,
943,
4853,
752,
310,
269,
729,
387,
3239,
374,
285,
253,
8989,
4313,
943,
320,
6879,
3185,
273,
2030,
323,
278,
3348,
604,
891,
6983,
9113,
50276,
74,
513,
417,
923,
2442,
4016,
38058,
3486,
4468,
253,
2929,
671,
2792,
436,
562,
387,
253,
990,
534,
310,
10599,
281,
479,
5474,
33032,
2520,
2929,
29328,
247,
1881,
35421,
7792,
970,
247,
9769,
27311,
16702,
254,
10336,
281,
4044,
1554,
2865,
1079,
24498,
14237,
34741,
27311,
310,
5611,
281,
3657,
1491,
23753,
275,
27311,
8336,
285,
2972,
3020,
44790,
5700,
310,
3732,
281,
3157,
15180,
6733,
253,
4795,
1566,
33526,
12085,
16226,
275,
2460,
9162,
285,
14086,
10554,
8892,
824,
347,
1789,
5481,
50276,
296,
3755,
20556,
50276,
18,
436,
789,
8069,
8725,
253,
1881,
35421,
278,
3348,
7792,
281,
253,
24498,
27311,
16702,
254,
9769,
10336,
50276,
19,
253,
4795,
1566,
41731,
13015,
5368,
1881,
35421,
3210,
275,
9162,
285,
14086,
10554,
8892,
50276,
14629,
366,
50276,
7152,
33032,
2520,
2929,
4081,
247,
747,
1881,
35421,
4715,
7792,
407,
24399,
9769,
27311,
16702,
254,
35615,
285,
34741,
27311,
715,
253,
34741,
6753,
2083,
351,
398,
253,
4081,
1332,
476,
5115,
15180,
6733,
285,
1698,
3215,
26208,
71,
7795,
25004,
8037,
387,
253,
1072,
9470,
4679,
327,
2067,
4382,
8113,
8892,
7568,
253,
12510,
273,
253,
4081,
1332,
20544,
50275,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
4209,
5853,
4278,
403,
2530,
50276,
783,
4081,
1332,
310,
973,
17194,
285,
2969,
2067,
2234,
4295,
403,
4081,
281,
2953,
5536,
15180,
2105,
285,
3215,
26208,
71,
7795,
25004,
26210,
50276,
783,
4081,
1332,
310,
12112,
285,
476,
320,
3732,
275,
1097,
2460,
9162,
285,
1789,
5481,
50276,
20881,
1255,
265,
50275,
262,
3133,
9769,
27311,
16702,
254,
35615,
452,
644,
14859,
275,
2045,
2987,
533,
921,
849,
1077,
2074,
3045,
281,
278,
3348,
3104,
5329,
2504,
2139,
253,
4081,
1332,
476,
1056,
731,
789,
323,
278,
3348,
253,
3910,
432,
2045,
789,
285,
253,
7680,
273,
253,
2929,
4558,
21248,
50276,
8826,
4243,
273,
253,
1332,
403,
417,
4518,
12800,
323,
1650,
275,
2972,
3020,
44790,
342,
34741,
2410,
17009,
253,
4477,
1375,
326,
17568,
44790,
3924,
18,
3280,
21761,
651,
2847,
512,
21761,
273,
3924,
20,
281,
452,
10571,
7985,
1491,
285,
4419,
7562,
512,
3924,
20,
21761,
2139,
253,
4081,
1332,
476,
2953,
436,
2523,
752,
310,
253,
2234,
2934,
273,
253,
4081,
1332,
50276,
783,
2424,
3733,
44540,
6889,
432,
1027,
3082,
891,
4282,
1880,
253,
4081,
1332,
476,
1335,
562,
32231,
2571,
762,
253,
1072,
3733,
44540,
50276,
5996,
30080,
22559,
50276,
74,
5717,
253,
4477,
323,
616,
2380,
954,
273,
619,
7350,
452,
644,
9713,
891,
2559,
619,
13716,
285,
5583,
14924,
323,
436,
2929,
4754,
5474,
33032,
2520,
2929,
12453,
253,
10183,
273,
9433,
278,
3348,
3733,
342,
27311,
267,
8090,
253,
4081,
2410,
785,
70,
47932,
34741,
2410,
17009,
275,
253,
2393,
3924,
273,
27311,
267,
8090,
407,
9433,
27311,
327,
253,
34741,
4735,
23226,
275,
436,
1039,
253,
1491,
13584,
310,
14415,
342,
253,
4081,
2410,
785,
70,
3733,
253,
9084,
342,
2393,
27311,
267,
8090,
476,
5649,
432,
253,
278,
3348,
3733,
285,
6786,
1805,
3700,
4715,
1543,
10941,
281,
2629,
9084,
50276,
262,
33526,
8936,
3045,
327,
4440,
257,
292,
50276,
983,
68,
16856,
337,
4460,
3733,
5700,
281,
8046,
278,
3348,
3733,
323,
3210,
342,
27311,
267,
8090,
374,
2266,
3045,
327,
2710,
3700,
4715,
8892,
50274,
783,
5301,
342,
253,
278,
3348,
3733,
342,
2629,
9084,
896,
47473,
778,
320,
16593,
1955,
281,
16984,
4465,
13782,
2105,
342,
253,
27311,
267,
8090,
352,
651,
320,
9371,
281,
2007,
2740,
1066,
253,
11701,
5474,
339,
431,
248,
2929,
7866,
342,
253,
9079,
326,
247,
1554,
2865,
1079,
9769,
27311,
16702,
254,
476,
3037,
1805,
14237,
970,
34741,
14800,
685,
26724,
362,
953,
253,
3236,
44790,
6974,
4081,
275,
253,
278,
3348,
2929,
476,
320,
43245,
9419,
1483,
672,
3587,
3732,
281,
9769,
3210,
436,
2929,
10262,
247,
1554,
2865,
1079,
2972,
3020,
44790,
5700,
342,
34741,
2410,
17009,
281,
14556,
6194,
247,
9769,
39707,
13118,
2241,
267,
1566,
323,
6779,
4715,
253,
2929,
6096,
247,
3862,
2491,
273,
16774,
1543,
327,
9162,
5481,
26405,
285,
3492,
4685,
8892,
281,
921,
253,
12510,
273,
253,
4081,
5853,
3236,
414,
253,
38135,
273,
253,
2929,
8696,
275,
697,
4081,
1554,
2865,
1079,
9769,
27311,
16702,
254,
32049,
534,
476,
6635,
24498,
14237,
285,
6830,
8268,
253,
3045,
273,
26724,
362,
953,
253,
2934,
273,
9769,
3210,
2168,
4961,
275,
2709,
7437,
273,
6239,
11959,
3024,
2393,
2410,
17009,
3966,
34741,
2410,
17009,
497,
5611,
275,
253,
12275,
83,
9866,
2929,
5987,
39962,
2061,
9275,
1036,
520,
3071,
32054,
9275,
253,
4757,
273,
436,
2929,
310,
275,
697,
4460,
5019,
273,
5368,
5697,
281,
4711,
247,
1077,
2969,
9769,
7792,
326,
8069,
24772,
253,
4757,
273,
2410,
17009,
285,
4979,
398,
50276,
74,
671,
751,
253,
2934,
273,
9591,
44790,
387,
253,
3563,
3924,
285,
840,
31414,
598,
48027,
253,
8989,
281,
4067,
30285,
281,
3693,
253,
8284,
273,
7562,
512,
21761,
275,
3924,
495,
50276,
783,
4081,
9978,
10748,
15693,
24498,
14237,
285,
13840,
23395,
342,
4735,
39694,
6928,
352,
310,
247,
5322,
1039,
281,
6635,
247,
4735,
39694,
342,
1980,
3634,
3066,
2410,
17009,
285,
4156,
3634,
970,
4979,
398,
50276,
15177,
253,
2929,
8558,
8631,
4679,
970,
9084,
67,
4311,
6928,
352,
10949,
247,
3862,
873,
273,
8113,
8892,
533,
352,
1057,
417,
3835,
4311,
352,
651,
320,
5322,
281,
923,
1880,
253,
4081,
6974,
7788,
281,
562,
32231,
5368,
44790,
5609,
323,
4067,
3210,
627,
310,
671,
3710,
20243,
5301,
342,
5368,
5609,
50276,
783,
2929,
10764,
1077,
27096,
1543,
273,
28913,
4679,
10941,
3632,
44790,
3963,
2410,
17009,
1554,
2865,
1079,
1086,
351,
398,
3966,
50276,
498,
15752,
253,
2929,
310,
1077,
973,
3542,
342,
247,
5322,
2685,
285,
11424,
253,
12342,
342,
11990,
50276,
32202,
1386,
8026,
3215,
376,
272,
50276,
4025,
26208,
50275,
9188,
40348,
253,
2929,
29328,
247,
2969,
285,
3576,
9769,
27311,
16702,
254,
32049,
534,
10748,
15693,
24498,
14237,
432,
271,
2460,
285,
41731,
13015,
247,
1180,
273,
5368,
5609,
50275,
9820,
4477,
18212,
9713,
253,
7364,
50276,
187,
187,
4118,
18435,
27,
783,
30628,
403,
2762,
670,
436,
19529,
8523,
846,
253,
4477,
30080,
22559,
581,
37317,
8042,
562,
326,
253,
1416,
2410,
785,
70,
310,
417,
1463,
281,
6266,
253,
1655,
789,
253,
4477,
3794,
407,
15081,
970,
271,
5795,
1416,
534,
310,
14969,
407,
253,
37317,
4583,
512,
253,
30628,
1462,
2762,
323,
436,
789,
285,
913,
9572,
342,
253,
30628,
253,
4477,
3091,
1379,
253,
13991,
432,
253,
30628,
281,
2007,
40167,
253,
1655,
789,
275,
253,
4049,
254,
609,
5102,
35103,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper presents an approach to uncertainty modeling in recurrent neural networks through a discrete hidden state the training of this discrete model is done using a reparameterizable approximation in particular using the gumbelsoftmax trick the authors show the utility of this method on a variety of problems including showing effective out of distribution detection and improved calibration in classification tasks comments this paper presents a relatively simple idea that builds relatively directly from previous work and uses the now common softmaxgumbel trick to enable differentiability the main strength of this paper is the thorough experimental evaluation on a wide variety of problems the main weakness of this paper is the very unclear presentation of the method in section 21 the authors do not define all quantities the mathematics of the method is interspersed with discussions of the approaches of others and the writing is unclear the authors must clarify the presentation of their method and have this presentation be distinct from discussion of previous work overall the experimental results seem compelling and interesting the authors should clarify their discussion of the partially observed rl task in the partially observed task is the agent only provided lagged measurements of the state the presentation if quite confusing and the authors should state what this task is as clearly as possible postrebuttal i thank the authors for their response both of the sections are now more clear although the authors should make an effort to polish the narrative of the paper and the clarity of exposition throughout the discussion of epistemic versus aleatoric uncertainty in the appendix is also interesting i have increased my score from 6 to 7 docsepthe paper proposes a novel approach for uncertainty estimation with rnns more precisely the task is to both fit a model on the data and to learn the uncertainty of the fitted model at the same time the proposed approach fits a random model with its randomness adjusted to the level of uncertainty the probability of the potential outputs on a given input is then estimated by sampling the model ie reevaluating it multiple times on the same input this in turn can also be used to estimate the uncertainty of the model one important detail that the paper does not discuss but would be important to understand is how st is trainedupdated actually the same question goes for tau in fact referring to st as states is quite confusing from the formulas it seems that they are used as weights the authors should discuss these questions in detail apart of these issues the paper is relatively well written and the considered problem is important to various applications the proposed model also makes sense on the high level although the missing details make it hard to claim the same in general finally empirical evaluations show the effectiveness of the method and also that its performance is comparable and in many cases superior to vanilla lstm bayesian rnn rnn with variational dropout and a deep ensemble of lstm based model remarks section 22 setting varphi to be a dotproduct does not seem right as its two attributes are thetat in rd and st in sd x k the dimensions do not match simple matrixvector product does work though in fact section 2 could be somewhat polished it is not always easy to understand what is part of the proposed method and what is explained in relation to other models only additionally it would be helpful to have a brief recap at the end of the section about how the uncertainty estimation is done for the model in 1 ti does not seem to be defined actually should it not be ti additionally alphai two lines below 2 should be alphati presumablydocsepsummary this paper proposes a method to quantify the uncertainty for rnn different from the traditional bayesian rnn the proposed method is more efficient at each time based on the current hidden state and memory it generates a probability distribution over the state transition paths on the transition probability by using the gumbel softmax function the next state is computed based on the weighted average of the sampled states and its uncertainty can be qualified by the sample variance the hyperparameter tau of the gumbel function is learnt from data to better capture the inherent uncertainty in the data to demonstrate their method they perform several experiments first they show that their model can capture the stochastics in language better than other methods second they demonstrate their method performs better in classification on benchmark datastes than baseline methods such as the ensemble and bbb methods in terms of both prediction accuracy and efficiency third they evaluated their method for outofdistribution detection and their experiments again show their method performs better than the baseline methods on benchmark datasets finally they show that when applied to reinforcement learning their method is better than existing methods in sample complexity strengths the proposed method for uncertainty quantification is efficient compared with other methods such as bayesian rnn the performances of their methods have been evaluated for different tasks on benchmark datasets and show competitive performance versus the baseline methods weaknesses first technical novelty is minor it is largely based on the exiting work on gumbel function more importantly is unclear why the gumbel softmax function even with the learnt tau parameter can capture the data uncertainty and better theoretical justification is needed second it is unclear how to compute the aleraeroic and epistemic uncertainties separately from their method as the latter is needed for ood detection third it is unclear how to quantify the accuracy with the estimated uncertainty and how the improved uncertainty quantification can translate into improved performance in classification regressions fifth the experimental comparisons are only done for baseline methods for each task the authors should also compare their methods to sota methods for each task finally they need do an ablation study on their method to figure out what contributes to their methods improved performance for certain tasks docsepthis work proposes a novel method to estimate uncertainties in recurrent neural networks the proposed model explicitly computes a probability distribution over a set of discrete hidden states given the current hidden state in an rnn leveraging the gumbel softmax trick the proposed method performs mc gradient estimation a temperature parameter is also learned to control the concentration of state transition distribution to estimate uncertainty of a given input the proposed model is run multiple times to draw samples for estimating the mean and variance experiments are conducted in a variety of sequential prediction problems including a reinforcement learning task demonstrating the effectiveness of the proposed uncertainty estimation method pros estimating uncertainty of predictions is important for datadriven machine learning models especially for detecting outofdistribution data the proposed method directly quantifies and calibrates uncertainty and therefore does not use much more parameters compared to bnns and requires less parameter tuning the paper selects a good range of task domains and strong baseline methods demonstrating comparable performance cons while the proposed method demonstrates good performance on both modeling stochastic processes and estimating outofdistribution data it is unclear whether the method itself can separate epistemic uncertainty from aleatoric uncertainty if both exists meanwhile most of the selected baseline methods focuses exclusively on estimating the epistemic uncertainty if possible it is desired to see a comparison of the proposed method with baseline methods that are designed to exclusively model aleatoric uncertainties for rnns it is mentioned that a large number of states improves performance in the experiments for predicting ood data a plot for the relationship between performance and the number of states used would be useful to understand how sensitive the performance is to the number of states used if possible the authors should also discuss the proposed works relationship with the samplingfree method of hwang et al 1 and how the choice of using discrete state distribution would outperform a parametric distribution 1 hwang s j mehta r r kim h j johnson s c singh v 2020 august samplingfree uncertainty estimation in gated recurrent units with applications to normative modeling in neuroimaging in uncertainty in artificial intelligence pp 809819 pmlr update the major concerns above have been addressed in the appendix of the updated manuscript im moving my initial rating of 6 to 7
### Summary: | this paper proposes a method to quantify the uncertainty for rnn which is an important problem in various applications it provides results in a variety of domains demonstrating that the proposed method outperforms baselines however these experiments would benefit greatly from a comparison with sota methods for the specific tasks in addition to the considered baselines eg covariance propagation prior network and orthonormal certificates the paper could also be improved by adding a theoretical justification to explain how the gumbel softmax function is able to capture the underlying data and model uncertainty | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
50275,
2520,
2929,
10262,
271,
2746,
281,
11649,
14053,
275,
18902,
11454,
6928,
949,
247,
13358,
8763,
1375,
253,
3733,
273,
436,
13358,
1566,
310,
2218,
970,
247,
294,
19484,
12729,
11193,
275,
1798,
970,
253,
305,
3561,
293,
5530,
4090,
10480,
253,
4477,
921,
253,
11839,
273,
436,
1332,
327,
247,
5235,
273,
3237,
1690,
4645,
3576,
562,
273,
3268,
5481,
285,
5520,
18543,
275,
9162,
8892,
50276,
26122,
50275,
2520,
2929,
10262,
247,
4942,
2969,
2934,
326,
21168,
4942,
3587,
432,
2045,
789,
285,
4648,
253,
1024,
1846,
2602,
4090,
72,
3561,
293,
10480,
281,
8046,
1027,
74,
1430,
253,
2022,
4757,
273,
436,
2929,
310,
253,
11080,
5661,
7103,
327,
247,
4618,
5235,
273,
3237,
50275,
783,
2022,
14855,
273,
436,
2929,
310,
253,
1077,
12744,
9759,
273,
253,
1332,
275,
2593,
3127,
253,
4477,
513,
417,
4853,
512,
13483,
253,
23065,
273,
253,
1332,
310,
734,
1033,
398,
264,
342,
11985,
273,
253,
7274,
273,
2571,
285,
253,
4028,
310,
12744,
253,
4477,
1364,
19148,
253,
9759,
273,
616,
1332,
285,
452,
436,
9759,
320,
5799,
432,
5955,
273,
2045,
789,
50275,
1189,
455,
253,
5661,
1543,
1646,
18511,
285,
4722,
253,
4477,
943,
19148,
616,
5955,
273,
253,
10571,
2540,
391,
77,
4836,
275,
253,
10571,
2540,
4836,
310,
253,
5570,
760,
2530,
16653,
2400,
6341,
273,
253,
1375,
253,
9759,
604,
3240,
21643,
285,
253,
4477,
943,
1375,
752,
436,
4836,
310,
347,
4518,
347,
1896,
50275,
5996,
250,
2858,
22559,
50276,
74,
5717,
253,
4477,
323,
616,
2380,
1097,
273,
253,
7118,
403,
1024,
625,
2590,
3738,
253,
4477,
943,
1056,
271,
3434,
281,
40167,
253,
14511,
273,
253,
2929,
285,
253,
19843,
273,
47284,
4768,
253,
5955,
273,
30009,
11060,
7147,
21844,
1080,
280,
11649,
275,
253,
30762,
310,
671,
4722,
891,
452,
2559,
619,
4868,
432,
721,
281,
818,
5474,
339,
431,
248,
2929,
29328,
247,
4460,
2746,
323,
11649,
13418,
342,
391,
79,
2224,
625,
10534,
253,
4836,
310,
281,
1097,
4944,
247,
1566,
327,
253,
941,
285,
281,
3037,
253,
11649,
273,
253,
14662,
1566,
387,
253,
1072,
673,
50276,
783,
4081,
2746,
13840,
247,
3632,
1566,
342,
697,
3632,
1255,
10904,
281,
253,
1268,
273,
11649,
253,
5912,
273,
253,
2442,
18012,
327,
247,
1677,
3280,
310,
840,
5998,
407,
10491,
253,
1566,
26332,
294,
15419,
18186,
352,
2709,
2069,
327,
253,
1072,
3280,
436,
275,
1614,
476,
671,
320,
908,
281,
6642,
253,
11649,
273,
253,
1566,
50276,
531,
1774,
2508,
326,
253,
2929,
1057,
417,
2319,
533,
651,
320,
1774,
281,
2096,
310,
849,
331,
310,
10166,
39055,
2686,
253,
1072,
1953,
4566,
323,
29201,
275,
958,
14339,
281,
331,
347,
3054,
310,
3240,
21643,
432,
253,
23276,
352,
3133,
326,
597,
403,
908,
347,
13461,
253,
4477,
943,
2319,
841,
3533,
275,
2508,
50275,
522,
435,
273,
841,
3374,
253,
2929,
310,
4942,
973,
3542,
285,
253,
2783,
1895,
310,
1774,
281,
2710,
4893,
253,
4081,
1566,
671,
2789,
3282,
327,
253,
1029,
1268,
3738,
253,
5816,
4278,
1056,
352,
1892,
281,
1750,
253,
1072,
275,
2087,
4720,
16774,
27163,
921,
253,
12510,
273,
253,
1332,
285,
671,
326,
697,
3045,
310,
10870,
50276,
395,
275,
1142,
2219,
8936,
50276,
936,
26724,
298,
296,
78,
17699,
16561,
391,
9866,
391,
9866,
342,
39762,
5926,
483,
285,
247,
3676,
19862,
273,
298,
296,
78,
1754,
1566,
50274,
2013,
7969,
50276,
4674,
3307,
4758,
945,
2162,
281,
320,
247,
14261,
7509,
1057,
417,
1646,
987,
347,
697,
767,
12474,
403,
253,
41506,
275,
47939,
285,
331,
275,
39868,
1269,
465,
253,
10103,
513,
417,
3761,
2969,
4315,
11000,
1885,
1057,
789,
2167,
50276,
249,
958,
2593,
374,
812,
320,
8489,
29422,
352,
310,
417,
1900,
3477,
281,
2096,
752,
310,
629,
273,
253,
4081,
1332,
285,
752,
310,
5544,
275,
5886,
281,
643,
3210,
760,
23000,
352,
651,
320,
9371,
281,
452,
247,
4864,
39994,
387,
253,
990,
273,
253,
2593,
670,
849,
253,
11649,
13418,
310,
2218,
323,
253,
1566,
50276,
249,
337,
16816,
1057,
417,
1646,
281,
320,
2931,
2686,
943,
352,
417,
320,
16816,
23000,
9765,
74,
767,
3104,
2708,
374,
943,
320,
355,
545,
8657,
18289,
7152,
339,
793,
360,
3454,
436,
2929,
29328,
247,
1332,
281,
22048,
253,
11649,
323,
391,
9866,
1027,
432,
50276,
783,
5899,
17699,
16561,
391,
9866,
253,
4081,
1332,
310,
625,
5919,
387,
1016,
50276,
2606,
1754,
327,
253,
1655,
8763,
1375,
285,
3541,
352,
15693,
247,
5912,
50276,
35360,
689,
253,
1375,
5502,
11865,
327,
253,
5502,
5912,
407,
50276,
5302,
253,
305,
3561,
293,
2602,
4090,
1159,
253,
1735,
1375,
310,
10302,
1754,
327,
253,
17375,
3388,
273,
253,
19958,
3054,
285,
697,
11649,
476,
320,
50276,
28758,
407,
253,
3410,
11041,
253,
4373,
19484,
29201,
273,
253,
305,
3561,
293,
1159,
50276,
261,
34003,
432,
941,
281,
1805,
9232,
253,
12794,
11649,
275,
253,
941,
50276,
936,
7568,
616,
1332,
597,
1347,
2067,
4679,
806,
597,
921,
326,
616,
1566,
476,
50276,
41628,
253,
331,
3770,
26245,
275,
3448,
1805,
685,
643,
3082,
50276,
9815,
597,
7568,
616,
50276,
9349,
17923,
1805,
275,
9162,
327,
22791,
2856,
505,
265,
685,
8245,
3082,
50276,
10328,
347,
253,
19862,
285,
270,
4482,
3082,
275,
2426,
273,
1097,
10554,
7200,
285,
6733,
50275,
19016,
597,
6760,
616,
1332,
323,
562,
1171,
35360,
5481,
285,
616,
4679,
969,
921,
616,
1332,
17923,
1805,
685,
253,
8245,
3082,
327,
22791,
15302,
50276,
71,
3341,
597,
921,
326,
672,
3732,
281,
35221,
50276,
28269,
616,
1332,
310,
1805,
685,
5368,
50276,
30172,
275,
3410,
10454,
50275,
296,
3755,
20556,
253,
4081,
1332,
323,
11649,
21652,
310,
5919,
2429,
342,
643,
3082,
824,
347,
17699,
16561,
391,
9866,
50276,
783,
16226,
273,
616,
3082,
452,
644,
6760,
323,
1027,
8892,
327,
22791,
15302,
285,
921,
12085,
3045,
7147,
253,
8245,
3082,
50276,
20881,
1255,
265,
50276,
7053,
7681,
38135,
310,
5884,
352,
310,
8127,
1754,
327,
253,
44528,
789,
327,
305,
3561,
293,
1159,
50276,
3062,
15538,
310,
12744,
2139,
253,
305,
3561,
293,
2602,
4090,
1159,
1014,
342,
253,
34003,
29201,
4764,
476,
9232,
253,
941,
11649,
285,
1805,
10527,
22861,
50276,
261,
3058,
50275,
9815,
352,
310,
12744,
849,
281,
11897,
253,
355,
3525,
2771,
280,
285,
30009,
11060,
20418,
11794,
432,
616,
1332,
347,
253,
6158,
310,
3058,
323,
258,
351,
5481,
50276,
19016,
352,
310,
12744,
849,
281,
22048,
253,
7200,
342,
253,
5998,
11649,
285,
849,
253,
5520,
11649,
21652,
476,
16497,
715,
5520,
3045,
275,
9162,
810,
37761,
50276,
25512,
394,
253,
5661,
14023,
403,
760,
2218,
323,
8245,
3082,
323,
1016,
4836,
50276,
783,
4477,
943,
671,
7277,
616,
3082,
281,
256,
5503,
3082,
323,
1016,
4836,
50276,
71,
3341,
597,
878,
513,
271,
28913,
1263,
327,
616,
1332,
281,
4677,
562,
752,
17904,
281,
616,
3082,
5520,
3045,
323,
2176,
8892,
50276,
7152,
33032,
2520,
789,
29328,
247,
4460,
1332,
281,
6642,
20418,
275,
18902,
11454,
6928,
253,
4081,
1566,
11120,
48169,
247,
5912,
3268,
689,
247,
873,
273,
13358,
8763,
3054,
1677,
253,
1655,
8763,
1375,
275,
271,
391,
9866,
19732,
2977,
253,
305,
3561,
293,
2602,
4090,
10480,
253,
4081,
1332,
17923,
278,
68,
11786,
13418,
247,
3276,
4764,
310,
671,
6311,
281,
1453,
253,
4719,
273,
1375,
5502,
3268,
281,
6642,
11649,
273,
247,
1677,
3280,
253,
4081,
1566,
310,
1408,
2709,
2069,
281,
3812,
3530,
323,
26230,
253,
1599,
285,
11041,
4679,
403,
5196,
275,
247,
5235,
273,
22453,
10554,
3237,
1690,
247,
35221,
4715,
4836,
17227,
253,
12510,
273,
253,
4081,
11649,
13418,
1332,
50275,
856,
84,
26230,
11649,
273,
13650,
310,
1774,
323,
2856,
324,
1069,
257,
5145,
4715,
3210,
3340,
323,
15549,
562,
1171,
35360,
941,
253,
4081,
1332,
3587,
2677,
7790,
285,
24403,
684,
11649,
285,
3103,
1057,
417,
897,
1199,
625,
3602,
2429,
281,
270,
79,
2224,
285,
4419,
1679,
4764,
25184,
253,
2929,
34899,
247,
1175,
2491,
273,
4836,
10625,
285,
2266,
8245,
3082,
17227,
10870,
3045,
50276,
5040,
1223,
253,
4081,
1332,
14371,
1175,
3045,
327,
1097,
14053,
19191,
4870,
285,
26230,
562,
1171,
35360,
941,
352,
310,
12744,
1880,
253,
1332,
3139,
476,
4858,
30009,
11060,
11649,
432,
21844,
1080,
280,
50276,
7157,
1695,
555,
604,
1097,
4961,
26614,
954,
273,
253,
4236,
8245,
3082,
16633,
14288,
327,
26230,
253,
30009,
11060,
11649,
604,
1896,
352,
310,
6799,
281,
923,
247,
5301,
273,
253,
4081,
1332,
342,
8245,
3082,
326,
403,
4158,
281,
14288,
1566,
21844,
1080,
280,
20418,
323,
391,
79,
2224,
352,
310,
5393,
326,
247,
1781,
1180,
273,
3054,
19132,
3045,
275,
253,
4679,
323,
21565,
258,
351,
941,
247,
7484,
323,
253,
2954,
875,
3045,
285,
253,
1180,
273,
3054,
908,
651,
320,
4217,
281,
2096,
849,
7996,
253,
3045,
310,
281,
253,
1180,
273,
3054,
908,
604,
1896,
253,
4477,
943,
671,
2319,
253,
4081,
2987,
2954,
342,
253,
10491,
4924,
1332,
273,
288,
33317,
1162,
355,
337,
285,
849,
253,
4327,
273,
970,
13358,
1375,
3268,
651,
562,
32231,
247,
36833,
3268,
50275,
18,
288,
33317,
256,
480,
479,
45846,
391,
391,
465,
303,
288,
480,
480,
2116,
1665,
256,
260,
50276,
4093,
73,
362,
9169,
14688,
461,
10491,
4924,
11649,
13418,
275,
305,
456,
18902,
5085,
342,
4893,
281,
43829,
14053,
275,
6551,
40270,
275,
11649,
275,
13345,
9260,
7266,
854,
41345,
746,
268,
1686,
83,
50274,
11183,
253,
2201,
7350,
1840,
452,
644,
9713,
275,
253,
30762,
273,
253,
9300,
7714,
516,
4886,
619,
3302,
13716,
273,
721,
281,
818,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
1332,
281,
22048,
253,
11649,
323,
391,
9866,
534,
310,
271,
1774,
1895,
275,
2710,
4893,
352,
3400,
1543,
275,
247,
5235,
273,
10625,
17227,
326,
253,
4081,
1332,
41731,
13015,
1666,
25379,
2299,
841,
4679,
651,
5649,
10260,
432,
247,
5301,
342,
256,
5503,
3082,
323,
253,
2173,
8892,
275,
1635,
281,
253,
2783,
1666,
25379,
24088,
26677,
18634,
2720,
2990,
285,
49674,
1939,
28460,
253,
2929,
812,
671,
320,
5520,
407,
6240,
247,
10527,
22861,
281,
5513,
849,
253,
305,
3561,
293,
2602,
4090,
1159,
310,
2104,
281,
9232,
253,
6944,
941,
285,
1566,
11649
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
50275,
2520,
2929,
10262,
271,
2746,
281,
11649,
14053,
275,
18902,
11454,
6928,
949,
247,
13358,
8763,
1375,
253,
3733,
273,
436,
13358,
1566,
310,
2218,
970,
247,
294,
19484,
12729,
11193,
275,
1798,
970,
253,
305,
3561,
293,
5530,
4090,
10480,
253,
4477,
921,
253,
11839,
273,
436,
1332,
327,
247,
5235,
273,
3237,
1690,
4645,
3576,
562,
273,
3268,
5481,
285,
5520,
18543,
275,
9162,
8892,
50276,
26122,
50275,
2520,
2929,
10262,
247,
4942,
2969,
2934,
326,
21168,
4942,
3587,
432,
2045,
789,
285,
4648,
253,
1024,
1846,
2602,
4090,
72,
3561,
293,
10480,
281,
8046,
1027,
74,
1430,
253,
2022,
4757,
273,
436,
2929,
310,
253,
11080,
5661,
7103,
327,
247,
4618,
5235,
273,
3237,
50275,
783,
2022,
14855,
273,
436,
2929,
310,
253,
1077,
12744,
9759,
273,
253,
1332,
275,
2593,
3127,
253,
4477,
513,
417,
4853,
512,
13483,
253,
23065,
273,
253,
1332,
310,
734,
1033,
398,
264,
342,
11985,
273,
253,
7274,
273,
2571,
285,
253,
4028,
310,
12744,
253,
4477,
1364,
19148,
253,
9759,
273,
616,
1332,
285,
452,
436,
9759,
320,
5799,
432,
5955,
273,
2045,
789,
50275,
1189,
455,
253,
5661,
1543,
1646,
18511,
285,
4722,
253,
4477,
943,
19148,
616,
5955,
273,
253,
10571,
2540,
391,
77,
4836,
275,
253,
10571,
2540,
4836,
310,
253,
5570,
760,
2530,
16653,
2400,
6341,
273,
253,
1375,
253,
9759,
604,
3240,
21643,
285,
253,
4477,
943,
1375,
752,
436,
4836,
310,
347,
4518,
347,
1896,
50275,
5996,
250,
2858,
22559,
50276,
74,
5717,
253,
4477,
323,
616,
2380,
1097,
273,
253,
7118,
403,
1024,
625,
2590,
3738,
253,
4477,
943,
1056,
271,
3434,
281,
40167,
253,
14511,
273,
253,
2929,
285,
253,
19843,
273,
47284,
4768,
253,
5955,
273,
30009,
11060,
7147,
21844,
1080,
280,
11649,
275,
253,
30762,
310,
671,
4722,
891,
452,
2559,
619,
4868,
432,
721,
281,
818,
5474,
339,
431,
248,
2929,
29328,
247,
4460,
2746,
323,
11649,
13418,
342,
391,
79,
2224,
625,
10534,
253,
4836,
310,
281,
1097,
4944,
247,
1566,
327,
253,
941,
285,
281,
3037,
253,
11649,
273,
253,
14662,
1566,
387,
253,
1072,
673,
50276,
783,
4081,
2746,
13840,
247,
3632,
1566,
342,
697,
3632,
1255,
10904,
281,
253,
1268,
273,
11649,
253,
5912,
273,
253,
2442,
18012,
327,
247,
1677,
3280,
310,
840,
5998,
407,
10491,
253,
1566,
26332,
294,
15419,
18186,
352,
2709,
2069,
327,
253,
1072,
3280,
436,
275,
1614,
476,
671,
320,
908,
281,
6642,
253,
11649,
273,
253,
1566,
50276,
531,
1774,
2508,
326,
253,
2929,
1057,
417,
2319,
533,
651,
320,
1774,
281,
2096,
310,
849,
331,
310,
10166,
39055,
2686,
253,
1072,
1953,
4566,
323,
29201,
275,
958,
14339,
281,
331,
347,
3054,
310,
3240,
21643,
432,
253,
23276,
352,
3133,
326,
597,
403,
908,
347,
13461,
253,
4477,
943,
2319,
841,
3533,
275,
2508,
50275,
522,
435,
273,
841,
3374,
253,
2929,
310,
4942,
973,
3542,
285,
253,
2783,
1895,
310,
1774,
281,
2710,
4893,
253,
4081,
1566,
671,
2789,
3282,
327,
253,
1029,
1268,
3738,
253,
5816,
4278,
1056,
352,
1892,
281,
1750,
253,
1072,
275,
2087,
4720,
16774,
27163,
921,
253,
12510,
273,
253,
1332,
285,
671,
326,
697,
3045,
310,
10870,
50276,
395,
275,
1142,
2219,
8936,
50276,
936,
26724,
298,
296,
78,
17699,
16561,
391,
9866,
391,
9866,
342,
39762,
5926,
483,
285,
247,
3676,
19862,
273,
298,
296,
78,
1754,
1566,
50274,
2013,
7969,
50276,
4674,
3307,
4758,
945,
2162,
281,
320,
247,
14261,
7509,
1057,
417,
1646,
987,
347,
697,
767,
12474,
403,
253,
41506,
275,
47939,
285,
331,
275,
39868,
1269,
465,
253,
10103,
513,
417,
3761,
2969,
4315,
11000,
1885,
1057,
789,
2167,
50276,
249,
958,
2593,
374,
812,
320,
8489,
29422,
352,
310,
417,
1900,
3477,
281,
2096,
752,
310,
629,
273,
253,
4081,
1332,
285,
752,
310,
5544,
275,
5886,
281,
643,
3210,
760,
23000,
352,
651,
320,
9371,
281,
452,
247,
4864,
39994,
387,
253,
990,
273,
253,
2593,
670,
849,
253,
11649,
13418,
310,
2218,
323,
253,
1566,
50276,
249,
337,
16816,
1057,
417,
1646,
281,
320,
2931,
2686,
943,
352,
417,
320,
16816,
23000,
9765,
74,
767,
3104,
2708,
374,
943,
320,
355,
545,
8657,
18289,
7152,
339,
793,
360,
3454,
436,
2929,
29328,
247,
1332,
281,
22048,
253,
11649,
323,
391,
9866,
1027,
432,
50276,
783,
5899,
17699,
16561,
391,
9866,
253,
4081,
1332,
310,
625,
5919,
387,
1016,
50276,
2606,
1754,
327,
253,
1655,
8763,
1375,
285,
3541,
352,
15693,
247,
5912,
50276,
35360,
689,
253,
1375,
5502,
11865,
327,
253,
5502,
5912,
407,
50276,
5302,
253,
305,
3561,
293,
2602,
4090,
1159,
253,
1735,
1375,
310,
10302,
1754,
327,
253,
17375,
3388,
273,
253,
19958,
3054,
285,
697,
11649,
476,
320,
50276,
28758,
407,
253,
3410,
11041,
253,
4373,
19484,
29201,
273,
253,
305,
3561,
293,
1159,
50276,
261,
34003,
432,
941,
281,
1805,
9232,
253,
12794,
11649,
275,
253,
941,
50276,
936,
7568,
616,
1332,
597,
1347,
2067,
4679,
806,
597,
921,
326,
616,
1566,
476,
50276,
41628,
253,
331,
3770,
26245,
275,
3448,
1805,
685,
643,
3082,
50276,
9815,
597,
7568,
616,
50276,
9349,
17923,
1805,
275,
9162,
327,
22791,
2856,
505,
265,
685,
8245,
3082,
50276,
10328,
347,
253,
19862,
285,
270,
4482,
3082,
275,
2426,
273,
1097,
10554,
7200,
285,
6733,
50275,
19016,
597,
6760,
616,
1332,
323,
562,
1171,
35360,
5481,
285,
616,
4679,
969,
921,
616,
1332,
17923,
1805,
685,
253,
8245,
3082,
327,
22791,
15302,
50276,
71,
3341,
597,
921,
326,
672,
3732,
281,
35221,
50276,
28269,
616,
1332,
310,
1805,
685,
5368,
50276,
30172,
275,
3410,
10454,
50275,
296,
3755,
20556,
253,
4081,
1332,
323,
11649,
21652,
310,
5919,
2429,
342,
643,
3082,
824,
347,
17699,
16561,
391,
9866,
50276,
783,
16226,
273,
616,
3082,
452,
644,
6760,
323,
1027,
8892,
327,
22791,
15302,
285,
921,
12085,
3045,
7147,
253,
8245,
3082,
50276,
20881,
1255,
265,
50276,
7053,
7681,
38135,
310,
5884,
352,
310,
8127,
1754,
327,
253,
44528,
789,
327,
305,
3561,
293,
1159,
50276,
3062,
15538,
310,
12744,
2139,
253,
305,
3561,
293,
2602,
4090,
1159,
1014,
342,
253,
34003,
29201,
4764,
476,
9232,
253,
941,
11649,
285,
1805,
10527,
22861,
50276,
261,
3058,
50275,
9815,
352,
310,
12744,
849,
281,
11897,
253,
355,
3525,
2771,
280,
285,
30009,
11060,
20418,
11794,
432,
616,
1332,
347,
253,
6158,
310,
3058,
323,
258,
351,
5481,
50276,
19016,
352,
310,
12744,
849,
281,
22048,
253,
7200,
342,
253,
5998,
11649,
285,
849,
253,
5520,
11649,
21652,
476,
16497,
715,
5520,
3045,
275,
9162,
810,
37761,
50276,
25512,
394,
253,
5661,
14023,
403,
760,
2218,
323,
8245,
3082,
323,
1016,
4836,
50276,
783,
4477,
943,
671,
7277,
616,
3082,
281,
256,
5503,
3082,
323,
1016,
4836,
50276,
71,
3341,
597,
878,
513,
271,
28913,
1263,
327,
616,
1332,
281,
4677,
562,
752,
17904,
281,
616,
3082,
5520,
3045,
323,
2176,
8892,
50276,
7152,
33032,
2520,
789,
29328,
247,
4460,
1332,
281,
6642,
20418,
275,
18902,
11454,
6928,
253,
4081,
1566,
11120,
48169,
247,
5912,
3268,
689,
247,
873,
273,
13358,
8763,
3054,
1677,
253,
1655,
8763,
1375,
275,
271,
391,
9866,
19732,
2977,
253,
305,
3561,
293,
2602,
4090,
10480,
253,
4081,
1332,
17923,
278,
68,
11786,
13418,
247,
3276,
4764,
310,
671,
6311,
281,
1453,
253,
4719,
273,
1375,
5502,
3268,
281,
6642,
11649,
273,
247,
1677,
3280,
253,
4081,
1566,
310,
1408,
2709,
2069,
281,
3812,
3530,
323,
26230,
253,
1599,
285,
11041,
4679,
403,
5196,
275,
247,
5235,
273,
22453,
10554,
3237,
1690,
247,
35221,
4715,
4836,
17227,
253,
12510,
273,
253,
4081,
11649,
13418,
1332,
50275,
856,
84,
26230,
11649,
273,
13650,
310,
1774,
323,
2856,
324,
1069,
257,
5145,
4715,
3210,
3340,
323,
15549,
562,
1171,
35360,
941,
253,
4081,
1332,
3587,
2677,
7790,
285,
24403,
684,
11649,
285,
3103,
1057,
417,
897,
1199,
625,
3602,
2429,
281,
270,
79,
2224,
285,
4419,
1679,
4764,
25184,
253,
2929,
34899,
247,
1175,
2491,
273,
4836,
10625,
285,
2266,
8245,
3082,
17227,
10870,
3045,
50276,
5040,
1223,
253,
4081,
1332,
14371,
1175,
3045,
327,
1097,
14053,
19191,
4870,
285,
26230,
562,
1171,
35360,
941,
352,
310,
12744,
1880,
253,
1332,
3139,
476,
4858,
30009,
11060,
11649,
432,
21844,
1080,
280,
50276,
7157,
1695,
555,
604,
1097,
4961,
26614,
954,
273,
253,
4236,
8245,
3082,
16633,
14288,
327,
26230,
253,
30009,
11060,
11649,
604,
1896,
352,
310,
6799,
281,
923,
247,
5301,
273,
253,
4081,
1332,
342,
8245,
3082,
326,
403,
4158,
281,
14288,
1566,
21844,
1080,
280,
20418,
323,
391,
79,
2224,
352,
310,
5393,
326,
247,
1781,
1180,
273,
3054,
19132,
3045,
275,
253,
4679,
323,
21565,
258,
351,
941,
247,
7484,
323,
253,
2954,
875,
3045,
285,
253,
1180,
273,
3054,
908,
651,
320,
4217,
281,
2096,
849,
7996,
253,
3045,
310,
281,
253,
1180,
273,
3054,
908,
604,
1896,
253,
4477,
943,
671,
2319,
253,
4081,
2987,
2954,
342,
253,
10491,
4924,
1332,
273,
288,
33317,
1162,
355,
337,
285,
849,
253,
4327,
273,
970,
13358,
1375,
3268,
651,
562,
32231,
247,
36833,
3268,
50275,
18,
288,
33317,
256,
480,
479,
45846,
391,
391,
465,
303,
288,
480,
480,
2116,
1665,
256,
260,
50276,
4093,
73,
362,
9169,
14688,
461,
10491,
4924,
11649,
13418,
275,
305,
456,
18902,
5085,
342,
4893,
281,
43829,
14053,
275,
6551,
40270,
275,
11649,
275,
13345,
9260,
7266,
854,
41345,
746,
268,
1686,
83,
50274,
11183,
253,
2201,
7350,
1840,
452,
644,
9713,
275,
253,
30762,
273,
253,
9300,
7714,
516,
4886,
619,
3302,
13716,
273,
721,
281,
818,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
1332,
281,
22048,
253,
11649,
323,
391,
9866,
534,
310,
271,
1774,
1895,
275,
2710,
4893,
352,
3400,
1543,
275,
247,
5235,
273,
10625,
17227,
326,
253,
4081,
1332,
41731,
13015,
1666,
25379,
2299,
841,
4679,
651,
5649,
10260,
432,
247,
5301,
342,
256,
5503,
3082,
323,
253,
2173,
8892,
275,
1635,
281,
253,
2783,
1666,
25379,
24088,
26677,
18634,
2720,
2990,
285,
49674,
1939,
28460,
253,
2929,
812,
671,
320,
5520,
407,
6240,
247,
10527,
22861,
281,
5513,
849,
253,
305,
3561,
293,
2602,
4090,
1159,
310,
2104,
281,
9232,
253,
6944,
941,
285,
1566,
11649
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper investigates the impact of different calibration strategy precombination postcombination and its dynamic variant on the performance of a deep ensemble it presents both theoretical and empirical proof to show that wellcalibrated ensemble member does guarantee calibration in the final ensemble strength a coherent theoretical account for the issue of calibrating deep ensembles accompanied by empirical evidence from cifar datasets although not stated explicitly a new calibration approach dynamic calibration is introduced which empirically leads to better performance weakness novelty may be limited one central contribution of this paper is to provide a mathematical derivation to confirm the observation made in rahaman and thiery 2020 and wen et al 2020 although i appreciate authors work on providing mathematical explanation for recent empirical findings im not sure if the submission in its current form is contributing significant novel theoretical insight beyond the fact that ensemble prediction is less confidence since max of the mean probability is no greater than mean of the max probabilities on the other hand the empirical investigation is conducted on a single vision task cifar10100 this paper can be made stronger by investigating synthetic situation where the ground truth is known or extend experiment to also other data modalities like guo et al 2017 organization given the place of the new approach dynamic temperature scaling in the experiment it might be worthwhile to devote some paragraph to introduce the procedure in more detail recommendation based on reason stated in weakness i recommend rejection since the either theoretical or the empirical contribution of this paper does not seem to be substantive enough for iclrdocsepthe paper makes an analysis of calibration in ensembles of deep learning models through some theoretical developments the paper supports that a given ensemble cannot be more confident than the average individual members for regions where the ensemble is well calibrated empirical results on cifar100 and three different deep models report a comparison of ensemble calibration where calibration is done over all members in order to achieved a calibrated ensemble decision over individual calibration of members with no feedback from the ensemble decisions results show that individual member calibration does not lead to calibrated ensembles and as such calibrating directly on the ensemble output is required for obtained a proper evaluation of its uncertainty different ensemble calibration approaches are also compared pros overall wellwritten paper straightforward proposal simple yet meaningful on several aspects for better understanding of the link between calibration and ensembles rely on theory to support some claims which strengthen the proposal cons the proposal is somewhat trivial although i do not have knowledge that it has been investigated in detail elsewhere before reading the paper i expected the results ie calibration of individual members will not lead to calibrated ensemble decisions calibration at the ensemble level is required the paper is somewhat confirming this in a more explicit manner evaluation on only one dataset cifar100 in the main paper with another dataset for the appendix cifar10 results on cifar10 in the appendix are not very compelling it is hard to make sense of the results in table 1 and similar differences are small and difficult to interpret the explanations and organization of the paper are hard to following in some specific part although the paper is making a wellfounded analysis of a hot topic in the last few years ie ensembles are a way to evaluate uncertainty on decisions i found it having some relatively trivial developments and the conclusion is intuitive and expected however it is the first time i see this point well articulated and the authors have made a good effort to develop theoretically backed explanations to support this docsepupdate after the author response ive read the other reviews and agree with r2 and r3 i think the paper is useful emphasizes you need to calibrate the final ensemble not enough to calibrate members and has some nice conceptual contributions explaining that if ensemble accuracy average member accuracy which is usually the case and the ensemble is calibrated even in just a globalweak sense then the members must be uncalibrated this could spur more research into conceptually analyzing ensembles and seems interesting but i understand the other reviewers concerns that its not clear what practical impact this will have so im keeping my score at a 6 instead of raising to a 7 summary this paper tackles the problem of calibrating an ensemble they show experimentally that calibrating all members of an ensemble is often not enough to calibrate the combined ensemble so instead we need to calibrate the final predictions of the ensemble additionally they show that using a different temperature parameter for different regions of outputs can improve calibration they explain why if the ensemble members are toplabel calibrated even in a very weak sense they call global calibration and the ensemble is calibrated then the ensemble is less accurate than the average member of the ensemble reasons for score they make interesting observations about calibration of ensembles that could guide practitioners for example that its not enough to calibrate the members of the ensemble they also raise an intriguing connection between calibration of ensemble members and ensemble accuracy one would not expect a priori that if both are calibrated the ensemble would do worse than the average member i could see this result being interesting to people who study ensembles as well there are some weaknesses in writing and execution but overall this paper is probably worth publishing if edited pros i think its a nice observation that calibrating the members of an ensemble may not yield a calibrated ensemble its easy to come up with toy examples where this is the case but its interesting that this seems to be the case in practice they make an intriguing observation that if the ensemble members are in fact calibrated and the ensemble is calibrated then the ensemble accuracy is at most the average member accuracy cons i believe the writing can be substantially simplified the core ideas are simple and nice but it takes a lot of effort to get to them and i believe the authors should put in more work into making this understandable some of the results seem unrealistic and can be omitted for example in the start of section 41 the first couple of results require that the ensemble member regions and ensemble regions are the same this seems rather unrealistic the assumptions in prop 1 seem too strong to me id remove the mentions of regions and id instead mention the other results prop 2 3 4 in the main paper section 41 you could just move the propositions and give some intuition for why the results are true removing regions should also considerably simplify the notation and setup im not quite sure what you mean in the intro when you say eq 1 doesnt explicitly reflect the relation between and the underlying data distribution px y the definition in equation 1 uses px y im not sure why all the definitions in 31 and 33 are defined in a way different from the standard ways in the calibration literature eg in kull et al 2019 or kumar et al 2019 temperature scaling is performed on logits not on the actual probabilities from equations 24 25 and 26 it looks like you might be doing temperature scaling on the probability space in equation 24 25 the first argument to f is the probability not the logit which looks a bit odd prop 4 should also hold when k 2 2 ensemble members i believe happy to provide an example some symbols are undefined for example deltayi omegaj i dont believe delta is defined i think it should be 1 if they are equal and 0 otherwise questions and things to improve please answer the cons above ensembles are particularly useful because they tend to be more calibrated out of domain lakshminarayanan et al 2017 it could be useful to see which of these methods calibrating the members or the entire ensemble is better calibrated when we have domain shift eg training data cifar10 test data cifar10c hendrycks et al 2019 having confidence intervals for the calibration errors would be nice and also using more modern debiased estimators to estimate the calibration error eg in kumar et al 2019 all cites mentioned are already in the paper except benchmarking neural network robustness to common corruptions and perturbations dan hendrycks thomas dietterich iclr 2019 docsep in general my opinion is aligned with anonreviewer1 the theory and the empirical contribution do not feel sufficient i also agree with anonreviewer3 and anonreviewer4 but feel less excited about the prons and more worried about the cons at this point im not against the acceptance of the paper although im still staying on the rejection side im increasing my score because we are at least talking about a borderline summary the paper study calibration of ensembles of dnns and its relation to the calibration of individual members of ensembles the work demonstrates that i members of an ensemble should not be calibrated but the final ensemble may require calibration especially if members of an ensemble are calibrated ii provide theoretical results to support the statement iii propose an adaptive calibration scheme dynamic temperature scaling that uses different temperatures based on the confidence of a model concerns 1 the main question of the paper should ensemble members be calibrated feels trivial because the community is aware of the simple example that provides an answer the deep ensembles lakshminarayanan2017 have miscalibrated membersconventional dnns but the predictions of an ensemble are inmostcases calibrated thus the answer is no 2 the paper mostly is clearly written but section 41 theoretical analysis is extremely hard to follow even though i reread it many times im still not sure if i understood it correctly the most confusing part is the conclusion in practice there is no constraint that the ensemble prediction should be calibrated thus ensemble prediction calibration is required even for toplabel calibrated members it seems that no listed results were used to produce this statement 3 the calibration of ensemble has been proposed in ashukha2020 5 discussion conclusion the resulting ensemble predictions requiring calibration functions to be optimized for the ensemble prediction rather than ensemble members 4 the two main contributions 41 theoretical analysis 42 temperature annealing for ensemble calibration feels not related they are basically two independent topics packed in the one paper 5 the empirical comparison exploits the calibrations score eg ece ece is a biased estimate of true calibration with a different bias for each model so it is not a valid metric to compare different models see vaicenavicius2019 the fact is even mentioned in the current paper it should be noted that for finite number of samples but still is ignored in the empirical study what i suggest is to use the squared kernel calibration error skce proposed in widmann2019 along with de facto standard but biased ece the skce is an unbiased estimate of calibration there might be some pitfalls of this metric that im not aware of but the paper looks solid and convincing also please put attention to figure 83 in the ariv version yes ece is the standard in the field but it is the wrong standard that prevents us from meaningful scientific progress so we should stop using it 6 the results provided in table 1 seem to be close values 06119 vs 06129 etc so at least standard deviations need to be reported also there is no mentioning of several runs per results in the text the paper toches the nice topics but overall it feels like ok but not enough the theory is interesting but it does not give us a lot of insides maybe its very subjective the dynamic temperature scaling is not proofed to outperform the basslines the contributions feel disconnected the writing quality needs to be improved comments 1 as far as i can tell the citation the weights assigned to the probabilities are either optimized using auc as in ashukha et al 2020 is incorrect as there is no mentioning of optimizing weights using auc in the paper 2 typo it should be noted that for a finite number of samples lakshminarayanan2017 lakshminarayanan b pritzel a blundell c simple and scalable predictive uncertainty estimation using deep ensembles in advances in neural information processing systems 2017 pp 64026413 ashukha2020 ashukha a lyzhov a molchanov d vetrov d pitfalls of indomain uncertainty estimation and ensembling in deep learning iclr 2020 vaicenavicius2019 juozas vaicenavicius david widmann carl andersson fredrik lindsten jacob roll and thomas b schon evaluating model calibration in classification aistats 2019 widmann2019 widmann d lindsten f zachariah d calibration tests in multiclass classification a unifying framework in advances in neural information processing systems 2019 pp 1225712267 httpsarxivorgpdf191011385pdf
### Summary: | this paper studies ensemble calibration and the relationship between the calibration of individual ensemble member models with the calibration of the resulting ensemble prediction the main theoretical result is that individual ensemble members should not be individually calibrated in order to have a wellcalibrated ensemble prediction while other recent work has found this to be the case in empirical results this paper substantiates the empirical results through theoretical results pros theoretical study of ensemble calibration with meaningful insights cons contributions limited to theoretical study of known observation and dynamic temperature scaling dynamic temperature scaling is not shown to outperform baseline methods limited experimental validation cifar10cifar100 the authors engaged in a extensive discussion with reviewers and made changes to their paper including adding standard deviation results over multiple runs and the skce calibration measure overall this is solid work and could be accepted to the conference however reviewers agree that parts of the work are lacking in particular 1 limited experimental evaluation one type of task onetwo datasets only and 2 given known literature the benefit of the derived theoretical results to practioners is not clear the discussions have been unable to resolve this disagreement | [
891,
923,
436,
1127,
973,
35144,
285,
253,
4477,
452,
1160,
247,
1175,
3434,
281,
1287,
28055,
17245,
22909,
281,
1329,
436,
5474,
33032,
11183,
846,
253,
2488,
2380,
209,
422,
1239,
253,
643,
10123,
285,
5194,
342,
391,
19,
285,
391,
20,
891,
1158,
253,
2929,
310,
4217,
35520,
368,
878,
281,
24403,
366,
253,
2457,
19862,
417,
2217,
281,
24403,
366,
2758,
285,
556,
690,
5322,
20178,
9021,
15571,
326,
604,
19862,
7200,
50276,
25629,
3558,
7200,
534,
310,
3798,
253,
1083,
285,
253,
19862,
310,
35890,
1014,
275,
816,
247,
4156,
20881,
3282,
840,
253,
2758,
1364,
320,
440,
1179,
50250,
436,
812,
36057,
625,
2561,
715,
4473,
1230,
18918,
49328,
285,
3133,
4722,
533,
891,
2096,
253,
643,
30628,
7350,
326,
697,
417,
2590,
752,
8542,
3486,
436,
588,
452,
594,
516,
7562,
619,
4868,
387,
247,
721,
3185,
273,
12976,
281,
247,
818,
50274,
8774,
50276,
2520,
2929,
39223,
253,
1895,
273,
24403,
839,
271,
19862,
597,
921,
21657,
326,
24403,
839,
512,
2758,
273,
271,
19862,
310,
2223,
417,
2217,
281,
24403,
366,
253,
5678,
19862,
594,
3185,
359,
878,
281,
24403,
366,
253,
2457,
13650,
273,
253,
19862,
23000,
597,
921,
326,
970,
247,
1027,
3276,
4764,
323,
1027,
4811,
273,
18012,
476,
3157,
18543,
597,
5513,
2139,
604,
253,
19862,
2758,
403,
281,
446,
1492,
35890,
1014,
275,
247,
1077,
5075,
3282,
597,
1067,
4156,
18543,
285,
253,
19862,
310,
35890,
840,
253,
19862,
310,
1679,
7899,
685,
253,
3388,
3558,
273,
253,
19862,
50274,
250,
3743,
323,
4868,
50276,
9328,
1056,
4722,
7313,
670,
18543,
273,
49328,
326,
812,
7102,
24432,
323,
1650,
326,
697,
417,
2217,
281,
24403,
366,
253,
2758,
273,
253,
19862,
597,
671,
7164,
271,
27807,
4602,
875,
18543,
273,
19862,
2758,
285,
19862,
7200,
581,
651,
417,
1902,
247,
30400,
326,
604,
1097,
403,
35890,
253,
19862,
651,
513,
7197,
685,
253,
3388,
3558,
891,
812,
923,
436,
906,
1146,
4722,
281,
952,
665,
1263,
49328,
347,
973,
627,
403,
690,
32213,
275,
4028,
285,
10636,
533,
4583,
436,
2929,
310,
3164,
4409,
18051,
604,
16168,
50274,
856,
84,
50275,
74,
1158,
697,
247,
5322,
8310,
326,
24403,
839,
253,
2758,
273,
271,
19862,
778,
417,
4917,
247,
35890,
19862,
697,
3477,
281,
1705,
598,
342,
20953,
6667,
835,
436,
310,
253,
1083,
533,
697,
4722,
326,
436,
3133,
281,
320,
253,
1083,
275,
3946,
50275,
9328,
1056,
271,
27807,
8310,
326,
604,
253,
19862,
2758,
403,
275,
958,
35890,
285,
253,
19862,
310,
35890,
840,
253,
19862,
7200,
310,
387,
954,
253,
3388,
3558,
7200,
50273,
5040,
50275,
74,
2868,
253,
4028,
476,
320,
9619,
21010,
253,
5161,
5697,
403,
2969,
285,
5322,
533,
352,
3936,
247,
2257,
273,
3434,
281,
755,
281,
731,
285,
891,
2868,
253,
4477,
943,
1691,
275,
625,
789,
715,
2403,
436,
34007,
50275,
8826,
273,
253,
1543,
1646,
46521,
285,
476,
320,
11035,
323,
1650,
275,
253,
1265,
273,
2593,
7609,
253,
806,
4564,
273,
1543,
2430,
326,
253,
19862,
3558,
4811,
285,
19862,
4811,
403,
253,
1072,
436,
3133,
2581,
46521,
253,
13260,
275,
4198,
337,
1646,
1512,
2266,
281,
479,
2654,
5386,
253,
25957,
273,
4811,
285,
2654,
3185,
3748,
253,
643,
1543,
4198,
374,
495,
577,
275,
253,
2022,
2929,
2593,
7609,
368,
812,
816,
2118,
253,
39325,
285,
1918,
690,
30328,
323,
2139,
253,
1543,
403,
2032,
11922,
4811,
943,
671,
15455,
25636,
253,
14951,
285,
9978,
50275,
303,
417,
3240,
2119,
752,
368,
1599,
275,
253,
26432,
672,
368,
1333,
16186,
337,
36908,
11120,
4887,
253,
5886,
875,
50276,
395,
253,
6944,
941,
3268,
268,
89,
340,
253,
5426,
275,
5150,
337,
4648,
268,
89,
340,
516,
417,
2119,
2139,
512,
253,
14308,
275,
4562,
285,
5922,
403,
2931,
275,
247,
1039,
1027,
432,
253,
2629,
4088,
275,
253,
18543,
6239,
24088,
275,
465,
962,
1162,
355,
6247,
390,
465,
22711,
1162,
355,
6247,
50275,
26158,
13642,
310,
2684,
327,
2412,
953,
417,
327,
253,
4588,
20552,
432,
7424,
2164,
2030,
285,
3436,
352,
4453,
751,
368,
1537,
320,
2509,
3276,
13642,
327,
253,
5912,
2317,
275,
5150,
2164,
2030,
253,
806,
4154,
281,
269,
310,
253,
5912,
417,
253,
2412,
262,
534,
4453,
247,
2372,
8909,
50275,
8560,
577,
943,
671,
2186,
672,
465,
50276,
19,
374,
19862,
2758,
891,
2868,
5211,
281,
2085,
271,
1650,
50275,
8826,
14217,
403,
17011,
323,
1650,
1448,
85,
333,
74,
7005,
909,
1432,
891,
13414,
2868,
18687,
310,
2931,
891,
1158,
352,
943,
320,
337,
604,
597,
403,
4503,
285,
470,
5010,
50274,
34974,
285,
1841,
281,
3157,
50275,
32897,
3662,
253,
772,
1840,
50275,
1215,
1814,
868,
403,
3782,
4217,
984,
597,
5257,
281,
320,
625,
35890,
562,
273,
5028,
298,
518,
1200,
1222,
274,
26782,
266,
1162,
355,
4240,
352,
812,
320,
4217,
281,
923,
534,
273,
841,
3082,
24403,
839,
253,
2758,
390,
253,
2862,
19862,
310,
1805,
35890,
672,
359,
452,
5028,
5333,
24088,
3733,
941,
50276,
46277,
274,
740,
1071,
941,
50276,
46277,
274,
740,
68,
344,
2109,
610,
6163,
1162,
355,
6247,
50275,
30819,
7162,
11508,
323,
253,
18543,
6332,
651,
320,
5322,
285,
671,
970,
625,
4980,
372,
30344,
48489,
281,
6642,
253,
18543,
2228,
24088,
275,
465,
22711,
1162,
355,
6247,
50273,
455,
28070,
5393,
403,
2168,
275,
253,
2929,
3707,
50276,
31591,
4698,
272,
11454,
2990,
31640,
281,
1846,
17715,
621,
285,
26309,
16447,
344,
2109,
610,
6163,
289,
4921,
6196,
350,
469,
17857,
32888,
6247,
5474,
33032,
275,
2087,
619,
4743,
310,
15616,
342,
271,
251,
15337,
254,
18,
253,
3762,
285,
253,
16774,
7680,
513,
417,
1928,
4209,
50274,
74,
671,
5194,
342,
271,
251,
15337,
254,
20,
50276,
395,
271,
251,
15337,
254,
21,
533,
1928,
1679,
9049,
670,
253,
819,
790,
285,
625,
11926,
670,
253,
772,
50275,
255,
436,
1127,
516,
417,
1411,
253,
14924,
273,
253,
2929,
3738,
516,
1335,
14596,
327,
253,
18235,
1930,
516,
3629,
619,
4868,
984,
359,
403,
387,
1878,
5015,
670,
247,
45210,
50274,
8774,
50276,
783,
2929,
1263,
18543,
273,
49328,
273,
277,
79,
2224,
285,
697,
5886,
281,
253,
18543,
273,
2060,
2758,
273,
49328,
253,
789,
14371,
326,
891,
2758,
273,
271,
19862,
943,
417,
320,
35890,
533,
253,
2457,
19862,
778,
2430,
18543,
3340,
604,
2758,
273,
271,
19862,
403,
35890,
21255,
2085,
10527,
1543,
281,
1329,
253,
3908,
37685,
12661,
271,
17825,
18543,
6974,
7870,
3276,
13642,
326,
4648,
1027,
9208,
1754,
327,
253,
7162,
273,
247,
1566,
50274,
585,
1209,
2224,
50275,
18,
253,
2022,
1953,
273,
253,
2929,
943,
19862,
2758,
320,
35890,
9193,
14916,
984,
253,
3114,
310,
6600,
273,
253,
2969,
1650,
326,
3400,
271,
3662,
253,
3676,
49328,
298,
518,
1200,
1222,
274,
26782,
266,
7132,
452,
3731,
1179,
50250,
2758,
585,
26743,
277,
79,
2224,
533,
253,
13650,
273,
271,
19862,
403,
275,
2252,
12866,
35890,
3021,
253,
3662,
310,
642,
50276,
19,
253,
2929,
6571,
310,
4518,
3542,
533,
2593,
7609,
10527,
1783,
310,
6685,
1892,
281,
956,
1014,
2167,
891,
294,
1088,
352,
1142,
2069,
516,
1335,
417,
2119,
604,
891,
7192,
352,
9113,
253,
954,
21643,
629,
310,
253,
6452,
275,
3946,
627,
310,
642,
7658,
326,
253,
19862,
10554,
943,
320,
35890,
3021,
19862,
10554,
18543,
310,
2424,
1014,
323,
281,
446,
1492,
35890,
2758,
352,
3133,
326,
642,
7117,
1543,
497,
908,
281,
4711,
436,
3908,
50272,
20,
253,
18543,
273,
19862,
556,
644,
4081,
275,
15898,
2788,
3227,
14952,
608,
5955,
50276,
585,
3444,
253,
4795,
19862,
13650,
50276,
1844,
4261,
18543,
3470,
281,
320,
18325,
323,
253,
19862,
10554,
2581,
685,
19862,
2758,
50276,
21,
253,
767,
2022,
9021,
7609,
10527,
1783,
5976,
3276,
35375,
323,
19862,
18543,
9193,
417,
2905,
597,
403,
10323,
767,
3907,
12989,
14998,
275,
253,
581,
2929,
50276,
22,
253,
16774,
5301,
40725,
253,
24403,
569,
4868,
24088,
299,
336,
299,
336,
50276,
261,
247,
23539,
6642,
273,
2032,
18543,
342,
247,
1027,
8492,
323,
1016,
1566,
594,
352,
310,
417,
247,
3588,
7982,
281,
7277,
1027,
3210,
923,
13460,
280,
257,
580,
280,
3750,
9638,
253,
958,
310,
1014,
5393,
275,
253,
1655,
2929,
352,
943,
320,
4879,
326,
323,
6486,
1180,
273,
3530,
50275,
2858,
1335,
310,
12841,
275,
253,
16774,
1263,
50276,
5371,
891,
1804,
310,
281,
897,
253,
30044,
10295,
18543,
2228,
1629,
336,
4081,
275,
5261,
8420,
9638,
2112,
342,
372,
32924,
2629,
533,
23539,
299,
336,
253,
1629,
336,
310,
271,
38663,
6642,
273,
18543,
627,
1537,
320,
690,
8483,
27366,
273,
436,
7982,
326,
516,
417,
6600,
273,
533,
253,
2929,
4453,
4891,
285,
21414,
671,
4496,
1691,
4116,
281,
4677,
11439,
275,
253,
247,
1069,
2715,
4754,
299,
336,
310,
253,
2629,
275,
253,
1673,
533,
352,
310,
253,
3430,
2629,
326,
16897,
441,
432,
14282,
8249,
4780,
594,
359,
943,
3523,
970,
352,
50276,
23,
253,
1543,
2530,
275,
2829,
337,
1646,
281,
320,
2810,
2193,
470,
3832,
746,
4632,
17796,
13482,
3966,
594,
387,
1878,
2629,
21492,
878,
281,
320,
2361,
671,
627,
310,
642,
29570,
273,
2067,
6613,
591,
1543,
275,
253,
2505,
50276,
783,
2929,
281,
2706,
253,
5322,
12989,
533,
4583,
352,
9193,
751,
8718,
533,
417,
2217,
253,
3762,
310,
4722,
533,
352,
1057,
417,
1918,
441,
247,
2257,
273,
1210,
1487,
5046,
697,
1077,
17854,
253,
7870,
3276,
13642,
310,
417,
4737,
264,
281,
562,
32231,
253,
16819,
8737,
253,
9021,
1928,
33817,
253,
4028,
3290,
3198,
281,
320,
5520,
50274,
26122,
50275,
18,
347,
2080,
347,
891,
476,
2028,
253,
25577,
253,
13461,
7922,
281,
253,
20552,
403,
2057,
18325,
970,
247,
1028,
347,
275,
15898,
2788,
3227,
1162,
355,
9169,
50276,
261,
13583,
347,
627,
310,
642,
29570,
273,
39793,
13461,
970,
247,
1028,
275,
253,
2929,
50276,
19,
1745,
80,
352,
943,
320,
4879,
326,
323,
247,
6486,
1180,
273,
3530,
50276,
77,
518,
1200,
1222,
274,
26782,
266,
7132,
298,
518,
1200,
1222,
274,
26782,
266,
270,
819,
5432,
293,
247,
787,
1504,
437,
260,
2969,
285,
44755,
15970,
11649,
13418,
970,
3676,
49328,
275,
16424,
275,
11454,
1491,
5162,
2718,
4240,
7266,
37174,
17984,
1012,
50276,
1225,
2788,
3227,
14952,
15898,
2788,
3227,
247,
12865,
20122,
729,
247,
14008,
2291,
729,
277,
26925,
18540,
277,
8483,
27366,
273,
801,
297,
404,
11649,
13418,
285,
546,
35128,
275,
3676,
4715,
17857,
32888,
9169,
50275,
6156,
280,
257,
580,
280,
3750,
9638,
7166,
6002,
284,
13460,
280,
257,
580,
280,
3750,
34843,
301,
5261,
8420,
1113,
77,
285,
398,
1665,
269,
433,
16409,
298,
527,
16750,
480,
317,
706,
4533,
285,
289,
4921,
270,
5807,
251,
16344,
1566,
18543,
275,
9162,
247,
382,
1832,
6247,
50276,
5392,
8420,
9638,
5261,
8420,
277,
298,
527,
16750,
269,
1182,
607,
8125,
73,
277,
18543,
5216,
275,
23559,
14407,
9162,
247,
440,
5411,
7792,
275,
16424,
275,
11454,
1491,
5162,
2718,
6247,
7266,
1249,
21553,
805,
23546,
5987,
39962,
2061,
9275,
746,
6903,
1012,
2227,
9275,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
19862,
18543,
285,
253,
2954,
875,
253,
18543,
273,
2060,
19862,
3558,
3210,
342,
253,
18543,
273,
253,
4795,
19862,
10554,
50276,
783,
2022,
10527,
906,
310,
326,
2060,
19862,
2758,
943,
417,
320,
15978,
35890,
275,
1340,
281,
452,
247,
973,
1179,
50250,
19862,
10554,
50276,
6050,
643,
3332,
789,
556,
1119,
436,
281,
320,
253,
1083,
275,
16774,
1543,
436,
2929,
4326,
28032,
253,
16774,
1543,
949,
10527,
1543,
50275,
856,
84,
50276,
783,
33977,
1263,
273,
19862,
18543,
342,
14282,
16039,
50276,
5040,
50276,
1987,
8303,
3710,
281,
10527,
1263,
273,
1929,
8310,
285,
7870,
3276,
13642,
50276,
19681,
3276,
13642,
310,
417,
2011,
281,
562,
32231,
8245,
3082,
50276,
15870,
5661,
12820,
260,
338,
274,
740,
46277,
274,
2313,
50276,
783,
4477,
9583,
275,
247,
9470,
5955,
342,
30628,
285,
1160,
2544,
281,
616,
2929,
1690,
6240,
2629,
11254,
1543,
689,
2709,
6613,
285,
253,
1629,
336,
18543,
2557,
50276,
1189,
455,
436,
310,
4891,
789,
285,
812,
320,
7607,
281,
253,
8059,
2299,
30628,
5194,
326,
4243,
273,
253,
789,
403,
14999,
275,
1798,
337,
3710,
5661,
7103,
581,
1511,
273,
4836,
327,
292,
680,
15302,
760,
285,
374,
1677,
1929,
6239,
253,
5649,
273,
253,
6012,
10527,
1543,
281,
268,
3460,
398,
310,
417,
2590,
50276,
783,
11985,
452,
644,
7591,
281,
11322,
436,
30859,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
891,
923,
436,
1127,
973,
35144,
285,
253,
4477,
452,
1160,
247,
1175,
3434,
281,
1287,
28055,
17245,
22909,
281,
1329,
436,
5474,
33032,
11183,
846,
253,
2488,
2380,
209,
422,
1239,
253,
643,
10123,
285,
5194,
342,
391,
19,
285,
391,
20,
891,
1158,
253,
2929,
310,
4217,
35520,
368,
878,
281,
24403,
366,
253,
2457,
19862,
417,
2217,
281,
24403,
366,
2758,
285,
556,
690,
5322,
20178,
9021,
15571,
326,
604,
19862,
7200,
50276,
25629,
3558,
7200,
534,
310,
3798,
253,
1083,
285,
253,
19862,
310,
35890,
1014,
275,
816,
247,
4156,
20881,
3282,
840,
253,
2758,
1364,
320,
440,
1179,
50250,
436,
812,
36057,
625,
2561,
715,
4473,
1230,
18918,
49328,
285,
3133,
4722,
533,
891,
2096,
253,
643,
30628,
7350,
326,
697,
417,
2590,
752,
8542,
3486,
436,
588,
452,
594,
516,
7562,
619,
4868,
387,
247,
721,
3185,
273,
12976,
281,
247,
818,
50274,
8774,
50276,
2520,
2929,
39223,
253,
1895,
273,
24403,
839,
271,
19862,
597,
921,
21657,
326,
24403,
839,
512,
2758,
273,
271,
19862,
310,
2223,
417,
2217,
281,
24403,
366,
253,
5678,
19862,
594,
3185,
359,
878,
281,
24403,
366,
253,
2457,
13650,
273,
253,
19862,
23000,
597,
921,
326,
970,
247,
1027,
3276,
4764,
323,
1027,
4811,
273,
18012,
476,
3157,
18543,
597,
5513,
2139,
604,
253,
19862,
2758,
403,
281,
446,
1492,
35890,
1014,
275,
247,
1077,
5075,
3282,
597,
1067,
4156,
18543,
285,
253,
19862,
310,
35890,
840,
253,
19862,
310,
1679,
7899,
685,
253,
3388,
3558,
273,
253,
19862,
50274,
250,
3743,
323,
4868,
50276,
9328,
1056,
4722,
7313,
670,
18543,
273,
49328,
326,
812,
7102,
24432,
323,
1650,
326,
697,
417,
2217,
281,
24403,
366,
253,
2758,
273,
253,
19862,
597,
671,
7164,
271,
27807,
4602,
875,
18543,
273,
19862,
2758,
285,
19862,
7200,
581,
651,
417,
1902,
247,
30400,
326,
604,
1097,
403,
35890,
253,
19862,
651,
513,
7197,
685,
253,
3388,
3558,
891,
812,
923,
436,
906,
1146,
4722,
281,
952,
665,
1263,
49328,
347,
973,
627,
403,
690,
32213,
275,
4028,
285,
10636,
533,
4583,
436,
2929,
310,
3164,
4409,
18051,
604,
16168,
50274,
856,
84,
50275,
74,
1158,
697,
247,
5322,
8310,
326,
24403,
839,
253,
2758,
273,
271,
19862,
778,
417,
4917,
247,
35890,
19862,
697,
3477,
281,
1705,
598,
342,
20953,
6667,
835,
436,
310,
253,
1083,
533,
697,
4722,
326,
436,
3133,
281,
320,
253,
1083,
275,
3946,
50275,
9328,
1056,
271,
27807,
8310,
326,
604,
253,
19862,
2758,
403,
275,
958,
35890,
285,
253,
19862,
310,
35890,
840,
253,
19862,
7200,
310,
387,
954,
253,
3388,
3558,
7200,
50273,
5040,
50275,
74,
2868,
253,
4028,
476,
320,
9619,
21010,
253,
5161,
5697,
403,
2969,
285,
5322,
533,
352,
3936,
247,
2257,
273,
3434,
281,
755,
281,
731,
285,
891,
2868,
253,
4477,
943,
1691,
275,
625,
789,
715,
2403,
436,
34007,
50275,
8826,
273,
253,
1543,
1646,
46521,
285,
476,
320,
11035,
323,
1650,
275,
253,
1265,
273,
2593,
7609,
253,
806,
4564,
273,
1543,
2430,
326,
253,
19862,
3558,
4811,
285,
19862,
4811,
403,
253,
1072,
436,
3133,
2581,
46521,
253,
13260,
275,
4198,
337,
1646,
1512,
2266,
281,
479,
2654,
5386,
253,
25957,
273,
4811,
285,
2654,
3185,
3748,
253,
643,
1543,
4198,
374,
495,
577,
275,
253,
2022,
2929,
2593,
7609,
368,
812,
816,
2118,
253,
39325,
285,
1918,
690,
30328,
323,
2139,
253,
1543,
403,
2032,
11922,
4811,
943,
671,
15455,
25636,
253,
14951,
285,
9978,
50275,
303,
417,
3240,
2119,
752,
368,
1599,
275,
253,
26432,
672,
368,
1333,
16186,
337,
36908,
11120,
4887,
253,
5886,
875,
50276,
395,
253,
6944,
941,
3268,
268,
89,
340,
253,
5426,
275,
5150,
337,
4648,
268,
89,
340,
516,
417,
2119,
2139,
512,
253,
14308,
275,
4562,
285,
5922,
403,
2931,
275,
247,
1039,
1027,
432,
253,
2629,
4088,
275,
253,
18543,
6239,
24088,
275,
465,
962,
1162,
355,
6247,
390,
465,
22711,
1162,
355,
6247,
50275,
26158,
13642,
310,
2684,
327,
2412,
953,
417,
327,
253,
4588,
20552,
432,
7424,
2164,
2030,
285,
3436,
352,
4453,
751,
368,
1537,
320,
2509,
3276,
13642,
327,
253,
5912,
2317,
275,
5150,
2164,
2030,
253,
806,
4154,
281,
269,
310,
253,
5912,
417,
253,
2412,
262,
534,
4453,
247,
2372,
8909,
50275,
8560,
577,
943,
671,
2186,
672,
465,
50276,
19,
374,
19862,
2758,
891,
2868,
5211,
281,
2085,
271,
1650,
50275,
8826,
14217,
403,
17011,
323,
1650,
1448,
85,
333,
74,
7005,
909,
1432,
891,
13414,
2868,
18687,
310,
2931,
891,
1158,
352,
943,
320,
337,
604,
597,
403,
4503,
285,
470,
5010,
50274,
34974,
285,
1841,
281,
3157,
50275,
32897,
3662,
253,
772,
1840,
50275,
1215,
1814,
868,
403,
3782,
4217,
984,
597,
5257,
281,
320,
625,
35890,
562,
273,
5028,
298,
518,
1200,
1222,
274,
26782,
266,
1162,
355,
4240,
352,
812,
320,
4217,
281,
923,
534,
273,
841,
3082,
24403,
839,
253,
2758,
390,
253,
2862,
19862,
310,
1805,
35890,
672,
359,
452,
5028,
5333,
24088,
3733,
941,
50276,
46277,
274,
740,
1071,
941,
50276,
46277,
274,
740,
68,
344,
2109,
610,
6163,
1162,
355,
6247,
50275,
30819,
7162,
11508,
323,
253,
18543,
6332,
651,
320,
5322,
285,
671,
970,
625,
4980,
372,
30344,
48489,
281,
6642,
253,
18543,
2228,
24088,
275,
465,
22711,
1162,
355,
6247,
50273,
455,
28070,
5393,
403,
2168,
275,
253,
2929,
3707,
50276,
31591,
4698,
272,
11454,
2990,
31640,
281,
1846,
17715,
621,
285,
26309,
16447,
344,
2109,
610,
6163,
289,
4921,
6196,
350,
469,
17857,
32888,
6247,
5474,
33032,
275,
2087,
619,
4743,
310,
15616,
342,
271,
251,
15337,
254,
18,
253,
3762,
285,
253,
16774,
7680,
513,
417,
1928,
4209,
50274,
74,
671,
5194,
342,
271,
251,
15337,
254,
20,
50276,
395,
271,
251,
15337,
254,
21,
533,
1928,
1679,
9049,
670,
253,
819,
790,
285,
625,
11926,
670,
253,
772,
50275,
255,
436,
1127,
516,
417,
1411,
253,
14924,
273,
253,
2929,
3738,
516,
1335,
14596,
327,
253,
18235,
1930,
516,
3629,
619,
4868,
984,
359,
403,
387,
1878,
5015,
670,
247,
45210,
50274,
8774,
50276,
783,
2929,
1263,
18543,
273,
49328,
273,
277,
79,
2224,
285,
697,
5886,
281,
253,
18543,
273,
2060,
2758,
273,
49328,
253,
789,
14371,
326,
891,
2758,
273,
271,
19862,
943,
417,
320,
35890,
533,
253,
2457,
19862,
778,
2430,
18543,
3340,
604,
2758,
273,
271,
19862,
403,
35890,
21255,
2085,
10527,
1543,
281,
1329,
253,
3908,
37685,
12661,
271,
17825,
18543,
6974,
7870,
3276,
13642,
326,
4648,
1027,
9208,
1754,
327,
253,
7162,
273,
247,
1566,
50274,
585,
1209,
2224,
50275,
18,
253,
2022,
1953,
273,
253,
2929,
943,
19862,
2758,
320,
35890,
9193,
14916,
984,
253,
3114,
310,
6600,
273,
253,
2969,
1650,
326,
3400,
271,
3662,
253,
3676,
49328,
298,
518,
1200,
1222,
274,
26782,
266,
7132,
452,
3731,
1179,
50250,
2758,
585,
26743,
277,
79,
2224,
533,
253,
13650,
273,
271,
19862,
403,
275,
2252,
12866,
35890,
3021,
253,
3662,
310,
642,
50276,
19,
253,
2929,
6571,
310,
4518,
3542,
533,
2593,
7609,
10527,
1783,
310,
6685,
1892,
281,
956,
1014,
2167,
891,
294,
1088,
352,
1142,
2069,
516,
1335,
417,
2119,
604,
891,
7192,
352,
9113,
253,
954,
21643,
629,
310,
253,
6452,
275,
3946,
627,
310,
642,
7658,
326,
253,
19862,
10554,
943,
320,
35890,
3021,
19862,
10554,
18543,
310,
2424,
1014,
323,
281,
446,
1492,
35890,
2758,
352,
3133,
326,
642,
7117,
1543,
497,
908,
281,
4711,
436,
3908,
50272,
20,
253,
18543,
273,
19862,
556,
644,
4081,
275,
15898,
2788,
3227,
14952,
608,
5955,
50276,
585,
3444,
253,
4795,
19862,
13650,
50276,
1844,
4261,
18543,
3470,
281,
320,
18325,
323,
253,
19862,
10554,
2581,
685,
19862,
2758,
50276,
21,
253,
767,
2022,
9021,
7609,
10527,
1783,
5976,
3276,
35375,
323,
19862,
18543,
9193,
417,
2905,
597,
403,
10323,
767,
3907,
12989,
14998,
275,
253,
581,
2929,
50276,
22,
253,
16774,
5301,
40725,
253,
24403,
569,
4868,
24088,
299,
336,
299,
336,
50276,
261,
247,
23539,
6642,
273,
2032,
18543,
342,
247,
1027,
8492,
323,
1016,
1566,
594,
352,
310,
417,
247,
3588,
7982,
281,
7277,
1027,
3210,
923,
13460,
280,
257,
580,
280,
3750,
9638,
253,
958,
310,
1014,
5393,
275,
253,
1655,
2929,
352,
943,
320,
4879,
326,
323,
6486,
1180,
273,
3530,
50275,
2858,
1335,
310,
12841,
275,
253,
16774,
1263,
50276,
5371,
891,
1804,
310,
281,
897,
253,
30044,
10295,
18543,
2228,
1629,
336,
4081,
275,
5261,
8420,
9638,
2112,
342,
372,
32924,
2629,
533,
23539,
299,
336,
253,
1629,
336,
310,
271,
38663,
6642,
273,
18543,
627,
1537,
320,
690,
8483,
27366,
273,
436,
7982,
326,
516,
417,
6600,
273,
533,
253,
2929,
4453,
4891,
285,
21414,
671,
4496,
1691,
4116,
281,
4677,
11439,
275,
253,
247,
1069,
2715,
4754,
299,
336,
310,
253,
2629,
275,
253,
1673,
533,
352,
310,
253,
3430,
2629,
326,
16897,
441,
432,
14282,
8249,
4780,
594,
359,
943,
3523,
970,
352,
50276,
23,
253,
1543,
2530,
275,
2829,
337,
1646,
281,
320,
2810,
2193,
470,
3832,
746,
4632,
17796,
13482,
3966,
594,
387,
1878,
2629,
21492,
878,
281,
320,
2361,
671,
627,
310,
642,
29570,
273,
2067,
6613,
591,
1543,
275,
253,
2505,
50276,
783,
2929,
281,
2706,
253,
5322,
12989,
533,
4583,
352,
9193,
751,
8718,
533,
417,
2217,
253,
3762,
310,
4722,
533,
352,
1057,
417,
1918,
441,
247,
2257,
273,
1210,
1487,
5046,
697,
1077,
17854,
253,
7870,
3276,
13642,
310,
417,
4737,
264,
281,
562,
32231,
253,
16819,
8737,
253,
9021,
1928,
33817,
253,
4028,
3290,
3198,
281,
320,
5520,
50274,
26122,
50275,
18,
347,
2080,
347,
891,
476,
2028,
253,
25577,
253,
13461,
7922,
281,
253,
20552,
403,
2057,
18325,
970,
247,
1028,
347,
275,
15898,
2788,
3227,
1162,
355,
9169,
50276,
261,
13583,
347,
627,
310,
642,
29570,
273,
39793,
13461,
970,
247,
1028,
275,
253,
2929,
50276,
19,
1745,
80,
352,
943,
320,
4879,
326,
323,
247,
6486,
1180,
273,
3530,
50276,
77,
518,
1200,
1222,
274,
26782,
266,
7132,
298,
518,
1200,
1222,
274,
26782,
266,
270,
819,
5432,
293,
247,
787,
1504,
437,
260,
2969,
285,
44755,
15970,
11649,
13418,
970,
3676,
49328,
275,
16424,
275,
11454,
1491,
5162,
2718,
4240,
7266,
37174,
17984,
1012,
50276,
1225,
2788,
3227,
14952,
15898,
2788,
3227,
247,
12865,
20122,
729,
247,
14008,
2291,
729,
277,
26925,
18540,
277,
8483,
27366,
273,
801,
297,
404,
11649,
13418,
285,
546,
35128,
275,
3676,
4715,
17857,
32888,
9169,
50275,
6156,
280,
257,
580,
280,
3750,
9638,
7166,
6002,
284,
13460,
280,
257,
580,
280,
3750,
34843,
301,
5261,
8420,
1113,
77,
285,
398,
1665,
269,
433,
16409,
298,
527,
16750,
480,
317,
706,
4533,
285,
289,
4921,
270,
5807,
251,
16344,
1566,
18543,
275,
9162,
247,
382,
1832,
6247,
50276,
5392,
8420,
9638,
5261,
8420,
277,
298,
527,
16750,
269,
1182,
607,
8125,
73,
277,
18543,
5216,
275,
23559,
14407,
9162,
247,
440,
5411,
7792,
275,
16424,
275,
11454,
1491,
5162,
2718,
6247,
7266,
1249,
21553,
805,
23546,
5987,
39962,
2061,
9275,
746,
6903,
1012,
2227,
9275,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
19862,
18543,
285,
253,
2954,
875,
253,
18543,
273,
2060,
19862,
3558,
3210,
342,
253,
18543,
273,
253,
4795,
19862,
10554,
50276,
783,
2022,
10527,
906,
310,
326,
2060,
19862,
2758,
943,
417,
320,
15978,
35890,
275,
1340,
281,
452,
247,
973,
1179,
50250,
19862,
10554,
50276,
6050,
643,
3332,
789,
556,
1119,
436,
281,
320,
253,
1083,
275,
16774,
1543,
436,
2929,
4326,
28032,
253,
16774,
1543,
949,
10527,
1543,
50275,
856,
84,
50276,
783,
33977,
1263,
273,
19862,
18543,
342,
14282,
16039,
50276,
5040,
50276,
1987,
8303,
3710,
281,
10527,
1263,
273,
1929,
8310,
285,
7870,
3276,
13642,
50276,
19681,
3276,
13642,
310,
417,
2011,
281,
562,
32231,
8245,
3082,
50276,
15870,
5661,
12820,
260,
338,
274,
740,
46277,
274,
2313,
50276,
783,
4477,
9583,
275,
247,
9470,
5955,
342,
30628,
285,
1160,
2544,
281,
616,
2929,
1690,
6240,
2629,
11254,
1543,
689,
2709,
6613,
285,
253,
1629,
336,
18543,
2557,
50276,
1189,
455,
436,
310,
4891,
789,
285,
812,
320,
7607,
281,
253,
8059,
2299,
30628,
5194,
326,
4243,
273,
253,
789,
403,
14999,
275,
1798,
337,
3710,
5661,
7103,
581,
1511,
273,
4836,
327,
292,
680,
15302,
760,
285,
374,
1677,
1929,
6239,
253,
5649,
273,
253,
6012,
10527,
1543,
281,
268,
3460,
398,
310,
417,
2590,
50276,
783,
11985,
452,
644,
7591,
281,
11322,
436,
30859,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents an extension to the commonly used gaussian process based hpo by incorporating the learning curve dynamics to decide the next hp configuration to be tried out for this the authors propose the use of a kernel that encodes the previous hp iterates using a neural network the method is shown to reach lower regret values for the same computational budget compared to the baselines considered post rebuttal given the detailed rebuttals by the authors the updated baselines im confident about increasing my rating for this paper it might aid their arguments if the authors were to move some sections of the appendix to the main paper pros the paper is quite clear see the subsequent comments and easy to follow the method proposed is simple and is an intuitive extension to the methods in the literature the paper is also well placed in the context of previous methods the experiments section is quite strong in the experiments and baselines covered cons 1 the authors start with the motivation that the rank correlation of performance at various budgets is poor however this is seemingly contradicted in fig 7 left where the exclusion of the learning curve in hpo doesnt worsen the performance much over all the datasets the authors show experiments where the inclusion of lc leads to better results for some datasets the authors should either provide references for the poor correlation or show statistics of how intermediate performance is a poor predictor of final rank 2 on training the deep convolutional kernel the neural net used is a mlp with 128 and 256 hidden units this is quite a large network how do the authors reliably train this network with only a few xi j yi j1 tuples in the initial phases how reliable are the networks predictions to terminate a run how are the hyperparameters of this training chosen the authors should give details and report ablations on network size architecture training parameters lr batch size etc in the absence of these it is hard to judge the merits of the proposed phi as i do not understand how these specifics were arrived at also the authors should report how the additional time of training the deep kernel changes the wall clock time measurements 3 evaluations the use of epochs and steps to describe the xaxis in various plots is a little confusing are these the same also it is ideal that the authors include the true performance say test accuracy on cifarimagenet exps and the true wallclock times somewhere in the paper in addition to the regret plots presented also the proposed methods rank fluctuates quite a bit in the initial steps in fig 3 and 5 can the authors comment on this 4 minor reversing the training update steps in the intro para 1 makes it sound like undoing the update steps the authors might consider rewording para 3 of motivation to aid readability as it took me a few reads to grasp the point the authors say gradient descent and adam on page 5 last para and gradient ascent and adam in a4 5 additional comments these are general comments the authors might consider discussing do the authors find that the trained deep conv kernel is transferable across tasks this might have interesting implications if it can be the authors write at the end of section 6 the additional use of an explicit learning curve representation might not lead to a strong improvement in every scenario while experimental evidence has been provided can the authors describe what factors determine if incorporating lc dynamics leads to better hpo the presented work is interesting barring a few points commented on above the motivation for this work needs further clarification from the authors the experimental evidence of the efficacy of the method is strong however the paper in its current form misses some important details and ablations if the authors can address these points adequately in their rebuttal id be quite happy to raise my score docsepa graybox hyperparameter optimization framework has been developed based on a multifidelity acquisition function and a surrogate model incorporated with learning curve dynamics the proposed method was built on top of deep kernel learning wilson et al 2016 and multifidelity bayesian optimization kandasamy et al 2017 experimental results on three different settings were provided in optimizing hyperparameters for mlp rnn and cnn respectively pros incorporating learning curve dynamics into the surrogate model is well motivated and supported by the ablation study on the nasbench 201 dataset extensive experimental results have been provided in terms of tabular datasets nlp tasks and nas cons predicting learning curves is not new for hpo as it has been well explored by previous works like 1 while the proposed method tries to involve the budget information for modeling curve dynamics the technical novelty of this work is still somewhat limited since it seems like a direct combination between wilson et al 2016 and kandasamy et al 2017 the multifidelity acquisition function is not well supported by the ablation study what is the comparison result between dyhpo and dyhpo wo mf some necessary baselines are missing in the current experiment such as 1 and wilson et al 2016 1 learning curve prediction with bayesian neural networks iclr17 overall the paper is easy to follow and wellmotivated while some ablated models and baselines are missing the experimental results are comprehensive and seem to be solid the main concern of this work is the lack of technical novelty compared with existing works docsepthis paper present a new bayesian optimization algorithm that integrates a deep kernel over both the hyperparameters x and the fidelity budget j typically number of epochs it also present a slightly modified version of the expected improvement acquisition function to for the fidelity budget the paper shows on several benchmarks that the proposed algorithm called dyhpo is highly competitive better performing than bohb and dehb on most of the benchmarks otherwise performing similarly strengths multifidelity is very important to obtain practical hpo algorithms for deep learning the proposed deep kernel accounts for the correlation between learning curves to avoid naively stopping trials to early like hb does the modification proposed to account for the learning curve is fairly simple the experiments are quite convincing with dyhpo outperforming other hpo algorithms almost systematically the baselines are good with hyperband bohb and dehb being serious multifidelity contenders weaknesses it is very unclear to me how we can guarantee that the algorithm will often resample x to make it continue if most dimensions of the search space are real it seems to me x will most likely be different than previous ones leading the algorithm to never continue trials the empirical results clearly show that this is not the case however the explanations do not make clear why it would not be the case only figure 8 reports training time in seconds i assume including hpo time to suggest new trials as well with the frequent query on the algorithm every 1 epoch i assume there must be a significant amount of overhead starting and stopping trials very often also the algorithm itself must be fairly slow compared to hyperband with the deep kernel that requires training it would be best to report more results on the running time of dyhpo there are no clear experiments with learning curves that are best later on to synthetically show that dyhpo performs well in this case this would be extremely valuable to support strongly that the reason why dyhpo works so well is that it indeed let the best trials train even though they do not perform well at the beginning the algorithm presented in this paper is an appreciable improvement to multifidelity variants of bayesian optimization especially because it accounts for the correlation between the learning curves and avoid relying to strongly on low fidelity to make hard decisions on trials to stop or continue training the experiments are convincing they are broad and compare good baselines the paper lacks important details in my opinion with respect to the optimization of the expected improvement and how it ensures a good fraction of the trials continue training it also lacks analysis of the execution time of the algorithm and more explicit experiments showing it can avoid stopping good trials that progress slowly in the first epochs i consider the work good enough for publication but it could benefit from some clarifications and additional analysis docsepthis paper proposes a graybox optimization method for hyperparameter optimization of deep neural network models in order to deal with the different budgets available for training nns in the framework of multifidelity optimization the proposed method uses a multitask gaussian process modeling that simultaneously measures the similarity between not only the inputs x but also the outputs y trained with different budgets in particular the multitask gaussian process model is constructed using a deep kernel with a feature extractor instead of the existing kernel function the performance of the proposed method is evaluated by experiments on three types of neural nets mlp rnn and cnn mlp and rnn are treated as usual hyperparameter optimization in 7 and 8 dimensions respectively while cnn is treated as a rewrite of nas as hyperparameter optimization major concerns 1 there are several parts of this paper that are not clearly related to similar studies eg multifidelity bo with deep models li multifidelity bayesian optimization via deep neural networks neurisp2020 ei for multifidelity bo picheny quantilebased optimization of noisy computer experiments with tunable precision technometrics 551213 lam multifidelity optimization using statistical surrogate modeling for nonhierarchical information sources 56th aiaaasceahsasc structures structural dynamics and materials conference 2015 2 there is a lack of explanation about the architecture of the deep kernel why this architecture what makes this architecture effective for multifidelity optimization etc it looks to be just a presentation of a kernel architecture that happens to work well 3 it is not appropriate to plot only the average riglet in the experimental results so the variance should be plotted as well if it is difficult to plot it can be reported separately 4 although the effectiveness of the multitask kernel is evaluated in ablation study modeling using multitask kernels in multifidelity optimization has already become popular and has been evaluated in various studies eg httpsarxivorgabs14063896 httpsarxivorgabs160507079 httpsarxivorgabs190304703 rather what we should consider is what parts of the deep kernel structure are effective and why i cannot support the acceptance of this paper due to insufficient evaluation of the novelty and the effectiveness of the proposed method docsepthis paper is concerned with multifidelity hpo the authors call this greybox which is a nonstandard term they propose a surrogate model for learning curve data eg metric values at each epoch based on a deep gp kernel different to most previous work on synchronous multifidelity hpo they decide for each running trial when it should be continued experiments are presented where the method is compared to a range of synchronous hpo baselines the experiment are fairly smallscale and do not use parallel evaluations a potential strength of this paper is the proposal of a novel surrogate model for learningcurve data which despite involving a neural network seems to be operational on just the data observed during a single hpo experiment there is a lot of prior work proposing learningcurve surrogates see below some cited here but most are either quite simple multitask gp or require warmstarting on data from previous hpo experiments having said that i could not find any mentioning of this point and i am really curious about the authors explaining how their deep gp model can be trained just on the very limited data observed during a single hpo experiment for an expensive tuning problem you probably have 2040 configurations most of which do not run for many epochs and even if some configurations run for many epochs learning curve data is exceedingly noisy the details really matter here with a standard bo surrogate i just need to refit the gp hyperparameters now and then which can easily be done even for little data with dyhpo you need to presumably update a deep kernel ie retrain a neural network the paper does not say how this is done in a fully automated fashion is training started from scratch or from the last recent weights the first is expensive while the latter is prone to get stuck at the previous solution and ignores the new data how long do you retrain do you retrain after getting each new observation while using complex deep neural surrogate models in bo is an obvious idea many previous trials have failed because complex nn models are just not easyfast to update as part of sequential decision making and in any case cannot be fit robustly to very small datasets this fact has been clearly spelled out for example in 7 for the closely related problem of bandit optimization id be personally really surprised if this work was different and solved these difficult issues but would be willing to give benefit of doubt if a lot more information was provided here how the authors pulled it off as it stands the authors do not even mention there could be issues here the most obvious weakness of this paper is that very relevant prior work is ignored namely on asynchronous multifidelity most prominently asha 1 is well known and implemented in ray tune 3 or autogluon 4 the baselines compared to against in this paper hyperband bohb are all synchronous and quite dated by now meaning that many trials need to run to a certain level until another decision is taken if you force methods to be synchronous this puts them at a disadvantage they need to delay decisions until some rung is completely filled which delays decisions and slows them down this is explained in the asha paper 1 it is well known that for large scale multifidelity hpo asynchronous scheduling works much better than synchronous see for example the comparisons in 2 algorithms like asha are behind commercial automated tuning services 5 it is quite astonishing that part of the research community is still considering synchronous methods like hyperband or bohb the state of the art for example the paper claims that it is a new idea that dyhpo never discards a configuration that is precisely what asha 1 does as well known as pauseandresume scheduling and what freezethaw bo suggested long ago id be surprised if a well configured asha method available in ray tune 3 would not be competitive or beat the approach suggested here despite not requiring a complex surrogate when doing such comparisons it is important to also take decision time into account because updating a surrogate model can be very expensive in dyhpo this likely means rerunning mlp fitting which is probably really expensive i also find the motivation as to why dyhpo works better than previous methods unconvincing the authors claim that rank correlations between early evaluations at few epochs and late ones are poor in my experience this is just not the case these correlations are in the majority pretty good which is exactly why multifidelity methods work very well for dnn tuning sure there are examples such as regularization but the question is whether that matters the authors should provide numerical evidence for such a claim now even if these correlations are poor it is not clear to me why dyhpo could do anything about that learning curve prediction is just hard because by far most of the data is from early evaluations but you are interested in late performance so you need to extrapolate why would some vanilla deep kernel be good at that the only way to really know about certain anticorrelations that can be exploited is to either fit models to data from past hpo experiments which dyhpo does not do or to built the knowledge into the model which they dont do either just because an nn is involved does not mean it will do magic for you the reason why dyhpo works better than competitors here is that it is asynchronous but the others are synchronous so at a disadvantage also the reason why modelbased hpo is better than random search based methods like hyperband is mostly because the latter cannot exploit they need to draw new configs always at random the experiments are pretty underwhelming apart from most relevant baselines missing dyhpo is asynchronous all competitors are synchronous the curves are also not very meaningful because the x axis is number of epochs instead of wallclock time dyhpo needs to update a complex surrogate model including retraining a neural network and the costs for doing that have to be taken into account all experiments are also sequential no parallel evaluations are used this could easily be done by using ray tune 3 again this falls far short of the current state of the art in automatic tuning of large neural models eg methods like asha or pbt finally there are quite some works on using complex surrogates to model learning curves in the context of hpo for example 5 it is not clear why this was not compared against as code is available the work of wistuba and grabocka is cited which proposed deep kernel surrogates before so this paper against their claim is not the first to do this in the context of multifidelity and in fact perrone etal 2018 cited here did this even earlier just not in the context of learning curve data the paper of wistuba is quite careful in explaining why a complex surrogate cannot be trained robustly on the data from a single experiment and proposes an algorithm to warmstart from past data it is dismissed here as competitor for doing so but as i said above i am not sure how dyhpo solves the apparent issue that complex nns cannot be trained on the small amount of data observed in hpo 1 asha httpsarxivorgabs181005934 2 mobster httpsarxivorgabs200310865 3 ray tune httpsdocsrayioenlatesttuneindexhtml 4 autogluon httpsautogluonaistableindexhtml 5 httpswwwdeterminedai 6 httpsopenreviewnetforumids11kbyclx 7 httpsarxivorgabs180209127 the paper may have some merits in suggesting a deep kernel surrogate model which although quite related to previous work is stated to work even if just fitted on the small amount of data from a single experiment in an online fashion however this has been tried several times before with little success and details explaining why the current approach should work are missing the proposed method uses asynchronous scheduling much like freezethaw but is compared against synchronous scheduling baselines which have a major disadvantage comparisons to sota methods like asha or pbt are missing these are not cited there is also quite a range of prior work on learning curve modeling for hpo which is not compared against open source code for doing a better comparison is publicly available for example ray tune experiments are smallish scale mostly on tabulated benchmarks and again are not close to what is possible today with parallel computation compared to missing alternatives like asha the proposed method is fairly complex and quite likely rather nonrobust to handle for example it requires retraining a neural network model each time a bit of new data is obtained which is very difficult to do
### Summary: | this paper presents a new method for performing bayesian optimization for hyperparameter tuning that uses learning curve trajectories to reason about how long to train a model for thus grey box optimization and whether to continue training a model the reviewers seem to find the paper clear wellmotivated and the presented methodology sensible however the reviews were quite mixed and leaning towards reject with 3 6 5 3 6 a challenge for the authors is that there is already significant related literature on the subject of multifidelity optimization and even specific formulations for hyperparameter optimization that reason about learning curves a common criticism raised by the reviewers is that while there are extensive experiments they dont seem to be the right choice of experiments to help understand the advantages of this method eg epochs instead of wallclock on the xaxis choice of baselines demonstration that early results are used to forecast later success etc unfortunately because there is significant related literature the bar is raised somewhat in terms of empirical evidence although theoretical evidence of the performance of this method would also help it seems clear that some of the reviewers are not convinced by the experiments that were presented thus the recommendation is to reject the paper but encourage the authors to submit to a future venue it looks like the authors have gone a long way to address these concerns in their author responses incorporating these new results and the reviewer feedback would go a long way to improving the paper for a future submission | [
2834,
281,
7484,
352,
476,
320,
2361,
11794,
50276,
21,
3738,
253,
12510,
273,
253,
1554,
262,
1945,
10295,
310,
6760,
275,
28913,
1263,
14053,
970,
1554,
262,
1945,
34501,
275,
25274,
21718,
13757,
556,
2168,
2489,
4633,
285,
556,
644,
6760,
275,
2710,
2175,
24088,
5987,
39962,
2061,
5375,
1047,
3071,
1839,
4196,
5987,
39962,
2061,
5375,
9913,
1235,
1967,
2787,
5987,
39962,
2061,
5375,
746,
15960,
21,
30349,
2581,
752,
359,
943,
1908,
310,
752,
4243,
273,
253,
3676,
10295,
2605,
403,
3576,
285,
2139,
891,
2550,
1329,
253,
14924,
273,
436,
2929,
1955,
281,
12497,
7103,
273,
253,
38135,
285,
253,
12510,
273,
253,
4081,
1332,
5474,
33032,
2520,
2929,
310,
7514,
342,
25274,
21718,
288,
5367,
253,
4477,
1067,
436,
14370,
3364,
534,
310,
247,
1327,
15291,
1307,
597,
12661,
247,
35701,
1566,
323,
4715,
6970,
941,
24088,
7982,
2193,
387,
1016,
23657,
1754,
327,
247,
3676,
31025,
10295,
1027,
281,
954,
2045,
789,
327,
34265,
25274,
21718,
288,
5367,
597,
7617,
323,
1016,
3515,
2332,
672,
352,
943,
320,
4821,
4679,
403,
3559,
835,
253,
1332,
310,
2429,
281,
247,
2491,
273,
34265,
288,
5367,
1666,
25379,
253,
3368,
403,
9648,
1355,
7527,
285,
513,
417,
897,
7529,
27163,
50276,
66,
2442,
4757,
273,
436,
2929,
310,
253,
10419,
273,
247,
4460,
35701,
1566,
323,
4715,
33356,
941,
534,
5747,
7668,
247,
11454,
2990,
3133,
281,
320,
15942,
327,
816,
253,
941,
2540,
1309,
247,
2014,
288,
5367,
3368,
627,
310,
247,
2257,
273,
2720,
789,
36636,
4715,
33356,
919,
6375,
684,
923,
2708,
690,
11106,
1060,
533,
954,
403,
2057,
3240,
2969,
1554,
262,
1945,
31025,
390,
2430,
5890,
45033,
327,
941,
432,
2045,
288,
5367,
4679,
1907,
753,
326,
891,
812,
417,
1089,
667,
29570,
273,
436,
1127,
285,
891,
717,
1663,
14338,
670,
253,
4477,
15571,
849,
616,
3676,
31025,
1566,
476,
320,
10166,
816,
327,
253,
1077,
3710,
941,
2540,
1309,
247,
2014,
288,
5367,
3368,
323,
271,
8214,
25184,
1895,
368,
3164,
452,
1384,
1449,
16012,
954,
273,
534,
513,
417,
1408,
323,
1142,
44540,
285,
1014,
604,
690,
16012,
1408,
323,
1142,
44540,
4715,
6970,
941,
310,
42508,
27620,
50276,
783,
4278,
1663,
2647,
1060,
342,
247,
2629,
1766,
35701,
891,
816,
878,
281,
1275,
262,
253,
31025,
4373,
22041,
1024,
285,
840,
534,
476,
4354,
320,
2218,
1014,
323,
1652,
941,
342,
17713,
73,
5367,
368,
878,
281,
18289,
5731,
247,
3676,
10295,
26332,
851,
1949,
247,
11454,
2990,
253,
2929,
1057,
417,
1333,
849,
436,
310,
2218,
275,
247,
4751,
16644,
8142,
310,
3733,
3053,
432,
20041,
390,
432,
253,
1390,
3332,
13461,
253,
806,
310,
8214,
1223,
253,
6158,
310,
21291,
281,
755,
10960,
387,
253,
2045,
2900,
285,
35136,
253,
747,
941,
849,
1048,
513,
368,
851,
1949,
513,
368,
851,
1949,
846,
2970,
1016,
747,
8310,
1223,
970,
2570,
3676,
11454,
35701,
3210,
275,
1766,
310,
271,
4755,
2934,
1142,
2045,
7587,
452,
4242,
984,
2570,
48257,
3210,
403,
816,
417,
3477,
7957,
281,
5731,
347,
629,
273,
22453,
3061,
2403,
285,
275,
667,
1083,
2550,
320,
4944,
10237,
314,
281,
1077,
1355,
15302,
436,
958,
556,
644,
4518,
43997,
562,
323,
1650,
275,
818,
323,
253,
8244,
2905,
1895,
273,
3961,
262,
13757,
2654,
320,
11697,
1663,
9861,
604,
436,
789,
369,
1027,
285,
14042,
841,
2834,
3374,
533,
651,
320,
7378,
281,
1918,
5649,
273,
5545,
604,
247,
2257,
625,
1491,
369,
2530,
1060,
849,
253,
4477,
7320,
352,
745,
347,
352,
9572,
253,
4477,
513,
417,
1014,
3748,
627,
812,
320,
3374,
1060,
50276,
783,
954,
4755,
14855,
273,
436,
2929,
310,
326,
1077,
4623,
2720,
789,
310,
12841,
10775,
327,
35576,
25274,
21718,
954,
46454,
347,
3227,
337,
310,
973,
1929,
285,
9009,
275,
21868,
19928,
495,
390,
1125,
462,
7675,
251,
577,
253,
1666,
25379,
2429,
281,
1411,
275,
436,
2929,
4373,
4152,
270,
1368,
67,
403,
512,
34265,
285,
3240,
15483,
407,
1024,
4495,
326,
1142,
7587,
878,
281,
1408,
281,
247,
2176,
1268,
1919,
1529,
3061,
310,
2668,
604,
368,
3490,
3082,
281,
320,
34265,
436,
12516,
731,
387,
247,
18928,
597,
878,
281,
5778,
7089,
1919,
690,
1408,
72,
310,
4336,
6898,
534,
20219,
7089,
285,
49133,
731,
1066,
436,
310,
5544,
275,
253,
347,
3227,
2929,
337,
352,
310,
973,
1929,
326,
323,
1781,
4311,
25274,
21718,
288,
5367,
35576,
27387,
2987,
1199,
1805,
685,
34265,
923,
323,
1650,
253,
14023,
275,
374,
11333,
751,
347,
3227,
403,
3212,
6264,
16644,
25184,
3238,
608,
352,
310,
3240,
35240,
326,
629,
273,
253,
2561,
3114,
310,
1335,
7296,
34265,
3082,
751,
4373,
4152,
390,
270,
1368,
67,
253,
1375,
273,
253,
1445,
50276,
1542,
1650,
253,
2929,
3916,
326,
352,
310,
247,
747,
2934,
326,
17713,
73,
5367,
1620,
1262,
2196,
247,
6661,
326,
310,
10534,
752,
347,
3227,
337,
1057,
347,
973,
1929,
347,
19309,
395,
373,
2123,
27387,
285,
752,
1959,
91,
678,
1403,
1766,
5125,
1048,
3622,
2654,
320,
9861,
604,
247,
973,
15378,
347,
3227,
1332,
2130,
275,
21868,
19928,
495,
651,
417,
320,
12085,
390,
7171,
253,
2746,
5125,
1060,
5747,
417,
10568,
247,
2570,
35701,
672,
2509,
824,
14023,
352,
310,
1774,
281,
671,
1379,
3061,
673,
715,
2395,
984,
22753,
247,
35701,
1566,
476,
320,
1077,
8214,
275,
17713,
73,
5367,
436,
2779,
2097,
294,
24220,
13361,
81,
13532,
534,
310,
3164,
1663,
8214,
50276,
74,
671,
1089,
253,
16038,
347,
281,
2139,
17713,
73,
5367,
2987,
1805,
685,
2045,
3082,
10915,
87,
19163,
253,
4477,
1750,
326,
5958,
13007,
875,
2393,
27163,
387,
1643,
44540,
285,
3563,
4394,
403,
4105,
275,
619,
2793,
436,
310,
816,
417,
253,
1083,
841,
13007,
403,
275,
253,
5020,
3965,
1175,
534,
310,
4555,
2139,
25274,
21718,
3082,
789,
1077,
973,
323,
277,
9866,
25184,
2119,
627,
403,
6667,
824,
347,
37820,
533,
253,
1953,
310,
1880,
326,
8213,
253,
4477,
943,
2085,
10704,
1941,
323,
824,
247,
1750,
1024,
1014,
604,
841,
13007,
403,
4105,
352,
310,
417,
2590,
281,
479,
2139,
17713,
73,
5367,
812,
513,
2712,
670,
326,
4715,
6970,
10554,
310,
816,
1892,
984,
407,
2080,
954,
273,
253,
941,
310,
432,
2393,
27163,
533,
368,
403,
6110,
275,
3563,
3045,
594,
368,
878,
281,
26480,
25839,
2139,
651,
690,
26724,
3676,
10295,
320,
1175,
387,
326,
253,
760,
1039,
281,
1663,
871,
670,
2176,
37935,
263,
22842,
326,
476,
320,
28734,
310,
281,
2057,
4944,
3210,
281,
941,
432,
2469,
288,
5367,
4679,
534,
17713,
73,
5367,
1057,
417,
513,
390,
281,
4270,
253,
3640,
715,
253,
1566,
534,
597,
13414,
513,
2057,
816,
984,
271,
48257,
310,
3206,
1057,
417,
1599,
352,
588,
513,
10721,
323,
368,
253,
1921,
2139,
17713,
73,
5367,
2987,
1805,
685,
21607,
1060,
310,
326,
352,
310,
35576,
533,
253,
2571,
403,
34265,
594,
387,
247,
18928,
671,
253,
1921,
2139,
1566,
3169,
288,
5367,
310,
1805,
685,
3632,
3186,
1754,
3082,
751,
4373,
4152,
310,
6571,
984,
253,
6158,
2550,
22059,
597,
878,
281,
3812,
747,
3596,
84,
1900,
387,
3632,
50276,
783,
4679,
403,
3965,
762,
11622,
3987,
7419,
432,
954,
4623,
1666,
25379,
5816,
17713,
73,
5367,
310,
35576,
512,
21607,
403,
34265,
253,
9191,
403,
671,
417,
1077,
14282,
984,
253,
1269,
7844,
310,
1180,
273,
44540,
3185,
273,
3402,
13273,
673,
17713,
73,
5367,
3198,
281,
5731,
247,
2570,
35701,
1566,
1690,
851,
26208,
247,
11454,
2990,
285,
253,
4815,
323,
2509,
326,
452,
281,
320,
2668,
715,
2395,
512,
4679,
403,
671,
22453,
642,
7529,
27163,
403,
908,
436,
812,
4354,
320,
2218,
407,
970,
21868,
19928,
495,
969,
436,
11521,
2080,
2159,
273,
253,
1655,
1375,
273,
253,
1445,
275,
12077,
25184,
273,
1781,
11454,
3210,
24088,
3082,
751,
347,
3227,
390,
268,
2612,
50276,
71,
3341,
627,
403,
3240,
690,
2987,
327,
970,
2570,
919,
6375,
684,
281,
1566,
4715,
9191,
275,
253,
3634,
273,
288,
5367,
323,
1650,
608,
352,
310,
417,
2590,
2139,
436,
369,
417,
2429,
1411,
347,
2127,
310,
2130,
253,
789,
273,
259,
382,
25642,
285,
10013,
825,
66,
310,
11106,
534,
4081,
3676,
10295,
919,
6375,
684,
1078,
594,
436,
2929,
1411,
616,
1750,
310,
417,
253,
806,
281,
513,
436,
275,
253,
3634,
273,
25274,
21718,
285,
275,
958,
591,
29037,
1162,
267,
4765,
11106,
1060,
858,
436,
1014,
4321,
816,
417,
275,
253,
3634,
273,
4715,
6970,
941,
253,
2929,
273,
259,
382,
25642,
310,
3240,
10182,
275,
15571,
2139,
247,
2570,
35701,
2550,
320,
10166,
10237,
314,
327,
253,
941,
432,
247,
2014,
3368,
285,
29328,
271,
5933,
281,
5890,
5478,
432,
2469,
941,
352,
310,
11511,
1060,
347,
32048,
323,
2509,
594,
533,
347,
891,
753,
1840,
891,
717,
417,
2119,
849,
17713,
73,
5367,
35910,
253,
5165,
2523,
326,
2570,
295,
2224,
2550,
320,
10166,
327,
253,
1355,
2408,
273,
941,
2540,
275,
288,
5367,
50276,
18,
347,
3227,
5987,
39962,
2061,
5375,
1093,
2313,
3046,
1706,
374,
9119,
2971,
5987,
39962,
2061,
5375,
9755,
12347,
2082,
495,
21868,
19928,
5987,
13880,
1402,
900,
257,
35261,
85,
2517,
4663,
2974,
577,
1125,
462,
7675,
251,
5987,
1920,
462,
7675,
8440,
382,
494,
4663,
2974,
608,
5987,
2700,
37501,
2284,
721,
5987,
5758,
15337,
3024,
39061,
2352,
883,
76,
1615,
498,
89,
818,
5987,
39962,
2061,
5375,
1093,
9992,
26,
11946,
50276,
783,
2929,
778,
452,
690,
16108,
275,
7738,
247,
3676,
10295,
35701,
1566,
534,
3738,
3240,
2905,
281,
2045,
789,
310,
4767,
281,
789,
1014,
604,
816,
14662,
327,
253,
1355,
2408,
273,
941,
432,
247,
2014,
3368,
275,
271,
3909,
8142,
2299,
436,
556,
644,
3597,
2067,
2069,
1078,
342,
1652,
2323,
285,
4278,
15571,
2139,
253,
1655,
2746,
943,
789,
403,
5816,
50276,
783,
4081,
1332,
4648,
35576,
27387,
1199,
751,
1959,
91,
678,
1403,
533,
310,
2429,
1411,
34265,
27387,
1666,
25379,
534,
452,
247,
2201,
18928,
14023,
281,
256,
5503,
3082,
751,
347,
3227,
390,
268,
2612,
403,
5816,
841,
403,
417,
11106,
627,
310,
671,
3240,
247,
2491,
273,
2720,
789,
327,
4715,
6970,
14053,
323,
288,
5367,
534,
310,
417,
2429,
1411,
1527,
2603,
2127,
323,
2509,
247,
1805,
5301,
310,
13644,
2130,
323,
1650,
21868,
19928,
50276,
16217,
3825,
403,
1355,
763,
4311,
6571,
327,
10334,
2907,
49602,
285,
969,
403,
417,
2810,
281,
752,
310,
1896,
3063,
342,
7529,
13782,
2429,
281,
5816,
18075,
751,
347,
3227,
253,
4081,
1332,
310,
9648,
2570,
285,
3240,
2779,
2581,
1327,
18848,
461,
281,
6016,
323,
1650,
352,
4419,
851,
26208,
247,
11454,
2990,
1566,
1016,
673,
247,
2372,
273,
747,
941,
310,
2797,
534,
310,
1077,
2834,
281,
513,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
747,
1332,
323,
9591,
17699,
16561,
13757,
323,
4373,
19484,
25184,
326,
4648,
4715,
6970,
24102,
281,
1921,
670,
849,
1048,
281,
6194,
247,
1566,
323,
3021,
14370,
3817,
13757,
285,
1880,
281,
4035,
3733,
247,
1566,
50276,
783,
30628,
1646,
281,
1089,
253,
2929,
2590,
973,
24013,
8550,
285,
253,
3559,
16182,
24600,
50276,
35529,
253,
10123,
497,
3240,
6804,
285,
25661,
4404,
12009,
342,
495,
721,
608,
495,
721,
50276,
66,
5691,
323,
253,
4477,
310,
326,
627,
310,
2168,
1534,
2905,
6239,
327,
253,
2256,
273,
25274,
21718,
13757,
285,
1014,
2173,
26850,
323,
4373,
19484,
13757,
326,
1921,
670,
4715,
9191,
50276,
66,
1846,
14226,
5439,
407,
253,
30628,
310,
326,
1223,
627,
403,
9470,
4679,
597,
13414,
1646,
281,
320,
253,
987,
4327,
273,
4679,
281,
1361,
2096,
253,
11361,
273,
436,
1332,
24088,
44540,
3185,
273,
3402,
13273,
327,
253,
1269,
10565,
4327,
273,
1666,
25379,
20028,
326,
2393,
1543,
403,
908,
281,
16923,
1996,
2323,
3966,
50276,
328,
9520,
984,
627,
310,
1534,
2905,
6239,
253,
2534,
310,
5439,
8489,
275,
2426,
273,
16774,
1941,
3738,
10527,
1941,
273,
253,
3045,
273,
436,
1332,
651,
671,
1361,
50276,
262,
3133,
2590,
326,
690,
273,
253,
30628,
403,
417,
13762,
407,
253,
4679,
326,
497,
3559,
50276,
40622,
253,
17401,
310,
281,
12009,
253,
2929,
533,
11907,
253,
4477,
281,
11929,
281,
247,
2852,
18767,
50276,
262,
4453,
751,
253,
4477,
452,
4783,
247,
1048,
1039,
281,
2953,
841,
7350,
275,
616,
2488,
6128,
50276,
1763,
24993,
839,
841,
747,
1543,
285,
253,
37317,
8680,
651,
564,
247,
1048,
1039,
281,
11138,
253,
2929,
323,
247,
2852,
19529
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
2834,
281,
7484,
352,
476,
320,
2361,
11794,
50276,
21,
3738,
253,
12510,
273,
253,
1554,
262,
1945,
10295,
310,
6760,
275,
28913,
1263,
14053,
970,
1554,
262,
1945,
34501,
275,
25274,
21718,
13757,
556,
2168,
2489,
4633,
285,
556,
644,
6760,
275,
2710,
2175,
24088,
5987,
39962,
2061,
5375,
1047,
3071,
1839,
4196,
5987,
39962,
2061,
5375,
9913,
1235,
1967,
2787,
5987,
39962,
2061,
5375,
746,
15960,
21,
30349,
2581,
752,
359,
943,
1908,
310,
752,
4243,
273,
253,
3676,
10295,
2605,
403,
3576,
285,
2139,
891,
2550,
1329,
253,
14924,
273,
436,
2929,
1955,
281,
12497,
7103,
273,
253,
38135,
285,
253,
12510,
273,
253,
4081,
1332,
5474,
33032,
2520,
2929,
310,
7514,
342,
25274,
21718,
288,
5367,
253,
4477,
1067,
436,
14370,
3364,
534,
310,
247,
1327,
15291,
1307,
597,
12661,
247,
35701,
1566,
323,
4715,
6970,
941,
24088,
7982,
2193,
387,
1016,
23657,
1754,
327,
247,
3676,
31025,
10295,
1027,
281,
954,
2045,
789,
327,
34265,
25274,
21718,
288,
5367,
597,
7617,
323,
1016,
3515,
2332,
672,
352,
943,
320,
4821,
4679,
403,
3559,
835,
253,
1332,
310,
2429,
281,
247,
2491,
273,
34265,
288,
5367,
1666,
25379,
253,
3368,
403,
9648,
1355,
7527,
285,
513,
417,
897,
7529,
27163,
50276,
66,
2442,
4757,
273,
436,
2929,
310,
253,
10419,
273,
247,
4460,
35701,
1566,
323,
4715,
33356,
941,
534,
5747,
7668,
247,
11454,
2990,
3133,
281,
320,
15942,
327,
816,
253,
941,
2540,
1309,
247,
2014,
288,
5367,
3368,
627,
310,
247,
2257,
273,
2720,
789,
36636,
4715,
33356,
919,
6375,
684,
923,
2708,
690,
11106,
1060,
533,
954,
403,
2057,
3240,
2969,
1554,
262,
1945,
31025,
390,
2430,
5890,
45033,
327,
941,
432,
2045,
288,
5367,
4679,
1907,
753,
326,
891,
812,
417,
1089,
667,
29570,
273,
436,
1127,
285,
891,
717,
1663,
14338,
670,
253,
4477,
15571,
849,
616,
3676,
31025,
1566,
476,
320,
10166,
816,
327,
253,
1077,
3710,
941,
2540,
1309,
247,
2014,
288,
5367,
3368,
323,
271,
8214,
25184,
1895,
368,
3164,
452,
1384,
1449,
16012,
954,
273,
534,
513,
417,
1408,
323,
1142,
44540,
285,
1014,
604,
690,
16012,
1408,
323,
1142,
44540,
4715,
6970,
941,
310,
42508,
27620,
50276,
783,
4278,
1663,
2647,
1060,
342,
247,
2629,
1766,
35701,
891,
816,
878,
281,
1275,
262,
253,
31025,
4373,
22041,
1024,
285,
840,
534,
476,
4354,
320,
2218,
1014,
323,
1652,
941,
342,
17713,
73,
5367,
368,
878,
281,
18289,
5731,
247,
3676,
10295,
26332,
851,
1949,
247,
11454,
2990,
253,
2929,
1057,
417,
1333,
849,
436,
310,
2218,
275,
247,
4751,
16644,
8142,
310,
3733,
3053,
432,
20041,
390,
432,
253,
1390,
3332,
13461,
253,
806,
310,
8214,
1223,
253,
6158,
310,
21291,
281,
755,
10960,
387,
253,
2045,
2900,
285,
35136,
253,
747,
941,
849,
1048,
513,
368,
851,
1949,
513,
368,
851,
1949,
846,
2970,
1016,
747,
8310,
1223,
970,
2570,
3676,
11454,
35701,
3210,
275,
1766,
310,
271,
4755,
2934,
1142,
2045,
7587,
452,
4242,
984,
2570,
48257,
3210,
403,
816,
417,
3477,
7957,
281,
5731,
347,
629,
273,
22453,
3061,
2403,
285,
275,
667,
1083,
2550,
320,
4944,
10237,
314,
281,
1077,
1355,
15302,
436,
958,
556,
644,
4518,
43997,
562,
323,
1650,
275,
818,
323,
253,
8244,
2905,
1895,
273,
3961,
262,
13757,
2654,
320,
11697,
1663,
9861,
604,
436,
789,
369,
1027,
285,
14042,
841,
2834,
3374,
533,
651,
320,
7378,
281,
1918,
5649,
273,
5545,
604,
247,
2257,
625,
1491,
369,
2530,
1060,
849,
253,
4477,
7320,
352,
745,
347,
352,
9572,
253,
4477,
513,
417,
1014,
3748,
627,
812,
320,
3374,
1060,
50276,
783,
954,
4755,
14855,
273,
436,
2929,
310,
326,
1077,
4623,
2720,
789,
310,
12841,
10775,
327,
35576,
25274,
21718,
954,
46454,
347,
3227,
337,
310,
973,
1929,
285,
9009,
275,
21868,
19928,
495,
390,
1125,
462,
7675,
251,
577,
253,
1666,
25379,
2429,
281,
1411,
275,
436,
2929,
4373,
4152,
270,
1368,
67,
403,
512,
34265,
285,
3240,
15483,
407,
1024,
4495,
326,
1142,
7587,
878,
281,
1408,
281,
247,
2176,
1268,
1919,
1529,
3061,
310,
2668,
604,
368,
3490,
3082,
281,
320,
34265,
436,
12516,
731,
387,
247,
18928,
597,
878,
281,
5778,
7089,
1919,
690,
1408,
72,
310,
4336,
6898,
534,
20219,
7089,
285,
49133,
731,
1066,
436,
310,
5544,
275,
253,
347,
3227,
2929,
337,
352,
310,
973,
1929,
326,
323,
1781,
4311,
25274,
21718,
288,
5367,
35576,
27387,
2987,
1199,
1805,
685,
34265,
923,
323,
1650,
253,
14023,
275,
374,
11333,
751,
347,
3227,
403,
3212,
6264,
16644,
25184,
3238,
608,
352,
310,
3240,
35240,
326,
629,
273,
253,
2561,
3114,
310,
1335,
7296,
34265,
3082,
751,
4373,
4152,
390,
270,
1368,
67,
253,
1375,
273,
253,
1445,
50276,
1542,
1650,
253,
2929,
3916,
326,
352,
310,
247,
747,
2934,
326,
17713,
73,
5367,
1620,
1262,
2196,
247,
6661,
326,
310,
10534,
752,
347,
3227,
337,
1057,
347,
973,
1929,
347,
19309,
395,
373,
2123,
27387,
285,
752,
1959,
91,
678,
1403,
1766,
5125,
1048,
3622,
2654,
320,
9861,
604,
247,
973,
15378,
347,
3227,
1332,
2130,
275,
21868,
19928,
495,
651,
417,
320,
12085,
390,
7171,
253,
2746,
5125,
1060,
5747,
417,
10568,
247,
2570,
35701,
672,
2509,
824,
14023,
352,
310,
1774,
281,
671,
1379,
3061,
673,
715,
2395,
984,
22753,
247,
35701,
1566,
476,
320,
1077,
8214,
275,
17713,
73,
5367,
436,
2779,
2097,
294,
24220,
13361,
81,
13532,
534,
310,
3164,
1663,
8214,
50276,
74,
671,
1089,
253,
16038,
347,
281,
2139,
17713,
73,
5367,
2987,
1805,
685,
2045,
3082,
10915,
87,
19163,
253,
4477,
1750,
326,
5958,
13007,
875,
2393,
27163,
387,
1643,
44540,
285,
3563,
4394,
403,
4105,
275,
619,
2793,
436,
310,
816,
417,
253,
1083,
841,
13007,
403,
275,
253,
5020,
3965,
1175,
534,
310,
4555,
2139,
25274,
21718,
3082,
789,
1077,
973,
323,
277,
9866,
25184,
2119,
627,
403,
6667,
824,
347,
37820,
533,
253,
1953,
310,
1880,
326,
8213,
253,
4477,
943,
2085,
10704,
1941,
323,
824,
247,
1750,
1024,
1014,
604,
841,
13007,
403,
4105,
352,
310,
417,
2590,
281,
479,
2139,
17713,
73,
5367,
812,
513,
2712,
670,
326,
4715,
6970,
10554,
310,
816,
1892,
984,
407,
2080,
954,
273,
253,
941,
310,
432,
2393,
27163,
533,
368,
403,
6110,
275,
3563,
3045,
594,
368,
878,
281,
26480,
25839,
2139,
651,
690,
26724,
3676,
10295,
320,
1175,
387,
326,
253,
760,
1039,
281,
1663,
871,
670,
2176,
37935,
263,
22842,
326,
476,
320,
28734,
310,
281,
2057,
4944,
3210,
281,
941,
432,
2469,
288,
5367,
4679,
534,
17713,
73,
5367,
1057,
417,
513,
390,
281,
4270,
253,
3640,
715,
253,
1566,
534,
597,
13414,
513,
2057,
816,
984,
271,
48257,
310,
3206,
1057,
417,
1599,
352,
588,
513,
10721,
323,
368,
253,
1921,
2139,
17713,
73,
5367,
2987,
1805,
685,
21607,
1060,
310,
326,
352,
310,
35576,
533,
253,
2571,
403,
34265,
594,
387,
247,
18928,
671,
253,
1921,
2139,
1566,
3169,
288,
5367,
310,
1805,
685,
3632,
3186,
1754,
3082,
751,
4373,
4152,
310,
6571,
984,
253,
6158,
2550,
22059,
597,
878,
281,
3812,
747,
3596,
84,
1900,
387,
3632,
50276,
783,
4679,
403,
3965,
762,
11622,
3987,
7419,
432,
954,
4623,
1666,
25379,
5816,
17713,
73,
5367,
310,
35576,
512,
21607,
403,
34265,
253,
9191,
403,
671,
417,
1077,
14282,
984,
253,
1269,
7844,
310,
1180,
273,
44540,
3185,
273,
3402,
13273,
673,
17713,
73,
5367,
3198,
281,
5731,
247,
2570,
35701,
1566,
1690,
851,
26208,
247,
11454,
2990,
285,
253,
4815,
323,
2509,
326,
452,
281,
320,
2668,
715,
2395,
512,
4679,
403,
671,
22453,
642,
7529,
27163,
403,
908,
436,
812,
4354,
320,
2218,
407,
970,
21868,
19928,
495,
969,
436,
11521,
2080,
2159,
273,
253,
1655,
1375,
273,
253,
1445,
275,
12077,
25184,
273,
1781,
11454,
3210,
24088,
3082,
751,
347,
3227,
390,
268,
2612,
50276,
71,
3341,
627,
403,
3240,
690,
2987,
327,
970,
2570,
919,
6375,
684,
281,
1566,
4715,
9191,
275,
253,
3634,
273,
288,
5367,
323,
1650,
608,
352,
310,
417,
2590,
2139,
436,
369,
417,
2429,
1411,
347,
2127,
310,
2130,
253,
789,
273,
259,
382,
25642,
285,
10013,
825,
66,
310,
11106,
534,
4081,
3676,
10295,
919,
6375,
684,
1078,
594,
436,
2929,
1411,
616,
1750,
310,
417,
253,
806,
281,
513,
436,
275,
253,
3634,
273,
25274,
21718,
285,
275,
958,
591,
29037,
1162,
267,
4765,
11106,
1060,
858,
436,
1014,
4321,
816,
417,
275,
253,
3634,
273,
4715,
6970,
941,
253,
2929,
273,
259,
382,
25642,
310,
3240,
10182,
275,
15571,
2139,
247,
2570,
35701,
2550,
320,
10166,
10237,
314,
327,
253,
941,
432,
247,
2014,
3368,
285,
29328,
271,
5933,
281,
5890,
5478,
432,
2469,
941,
352,
310,
11511,
1060,
347,
32048,
323,
2509,
594,
533,
347,
891,
753,
1840,
891,
717,
417,
2119,
849,
17713,
73,
5367,
35910,
253,
5165,
2523,
326,
2570,
295,
2224,
2550,
320,
10166,
327,
253,
1355,
2408,
273,
941,
2540,
275,
288,
5367,
50276,
18,
347,
3227,
5987,
39962,
2061,
5375,
1093,
2313,
3046,
1706,
374,
9119,
2971,
5987,
39962,
2061,
5375,
9755,
12347,
2082,
495,
21868,
19928,
5987,
13880,
1402,
900,
257,
35261,
85,
2517,
4663,
2974,
577,
1125,
462,
7675,
251,
5987,
1920,
462,
7675,
8440,
382,
494,
4663,
2974,
608,
5987,
2700,
37501,
2284,
721,
5987,
5758,
15337,
3024,
39061,
2352,
883,
76,
1615,
498,
89,
818,
5987,
39962,
2061,
5375,
1093,
9992,
26,
11946,
50276,
783,
2929,
778,
452,
690,
16108,
275,
7738,
247,
3676,
10295,
35701,
1566,
534,
3738,
3240,
2905,
281,
2045,
789,
310,
4767,
281,
789,
1014,
604,
816,
14662,
327,
253,
1355,
2408,
273,
941,
432,
247,
2014,
3368,
275,
271,
3909,
8142,
2299,
436,
556,
644,
3597,
2067,
2069,
1078,
342,
1652,
2323,
285,
4278,
15571,
2139,
253,
1655,
2746,
943,
789,
403,
5816,
50276,
783,
4081,
1332,
4648,
35576,
27387,
1199,
751,
1959,
91,
678,
1403,
533,
310,
2429,
1411,
34265,
27387,
1666,
25379,
534,
452,
247,
2201,
18928,
14023,
281,
256,
5503,
3082,
751,
347,
3227,
390,
268,
2612,
403,
5816,
841,
403,
417,
11106,
627,
310,
671,
3240,
247,
2491,
273,
2720,
789,
327,
4715,
6970,
14053,
323,
288,
5367,
534,
310,
417,
2429,
1411,
1527,
2603,
2127,
323,
2509,
247,
1805,
5301,
310,
13644,
2130,
323,
1650,
21868,
19928,
50276,
16217,
3825,
403,
1355,
763,
4311,
6571,
327,
10334,
2907,
49602,
285,
969,
403,
417,
2810,
281,
752,
310,
1896,
3063,
342,
7529,
13782,
2429,
281,
5816,
18075,
751,
347,
3227,
253,
4081,
1332,
310,
9648,
2570,
285,
3240,
2779,
2581,
1327,
18848,
461,
281,
6016,
323,
1650,
352,
4419,
851,
26208,
247,
11454,
2990,
1566,
1016,
673,
247,
2372,
273,
747,
941,
310,
2797,
534,
310,
1077,
2834,
281,
513,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
747,
1332,
323,
9591,
17699,
16561,
13757,
323,
4373,
19484,
25184,
326,
4648,
4715,
6970,
24102,
281,
1921,
670,
849,
1048,
281,
6194,
247,
1566,
323,
3021,
14370,
3817,
13757,
285,
1880,
281,
4035,
3733,
247,
1566,
50276,
783,
30628,
1646,
281,
1089,
253,
2929,
2590,
973,
24013,
8550,
285,
253,
3559,
16182,
24600,
50276,
35529,
253,
10123,
497,
3240,
6804,
285,
25661,
4404,
12009,
342,
495,
721,
608,
495,
721,
50276,
66,
5691,
323,
253,
4477,
310,
326,
627,
310,
2168,
1534,
2905,
6239,
327,
253,
2256,
273,
25274,
21718,
13757,
285,
1014,
2173,
26850,
323,
4373,
19484,
13757,
326,
1921,
670,
4715,
9191,
50276,
66,
1846,
14226,
5439,
407,
253,
30628,
310,
326,
1223,
627,
403,
9470,
4679,
597,
13414,
1646,
281,
320,
253,
987,
4327,
273,
4679,
281,
1361,
2096,
253,
11361,
273,
436,
1332,
24088,
44540,
3185,
273,
3402,
13273,
327,
253,
1269,
10565,
4327,
273,
1666,
25379,
20028,
326,
2393,
1543,
403,
908,
281,
16923,
1996,
2323,
3966,
50276,
328,
9520,
984,
627,
310,
1534,
2905,
6239,
253,
2534,
310,
5439,
8489,
275,
2426,
273,
16774,
1941,
3738,
10527,
1941,
273,
253,
3045,
273,
436,
1332,
651,
671,
1361,
50276,
262,
3133,
2590,
326,
690,
273,
253,
30628,
403,
417,
13762,
407,
253,
4679,
326,
497,
3559,
50276,
40622,
253,
17401,
310,
281,
12009,
253,
2929,
533,
11907,
253,
4477,
281,
11929,
281,
247,
2852,
18767,
50276,
262,
4453,
751,
253,
4477,
452,
4783,
247,
1048,
1039,
281,
2953,
841,
7350,
275,
616,
2488,
6128,
50276,
1763,
24993,
839,
841,
747,
1543,
285,
253,
37317,
8680,
651,
564,
247,
1048,
1039,
281,
11138,
253,
2929,
323,
247,
2852,
19529
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary the paper proposes two benchmarks for continual language modeling one evaluating characterlevel multilingual drift between languages which share similar characters and second evaluating wordlevel drift between english corpora of different domains the setup is online in the sense of evaluation they evaluate on the new sentences and then train over them unlike image datasets and catastrophic forgetting is hence characterised as having higher error than was in the past when there is a switch between the domainslanguages hence the loss functions measuring forgetting quantify the height and length of the rise in error they compare a mixture of expert baselines with gating by different gating methods on this setup primary concerns 1 there are few sentences and terms that are hard to understand and to me they seem imprecise examples would be 11 intro human children still manage to acquire multiple languages without being explicitly asked to keep them separated not sure if i buy this as it is known that if children are exposed to situation where there are many languages they get confused sometimes many kids find it hard to learn any of them and it becomes important to give them guiding signal do you have any reference to support this hypothesis 12 section 3 second para preventing data leakage what do you mean by data leakage 13 section 3 third para hard to follow notation isnt clear and it seems there is a typo in si sumj ti 14 section 3 fourth para for a model to be resilient to forgetting it must adapt quickly this statement is not correct because if a model adapts quickly to a new distribution the parameter change would lead to forgetting and thats primary the reason why there are regularization based approaches for continual learning enforcing models to be in the vicinity of old parameters too much adaptivity does not ensure less forgetting 15 section 3 loss after switch what do you mean by a switch how do you know when a switch happens task label is not given in practice the loss curve is not smooth how do you identify the switch fig 1 a is too smooth does not represent the real loss curve 2 regarding experiments is it not possible to design much simpler methods which work for this problem if its known there is expected to be a characterwordsequence distribution shift i believe its likely they can be detected easily with traditional ngram models and style distinguishing attributes typically used for author identification 12 why isnt it possible to use a baseline which consists of experts for one domainlanguage where the charactersequence decides which expert to use instead of these weaker gatingbased methods also englishczechgermanfrench seem very distinguishable and share little in common in terms of character sequences 3 hence i am doubtful of the finding that combining these models will improve any single language performance 3 why is it not possible to apply traditional continual methods like experience replay to this setting you simply store intelligently selected past sentences in memory when say error shoots up and replay using them there are many other continual learning approaches that potentially could be applied here any particular reason for not using them 1 koppel et al computational methods in authorship attribution 2 sapkota et al not all character ngrams are created equal a study in authorship attribution 3 gerz et al on the relation between linguistic typology and limitations of multilingual language modeling edited docsepstrengths this paper proposes a new evaluation framework and gives two available evaluation datasets weakness the paper needs a major rewrite to improve fluency and to better state motivation and contribution the empirical validation is weak reasons for accept the advantages of this paper are 1 this paper proposed a new evaluation benchmark and dataset to promote the related research of online continual learning 2 the proposed plastic gate allows it to distribute different distributions among different experts which has certain effects from the experimental results reasons for reject the shortcomings of this paper are 1 this paper is not enough novel and has not contributed enough to continual learning related research 2 the core motivation of this paper is not clear enough the abstract mentioned that it is hard to demarcate task boundaries in actual tasks and then said that a new benchmark new metrics and gating technique are proposed stacked statements like this can hardly capture the main problem to be solved 3 the advantages of the new metrics are not clear because from the experimental results ppl and pplsw have a strong correlation therefore please explain its advantages in detail including the advantages of this evaluation framework compared with the evaluation framework of related literature and verify it 4 the baseline uses lstm and does not use cnn transformer etc which shows that its generalization is limited 5 can you provide the experimental results when is other values and the combination of the number of modules 6 because what you are proposing is a continuous language modeling evaluation framework is it possible to evaluate some of the latest online continual learning systems for example 1 lifelong machine learning with deep streaming linear discriminant analysis 2 learning a unified classifier incrementally via rebalancing or other taskfree continual learning related work this will have a good evaluation effect on measuring the versatility of your evaluation framework docsep summary this paper introduces a dataset and benchmark for language modeling in the online continual learning framework the key characteristics of this benchmark are the data are temporally correlated there are no task identifiers presented to the model taskfree setting and the evaluation is performed in an online fashion the benchmark consists of a multilingual characterlevel dataset and a multidomain wordlevel dataset the authors introduce several metrics and evaluate using several simple baselines of mixturesofexperts and productsofexperts pros 1 the paper is clear and wellwritten 2 the authors provide sufficient details on data collection and modeling 3 the relevant work section is extensive 4 the design choices in constructing the dataset are well thought out and make sense given the objective of the paper in particular the dataset along with the proposed evaluation metrics captures the three stated objectives of the benchmark 5 the authors are upfront about materials left out of the main text its nice when potential questions are anticipated and answered for example why werent continual learning sota models evaluated and why werent transformers considered as baselines the authors answer these questions candidly cons 1 the dataset seems incremental over existing work 2 the introduced evaluation metrics are described intuitively but are not analyzed empirically or theoretically 3 the necessityvalue of the introduced dataset is not adequately justified in relation to existing challenges in the continual learning setting a component of this is showing where existing models fail and why this dataset will help improve them recommendation and explanation i recommend rejection for the previously outlined reasons i also have some questions that i hope the author can help address 1 what is the key innovation over existing work such as dautume et al who also study language models in the continual learning taskfree setting 2 what failure of current models does this benchmark address note that the answer to this question should also be empirically demonstrated additional feedback 1 this benchmark could very well be a valuable contribution that fills a hole in the existing body of work but the paper in its current form does not adequately establish this the rebuttal should better address how this benchmark fits into existing work by comparing it to existing datasets and more relevant baselines 2 the paper as a whole is well written but i question some of the choices in syntax terms such as demarcation and desideratum are spirited but may be better replaced by plainer alternativesdocsepthis papers main contributions are i to propose two new benchmarks for online continual learning in the context of language modelling and ii evaluate the performance of a number of compositionofexpertsbased models on the new datasets using a number of metrics the multilingual benchmark derived from an existing multilingual news corpus consists of sequences of characters where the language is periodically switched and the multidomain benchmark consists of sequences of english words where the corpus is periodically switched the comparative performances of the various baselines on the two datasets as well as an analysis of the mixture weights in one of the models during training are used to provide insights into the qualitative differences between the datasets overall i am inclined to recommend acceptance for this paper on the margin because it makes a good contribution towards evaluating continual learning models in more real world settings more specifically in the context of online learning the datasets proposed are wellsuited for purpose for reasons outlined below and the evaluation using various compositionofexperts models is fairly conducted and followed up with an informative analysis the key downside of the paper is that no standard continual learning baselines are trained on the proposed datasets i would be inclined to increase my score if results were shown for 1 or 2 algorithms specifically designed for continual learning with neural networks as discussed in more detail below positives there is a need to start evaluating continual learning in closertoreallife settings in providing datasets that facilitate evaluation of continual learning models in an online setting without task boundaries this paper makes a positive contribution in this direction
the datasets are simply composed but seem well suited for evaluating online continual learning because i language data is sequential ii by imposing a truncated exponential and thus memoryless distribution on the length of subsequences it is hard for models to cheat in predicting the next task switch preserving taskagnosticity and iii in both datasets the subtasks share latent similarities creating the possibility for forwardbackward transfer between them
the analysis of the experiments provides interesting insights into the datasets and differences between the baselines eg figure 1d effectively shows how the weights of one of the product of experts models switch after a task change indicating a degree of specialisation of the modules and 1e uses the correlations of the mixtures weights used for different subtasks to highlight the latent similarity between pairs of subtasks
the paper is clearly written and easy to follow
main concern limited set of baselines while a range of compositionofexperts baselines are used for evaluation it would have been much better to also include other methods specifically designed for online continual learning such as those cited in the paper 1 2 or though not strictly online a replaybased method such as clear which works in the taskagnostic setting it is claimed in the paper that including stateoftheart online continual learning methods would have involved nontrivial adaptations significantly departing from the original models which would limit any possible conclusions we could draw as they are designed for imagebased datasets i dont fully understand the basis of this claim perhaps the authors could elaborate as far as i am aware for example 1 is not restricted for use on imagebased datasets
since the subtasks do have discrete boundaries even though these are not passed to the model during training it would be possible to evaluate methods that use task boundaries for consolidation on the proposed datasets by either providing knowledge of the boundaries although this breaks the taskagnosticity or by using methods that can detect task boundaries eg ewc uses the forgetmenot process 3
overall not evaluating the datasets with any standard continual learning baselines is an important weakness
other comments the proposed method plastic gates which performs best amongst the baselines used when combined with product of experts models seems simple and effective but i am inclined to question how novel it is since it just amounts to multistep online gradient descent on the mixture weights
the metrics used for evaluating continual learning loss after switch and recovery time after switch which are one of the main selling points of the paper are suitable for the datasets provided but would not be applicable in a setting where either the task boundaries are not known or there are no hard task boundaries to be identified
typo section 2 paragraph 2 mnnist mnist
### Summary: | the initial reviews were mixed for this paper on one hand some of the reviewers highlighted that the proposed datasets could be useful to researchers on the other reviewers found a few important flaws with the current manuscript including missing baselines issues with the proposed tasks and possibly inaccurateimprecise statements our discussion after the authors response focussed on whether the positives aspects of the current paper outweighed some of the perceived weaknesses of the paper in particular while some of the initial criticisms from the reviewers were successfully addressed by the authors including possible imprecisions and to a certain extent motivation all the reviewers remained convinced that standard continual learning baselines could be adapted to this setting they also conjectured that these missing baselines might not allow readers to appreciate the strength of the proposed datasets in their response the authors argued that adapting models would require research the reviewers are under the impression that it would be useful to test baselines more or less asis even if the authors do not think these baselines will be competitive for example in the discussion a reviewer suggested that an experience replay baseline could have been implemented where the replay buffer includes the hidden states of an lstm it might also be useful to study baselines that do not strictly obey the proposed setting again to get a better understanding of the proposed tasks including how difficult it is overall having some of these baselines would be one way to better connect the proposed work to the current continuallearning literature | [
672,
1333,
2228,
32299,
598,
285,
44864,
970,
731,
627,
403,
1142,
643,
45120,
4715,
7274,
326,
7826,
812,
320,
3732,
1060,
667,
1798,
1921,
323,
417,
970,
731,
50276,
18,
465,
28820,
1162,
355,
15180,
3082,
275,
4477,
1456,
863,
2382,
374,
35223,
76,
5503,
1162,
355,
417,
512,
1894,
295,
5059,
403,
3562,
4503,
247,
1263,
275,
4477,
1456,
863,
2382,
495,
21974,
91,
1162,
355,
327,
253,
5886,
875,
32019,
1745,
1497,
285,
7364,
273,
1554,
39661,
3448,
14053,
16168,
50276,
7152,
33032,
296,
3755,
20556,
436,
2929,
29328,
247,
747,
7103,
7792,
285,
4245,
767,
2130,
7103,
15302,
14855,
50276,
783,
2929,
3198,
247,
2201,
24813,
281,
3157,
2938,
1371,
285,
281,
1805,
1375,
16038,
285,
7680,
50276,
783,
16774,
12820,
310,
5075,
50276,
250,
3743,
323,
2997,
253,
11361,
273,
436,
2929,
403,
50276,
18,
186,
2520,
2929,
4081,
247,
747,
7103,
22791,
285,
10895,
281,
8591,
253,
2905,
2561,
273,
3909,
45120,
4715,
50276,
19,
186,
783,
4081,
8013,
7394,
4483,
352,
281,
16969,
1027,
10670,
2190,
1027,
10071,
534,
556,
2176,
2538,
432,
253,
5661,
1543,
50275,
250,
3743,
323,
12009,
253,
35387,
273,
436,
2929,
403,
337,
186,
2520,
2929,
310,
417,
2217,
4460,
285,
556,
417,
9945,
2217,
281,
45120,
4715,
2905,
2561,
50276,
19,
186,
783,
5161,
16038,
273,
436,
2929,
310,
417,
2590,
2217,
253,
12002,
5393,
326,
352,
310,
1892,
281,
1471,
3178,
366,
4836,
13674,
275,
4588,
8892,
285,
840,
753,
326,
247,
747,
22791,
747,
17082,
285,
305,
839,
5853,
403,
4081,
24982,
7234,
751,
436,
476,
10693,
9232,
253,
2022,
1895,
281,
320,
14042,
495,
186,
783,
11361,
273,
253,
747,
17082,
403,
417,
2590,
984,
432,
253,
5661,
1543,
268,
446,
285,
268,
446,
2140,
452,
247,
2266,
5921,
3103,
4496,
5513,
697,
11361,
275,
2508,
1690,
253,
11361,
273,
436,
7103,
7792,
2429,
342,
253,
7103,
7792,
273,
2905,
6239,
285,
12654,
352,
577,
186,
783,
8245,
4648,
298,
296,
78,
285,
1057,
417,
897,
260,
9866,
39707,
3966,
534,
2722,
326,
697,
26647,
310,
3710,
608,
186,
5092,
368,
2085,
253,
5661,
1543,
672,
50276,
261,
643,
2193,
285,
253,
5019,
273,
253,
1180,
273,
11911,
721,
186,
12157,
752,
368,
403,
36636,
310,
247,
5415,
3448,
14053,
7103,
7792,
310,
352,
1896,
281,
7472,
690,
273,
253,
6323,
3909,
45120,
4715,
2718,
50276,
1542,
1650,
337,
36536,
5145,
4715,
342,
3676,
18361,
4872,
20741,
386,
1783,
374,
4715,
247,
27998,
30410,
17627,
595,
3066,
6142,
267,
6816,
390,
643,
4836,
4924,
45120,
4715,
2905,
789,
436,
588,
452,
247,
1175,
7103,
1055,
327,
10499,
253,
49607,
273,
634,
7103,
7792,
50276,
7152,
33032,
50276,
8774,
50276,
2520,
2929,
23970,
247,
10895,
285,
22791,
323,
3448,
14053,
275,
253,
3909,
45120,
4715,
7792,
253,
2234,
5319,
273,
436,
22791,
403,
253,
941,
403,
5897,
595,
9578,
627,
403,
642,
4836,
47811,
3559,
281,
253,
1566,
4836,
4924,
4758,
285,
253,
7103,
310,
2684,
275,
271,
3909,
8142,
50276,
783,
22791,
8414,
273,
247,
1554,
39661,
1894,
5251,
10895,
285,
247,
23964,
297,
404,
3159,
5251,
10895,
253,
4477,
9569,
2067,
17082,
285,
7472,
970,
2067,
2969,
1666,
25379,
273,
24170,
80,
453,
89,
468,
1641,
285,
1885,
601,
453,
89,
468,
1641,
50273,
856,
84,
50276,
18,
186,
783,
2929,
310,
2590,
285,
973,
15720,
50276,
19,
186,
783,
4477,
2085,
4209,
4278,
327,
941,
4849,
285,
14053,
495,
186,
783,
4623,
789,
2593,
310,
9470,
577,
186,
783,
2216,
10165,
275,
26736,
253,
10895,
403,
973,
1869,
562,
285,
1056,
3282,
1677,
253,
8103,
273,
253,
2929,
275,
1798,
253,
10895,
2112,
342,
253,
4081,
7103,
17082,
28174,
253,
1264,
4767,
16566,
273,
253,
22791,
50276,
22,
186,
783,
4477,
403,
598,
6342,
670,
4753,
1669,
562,
273,
253,
2022,
2505,
697,
5322,
672,
2442,
3533,
403,
17683,
285,
9577,
323,
1650,
2139,
359,
624,
45120,
4715,
256,
5503,
3210,
6760,
285,
2139,
359,
624,
4979,
398,
2783,
347,
1666,
25379,
253,
4477,
3662,
841,
3533,
4613,
314,
50273,
5040,
50276,
18,
186,
783,
10895,
3133,
32809,
689,
5368,
789,
374,
186,
783,
5611,
7103,
17082,
403,
2529,
540,
41597,
533,
403,
417,
5867,
45190,
390,
28055,
50276,
20,
186,
783,
15504,
2877,
273,
253,
5611,
10895,
310,
417,
18212,
17285,
275,
5886,
281,
5368,
7881,
275,
253,
45120,
4715,
4758,
247,
4445,
273,
436,
310,
4645,
835,
5368,
3210,
1891,
285,
2139,
436,
10895,
588,
1361,
3157,
731,
50273,
250,
27167,
318,
285,
8813,
50276,
74,
5583,
18235,
323,
253,
3786,
18627,
4606,
50273,
74,
671,
452,
690,
3533,
326,
891,
3524,
253,
2488,
476,
1361,
2953,
50276,
18,
752,
310,
253,
2234,
15832,
689,
5368,
789,
824,
347,
277,
1920,
2123,
1162,
355,
665,
671,
1263,
3448,
3210,
275,
253,
45120,
4715,
4836,
4924,
4758,
374,
752,
4433,
273,
1655,
3210,
1057,
436,
22791,
2953,
3877,
326,
253,
3662,
281,
436,
1953,
943,
671,
320,
45190,
5183,
50273,
38092,
8680,
337,
436,
22791,
812,
1077,
973,
320,
247,
9865,
7680,
326,
32113,
247,
7793,
275,
253,
5368,
2133,
273,
789,
533,
253,
2929,
275,
697,
1655,
830,
1057,
417,
18212,
5100,
436,
253,
30080,
22559,
943,
1805,
2953,
849,
436,
22791,
13840,
715,
5368,
789,
407,
10941,
352,
281,
5368,
15302,
285,
625,
4623,
1666,
25379,
374,
253,
2929,
347,
247,
2644,
310,
973,
3542,
533,
891,
1953,
690,
273,
253,
10165,
275,
16144,
2426,
824,
347,
1471,
3178,
318,
285,
711,
1334,
18438,
403,
15009,
959,
533,
778,
320,
1805,
7932,
407,
499,
7566,
18075,
7152,
33032,
2520,
9380,
2022,
9021,
403,
891,
281,
12661,
767,
747,
49602,
323,
3909,
45120,
4715,
275,
253,
3634,
273,
3448,
26278,
285,
21255,
7472,
253,
3045,
273,
247,
1180,
273,
5889,
80,
453,
89,
468,
1641,
3169,
3210,
327,
253,
747,
15302,
970,
247,
1180,
273,
17082,
253,
1554,
39661,
22791,
6012,
432,
271,
5368,
1554,
39661,
3668,
20689,
8414,
273,
6430,
273,
5810,
835,
253,
3448,
310,
28557,
17609,
285,
253,
23964,
297,
404,
22791,
8414,
273,
6430,
273,
48087,
3000,
835,
253,
20689,
310,
28557,
17609,
253,
20407,
16226,
273,
253,
2710,
1666,
25379,
327,
253,
767,
15302,
347,
973,
347,
271,
1783,
273,
253,
7802,
13461,
275,
581,
273,
253,
3210,
1309,
3733,
403,
908,
281,
2085,
16039,
715,
253,
18276,
3910,
875,
253,
15302,
50276,
1189,
455,
891,
717,
21802,
281,
5583,
14924,
323,
436,
2929,
327,
253,
8459,
984,
352,
2789,
247,
1175,
7680,
4404,
16344,
45120,
4715,
3210,
275,
625,
1524,
1533,
7533,
625,
5742,
275,
253,
3634,
273,
3909,
4715,
253,
15302,
4081,
403,
973,
3467,
959,
323,
4096,
323,
4606,
18627,
2708,
285,
253,
7103,
970,
2710,
5889,
80,
453,
89,
468,
1641,
3210,
310,
9648,
5196,
285,
3560,
598,
342,
271,
27096,
1783,
253,
2234,
42719,
273,
253,
2929,
310,
326,
642,
2629,
45120,
4715,
1666,
25379,
403,
10166,
327,
253,
4081,
15302,
891,
651,
320,
21802,
281,
2572,
619,
4868,
604,
1543,
497,
2011,
323,
337,
390,
374,
11333,
5742,
4158,
323,
45120,
4715,
342,
11454,
6928,
347,
5469,
275,
625,
2508,
2708,
50276,
993,
23223,
28910,
186,
9088,
310,
247,
878,
281,
1265,
16344,
45120,
4715,
275,
2734,
797,
410,
455,
1074,
7533,
275,
5277,
15302,
326,
12454,
7103,
273,
45120,
4715,
3210,
275,
271,
3909,
4758,
1293,
4836,
13674,
436,
2929,
2789,
247,
2762,
7680,
275,
436,
3884,
40702,
28910,
186,
783,
15302,
403,
3365,
9924,
533,
1646,
973,
18960,
323,
16344,
3909,
45120,
4715,
984,
891,
3448,
941,
310,
22453,
21255,
407,
23254,
247,
28069,
17619,
285,
3021,
3541,
1417,
3268,
327,
253,
2978,
273,
4728,
2979,
352,
310,
1892,
323,
3210,
281,
39038,
275,
21565,
253,
1735,
4836,
5234,
24279,
4836,
1530,
493,
5755,
285,
37685,
275,
1097,
15302,
253,
8482,
6579,
3894,
21624,
22620,
6153,
253,
6387,
323,
3579,
2135,
1034,
3700,
875,
731,
40702,
28910,
186,
783,
1783,
273,
253,
4679,
3400,
4722,
16039,
715,
253,
15302,
285,
3910,
875,
253,
1666,
25379,
24088,
4677,
337,
69,
8069,
2722,
849,
253,
13461,
273,
581,
273,
253,
1885,
273,
10071,
3210,
5234,
846,
247,
4836,
1818,
7809,
247,
4248,
273,
2714,
5837,
273,
253,
11911,
285,
337,
70,
4648,
253,
13007,
273,
253,
24170,
13461,
908,
323,
1027,
8482,
6579,
281,
6780,
253,
21624,
14259,
875,
8557,
273,
8482,
6579,
40702,
28910,
186,
783,
2929,
310,
4518,
3542,
285,
3477,
281,
956,
40702,
50276,
7265,
4468,
28910,
186,
15870,
873,
273,
1666,
25379,
1223,
247,
2491,
273,
5889,
80,
453,
89,
468,
1641,
1666,
25379,
403,
908,
323,
7103,
352,
651,
452,
644,
1199,
1805,
281,
671,
2486,
643,
3082,
5742,
4158,
323,
3909,
45120,
4715,
824,
347,
1110,
11106,
275,
253,
2929,
337,
374,
390,
2167,
417,
13714,
3909,
247,
44864,
3169,
1332,
824,
347,
2590,
534,
2987,
275,
253,
4836,
1530,
6932,
4758,
352,
310,
7558,
275,
253,
2929,
326,
1690,
1375,
23037,
14387,
3909,
45120,
4715,
3082,
651,
452,
3206,
37825,
41655,
3012,
48373,
432,
253,
3236,
3210,
534,
651,
2701,
667,
1896,
11815,
359,
812,
3812,
347,
597,
403,
4158,
323,
2460,
3169,
15302,
891,
13414,
4751,
2096,
253,
3720,
273,
436,
1750,
4931,
253,
4477,
812,
21184,
50276,
284,
2080,
347,
891,
717,
6600,
323,
1650,
337,
310,
417,
11096,
323,
897,
327,
2460,
3169,
15302,
40702,
28910,
186,
17480,
253,
8482,
6579,
513,
452,
13358,
13674,
1014,
2167,
841,
403,
417,
4817,
281,
253,
1566,
1309,
3733,
352,
651,
320,
1896,
281,
7472,
3082,
326,
897,
4836,
13674,
323,
34889,
327,
253,
4081,
15302,
407,
2057,
5277,
3640,
273,
253,
13674,
3738,
436,
13471,
253,
4836,
1530,
493,
5755,
390,
407,
970,
3082,
326,
476,
2736,
4836,
13674,
50276,
909,
299,
38212,
4648,
253,
7740,
3767,
302,
1232,
495,
40702,
28910,
186,
1189,
455,
417,
16344,
253,
15302,
342,
667,
2629,
45120,
4715,
1666,
25379,
310,
271,
1774,
14855,
40702,
50276,
977,
5701,
28910,
186,
783,
4081,
1332,
8013,
18488,
534,
17923,
1682,
15995,
253,
1666,
25379,
908,
672,
5678,
342,
1885,
273,
10071,
3210,
3133,
2969,
285,
3576,
533,
891,
717,
21802,
281,
1953,
849,
4460,
352,
310,
1580,
352,
816,
8322,
281,
1554,
382,
554,
3909,
11786,
18499,
327,
253,
7802,
13461,
40702,
28910,
186,
783,
17082,
908,
323,
16344,
45120,
4715,
2957,
846,
5234,
285,
7355,
673,
846,
5234,
534,
403,
581,
273,
253,
2022,
10156,
2792,
273,
253,
2929,
403,
7470,
323,
253,
15302,
2530,
533,
651,
417,
320,
7763,
275,
247,
4758,
835,
2057,
253,
4836,
13674,
403,
417,
1929,
390,
627,
403,
642,
1892,
4836,
13674,
281,
320,
3636,
40702,
28910,
186,
555,
5367,
2593,
374,
12494,
374,
278,
9866,
382,
50276,
16192,
382,
187,
187,
4118,
18435,
27,
783,
3302,
10123,
497,
6804,
323,
436,
2929,
327,
581,
1133,
690,
273,
253,
30628,
16318,
326,
253,
4081,
15302,
812,
320,
4217,
281,
8607,
327,
253,
643,
30628,
1119,
247,
1643,
1774,
32138,
342,
253,
1655,
7714,
1690,
5816,
1666,
25379,
3374,
342,
253,
4081,
8892,
285,
6830,
31215,
11548,
2845,
885,
7234,
50276,
454,
5955,
846,
253,
4477,
2380,
41685,
47291,
327,
1880,
253,
37865,
7794,
273,
253,
1655,
2929,
32180,
18201,
690,
273,
253,
12351,
32213,
273,
253,
2929,
275,
1798,
1223,
690,
273,
253,
3302,
43680,
432,
253,
30628,
497,
8379,
9713,
407,
253,
4477,
1690,
1896,
1607,
2845,
3836,
285,
281,
247,
2176,
6070,
16038,
512,
253,
30628,
6376,
13762,
326,
2629,
45120,
4715,
1666,
25379,
812,
320,
12956,
281,
436,
4758,
597,
671,
19704,
1520,
326,
841,
5816,
1666,
25379,
1537,
417,
1581,
10668,
281,
11435,
253,
4757,
273,
253,
4081,
15302,
50275,
249,
616,
2380,
253,
4477,
9125,
326,
42174,
3210,
651,
2430,
2561,
253,
30628,
403,
762,
253,
13214,
326,
352,
651,
320,
4217,
281,
1071,
1666,
25379,
625,
390,
1679,
347,
261,
1014,
604,
253,
4477,
513,
417,
1158,
841,
1666,
25379,
588,
320,
12085,
323,
1650,
275,
253,
5955,
247,
37317,
5125,
326,
271,
2793,
44864,
8245,
812,
50276,
9802,
644,
9009,
835,
253,
44864,
6391,
3797,
253,
8763,
3054,
273,
271,
298,
296,
78,
352,
1537,
671,
320,
4217,
281,
1263,
1666,
25379,
326,
513,
417,
13714,
20090,
253,
4081,
4758,
969,
281,
755,
247,
1805,
4685,
273,
253,
4081,
8892,
1690,
849,
2834,
352,
310,
50276,
1189,
455,
1907,
690,
273,
841,
1666,
25379,
651,
320,
581,
1039,
281,
1805,
4684,
253,
4081,
789,
281,
253,
1655,
45120,
28269,
6239,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
672,
1333,
2228,
32299,
598,
285,
44864,
970,
731,
627,
403,
1142,
643,
45120,
4715,
7274,
326,
7826,
812,
320,
3732,
1060,
667,
1798,
1921,
323,
417,
970,
731,
50276,
18,
465,
28820,
1162,
355,
15180,
3082,
275,
4477,
1456,
863,
2382,
374,
35223,
76,
5503,
1162,
355,
417,
512,
1894,
295,
5059,
403,
3562,
4503,
247,
1263,
275,
4477,
1456,
863,
2382,
495,
21974,
91,
1162,
355,
327,
253,
5886,
875,
32019,
1745,
1497,
285,
7364,
273,
1554,
39661,
3448,
14053,
16168,
50276,
7152,
33032,
296,
3755,
20556,
436,
2929,
29328,
247,
747,
7103,
7792,
285,
4245,
767,
2130,
7103,
15302,
14855,
50276,
783,
2929,
3198,
247,
2201,
24813,
281,
3157,
2938,
1371,
285,
281,
1805,
1375,
16038,
285,
7680,
50276,
783,
16774,
12820,
310,
5075,
50276,
250,
3743,
323,
2997,
253,
11361,
273,
436,
2929,
403,
50276,
18,
186,
2520,
2929,
4081,
247,
747,
7103,
22791,
285,
10895,
281,
8591,
253,
2905,
2561,
273,
3909,
45120,
4715,
50276,
19,
186,
783,
4081,
8013,
7394,
4483,
352,
281,
16969,
1027,
10670,
2190,
1027,
10071,
534,
556,
2176,
2538,
432,
253,
5661,
1543,
50275,
250,
3743,
323,
12009,
253,
35387,
273,
436,
2929,
403,
337,
186,
2520,
2929,
310,
417,
2217,
4460,
285,
556,
417,
9945,
2217,
281,
45120,
4715,
2905,
2561,
50276,
19,
186,
783,
5161,
16038,
273,
436,
2929,
310,
417,
2590,
2217,
253,
12002,
5393,
326,
352,
310,
1892,
281,
1471,
3178,
366,
4836,
13674,
275,
4588,
8892,
285,
840,
753,
326,
247,
747,
22791,
747,
17082,
285,
305,
839,
5853,
403,
4081,
24982,
7234,
751,
436,
476,
10693,
9232,
253,
2022,
1895,
281,
320,
14042,
495,
186,
783,
11361,
273,
253,
747,
17082,
403,
417,
2590,
984,
432,
253,
5661,
1543,
268,
446,
285,
268,
446,
2140,
452,
247,
2266,
5921,
3103,
4496,
5513,
697,
11361,
275,
2508,
1690,
253,
11361,
273,
436,
7103,
7792,
2429,
342,
253,
7103,
7792,
273,
2905,
6239,
285,
12654,
352,
577,
186,
783,
8245,
4648,
298,
296,
78,
285,
1057,
417,
897,
260,
9866,
39707,
3966,
534,
2722,
326,
697,
26647,
310,
3710,
608,
186,
5092,
368,
2085,
253,
5661,
1543,
672,
50276,
261,
643,
2193,
285,
253,
5019,
273,
253,
1180,
273,
11911,
721,
186,
12157,
752,
368,
403,
36636,
310,
247,
5415,
3448,
14053,
7103,
7792,
310,
352,
1896,
281,
7472,
690,
273,
253,
6323,
3909,
45120,
4715,
2718,
50276,
1542,
1650,
337,
36536,
5145,
4715,
342,
3676,
18361,
4872,
20741,
386,
1783,
374,
4715,
247,
27998,
30410,
17627,
595,
3066,
6142,
267,
6816,
390,
643,
4836,
4924,
45120,
4715,
2905,
789,
436,
588,
452,
247,
1175,
7103,
1055,
327,
10499,
253,
49607,
273,
634,
7103,
7792,
50276,
7152,
33032,
50276,
8774,
50276,
2520,
2929,
23970,
247,
10895,
285,
22791,
323,
3448,
14053,
275,
253,
3909,
45120,
4715,
7792,
253,
2234,
5319,
273,
436,
22791,
403,
253,
941,
403,
5897,
595,
9578,
627,
403,
642,
4836,
47811,
3559,
281,
253,
1566,
4836,
4924,
4758,
285,
253,
7103,
310,
2684,
275,
271,
3909,
8142,
50276,
783,
22791,
8414,
273,
247,
1554,
39661,
1894,
5251,
10895,
285,
247,
23964,
297,
404,
3159,
5251,
10895,
253,
4477,
9569,
2067,
17082,
285,
7472,
970,
2067,
2969,
1666,
25379,
273,
24170,
80,
453,
89,
468,
1641,
285,
1885,
601,
453,
89,
468,
1641,
50273,
856,
84,
50276,
18,
186,
783,
2929,
310,
2590,
285,
973,
15720,
50276,
19,
186,
783,
4477,
2085,
4209,
4278,
327,
941,
4849,
285,
14053,
495,
186,
783,
4623,
789,
2593,
310,
9470,
577,
186,
783,
2216,
10165,
275,
26736,
253,
10895,
403,
973,
1869,
562,
285,
1056,
3282,
1677,
253,
8103,
273,
253,
2929,
275,
1798,
253,
10895,
2112,
342,
253,
4081,
7103,
17082,
28174,
253,
1264,
4767,
16566,
273,
253,
22791,
50276,
22,
186,
783,
4477,
403,
598,
6342,
670,
4753,
1669,
562,
273,
253,
2022,
2505,
697,
5322,
672,
2442,
3533,
403,
17683,
285,
9577,
323,
1650,
2139,
359,
624,
45120,
4715,
256,
5503,
3210,
6760,
285,
2139,
359,
624,
4979,
398,
2783,
347,
1666,
25379,
253,
4477,
3662,
841,
3533,
4613,
314,
50273,
5040,
50276,
18,
186,
783,
10895,
3133,
32809,
689,
5368,
789,
374,
186,
783,
5611,
7103,
17082,
403,
2529,
540,
41597,
533,
403,
417,
5867,
45190,
390,
28055,
50276,
20,
186,
783,
15504,
2877,
273,
253,
5611,
10895,
310,
417,
18212,
17285,
275,
5886,
281,
5368,
7881,
275,
253,
45120,
4715,
4758,
247,
4445,
273,
436,
310,
4645,
835,
5368,
3210,
1891,
285,
2139,
436,
10895,
588,
1361,
3157,
731,
50273,
250,
27167,
318,
285,
8813,
50276,
74,
5583,
18235,
323,
253,
3786,
18627,
4606,
50273,
74,
671,
452,
690,
3533,
326,
891,
3524,
253,
2488,
476,
1361,
2953,
50276,
18,
752,
310,
253,
2234,
15832,
689,
5368,
789,
824,
347,
277,
1920,
2123,
1162,
355,
665,
671,
1263,
3448,
3210,
275,
253,
45120,
4715,
4836,
4924,
4758,
374,
752,
4433,
273,
1655,
3210,
1057,
436,
22791,
2953,
3877,
326,
253,
3662,
281,
436,
1953,
943,
671,
320,
45190,
5183,
50273,
38092,
8680,
337,
436,
22791,
812,
1077,
973,
320,
247,
9865,
7680,
326,
32113,
247,
7793,
275,
253,
5368,
2133,
273,
789,
533,
253,
2929,
275,
697,
1655,
830,
1057,
417,
18212,
5100,
436,
253,
30080,
22559,
943,
1805,
2953,
849,
436,
22791,
13840,
715,
5368,
789,
407,
10941,
352,
281,
5368,
15302,
285,
625,
4623,
1666,
25379,
374,
253,
2929,
347,
247,
2644,
310,
973,
3542,
533,
891,
1953,
690,
273,
253,
10165,
275,
16144,
2426,
824,
347,
1471,
3178,
318,
285,
711,
1334,
18438,
403,
15009,
959,
533,
778,
320,
1805,
7932,
407,
499,
7566,
18075,
7152,
33032,
2520,
9380,
2022,
9021,
403,
891,
281,
12661,
767,
747,
49602,
323,
3909,
45120,
4715,
275,
253,
3634,
273,
3448,
26278,
285,
21255,
7472,
253,
3045,
273,
247,
1180,
273,
5889,
80,
453,
89,
468,
1641,
3169,
3210,
327,
253,
747,
15302,
970,
247,
1180,
273,
17082,
253,
1554,
39661,
22791,
6012,
432,
271,
5368,
1554,
39661,
3668,
20689,
8414,
273,
6430,
273,
5810,
835,
253,
3448,
310,
28557,
17609,
285,
253,
23964,
297,
404,
22791,
8414,
273,
6430,
273,
48087,
3000,
835,
253,
20689,
310,
28557,
17609,
253,
20407,
16226,
273,
253,
2710,
1666,
25379,
327,
253,
767,
15302,
347,
973,
347,
271,
1783,
273,
253,
7802,
13461,
275,
581,
273,
253,
3210,
1309,
3733,
403,
908,
281,
2085,
16039,
715,
253,
18276,
3910,
875,
253,
15302,
50276,
1189,
455,
891,
717,
21802,
281,
5583,
14924,
323,
436,
2929,
327,
253,
8459,
984,
352,
2789,
247,
1175,
7680,
4404,
16344,
45120,
4715,
3210,
275,
625,
1524,
1533,
7533,
625,
5742,
275,
253,
3634,
273,
3909,
4715,
253,
15302,
4081,
403,
973,
3467,
959,
323,
4096,
323,
4606,
18627,
2708,
285,
253,
7103,
970,
2710,
5889,
80,
453,
89,
468,
1641,
3210,
310,
9648,
5196,
285,
3560,
598,
342,
271,
27096,
1783,
253,
2234,
42719,
273,
253,
2929,
310,
326,
642,
2629,
45120,
4715,
1666,
25379,
403,
10166,
327,
253,
4081,
15302,
891,
651,
320,
21802,
281,
2572,
619,
4868,
604,
1543,
497,
2011,
323,
337,
390,
374,
11333,
5742,
4158,
323,
45120,
4715,
342,
11454,
6928,
347,
5469,
275,
625,
2508,
2708,
50276,
993,
23223,
28910,
186,
9088,
310,
247,
878,
281,
1265,
16344,
45120,
4715,
275,
2734,
797,
410,
455,
1074,
7533,
275,
5277,
15302,
326,
12454,
7103,
273,
45120,
4715,
3210,
275,
271,
3909,
4758,
1293,
4836,
13674,
436,
2929,
2789,
247,
2762,
7680,
275,
436,
3884,
40702,
28910,
186,
783,
15302,
403,
3365,
9924,
533,
1646,
973,
18960,
323,
16344,
3909,
45120,
4715,
984,
891,
3448,
941,
310,
22453,
21255,
407,
23254,
247,
28069,
17619,
285,
3021,
3541,
1417,
3268,
327,
253,
2978,
273,
4728,
2979,
352,
310,
1892,
323,
3210,
281,
39038,
275,
21565,
253,
1735,
4836,
5234,
24279,
4836,
1530,
493,
5755,
285,
37685,
275,
1097,
15302,
253,
8482,
6579,
3894,
21624,
22620,
6153,
253,
6387,
323,
3579,
2135,
1034,
3700,
875,
731,
40702,
28910,
186,
783,
1783,
273,
253,
4679,
3400,
4722,
16039,
715,
253,
15302,
285,
3910,
875,
253,
1666,
25379,
24088,
4677,
337,
69,
8069,
2722,
849,
253,
13461,
273,
581,
273,
253,
1885,
273,
10071,
3210,
5234,
846,
247,
4836,
1818,
7809,
247,
4248,
273,
2714,
5837,
273,
253,
11911,
285,
337,
70,
4648,
253,
13007,
273,
253,
24170,
13461,
908,
323,
1027,
8482,
6579,
281,
6780,
253,
21624,
14259,
875,
8557,
273,
8482,
6579,
40702,
28910,
186,
783,
2929,
310,
4518,
3542,
285,
3477,
281,
956,
40702,
50276,
7265,
4468,
28910,
186,
15870,
873,
273,
1666,
25379,
1223,
247,
2491,
273,
5889,
80,
453,
89,
468,
1641,
1666,
25379,
403,
908,
323,
7103,
352,
651,
452,
644,
1199,
1805,
281,
671,
2486,
643,
3082,
5742,
4158,
323,
3909,
45120,
4715,
824,
347,
1110,
11106,
275,
253,
2929,
337,
374,
390,
2167,
417,
13714,
3909,
247,
44864,
3169,
1332,
824,
347,
2590,
534,
2987,
275,
253,
4836,
1530,
6932,
4758,
352,
310,
7558,
275,
253,
2929,
326,
1690,
1375,
23037,
14387,
3909,
45120,
4715,
3082,
651,
452,
3206,
37825,
41655,
3012,
48373,
432,
253,
3236,
3210,
534,
651,
2701,
667,
1896,
11815,
359,
812,
3812,
347,
597,
403,
4158,
323,
2460,
3169,
15302,
891,
13414,
4751,
2096,
253,
3720,
273,
436,
1750,
4931,
253,
4477,
812,
21184,
50276,
284,
2080,
347,
891,
717,
6600,
323,
1650,
337,
310,
417,
11096,
323,
897,
327,
2460,
3169,
15302,
40702,
28910,
186,
17480,
253,
8482,
6579,
513,
452,
13358,
13674,
1014,
2167,
841,
403,
417,
4817,
281,
253,
1566,
1309,
3733,
352,
651,
320,
1896,
281,
7472,
3082,
326,
897,
4836,
13674,
323,
34889,
327,
253,
4081,
15302,
407,
2057,
5277,
3640,
273,
253,
13674,
3738,
436,
13471,
253,
4836,
1530,
493,
5755,
390,
407,
970,
3082,
326,
476,
2736,
4836,
13674,
50276,
909,
299,
38212,
4648,
253,
7740,
3767,
302,
1232,
495,
40702,
28910,
186,
1189,
455,
417,
16344,
253,
15302,
342,
667,
2629,
45120,
4715,
1666,
25379,
310,
271,
1774,
14855,
40702,
50276,
977,
5701,
28910,
186,
783,
4081,
1332,
8013,
18488,
534,
17923,
1682,
15995,
253,
1666,
25379,
908,
672,
5678,
342,
1885,
273,
10071,
3210,
3133,
2969,
285,
3576,
533,
891,
717,
21802,
281,
1953,
849,
4460,
352,
310,
1580,
352,
816,
8322,
281,
1554,
382,
554,
3909,
11786,
18499,
327,
253,
7802,
13461,
40702,
28910,
186,
783,
17082,
908,
323,
16344,
45120,
4715,
2957,
846,
5234,
285,
7355,
673,
846,
5234,
534,
403,
581,
273,
253,
2022,
10156,
2792,
273,
253,
2929,
403,
7470,
323,
253,
15302,
2530,
533,
651,
417,
320,
7763,
275,
247,
4758,
835,
2057,
253,
4836,
13674,
403,
417,
1929,
390,
627,
403,
642,
1892,
4836,
13674,
281,
320,
3636,
40702,
28910,
186,
555,
5367,
2593,
374,
12494,
374,
278,
9866,
382,
50276,
16192,
382,
187,
187,
4118,
18435,
27,
783,
3302,
10123,
497,
6804,
323,
436,
2929,
327,
581,
1133,
690,
273,
253,
30628,
16318,
326,
253,
4081,
15302,
812,
320,
4217,
281,
8607,
327,
253,
643,
30628,
1119,
247,
1643,
1774,
32138,
342,
253,
1655,
7714,
1690,
5816,
1666,
25379,
3374,
342,
253,
4081,
8892,
285,
6830,
31215,
11548,
2845,
885,
7234,
50276,
454,
5955,
846,
253,
4477,
2380,
41685,
47291,
327,
1880,
253,
37865,
7794,
273,
253,
1655,
2929,
32180,
18201,
690,
273,
253,
12351,
32213,
273,
253,
2929,
275,
1798,
1223,
690,
273,
253,
3302,
43680,
432,
253,
30628,
497,
8379,
9713,
407,
253,
4477,
1690,
1896,
1607,
2845,
3836,
285,
281,
247,
2176,
6070,
16038,
512,
253,
30628,
6376,
13762,
326,
2629,
45120,
4715,
1666,
25379,
812,
320,
12956,
281,
436,
4758,
597,
671,
19704,
1520,
326,
841,
5816,
1666,
25379,
1537,
417,
1581,
10668,
281,
11435,
253,
4757,
273,
253,
4081,
15302,
50275,
249,
616,
2380,
253,
4477,
9125,
326,
42174,
3210,
651,
2430,
2561,
253,
30628,
403,
762,
253,
13214,
326,
352,
651,
320,
4217,
281,
1071,
1666,
25379,
625,
390,
1679,
347,
261,
1014,
604,
253,
4477,
513,
417,
1158,
841,
1666,
25379,
588,
320,
12085,
323,
1650,
275,
253,
5955,
247,
37317,
5125,
326,
271,
2793,
44864,
8245,
812,
50276,
9802,
644,
9009,
835,
253,
44864,
6391,
3797,
253,
8763,
3054,
273,
271,
298,
296,
78,
352,
1537,
671,
320,
4217,
281,
1263,
1666,
25379,
326,
513,
417,
13714,
20090,
253,
4081,
4758,
969,
281,
755,
247,
1805,
4685,
273,
253,
4081,
8892,
1690,
849,
2834,
352,
310,
50276,
1189,
455,
1907,
690,
273,
841,
1666,
25379,
651,
320,
581,
1039,
281,
1805,
4684,
253,
4081,
789,
281,
253,
1655,
45120,
28269,
6239,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the work presents a new parallel tempering scheme that adapts a variational reference distribution within a parametric family they adopt a parameter to minimize the forward kl divergence between the parametric family and the target distribution they combine a fixed and an adaptive reference that leads to better restart rate performance than the baseline strengths interesting and witty idea combining a fixed and adaptive reference in the scheme extensive theoretical analysis of the proposed scheme the authors provide theoretical guarantees for the performance and convergence of the method good presentation of the work weaknesses a lot of toy experiments but not real world datasets it would be interesting to see the method applied in a bigger model and a bigger dataset like an image dataset mnist cifar10 structure is a bit odd since there are no conclusions and no discussion of limitations future directions societal impact but again this is a theoretical work so societal impact is not applicable in this case i would like to see the other stuff more in a separate paragraph though this is a theoretical work so negative societal impact is not discussed and the limitations are briefly but not clearly discussed in the main text subsection 35 docsepthe authors proposed an improved version of the parallel tempering algorithm to solve the nonscalability issue with respect to the data size in particular the authors show that in the largedata limit the restart rate degrades arbitrarily to 0 which strongly affects the communications between the chains associated with the target distribution and the prior distribution to tackle that issue the authors proposed to adopt variational inference based on the exponential family theories and experiments show much better restart rates pros i like the authors insight on the weakness of parallel tempering with respect to the data size given a fixed schedule of parallel tempering the communication efficiency does raise a concern in largedata limits a major reason i am suspecting is that as the number of data points increases the major mode becomes more dominant which also inspires the authors to use a tunable prior based on variational inference cons 1 i think the proposed method is not the right solution to tackle that issue as is known that parallel tempering not only cares about communication efficiency or restart rates but also focuses on the explorationexploitation tradeoff the current method seems to solve the issue of communication inefficiency but the impact of exploration is not clear if we dont know how much exploration is sacrificed why not just adopt a prior that is close enough to the target distribution in that way we can maintain a large enough restart rate via the most vanilla method 2 the combination with a fixed reference further increases my concerns about this method in exploration which has to resort to a different prior for exploration 3 regarding the theories i feel this paper is more suitable for a journal review i am familiar with syeds jrssb21 paper but the proof details of this work are not carefully checked na docsepthis paper proposes to learn the prior distribution adaptively for parallel tempering in particular the prior distribution is tuned to optimize a proxy objective forward kl divergence to the posterior with a simple gradientfree moment matching procedure in theory the variational prior reference proves to outperform fixed reference but in practice it may get stuck in a single mode which the authors resolve by mixing the adaptive and fixed reference distributions empirically the proposed method achieves a big gain over existing methods on bayesian inference tasks strengths the paper is very well written and easy to follow the introduced algorithm is intuitive and theoretically sound in the large data limit the momentmatched reference could achieve the best possible restart rate of 12 the authors fixed the collapsed reference by adding fixed reference back in practice which seems to work well empirically to be fair im not familiar with the datasets the authors used in the paper so i dont know how convincing the empirical results are weaknesses lack of discussions about the assumptions in theoretical analyses for propositions 3133 the conclusions only hold under some assumptions mentioned in the appendix adding some discussions or giving some intuitive explanations about the settings would be helpful for readers to understand the implications of all these propositions all the experiments are done on traditional inference problems with relatively toy models in this case i would expect sampling to be easy for models like deep neural networks the posterior could be very complicated and i dont think the combination of a fixed and an adaptive reference would be enough the authors discussed the limitations in the paper and i dont see any negative societal impact of this work
### Summary: | the idea of this paper is to tune the reference distribution for parallel tempering to improve efficiency the key idea is simple assume the reference distribution is in the exponential family and use sufficient statistics experimental results show that this typically helps in terms of metrics like effective sample size per iteration though not necessarily in terms of effective samples per second there are theoretical guarantees which each rely on a long list of assumptions which are deferred to the appendix while i realize the limitations of space i echo the reviewers that more discussion of the assumptions should be included in the paper of which should be considered more minor or major still this paper proposes a novel approach that is plausibly useful in at least some settings so i recommend acceptance a minor point the font sizes are poorly chosen to the point of being unreadable if the paper is printed i had to resort to zooming into individual figures on the computer to reference which was quite tedious | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
789,
10262,
247,
747,
7529,
2660,
272,
6974,
326,
5223,
84,
247,
39762,
3806,
3268,
1561,
247,
36833,
2021,
597,
5283,
247,
4764,
281,
15338,
253,
3579,
27451,
23279,
875,
253,
36833,
2021,
285,
253,
2303,
3268,
597,
13398,
247,
4229,
285,
271,
17825,
3806,
326,
5644,
281,
1805,
19855,
2281,
3045,
685,
253,
8245,
20544,
4722,
285,
19311,
555,
2934,
16248,
247,
4229,
285,
17825,
3806,
275,
253,
6974,
9470,
10527,
1783,
273,
253,
4081,
6974,
253,
4477,
2085,
10527,
23632,
323,
253,
3045,
285,
14940,
273,
253,
1332,
1175,
9759,
273,
253,
789,
50276,
20881,
1255,
265,
247,
2257,
273,
20953,
4679,
533,
417,
1524,
1533,
15302,
352,
651,
320,
4722,
281,
923,
253,
1332,
3732,
275,
247,
8750,
1566,
285,
247,
8750,
10895,
751,
271,
2460,
10895,
278,
79,
382,
260,
338,
274,
740,
50276,
18317,
310,
247,
2372,
8909,
1580,
627,
403,
642,
11815,
285,
642,
5955,
273,
7364,
2852,
10746,
38058,
3486,
533,
969,
436,
310,
247,
10527,
789,
594,
38058,
3486,
310,
417,
7763,
275,
436,
1083,
891,
651,
751,
281,
923,
253,
643,
5017,
625,
275,
247,
4858,
12494,
2167,
50276,
2520,
310,
247,
10527,
789,
594,
4016,
38058,
3486,
310,
417,
5469,
285,
253,
7364,
403,
13366,
533,
417,
4518,
5469,
275,
253,
2022,
2505,
19087,
4791,
5474,
339,
431,
248,
4477,
4081,
271,
5520,
2715,
273,
253,
7529,
2660,
272,
5933,
281,
8415,
253,
14122,
1179,
1430,
2523,
342,
1675,
281,
253,
941,
1979,
275,
1798,
253,
4477,
921,
326,
275,
253,
1236,
2400,
682,
2701,
253,
19855,
2281,
372,
25013,
29607,
281,
470,
534,
7052,
11852,
253,
10924,
875,
253,
13178,
2330,
342,
253,
2303,
3268,
285,
253,
2720,
3268,
281,
18915,
326,
2523,
253,
4477,
4081,
281,
5283,
39762,
17032,
1754,
327,
253,
17619,
2021,
11813,
285,
4679,
921,
1199,
1805,
19855,
4142,
5847,
50275,
74,
751,
253,
4477,
12288,
327,
253,
14855,
273,
7529,
2660,
272,
342,
1675,
281,
253,
941,
1979,
1677,
247,
4229,
10130,
273,
7529,
2660,
272,
253,
5511,
6733,
1057,
7164,
247,
4468,
275,
1236,
2400,
682,
7787,
247,
2201,
1921,
891,
717,
9101,
272,
310,
326,
347,
253,
1180,
273,
941,
2792,
5459,
253,
2201,
4438,
4916,
625,
11360,
534,
671,
6381,
2731,
253,
4477,
281,
897,
247,
10839,
494,
2720,
1754,
327,
39762,
17032,
50276,
5040,
50276,
18,
891,
1158,
253,
4081,
1332,
310,
417,
253,
987,
2900,
281,
18915,
326,
2523,
347,
310,
1929,
326,
7529,
2660,
272,
417,
760,
24505,
670,
5511,
6733,
390,
19855,
4142,
533,
671,
16633,
327,
253,
17947,
15083,
80,
3535,
5454,
2727,
253,
1655,
1332,
3133,
281,
8415,
253,
2523,
273,
5511,
275,
46505,
533,
253,
3486,
273,
17947,
310,
417,
2590,
604,
359,
13414,
871,
849,
1199,
17947,
310,
31548,
2139,
417,
816,
5283,
247,
2720,
326,
310,
2810,
2217,
281,
253,
2303,
3268,
275,
326,
1039,
359,
476,
6558,
247,
1781,
2217,
19855,
2281,
3066,
253,
954,
26724,
1332,
50276,
19,
253,
5019,
342,
247,
4229,
3806,
2007,
5459,
619,
7350,
670,
436,
1332,
275,
17947,
534,
556,
281,
17942,
281,
247,
1027,
2720,
323,
17947,
50276,
20,
5001,
253,
11813,
891,
1928,
436,
2929,
310,
625,
7470,
323,
247,
6698,
2278,
50275,
74,
717,
7615,
342,
726,
5797,
480,
43053,
67,
1797,
2929,
533,
253,
4737,
4278,
273,
436,
789,
403,
417,
9257,
10141,
5549,
5474,
33032,
2520,
2929,
29328,
281,
3037,
253,
2720,
3268,
5223,
1242,
323,
7529,
2660,
272,
275,
1798,
253,
2720,
3268,
310,
24251,
281,
22318,
247,
17335,
8103,
3579,
27451,
23279,
281,
253,
12637,
342,
247,
2969,
11786,
4924,
2774,
11038,
5199,
275,
3762,
253,
39762,
2720,
3806,
19539,
281,
562,
32231,
4229,
3806,
533,
275,
3946,
352,
778,
755,
10960,
275,
247,
2014,
4438,
534,
253,
4477,
11322,
407,
12480,
253,
17825,
285,
4229,
3806,
10670,
45190,
253,
4081,
1332,
33526,
247,
1943,
6351,
689,
5368,
3082,
327,
17699,
16561,
17032,
8892,
20544,
50276,
783,
2929,
310,
1077,
973,
3542,
285,
3477,
281,
956,
50276,
783,
5611,
5933,
310,
27350,
285,
28055,
3590,
275,
253,
1781,
941,
2701,
253,
2774,
25049,
3806,
812,
5115,
253,
1682,
1896,
19855,
2281,
273,
1249,
50276,
783,
4477,
4229,
253,
21900,
3806,
407,
6240,
4229,
3806,
896,
275,
3946,
534,
3133,
281,
789,
973,
45190,
281,
320,
4344,
516,
417,
7615,
342,
253,
15302,
253,
4477,
908,
275,
253,
2929,
594,
891,
13414,
871,
849,
21414,
253,
16774,
1543,
403,
50275,
20881,
1255,
265,
50276,
77,
471,
273,
11985,
670,
253,
13260,
275,
10527,
6260,
323,
39325,
495,
14380,
253,
11815,
760,
2186,
762,
690,
13260,
5393,
275,
253,
30762,
6240,
690,
11985,
390,
4933,
690,
27350,
22909,
670,
253,
7533,
651,
320,
9371,
323,
10668,
281,
2096,
253,
12739,
273,
512,
841,
39325,
50276,
455,
253,
4679,
403,
2218,
327,
5899,
17032,
3237,
342,
4942,
20953,
3210,
275,
436,
1083,
891,
651,
1902,
10491,
281,
320,
3477,
323,
3210,
751,
3676,
11454,
6928,
253,
12637,
812,
320,
1077,
9542,
285,
891,
13414,
1158,
253,
5019,
273,
247,
4229,
285,
271,
17825,
3806,
651,
320,
2217,
253,
4477,
5469,
253,
7364,
275,
253,
2929,
285,
891,
13414,
923,
667,
4016,
38058,
3486,
273,
436,
789,
2490,
187,
4118,
18435,
27,
783,
2934,
273,
436,
2929,
310,
281,
19928,
253,
3806,
3268,
323,
7529,
2660,
272,
281,
3157,
6733,
253,
2234,
2934,
310,
2969,
5467,
253,
3806,
3268,
310,
275,
253,
17619,
2021,
285,
897,
4209,
9990,
5661,
1543,
921,
326,
436,
5431,
7729,
275,
2426,
273,
17082,
751,
3576,
3410,
1979,
591,
19502,
2167,
417,
7933,
275,
2426,
273,
3576,
3530,
591,
1273,
627,
403,
10527,
23632,
534,
1016,
10725,
327,
247,
1048,
1618,
273,
13260,
534,
403,
36334,
281,
253,
30762,
1223,
891,
8968,
253,
7364,
273,
2317,
891,
7392,
253,
30628,
326,
625,
5955,
273,
253,
13260,
943,
320,
2908,
275,
253,
2929,
273,
534,
943,
320,
2783,
625,
5884,
390,
2201,
1335,
436,
2929,
29328,
247,
4460,
2746,
326,
310,
18662,
4360,
4217,
275,
387,
1878,
690,
7533,
594,
891,
5583,
14924,
50276,
66,
5884,
1127,
253,
8266,
9552,
403,
15225,
6777,
281,
253,
1127,
273,
1146,
440,
25285,
604,
253,
2929,
310,
11462,
891,
574,
281,
17942,
281,
21282,
272,
715,
2060,
8442,
327,
253,
4382,
281,
3806,
534,
369,
3240,
38519
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
789,
10262,
247,
747,
7529,
2660,
272,
6974,
326,
5223,
84,
247,
39762,
3806,
3268,
1561,
247,
36833,
2021,
597,
5283,
247,
4764,
281,
15338,
253,
3579,
27451,
23279,
875,
253,
36833,
2021,
285,
253,
2303,
3268,
597,
13398,
247,
4229,
285,
271,
17825,
3806,
326,
5644,
281,
1805,
19855,
2281,
3045,
685,
253,
8245,
20544,
4722,
285,
19311,
555,
2934,
16248,
247,
4229,
285,
17825,
3806,
275,
253,
6974,
9470,
10527,
1783,
273,
253,
4081,
6974,
253,
4477,
2085,
10527,
23632,
323,
253,
3045,
285,
14940,
273,
253,
1332,
1175,
9759,
273,
253,
789,
50276,
20881,
1255,
265,
247,
2257,
273,
20953,
4679,
533,
417,
1524,
1533,
15302,
352,
651,
320,
4722,
281,
923,
253,
1332,
3732,
275,
247,
8750,
1566,
285,
247,
8750,
10895,
751,
271,
2460,
10895,
278,
79,
382,
260,
338,
274,
740,
50276,
18317,
310,
247,
2372,
8909,
1580,
627,
403,
642,
11815,
285,
642,
5955,
273,
7364,
2852,
10746,
38058,
3486,
533,
969,
436,
310,
247,
10527,
789,
594,
38058,
3486,
310,
417,
7763,
275,
436,
1083,
891,
651,
751,
281,
923,
253,
643,
5017,
625,
275,
247,
4858,
12494,
2167,
50276,
2520,
310,
247,
10527,
789,
594,
4016,
38058,
3486,
310,
417,
5469,
285,
253,
7364,
403,
13366,
533,
417,
4518,
5469,
275,
253,
2022,
2505,
19087,
4791,
5474,
339,
431,
248,
4477,
4081,
271,
5520,
2715,
273,
253,
7529,
2660,
272,
5933,
281,
8415,
253,
14122,
1179,
1430,
2523,
342,
1675,
281,
253,
941,
1979,
275,
1798,
253,
4477,
921,
326,
275,
253,
1236,
2400,
682,
2701,
253,
19855,
2281,
372,
25013,
29607,
281,
470,
534,
7052,
11852,
253,
10924,
875,
253,
13178,
2330,
342,
253,
2303,
3268,
285,
253,
2720,
3268,
281,
18915,
326,
2523,
253,
4477,
4081,
281,
5283,
39762,
17032,
1754,
327,
253,
17619,
2021,
11813,
285,
4679,
921,
1199,
1805,
19855,
4142,
5847,
50275,
74,
751,
253,
4477,
12288,
327,
253,
14855,
273,
7529,
2660,
272,
342,
1675,
281,
253,
941,
1979,
1677,
247,
4229,
10130,
273,
7529,
2660,
272,
253,
5511,
6733,
1057,
7164,
247,
4468,
275,
1236,
2400,
682,
7787,
247,
2201,
1921,
891,
717,
9101,
272,
310,
326,
347,
253,
1180,
273,
941,
2792,
5459,
253,
2201,
4438,
4916,
625,
11360,
534,
671,
6381,
2731,
253,
4477,
281,
897,
247,
10839,
494,
2720,
1754,
327,
39762,
17032,
50276,
5040,
50276,
18,
891,
1158,
253,
4081,
1332,
310,
417,
253,
987,
2900,
281,
18915,
326,
2523,
347,
310,
1929,
326,
7529,
2660,
272,
417,
760,
24505,
670,
5511,
6733,
390,
19855,
4142,
533,
671,
16633,
327,
253,
17947,
15083,
80,
3535,
5454,
2727,
253,
1655,
1332,
3133,
281,
8415,
253,
2523,
273,
5511,
275,
46505,
533,
253,
3486,
273,
17947,
310,
417,
2590,
604,
359,
13414,
871,
849,
1199,
17947,
310,
31548,
2139,
417,
816,
5283,
247,
2720,
326,
310,
2810,
2217,
281,
253,
2303,
3268,
275,
326,
1039,
359,
476,
6558,
247,
1781,
2217,
19855,
2281,
3066,
253,
954,
26724,
1332,
50276,
19,
253,
5019,
342,
247,
4229,
3806,
2007,
5459,
619,
7350,
670,
436,
1332,
275,
17947,
534,
556,
281,
17942,
281,
247,
1027,
2720,
323,
17947,
50276,
20,
5001,
253,
11813,
891,
1928,
436,
2929,
310,
625,
7470,
323,
247,
6698,
2278,
50275,
74,
717,
7615,
342,
726,
5797,
480,
43053,
67,
1797,
2929,
533,
253,
4737,
4278,
273,
436,
789,
403,
417,
9257,
10141,
5549,
5474,
33032,
2520,
2929,
29328,
281,
3037,
253,
2720,
3268,
5223,
1242,
323,
7529,
2660,
272,
275,
1798,
253,
2720,
3268,
310,
24251,
281,
22318,
247,
17335,
8103,
3579,
27451,
23279,
281,
253,
12637,
342,
247,
2969,
11786,
4924,
2774,
11038,
5199,
275,
3762,
253,
39762,
2720,
3806,
19539,
281,
562,
32231,
4229,
3806,
533,
275,
3946,
352,
778,
755,
10960,
275,
247,
2014,
4438,
534,
253,
4477,
11322,
407,
12480,
253,
17825,
285,
4229,
3806,
10670,
45190,
253,
4081,
1332,
33526,
247,
1943,
6351,
689,
5368,
3082,
327,
17699,
16561,
17032,
8892,
20544,
50276,
783,
2929,
310,
1077,
973,
3542,
285,
3477,
281,
956,
50276,
783,
5611,
5933,
310,
27350,
285,
28055,
3590,
275,
253,
1781,
941,
2701,
253,
2774,
25049,
3806,
812,
5115,
253,
1682,
1896,
19855,
2281,
273,
1249,
50276,
783,
4477,
4229,
253,
21900,
3806,
407,
6240,
4229,
3806,
896,
275,
3946,
534,
3133,
281,
789,
973,
45190,
281,
320,
4344,
516,
417,
7615,
342,
253,
15302,
253,
4477,
908,
275,
253,
2929,
594,
891,
13414,
871,
849,
21414,
253,
16774,
1543,
403,
50275,
20881,
1255,
265,
50276,
77,
471,
273,
11985,
670,
253,
13260,
275,
10527,
6260,
323,
39325,
495,
14380,
253,
11815,
760,
2186,
762,
690,
13260,
5393,
275,
253,
30762,
6240,
690,
11985,
390,
4933,
690,
27350,
22909,
670,
253,
7533,
651,
320,
9371,
323,
10668,
281,
2096,
253,
12739,
273,
512,
841,
39325,
50276,
455,
253,
4679,
403,
2218,
327,
5899,
17032,
3237,
342,
4942,
20953,
3210,
275,
436,
1083,
891,
651,
1902,
10491,
281,
320,
3477,
323,
3210,
751,
3676,
11454,
6928,
253,
12637,
812,
320,
1077,
9542,
285,
891,
13414,
1158,
253,
5019,
273,
247,
4229,
285,
271,
17825,
3806,
651,
320,
2217,
253,
4477,
5469,
253,
7364,
275,
253,
2929,
285,
891,
13414,
923,
667,
4016,
38058,
3486,
273,
436,
789,
2490,
187,
4118,
18435,
27,
783,
2934,
273,
436,
2929,
310,
281,
19928,
253,
3806,
3268,
323,
7529,
2660,
272,
281,
3157,
6733,
253,
2234,
2934,
310,
2969,
5467,
253,
3806,
3268,
310,
275,
253,
17619,
2021,
285,
897,
4209,
9990,
5661,
1543,
921,
326,
436,
5431,
7729,
275,
2426,
273,
17082,
751,
3576,
3410,
1979,
591,
19502,
2167,
417,
7933,
275,
2426,
273,
3576,
3530,
591,
1273,
627,
403,
10527,
23632,
534,
1016,
10725,
327,
247,
1048,
1618,
273,
13260,
534,
403,
36334,
281,
253,
30762,
1223,
891,
8968,
253,
7364,
273,
2317,
891,
7392,
253,
30628,
326,
625,
5955,
273,
253,
13260,
943,
320,
2908,
275,
253,
2929,
273,
534,
943,
320,
2783,
625,
5884,
390,
2201,
1335,
436,
2929,
29328,
247,
4460,
2746,
326,
310,
18662,
4360,
4217,
275,
387,
1878,
690,
7533,
594,
891,
5583,
14924,
50276,
66,
5884,
1127,
253,
8266,
9552,
403,
15225,
6777,
281,
253,
1127,
273,
1146,
440,
25285,
604,
253,
2929,
310,
11462,
891,
574,
281,
17942,
281,
21282,
272,
715,
2060,
8442,
327,
253,
4382,
281,
3806,
534,
369,
3240,
38519
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper presents a method for improving taillabel performance in extreme multilabel learning setup where the number of target labels can be extremely large it is based on the finding that the distribution of the norms of the learnt weight vectors also follows a powerlaw as does the distribution of the samples among labels the main contribution of the paper is proposing methods for reranking which encourages precedence of taillabels and a data augmentation mechanism it achieves improvements when applied to sota methods on relevant psp metrics some of the concerns regarding the paper are the approach overall seems more like an adhoc postprocessing step rather than a learning algorithm it is possible that the impact of ranknet proposed in section 32 can be achieved in a more simple way of reranking scores in the code provided it was not clear where ranknet as described in section 32 was implemented the theorem 1 seems incorrect the probability model is not completely specified as it is not clear what exactly is meant by the test point being randomly sampled is it uniformly at random as seems to be from the proof or from the distribution that is same as the training distribution as the typical iid assumption in ml also it seems to compute expectation of some event yj in betak which is strange as expectations can be computed only of random variables overall the statement of the theorem seems quite vague and imprecise there are some notational issues also the w and w symbols in the theorem dont match the preceding text in terms of the experimental results it is not clear what happens with vanilla pk and ndcgk even though it is mentioned on page 6 para2 that the these metrics are computed but these are not given anywhere also the table 4 does not seem to be of much consequence as the reranking method can be potentially be applied to all the competing methods other minor comments the references are improperly given in some places abbreviations are used for conference names and in others full names are given in many places arxiv versions of the papers are mentioned even though the corresponding papers are published with conferencesjournalsdocsepsummary in prediction problems with millions of labels also known as extreme multilabel learning xml problems eg recommender systems the model predictions are not as good for the tail rarer labels this paper proposes two models for this problem the first model is rerankingbased that is it reranks the prediction scores of a standard xml model the second model tries to augment the rarer labels to reduce the skew in data results shown on several realworld datasets highlight the superior predictive ability of the proposed reranking model for tail labels compared to a host of competitive baselines comments the paper solves an important problem which has several industrial applications of extreme multilabel learning the proposed methods are novel perhaps less so to someone who is an expert in xml the experimental evaluation is highly impressive both the proposed methods outperform a host of highly competitive baselines on a variety of datasets by significant margins however i have a couple of concerns regarding the proposed methods 1 the ranknet method which reranks the xml models predictions needs to be compared against a baseline which also performs reranking for an applestoapples comparison in table 2 sure the improvements due to reranking vs noreranking are impressive but how would a simple reranking approach which is not populationaware perform how is the lambda chosen by cv since you can stack ranknet modules to make it deep how many were used for results in table 2 how sensitive are the results to the number of modules 2 the data augmentation for the tail labels seems arbitrary why only input dropout and input swap also it is unclear how one should split the data between head and tail labels more importantly how are the model scores for head and tail labels integrated to make a final prediction docsepthis paper considers the setting of extreme multilabel classification where labels typically follow a powerlaw distribution with many infrequentlyobserved labels socalled tail labels in this setting it often happens that multilabel classifiers more often predict frequent labels as positive than infrequent labels in practical applications this is not always wanted and the authors present a new algorithm that favors tail labels over frequent labels to this end a specific rankingbased loss function that consists of two parts is minimized the first part of the loss ranks positive tail labels higher than positive frequent labels the second part is more standard and ranks positive labels higher than negative labels improving predictions for tail labels is an interesting research goal that has not been thoroughly addressed in the literature but i am not convinced of the theoretical results and the introduced algorithm theorem 1 does not hold because an important condition is missing the theorem would only hold if wjt x 0 for all x however in practice such a condition cannot be guaranteed the formulation of the theorem is more difficult than needed but what the authors want to say is the following pyjx is a monotonically increasing function of the norm of wj the proof that is found in the appendix cannot be correct because one can easily construct a counterexample when wjt x 0 and the proof is also more complicated than needed in fact pyjx is just a transformation of wjt x via a monotone function g with 01 as codomain useful choices for g are the logit or probit link but not an exponential function as stated in the proof with this insight one can easily see that when wjt x 0 the probability pyjx will decrease when for example all coefficient in w are multiplied with a factor two in that case the norm of wj all increases and we have a counterexample for the theorem to my opinion the proof makes a few very strange constructions but i cannot immediately see where the mistake is i also do not understand why the link function is only introduced in the appendix because it is a key concept to link wjt x and pyjx to increase readability i would advise to discuss this early in section 2 i also do not understand what wj represents in the case of treebased models more discussion is needed for treebased models one doesnt have a weight vector per class isnt it i am also not convinced of the algorithm that is introduced in section 32 the method is very adhoc without any theoretical justification as a result of pairwise terms it might also be computationally challenging to optimize the proposed loss for extreme multilabel datasets isnt there a much simpler solution using the terminology of section 21 one could simply improve the performance for tail labels by adjusting the threshold t for such labels only has such a simple solution been considered in literature in that way one could fit standard probabilistic classifiers during training following by a reasoning on probabilities in a posttraining procedure similar to the approach of the authors one could take label frequencies into account during this posttraining procedure resulting in a threshold t that depends on label frequency in the experiments it is not clear to me why only four xml datasets are used why were the other datasets in the xml repository not analyzed please provide a good motivation or analyze all datasets
### Summary: | the paper presents some interesting insights but all reviewers have agreed that it does not meet the bar of iclr the theoretical results require revision as several issues have been indicated in the reviews the authors have tried to correct them during the rebuttal but the reviewers remain unconvinced also the novelty is limited as reranking is a wellknown concept and decoupling of head and tail labels is an approach often used in practice across many applications the authors should also clarify the way the ranknet method is used and implemented to clarify the issue raised by reviewer 1 finally let me notice that adjusting thresholds for labels has been considered in the xmlc literature in the context of optimization of the macro fmeasure extreme fmeasure maximization using sparse probability estimates icml 2016 | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
10262,
247,
1332,
323,
11138,
15307,
408,
1492,
3045,
275,
9559,
33362,
1492,
4715,
9978,
835,
253,
1180,
273,
2303,
13301,
476,
320,
6685,
1781,
352,
310,
1754,
327,
253,
4560,
326,
253,
3268,
273,
253,
22429,
273,
253,
34003,
2801,
11390,
671,
3637,
247,
1612,
6937,
347,
1057,
253,
3268,
273,
253,
3530,
2190,
13301,
253,
2022,
7680,
273,
253,
2929,
310,
36636,
3082,
323,
294,
47883,
534,
29426,
49280,
273,
15307,
408,
357,
1241,
285,
247,
941,
42072,
5122,
352,
33526,
11701,
672,
3732,
281,
256,
5503,
3082,
327,
4623,
268,
1033,
17082,
50276,
8826,
273,
253,
7350,
5001,
253,
2929,
403,
50275,
783,
2746,
4583,
3133,
625,
751,
271,
519,
37806,
1501,
21678,
3213,
2581,
685,
247,
4715,
5933,
352,
310,
1896,
326,
253,
3486,
273,
5958,
3024,
4081,
275,
2593,
4567,
476,
320,
6786,
275,
247,
625,
2969,
1039,
273,
294,
47883,
7363,
275,
253,
2127,
2530,
352,
369,
417,
2590,
835,
5958,
3024,
347,
2529,
275,
2593,
4567,
369,
9009,
50275,
783,
10012,
337,
3133,
13583,
253,
5912,
1566,
310,
417,
4336,
7616,
347,
352,
310,
417,
2590,
752,
4555,
310,
5486,
407,
253,
1071,
1127,
1146,
12421,
19958,
310,
352,
17568,
387,
3632,
347,
3133,
281,
320,
432,
253,
4737,
390,
432,
253,
3268,
326,
310,
1072,
347,
253,
3733,
3268,
347,
253,
6867,
891,
301,
9376,
275,
13361,
671,
352,
3133,
281,
11897,
15355,
273,
690,
2362,
340,
75,
275,
701,
518,
534,
310,
8921,
347,
12656,
476,
320,
10302,
760,
273,
3632,
4903,
4583,
253,
3908,
273,
253,
10012,
3133,
3240,
21248,
285,
1607,
2845,
885,
627,
403,
690,
417,
1050,
3374,
671,
253,
259,
285,
259,
14217,
275,
253,
10012,
13414,
3761,
253,
17691,
2505,
50276,
249,
2426,
273,
253,
5661,
1543,
352,
310,
417,
2590,
752,
6569,
342,
26724,
268,
76,
285,
295,
12352,
72,
76,
1014,
2167,
352,
310,
5393,
327,
3239,
721,
5586,
19,
326,
253,
841,
17082,
403,
10302,
533,
841,
403,
417,
1677,
9825,
671,
253,
2829,
577,
1057,
417,
1646,
281,
320,
273,
1199,
9936,
347,
253,
294,
47883,
1332,
476,
320,
7826,
320,
3732,
281,
512,
253,
11771,
3082,
50276,
977,
5884,
5701,
50276,
783,
10414,
403,
28203,
1677,
275,
690,
5053,
490,
25669,
403,
908,
323,
8059,
4454,
285,
275,
2571,
2120,
4454,
403,
1677,
275,
1142,
5053,
549,
32693,
9508,
273,
253,
9380,
403,
5393,
1014,
2167,
253,
3969,
9380,
403,
3863,
342,
27691,
34859,
7152,
339,
793,
360,
3454,
50276,
249,
10554,
3237,
342,
9790,
273,
13301,
671,
1929,
347,
9559,
33362,
1492,
4715,
10366,
3237,
24088,
3818,
3109,
2718,
253,
1566,
13650,
403,
417,
347,
1175,
323,
253,
8105,
1218,
6554,
13301,
436,
2929,
29328,
767,
3210,
323,
436,
1895,
253,
806,
1566,
310,
294,
47883,
3169,
326,
310,
352,
294,
83,
3107,
253,
10554,
7363,
273,
247,
2629,
10366,
1566,
253,
1273,
1566,
14177,
281,
35919,
253,
1218,
6554,
13301,
281,
4796,
253,
35689,
275,
941,
1543,
2011,
327,
2067,
1524,
10186,
15302,
6780,
253,
8936,
15970,
3745,
273,
253,
4081,
294,
47883,
1566,
323,
8105,
13301,
2429,
281,
247,
3167,
273,
12085,
1666,
25379,
50275,
26122,
50276,
783,
2929,
35910,
271,
1774,
1895,
534,
556,
2067,
9787,
4893,
273,
9559,
33362,
1492,
4715,
253,
4081,
3082,
403,
4460,
4931,
1679,
594,
281,
3095,
665,
310,
271,
6485,
275,
10366,
253,
5661,
7103,
310,
4122,
13943,
1097,
253,
4081,
3082,
562,
32231,
247,
3167,
273,
4122,
12085,
1666,
25379,
327,
247,
5235,
273,
15302,
407,
1534,
24390,
2299,
891,
452,
247,
4564,
273,
7350,
5001,
253,
4081,
3082,
50275,
18,
253,
5958,
3024,
1332,
534,
294,
83,
3107,
253,
10366,
3210,
13650,
3198,
281,
320,
2429,
1411,
247,
8245,
534,
671,
17923,
294,
47883,
323,
271,
2999,
22055,
1212,
868,
5301,
275,
2829,
374,
2119,
253,
11701,
1955,
281,
294,
47883,
4632,
295,
14071,
41988,
403,
13943,
533,
849,
651,
247,
2969,
294,
47883,
2746,
534,
310,
417,
3072,
13823,
1347,
849,
310,
253,
29331,
6777,
407,
30105,
1580,
368,
476,
8031,
5958,
3024,
11911,
281,
1056,
352,
3676,
849,
1142,
497,
908,
323,
1543,
275,
2829,
374,
849,
7996,
403,
253,
1543,
281,
253,
1180,
273,
11911,
50276,
19,
253,
941,
42072,
323,
253,
8105,
13301,
3133,
10341,
2139,
760,
3280,
5926,
483,
285,
3280,
22101,
671,
352,
310,
12744,
849,
581,
943,
8085,
253,
941,
875,
1481,
285,
8105,
13301,
625,
15538,
849,
403,
253,
1566,
7363,
323,
1481,
285,
8105,
13301,
8527,
281,
1056,
247,
2457,
10554,
50274,
7152,
33032,
2520,
2929,
19401,
253,
4758,
273,
9559,
33362,
1492,
9162,
835,
13301,
5431,
956,
247,
1612,
6937,
3268,
342,
1142,
2192,
34416,
45912,
13301,
9267,
18859,
8105,
13301,
275,
436,
4758,
352,
2223,
6569,
326,
33362,
1492,
49996,
625,
2223,
3283,
10879,
13301,
347,
2762,
685,
2192,
38976,
13301,
275,
8542,
4893,
436,
310,
417,
1900,
3078,
285,
253,
4477,
1246,
247,
747,
5933,
326,
32955,
8105,
13301,
689,
10879,
13301,
281,
436,
990,
247,
2173,
19947,
3169,
2957,
1159,
326,
8414,
273,
767,
4243,
310,
36625,
253,
806,
629,
273,
253,
2957,
17210,
2762,
8105,
13301,
2169,
685,
2762,
10879,
13301,
253,
1273,
629,
310,
625,
2629,
285,
17210,
2762,
13301,
2169,
685,
4016,
13301,
50275,
303,
40037,
13650,
323,
8105,
13301,
310,
271,
4722,
2561,
4736,
326,
556,
417,
644,
16575,
9713,
275,
253,
6239,
533,
891,
717,
417,
13762,
273,
253,
10527,
1543,
285,
253,
5611,
5933,
50275,
33921,
337,
1057,
417,
2186,
984,
271,
1774,
1617,
310,
5816,
253,
10012,
651,
760,
2186,
604,
259,
42565,
1269,
50276,
17,
323,
512,
1269,
2299,
275,
3946,
824,
247,
1617,
2550,
320,
16293,
253,
15895,
273,
253,
10012,
310,
625,
2834,
685,
3058,
533,
752,
253,
4477,
971,
281,
1333,
310,
253,
1563,
7239,
75,
89,
310,
247,
41907,
1037,
3629,
1159,
273,
253,
5222,
273,
259,
75,
253,
4737,
326,
310,
1119,
275,
253,
30762,
2550,
320,
3451,
984,
581,
476,
4354,
3989,
247,
2258,
442,
18398,
4636,
672,
259,
42565,
1269,
50276,
17,
285,
253,
4737,
310,
671,
625,
9542,
685,
3058,
275,
958,
7239,
75,
89,
310,
816,
247,
9261,
273,
259,
42565,
1269,
3066,
247,
49123,
1159,
305,
342,
14805,
347,
12738,
297,
404,
4217,
10165,
323,
305,
403,
253,
2412,
262,
390,
1742,
262,
3048,
533,
417,
271,
17619,
1159,
347,
4767,
275,
253,
4737,
342,
436,
12288,
581,
476,
4354,
923,
326,
672,
259,
42565,
1269,
50276,
17,
253,
5912,
7239,
75,
89,
588,
6379,
672,
323,
1650,
512,
10235,
275,
259,
403,
31458,
342,
247,
2803,
767,
275,
326,
1083,
253,
5222,
273,
259,
75,
512,
5459,
285,
359,
452,
247,
2258,
442,
18398,
4636,
323,
253,
10012,
281,
619,
4743,
253,
4737,
2789,
247,
1643,
1077,
8921,
35831,
533,
891,
2550,
4745,
923,
835,
253,
10551,
310,
50275,
74,
671,
513,
417,
2096,
2139,
253,
3048,
1159,
310,
760,
5611,
275,
253,
30762,
984,
352,
310,
247,
2234,
4473,
281,
3048,
259,
42565,
1269,
285,
7239,
75,
89,
281,
2572,
1239,
1430,
891,
651,
22276,
281,
2319,
436,
2393,
275,
2593,
374,
891,
671,
513,
417,
2096,
752,
259,
75,
6125,
275,
253,
1083,
273,
5202,
3169,
3210,
625,
5955,
310,
3058,
323,
5202,
3169,
3210,
581,
36908,
452,
247,
2801,
4972,
591,
966,
310,
2649,
352,
50275,
74,
717,
671,
417,
13762,
273,
253,
5933,
326,
310,
5611,
275,
2593,
4567,
253,
1332,
310,
1077,
519,
37806,
1293,
667,
10527,
22861,
347,
247,
906,
273,
28208,
2426,
352,
1537,
671,
320,
43245,
11132,
281,
22318,
253,
4081,
2957,
323,
9559,
33362,
1492,
15302,
310,
2649,
627,
247,
1199,
19554,
2900,
970,
253,
28939,
273,
2593,
3127,
581,
812,
3365,
3157,
253,
3045,
323,
8105,
13301,
407,
19427,
253,
7887,
246,
323,
824,
13301,
760,
556,
824,
247,
2969,
2900,
644,
2783,
275,
6239,
275,
326,
1039,
581,
812,
4944,
2629,
37851,
49996,
1309,
3733,
1563,
407,
247,
14720,
327,
20552,
275,
247,
1501,
31158,
5199,
2074,
281,
253,
2746,
273,
253,
4477,
581,
812,
1379,
5203,
11383,
715,
2395,
1309,
436,
1501,
31158,
5199,
4795,
275,
247,
7887,
246,
326,
7024,
327,
5203,
4294,
50274,
249,
253,
4679,
352,
310,
417,
2590,
281,
479,
2139,
760,
1740,
10366,
15302,
403,
908,
2139,
497,
253,
643,
15302,
275,
253,
10366,
18491,
417,
5867,
4496,
2085,
247,
1175,
16038,
390,
12106,
512,
15302,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
690,
4722,
16039,
533,
512,
30628,
452,
5821,
326,
352,
1057,
417,
2525,
253,
2534,
273,
17857,
32888,
253,
10527,
1543,
2430,
18520,
347,
2067,
3374,
452,
644,
4860,
275,
253,
10123,
253,
4477,
452,
3597,
281,
3451,
731,
1309,
253,
30080,
22559,
533,
253,
30628,
3464,
10915,
8498,
758,
50276,
12563,
253,
38135,
310,
3710,
347,
294,
47883,
310,
247,
973,
4304,
4473,
285,
34430,
4906,
273,
1481,
285,
8105,
13301,
310,
271,
2746,
2223,
908,
275,
3946,
2439,
1142,
4893,
50276,
783,
4477,
943,
671,
19148,
253,
1039,
253,
5958,
3024,
1332,
310,
908,
285,
9009,
281,
19148,
253,
2523,
5439,
407,
37317,
337,
4720,
1339,
479,
4366,
326,
19427,
26682,
323,
13301,
556,
644,
2783,
275,
253,
10366,
68,
6239,
275,
253,
3634,
273,
13757,
273,
253,
14823,
269,
30238,
9559,
269,
30238,
11903,
1320,
970,
23507,
5912,
8197,
17857,
1686,
4022,
50276
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
10262,
247,
1332,
323,
11138,
15307,
408,
1492,
3045,
275,
9559,
33362,
1492,
4715,
9978,
835,
253,
1180,
273,
2303,
13301,
476,
320,
6685,
1781,
352,
310,
1754,
327,
253,
4560,
326,
253,
3268,
273,
253,
22429,
273,
253,
34003,
2801,
11390,
671,
3637,
247,
1612,
6937,
347,
1057,
253,
3268,
273,
253,
3530,
2190,
13301,
253,
2022,
7680,
273,
253,
2929,
310,
36636,
3082,
323,
294,
47883,
534,
29426,
49280,
273,
15307,
408,
357,
1241,
285,
247,
941,
42072,
5122,
352,
33526,
11701,
672,
3732,
281,
256,
5503,
3082,
327,
4623,
268,
1033,
17082,
50276,
8826,
273,
253,
7350,
5001,
253,
2929,
403,
50275,
783,
2746,
4583,
3133,
625,
751,
271,
519,
37806,
1501,
21678,
3213,
2581,
685,
247,
4715,
5933,
352,
310,
1896,
326,
253,
3486,
273,
5958,
3024,
4081,
275,
2593,
4567,
476,
320,
6786,
275,
247,
625,
2969,
1039,
273,
294,
47883,
7363,
275,
253,
2127,
2530,
352,
369,
417,
2590,
835,
5958,
3024,
347,
2529,
275,
2593,
4567,
369,
9009,
50275,
783,
10012,
337,
3133,
13583,
253,
5912,
1566,
310,
417,
4336,
7616,
347,
352,
310,
417,
2590,
752,
4555,
310,
5486,
407,
253,
1071,
1127,
1146,
12421,
19958,
310,
352,
17568,
387,
3632,
347,
3133,
281,
320,
432,
253,
4737,
390,
432,
253,
3268,
326,
310,
1072,
347,
253,
3733,
3268,
347,
253,
6867,
891,
301,
9376,
275,
13361,
671,
352,
3133,
281,
11897,
15355,
273,
690,
2362,
340,
75,
275,
701,
518,
534,
310,
8921,
347,
12656,
476,
320,
10302,
760,
273,
3632,
4903,
4583,
253,
3908,
273,
253,
10012,
3133,
3240,
21248,
285,
1607,
2845,
885,
627,
403,
690,
417,
1050,
3374,
671,
253,
259,
285,
259,
14217,
275,
253,
10012,
13414,
3761,
253,
17691,
2505,
50276,
249,
2426,
273,
253,
5661,
1543,
352,
310,
417,
2590,
752,
6569,
342,
26724,
268,
76,
285,
295,
12352,
72,
76,
1014,
2167,
352,
310,
5393,
327,
3239,
721,
5586,
19,
326,
253,
841,
17082,
403,
10302,
533,
841,
403,
417,
1677,
9825,
671,
253,
2829,
577,
1057,
417,
1646,
281,
320,
273,
1199,
9936,
347,
253,
294,
47883,
1332,
476,
320,
7826,
320,
3732,
281,
512,
253,
11771,
3082,
50276,
977,
5884,
5701,
50276,
783,
10414,
403,
28203,
1677,
275,
690,
5053,
490,
25669,
403,
908,
323,
8059,
4454,
285,
275,
2571,
2120,
4454,
403,
1677,
275,
1142,
5053,
549,
32693,
9508,
273,
253,
9380,
403,
5393,
1014,
2167,
253,
3969,
9380,
403,
3863,
342,
27691,
34859,
7152,
339,
793,
360,
3454,
50276,
249,
10554,
3237,
342,
9790,
273,
13301,
671,
1929,
347,
9559,
33362,
1492,
4715,
10366,
3237,
24088,
3818,
3109,
2718,
253,
1566,
13650,
403,
417,
347,
1175,
323,
253,
8105,
1218,
6554,
13301,
436,
2929,
29328,
767,
3210,
323,
436,
1895,
253,
806,
1566,
310,
294,
47883,
3169,
326,
310,
352,
294,
83,
3107,
253,
10554,
7363,
273,
247,
2629,
10366,
1566,
253,
1273,
1566,
14177,
281,
35919,
253,
1218,
6554,
13301,
281,
4796,
253,
35689,
275,
941,
1543,
2011,
327,
2067,
1524,
10186,
15302,
6780,
253,
8936,
15970,
3745,
273,
253,
4081,
294,
47883,
1566,
323,
8105,
13301,
2429,
281,
247,
3167,
273,
12085,
1666,
25379,
50275,
26122,
50276,
783,
2929,
35910,
271,
1774,
1895,
534,
556,
2067,
9787,
4893,
273,
9559,
33362,
1492,
4715,
253,
4081,
3082,
403,
4460,
4931,
1679,
594,
281,
3095,
665,
310,
271,
6485,
275,
10366,
253,
5661,
7103,
310,
4122,
13943,
1097,
253,
4081,
3082,
562,
32231,
247,
3167,
273,
4122,
12085,
1666,
25379,
327,
247,
5235,
273,
15302,
407,
1534,
24390,
2299,
891,
452,
247,
4564,
273,
7350,
5001,
253,
4081,
3082,
50275,
18,
253,
5958,
3024,
1332,
534,
294,
83,
3107,
253,
10366,
3210,
13650,
3198,
281,
320,
2429,
1411,
247,
8245,
534,
671,
17923,
294,
47883,
323,
271,
2999,
22055,
1212,
868,
5301,
275,
2829,
374,
2119,
253,
11701,
1955,
281,
294,
47883,
4632,
295,
14071,
41988,
403,
13943,
533,
849,
651,
247,
2969,
294,
47883,
2746,
534,
310,
417,
3072,
13823,
1347,
849,
310,
253,
29331,
6777,
407,
30105,
1580,
368,
476,
8031,
5958,
3024,
11911,
281,
1056,
352,
3676,
849,
1142,
497,
908,
323,
1543,
275,
2829,
374,
849,
7996,
403,
253,
1543,
281,
253,
1180,
273,
11911,
50276,
19,
253,
941,
42072,
323,
253,
8105,
13301,
3133,
10341,
2139,
760,
3280,
5926,
483,
285,
3280,
22101,
671,
352,
310,
12744,
849,
581,
943,
8085,
253,
941,
875,
1481,
285,
8105,
13301,
625,
15538,
849,
403,
253,
1566,
7363,
323,
1481,
285,
8105,
13301,
8527,
281,
1056,
247,
2457,
10554,
50274,
7152,
33032,
2520,
2929,
19401,
253,
4758,
273,
9559,
33362,
1492,
9162,
835,
13301,
5431,
956,
247,
1612,
6937,
3268,
342,
1142,
2192,
34416,
45912,
13301,
9267,
18859,
8105,
13301,
275,
436,
4758,
352,
2223,
6569,
326,
33362,
1492,
49996,
625,
2223,
3283,
10879,
13301,
347,
2762,
685,
2192,
38976,
13301,
275,
8542,
4893,
436,
310,
417,
1900,
3078,
285,
253,
4477,
1246,
247,
747,
5933,
326,
32955,
8105,
13301,
689,
10879,
13301,
281,
436,
990,
247,
2173,
19947,
3169,
2957,
1159,
326,
8414,
273,
767,
4243,
310,
36625,
253,
806,
629,
273,
253,
2957,
17210,
2762,
8105,
13301,
2169,
685,
2762,
10879,
13301,
253,
1273,
629,
310,
625,
2629,
285,
17210,
2762,
13301,
2169,
685,
4016,
13301,
50275,
303,
40037,
13650,
323,
8105,
13301,
310,
271,
4722,
2561,
4736,
326,
556,
417,
644,
16575,
9713,
275,
253,
6239,
533,
891,
717,
417,
13762,
273,
253,
10527,
1543,
285,
253,
5611,
5933,
50275,
33921,
337,
1057,
417,
2186,
984,
271,
1774,
1617,
310,
5816,
253,
10012,
651,
760,
2186,
604,
259,
42565,
1269,
50276,
17,
323,
512,
1269,
2299,
275,
3946,
824,
247,
1617,
2550,
320,
16293,
253,
15895,
273,
253,
10012,
310,
625,
2834,
685,
3058,
533,
752,
253,
4477,
971,
281,
1333,
310,
253,
1563,
7239,
75,
89,
310,
247,
41907,
1037,
3629,
1159,
273,
253,
5222,
273,
259,
75,
253,
4737,
326,
310,
1119,
275,
253,
30762,
2550,
320,
3451,
984,
581,
476,
4354,
3989,
247,
2258,
442,
18398,
4636,
672,
259,
42565,
1269,
50276,
17,
285,
253,
4737,
310,
671,
625,
9542,
685,
3058,
275,
958,
7239,
75,
89,
310,
816,
247,
9261,
273,
259,
42565,
1269,
3066,
247,
49123,
1159,
305,
342,
14805,
347,
12738,
297,
404,
4217,
10165,
323,
305,
403,
253,
2412,
262,
390,
1742,
262,
3048,
533,
417,
271,
17619,
1159,
347,
4767,
275,
253,
4737,
342,
436,
12288,
581,
476,
4354,
923,
326,
672,
259,
42565,
1269,
50276,
17,
253,
5912,
7239,
75,
89,
588,
6379,
672,
323,
1650,
512,
10235,
275,
259,
403,
31458,
342,
247,
2803,
767,
275,
326,
1083,
253,
5222,
273,
259,
75,
512,
5459,
285,
359,
452,
247,
2258,
442,
18398,
4636,
323,
253,
10012,
281,
619,
4743,
253,
4737,
2789,
247,
1643,
1077,
8921,
35831,
533,
891,
2550,
4745,
923,
835,
253,
10551,
310,
50275,
74,
671,
513,
417,
2096,
2139,
253,
3048,
1159,
310,
760,
5611,
275,
253,
30762,
984,
352,
310,
247,
2234,
4473,
281,
3048,
259,
42565,
1269,
285,
7239,
75,
89,
281,
2572,
1239,
1430,
891,
651,
22276,
281,
2319,
436,
2393,
275,
2593,
374,
891,
671,
513,
417,
2096,
752,
259,
75,
6125,
275,
253,
1083,
273,
5202,
3169,
3210,
625,
5955,
310,
3058,
323,
5202,
3169,
3210,
581,
36908,
452,
247,
2801,
4972,
591,
966,
310,
2649,
352,
50275,
74,
717,
671,
417,
13762,
273,
253,
5933,
326,
310,
5611,
275,
2593,
4567,
253,
1332,
310,
1077,
519,
37806,
1293,
667,
10527,
22861,
347,
247,
906,
273,
28208,
2426,
352,
1537,
671,
320,
43245,
11132,
281,
22318,
253,
4081,
2957,
323,
9559,
33362,
1492,
15302,
310,
2649,
627,
247,
1199,
19554,
2900,
970,
253,
28939,
273,
2593,
3127,
581,
812,
3365,
3157,
253,
3045,
323,
8105,
13301,
407,
19427,
253,
7887,
246,
323,
824,
13301,
760,
556,
824,
247,
2969,
2900,
644,
2783,
275,
6239,
275,
326,
1039,
581,
812,
4944,
2629,
37851,
49996,
1309,
3733,
1563,
407,
247,
14720,
327,
20552,
275,
247,
1501,
31158,
5199,
2074,
281,
253,
2746,
273,
253,
4477,
581,
812,
1379,
5203,
11383,
715,
2395,
1309,
436,
1501,
31158,
5199,
4795,
275,
247,
7887,
246,
326,
7024,
327,
5203,
4294,
50274,
249,
253,
4679,
352,
310,
417,
2590,
281,
479,
2139,
760,
1740,
10366,
15302,
403,
908,
2139,
497,
253,
643,
15302,
275,
253,
10366,
18491,
417,
5867,
4496,
2085,
247,
1175,
16038,
390,
12106,
512,
15302,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
690,
4722,
16039,
533,
512,
30628,
452,
5821,
326,
352,
1057,
417,
2525,
253,
2534,
273,
17857,
32888,
253,
10527,
1543,
2430,
18520,
347,
2067,
3374,
452,
644,
4860,
275,
253,
10123,
253,
4477,
452,
3597,
281,
3451,
731,
1309,
253,
30080,
22559,
533,
253,
30628,
3464,
10915,
8498,
758,
50276,
12563,
253,
38135,
310,
3710,
347,
294,
47883,
310,
247,
973,
4304,
4473,
285,
34430,
4906,
273,
1481,
285,
8105,
13301,
310,
271,
2746,
2223,
908,
275,
3946,
2439,
1142,
4893,
50276,
783,
4477,
943,
671,
19148,
253,
1039,
253,
5958,
3024,
1332,
310,
908,
285,
9009,
281,
19148,
253,
2523,
5439,
407,
37317,
337,
4720,
1339,
479,
4366,
326,
19427,
26682,
323,
13301,
556,
644,
2783,
275,
253,
10366,
68,
6239,
275,
253,
3634,
273,
13757,
273,
253,
14823,
269,
30238,
9559,
269,
30238,
11903,
1320,
970,
23507,
5912,
8197,
17857,
1686,
4022,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors propose a new method for integrating graphbased models with boosting this is done using the typical method involving residuals and weaklearners but adding a step where information is propagated in the graph the approach is also simple as no gnns or other auxiliary models are required it is also shown how the metaloss introduced by the authors provides convergence given some moderate assumptions according to the experiments reported the proposed model is better than the current state of the art in the considered domain the notation used is easy to understand as is the mathematical explanation in section 3 which is presented in a comprehensive but concise manner for ebbs we must fit the weaklearners to gradients from both the training and test nodes this is the sentence that i am most concerned about as the use of test data in the training phase may render the results obtained invalid although the authors give their own explanation of why the test nodes should also be used during training ie for the propagation of information in the graph if the test labels are used during the train there is no longer any separation between train and test i have this doubt because the labels are used to calculate functionspace gradients is this correct algorithm 1 could be described in a little more detail the analysis of the convergence of the method by theorem is very good the remarks connected to the theorem are interesting but could have been treated in more detail they are in part in the supplements if experiments are done with different random seeds as stated then results in table 1 should be reported with their corresponding standard deviation or confidence interval why are some results reported with different decimal places in the table i am talking specifically about the cs column but also for the slap dblp and phy columns one decimal place could be added if the datasets were taken from ivanov prokhorenkova 2021 why was wiki not taken the reason should be that being a homogeneous dataset then as explained by ivanov prokhorenkova 2021 neural network approaches are sufficient to achieve the best results it would still be interesting as a comparison also as a comparison with them the house and vk datasets could also be used for classification they also report the standard deviation of all results also the results in the table match those of ivanov prokhorenkova 2021 but i do not understand why their lightgbm results row has become the catboost row in this article for the slap dblp and ogbarxiv datasets is this perhaps an error although the difference between ogbarxiv and the other datasets is properly explained i think it still makes sense to put those results together with the others in table 1 although tabular graph data for node classification is widelyavailable in industry unfortunately there is currently little publiclyavailable realworld data that can be used for benchmarking this sentence is very vague and i am not fully convinced of its veracity the mention of the method called catboost is interesting but it is given too little space why is it not considered in ours if the idea is picked up by some other work let it be mentioned properly revealing that it may be more robust to nonideal use cases thats why it might be interesting to add homogenous datasets and see if it applies there too while in the main part it says this suggests that in new application domains it may conceivably be easier to adapt ebbs models in the supplements it says it shows ebbs can be run with mostly shared hyperparameters across all datasets i dont think there are enough experimentsresults to say that but id stick with suggest in the supplementary materials as well maybe add a sentence about the possibility of exploring this area more in future work after the rebuttal i have strengthen my opinion on the quality of the paper i believe it is a nice contribution to the field the work is well structured with a good theoretical basis to support the proposed methodology the empirical results are very promising although the small amount of datasets combined with the lack of confidence intervals does not allow for meaningful conclusions to be drawn the only major doubt concerns the use of test data in the training phase which may have compromised the whole experiment docsepfollowing in the success of boosting methods for tabular data this paper introduces a new boosting approach for data that graphbased with tabular features the proposed approach efficient bilevel boosted smoothing ebbs has convergence guarantees as well as empirical successes compared to competing methods summary this paper investigates tabular graphbased data for classification and regression tasks the proposed approach is an endtoend bilevel combination of label propagation and boosting the authors contribute not only an empirical analysis of the proposed approach on 8 datasets demonstrating the effectiveness of the proposed approach as well as a theoretical analysis merits i believe that this is a strong paper that clearly outlines a proposed approach for boosting in this noniid setting of graph data the proposed approach has a convergence guarantee and is shown to be very effective empirically overall this seems to be a strong result the supplemental material seems to give statistical significance of the improvements shown compared to the best previous method bgnn combining boosting and gnns ebbs achieves stronger empirical results the algorithms and theoretical results are discussed as well weaknesses here are a few concerns suggestions it could perhaps be made stronger by including some of the additional analysis that is in the supplemental material that investigates the tradeoffs and ablations of the approaches in the main body of the text i think that the paper could be made much stronger with a simple motivating perhaps synthetic example that illustrates where and when ebbs can be useful compared to competing methods while convergence guarantees and motivations are described a clear simple example which might further be useful in using ablations to identify contributions of different parts of the solution could strength the paper minor notes why is graphdata one word in the title figure 1 would be easier to read if yaxis was the same in both plots this paper provides both theoretical and empirical results for a boosting method for graph structured data the results appear to advance the state of the art and the submission seems to have valuable contributions docsepthis paper proposes a new way to integrate graphbased models with boosting based on principled meta loss named ebbs in experiments the proposed method outperforms tabular baselines gnn baselines and some hybrid strategies like bgnn over some node classificationregression datasets strengths this paper proposes ebbs efficient bilevel boosted smoothing a novel way to combine gnn and gradient boosting for learning tabular graph data the addressed problem of integrating boosting into gnns is very interesting to me also learning over tabular graph data should receive a wide audience given its importance in the industry empirical experiments show ebbs outperforms baseline methods on multiple node classification and node regression datasets weakness in my opinion the main weaknesses are in writingpresentation and reproducibility first i feel the writing of section 3 can be improved to avoid readers confusion for example we can be more clear about how eq 2 is rooted in zhou et al 2004 in fact i didnt get it when i checked the referenced paper in 2 are both z and theta learnable p is binded twice once in p eq 3 and once in pk eq 6 in eq 7 what is for inner level optimization and what is for outer level optimization is graphaware propagation layers terminology used in the literature second it seems that the proposed method ebbs will be incorporating test nodes during training will this cause test information to leak into the training process is there any specific preprocessing to avoid leaking is ebbs easy to implement minor issues typos formats on top model leaderboards on the top of model leaderboards the template seems a bit different from the normal one especially the font and the colour of the citation text whereby the endtoend training of a bilevel loss is such that values of the boosted base model f ebb and flow across the input graph producing a smoothed predictor f not sure if there is a grammar issue use mi and to reference delete and overall i like the problem the paper aims to address how to better combine gnn with boosting methods for learning on tabular data the paper proposes a novel way to address this problem which is based on a principled metaloss empirical results show the effectiveness of the results i feel the paper can be improved more by iterating on the formulations in sec 3 docsepin this paper the authors present a new approach to combine the boosted decision tree classifiers with a graph propagation model which is important in handling table input data the approach casts the graph propagation as an optimization problem where the input node features are generated by boosted decision trees the gradient can be taken in the functional space to learn the decision trees to minimize a unified loss the final algorithm is shown to minimize the unified loss in a principled manner the superior performance is demonstrated over the existing bgnn model strength the approach nicely defines a single objective that the model graph propagation decision trees optimizes empirically there is a nice improvement over the existing bgnn weakness the studied problem does not seem particularly novel to me especially given bgnn given bgnn the scope seems a bit narrow to me although i acknowledge that the authors solve the problem in a potentially better way than the bgnn paper commentsquestions i am curious to see the result of xgboost cs eg use xgboost as the base predictor in cs does the framework supports any propagation rules beyond 6 i would be curious to see how general the method is overall the approach seems sound and principled although the scope is a bit narrow hence i will give the weak accept i would also like the authors to address my commentsquestions
### Summary: | the paper addresses a problem encountered in many realworld applications ie the treatment of tabular data composed of heterogeneous feature types where samples are not iid in this case learning is more effective if the typically successful approach for iid data boosted decision trees committee techniques is combined with gnn to take into account the dependencies between samples the main contribution of the paper with respect to previous work in the field is the introduction of a principled approach to pursue such integration one important component of the proposed approach is played by the definition of a specific bilevel loss efficient bilevel boosted smoothing that allows for convergence guarantees under mild assumptions both theoretical and experimental contributions are sound and convincing justifying the claimed merits of the proposed approach another strong point is the fact that the proposed approach is general and amenable to support a broad family of propagation rules one weakness with the original submission was presentation mainly because some key information was confined into the supplementary material the revised version addressed this problem and added some more empirical results that confirmed the superiority of the proposed approach finally the fact that learning over tabular graph data is very important in industry the proposed approach may be of interest for a wide audience | [
310,
253,
15965,
8813,
275,
2593,
495,
534,
310,
3559,
275,
247,
11088,
533,
44003,
5133,
323,
38391,
1768,
359,
1364,
4944,
253,
5075,
29343,
398,
281,
27935,
432,
1097,
253,
3733,
285,
1071,
7632,
436,
310,
253,
6197,
326,
891,
717,
954,
7514,
670,
347,
253,
897,
273,
1071,
941,
275,
253,
3733,
3408,
778,
8600,
253,
1543,
2797,
12078,
3738,
253,
4477,
1918,
616,
1211,
8813,
273,
2139,
253,
1071,
7632,
943,
671,
320,
908,
1309,
3733,
26332,
323,
253,
18634,
273,
1491,
275,
253,
4216,
604,
253,
1071,
13301,
403,
908,
1309,
253,
6194,
627,
310,
642,
3356,
667,
9712,
875,
6194,
285,
1071,
891,
452,
436,
5545,
984,
253,
13301,
403,
908,
281,
10173,
3470,
4511,
27935,
310,
436,
3451,
50276,
41528,
337,
812,
320,
2529,
275,
247,
1652,
625,
2508,
50276,
783,
1783,
273,
253,
14940,
273,
253,
1332,
407,
10012,
310,
1077,
1175,
253,
16157,
4802,
281,
253,
10012,
403,
4722,
533,
812,
452,
644,
4127,
275,
625,
2508,
597,
403,
275,
629,
275,
253,
26434,
604,
4679,
403,
2218,
342,
1027,
3632,
12922,
347,
4767,
840,
1543,
275,
2829,
337,
943,
320,
2361,
342,
616,
3969,
2629,
11254,
390,
7162,
7726,
2139,
403,
690,
1543,
2361,
342,
1027,
14492,
5053,
275,
253,
2829,
891,
717,
5015,
5742,
670,
253,
29180,
5084,
533,
671,
323,
253,
37551,
277,
1559,
81,
285,
39758,
9930,
581,
14492,
1659,
812,
320,
2879,
604,
253,
15302,
497,
2668,
432,
209,
18064,
729,
50276,
856,
17616,
15077,
76,
8947,
43425,
2139,
369,
35372,
417,
2668,
253,
1921,
943,
320,
326,
1146,
247,
17010,
10895,
840,
347,
5544,
407,
209,
18064,
729,
50276,
856,
17616,
15077,
76,
8947,
43425,
11454,
2990,
7274,
403,
4209,
281,
5115,
253,
1682,
1543,
352,
651,
1335,
320,
4722,
347,
247,
5301,
671,
347,
247,
5301,
342,
731,
253,
2419,
285,
362,
76,
15302,
812,
671,
320,
908,
323,
9162,
597,
671,
1304,
253,
2629,
11254,
273,
512,
1543,
671,
253,
1543,
275,
253,
2829,
3761,
1110,
273,
209,
18064,
729,
50276,
856,
17616,
15077,
76,
8947,
43425,
533,
891,
513,
417,
2096,
2139,
616,
1708,
72,
5844,
1543,
4194,
556,
2489,
253,
5798,
15467,
4194,
275,
436,
3929,
323,
253,
37551,
277,
1559,
81,
285,
9040,
2009,
32693,
15302,
310,
436,
4931,
271,
2228,
3738,
253,
3064,
875,
9040,
2009,
32693,
285,
253,
643,
15302,
310,
6283,
5544,
891,
1158,
352,
1335,
2789,
3282,
281,
1691,
1110,
1543,
2366,
342,
253,
2571,
275,
2829,
337,
3738,
10334,
792,
4216,
941,
323,
4666,
9162,
310,
7561,
15735,
275,
4491,
19235,
627,
310,
4390,
1652,
13644,
15735,
1524,
10186,
941,
326,
476,
320,
908,
323,
22791,
272,
436,
6197,
310,
1077,
21248,
285,
891,
717,
417,
4751,
13762,
273,
697,
2336,
10757,
50276,
783,
3748,
273,
253,
1332,
1925,
5798,
15467,
310,
4722,
533,
352,
310,
1677,
1512,
1652,
2317,
2139,
310,
352,
417,
2783,
275,
20451,
604,
253,
2934,
310,
5055,
598,
407,
690,
643,
789,
1339,
352,
320,
5393,
6283,
19678,
326,
352,
778,
320,
625,
10237,
281,
1327,
45222,
897,
2219,
28763,
2139,
352,
1537,
320,
4722,
281,
823,
2860,
11426,
15302,
285,
923,
604,
352,
10384,
627,
1512,
1223,
275,
253,
2022,
629,
352,
2296,
436,
5936,
326,
275,
747,
2898,
10625,
352,
778,
10686,
400,
1598,
320,
6927,
281,
5223,
38391,
1768,
3210,
275,
253,
26434,
352,
2296,
352,
2722,
38391,
1768,
476,
320,
1408,
342,
6571,
6096,
4373,
22041,
2439,
512,
15302,
891,
13414,
1158,
627,
403,
2217,
4679,
16680,
281,
1333,
326,
533,
2654,
7356,
342,
1804,
275,
253,
24864,
4753,
347,
973,
5046,
823,
247,
6197,
670,
253,
6387,
273,
18216,
436,
2170,
625,
275,
2852,
789,
50276,
6438,
253,
30080,
22559,
891,
452,
17084,
619,
4743,
327,
253,
3290,
273,
253,
2929,
891,
2868,
352,
310,
247,
5322,
7680,
281,
253,
1673,
253,
789,
310,
973,
18872,
342,
247,
1175,
10527,
3720,
281,
1329,
253,
4081,
16182,
253,
16774,
1543,
403,
1077,
12532,
3738,
253,
1355,
2408,
273,
15302,
5678,
342,
253,
3480,
273,
7162,
11508,
1057,
417,
1581,
323,
14282,
11815,
281,
320,
8392,
253,
760,
2201,
5545,
7350,
253,
897,
273,
1071,
941,
275,
253,
3733,
3408,
534,
778,
452,
25047,
253,
2644,
3368,
5474,
339,
18569,
8165,
275,
253,
2323,
273,
43124,
3082,
323,
10334,
792,
941,
436,
2929,
23970,
247,
747,
43124,
2746,
323,
941,
326,
4216,
3169,
342,
10334,
792,
3386,
253,
4081,
2746,
5919,
26413,
652,
46002,
36971,
38391,
1768,
556,
14940,
23632,
347,
973,
347,
16774,
34574,
2429,
281,
11771,
3082,
50276,
8774,
436,
2929,
2340,
684,
10334,
792,
4216,
3169,
941,
323,
9162,
285,
9077,
8892,
253,
4081,
2746,
310,
271,
990,
936,
423,
26413,
652,
5019,
273,
5203,
18634,
285,
43124,
253,
4477,
8162,
417,
760,
271,
16774,
1783,
273,
253,
4081,
2746,
327,
854,
15302,
17227,
253,
12510,
273,
253,
4081,
2746,
347,
973,
347,
247,
10527,
1783,
50275,
961,
953,
891,
2868,
326,
436,
310,
247,
2266,
2929,
326,
4518,
36264,
247,
4081,
2746,
323,
43124,
275,
436,
1327,
74,
301,
4758,
273,
4216,
941,
253,
4081,
2746,
556,
247,
14940,
12215,
285,
310,
2011,
281,
320,
1077,
3576,
45190,
4583,
436,
3133,
281,
320,
247,
2266,
906,
253,
25702,
2144,
3133,
281,
1918,
7605,
8453,
273,
253,
11701,
2011,
2429,
281,
253,
1682,
2045,
1332,
270,
3757,
79,
16248,
43124,
285,
18976,
2224,
38391,
1768,
33526,
10046,
16774,
1543,
253,
11333,
285,
10527,
1543,
403,
5469,
347,
973,
50275,
20881,
1255,
265,
1060,
403,
247,
1643,
7350,
50276,
35640,
621,
50276,
262,
812,
4931,
320,
1160,
10046,
407,
1690,
690,
273,
253,
3081,
1783,
326,
310,
275,
253,
25702,
2144,
326,
2340,
684,
253,
5454,
14273,
285,
490,
77,
569,
273,
253,
7274,
275,
253,
2022,
2133,
273,
253,
2505,
50275,
74,
1158,
326,
253,
2929,
812,
320,
1160,
1199,
10046,
342,
247,
2969,
15265,
839,
4931,
13506,
1650,
326,
18303,
835,
285,
672,
38391,
1768,
476,
320,
4217,
2429,
281,
11771,
3082,
1223,
14940,
23632,
285,
42852,
403,
2529,
247,
2590,
2969,
1650,
534,
1537,
2007,
320,
4217,
275,
970,
490,
77,
569,
281,
4271,
9021,
273,
1027,
4243,
273,
253,
2900,
812,
4757,
253,
2929,
50275,
37585,
7211,
50276,
22309,
310,
4216,
2203,
581,
3159,
275,
253,
4060,
50276,
13206,
337,
651,
320,
6927,
281,
1239,
604,
340,
10565,
369,
253,
1072,
275,
1097,
14777,
436,
2929,
3400,
1097,
10527,
285,
16774,
1543,
323,
247,
43124,
1332,
323,
4216,
18872,
941,
253,
1543,
3176,
281,
7170,
253,
1375,
273,
253,
1445,
285,
253,
19529,
3133,
281,
452,
9865,
9021,
5474,
33032,
2520,
2929,
29328,
247,
747,
1039,
281,
19837,
4216,
3169,
3210,
342,
43124,
1754,
327,
3505,
74,
6216,
11419,
2957,
4907,
38391,
1768,
275,
4679,
253,
4081,
1332,
41731,
13015,
10334,
792,
1666,
25379,
305,
9866,
1666,
25379,
285,
690,
9769,
8130,
751,
270,
3757,
79,
689,
690,
4666,
9162,
1747,
1256,
15302,
20544,
50276,
2520,
2929,
29328,
38391,
1768,
5919,
26413,
652,
46002,
36971,
247,
4460,
1039,
281,
13398,
305,
9866,
285,
11786,
43124,
323,
4715,
10334,
792,
4216,
941,
253,
9713,
1895,
273,
24399,
43124,
715,
18976,
2224,
310,
1077,
4722,
281,
479,
671,
4715,
689,
10334,
792,
4216,
941,
943,
4763,
247,
4618,
8446,
1677,
697,
6349,
275,
253,
4491,
16774,
4679,
921,
38391,
1768,
41731,
13015,
8245,
3082,
327,
2709,
4666,
9162,
285,
4666,
9077,
15302,
50275,
20881,
1255,
275,
619,
4743,
253,
2022,
32213,
403,
275,
4028,
49836,
285,
38041,
806,
891,
1928,
253,
4028,
273,
2593,
495,
476,
320,
5520,
281,
3693,
10668,
13775,
323,
1650,
50274,
664,
476,
320,
625,
2590,
670,
849,
16186,
374,
310,
26415,
275,
1182,
14451,
1162,
355,
6157,
275,
958,
891,
42126,
755,
352,
672,
891,
10141,
253,
23378,
2929,
50273,
249,
374,
403,
1097,
1182,
285,
39116,
3037,
494,
50273,
81,
310,
8980,
264,
7019,
2378,
275,
268,
16186,
495,
285,
2378,
275,
268,
76,
16186,
721,
50274,
249,
16186,
818,
752,
310,
323,
6703,
1268,
13757,
285,
752,
310,
323,
8346,
1268,
13757,
50274,
261,
4216,
13823,
18634,
8090,
28939,
908,
275,
253,
6239,
1273,
352,
3133,
326,
253,
4081,
1332,
38391,
1768,
588,
320,
24049,
1071,
7632,
1309,
3733,
588,
436,
2847,
1071,
1491,
281,
13584,
715,
253,
3733,
1232,
310,
627,
667,
2173,
638,
21678,
281,
3693,
40929,
310,
38391,
1768,
3477,
281,
3359,
50275,
37585,
3374,
963,
993,
21453,
50275,
251,
1755,
1566,
6657,
19184,
50276,
251,
253,
1755,
273,
1566,
6657,
19184,
50276,
783,
7646,
3133,
247,
2372,
1027,
432,
253,
2622,
581,
3340,
253,
8266,
285,
253,
10688,
273,
253,
25577,
2505,
50276,
2811,
1615,
253,
990,
936,
423,
3733,
273,
247,
26413,
652,
2957,
310,
824,
326,
2193,
273,
253,
46002,
2613,
1566,
269,
299,
4482,
285,
2685,
2439,
253,
3280,
4216,
9603,
247,
43966,
23403,
269,
50276,
1439,
2119,
604,
627,
310,
247,
28146,
2523,
50275,
2327,
3641,
285,
281,
3806,
50276,
16435,
285,
50276,
1189,
455,
891,
751,
253,
1895,
253,
2929,
13698,
281,
2953,
50276,
5430,
281,
1805,
13398,
305,
9866,
342,
43124,
3082,
323,
4715,
327,
10334,
792,
941,
253,
2929,
29328,
247,
4460,
1039,
281,
2953,
436,
1895,
534,
310,
1754,
327,
247,
3505,
74,
6216,
5148,
1730,
16774,
1543,
921,
253,
12510,
273,
253,
1543,
891,
1928,
253,
2929,
476,
320,
5520,
625,
407,
10040,
839,
327,
253,
26850,
275,
4706,
495,
5474,
339,
9852,
436,
2929,
253,
4477,
1246,
247,
747,
2746,
281,
13398,
253,
46002,
3061,
5202,
49996,
342,
247,
4216,
18634,
1566,
534,
310,
1774,
275,
10885,
2829,
3280,
941,
253,
2746,
43603,
253,
4216,
18634,
347,
271,
13757,
1895,
835,
253,
3280,
4666,
3386,
403,
4561,
407,
46002,
3061,
7139,
253,
11786,
476,
320,
2668,
275,
253,
5164,
2317,
281,
3037,
253,
3061,
7139,
281,
15338,
247,
27998,
2957,
253,
2457,
5933,
310,
2011,
281,
15338,
253,
27998,
2957,
275,
247,
3505,
74,
6216,
5133,
253,
8936,
3045,
310,
5183,
689,
253,
5368,
270,
3757,
79,
1566,
50276,
45563,
50276,
783,
2746,
23395,
13067,
247,
2014,
8103,
326,
253,
1566,
4216,
18634,
50276,
33642,
7139,
5556,
4219,
50276,
358,
5378,
1037,
627,
310,
247,
5322,
7756,
689,
253,
5368,
270,
3757,
79,
50276,
20881,
1255,
50276,
783,
5421,
1895,
1057,
417,
1646,
3782,
4460,
281,
479,
3340,
1677,
270,
3757,
79,
1677,
270,
3757,
79,
253,
7990,
3133,
247,
2372,
6891,
281,
479,
3738,
891,
14409,
326,
253,
4477,
8415,
253,
1895,
275,
247,
7826,
1805,
1039,
685,
253,
270,
3757,
79,
2929,
50276,
26122,
34974,
50276,
74,
717,
14338,
281,
923,
253,
906,
273,
1269,
72,
15467,
50276,
6113,
24088,
897,
1269,
72,
15467,
347,
253,
2613,
23403,
275,
29180,
50276,
18566,
253,
7792,
8525,
667,
18634,
4803,
4457,
721,
891,
651,
320,
14338,
281,
923,
849,
2087,
253,
1332,
310,
50276,
1189,
455,
253,
2746,
3133,
3590,
285,
3505,
74,
6216,
3738,
253,
7990,
310,
247,
2372,
6891,
7613,
891,
588,
1918,
253,
5075,
2997,
891,
651,
671,
751,
253,
4477,
281,
2953,
619,
5701,
34974,
2490,
187,
4118,
18435,
27,
783,
2929,
12453,
247,
1895,
14494,
275,
1142,
1524,
10186,
4893,
26332,
253,
1971,
273,
10334,
792,
941,
9924,
273,
22766,
4735,
3510,
835,
3530,
403,
417,
891,
301,
275,
436,
1083,
4715,
310,
625,
3576,
604,
253,
5431,
5547,
2746,
323,
891,
301,
941,
46002,
3061,
7139,
50276,
43206,
5609,
310,
5678,
342,
305,
9866,
281,
1379,
715,
2395,
253,
21011,
875,
3530,
253,
2022,
7680,
273,
253,
2929,
342,
1675,
281,
2045,
789,
275,
253,
1673,
310,
253,
10199,
273,
247,
3505,
74,
6216,
2746,
281,
15142,
824,
9554,
581,
1774,
4445,
273,
253,
4081,
2746,
310,
4546,
407,
253,
5426,
273,
247,
2173,
26413,
652,
2957,
5919,
26413,
652,
46002,
36971,
326,
4483,
323,
14940,
23632,
762,
11134,
13260,
1097,
10527,
285,
5661,
9021,
403,
3590,
285,
21414,
816,
5411,
253,
7558,
16108,
273,
253,
4081,
2746,
1529,
2266,
1127,
310,
253,
958,
326,
253,
4081,
2746,
310,
2087,
285,
42133,
281,
1329,
247,
3862,
2021,
273,
18634,
4803,
581,
14855,
342,
253,
3236,
19529,
369,
9759,
7194,
984,
690,
2234,
1491,
369,
18414,
715,
253,
24864,
2144,
253,
17265,
2715,
9713,
436,
1895,
285,
2879,
690,
625,
16774,
1543,
326,
5783,
253,
34385,
273,
253,
4081,
2746,
4720,
253,
958,
326,
4715,
689,
10334,
792,
4216,
941,
310,
1077,
1774,
275,
4491,
253,
4081,
2746,
778,
320,
273,
1600,
323,
247,
4618,
8446
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
310,
253,
15965,
8813,
275,
2593,
495,
534,
310,
3559,
275,
247,
11088,
533,
44003,
5133,
323,
38391,
1768,
359,
1364,
4944,
253,
5075,
29343,
398,
281,
27935,
432,
1097,
253,
3733,
285,
1071,
7632,
436,
310,
253,
6197,
326,
891,
717,
954,
7514,
670,
347,
253,
897,
273,
1071,
941,
275,
253,
3733,
3408,
778,
8600,
253,
1543,
2797,
12078,
3738,
253,
4477,
1918,
616,
1211,
8813,
273,
2139,
253,
1071,
7632,
943,
671,
320,
908,
1309,
3733,
26332,
323,
253,
18634,
273,
1491,
275,
253,
4216,
604,
253,
1071,
13301,
403,
908,
1309,
253,
6194,
627,
310,
642,
3356,
667,
9712,
875,
6194,
285,
1071,
891,
452,
436,
5545,
984,
253,
13301,
403,
908,
281,
10173,
3470,
4511,
27935,
310,
436,
3451,
50276,
41528,
337,
812,
320,
2529,
275,
247,
1652,
625,
2508,
50276,
783,
1783,
273,
253,
14940,
273,
253,
1332,
407,
10012,
310,
1077,
1175,
253,
16157,
4802,
281,
253,
10012,
403,
4722,
533,
812,
452,
644,
4127,
275,
625,
2508,
597,
403,
275,
629,
275,
253,
26434,
604,
4679,
403,
2218,
342,
1027,
3632,
12922,
347,
4767,
840,
1543,
275,
2829,
337,
943,
320,
2361,
342,
616,
3969,
2629,
11254,
390,
7162,
7726,
2139,
403,
690,
1543,
2361,
342,
1027,
14492,
5053,
275,
253,
2829,
891,
717,
5015,
5742,
670,
253,
29180,
5084,
533,
671,
323,
253,
37551,
277,
1559,
81,
285,
39758,
9930,
581,
14492,
1659,
812,
320,
2879,
604,
253,
15302,
497,
2668,
432,
209,
18064,
729,
50276,
856,
17616,
15077,
76,
8947,
43425,
2139,
369,
35372,
417,
2668,
253,
1921,
943,
320,
326,
1146,
247,
17010,
10895,
840,
347,
5544,
407,
209,
18064,
729,
50276,
856,
17616,
15077,
76,
8947,
43425,
11454,
2990,
7274,
403,
4209,
281,
5115,
253,
1682,
1543,
352,
651,
1335,
320,
4722,
347,
247,
5301,
671,
347,
247,
5301,
342,
731,
253,
2419,
285,
362,
76,
15302,
812,
671,
320,
908,
323,
9162,
597,
671,
1304,
253,
2629,
11254,
273,
512,
1543,
671,
253,
1543,
275,
253,
2829,
3761,
1110,
273,
209,
18064,
729,
50276,
856,
17616,
15077,
76,
8947,
43425,
533,
891,
513,
417,
2096,
2139,
616,
1708,
72,
5844,
1543,
4194,
556,
2489,
253,
5798,
15467,
4194,
275,
436,
3929,
323,
253,
37551,
277,
1559,
81,
285,
9040,
2009,
32693,
15302,
310,
436,
4931,
271,
2228,
3738,
253,
3064,
875,
9040,
2009,
32693,
285,
253,
643,
15302,
310,
6283,
5544,
891,
1158,
352,
1335,
2789,
3282,
281,
1691,
1110,
1543,
2366,
342,
253,
2571,
275,
2829,
337,
3738,
10334,
792,
4216,
941,
323,
4666,
9162,
310,
7561,
15735,
275,
4491,
19235,
627,
310,
4390,
1652,
13644,
15735,
1524,
10186,
941,
326,
476,
320,
908,
323,
22791,
272,
436,
6197,
310,
1077,
21248,
285,
891,
717,
417,
4751,
13762,
273,
697,
2336,
10757,
50276,
783,
3748,
273,
253,
1332,
1925,
5798,
15467,
310,
4722,
533,
352,
310,
1677,
1512,
1652,
2317,
2139,
310,
352,
417,
2783,
275,
20451,
604,
253,
2934,
310,
5055,
598,
407,
690,
643,
789,
1339,
352,
320,
5393,
6283,
19678,
326,
352,
778,
320,
625,
10237,
281,
1327,
45222,
897,
2219,
28763,
2139,
352,
1537,
320,
4722,
281,
823,
2860,
11426,
15302,
285,
923,
604,
352,
10384,
627,
1512,
1223,
275,
253,
2022,
629,
352,
2296,
436,
5936,
326,
275,
747,
2898,
10625,
352,
778,
10686,
400,
1598,
320,
6927,
281,
5223,
38391,
1768,
3210,
275,
253,
26434,
352,
2296,
352,
2722,
38391,
1768,
476,
320,
1408,
342,
6571,
6096,
4373,
22041,
2439,
512,
15302,
891,
13414,
1158,
627,
403,
2217,
4679,
16680,
281,
1333,
326,
533,
2654,
7356,
342,
1804,
275,
253,
24864,
4753,
347,
973,
5046,
823,
247,
6197,
670,
253,
6387,
273,
18216,
436,
2170,
625,
275,
2852,
789,
50276,
6438,
253,
30080,
22559,
891,
452,
17084,
619,
4743,
327,
253,
3290,
273,
253,
2929,
891,
2868,
352,
310,
247,
5322,
7680,
281,
253,
1673,
253,
789,
310,
973,
18872,
342,
247,
1175,
10527,
3720,
281,
1329,
253,
4081,
16182,
253,
16774,
1543,
403,
1077,
12532,
3738,
253,
1355,
2408,
273,
15302,
5678,
342,
253,
3480,
273,
7162,
11508,
1057,
417,
1581,
323,
14282,
11815,
281,
320,
8392,
253,
760,
2201,
5545,
7350,
253,
897,
273,
1071,
941,
275,
253,
3733,
3408,
534,
778,
452,
25047,
253,
2644,
3368,
5474,
339,
18569,
8165,
275,
253,
2323,
273,
43124,
3082,
323,
10334,
792,
941,
436,
2929,
23970,
247,
747,
43124,
2746,
323,
941,
326,
4216,
3169,
342,
10334,
792,
3386,
253,
4081,
2746,
5919,
26413,
652,
46002,
36971,
38391,
1768,
556,
14940,
23632,
347,
973,
347,
16774,
34574,
2429,
281,
11771,
3082,
50276,
8774,
436,
2929,
2340,
684,
10334,
792,
4216,
3169,
941,
323,
9162,
285,
9077,
8892,
253,
4081,
2746,
310,
271,
990,
936,
423,
26413,
652,
5019,
273,
5203,
18634,
285,
43124,
253,
4477,
8162,
417,
760,
271,
16774,
1783,
273,
253,
4081,
2746,
327,
854,
15302,
17227,
253,
12510,
273,
253,
4081,
2746,
347,
973,
347,
247,
10527,
1783,
50275,
961,
953,
891,
2868,
326,
436,
310,
247,
2266,
2929,
326,
4518,
36264,
247,
4081,
2746,
323,
43124,
275,
436,
1327,
74,
301,
4758,
273,
4216,
941,
253,
4081,
2746,
556,
247,
14940,
12215,
285,
310,
2011,
281,
320,
1077,
3576,
45190,
4583,
436,
3133,
281,
320,
247,
2266,
906,
253,
25702,
2144,
3133,
281,
1918,
7605,
8453,
273,
253,
11701,
2011,
2429,
281,
253,
1682,
2045,
1332,
270,
3757,
79,
16248,
43124,
285,
18976,
2224,
38391,
1768,
33526,
10046,
16774,
1543,
253,
11333,
285,
10527,
1543,
403,
5469,
347,
973,
50275,
20881,
1255,
265,
1060,
403,
247,
1643,
7350,
50276,
35640,
621,
50276,
262,
812,
4931,
320,
1160,
10046,
407,
1690,
690,
273,
253,
3081,
1783,
326,
310,
275,
253,
25702,
2144,
326,
2340,
684,
253,
5454,
14273,
285,
490,
77,
569,
273,
253,
7274,
275,
253,
2022,
2133,
273,
253,
2505,
50275,
74,
1158,
326,
253,
2929,
812,
320,
1160,
1199,
10046,
342,
247,
2969,
15265,
839,
4931,
13506,
1650,
326,
18303,
835,
285,
672,
38391,
1768,
476,
320,
4217,
2429,
281,
11771,
3082,
1223,
14940,
23632,
285,
42852,
403,
2529,
247,
2590,
2969,
1650,
534,
1537,
2007,
320,
4217,
275,
970,
490,
77,
569,
281,
4271,
9021,
273,
1027,
4243,
273,
253,
2900,
812,
4757,
253,
2929,
50275,
37585,
7211,
50276,
22309,
310,
4216,
2203,
581,
3159,
275,
253,
4060,
50276,
13206,
337,
651,
320,
6927,
281,
1239,
604,
340,
10565,
369,
253,
1072,
275,
1097,
14777,
436,
2929,
3400,
1097,
10527,
285,
16774,
1543,
323,
247,
43124,
1332,
323,
4216,
18872,
941,
253,
1543,
3176,
281,
7170,
253,
1375,
273,
253,
1445,
285,
253,
19529,
3133,
281,
452,
9865,
9021,
5474,
33032,
2520,
2929,
29328,
247,
747,
1039,
281,
19837,
4216,
3169,
3210,
342,
43124,
1754,
327,
3505,
74,
6216,
11419,
2957,
4907,
38391,
1768,
275,
4679,
253,
4081,
1332,
41731,
13015,
10334,
792,
1666,
25379,
305,
9866,
1666,
25379,
285,
690,
9769,
8130,
751,
270,
3757,
79,
689,
690,
4666,
9162,
1747,
1256,
15302,
20544,
50276,
2520,
2929,
29328,
38391,
1768,
5919,
26413,
652,
46002,
36971,
247,
4460,
1039,
281,
13398,
305,
9866,
285,
11786,
43124,
323,
4715,
10334,
792,
4216,
941,
253,
9713,
1895,
273,
24399,
43124,
715,
18976,
2224,
310,
1077,
4722,
281,
479,
671,
4715,
689,
10334,
792,
4216,
941,
943,
4763,
247,
4618,
8446,
1677,
697,
6349,
275,
253,
4491,
16774,
4679,
921,
38391,
1768,
41731,
13015,
8245,
3082,
327,
2709,
4666,
9162,
285,
4666,
9077,
15302,
50275,
20881,
1255,
275,
619,
4743,
253,
2022,
32213,
403,
275,
4028,
49836,
285,
38041,
806,
891,
1928,
253,
4028,
273,
2593,
495,
476,
320,
5520,
281,
3693,
10668,
13775,
323,
1650,
50274,
664,
476,
320,
625,
2590,
670,
849,
16186,
374,
310,
26415,
275,
1182,
14451,
1162,
355,
6157,
275,
958,
891,
42126,
755,
352,
672,
891,
10141,
253,
23378,
2929,
50273,
249,
374,
403,
1097,
1182,
285,
39116,
3037,
494,
50273,
81,
310,
8980,
264,
7019,
2378,
275,
268,
16186,
495,
285,
2378,
275,
268,
76,
16186,
721,
50274,
249,
16186,
818,
752,
310,
323,
6703,
1268,
13757,
285,
752,
310,
323,
8346,
1268,
13757,
50274,
261,
4216,
13823,
18634,
8090,
28939,
908,
275,
253,
6239,
1273,
352,
3133,
326,
253,
4081,
1332,
38391,
1768,
588,
320,
24049,
1071,
7632,
1309,
3733,
588,
436,
2847,
1071,
1491,
281,
13584,
715,
253,
3733,
1232,
310,
627,
667,
2173,
638,
21678,
281,
3693,
40929,
310,
38391,
1768,
3477,
281,
3359,
50275,
37585,
3374,
963,
993,
21453,
50275,
251,
1755,
1566,
6657,
19184,
50276,
251,
253,
1755,
273,
1566,
6657,
19184,
50276,
783,
7646,
3133,
247,
2372,
1027,
432,
253,
2622,
581,
3340,
253,
8266,
285,
253,
10688,
273,
253,
25577,
2505,
50276,
2811,
1615,
253,
990,
936,
423,
3733,
273,
247,
26413,
652,
2957,
310,
824,
326,
2193,
273,
253,
46002,
2613,
1566,
269,
299,
4482,
285,
2685,
2439,
253,
3280,
4216,
9603,
247,
43966,
23403,
269,
50276,
1439,
2119,
604,
627,
310,
247,
28146,
2523,
50275,
2327,
3641,
285,
281,
3806,
50276,
16435,
285,
50276,
1189,
455,
891,
751,
253,
1895,
253,
2929,
13698,
281,
2953,
50276,
5430,
281,
1805,
13398,
305,
9866,
342,
43124,
3082,
323,
4715,
327,
10334,
792,
941,
253,
2929,
29328,
247,
4460,
1039,
281,
2953,
436,
1895,
534,
310,
1754,
327,
247,
3505,
74,
6216,
5148,
1730,
16774,
1543,
921,
253,
12510,
273,
253,
1543,
891,
1928,
253,
2929,
476,
320,
5520,
625,
407,
10040,
839,
327,
253,
26850,
275,
4706,
495,
5474,
339,
9852,
436,
2929,
253,
4477,
1246,
247,
747,
2746,
281,
13398,
253,
46002,
3061,
5202,
49996,
342,
247,
4216,
18634,
1566,
534,
310,
1774,
275,
10885,
2829,
3280,
941,
253,
2746,
43603,
253,
4216,
18634,
347,
271,
13757,
1895,
835,
253,
3280,
4666,
3386,
403,
4561,
407,
46002,
3061,
7139,
253,
11786,
476,
320,
2668,
275,
253,
5164,
2317,
281,
3037,
253,
3061,
7139,
281,
15338,
247,
27998,
2957,
253,
2457,
5933,
310,
2011,
281,
15338,
253,
27998,
2957,
275,
247,
3505,
74,
6216,
5133,
253,
8936,
3045,
310,
5183,
689,
253,
5368,
270,
3757,
79,
1566,
50276,
45563,
50276,
783,
2746,
23395,
13067,
247,
2014,
8103,
326,
253,
1566,
4216,
18634,
50276,
33642,
7139,
5556,
4219,
50276,
358,
5378,
1037,
627,
310,
247,
5322,
7756,
689,
253,
5368,
270,
3757,
79,
50276,
20881,
1255,
50276,
783,
5421,
1895,
1057,
417,
1646,
3782,
4460,
281,
479,
3340,
1677,
270,
3757,
79,
1677,
270,
3757,
79,
253,
7990,
3133,
247,
2372,
6891,
281,
479,
3738,
891,
14409,
326,
253,
4477,
8415,
253,
1895,
275,
247,
7826,
1805,
1039,
685,
253,
270,
3757,
79,
2929,
50276,
26122,
34974,
50276,
74,
717,
14338,
281,
923,
253,
906,
273,
1269,
72,
15467,
50276,
6113,
24088,
897,
1269,
72,
15467,
347,
253,
2613,
23403,
275,
29180,
50276,
18566,
253,
7792,
8525,
667,
18634,
4803,
4457,
721,
891,
651,
320,
14338,
281,
923,
849,
2087,
253,
1332,
310,
50276,
1189,
455,
253,
2746,
3133,
3590,
285,
3505,
74,
6216,
3738,
253,
7990,
310,
247,
2372,
6891,
7613,
891,
588,
1918,
253,
5075,
2997,
891,
651,
671,
751,
253,
4477,
281,
2953,
619,
5701,
34974,
2490,
187,
4118,
18435,
27,
783,
2929,
12453,
247,
1895,
14494,
275,
1142,
1524,
10186,
4893,
26332,
253,
1971,
273,
10334,
792,
941,
9924,
273,
22766,
4735,
3510,
835,
3530,
403,
417,
891,
301,
275,
436,
1083,
4715,
310,
625,
3576,
604,
253,
5431,
5547,
2746,
323,
891,
301,
941,
46002,
3061,
7139,
50276,
43206,
5609,
310,
5678,
342,
305,
9866,
281,
1379,
715,
2395,
253,
21011,
875,
3530,
253,
2022,
7680,
273,
253,
2929,
342,
1675,
281,
2045,
789,
275,
253,
1673,
310,
253,
10199,
273,
247,
3505,
74,
6216,
2746,
281,
15142,
824,
9554,
581,
1774,
4445,
273,
253,
4081,
2746,
310,
4546,
407,
253,
5426,
273,
247,
2173,
26413,
652,
2957,
5919,
26413,
652,
46002,
36971,
326,
4483,
323,
14940,
23632,
762,
11134,
13260,
1097,
10527,
285,
5661,
9021,
403,
3590,
285,
21414,
816,
5411,
253,
7558,
16108,
273,
253,
4081,
2746,
1529,
2266,
1127,
310,
253,
958,
326,
253,
4081,
2746,
310,
2087,
285,
42133,
281,
1329,
247,
3862,
2021,
273,
18634,
4803,
581,
14855,
342,
253,
3236,
19529,
369,
9759,
7194,
984,
690,
2234,
1491,
369,
18414,
715,
253,
24864,
2144,
253,
17265,
2715,
9713,
436,
1895,
285,
2879,
690,
625,
16774,
1543,
326,
5783,
253,
34385,
273,
253,
4081,
2746,
4720,
253,
958,
326,
4715,
689,
10334,
792,
4216,
941,
310,
1077,
1774,
275,
4491,
253,
4081,
2746,
778,
320,
273,
1600,
323,
247,
4618,
8446
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper introduces segvit a semantic segmentation framework with plain vits as backbones one of the core technical contributions is the proposed attentiontomask atm block which generating masks from the intermediate attention maps between class embeddings and key maps in addition a shrunk structure is then proposed to save computational cost while maintaining the performance based on plain vit networks only segvit obtains stateoftheart results on three semantic segmentation datasets ade20k pascalcontext and cocostuff10k pros 1 the paper is well motivated recently many works eg 1 have realized even plain vits could have rich representation capacity which however requires special optimization eg masked image modeling or other architectural modifications for downstream tasks i am pleased that the paper demonstrates that plain vits can obtain as good results as the hierarchical counterparts eg 15 47 on segmentation tasks which may encourage simpler and unified network design principles 2 strong results are reported in the paper for example on ade20k val a model with vitl backbone achieves 552 miou which is very competitive even among more sophisticated networks such as swinl and mvit 3 the motivation of atm module sounds reasonable to some extent intuitively a good attention mask should cover the foreground of the given object or class therefore it is possible to generate mask directly from the attention matrix cons 1 my major concern is that the technical novelty is relatively limited the overall framework is very similar to maskformer 15 and mask2former 47 compared with 15 the major difference on the technical details is 15 generate masks from the product of the mask embedding and the perpixel embedding while in the paper the mask is directly derived from the attention weights however i do not think it differs much although 15 47 mainly evaluate on hierarchical backbones theoretically they can also be equipped with plain networks in addition the proposed quqd layers are not novel also sounds irrelevant to the main topic of the paper since many previous works eg pvt 17 also adopt similar blocks to reduce computational cost in conclusion i think the contributions claimed in the introduction seems not significant 2 according to table 4 and line 234239 in the proposed atm block the separated supervision of classification and mask prediction is the most important design principle however it is not originally proposed in the paper as 15 16 already introduces the paradigm it further weakens the significance of the proposed method 1 li et al exploring plain vision transformer backbones for object detection tech report limitations are mentioned in the conclusion although i think more discussion and comparisons with 15 are required in the paper docsepthe authors deal with vitbased semantic segmentation in particular they use a set of learnable tokens each corresponding to a semantic class which decode the outputs of the vitbased backbone into perclass semantic masks this is accomplished by multiple layers of cross attention between class token and vit tokens rather than use a dot product like mechanism to produce similarity between a class token and spatial features they directly supervise the cross attention maps using a sigmoidal output furthermore they introduce a downupsampling technique to mimic the general idea of an efficient multiscale prediction head their results are quite good even when compared to some of the best recent models and their qd module provides some computationperformance tradeoffs strengths 1 this is a well written paper and the approach is quite clean 2 the results presented are quite good as well achieving atnear sota against competitive models weaknesses 1 the idea is still related to the idea of dot product based segmentation from some class embedding i think a good deal of experiments might need to be performed to actually understand the technical contribution 2 while the results are good related work like segmenter is not far off from the performance presented here and shares some significant similarities with this method i believe so docsepthe paper proposed a plainvit based semantic segmentation framework which uses an attentiontomask decoder to aggregate image features and a shrunk structure to save computational cost strengths 1 the paper proposed segvit framework and achieved a sota performance based on a plain vit backbone 2 the paper is well written and clear to understand weaknesses 1 i doubt the novelty of the design of atm module since the maskformer framework has been proposed for over half a year the atm module is similar to the maskformer transformer decoder module the only difference is that the mask output of maskformer is generated by the multiplication of the final output query tokens and image features while the mask output of atm is from the multiplication of an intermediate variable k inside the transformer layer and the image features the difference is not obvious the segvit framework is just like maskformer vit multilayerfeature 2 table 4 shows that atm has a relatively low performance gain of about 05 to setr it shows that the performance of atm is even worse than segmenter since the result of segmenter is 08 better than setr in talbe 1 3 also in table 4 it shows that by using lmask loss the miou result increases about 26 than using ce loss only however table 1 shows that the result of segvit is about 23 better than segmenter baseline which only uses ce loss if it shows that the performance gain is all from the new loss design but not from the framework architecture the authors have addressed the limitations and potential negative societal impacts docsepthis paper present a semantic segmentation method based on the plain vision transformer vit specifically it proposes the attentiontomask atm module to generate the pixellevel mask in addition to reduce the computational cost it designs a querybased downsampling qd and upsampling module qu in the shrunk version experiments are conducted on three datasets and better results are obtained compared with previous methods strength 1 exploring plain architecture for semantic segmentation is an interesting and promising direction this paper make a forward step towards this direction 2 the performance of segvit seems to be better than previous stateoftheart methods weakness 1 about the attentiontomask module atm it is implemented in crossattention manner but in fact there is little difference with the standard classifier a fc layer to map features to probability for perpixel classification in the normal semantic segmentation framework each learned token could be viewed as a classifier layer to map the pixellevel features into a probability with a sigmoid function in this sense the atm is similar to a standard classification layer 2 about the shrunk structure i am confused about the querybased downsampling operation qd in line 172 it says to use the nearest sampling to reduce the token numbers in this sense it has nothing with the query based downsampling and is simply a standard downsampling operation i am also confused about the implementation details on querybased upsampling operation qu it says to use a standard transformer decoder structure to upsample features is there any special design on the transformer decoder by incorporating the spatial information more details are required on the decoder design 3 about the two qu operations in the version c of figure 3 the downsampling ratio of lower qu is 116 and it is natural to think its output have smaller downsampling ratio like 18 however from line 181183 its output size seems to be 116 which is confused for me 4 i think this paper should compare with the previous works perceiverio which employs a similar downsamplingupsampling architecture for dense prediction with transformers more discussion on the difference is required to better motivate the proposed method the authors have addressed the limitations of the proposed method in large memory consumption
### Summary: | this submission has received comments from 4 official reviewers the authors have made very detailed replies to the reviewers comments the authors and reviewers had quite rich discussions after these discussions 3 reviewers recommended weak acceptance and 1 recommended rejection for the novelty concerns the authors clarify them during the rebuttal the reviewers have also recommended comparing with recent semantic segmentation methods using vits missing comparisons should be included in the final version including comparisons with 1 ma xuezhe et al luna linear unified nested attention neurips 2021 2 ryoo michael et al tokenlearner adaptive spacetime tokenization for videos neurips 2021 3 wu yuhuan et al p2t pyramid pooling transformer for scene understanding ieee tpami 2022 only reviewer eyo8 recommends borderline rejection the authors have made quite a detailed rebuttal but we have not heard from the reviewer after the rebuttal thus the ac would like to recommend acceptance | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
23970,
8753,
34490,
247,
24705,
26405,
7792,
342,
8342,
362,
953,
347,
896,
47473,
581,
273,
253,
5161,
7681,
9021,
310,
253,
4081,
4116,
25321,
1945,
387,
78,
2972,
534,
11365,
25965,
432,
253,
10444,
4116,
8115,
875,
966,
46234,
285,
2234,
8115,
275,
1635,
247,
11111,
3938,
2605,
310,
840,
4081,
281,
5321,
15180,
2105,
1223,
11850,
253,
3045,
1754,
327,
8342,
9084,
6928,
760,
8753,
34490,
31326,
1375,
23037,
14387,
1543,
327,
1264,
24705,
26405,
15302,
519,
70,
938,
76,
7222,
1179,
8882,
285,
9285,
493,
2066,
740,
76,
50276,
856,
84,
50276,
18,
253,
2929,
310,
973,
17194,
4102,
1142,
2987,
24088,
337,
452,
8156,
1014,
8342,
362,
953,
812,
452,
6793,
6779,
5350,
534,
2299,
4419,
2714,
13757,
24088,
34741,
2460,
14053,
390,
643,
27934,
14586,
323,
15450,
8892,
891,
717,
13864,
326,
253,
2929,
14371,
326,
8342,
362,
953,
476,
4044,
347,
1175,
1543,
347,
253,
24498,
21421,
24088,
1458,
7543,
327,
26405,
8892,
534,
778,
11907,
19554,
285,
27998,
2990,
2216,
9241,
50275,
19,
2266,
1543,
403,
2361,
275,
253,
2929,
323,
1650,
327,
519,
70,
938,
76,
821,
247,
1566,
342,
9084,
77,
27882,
33526,
41589,
3641,
276,
534,
310,
1077,
12085,
1014,
2190,
625,
18144,
6928,
824,
347,
1863,
249,
77,
285,
278,
34490,
50275,
20,
253,
16038,
273,
387,
78,
6333,
7835,
5272,
281,
690,
6070,
540,
41597,
247,
1175,
4116,
8989,
943,
3835,
253,
35936,
273,
253,
1677,
1789,
390,
966,
3103,
352,
310,
1896,
281,
6635,
8989,
3587,
432,
253,
4116,
4315,
50275,
5040,
50276,
18,
619,
2201,
4468,
310,
326,
253,
7681,
38135,
310,
4942,
3710,
253,
4583,
7792,
310,
1077,
2074,
281,
8989,
19946,
1458,
285,
8989,
19,
19946,
7543,
2429,
342,
1458,
253,
2201,
3064,
327,
253,
7681,
4278,
310,
1458,
6635,
25965,
432,
253,
1885,
273,
253,
8989,
21496,
285,
253,
591,
29206,
21496,
1223,
275,
253,
2929,
253,
8989,
310,
3587,
6012,
432,
253,
4116,
13461,
2299,
891,
513,
417,
1158,
352,
19986,
1199,
3738,
1458,
7543,
7194,
7472,
327,
24498,
896,
47473,
28055,
597,
476,
671,
320,
13496,
342,
8342,
6928,
275,
1635,
253,
4081,
572,
82,
69,
8090,
403,
417,
4460,
671,
7835,
19124,
281,
253,
2022,
9400,
273,
253,
2929,
1580,
1142,
2045,
2987,
24088,
268,
20282,
1722,
671,
5283,
2074,
8336,
281,
4796,
15180,
2105,
275,
6452,
891,
1158,
253,
9021,
7558,
275,
253,
10199,
3133,
417,
1534,
50276,
19,
2556,
281,
2829,
577,
285,
1386,
27812,
20487,
275,
253,
4081,
387,
78,
2972,
253,
9070,
20446,
273,
9162,
285,
8989,
10554,
310,
253,
954,
1774,
2216,
8063,
2299,
352,
310,
417,
8927,
4081,
275,
253,
2929,
347,
1458,
1668,
2168,
23970,
253,
22199,
352,
2007,
5075,
561,
253,
8453,
273,
253,
4081,
1332,
50275,
18,
632,
1162,
355,
18216,
8342,
8113,
39707,
896,
47473,
323,
1789,
5481,
13817,
1304,
50275,
17465,
569,
403,
5393,
275,
253,
6452,
3738,
891,
1158,
625,
5955,
285,
14023,
342,
1458,
403,
2424,
275,
253,
2929,
50276,
7152,
339,
431,
248,
4477,
2968,
342,
9084,
3169,
24705,
26405,
275,
1798,
597,
897,
247,
873,
273,
3037,
494,
21761,
1016,
3969,
281,
247,
24705,
966,
534,
30358,
253,
18012,
273,
253,
9084,
3169,
27882,
715,
591,
2437,
24705,
25965,
436,
310,
14123,
407,
2709,
8090,
273,
2831,
4116,
875,
966,
10669,
285,
9084,
21761,
2581,
685,
897,
247,
14261,
1885,
751,
5122,
281,
4711,
14259,
875,
247,
966,
10669,
285,
8820,
3386,
597,
3587,
35220,
885,
253,
2831,
4116,
8115,
970,
247,
9788,
78,
16080,
3453,
33810,
597,
9569,
247,
1066,
8777,
312,
4906,
5853,
281,
25066,
253,
2087,
2934,
273,
271,
5919,
1554,
2865,
1079,
10554,
1481,
616,
1543,
403,
3240,
1175,
1014,
672,
2429,
281,
690,
273,
253,
1682,
3332,
3210,
285,
616,
2805,
69,
6333,
3400,
690,
13782,
24159,
5454,
14273,
20544,
50276,
18,
436,
310,
247,
973,
3542,
2929,
285,
253,
2746,
310,
3240,
4076,
374,
253,
1543,
3559,
403,
3240,
1175,
347,
973,
17170,
387,
31323,
256,
5503,
1411,
12085,
3210,
50276,
20881,
1255,
265,
50276,
18,
253,
2934,
310,
1335,
2905,
281,
253,
2934,
273,
14261,
1885,
1754,
26405,
432,
690,
966,
21496,
891,
1158,
247,
1175,
2968,
273,
4679,
1537,
878,
281,
320,
2684,
281,
2686,
2096,
253,
7681,
7680,
374,
1223,
253,
1543,
403,
1175,
2905,
789,
751,
8223,
254,
310,
417,
2080,
745,
432,
253,
3045,
3559,
1060,
285,
10764,
690,
1534,
22620,
342,
436,
1332,
891,
2868,
594,
5474,
339,
431,
248,
2929,
4081,
247,
8342,
34490,
1754,
24705,
26405,
7792,
534,
4648,
271,
4116,
25321,
1945,
29810,
281,
19737,
2460,
3386,
285,
247,
11111,
3938,
2605,
281,
5321,
15180,
2105,
20544,
337,
253,
2929,
4081,
8753,
34490,
7792,
285,
6786,
247,
256,
5503,
3045,
1754,
327,
247,
8342,
9084,
27882,
50276,
19,
253,
2929,
310,
973,
3542,
285,
2590,
281,
2096,
50276,
20881,
1255,
265,
337,
891,
5545,
253,
38135,
273,
253,
2216,
273,
387,
78,
6333,
1580,
253,
8989,
19946,
7792,
556,
644,
4081,
323,
689,
2716,
247,
807,
253,
387,
78,
6333,
310,
2074,
281,
253,
8989,
19946,
39707,
29810,
6333,
253,
760,
3064,
310,
326,
253,
8989,
3453,
273,
8989,
19946,
310,
4561,
407,
253,
25219,
273,
253,
2457,
3453,
7316,
21761,
285,
2460,
3386,
1223,
253,
8989,
3453,
273,
387,
78,
310,
432,
253,
25219,
273,
271,
10444,
4778,
465,
3304,
253,
39707,
3828,
285,
253,
2460,
3386,
253,
3064,
310,
417,
4755,
253,
8753,
34490,
7792,
310,
816,
751,
8989,
19946,
50276,
34490,
50276,
9961,
300,
4071,
24594,
374,
2829,
577,
2722,
326,
387,
78,
556,
247,
4942,
1698,
3045,
6351,
273,
670,
16987,
281,
873,
83,
352,
2722,
326,
253,
3045,
273,
387,
78,
310,
1014,
7197,
685,
8223,
254,
1580,
253,
906,
273,
8223,
254,
310,
16331,
1805,
685,
873,
83,
275,
5269,
1257,
337,
495,
671,
275,
2829,
577,
352,
2722,
326,
407,
970,
298,
12477,
2957,
253,
3641,
276,
906,
5459,
670,
3436,
685,
970,
2636,
2957,
760,
2299,
2829,
337,
2722,
326,
253,
906,
273,
8753,
34490,
310,
670,
3495,
1805,
685,
8223,
254,
8245,
534,
760,
4648,
2636,
2957,
604,
352,
2722,
326,
253,
3045,
6351,
310,
512,
432,
253,
747,
2957,
2216,
533,
417,
432,
253,
7792,
10336,
50276,
783,
4477,
452,
9713,
253,
7364,
285,
2442,
4016,
38058,
16274,
5474,
33032,
2520,
2929,
1246,
247,
24705,
26405,
1332,
1754,
327,
253,
8342,
8113,
39707,
9084,
5742,
352,
29328,
253,
4116,
25321,
1945,
387,
78,
6333,
281,
6635,
253,
8066,
4415,
652,
8989,
275,
1635,
281,
4796,
253,
15180,
2105,
352,
11809,
247,
7316,
3169,
1066,
48027,
2805,
69,
285,
598,
48027,
6333,
572,
275,
253,
11111,
3938,
2715,
4679,
403,
5196,
327,
1264,
15302,
285,
1805,
1543,
403,
2797,
2429,
342,
2045,
3082,
4757,
50276,
18,
18216,
8342,
10336,
323,
24705,
26405,
310,
271,
4722,
285,
12532,
3884,
436,
2929,
1056,
247,
3579,
3213,
4404,
436,
3884,
374,
253,
3045,
273,
8753,
34490,
3133,
281,
320,
1805,
685,
2045,
1375,
23037,
14387,
3082,
50276,
20881,
1255,
50276,
18,
670,
253,
4116,
25321,
1945,
6333,
387,
78,
352,
310,
9009,
275,
2831,
42959,
5133,
533,
275,
958,
627,
310,
1652,
3064,
342,
253,
2629,
30410,
247,
269,
68,
3828,
281,
3711,
3386,
281,
5912,
323,
591,
29206,
9162,
275,
253,
2622,
24705,
26405,
7792,
1016,
6311,
10669,
812,
320,
11575,
347,
247,
30410,
3828,
281,
3711,
253,
8066,
4415,
652,
3386,
715,
247,
5912,
342,
247,
9788,
78,
1238,
1159,
275,
436,
3282,
253,
387,
78,
310,
2074,
281,
247,
2629,
9162,
3828,
50276,
19,
670,
253,
11111,
3938,
2605,
891,
717,
13477,
670,
253,
7316,
3169,
1066,
48027,
4254,
2805,
69,
275,
1386,
24347,
352,
2296,
281,
897,
253,
5275,
10491,
281,
4796,
253,
10669,
3904,
275,
436,
3282,
352,
556,
2717,
342,
253,
7316,
1754,
1066,
48027,
285,
310,
3365,
247,
2629,
1066,
48027,
4254,
891,
717,
671,
13477,
670,
253,
7092,
4278,
327,
7316,
3169,
598,
48027,
4254,
572,
352,
2296,
281,
897,
247,
2629,
39707,
29810,
2605,
281,
598,
16848,
3386,
310,
627,
667,
2714,
2216,
327,
253,
39707,
29810,
407,
24049,
253,
8820,
1491,
625,
4278,
403,
2424,
327,
253,
29810,
2216,
50276,
20,
670,
253,
767,
572,
5871,
275,
253,
2715,
260,
273,
4677,
495,
50276,
783,
1066,
48027,
4313,
273,
2406,
572,
310,
12472,
285,
352,
310,
3626,
281,
1158,
697,
3453,
452,
4577,
1066,
48027,
4313,
751,
1283,
2299,
432,
1386,
1283,
883,
3245,
697,
3453,
1979,
3133,
281,
320,
12472,
534,
310,
13477,
323,
479,
50276,
21,
891,
1158,
436,
2929,
943,
7277,
342,
253,
2045,
2987,
591,
22070,
900,
534,
27532,
247,
2074,
1066,
48027,
8777,
312,
4906,
10336,
323,
14086,
10554,
342,
4979,
398,
625,
5955,
327,
253,
3064,
310,
2424,
281,
1805,
41509,
253,
4081,
1332,
50276,
783,
4477,
452,
9713,
253,
7364,
273,
253,
4081,
1332,
275,
1781,
3541,
8353,
2490,
187,
4118,
18435,
27,
2520,
19529,
556,
2959,
5701,
432,
577,
3565,
30628,
253,
4477,
452,
1160,
1077,
7000,
32114,
281,
253,
30628,
5701,
253,
4477,
285,
30628,
574,
3240,
6793,
11985,
846,
841,
11985,
495,
30628,
8521,
5075,
14924,
285,
337,
8521,
18235,
50275,
1542,
253,
38135,
7350,
253,
4477,
19148,
731,
1309,
253,
30080,
22559,
253,
30628,
452,
671,
8521,
10941,
342,
3332,
24705,
26405,
3082,
970,
362,
953,
5816,
14023,
943,
320,
2908,
275,
253,
2457,
2715,
1690,
14023,
342,
50275,
18,
6429,
1269,
17761,
248,
1162,
355,
298,
9821,
4872,
27998,
20494,
4116,
5723,
2824,
43425,
374,
43938,
3288,
278,
44023,
1162,
355,
10669,
282,
47612,
17825,
29380,
10669,
1320,
323,
10556,
5723,
2824,
43425,
495,
259,
86,
340,
6968,
9041,
1162,
355,
268,
19,
85,
39694,
45900,
39707,
323,
6200,
4685,
26332,
1796,
246,
81,
7588,
1384,
1423,
50275,
7483,
37317,
2046,
80,
25,
32636,
45210,
18235,
253,
4477,
452,
1160,
3240,
247,
7000,
30080,
22559,
533,
359,
452,
417,
3735,
432,
253,
37317,
846,
253,
30080,
22559,
50276,
40622,
253,
913,
651,
751,
281,
5583,
14924,
50276
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
23970,
8753,
34490,
247,
24705,
26405,
7792,
342,
8342,
362,
953,
347,
896,
47473,
581,
273,
253,
5161,
7681,
9021,
310,
253,
4081,
4116,
25321,
1945,
387,
78,
2972,
534,
11365,
25965,
432,
253,
10444,
4116,
8115,
875,
966,
46234,
285,
2234,
8115,
275,
1635,
247,
11111,
3938,
2605,
310,
840,
4081,
281,
5321,
15180,
2105,
1223,
11850,
253,
3045,
1754,
327,
8342,
9084,
6928,
760,
8753,
34490,
31326,
1375,
23037,
14387,
1543,
327,
1264,
24705,
26405,
15302,
519,
70,
938,
76,
7222,
1179,
8882,
285,
9285,
493,
2066,
740,
76,
50276,
856,
84,
50276,
18,
253,
2929,
310,
973,
17194,
4102,
1142,
2987,
24088,
337,
452,
8156,
1014,
8342,
362,
953,
812,
452,
6793,
6779,
5350,
534,
2299,
4419,
2714,
13757,
24088,
34741,
2460,
14053,
390,
643,
27934,
14586,
323,
15450,
8892,
891,
717,
13864,
326,
253,
2929,
14371,
326,
8342,
362,
953,
476,
4044,
347,
1175,
1543,
347,
253,
24498,
21421,
24088,
1458,
7543,
327,
26405,
8892,
534,
778,
11907,
19554,
285,
27998,
2990,
2216,
9241,
50275,
19,
2266,
1543,
403,
2361,
275,
253,
2929,
323,
1650,
327,
519,
70,
938,
76,
821,
247,
1566,
342,
9084,
77,
27882,
33526,
41589,
3641,
276,
534,
310,
1077,
12085,
1014,
2190,
625,
18144,
6928,
824,
347,
1863,
249,
77,
285,
278,
34490,
50275,
20,
253,
16038,
273,
387,
78,
6333,
7835,
5272,
281,
690,
6070,
540,
41597,
247,
1175,
4116,
8989,
943,
3835,
253,
35936,
273,
253,
1677,
1789,
390,
966,
3103,
352,
310,
1896,
281,
6635,
8989,
3587,
432,
253,
4116,
4315,
50275,
5040,
50276,
18,
619,
2201,
4468,
310,
326,
253,
7681,
38135,
310,
4942,
3710,
253,
4583,
7792,
310,
1077,
2074,
281,
8989,
19946,
1458,
285,
8989,
19,
19946,
7543,
2429,
342,
1458,
253,
2201,
3064,
327,
253,
7681,
4278,
310,
1458,
6635,
25965,
432,
253,
1885,
273,
253,
8989,
21496,
285,
253,
591,
29206,
21496,
1223,
275,
253,
2929,
253,
8989,
310,
3587,
6012,
432,
253,
4116,
13461,
2299,
891,
513,
417,
1158,
352,
19986,
1199,
3738,
1458,
7543,
7194,
7472,
327,
24498,
896,
47473,
28055,
597,
476,
671,
320,
13496,
342,
8342,
6928,
275,
1635,
253,
4081,
572,
82,
69,
8090,
403,
417,
4460,
671,
7835,
19124,
281,
253,
2022,
9400,
273,
253,
2929,
1580,
1142,
2045,
2987,
24088,
268,
20282,
1722,
671,
5283,
2074,
8336,
281,
4796,
15180,
2105,
275,
6452,
891,
1158,
253,
9021,
7558,
275,
253,
10199,
3133,
417,
1534,
50276,
19,
2556,
281,
2829,
577,
285,
1386,
27812,
20487,
275,
253,
4081,
387,
78,
2972,
253,
9070,
20446,
273,
9162,
285,
8989,
10554,
310,
253,
954,
1774,
2216,
8063,
2299,
352,
310,
417,
8927,
4081,
275,
253,
2929,
347,
1458,
1668,
2168,
23970,
253,
22199,
352,
2007,
5075,
561,
253,
8453,
273,
253,
4081,
1332,
50275,
18,
632,
1162,
355,
18216,
8342,
8113,
39707,
896,
47473,
323,
1789,
5481,
13817,
1304,
50275,
17465,
569,
403,
5393,
275,
253,
6452,
3738,
891,
1158,
625,
5955,
285,
14023,
342,
1458,
403,
2424,
275,
253,
2929,
50276,
7152,
339,
431,
248,
4477,
2968,
342,
9084,
3169,
24705,
26405,
275,
1798,
597,
897,
247,
873,
273,
3037,
494,
21761,
1016,
3969,
281,
247,
24705,
966,
534,
30358,
253,
18012,
273,
253,
9084,
3169,
27882,
715,
591,
2437,
24705,
25965,
436,
310,
14123,
407,
2709,
8090,
273,
2831,
4116,
875,
966,
10669,
285,
9084,
21761,
2581,
685,
897,
247,
14261,
1885,
751,
5122,
281,
4711,
14259,
875,
247,
966,
10669,
285,
8820,
3386,
597,
3587,
35220,
885,
253,
2831,
4116,
8115,
970,
247,
9788,
78,
16080,
3453,
33810,
597,
9569,
247,
1066,
8777,
312,
4906,
5853,
281,
25066,
253,
2087,
2934,
273,
271,
5919,
1554,
2865,
1079,
10554,
1481,
616,
1543,
403,
3240,
1175,
1014,
672,
2429,
281,
690,
273,
253,
1682,
3332,
3210,
285,
616,
2805,
69,
6333,
3400,
690,
13782,
24159,
5454,
14273,
20544,
50276,
18,
436,
310,
247,
973,
3542,
2929,
285,
253,
2746,
310,
3240,
4076,
374,
253,
1543,
3559,
403,
3240,
1175,
347,
973,
17170,
387,
31323,
256,
5503,
1411,
12085,
3210,
50276,
20881,
1255,
265,
50276,
18,
253,
2934,
310,
1335,
2905,
281,
253,
2934,
273,
14261,
1885,
1754,
26405,
432,
690,
966,
21496,
891,
1158,
247,
1175,
2968,
273,
4679,
1537,
878,
281,
320,
2684,
281,
2686,
2096,
253,
7681,
7680,
374,
1223,
253,
1543,
403,
1175,
2905,
789,
751,
8223,
254,
310,
417,
2080,
745,
432,
253,
3045,
3559,
1060,
285,
10764,
690,
1534,
22620,
342,
436,
1332,
891,
2868,
594,
5474,
339,
431,
248,
2929,
4081,
247,
8342,
34490,
1754,
24705,
26405,
7792,
534,
4648,
271,
4116,
25321,
1945,
29810,
281,
19737,
2460,
3386,
285,
247,
11111,
3938,
2605,
281,
5321,
15180,
2105,
20544,
337,
253,
2929,
4081,
8753,
34490,
7792,
285,
6786,
247,
256,
5503,
3045,
1754,
327,
247,
8342,
9084,
27882,
50276,
19,
253,
2929,
310,
973,
3542,
285,
2590,
281,
2096,
50276,
20881,
1255,
265,
337,
891,
5545,
253,
38135,
273,
253,
2216,
273,
387,
78,
6333,
1580,
253,
8989,
19946,
7792,
556,
644,
4081,
323,
689,
2716,
247,
807,
253,
387,
78,
6333,
310,
2074,
281,
253,
8989,
19946,
39707,
29810,
6333,
253,
760,
3064,
310,
326,
253,
8989,
3453,
273,
8989,
19946,
310,
4561,
407,
253,
25219,
273,
253,
2457,
3453,
7316,
21761,
285,
2460,
3386,
1223,
253,
8989,
3453,
273,
387,
78,
310,
432,
253,
25219,
273,
271,
10444,
4778,
465,
3304,
253,
39707,
3828,
285,
253,
2460,
3386,
253,
3064,
310,
417,
4755,
253,
8753,
34490,
7792,
310,
816,
751,
8989,
19946,
50276,
34490,
50276,
9961,
300,
4071,
24594,
374,
2829,
577,
2722,
326,
387,
78,
556,
247,
4942,
1698,
3045,
6351,
273,
670,
16987,
281,
873,
83,
352,
2722,
326,
253,
3045,
273,
387,
78,
310,
1014,
7197,
685,
8223,
254,
1580,
253,
906,
273,
8223,
254,
310,
16331,
1805,
685,
873,
83,
275,
5269,
1257,
337,
495,
671,
275,
2829,
577,
352,
2722,
326,
407,
970,
298,
12477,
2957,
253,
3641,
276,
906,
5459,
670,
3436,
685,
970,
2636,
2957,
760,
2299,
2829,
337,
2722,
326,
253,
906,
273,
8753,
34490,
310,
670,
3495,
1805,
685,
8223,
254,
8245,
534,
760,
4648,
2636,
2957,
604,
352,
2722,
326,
253,
3045,
6351,
310,
512,
432,
253,
747,
2957,
2216,
533,
417,
432,
253,
7792,
10336,
50276,
783,
4477,
452,
9713,
253,
7364,
285,
2442,
4016,
38058,
16274,
5474,
33032,
2520,
2929,
1246,
247,
24705,
26405,
1332,
1754,
327,
253,
8342,
8113,
39707,
9084,
5742,
352,
29328,
253,
4116,
25321,
1945,
387,
78,
6333,
281,
6635,
253,
8066,
4415,
652,
8989,
275,
1635,
281,
4796,
253,
15180,
2105,
352,
11809,
247,
7316,
3169,
1066,
48027,
2805,
69,
285,
598,
48027,
6333,
572,
275,
253,
11111,
3938,
2715,
4679,
403,
5196,
327,
1264,
15302,
285,
1805,
1543,
403,
2797,
2429,
342,
2045,
3082,
4757,
50276,
18,
18216,
8342,
10336,
323,
24705,
26405,
310,
271,
4722,
285,
12532,
3884,
436,
2929,
1056,
247,
3579,
3213,
4404,
436,
3884,
374,
253,
3045,
273,
8753,
34490,
3133,
281,
320,
1805,
685,
2045,
1375,
23037,
14387,
3082,
50276,
20881,
1255,
50276,
18,
670,
253,
4116,
25321,
1945,
6333,
387,
78,
352,
310,
9009,
275,
2831,
42959,
5133,
533,
275,
958,
627,
310,
1652,
3064,
342,
253,
2629,
30410,
247,
269,
68,
3828,
281,
3711,
3386,
281,
5912,
323,
591,
29206,
9162,
275,
253,
2622,
24705,
26405,
7792,
1016,
6311,
10669,
812,
320,
11575,
347,
247,
30410,
3828,
281,
3711,
253,
8066,
4415,
652,
3386,
715,
247,
5912,
342,
247,
9788,
78,
1238,
1159,
275,
436,
3282,
253,
387,
78,
310,
2074,
281,
247,
2629,
9162,
3828,
50276,
19,
670,
253,
11111,
3938,
2605,
891,
717,
13477,
670,
253,
7316,
3169,
1066,
48027,
4254,
2805,
69,
275,
1386,
24347,
352,
2296,
281,
897,
253,
5275,
10491,
281,
4796,
253,
10669,
3904,
275,
436,
3282,
352,
556,
2717,
342,
253,
7316,
1754,
1066,
48027,
285,
310,
3365,
247,
2629,
1066,
48027,
4254,
891,
717,
671,
13477,
670,
253,
7092,
4278,
327,
7316,
3169,
598,
48027,
4254,
572,
352,
2296,
281,
897,
247,
2629,
39707,
29810,
2605,
281,
598,
16848,
3386,
310,
627,
667,
2714,
2216,
327,
253,
39707,
29810,
407,
24049,
253,
8820,
1491,
625,
4278,
403,
2424,
327,
253,
29810,
2216,
50276,
20,
670,
253,
767,
572,
5871,
275,
253,
2715,
260,
273,
4677,
495,
50276,
783,
1066,
48027,
4313,
273,
2406,
572,
310,
12472,
285,
352,
310,
3626,
281,
1158,
697,
3453,
452,
4577,
1066,
48027,
4313,
751,
1283,
2299,
432,
1386,
1283,
883,
3245,
697,
3453,
1979,
3133,
281,
320,
12472,
534,
310,
13477,
323,
479,
50276,
21,
891,
1158,
436,
2929,
943,
7277,
342,
253,
2045,
2987,
591,
22070,
900,
534,
27532,
247,
2074,
1066,
48027,
8777,
312,
4906,
10336,
323,
14086,
10554,
342,
4979,
398,
625,
5955,
327,
253,
3064,
310,
2424,
281,
1805,
41509,
253,
4081,
1332,
50276,
783,
4477,
452,
9713,
253,
7364,
273,
253,
4081,
1332,
275,
1781,
3541,
8353,
2490,
187,
4118,
18435,
27,
2520,
19529,
556,
2959,
5701,
432,
577,
3565,
30628,
253,
4477,
452,
1160,
1077,
7000,
32114,
281,
253,
30628,
5701,
253,
4477,
285,
30628,
574,
3240,
6793,
11985,
846,
841,
11985,
495,
30628,
8521,
5075,
14924,
285,
337,
8521,
18235,
50275,
1542,
253,
38135,
7350,
253,
4477,
19148,
731,
1309,
253,
30080,
22559,
253,
30628,
452,
671,
8521,
10941,
342,
3332,
24705,
26405,
3082,
970,
362,
953,
5816,
14023,
943,
320,
2908,
275,
253,
2457,
2715,
1690,
14023,
342,
50275,
18,
6429,
1269,
17761,
248,
1162,
355,
298,
9821,
4872,
27998,
20494,
4116,
5723,
2824,
43425,
374,
43938,
3288,
278,
44023,
1162,
355,
10669,
282,
47612,
17825,
29380,
10669,
1320,
323,
10556,
5723,
2824,
43425,
495,
259,
86,
340,
6968,
9041,
1162,
355,
268,
19,
85,
39694,
45900,
39707,
323,
6200,
4685,
26332,
1796,
246,
81,
7588,
1384,
1423,
50275,
7483,
37317,
2046,
80,
25,
32636,
45210,
18235,
253,
4477,
452,
1160,
3240,
247,
7000,
30080,
22559,
533,
359,
452,
417,
3735,
432,
253,
37317,
846,
253,
30080,
22559,
50276,
40622,
253,
913,
651,
751,
281,
5583,
14924,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
authors introduce a new pseudo distance on attributed graphs the tree movers distance tmd they first introduce then authors introduce a distance between trees td which aims at recursively comparing their roots through their respective attributes and their subtrees using optimal transport ot to get hard assignments between these induced subtrees to pursue the recursion ie comparing trees of smaller respective depth tmd then naturally comes from td by modeling graphs as multisets of trees rooted in each node of each graph tmd is shown to define a proper pseudometric for which the axiom of discernability is closely related to the weisfeilerlehman graph isomorphism test after investigating the revelance of tmd for graphs classification authors further study its relevance to quantify stability and generalization abilities of the wellknown graph isomorphism networks gin being sota graph neural networks textbfstrengths overall the paper is wellwritten and the design of tmd is elegantoriginal the authors have managed to clearly address a wide variety of concepts from kernels to gnns an obvious pedagogical effort has been made in several proofs of the results provided which is appreciable tmd is interesting by its flexibility suggested by its dependencies to a depthdependent weighting function and to cost functions inherent to used ot on nodes tmd seems competitive as a kernel in graphs classification and mostly shines on the study of gin textbfneutral about without a clear characterization of tmd balls in the space of attributed graphs or at first unattributed graphs i find it difficult to envision how tmd can guide gnn to further improvements but given the difficulty of this task the empirical evidence provided in sections 5 and 6 supports the relevance of tmd for this purpose textbfweaknesses points to clarify authors exploit specific properties of the kantorovich formulation of ot especially its relation to monges formulation which are eluded in the paper and clearly not straightforward so to improve the clarity of the document eg the need for definitions 2 and 3 it would be good to mention them also the caption of figure 1 can be improved for this purpose theorem 7 from your proof i would say that the stated implication is actually an equivalence could you elaborate on this on the graphs classification benchmark i am not sure to understand your validation scheme from your explanations from my understanding you did a crossvalidation for tmd while eg authors of fgw reported a 10fold nested crossvalidation in their paper which better quantifies generalization abilities and is more natural for graph kernel methods as the computational bottleneck lies in the computation of the kernel matrix therefore i suggest harmonizing the validation scheme on kernel methods instead of just reporting the performance of the respective papers moreover could you complete the benchmark on graphs classification with a benchmark in terms of runtimes on subsection 51 there is a difference between your formulation of messagepassing and the one from gin see equation 41 of 48 epsilon is not handled in the same way as you set epsilon1 for your experiments they are still valid but the implications of this change for the theoretical results in your paper and the ones in gins paper are not clear to me even if it seems minor could you elaborate on this there is no reference to the figure 4 and 5 in the main paper modification i increased my initial grade from 5 borderline accept to 6 weak accept after a convincing rebuttal and discussion by the authors a few limitations of their work have not really been addressed as illustrated by my elaboration on the weaknessespoints for clarification paragraph the authors have adequately addressed the potential negative societal impact of their work in the supplemental material docsepa metric on the set of graphs is defined using concepts from optimal transport it is shown both analytically and by experiment that graph neural networks define lipschitz continuous functions in the metric strength the proposal is clear and well motivated and has evident applications the evaluation is adequate for a first work weakness graph metrics are a much studied field and there is little comparison with previous work technical results with no immediate societal impact docsepthis paper introduces a graph pseudometric based on the hierarchical optimal transport to understand the generalization of machine learning models on graphs they show that the proposed tmd captures properties relevant to graph classification and can be related to generalization of gnns under distribution shifts strengths 1 the paper is wellwritten and easy to follow although there exist some unclear descriptions 2 this paper proposes a new ot distance for graphs using the computation trees of the graph to calculate the distance between two graphs is direct and reasonable 3 the lipschitz constant and stability analysis of gnns seem to be useful and meaningful weaknesses 1 tmd can provably distinguish graphs that are identifiable by the 1dimensional weisfeilerleman graph isomorphism test the authors say it can be further strengthened by augmenting node attributes eg with positional encodings but they have not provided the details since expressive power is very important for graph representation learning 2 the computational complexity of tmd is high the authors only implement it on cpu pot package it is unclear whether the method can be accelerated by gpus 3 some baselines on graph ot are missing for example 1 2 bellow 1 got an optimal transport framework for graph comparison 2 copt coordinated optimal transport on graph yes docsepthe authors first propose tmd a pseudometric for comparing graphs to each other tmd compares graphs to each other by recursively solving a series of optimal transport problems which minimizes distances of subtree patterns the proposed pseudometric is evaluated in graph classification distances fed to an indefinite kernel and then to svm the results indicate that tmd performs on par with the best performing baselines the authors also provide some theoretical results first they bound the lipschitz constant of the gin model with respect to tmd and also analyze the stability of gin under node deletions edge deletions and perturbations of node features finally they provide a result about the generalization error of gin under distribution shifts strengths in my view the originality of the paper is high the proposed tmd distance is novel previous studies have applied optimal transport techniques on the labels produced by the wl algorithm but in my understanding the proposed recursive definition is different from previous work i really like the results about the stability of graph neural networks most previous studies have focused on a different problem whether a graph neural network can distinguish classes of nonisomorphic graphs or not not much work has been done on the distance between the graph representations i am not thought sure how useful the experiments of subsection 53 are whats the purpose of showing that the correlation between tmd and a graph neural network is high furthemore does this hold for all datasets weaknesses the computational complexity of the proposed distance function is very high since it needs to solve a series of optimal transport problems this renders the method practically infeasible for datasets that contain large graphs such as redditbinary and datasets that contain many samples such as the ogb graph property prediction datasets of course for the optimal transport problem one could use some approximate algorithm but still i dont think the proposed method can be applied to large datasets as one can see in table 1 the proposed method provides only marginal improvements in accuracy over the baselines furthermore tmd is not compared against several stateoftheart graph neural networks it is only compared against gin and gcn which are at most as expressive as 1wl it would be nice if the authors could also report the running time of tmd and compare it against that of the wasserstein wl function the message passing scheme of gin given in subsection 51 is different from the one provided in the original paper ie zvl phil 1epsilonlzvl1 sumu in mathcalnv zul1 furthermore both for the experiments and for the proof of theorem 8 the authors set epsilon1 gin is known to be less expressive than 1wl when epsilon1 furthermore the message passing scheme of gcn given in appendix b1 is not correct thus i wonder whether the lipschitz constant of any message passing graph neural network can be bounded under tmd or are there some conditions that need to be satisfied even though tmd is sufficiently different from the wasserstein wl pseudometric i would suggest the authors provide more details about how the two methods differ from each other and also compare the complexities of the two approaches to each other the authors discuss the limitations and potential negative societal impact of their work
### Summary: | this paper proposes a new similarity measure between graphs based on computing optimal transport between distributions of trees extracted from graphs the method benefit from the fast solvers of ot between trees and the proposed metric has been shown to be interesting for computing a lipshitz constant related to the generalization of message passing gnn the experiments were appreciated but lack of comparison with existing graph distances and gnn was noted by the reviewers on the graph classification experiment the authors did a very good reply to the reviewers which was much appreciated for instance the new experiments are very interesting and should be included in the paper or supplementary the fact that the performance does not depend too much on the classifier svm vs knn is also interesting during discussion the consensus was that the paper deserves to be published at neurips but that the authors are requested to include the new results and discussionsclarifications in the paper and supp | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
43355,
9569,
247,
747,
17927,
4181,
327,
12877,
14580,
253,
5202,
5497,
735,
4181,
246,
6535,
597,
806,
9569,
840,
4477,
9569,
247,
4181,
875,
7139,
32989,
534,
13698,
387,
17910,
1242,
10941,
616,
11465,
949,
616,
9056,
12474,
285,
616,
749,
45670,
970,
8654,
4616,
14366,
281,
755,
1892,
23768,
875,
841,
5802,
749,
45670,
281,
15142,
253,
43489,
26332,
10941,
7139,
273,
4577,
9056,
6864,
246,
6535,
840,
10748,
3249,
432,
32989,
407,
14053,
14580,
347,
1554,
261,
1507,
273,
7139,
26415,
275,
1016,
4666,
273,
1016,
4216,
246,
6535,
310,
2011,
281,
4853,
247,
1463,
10585,
7480,
323,
534,
253,
26373,
297,
273,
26923,
1430,
310,
8244,
2905,
281,
253,
359,
261,
453,
6731,
282,
38373,
4216,
20169,
1071,
846,
15686,
253,
17347,
593,
273,
246,
6535,
323,
14580,
9162,
4477,
2007,
1263,
697,
17200,
281,
22048,
7882,
285,
26647,
15277,
273,
253,
973,
4304,
4216,
20169,
6928,
48347,
1146,
256,
5503,
4216,
11454,
6928,
2505,
3342,
296,
3755,
20556,
50275,
1189,
455,
253,
2929,
310,
973,
15720,
285,
253,
2216,
273,
246,
6535,
310,
20654,
19164,
253,
4477,
452,
7303,
281,
4518,
2953,
247,
4618,
5235,
273,
12342,
432,
34501,
281,
18976,
2224,
271,
4755,
7690,
356,
38721,
3434,
556,
644,
1160,
275,
2067,
27947,
273,
253,
1543,
2530,
534,
310,
6373,
6051,
50273,
85,
6535,
310,
4722,
407,
697,
15840,
5125,
407,
697,
21011,
281,
247,
6864,
6820,
42428,
1159,
285,
281,
2105,
3470,
12794,
281,
908,
14366,
327,
7632,
50274,
85,
6535,
3133,
12085,
347,
247,
10295,
275,
14580,
9162,
285,
6571,
439,
1100,
327,
253,
1263,
273,
48347,
50276,
11765,
27912,
670,
50275,
14920,
247,
2590,
14846,
273,
246,
6535,
15254,
275,
253,
2317,
273,
12877,
14580,
390,
387,
806,
39152,
3567,
14580,
891,
1089,
352,
2834,
281,
31161,
849,
246,
6535,
476,
7102,
305,
9866,
281,
2007,
11701,
533,
1677,
253,
10183,
273,
436,
4836,
253,
16774,
1941,
2530,
275,
7118,
608,
285,
721,
8525,
253,
17200,
273,
246,
6535,
323,
436,
4096,
50274,
11765,
20881,
1255,
265,
50276,
10801,
281,
19148,
50275,
43355,
22059,
2173,
3607,
273,
253,
465,
386,
263,
23303,
15895,
273,
14366,
3340,
697,
5886,
281,
49535,
265,
15895,
534,
403,
1045,
21015,
275,
253,
2929,
285,
4518,
417,
15246,
594,
281,
3157,
253,
19843,
273,
253,
3389,
24088,
253,
878,
323,
14308,
374,
285,
495,
352,
651,
320,
1175,
281,
3748,
731,
671,
253,
11743,
273,
4677,
337,
476,
320,
5520,
323,
436,
4096,
50275,
33921,
818,
432,
634,
4737,
891,
651,
1333,
326,
253,
4767,
27570,
310,
2686,
271,
19945,
812,
368,
21184,
327,
436,
50274,
251,
253,
14580,
9162,
22791,
50276,
74,
717,
417,
2119,
281,
2096,
634,
12820,
6974,
432,
634,
22909,
432,
619,
4685,
368,
858,
247,
2831,
29599,
323,
246,
6535,
1223,
24088,
4477,
273,
269,
72,
88,
2361,
247,
884,
8089,
20494,
2831,
29599,
275,
616,
2929,
50276,
4609,
1805,
2677,
7790,
26647,
15277,
285,
310,
625,
3626,
323,
4216,
10295,
3082,
347,
253,
15180,
3673,
44856,
8696,
275,
253,
13782,
273,
253,
10295,
4315,
3103,
891,
1804,
23284,
3006,
253,
12820,
6974,
327,
10295,
3082,
3185,
273,
816,
9610,
253,
3045,
273,
253,
9056,
9380,
25761,
812,
368,
3426,
253,
22791,
327,
14580,
9162,
342,
247,
22791,
275,
2426,
273,
1408,
3181,
50274,
251,
19087,
8319,
627,
310,
247,
3064,
875,
634,
15895,
273,
3935,
5858,
272,
285,
253,
581,
432,
48347,
923,
5150,
7609,
273,
5693,
299,
4277,
310,
417,
15726,
275,
253,
1072,
1039,
347,
368,
873,
299,
4277,
18,
323,
634,
4679,
597,
403,
1335,
3588,
533,
253,
12739,
273,
436,
1818,
323,
253,
10527,
1543,
275,
634,
2929,
285,
253,
4394,
275,
305,
968,
2929,
403,
417,
2590,
281,
479,
1014,
604,
352,
3133,
5884,
812,
368,
21184,
327,
436,
50272,
9088,
310,
642,
3806,
281,
253,
4677,
577,
285,
608,
275,
253,
2022,
2929,
50276,
2307,
1877,
891,
2559,
619,
3302,
9646,
432,
608,
45210,
2997,
281,
721,
5075,
2997,
846,
247,
21414,
30080,
22559,
285,
5955,
407,
253,
4477,
247,
1643,
7364,
273,
616,
789,
452,
417,
1663,
644,
9713,
347,
12800,
407,
619,
14883,
318,
327,
253,
32213,
10801,
323,
37699,
12494,
253,
4477,
452,
18212,
9713,
253,
2442,
4016,
38058,
3486,
273,
616,
789,
275,
253,
25702,
2144,
5474,
339,
4904,
7982,
327,
253,
873,
273,
14580,
310,
2931,
970,
12342,
432,
8654,
4616,
50276,
262,
310,
2011,
1097,
41398,
285,
407,
3368,
326,
4216,
11454,
6928,
4853,
11233,
37913,
5415,
3470,
275,
253,
7982,
50276,
45563,
253,
10419,
310,
2590,
285,
973,
17194,
285,
556,
8943,
4893,
50276,
783,
7103,
310,
10599,
323,
247,
806,
789,
50276,
20881,
1255,
4216,
17082,
403,
247,
1199,
5421,
1673,
285,
627,
310,
1652,
5301,
342,
2045,
789,
7681,
1543,
342,
642,
8993,
38058,
3486,
5474,
33032,
2520,
2929,
23970,
247,
4216,
10585,
7480,
1754,
327,
253,
24498,
8654,
4616,
50276,
936,
2096,
253,
26647,
273,
5145,
4715,
3210,
327,
14580,
597,
921,
326,
253,
4081,
246,
6535,
28174,
3607,
4623,
281,
4216,
9162,
285,
476,
320,
2905,
281,
26647,
273,
18976,
2224,
762,
3268,
15036,
50275,
296,
3755,
20556,
50276,
18,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
3738,
627,
2226,
690,
12744,
20121,
374,
436,
2929,
29328,
247,
747,
14366,
4181,
323,
14580,
970,
253,
13782,
7139,
273,
253,
4216,
281,
10173,
253,
4181,
875,
767,
14580,
310,
1480,
285,
5272,
495,
253,
11233,
37913,
3638,
285,
7882,
1783,
273,
18976,
2224,
1646,
281,
320,
4217,
285,
14282,
50276,
20881,
1255,
265,
50276,
18,
246,
6535,
476,
872,
1598,
12129,
14580,
326,
403,
38640,
407,
253,
337,
6967,
359,
261,
453,
6731,
39772,
4216,
20169,
1071,
253,
4477,
1333,
352,
476,
320,
2007,
34615,
407,
35919,
272,
4666,
12474,
24088,
342,
40798,
2349,
351,
723,
533,
597,
452,
417,
2530,
253,
4278,
1580,
43541,
1612,
310,
1077,
1774,
323,
4216,
6779,
4715,
374,
50276,
783,
15180,
10454,
273,
246,
6535,
310,
1029,
253,
4477,
760,
3359,
352,
327,
27754,
1721,
5522,
352,
310,
12744,
1880,
253,
1332,
476,
320,
21702,
407,
31025,
316,
495,
690,
1666,
25379,
327,
4216,
14366,
403,
5816,
323,
1650,
337,
374,
44462,
50273,
18,
1694,
271,
8654,
4616,
7792,
323,
4216,
5301,
374,
260,
2178,
25899,
8654,
4616,
327,
4216,
50276,
9820,
5474,
339,
431,
248,
4477,
806,
12661,
246,
6535,
247,
10585,
7480,
323,
10941,
14580,
281,
1016,
643,
246,
6535,
26662,
14580,
281,
1016,
643,
407,
17910,
1242,
16161,
247,
2962,
273,
8654,
4616,
3237,
534,
46926,
13849,
273,
8482,
658,
6127,
253,
4081,
10585,
7480,
310,
6760,
275,
4216,
9162,
13849,
10208,
281,
271,
44245,
10295,
285,
840,
281,
256,
11618,
253,
1543,
5224,
326,
246,
6535,
17923,
327,
1061,
342,
253,
1682,
9591,
1666,
25379,
253,
4477,
671,
2085,
690,
10527,
1543,
806,
597,
3033,
253,
11233,
37913,
3638,
273,
253,
48347,
1566,
342,
1675,
281,
246,
6535,
285,
671,
12106,
253,
7882,
273,
48347,
762,
4666,
38995,
5024,
38995,
285,
26309,
273,
4666,
3386,
4720,
597,
2085,
247,
906,
670,
253,
26647,
2228,
273,
48347,
762,
3268,
15036,
20544,
50276,
249,
619,
1859,
253,
3236,
414,
273,
253,
2929,
310,
1029,
253,
4081,
246,
6535,
4181,
310,
4460,
2045,
2175,
452,
3732,
8654,
4616,
5609,
327,
253,
13301,
4197,
407,
253,
259,
77,
5933,
533,
275,
619,
4685,
253,
4081,
33037,
5426,
310,
1027,
432,
2045,
789,
50275,
74,
1663,
751,
253,
1543,
670,
253,
7882,
273,
4216,
11454,
6928,
954,
2045,
2175,
452,
7106,
327,
247,
1027,
1895,
1880,
247,
4216,
11454,
2990,
476,
12129,
5971,
273,
1327,
261,
13468,
14580,
390,
417,
417,
1199,
789,
556,
644,
2218,
327,
253,
4181,
875,
253,
4216,
14237,
891,
717,
417,
1869,
2119,
849,
4217,
253,
4679,
273,
19087,
8676,
403,
47515,
253,
4096,
273,
4645,
326,
253,
5921,
875,
246,
6535,
285,
247,
4216,
11454,
2990,
310,
1029,
11829,
783,
3062,
1057,
436,
2186,
323,
512,
15302,
50276,
20881,
1255,
265,
50276,
783,
15180,
10454,
273,
253,
4081,
4181,
1159,
310,
1077,
1029,
1580,
352,
3198,
281,
8415,
247,
2962,
273,
8654,
4616,
3237,
436,
29512,
253,
1332,
18236,
275,
36764,
917,
323,
15302,
326,
3831,
1781,
14580,
824,
347,
28159,
262,
26458,
285,
15302,
326,
3831,
1142,
3530,
824,
347,
253,
9040,
67,
4216,
2867,
10554,
15302,
273,
2282,
323,
253,
8654,
4616,
1895,
581,
812,
897,
690,
16851,
5933,
533,
1335,
891,
13414,
1158,
253,
4081,
1332,
476,
320,
3732,
281,
1781,
15302,
50275,
284,
581,
476,
923,
275,
2829,
337,
253,
4081,
1332,
3400,
760,
16888,
11701,
275,
7200,
689,
253,
1666,
25379,
33810,
246,
6535,
310,
417,
2429,
1411,
2067,
1375,
23037,
14387,
4216,
11454,
6928,
352,
310,
760,
2429,
1411,
48347,
285,
305,
14340,
534,
403,
387,
954,
347,
43541,
347,
337,
24966,
352,
651,
320,
5322,
604,
253,
4477,
812,
671,
1304,
253,
3515,
673,
273,
246,
6535,
285,
7277,
352,
1411,
326,
273,
253,
369,
2152,
6339,
259,
77,
1159,
50275,
783,
3935,
8136,
6974,
273,
48347,
1677,
275,
19087,
8319,
310,
1027,
432,
253,
581,
2530,
275,
253,
3236,
2929,
26332,
1182,
29576,
50276,
545,
300,
337,
4259,
77,
91,
29576,
18,
2020,
86,
275,
14168,
1179,
31287,
1182,
335,
18,
33810,
1097,
323,
253,
4679,
285,
323,
253,
4737,
273,
10012,
854,
253,
4477,
873,
299,
4277,
18,
48347,
310,
1929,
281,
320,
1679,
43541,
685,
337,
24966,
672,
299,
4277,
18,
33810,
253,
3935,
8136,
6974,
273,
305,
14340,
1677,
275,
30762,
270,
18,
310,
417,
3451,
3021,
891,
4282,
1880,
253,
11233,
37913,
3638,
273,
667,
3935,
8136,
4216,
11454,
2990,
476,
320,
11542,
762,
246,
6535,
390,
403,
627,
690,
2515,
326,
878,
281,
320,
10048,
50275,
9154,
2167,
246,
6535,
310,
10481,
1027,
432,
253,
369,
2152,
6339,
259,
77,
10585,
7480,
891,
651,
1804,
253,
4477,
2085,
625,
4278,
670,
849,
253,
767,
3082,
9184,
432,
1016,
643,
285,
671,
7277,
253,
48663,
273,
253,
767,
7274,
281,
1016,
643,
253,
4477,
2319,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
747,
14259,
2557,
875,
14580,
1754,
327,
12672,
8654,
4616,
875,
10670,
273,
7139,
10375,
432,
14580,
253,
1332,
5649,
432,
253,
3809,
1220,
735,
273,
14366,
875,
7139,
285,
253,
4081,
7982,
556,
644,
2011,
281,
320,
4722,
323,
12672,
50276,
66,
5541,
1200,
5432,
3638,
2905,
281,
253,
26647,
273,
3935,
8136,
305,
9866,
253,
4679,
497,
14109,
533,
3480,
273,
5301,
342,
5368,
4216,
13849,
285,
305,
9866,
369,
4879,
407,
253,
30628,
327,
253,
4216,
9162,
3368,
50276,
783,
4477,
858,
247,
1077,
1175,
12252,
281,
253,
30628,
534,
369,
1199,
14109,
323,
4227,
253,
747,
4679,
403,
1077,
4722,
285,
943,
320,
2908,
275,
253,
2929,
390,
24864,
253,
958,
326,
253,
3045,
1057,
417,
3469,
1512,
1199,
327,
253,
30410,
256,
11618,
4632,
694,
79,
310,
671,
4722,
1309,
5955,
253,
13969,
369,
326,
253,
2929,
22828,
281,
320,
3863,
387,
5723,
2824,
533,
326,
253,
4477,
403,
9521,
281,
2486,
253,
747,
1543,
285,
11985,
498,
274,
6787,
275,
253,
2929,
285,
915
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
43355,
9569,
247,
747,
17927,
4181,
327,
12877,
14580,
253,
5202,
5497,
735,
4181,
246,
6535,
597,
806,
9569,
840,
4477,
9569,
247,
4181,
875,
7139,
32989,
534,
13698,
387,
17910,
1242,
10941,
616,
11465,
949,
616,
9056,
12474,
285,
616,
749,
45670,
970,
8654,
4616,
14366,
281,
755,
1892,
23768,
875,
841,
5802,
749,
45670,
281,
15142,
253,
43489,
26332,
10941,
7139,
273,
4577,
9056,
6864,
246,
6535,
840,
10748,
3249,
432,
32989,
407,
14053,
14580,
347,
1554,
261,
1507,
273,
7139,
26415,
275,
1016,
4666,
273,
1016,
4216,
246,
6535,
310,
2011,
281,
4853,
247,
1463,
10585,
7480,
323,
534,
253,
26373,
297,
273,
26923,
1430,
310,
8244,
2905,
281,
253,
359,
261,
453,
6731,
282,
38373,
4216,
20169,
1071,
846,
15686,
253,
17347,
593,
273,
246,
6535,
323,
14580,
9162,
4477,
2007,
1263,
697,
17200,
281,
22048,
7882,
285,
26647,
15277,
273,
253,
973,
4304,
4216,
20169,
6928,
48347,
1146,
256,
5503,
4216,
11454,
6928,
2505,
3342,
296,
3755,
20556,
50275,
1189,
455,
253,
2929,
310,
973,
15720,
285,
253,
2216,
273,
246,
6535,
310,
20654,
19164,
253,
4477,
452,
7303,
281,
4518,
2953,
247,
4618,
5235,
273,
12342,
432,
34501,
281,
18976,
2224,
271,
4755,
7690,
356,
38721,
3434,
556,
644,
1160,
275,
2067,
27947,
273,
253,
1543,
2530,
534,
310,
6373,
6051,
50273,
85,
6535,
310,
4722,
407,
697,
15840,
5125,
407,
697,
21011,
281,
247,
6864,
6820,
42428,
1159,
285,
281,
2105,
3470,
12794,
281,
908,
14366,
327,
7632,
50274,
85,
6535,
3133,
12085,
347,
247,
10295,
275,
14580,
9162,
285,
6571,
439,
1100,
327,
253,
1263,
273,
48347,
50276,
11765,
27912,
670,
50275,
14920,
247,
2590,
14846,
273,
246,
6535,
15254,
275,
253,
2317,
273,
12877,
14580,
390,
387,
806,
39152,
3567,
14580,
891,
1089,
352,
2834,
281,
31161,
849,
246,
6535,
476,
7102,
305,
9866,
281,
2007,
11701,
533,
1677,
253,
10183,
273,
436,
4836,
253,
16774,
1941,
2530,
275,
7118,
608,
285,
721,
8525,
253,
17200,
273,
246,
6535,
323,
436,
4096,
50274,
11765,
20881,
1255,
265,
50276,
10801,
281,
19148,
50275,
43355,
22059,
2173,
3607,
273,
253,
465,
386,
263,
23303,
15895,
273,
14366,
3340,
697,
5886,
281,
49535,
265,
15895,
534,
403,
1045,
21015,
275,
253,
2929,
285,
4518,
417,
15246,
594,
281,
3157,
253,
19843,
273,
253,
3389,
24088,
253,
878,
323,
14308,
374,
285,
495,
352,
651,
320,
1175,
281,
3748,
731,
671,
253,
11743,
273,
4677,
337,
476,
320,
5520,
323,
436,
4096,
50275,
33921,
818,
432,
634,
4737,
891,
651,
1333,
326,
253,
4767,
27570,
310,
2686,
271,
19945,
812,
368,
21184,
327,
436,
50274,
251,
253,
14580,
9162,
22791,
50276,
74,
717,
417,
2119,
281,
2096,
634,
12820,
6974,
432,
634,
22909,
432,
619,
4685,
368,
858,
247,
2831,
29599,
323,
246,
6535,
1223,
24088,
4477,
273,
269,
72,
88,
2361,
247,
884,
8089,
20494,
2831,
29599,
275,
616,
2929,
50276,
4609,
1805,
2677,
7790,
26647,
15277,
285,
310,
625,
3626,
323,
4216,
10295,
3082,
347,
253,
15180,
3673,
44856,
8696,
275,
253,
13782,
273,
253,
10295,
4315,
3103,
891,
1804,
23284,
3006,
253,
12820,
6974,
327,
10295,
3082,
3185,
273,
816,
9610,
253,
3045,
273,
253,
9056,
9380,
25761,
812,
368,
3426,
253,
22791,
327,
14580,
9162,
342,
247,
22791,
275,
2426,
273,
1408,
3181,
50274,
251,
19087,
8319,
627,
310,
247,
3064,
875,
634,
15895,
273,
3935,
5858,
272,
285,
253,
581,
432,
48347,
923,
5150,
7609,
273,
5693,
299,
4277,
310,
417,
15726,
275,
253,
1072,
1039,
347,
368,
873,
299,
4277,
18,
323,
634,
4679,
597,
403,
1335,
3588,
533,
253,
12739,
273,
436,
1818,
323,
253,
10527,
1543,
275,
634,
2929,
285,
253,
4394,
275,
305,
968,
2929,
403,
417,
2590,
281,
479,
1014,
604,
352,
3133,
5884,
812,
368,
21184,
327,
436,
50272,
9088,
310,
642,
3806,
281,
253,
4677,
577,
285,
608,
275,
253,
2022,
2929,
50276,
2307,
1877,
891,
2559,
619,
3302,
9646,
432,
608,
45210,
2997,
281,
721,
5075,
2997,
846,
247,
21414,
30080,
22559,
285,
5955,
407,
253,
4477,
247,
1643,
7364,
273,
616,
789,
452,
417,
1663,
644,
9713,
347,
12800,
407,
619,
14883,
318,
327,
253,
32213,
10801,
323,
37699,
12494,
253,
4477,
452,
18212,
9713,
253,
2442,
4016,
38058,
3486,
273,
616,
789,
275,
253,
25702,
2144,
5474,
339,
4904,
7982,
327,
253,
873,
273,
14580,
310,
2931,
970,
12342,
432,
8654,
4616,
50276,
262,
310,
2011,
1097,
41398,
285,
407,
3368,
326,
4216,
11454,
6928,
4853,
11233,
37913,
5415,
3470,
275,
253,
7982,
50276,
45563,
253,
10419,
310,
2590,
285,
973,
17194,
285,
556,
8943,
4893,
50276,
783,
7103,
310,
10599,
323,
247,
806,
789,
50276,
20881,
1255,
4216,
17082,
403,
247,
1199,
5421,
1673,
285,
627,
310,
1652,
5301,
342,
2045,
789,
7681,
1543,
342,
642,
8993,
38058,
3486,
5474,
33032,
2520,
2929,
23970,
247,
4216,
10585,
7480,
1754,
327,
253,
24498,
8654,
4616,
50276,
936,
2096,
253,
26647,
273,
5145,
4715,
3210,
327,
14580,
597,
921,
326,
253,
4081,
246,
6535,
28174,
3607,
4623,
281,
4216,
9162,
285,
476,
320,
2905,
281,
26647,
273,
18976,
2224,
762,
3268,
15036,
50275,
296,
3755,
20556,
50276,
18,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
3738,
627,
2226,
690,
12744,
20121,
374,
436,
2929,
29328,
247,
747,
14366,
4181,
323,
14580,
970,
253,
13782,
7139,
273,
253,
4216,
281,
10173,
253,
4181,
875,
767,
14580,
310,
1480,
285,
5272,
495,
253,
11233,
37913,
3638,
285,
7882,
1783,
273,
18976,
2224,
1646,
281,
320,
4217,
285,
14282,
50276,
20881,
1255,
265,
50276,
18,
246,
6535,
476,
872,
1598,
12129,
14580,
326,
403,
38640,
407,
253,
337,
6967,
359,
261,
453,
6731,
39772,
4216,
20169,
1071,
253,
4477,
1333,
352,
476,
320,
2007,
34615,
407,
35919,
272,
4666,
12474,
24088,
342,
40798,
2349,
351,
723,
533,
597,
452,
417,
2530,
253,
4278,
1580,
43541,
1612,
310,
1077,
1774,
323,
4216,
6779,
4715,
374,
50276,
783,
15180,
10454,
273,
246,
6535,
310,
1029,
253,
4477,
760,
3359,
352,
327,
27754,
1721,
5522,
352,
310,
12744,
1880,
253,
1332,
476,
320,
21702,
407,
31025,
316,
495,
690,
1666,
25379,
327,
4216,
14366,
403,
5816,
323,
1650,
337,
374,
44462,
50273,
18,
1694,
271,
8654,
4616,
7792,
323,
4216,
5301,
374,
260,
2178,
25899,
8654,
4616,
327,
4216,
50276,
9820,
5474,
339,
431,
248,
4477,
806,
12661,
246,
6535,
247,
10585,
7480,
323,
10941,
14580,
281,
1016,
643,
246,
6535,
26662,
14580,
281,
1016,
643,
407,
17910,
1242,
16161,
247,
2962,
273,
8654,
4616,
3237,
534,
46926,
13849,
273,
8482,
658,
6127,
253,
4081,
10585,
7480,
310,
6760,
275,
4216,
9162,
13849,
10208,
281,
271,
44245,
10295,
285,
840,
281,
256,
11618,
253,
1543,
5224,
326,
246,
6535,
17923,
327,
1061,
342,
253,
1682,
9591,
1666,
25379,
253,
4477,
671,
2085,
690,
10527,
1543,
806,
597,
3033,
253,
11233,
37913,
3638,
273,
253,
48347,
1566,
342,
1675,
281,
246,
6535,
285,
671,
12106,
253,
7882,
273,
48347,
762,
4666,
38995,
5024,
38995,
285,
26309,
273,
4666,
3386,
4720,
597,
2085,
247,
906,
670,
253,
26647,
2228,
273,
48347,
762,
3268,
15036,
20544,
50276,
249,
619,
1859,
253,
3236,
414,
273,
253,
2929,
310,
1029,
253,
4081,
246,
6535,
4181,
310,
4460,
2045,
2175,
452,
3732,
8654,
4616,
5609,
327,
253,
13301,
4197,
407,
253,
259,
77,
5933,
533,
275,
619,
4685,
253,
4081,
33037,
5426,
310,
1027,
432,
2045,
789,
50275,
74,
1663,
751,
253,
1543,
670,
253,
7882,
273,
4216,
11454,
6928,
954,
2045,
2175,
452,
7106,
327,
247,
1027,
1895,
1880,
247,
4216,
11454,
2990,
476,
12129,
5971,
273,
1327,
261,
13468,
14580,
390,
417,
417,
1199,
789,
556,
644,
2218,
327,
253,
4181,
875,
253,
4216,
14237,
891,
717,
417,
1869,
2119,
849,
4217,
253,
4679,
273,
19087,
8676,
403,
47515,
253,
4096,
273,
4645,
326,
253,
5921,
875,
246,
6535,
285,
247,
4216,
11454,
2990,
310,
1029,
11829,
783,
3062,
1057,
436,
2186,
323,
512,
15302,
50276,
20881,
1255,
265,
50276,
783,
15180,
10454,
273,
253,
4081,
4181,
1159,
310,
1077,
1029,
1580,
352,
3198,
281,
8415,
247,
2962,
273,
8654,
4616,
3237,
436,
29512,
253,
1332,
18236,
275,
36764,
917,
323,
15302,
326,
3831,
1781,
14580,
824,
347,
28159,
262,
26458,
285,
15302,
326,
3831,
1142,
3530,
824,
347,
253,
9040,
67,
4216,
2867,
10554,
15302,
273,
2282,
323,
253,
8654,
4616,
1895,
581,
812,
897,
690,
16851,
5933,
533,
1335,
891,
13414,
1158,
253,
4081,
1332,
476,
320,
3732,
281,
1781,
15302,
50275,
284,
581,
476,
923,
275,
2829,
337,
253,
4081,
1332,
3400,
760,
16888,
11701,
275,
7200,
689,
253,
1666,
25379,
33810,
246,
6535,
310,
417,
2429,
1411,
2067,
1375,
23037,
14387,
4216,
11454,
6928,
352,
310,
760,
2429,
1411,
48347,
285,
305,
14340,
534,
403,
387,
954,
347,
43541,
347,
337,
24966,
352,
651,
320,
5322,
604,
253,
4477,
812,
671,
1304,
253,
3515,
673,
273,
246,
6535,
285,
7277,
352,
1411,
326,
273,
253,
369,
2152,
6339,
259,
77,
1159,
50275,
783,
3935,
8136,
6974,
273,
48347,
1677,
275,
19087,
8319,
310,
1027,
432,
253,
581,
2530,
275,
253,
3236,
2929,
26332,
1182,
29576,
50276,
545,
300,
337,
4259,
77,
91,
29576,
18,
2020,
86,
275,
14168,
1179,
31287,
1182,
335,
18,
33810,
1097,
323,
253,
4679,
285,
323,
253,
4737,
273,
10012,
854,
253,
4477,
873,
299,
4277,
18,
48347,
310,
1929,
281,
320,
1679,
43541,
685,
337,
24966,
672,
299,
4277,
18,
33810,
253,
3935,
8136,
6974,
273,
305,
14340,
1677,
275,
30762,
270,
18,
310,
417,
3451,
3021,
891,
4282,
1880,
253,
11233,
37913,
3638,
273,
667,
3935,
8136,
4216,
11454,
2990,
476,
320,
11542,
762,
246,
6535,
390,
403,
627,
690,
2515,
326,
878,
281,
320,
10048,
50275,
9154,
2167,
246,
6535,
310,
10481,
1027,
432,
253,
369,
2152,
6339,
259,
77,
10585,
7480,
891,
651,
1804,
253,
4477,
2085,
625,
4278,
670,
849,
253,
767,
3082,
9184,
432,
1016,
643,
285,
671,
7277,
253,
48663,
273,
253,
767,
7274,
281,
1016,
643,
253,
4477,
2319,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
747,
14259,
2557,
875,
14580,
1754,
327,
12672,
8654,
4616,
875,
10670,
273,
7139,
10375,
432,
14580,
253,
1332,
5649,
432,
253,
3809,
1220,
735,
273,
14366,
875,
7139,
285,
253,
4081,
7982,
556,
644,
2011,
281,
320,
4722,
323,
12672,
50276,
66,
5541,
1200,
5432,
3638,
2905,
281,
253,
26647,
273,
3935,
8136,
305,
9866,
253,
4679,
497,
14109,
533,
3480,
273,
5301,
342,
5368,
4216,
13849,
285,
305,
9866,
369,
4879,
407,
253,
30628,
327,
253,
4216,
9162,
3368,
50276,
783,
4477,
858,
247,
1077,
1175,
12252,
281,
253,
30628,
534,
369,
1199,
14109,
323,
4227,
253,
747,
4679,
403,
1077,
4722,
285,
943,
320,
2908,
275,
253,
2929,
390,
24864,
253,
958,
326,
253,
3045,
1057,
417,
3469,
1512,
1199,
327,
253,
30410,
256,
11618,
4632,
694,
79,
310,
671,
4722,
1309,
5955,
253,
13969,
369,
326,
253,
2929,
22828,
281,
320,
3863,
387,
5723,
2824,
533,
326,
253,
4477,
403,
9521,
281,
2486,
253,
747,
1543,
285,
11985,
498,
274,
6787,
275,
253,
2929,
285,
915
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this is an excellent analysis paper of a very interesting phenomenon in deep neural networks quality clarity originality as far as i know the paper explores a very relevant and original question studying how the learning process of different examples in the dataset varies in particular the authors study whether some examples are harder to learn than others examples that are forgotten and relearned multiple times through learning we can imagine that such examples are support vectors for neural networks helping define the decision boundary the paper is very clear and the experiments are of very high quality i particularly appreciated the effort of the authors to use architectures that achieve close to sota on all datasets to ensure conclusions are valid in this setting i also thought the multiple repetitions and analysing rank correlation over different random seeds was a good additional test significance this paper has some very interesting and significant takeaways some of the other experiments i thought were particularly insightful were the effect on test error of removing examples that arent forgotten to examples that are forgotten more in summary the harder examples are more crucial to define the right decision boundaries i also liked the experiment with noisy labels showing that this results in networks forgetting faster my one suggestion would be to try this experiment with noisy data instead of noisy labels as we are especially curious about the effect of the data as opposed to a different labelling task i encourage the authors to followup with a larger scaled version of their experiments its possible that for a harder task like imagenet a combination of easy and hard examples might be needed to enable learning and define good decision boundaries i argue strongly for this paper to be accepted to iclr i think it will be of great interest to the communitydocsepupdate 2 nov 19 2018 the paper has improved very substantially since the initial submission and the authors have addressed almost all of my comments i have therefore increased my score to an 8 and recommend acceptance update nov 16 2018 in light of the author response i have increased my score to a 6 this paper aims to analyze the extent to which networks learn to correctly classify specific examples and then forget these examples over the course of training the authors provide several examples of forgettable and unforgettable examples demonstrating among other things that examples with noisy examples are more forgettable and that a reasonable fraction of unforgettable examples can be removed from the training set without harming performance the paper is clearly written and the work is novel to my knowledge this is the first investigation of example forgetting over training there are an interesting and likely important set of ideas here and portions of the paper are quite strong in particular the experiment demonstrating that examples with noisy examples are more forgettable is quite nice however there are several experimental oversights which make this paper difficult to recommend for publication in its current form major points 1 the most critical issue is with the measurement of forgetting itself the authors do not take into account the chance forgetting rate in any of their experiments simply due to chance some examples will be correctly labeled at some point in training especially in the datasets analyzed which only contain 10 classes this makes it difficult to distinguish whether a forgotten example was actually ever learned in the first place in order to properly ground this metric measurements of chance forgetting rates will be necessary for example what are the forgetting rates when random steps are taken at each update step 2 were the networks trained on mnist permutedmnist and cifar10 trained for the same number of epochs related to point 1 the forgetting rate should increase with the number of epochs used in training as the probability of each example being correctly classified should increase if the cifar10 models were trained for more epochs this would explain the observation that more cifar10 examples were forgettable 3 in the experiment presented in figure 4b it is difficult to tell whether the never forgotten set suffers less degradation in the third training regime because the examples were never forgotten or because the model had twice has much prior experience please include a control where the order is flipped eg forgotten never forgotten forgotten in addition to the included never forgotten forgotten never forgotten order currently present 4 the visual inspection of forgettable and unforgettable examples in figure 2 is extremely anecdotal and moreover do not even appear to clearly support the claims made in the paper minor points 1 in the discussion of previous studies which attempted to assess the importance of particular examples to classification decisions a citation to 1 should be added 2 the point regarding similarity across seeds is absolutely critical especially wrt major comment 1 and should be included earlier in the paper and more prominently 3 the histograms in figure 1 are misleading in the cropped state while i appreciate that the authors included the full histogram in the supplement these full histograms should be included in the main figure as well perhaps as an inset 4 the inclusion of a space after the commas in numbers eg 50 245 is quite confusing especially when multiple numbers are listed as in the first line on page 4 1 koh pang wei and percy liang understanding blackbox predictions via influence functions icml 2017 docsepthis paper studies the forgetting behavior of the training examples during sgd empirically it shows there are forgettable and unforgettable examples unforgettable examples are like support examples one can achieve similar performance by training only on these support examples the paper also shows this phenomenon is consistent across different network architectures pros this paper is written in high quality clearly presented it is original in the sense that this is the first empirical study on the forgettability of examples in during neural network training comments and questions on the experiment details 1 is the dataset randomly shuffled after every epoch one concern is that if the order is fixed some of the examples will be unforgettable simply because the previous batches have similar examples and training the model on the previous batches makes it good on some examples in the current batch 2 it would be more interesting to also include datasets like cifar100 which has more labels the current datasets all have only 10 categories 3 an addition figure can be provided which switches the order of training in figure 4b namely start with training on b2 cons lack of insight subjectively i usually expect empirical analysis papers to either come up with unexpected observations or provide guidance for practice in my opinion the findings of this work is within expectation and there is a gap for practice overall this paper is worth publishing for the systematic experiments which empirically verifies that there are support examples in neural networks
### Summary: | this paper is an analysis of the phenomenon of example forgetting in deep neural net training the empirical study is the first of its kind and features convincing experiments with architectures that achieve near stateoftheart results it shows that a portion of the training set can be seen as support examples the reviewers noted weaknesses such as in the measurement of the forgetting itself and the training regiment however they agreed that their concerns we addressed by the rebuttal they also noted that the paper is not forthcoming with insights but found enough value in the systematic empirical study it provides | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
310,
271,
7126,
1783,
2929,
273,
247,
1077,
4722,
11562,
275,
3676,
11454,
6928,
50276,
15177,
19843,
3236,
414,
347,
2080,
347,
891,
871,
253,
2929,
33826,
247,
1077,
4623,
285,
3236,
1953,
50276,
14091,
3184,
849,
253,
4715,
1232,
273,
1027,
6667,
275,
253,
10895,
16149,
275,
1798,
253,
4477,
1263,
1880,
690,
6667,
403,
12150,
281,
3037,
685,
2571,
6667,
326,
403,
14454,
285,
1693,
1596,
264,
2709,
2069,
949,
4715,
359,
476,
8564,
326,
824,
6667,
403,
1329,
11390,
323,
11454,
6928,
9073,
4853,
253,
3061,
7548,
50276,
783,
2929,
310,
1077,
2590,
285,
253,
4679,
403,
273,
1077,
1029,
3290,
891,
3782,
14109,
253,
3434,
273,
253,
4477,
281,
897,
35615,
326,
5115,
2810,
281,
256,
5503,
327,
512,
15302,
281,
5416,
11815,
403,
3588,
275,
436,
4758,
891,
671,
1869,
253,
2709,
49495,
285,
5127,
272,
5958,
5921,
689,
1027,
3632,
12922,
369,
247,
1175,
3081,
1071,
50276,
9188,
40348,
436,
2929,
556,
690,
1077,
4722,
285,
1534,
1379,
42287,
690,
273,
253,
643,
4679,
891,
1869,
497,
3782,
47860,
497,
253,
1055,
50276,
251,
1071,
2228,
273,
11922,
6667,
326,
403,
2649,
14454,
281,
6667,
326,
403,
14454,
625,
275,
6010,
253,
12150,
6667,
403,
625,
9560,
281,
4853,
253,
987,
3061,
13674,
891,
671,
10490,
253,
3368,
342,
27620,
13301,
4645,
326,
436,
1543,
275,
6928,
37264,
7938,
50276,
2577,
581,
14876,
651,
320,
281,
1611,
436,
3368,
342,
27620,
941,
3185,
273,
27620,
13301,
347,
359,
403,
3340,
14338,
670,
253,
1055,
273,
253,
941,
347,
10066,
281,
247,
1027,
46684,
4836,
50276,
74,
11907,
253,
4477,
281,
956,
484,
342,
247,
4067,
24337,
2715,
273,
616,
4679,
697,
1896,
326,
323,
247,
12150,
4836,
751,
4440,
257,
292,
247,
5019,
273,
3477,
285,
1892,
6667,
1537,
320,
3058,
281,
8046,
4715,
285,
4853,
1175,
3061,
13674,
50276,
74,
9059,
7052,
323,
436,
2929,
281,
320,
7607,
281,
17857,
32888,
891,
1158,
352,
588,
320,
273,
1270,
1600,
281,
253,
3114,
7152,
33032,
11183,
374,
22458,
655,
4765,
253,
2929,
556,
5520,
1077,
9619,
1580,
253,
3302,
19529,
285,
253,
4477,
452,
9713,
2761,
512,
273,
619,
5701,
891,
452,
3103,
2559,
619,
4868,
281,
271,
854,
285,
5583,
14924,
50275,
11183,
22458,
1668,
4765,
50276,
249,
1708,
273,
253,
2488,
2380,
891,
452,
2559,
619,
4868,
281,
247,
721,
50275,
2520,
2929,
13698,
281,
12106,
253,
6070,
281,
534,
6928,
3037,
281,
9113,
30215,
2173,
6667,
285,
840,
7740,
841,
6667,
689,
253,
2282,
273,
3733,
253,
4477,
2085,
2067,
6667,
273,
7740,
2420,
285,
42439,
788,
2420,
6667,
17227,
2190,
643,
1841,
326,
6667,
342,
27620,
6667,
403,
625,
7740,
2420,
285,
326,
247,
5272,
6919,
273,
42439,
788,
2420,
6667,
476,
320,
5176,
432,
253,
3733,
873,
1293,
5237,
272,
3045,
50275,
783,
2929,
310,
4518,
3542,
285,
253,
789,
310,
4460,
50276,
936,
619,
3640,
436,
310,
253,
806,
5839,
273,
1650,
37264,
689,
3733,
627,
403,
271,
4722,
285,
2779,
1774,
873,
273,
5697,
1060,
285,
11821,
273,
253,
2929,
403,
3240,
2266,
50276,
249,
1798,
253,
3368,
17227,
326,
6667,
342,
27620,
6667,
403,
625,
7740,
2420,
310,
3240,
5322,
2299,
627,
403,
2067,
5661,
689,
84,
4380,
534,
1056,
436,
2929,
2834,
281,
5583,
323,
9311,
275,
697,
1655,
830,
50276,
24330,
2792,
50276,
18,
253,
954,
4619,
2523,
310,
342,
253,
6814,
273,
37264,
3139,
253,
4477,
513,
417,
1379,
715,
2395,
253,
4839,
37264,
2281,
275,
667,
273,
616,
4679,
3365,
1955,
281,
4839,
690,
6667,
588,
320,
9113,
13130,
387,
690,
1127,
275,
3733,
3340,
275,
253,
15302,
5867,
534,
760,
3831,
884,
5971,
436,
2789,
352,
2834,
281,
12129,
1880,
247,
14454,
1650,
369,
2686,
2455,
6311,
275,
253,
806,
1659,
275,
1340,
281,
6283,
3216,
436,
7982,
6341,
273,
4839,
37264,
4142,
588,
320,
3309,
323,
1650,
752,
403,
253,
37264,
4142,
672,
3632,
5018,
403,
2668,
387,
1016,
5731,
3213,
50275,
19,
497,
253,
6928,
10166,
327,
278,
79,
382,
8143,
4525,
16192,
382,
285,
260,
338,
274,
740,
10166,
323,
253,
1072,
1180,
273,
44540,
2905,
281,
1127,
337,
253,
37264,
2281,
943,
2572,
342,
253,
1180,
273,
44540,
908,
275,
3733,
347,
253,
5912,
273,
1016,
1650,
1146,
9113,
10509,
943,
2572,
604,
253,
260,
338,
274,
740,
3210,
497,
10166,
323,
625,
44540,
436,
651,
5513,
253,
8310,
326,
625,
260,
338,
274,
740,
6667,
497,
7740,
2420,
50276,
20,
275,
253,
3368,
3559,
275,
4677,
577,
67,
352,
310,
2834,
281,
2028,
1880,
253,
1620,
14454,
873,
27171,
1679,
11961,
275,
253,
2626,
3733,
9459,
984,
253,
6667,
497,
1620,
14454,
390,
984,
253,
1566,
574,
7019,
556,
1199,
2720,
2793,
4496,
2486,
247,
1453,
835,
253,
1340,
310,
34572,
24088,
14454,
1620,
14454,
14454,
275,
1635,
281,
253,
2908,
1620,
14454,
14454,
1620,
14454,
1340,
4390,
1246,
50276,
21,
253,
5304,
15981,
273,
7740,
2420,
285,
42439,
788,
2420,
6667,
275,
4677,
374,
310,
6685,
34009,
5256,
267,
285,
25761,
513,
417,
1014,
3176,
281,
4518,
1329,
253,
3916,
1160,
275,
253,
2929,
50276,
37585,
2792,
50276,
18,
275,
253,
5955,
273,
2045,
2175,
534,
9919,
281,
2939,
253,
6349,
273,
1798,
6667,
281,
9162,
7089,
247,
25577,
281,
337,
943,
320,
2879,
50275,
19,
253,
1127,
5001,
14259,
2439,
12922,
310,
8839,
4619,
3340,
8772,
2201,
4385,
337,
50276,
395,
943,
320,
2908,
4321,
275,
253,
2929,
285,
625,
46454,
50276,
20,
253,
47846,
275,
4677,
337,
403,
24363,
275,
253,
9187,
1882,
1375,
1223,
891,
11435,
326,
253,
4477,
2908,
253,
2120,
33105,
275,
253,
8499,
841,
2120,
47846,
943,
320,
2908,
275,
253,
2022,
4677,
347,
973,
4931,
347,
271,
31660,
50276,
21,
253,
11250,
273,
247,
2317,
846,
253,
764,
284,
275,
3904,
24088,
2456,
22752,
310,
3240,
21643,
3340,
672,
2709,
3904,
403,
7117,
347,
275,
253,
806,
1386,
327,
3239,
577,
50276,
18,
465,
1368,
268,
606,
359,
74,
285,
591,
951,
632,
606,
4685,
2806,
3364,
13650,
3066,
4833,
3470,
17857,
1686,
4240,
5474,
33032,
2520,
2929,
2175,
253,
37264,
3879,
273,
253,
3733,
6667,
1309,
256,
35333,
45190,
352,
2722,
627,
403,
7740,
2420,
285,
42439,
788,
2420,
6667,
42439,
788,
2420,
6667,
403,
751,
1329,
6667,
581,
476,
5115,
2074,
3045,
407,
3733,
760,
327,
841,
1329,
6667,
253,
2929,
671,
2722,
436,
11562,
310,
5185,
2439,
1027,
2990,
35615,
50276,
856,
84,
436,
2929,
310,
3542,
275,
1029,
3290,
4518,
3559,
352,
310,
3236,
275,
253,
3282,
326,
436,
310,
253,
806,
16774,
1263,
327,
253,
7740,
85,
1430,
273,
6667,
275,
1309,
11454,
2990,
3733,
50276,
26122,
285,
3533,
327,
253,
3368,
4278,
337,
310,
253,
10895,
12421,
439,
31377,
846,
1046,
23657,
581,
4468,
310,
326,
604,
253,
1340,
310,
4229,
690,
273,
253,
6667,
588,
320,
42439,
788,
2420,
3365,
984,
253,
2045,
39657,
452,
2074,
6667,
50276,
395,
3733,
253,
1566,
327,
253,
2045,
39657,
2789,
352,
1175,
327,
690,
6667,
275,
253,
1655,
14604,
374,
352,
651,
320,
625,
4722,
281,
671,
2486,
15302,
751,
260,
338,
274,
2313,
534,
556,
625,
13301,
253,
1655,
15302,
512,
452,
760,
884,
9050,
495,
271,
1635,
4677,
476,
320,
2530,
534,
20994,
253,
1340,
273,
3733,
275,
4677,
577,
67,
10775,
1265,
342,
3733,
327,
270,
19,
50276,
5040,
3480,
273,
12288,
2256,
1242,
891,
3798,
1902,
16774,
1783,
9380,
281,
2057,
1705,
598,
342,
12439,
7313,
390,
2085,
12925,
323,
3946,
275,
619,
4743,
253,
4342,
273,
436,
789,
310,
1561,
15355,
285,
627,
310,
247,
8037,
323,
3946,
50276,
1189,
455,
436,
2929,
310,
4409,
18051,
323,
253,
12082,
4679,
534,
45190,
2336,
7790,
326,
627,
403,
1329,
6667,
275,
11454,
6928,
187,
187,
4118,
18435,
27,
2520,
2929,
310,
271,
1783,
273,
253,
11562,
273,
1650,
37264,
275,
3676,
11454,
2036,
3733,
253,
16774,
1263,
310,
253,
806,
273,
697,
2238,
285,
3386,
21414,
4679,
342,
35615,
326,
5115,
2822,
1375,
23037,
14387,
1543,
352,
2722,
326,
247,
5110,
273,
253,
3733,
873,
476,
320,
2326,
347,
1329,
6667,
253,
30628,
4879,
32213,
824,
347,
275,
253,
6814,
273,
253,
37264,
3139,
285,
253,
3733,
35214,
2299,
597,
5821,
326,
616,
7350,
359,
9713,
407,
253,
30080,
22559,
597,
671,
4879,
326,
253,
2929,
310,
417,
31196,
342,
16039,
533,
1119,
2217,
1318,
275,
253,
12082,
16774,
1263,
352,
3400
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
310,
271,
7126,
1783,
2929,
273,
247,
1077,
4722,
11562,
275,
3676,
11454,
6928,
50276,
15177,
19843,
3236,
414,
347,
2080,
347,
891,
871,
253,
2929,
33826,
247,
1077,
4623,
285,
3236,
1953,
50276,
14091,
3184,
849,
253,
4715,
1232,
273,
1027,
6667,
275,
253,
10895,
16149,
275,
1798,
253,
4477,
1263,
1880,
690,
6667,
403,
12150,
281,
3037,
685,
2571,
6667,
326,
403,
14454,
285,
1693,
1596,
264,
2709,
2069,
949,
4715,
359,
476,
8564,
326,
824,
6667,
403,
1329,
11390,
323,
11454,
6928,
9073,
4853,
253,
3061,
7548,
50276,
783,
2929,
310,
1077,
2590,
285,
253,
4679,
403,
273,
1077,
1029,
3290,
891,
3782,
14109,
253,
3434,
273,
253,
4477,
281,
897,
35615,
326,
5115,
2810,
281,
256,
5503,
327,
512,
15302,
281,
5416,
11815,
403,
3588,
275,
436,
4758,
891,
671,
1869,
253,
2709,
49495,
285,
5127,
272,
5958,
5921,
689,
1027,
3632,
12922,
369,
247,
1175,
3081,
1071,
50276,
9188,
40348,
436,
2929,
556,
690,
1077,
4722,
285,
1534,
1379,
42287,
690,
273,
253,
643,
4679,
891,
1869,
497,
3782,
47860,
497,
253,
1055,
50276,
251,
1071,
2228,
273,
11922,
6667,
326,
403,
2649,
14454,
281,
6667,
326,
403,
14454,
625,
275,
6010,
253,
12150,
6667,
403,
625,
9560,
281,
4853,
253,
987,
3061,
13674,
891,
671,
10490,
253,
3368,
342,
27620,
13301,
4645,
326,
436,
1543,
275,
6928,
37264,
7938,
50276,
2577,
581,
14876,
651,
320,
281,
1611,
436,
3368,
342,
27620,
941,
3185,
273,
27620,
13301,
347,
359,
403,
3340,
14338,
670,
253,
1055,
273,
253,
941,
347,
10066,
281,
247,
1027,
46684,
4836,
50276,
74,
11907,
253,
4477,
281,
956,
484,
342,
247,
4067,
24337,
2715,
273,
616,
4679,
697,
1896,
326,
323,
247,
12150,
4836,
751,
4440,
257,
292,
247,
5019,
273,
3477,
285,
1892,
6667,
1537,
320,
3058,
281,
8046,
4715,
285,
4853,
1175,
3061,
13674,
50276,
74,
9059,
7052,
323,
436,
2929,
281,
320,
7607,
281,
17857,
32888,
891,
1158,
352,
588,
320,
273,
1270,
1600,
281,
253,
3114,
7152,
33032,
11183,
374,
22458,
655,
4765,
253,
2929,
556,
5520,
1077,
9619,
1580,
253,
3302,
19529,
285,
253,
4477,
452,
9713,
2761,
512,
273,
619,
5701,
891,
452,
3103,
2559,
619,
4868,
281,
271,
854,
285,
5583,
14924,
50275,
11183,
22458,
1668,
4765,
50276,
249,
1708,
273,
253,
2488,
2380,
891,
452,
2559,
619,
4868,
281,
247,
721,
50275,
2520,
2929,
13698,
281,
12106,
253,
6070,
281,
534,
6928,
3037,
281,
9113,
30215,
2173,
6667,
285,
840,
7740,
841,
6667,
689,
253,
2282,
273,
3733,
253,
4477,
2085,
2067,
6667,
273,
7740,
2420,
285,
42439,
788,
2420,
6667,
17227,
2190,
643,
1841,
326,
6667,
342,
27620,
6667,
403,
625,
7740,
2420,
285,
326,
247,
5272,
6919,
273,
42439,
788,
2420,
6667,
476,
320,
5176,
432,
253,
3733,
873,
1293,
5237,
272,
3045,
50275,
783,
2929,
310,
4518,
3542,
285,
253,
789,
310,
4460,
50276,
936,
619,
3640,
436,
310,
253,
806,
5839,
273,
1650,
37264,
689,
3733,
627,
403,
271,
4722,
285,
2779,
1774,
873,
273,
5697,
1060,
285,
11821,
273,
253,
2929,
403,
3240,
2266,
50276,
249,
1798,
253,
3368,
17227,
326,
6667,
342,
27620,
6667,
403,
625,
7740,
2420,
310,
3240,
5322,
2299,
627,
403,
2067,
5661,
689,
84,
4380,
534,
1056,
436,
2929,
2834,
281,
5583,
323,
9311,
275,
697,
1655,
830,
50276,
24330,
2792,
50276,
18,
253,
954,
4619,
2523,
310,
342,
253,
6814,
273,
37264,
3139,
253,
4477,
513,
417,
1379,
715,
2395,
253,
4839,
37264,
2281,
275,
667,
273,
616,
4679,
3365,
1955,
281,
4839,
690,
6667,
588,
320,
9113,
13130,
387,
690,
1127,
275,
3733,
3340,
275,
253,
15302,
5867,
534,
760,
3831,
884,
5971,
436,
2789,
352,
2834,
281,
12129,
1880,
247,
14454,
1650,
369,
2686,
2455,
6311,
275,
253,
806,
1659,
275,
1340,
281,
6283,
3216,
436,
7982,
6341,
273,
4839,
37264,
4142,
588,
320,
3309,
323,
1650,
752,
403,
253,
37264,
4142,
672,
3632,
5018,
403,
2668,
387,
1016,
5731,
3213,
50275,
19,
497,
253,
6928,
10166,
327,
278,
79,
382,
8143,
4525,
16192,
382,
285,
260,
338,
274,
740,
10166,
323,
253,
1072,
1180,
273,
44540,
2905,
281,
1127,
337,
253,
37264,
2281,
943,
2572,
342,
253,
1180,
273,
44540,
908,
275,
3733,
347,
253,
5912,
273,
1016,
1650,
1146,
9113,
10509,
943,
2572,
604,
253,
260,
338,
274,
740,
3210,
497,
10166,
323,
625,
44540,
436,
651,
5513,
253,
8310,
326,
625,
260,
338,
274,
740,
6667,
497,
7740,
2420,
50276,
20,
275,
253,
3368,
3559,
275,
4677,
577,
67,
352,
310,
2834,
281,
2028,
1880,
253,
1620,
14454,
873,
27171,
1679,
11961,
275,
253,
2626,
3733,
9459,
984,
253,
6667,
497,
1620,
14454,
390,
984,
253,
1566,
574,
7019,
556,
1199,
2720,
2793,
4496,
2486,
247,
1453,
835,
253,
1340,
310,
34572,
24088,
14454,
1620,
14454,
14454,
275,
1635,
281,
253,
2908,
1620,
14454,
14454,
1620,
14454,
1340,
4390,
1246,
50276,
21,
253,
5304,
15981,
273,
7740,
2420,
285,
42439,
788,
2420,
6667,
275,
4677,
374,
310,
6685,
34009,
5256,
267,
285,
25761,
513,
417,
1014,
3176,
281,
4518,
1329,
253,
3916,
1160,
275,
253,
2929,
50276,
37585,
2792,
50276,
18,
275,
253,
5955,
273,
2045,
2175,
534,
9919,
281,
2939,
253,
6349,
273,
1798,
6667,
281,
9162,
7089,
247,
25577,
281,
337,
943,
320,
2879,
50275,
19,
253,
1127,
5001,
14259,
2439,
12922,
310,
8839,
4619,
3340,
8772,
2201,
4385,
337,
50276,
395,
943,
320,
2908,
4321,
275,
253,
2929,
285,
625,
46454,
50276,
20,
253,
47846,
275,
4677,
337,
403,
24363,
275,
253,
9187,
1882,
1375,
1223,
891,
11435,
326,
253,
4477,
2908,
253,
2120,
33105,
275,
253,
8499,
841,
2120,
47846,
943,
320,
2908,
275,
253,
2022,
4677,
347,
973,
4931,
347,
271,
31660,
50276,
21,
253,
11250,
273,
247,
2317,
846,
253,
764,
284,
275,
3904,
24088,
2456,
22752,
310,
3240,
21643,
3340,
672,
2709,
3904,
403,
7117,
347,
275,
253,
806,
1386,
327,
3239,
577,
50276,
18,
465,
1368,
268,
606,
359,
74,
285,
591,
951,
632,
606,
4685,
2806,
3364,
13650,
3066,
4833,
3470,
17857,
1686,
4240,
5474,
33032,
2520,
2929,
2175,
253,
37264,
3879,
273,
253,
3733,
6667,
1309,
256,
35333,
45190,
352,
2722,
627,
403,
7740,
2420,
285,
42439,
788,
2420,
6667,
42439,
788,
2420,
6667,
403,
751,
1329,
6667,
581,
476,
5115,
2074,
3045,
407,
3733,
760,
327,
841,
1329,
6667,
253,
2929,
671,
2722,
436,
11562,
310,
5185,
2439,
1027,
2990,
35615,
50276,
856,
84,
436,
2929,
310,
3542,
275,
1029,
3290,
4518,
3559,
352,
310,
3236,
275,
253,
3282,
326,
436,
310,
253,
806,
16774,
1263,
327,
253,
7740,
85,
1430,
273,
6667,
275,
1309,
11454,
2990,
3733,
50276,
26122,
285,
3533,
327,
253,
3368,
4278,
337,
310,
253,
10895,
12421,
439,
31377,
846,
1046,
23657,
581,
4468,
310,
326,
604,
253,
1340,
310,
4229,
690,
273,
253,
6667,
588,
320,
42439,
788,
2420,
3365,
984,
253,
2045,
39657,
452,
2074,
6667,
50276,
395,
3733,
253,
1566,
327,
253,
2045,
39657,
2789,
352,
1175,
327,
690,
6667,
275,
253,
1655,
14604,
374,
352,
651,
320,
625,
4722,
281,
671,
2486,
15302,
751,
260,
338,
274,
2313,
534,
556,
625,
13301,
253,
1655,
15302,
512,
452,
760,
884,
9050,
495,
271,
1635,
4677,
476,
320,
2530,
534,
20994,
253,
1340,
273,
3733,
275,
4677,
577,
67,
10775,
1265,
342,
3733,
327,
270,
19,
50276,
5040,
3480,
273,
12288,
2256,
1242,
891,
3798,
1902,
16774,
1783,
9380,
281,
2057,
1705,
598,
342,
12439,
7313,
390,
2085,
12925,
323,
3946,
275,
619,
4743,
253,
4342,
273,
436,
789,
310,
1561,
15355,
285,
627,
310,
247,
8037,
323,
3946,
50276,
1189,
455,
436,
2929,
310,
4409,
18051,
323,
253,
12082,
4679,
534,
45190,
2336,
7790,
326,
627,
403,
1329,
6667,
275,
11454,
6928,
187,
187,
4118,
18435,
27,
2520,
2929,
310,
271,
1783,
273,
253,
11562,
273,
1650,
37264,
275,
3676,
11454,
2036,
3733,
253,
16774,
1263,
310,
253,
806,
273,
697,
2238,
285,
3386,
21414,
4679,
342,
35615,
326,
5115,
2822,
1375,
23037,
14387,
1543,
352,
2722,
326,
247,
5110,
273,
253,
3733,
873,
476,
320,
2326,
347,
1329,
6667,
253,
30628,
4879,
32213,
824,
347,
275,
253,
6814,
273,
253,
37264,
3139,
285,
253,
3733,
35214,
2299,
597,
5821,
326,
616,
7350,
359,
9713,
407,
253,
30080,
22559,
597,
671,
4879,
326,
253,
2929,
310,
417,
31196,
342,
16039,
533,
1119,
2217,
1318,
275,
253,
12082,
16774,
1263,
352,
3400
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the effect of batch normalization via a physics style meanfield theory the theory yields a prediction of maximal learning rate for fullyconnected and convolutional networks and experimentally the max learning rate agrees very well with the theoretical prediction this is a wellwritten paper with a clean novel result when we fix the batchnorm parameter gamma a smaller gamma stabilizes the training better allowing a greater range of learning rates though in practice the batchnorm parameters are also trained this result may suggest using a smaller initialization a couple of things i was wondering as a baseline how would the max learning rate behave without batchnorm would the theories again match the experimental result there is the presence of momentum important if i set the momentum to be zero it does not change the theory about the fisher information and only affects the dependence of eta on the fisher information in this case would the theory still match the experimentsdocsepinteresting application of mft on fim to understand batch normalization this paper applies mean field analysis to networks with batch normalization layers analyzing maximum eigenvalue of the fisher information matrix the authors provide theoretical evidence of allowing higher learning rates and faster convergence of networks with batch normalization the analysis reduces to providing lower bound for maximum eigenvalue of fim using meanfield approximation authors provide lower bound of the maximum eigenvalue in the case of fullyconnected and convolutional networks with batch normalization layers lastly authors observe empirical correlation between smaller gamma and lower test loss pro clear result providing theoretical ground for commonly observed effects experiments are simple but illustrative it is quite surprising how well the maximum learning rate prediction matches with actual training performance curve con while mean field analysis apriori works in the limit where networks width goes to infinity for fixed dataset size the analysis of fisher and batch normalization need asymptotic limit of dataset size although some interesting results are provided the content could be expanded further for conference submission the prediction on maximum learning rate is interesting and the concrete result from mean field analysis while correlation between batch norm gamma parameter and test loss is also interesting the provided theory does not seem to provide good intuition about the phenomenon comments the theory provides the means to compute lower bound of maximum eigenvalue of fim using meanfield theory in figure 1 is bar lambdamax computed using the theory or empirically computed on the actual network it would be nice to make this clear in figure 2 the observed eta2 of dark bands in heatmap is interesting while most of networks without batch norm performance is maximized using learning rates very close to maximal value often networks using batch norm the learning rate with maximal performance is not the maximal one and it would be interesting to provide theoretical i feel like section 32 should cite xiao et al 2018 although this paper is cited in the intro the mean field analysis of convolutional layers was first worked out in this paper and should be credited docsepin this paper the effect of batch normalization to the maximum eigenvalue of the fisher information is analyzed the techinique is mostly developed by karakida et al 2018 the main result is an informal bound of the maximum eigenvalue which is given without proof though the numerical result corresponds to the derived bound the paper is basically well written but the technical part has several notational problems for example there is no definition of otimes odot and hess operators the use of the meanfield theory is an interesting direction to analyze batch normalization however in this paper it seems failed to say some rigorous conclusion indeed all of the theoretical outcomes are written as claims and no formal proof is given also there is no clear explanation of why the authors give the results in a nonrigorous way where is the difficult part to analyze in a rigorous way etc aside from the rigor issue the paper heavily depends on the study of karakida et al 2018 the derivation of the bound 44 is directly built on karakidas results such as eqs 782022 which reduces the papers originality the paper also lacks practical value can we improve an algorithm or something by using the bound 44 or other results
### Summary: | this paper presents a mean field analysis of the effect of batch norm on optimization assuming the weights and biases are independent gaussians an assumption thats led to other interesting analysis they propagate various statistics through the network which lets them derive the maximum eigenvalue of the fisher information matrix this determines the maximum learning rate at which learning is stable the finding is that batch norm allows larger learning rates in terms of novelty the paper builds on the analysis of karakida et al 2018 the derivations are mostly mechanical though theres probably still sufficient novelty unfortunately its not clear what we learn at the end of the day the maximum learning rate isnt very meaningful to analyze since the learning rate is only meaningful relative to the scale of the weights and gradients and the distance that needs to be moved to reach the optimum the authors claim that a higher learning rate leads to faster convergence but this seems false and at the very least would need more justification its wellknown that batch norm rescales the norm of the gradients inversely to the norm of the weights hence if the weight norm is larger than 1 bn will reduce the gradient norm and hence increase the maximum learning rate but this isnt a very interesting effect from an optimization perspective i cant tell from the analysis whether theres a more meaningful sense in which bn speeds up convergence the condition number might be more relevant from a convergence perspective overall this paper is a promising start but needs more work before its ready for publication at iclr | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
1055,
273,
14604,
21539,
3066,
247,
12057,
3740,
1599,
3423,
3762,
253,
3762,
11026,
247,
10554,
273,
13493,
4715,
2281,
323,
4751,
14063,
285,
27311,
267,
6928,
285,
21657,
253,
2781,
4715,
2281,
18726,
1077,
973,
342,
253,
10527,
10554,
50276,
2520,
310,
247,
973,
15720,
2929,
342,
247,
4076,
4460,
906,
672,
359,
4993,
253,
10464,
1451,
526,
4764,
17356,
247,
4577,
17356,
10308,
4219,
253,
3733,
1805,
6941,
247,
3687,
2491,
273,
4715,
4142,
2167,
275,
3946,
253,
10464,
1451,
526,
3602,
403,
671,
10166,
436,
906,
778,
1804,
970,
247,
4577,
31850,
50275,
66,
4564,
273,
1841,
891,
369,
12371,
50275,
284,
247,
8245,
849,
651,
253,
2781,
4715,
2281,
21319,
1293,
10464,
1451,
526,
651,
253,
11813,
969,
3761,
253,
5661,
906,
627,
50275,
261,
253,
3361,
273,
10254,
1774,
604,
891,
873,
253,
10254,
281,
320,
5058,
352,
1057,
417,
1818,
253,
3762,
670,
253,
27633,
1491,
285,
760,
11852,
253,
10096,
273,
1162,
66,
327,
253,
27633,
1491,
275,
436,
1083,
651,
253,
3762,
1335,
3761,
253,
4679,
7152,
339,
9852,
6173,
272,
2898,
273,
278,
649,
327,
269,
303,
281,
2096,
14604,
21539,
50276,
2520,
2929,
10384,
1599,
1673,
1783,
281,
6928,
342,
14604,
21539,
8090,
18918,
4869,
25023,
273,
253,
27633,
1491,
4315,
253,
4477,
2085,
10527,
1941,
273,
6941,
2169,
4715,
4142,
285,
7938,
14940,
273,
6928,
342,
14604,
21539,
50275,
783,
1783,
11355,
281,
5277,
2406,
3033,
323,
4869,
25023,
273,
269,
303,
970,
1599,
3423,
11193,
4477,
2085,
2406,
3033,
273,
253,
4869,
25023,
275,
253,
1083,
273,
4751,
14063,
285,
27311,
267,
6928,
342,
14604,
21539,
8090,
1390,
314,
4477,
10018,
16774,
5921,
875,
4577,
17356,
285,
2406,
1071,
2957,
50275,
856,
50274,
8250,
906,
5277,
10527,
3216,
323,
7744,
2540,
2538,
50274,
16217,
3825,
403,
2969,
533,
47386,
352,
310,
3240,
10084,
849,
973,
253,
4869,
4715,
2281,
10554,
10129,
342,
4588,
3733,
3045,
6970,
50276,
186,
50276,
585,
50275,
6050,
1599,
1673,
1783,
1049,
7947,
74,
2987,
275,
253,
2701,
835,
6928,
4871,
4566,
281,
23579,
323,
4229,
10895,
1979,
253,
1783,
273,
27633,
285,
14604,
21539,
878,
20185,
2701,
273,
10895,
1979,
50274,
20261,
690,
4722,
1543,
403,
2530,
253,
2600,
812,
320,
11848,
2007,
323,
8059,
19529,
253,
10554,
327,
4869,
4715,
2281,
310,
4722,
285,
253,
11859,
906,
432,
1599,
1673,
1783,
50275,
6050,
5921,
875,
14604,
5222,
17356,
4764,
285,
1071,
2957,
310,
671,
4722,
253,
2530,
3762,
1057,
417,
1646,
281,
2085,
1175,
30328,
670,
253,
11562,
50275,
26122,
50276,
783,
3762,
3400,
253,
2097,
281,
11897,
2406,
3033,
273,
4869,
25023,
273,
269,
303,
970,
1599,
3423,
3762,
275,
4677,
337,
310,
2534,
24082,
11747,
991,
10302,
970,
253,
3762,
390,
45190,
10302,
327,
253,
4588,
2990,
352,
651,
320,
5322,
281,
1056,
436,
2590,
50275,
249,
4677,
374,
253,
2540,
1162,
66,
19,
273,
3644,
10604,
275,
4250,
4251,
310,
4722,
1223,
954,
273,
6928,
1293,
14604,
5222,
3045,
310,
11903,
1025,
970,
4715,
4142,
1077,
2810,
281,
13493,
1318,
2223,
6928,
970,
14604,
5222,
253,
4715,
2281,
342,
13493,
3045,
310,
417,
253,
13493,
581,
285,
352,
651,
320,
4722,
281,
2085,
10527,
50275,
74,
1928,
751,
2593,
4567,
943,
26542,
1269,
22728,
1162,
355,
4765,
3738,
436,
2929,
310,
11106,
275,
253,
26432,
253,
1599,
1673,
1783,
273,
27311,
267,
8090,
369,
806,
4307,
562,
275,
436,
2929,
285,
943,
320,
26873,
50276,
7152,
339,
9852,
436,
2929,
253,
1055,
273,
14604,
21539,
281,
253,
4869,
25023,
273,
253,
27633,
1491,
310,
5867,
253,
13817,
249,
2271,
310,
6571,
3715,
407,
465,
34250,
4355,
1162,
355,
4765,
253,
2022,
906,
310,
271,
25040,
3033,
273,
253,
4869,
25023,
534,
310,
1677,
1293,
4737,
2167,
253,
10704,
906,
10140,
281,
253,
6012,
3033,
50276,
783,
2929,
310,
10323,
973,
3542,
533,
253,
7681,
629,
556,
2067,
417,
1050,
3237,
323,
1650,
627,
310,
642,
5426,
273,
258,
3181,
258,
5256,
285,
344,
859,
9158,
50276,
783,
897,
273,
253,
1599,
3423,
3762,
310,
271,
4722,
3884,
281,
12106,
14604,
21539,
2299,
275,
436,
2929,
352,
3133,
4242,
281,
1333,
690,
26565,
6452,
6296,
512,
273,
253,
10527,
6973,
403,
3542,
347,
3916,
285,
642,
7473,
4737,
310,
1677,
671,
627,
310,
642,
2590,
8813,
273,
2139,
253,
4477,
1918,
253,
1543,
275,
247,
1327,
10389,
11303,
1039,
835,
310,
253,
2834,
629,
281,
12106,
275,
247,
26565,
1039,
3966,
50275,
45529,
432,
253,
8132,
263,
2523,
253,
2929,
11306,
7024,
327,
253,
1263,
273,
465,
34250,
4355,
1162,
355,
4765,
253,
28529,
273,
253,
3033,
7127,
310,
3587,
4270,
327,
465,
34250,
21400,
1543,
824,
347,
16186,
84,
10523,
938,
1423,
534,
11355,
253,
9380,
3236,
414,
50276,
783,
2929,
671,
19756,
8542,
1318,
476,
359,
3157,
271,
5933,
390,
1633,
407,
970,
253,
3033,
7127,
390,
643,
1543,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
1599,
1673,
1783,
273,
253,
1055,
273,
14604,
5222,
327,
13757,
7384,
253,
13461,
285,
31306,
403,
3907,
305,
10064,
2458,
271,
9376,
28763,
3977,
281,
643,
4722,
1783,
597,
38500,
2710,
9990,
949,
253,
2990,
534,
14935,
731,
15313,
253,
4869,
25023,
273,
253,
27633,
1491,
4315,
436,
14802,
253,
4869,
4715,
2281,
387,
534,
4715,
310,
6474,
253,
4560,
310,
326,
14604,
5222,
4483,
4067,
4715,
4142,
50276,
249,
2426,
273,
38135,
253,
2929,
21168,
327,
253,
1783,
273,
465,
34250,
4355,
1162,
355,
4765,
253,
3538,
569,
403,
6571,
8651,
2167,
253,
373,
3164,
1335,
4209,
38135,
50276,
328,
9520,
697,
417,
2590,
752,
359,
3037,
387,
253,
990,
273,
253,
1388,
253,
4869,
4715,
2281,
310,
2649,
1077,
14282,
281,
12106,
1580,
253,
4715,
2281,
310,
760,
14282,
4103,
281,
253,
4311,
273,
253,
13461,
285,
27935,
285,
253,
4181,
326,
3198,
281,
320,
4395,
281,
3986,
253,
24571,
253,
4477,
1750,
326,
247,
2169,
4715,
2281,
5644,
281,
7938,
14940,
533,
436,
3133,
3221,
285,
387,
253,
1077,
1878,
651,
878,
625,
22861,
697,
973,
4304,
326,
14604,
5222,
46595,
265,
253,
5222,
273,
253,
27935,
39342,
281,
253,
5222,
273,
253,
13461,
7613,
604,
253,
2801,
5222,
310,
4067,
685,
337,
270,
79,
588,
4796,
253,
11786,
5222,
285,
7613,
2572,
253,
4869,
4715,
2281,
533,
436,
310,
2649,
247,
1077,
4722,
1055,
432,
271,
13757,
8668,
891,
16216,
2028,
432,
253,
1783,
1880,
253,
373,
247,
625,
14282,
3282,
275,
534,
270,
79,
18819,
598,
14940,
253,
1617,
1180,
1537,
320,
625,
4623,
432,
247,
14940,
8668,
50276,
1189,
455,
436,
2929,
310,
247,
12532,
1265,
533,
3198,
625,
789,
1078,
697,
4704,
323,
9311,
387,
17857,
32888,
50276
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
1055,
273,
14604,
21539,
3066,
247,
12057,
3740,
1599,
3423,
3762,
253,
3762,
11026,
247,
10554,
273,
13493,
4715,
2281,
323,
4751,
14063,
285,
27311,
267,
6928,
285,
21657,
253,
2781,
4715,
2281,
18726,
1077,
973,
342,
253,
10527,
10554,
50276,
2520,
310,
247,
973,
15720,
2929,
342,
247,
4076,
4460,
906,
672,
359,
4993,
253,
10464,
1451,
526,
4764,
17356,
247,
4577,
17356,
10308,
4219,
253,
3733,
1805,
6941,
247,
3687,
2491,
273,
4715,
4142,
2167,
275,
3946,
253,
10464,
1451,
526,
3602,
403,
671,
10166,
436,
906,
778,
1804,
970,
247,
4577,
31850,
50275,
66,
4564,
273,
1841,
891,
369,
12371,
50275,
284,
247,
8245,
849,
651,
253,
2781,
4715,
2281,
21319,
1293,
10464,
1451,
526,
651,
253,
11813,
969,
3761,
253,
5661,
906,
627,
50275,
261,
253,
3361,
273,
10254,
1774,
604,
891,
873,
253,
10254,
281,
320,
5058,
352,
1057,
417,
1818,
253,
3762,
670,
253,
27633,
1491,
285,
760,
11852,
253,
10096,
273,
1162,
66,
327,
253,
27633,
1491,
275,
436,
1083,
651,
253,
3762,
1335,
3761,
253,
4679,
7152,
339,
9852,
6173,
272,
2898,
273,
278,
649,
327,
269,
303,
281,
2096,
14604,
21539,
50276,
2520,
2929,
10384,
1599,
1673,
1783,
281,
6928,
342,
14604,
21539,
8090,
18918,
4869,
25023,
273,
253,
27633,
1491,
4315,
253,
4477,
2085,
10527,
1941,
273,
6941,
2169,
4715,
4142,
285,
7938,
14940,
273,
6928,
342,
14604,
21539,
50275,
783,
1783,
11355,
281,
5277,
2406,
3033,
323,
4869,
25023,
273,
269,
303,
970,
1599,
3423,
11193,
4477,
2085,
2406,
3033,
273,
253,
4869,
25023,
275,
253,
1083,
273,
4751,
14063,
285,
27311,
267,
6928,
342,
14604,
21539,
8090,
1390,
314,
4477,
10018,
16774,
5921,
875,
4577,
17356,
285,
2406,
1071,
2957,
50275,
856,
50274,
8250,
906,
5277,
10527,
3216,
323,
7744,
2540,
2538,
50274,
16217,
3825,
403,
2969,
533,
47386,
352,
310,
3240,
10084,
849,
973,
253,
4869,
4715,
2281,
10554,
10129,
342,
4588,
3733,
3045,
6970,
50276,
186,
50276,
585,
50275,
6050,
1599,
1673,
1783,
1049,
7947,
74,
2987,
275,
253,
2701,
835,
6928,
4871,
4566,
281,
23579,
323,
4229,
10895,
1979,
253,
1783,
273,
27633,
285,
14604,
21539,
878,
20185,
2701,
273,
10895,
1979,
50274,
20261,
690,
4722,
1543,
403,
2530,
253,
2600,
812,
320,
11848,
2007,
323,
8059,
19529,
253,
10554,
327,
4869,
4715,
2281,
310,
4722,
285,
253,
11859,
906,
432,
1599,
1673,
1783,
50275,
6050,
5921,
875,
14604,
5222,
17356,
4764,
285,
1071,
2957,
310,
671,
4722,
253,
2530,
3762,
1057,
417,
1646,
281,
2085,
1175,
30328,
670,
253,
11562,
50275,
26122,
50276,
783,
3762,
3400,
253,
2097,
281,
11897,
2406,
3033,
273,
4869,
25023,
273,
269,
303,
970,
1599,
3423,
3762,
275,
4677,
337,
310,
2534,
24082,
11747,
991,
10302,
970,
253,
3762,
390,
45190,
10302,
327,
253,
4588,
2990,
352,
651,
320,
5322,
281,
1056,
436,
2590,
50275,
249,
4677,
374,
253,
2540,
1162,
66,
19,
273,
3644,
10604,
275,
4250,
4251,
310,
4722,
1223,
954,
273,
6928,
1293,
14604,
5222,
3045,
310,
11903,
1025,
970,
4715,
4142,
1077,
2810,
281,
13493,
1318,
2223,
6928,
970,
14604,
5222,
253,
4715,
2281,
342,
13493,
3045,
310,
417,
253,
13493,
581,
285,
352,
651,
320,
4722,
281,
2085,
10527,
50275,
74,
1928,
751,
2593,
4567,
943,
26542,
1269,
22728,
1162,
355,
4765,
3738,
436,
2929,
310,
11106,
275,
253,
26432,
253,
1599,
1673,
1783,
273,
27311,
267,
8090,
369,
806,
4307,
562,
275,
436,
2929,
285,
943,
320,
26873,
50276,
7152,
339,
9852,
436,
2929,
253,
1055,
273,
14604,
21539,
281,
253,
4869,
25023,
273,
253,
27633,
1491,
310,
5867,
253,
13817,
249,
2271,
310,
6571,
3715,
407,
465,
34250,
4355,
1162,
355,
4765,
253,
2022,
906,
310,
271,
25040,
3033,
273,
253,
4869,
25023,
534,
310,
1677,
1293,
4737,
2167,
253,
10704,
906,
10140,
281,
253,
6012,
3033,
50276,
783,
2929,
310,
10323,
973,
3542,
533,
253,
7681,
629,
556,
2067,
417,
1050,
3237,
323,
1650,
627,
310,
642,
5426,
273,
258,
3181,
258,
5256,
285,
344,
859,
9158,
50276,
783,
897,
273,
253,
1599,
3423,
3762,
310,
271,
4722,
3884,
281,
12106,
14604,
21539,
2299,
275,
436,
2929,
352,
3133,
4242,
281,
1333,
690,
26565,
6452,
6296,
512,
273,
253,
10527,
6973,
403,
3542,
347,
3916,
285,
642,
7473,
4737,
310,
1677,
671,
627,
310,
642,
2590,
8813,
273,
2139,
253,
4477,
1918,
253,
1543,
275,
247,
1327,
10389,
11303,
1039,
835,
310,
253,
2834,
629,
281,
12106,
275,
247,
26565,
1039,
3966,
50275,
45529,
432,
253,
8132,
263,
2523,
253,
2929,
11306,
7024,
327,
253,
1263,
273,
465,
34250,
4355,
1162,
355,
4765,
253,
28529,
273,
253,
3033,
7127,
310,
3587,
4270,
327,
465,
34250,
21400,
1543,
824,
347,
16186,
84,
10523,
938,
1423,
534,
11355,
253,
9380,
3236,
414,
50276,
783,
2929,
671,
19756,
8542,
1318,
476,
359,
3157,
271,
5933,
390,
1633,
407,
970,
253,
3033,
7127,
390,
643,
1543,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
1599,
1673,
1783,
273,
253,
1055,
273,
14604,
5222,
327,
13757,
7384,
253,
13461,
285,
31306,
403,
3907,
305,
10064,
2458,
271,
9376,
28763,
3977,
281,
643,
4722,
1783,
597,
38500,
2710,
9990,
949,
253,
2990,
534,
14935,
731,
15313,
253,
4869,
25023,
273,
253,
27633,
1491,
4315,
436,
14802,
253,
4869,
4715,
2281,
387,
534,
4715,
310,
6474,
253,
4560,
310,
326,
14604,
5222,
4483,
4067,
4715,
4142,
50276,
249,
2426,
273,
38135,
253,
2929,
21168,
327,
253,
1783,
273,
465,
34250,
4355,
1162,
355,
4765,
253,
3538,
569,
403,
6571,
8651,
2167,
253,
373,
3164,
1335,
4209,
38135,
50276,
328,
9520,
697,
417,
2590,
752,
359,
3037,
387,
253,
990,
273,
253,
1388,
253,
4869,
4715,
2281,
310,
2649,
1077,
14282,
281,
12106,
1580,
253,
4715,
2281,
310,
760,
14282,
4103,
281,
253,
4311,
273,
253,
13461,
285,
27935,
285,
253,
4181,
326,
3198,
281,
320,
4395,
281,
3986,
253,
24571,
253,
4477,
1750,
326,
247,
2169,
4715,
2281,
5644,
281,
7938,
14940,
533,
436,
3133,
3221,
285,
387,
253,
1077,
1878,
651,
878,
625,
22861,
697,
973,
4304,
326,
14604,
5222,
46595,
265,
253,
5222,
273,
253,
27935,
39342,
281,
253,
5222,
273,
253,
13461,
7613,
604,
253,
2801,
5222,
310,
4067,
685,
337,
270,
79,
588,
4796,
253,
11786,
5222,
285,
7613,
2572,
253,
4869,
4715,
2281,
533,
436,
310,
2649,
247,
1077,
4722,
1055,
432,
271,
13757,
8668,
891,
16216,
2028,
432,
253,
1783,
1880,
253,
373,
247,
625,
14282,
3282,
275,
534,
270,
79,
18819,
598,
14940,
253,
1617,
1180,
1537,
320,
625,
4623,
432,
247,
14940,
8668,
50276,
1189,
455,
436,
2929,
310,
247,
12532,
1265,
533,
3198,
625,
789,
1078,
697,
4704,
323,
9311,
387,
17857,
32888,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
recommendation 2 serious ethical issues that need to be addressed in the final version ethics review the authors introduce the tgea 20 dataset which is a chinese dataset where the examples are generated by various pretrained language models the dataset has been annotated such that the machineauthored texts can be assessed on various tasks within the broad categories of diagnosis tasks and pathology mitigation tasks the main issue raised by reviewers is the risk of erasure and invisibility of linguistic variability in chinese language training data a recommendation was formulated in this regard ethics documentation to address the reviewers concerns on erasure of specificities in the chinese language the authors offered to contact the developers of the publicly available models they are using with the idea of asking for information on the training data used for these models the authors also offered to include data cards for th emodels and datasets especially with respect to the varieties of chinese the goal is to identify varieties of chinese other than mandarin more generally authors propose to provide a clear description with respect to the variety of chinese in the revised version of the paper na docsepthis paper presents a largescale and curated dataset in chinese along with two benchmarks for diagnosis 5 tasks and pathology mitigation 2 tasks to improve quality of generated texts from language models the authors designed a thorough annotation process including data collection training annotators in the preannotation phase and quality control by the feedback loop the selected sentences for annotation cover 3 aspects model decoding strategy and prompt they also provided a detailed analysis on the distributions of erroneous sentences produced by a variety of models with different sizes ranging from 110m to 26b parameters the experimental results on the proposed benchmarks show that the diagnosis tasks are challenging and training a language model here gpt2 on their dataset help reducing errors in the generated texts 1 data collection phase is wellimplemented the authors use different models decoding strategies and prompt types nominal phrasal and sentential with various domains news wikipedia and web fictions to diversify types of erroneous sentences 2 the annotation process and quality control are welldesigned to annotate the largescale dataset while maintaining its quality 3 the dataset has potential usages large language models in chinese can benefit from training on this dataset to mitigating erroneous sentences also the discriminative models can be trained to automatically detect errors made by language models thus this work can be valuable for future research 1 i have some concerns regarding the quality control l188 who trained the first 4 reviewers i would like to evaluate the quality of this dataset carefully since their annotations are used as groundtruths to train other annotators l199 what is average performance of 7 welltrained reviewers are they trained by the annotations produced by the first 4 reviewers the reason i asked is that they are the ones who guarantee the highquality outcomes for this dataset 2 although alphabalanced loss was used some diagnostic tasks such as erroneous text detection misew extraction suffer a heavy unbalance that may affect model training and evaluation so the results on these datasets are not quite convincing 3 no statistics reported for the proposed tasks in two sets of benchmarks 4 the word prediction task in the pathology mitigation benchmark does not properly evaluate the ability of language models because there can be many correct predictions for the last token given each sentence 5 no qualitative examples ie model predictions for each benchmark task in the main text and the appendix 6 some minor issues in presentation numbers in table 1 are quite small that makes it hard to read l240 error correction task is missing l267 it should be macbert instead of macbeert docsepthe authors introduce the tgea 20 dataset which is a chinese dataset where the examples are generated by various pretrained language models the dataset has been annotated such that the machineauthored texts can be assessed on various tasks within the broad categories of diagnosis tasks and pathology mitigation tasks the main strength of this dataset is its scale it substantially extends tgea 10 to now consist of 195629 annotated sentences such a dataset will be particularly useful in devising methods to assess the quality of the generated text from pretrained language models also the authors have taken great care for a sophisticated quality control process in order to ensure that the annotations for the various benchmarking tasks can be trusted further explanation or clarification is required for the following points the authors claim that the examples generated are diverse due to the different decoding strategies and also using 4 different pretrained language models however are 4 pretrained language models representative in the mistakes they make for future pretrained language models that have come out recently and to come out which are far larger in size and may have different pathological weaknesses it would be great if the novelty in the tasks beyond tgea 10 could be clearly spelled out beyond just the scale of the dataset how valid is it to compare the pathological weaknesses of models in chinese to english examples as in scarecrow docsepthis paper proposes tgea 20 the largest dataset for diagnosing typed errors made by pretrained language models it is an extended version of tgea with various large language models and downstream tasks the paper mainly compared its contribution with tgea and scarecrow several experiments are performed using the dataset and experimental results on various downstream tasks show that there are large rooms exploring the proposed dataset it nicely expands previous tgea in terms of scalability annotation richness etc strict quality control on the construction process proposed misew and pathology mitigation which can assist the annotation richness the intention of misew extraction is plausible but is somewhat overlapped with erroneous span location as a downstream task thus the necessity of the task should be more justified in some manner such as qualitative analysis proposed pathology mitigation should also be more explained in terms of why it should be jointly considered in future works docsepthis paper contributes to understanding and reducing the text generation errors made by large pretrained language models the authors have created the largest chinese language dataset of machineauthored texts and a substantial subset of the texts 195k were manually annotated at a finegrained level and corrected for text quality issues grammaticality and semantic coherence the authors use the annotated data both to benchmark the best performance of 4 major plms with variable architecture and scale against each toplevel error category and to test the extent to which finetuning with the humancorrected texts reduces the prevalence of these errors in the plm outputs the size of the dataset and multiple annotations erroneous spans and minimal set of errorrelated words and corrections for these words provide an excellent basis for studying the nature of grammatical and semantic coherence errors in chinese language automated text generated by current sota models performing at their best as such this dataset should be of considerable interest to researchers seeking to understand the persistence of certain error types in machinegenerated text and also to understand from objective evidence vs subjective human evaluations cf clark 2021 what kinds of errors may be indicative of machinegenerated outputs and therefore may support the detection of machinegenerated text the further interest of this paper is the attempt to use the errorpluscorrection misew pairs to improve the quality of the generated text by reducing errors of these kinds in the output another strength of the paper is the clear and comprehensive description of the research process including the annotation process which adheres to annotation best practices in many ways including an indicator of annotator confidence pretraining to support annotator convergence to a high level of interannotator agreement and iterative retraining during annotation etc it would have been good to see cohens kappa statistics or similar cited for the interannotator agreement in section 33 average accuracy of annotators is mentioned average performance increases from 589 to 797 and also interannotator disagreement but not measure of the latter is explicitly provided docsepthe work builds on by releasing a larger and higher quality version of a previously released dataset named tgea it is a comprehensive collection of machine authored texts in chinese language that have been annotated for errors based on a novel ontology of errors this ontology is based on data mining for frequently occurring forms of errors followed by supervision by expert annotators furthermore systematic analysis to the annotated errors is performed to reveal patterns that are helpful in gauging the capability of various plms on different datasets lastly the work goes on to validate whether errors found can be fixed automatically with preexisting large language models and find it hard to solve by modern means this paves the way for 1 a dataset to analyze the kind errors plm makes 2 developing automated methods that can automatically correct the errors made by plms because existing sota is not enough to rectify it 3 benchmark of the performance of sota on both diagnostic as well as pathological errors for future works to compare against improvement over previous work where authors used the stochastic decoding strategy and repetition penalty which reduced the redundancy related errors frequency and hence also allowed energy to be focused on harder errors the number of annotated samples is large enough to gain confidence and mitigate risk of incorrect conclusions due to spurious correlations benchmark results are shared on the presented dataset using sota models which allows the research community to have solid baselines to compare their research and findings against furthermore by evaluating numerous plms on diverse datasets the work also helps users of plms in deciding appropriate model for any given nlg task the reasoning behind choosing three point scale for asking confidence in annotation instead of standard likert scale is not provided any inspiration that was taken from related work is missing for the choice of annotator training methodology given that beyond a certain threshold plms show emergence capabilities the work done here is called into question as whether or not the errors patterns in small plms 5b parameters are also prevalent in large plms 100b like pangu 200b wu dao 20 175t grammar errors 1 l168 think it 2 l190 in three times
### Summary: | the reviewers all liked the paper the authors response clarified most points raised by the reviewers in view of that the authors are strongly invited to take the feedback on board for the final version the main ethical issue raised by reviewers is the risk of erasure and invisibility of linguistic variability in chinese language training data data cards need to be added to the final version | [
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
250,
27167,
318,
374,
4092,
16289,
3374,
326,
878,
281,
320,
9713,
275,
253,
2457,
2715,
50276,
678,
982,
2278,
50276,
783,
4477,
9569,
253,
246,
463,
66,
1384,
10895,
534,
310,
247,
448,
5187,
10895,
835,
253,
6667,
403,
4561,
407,
2710,
3215,
11273,
3448,
3210,
253,
10895,
556,
644,
28267,
824,
326,
253,
5145,
14399,
2149,
17438,
476,
320,
7515,
327,
2710,
8892,
1561,
253,
3862,
9050,
273,
6120,
8892,
285,
19955,
36455,
8892,
50276,
783,
2022,
2523,
5439,
407,
30628,
310,
253,
2495,
273,
2827,
5849,
285,
828,
261,
2322,
273,
32019,
13099,
275,
448,
5187,
3448,
3733,
941,
247,
17401,
369,
26115,
275,
436,
2743,
18035,
10097,
50275,
936,
2953,
253,
30628,
7350,
327,
2827,
5849,
273,
2173,
1005,
275,
253,
448,
5187,
3448,
253,
4477,
5907,
281,
3057,
253,
12259,
273,
253,
13644,
2130,
3210,
597,
403,
970,
342,
253,
2934,
273,
7004,
323,
1491,
327,
253,
3733,
941,
908,
323,
841,
3210,
253,
4477,
671,
5907,
281,
2486,
941,
8364,
323,
289,
802,
351,
1241,
285,
15302,
3340,
342,
1675,
281,
253,
19112,
273,
448,
5187,
253,
4736,
310,
281,
4271,
19112,
273,
448,
5187,
643,
685,
7649,
19881,
625,
3839,
4477,
12661,
281,
2085,
247,
2590,
5740,
342,
1675,
281,
253,
5235,
273,
448,
5187,
275,
253,
17265,
2715,
273,
253,
2929,
50276,
2072,
5474,
33032,
2520,
2929,
10262,
247,
1236,
2510,
25912,
285,
1095,
456,
10895,
275,
448,
5187,
2112,
342,
767,
49602,
323,
6120,
608,
8892,
285,
19955,
36455,
374,
8892,
281,
3157,
3290,
273,
4561,
17438,
432,
3448,
3210,
253,
4477,
4158,
247,
11080,
22581,
1232,
1690,
941,
4849,
3733,
12182,
2392,
275,
253,
638,
22965,
3408,
285,
3290,
1453,
407,
253,
8680,
6287,
253,
4236,
14683,
323,
22581,
3835,
495,
7794,
1566,
28490,
5700,
285,
8959,
597,
671,
2530,
247,
7000,
1783,
327,
253,
10670,
273,
21210,
14683,
4197,
407,
247,
5235,
273,
3210,
342,
1027,
9552,
12319,
432,
9199,
78,
281,
3436,
67,
3602,
253,
5661,
1543,
327,
253,
4081,
49602,
921,
326,
253,
6120,
8892,
403,
11132,
285,
3733,
247,
3448,
1566,
1060,
305,
431,
19,
327,
616,
10895,
1361,
8493,
6332,
275,
253,
4561,
17438,
337,
941,
4849,
3408,
310,
973,
303,
38758,
253,
4477,
897,
1027,
3210,
28490,
8130,
285,
8959,
3510,
25662,
815,
6230,
267,
285,
2197,
1624,
342,
2710,
10625,
3668,
259,
15170,
285,
4384,
269,
11297,
281,
7940,
1419,
3510,
273,
21210,
14683,
374,
253,
22581,
1232,
285,
3290,
1453,
403,
6210,
392,
265,
1300,
281,
12182,
366,
253,
1236,
2510,
25912,
10895,
1223,
11850,
697,
3290,
495,
253,
10895,
556,
2442,
441,
1131,
1781,
3448,
3210,
275,
448,
5187,
476,
5649,
432,
3733,
327,
436,
10895,
281,
37460,
21210,
14683,
671,
253,
20741,
800,
3210,
476,
320,
10166,
281,
8356,
2736,
6332,
1160,
407,
3448,
3210,
3021,
436,
789,
476,
320,
9865,
323,
2852,
2561,
337,
891,
452,
690,
7350,
5001,
253,
3290,
1453,
50276,
77,
17599,
665,
10166,
253,
806,
577,
30628,
891,
651,
751,
281,
7472,
253,
3290,
273,
436,
10895,
9257,
1580,
616,
31825,
403,
908,
347,
3216,
33024,
84,
281,
6194,
643,
12182,
2392,
50276,
77,
3031,
752,
310,
3388,
3045,
273,
818,
973,
32927,
30628,
403,
597,
10166,
407,
253,
31825,
4197,
407,
253,
806,
577,
30628,
253,
1921,
891,
2546,
310,
326,
597,
403,
253,
4394,
665,
12215,
253,
1029,
15177,
6973,
323,
436,
10895,
374,
3738,
355,
20376,
267,
3086,
2957,
369,
908,
690,
10401,
8892,
824,
347,
21210,
2505,
5481,
278,
885,
88,
11998,
11089,
247,
5536,
440,
20203,
326,
778,
2818,
1566,
3733,
285,
7103,
594,
253,
1543,
327,
841,
15302,
403,
417,
3240,
21414,
495,
642,
9990,
2361,
323,
253,
4081,
8892,
275,
767,
5239,
273,
49602,
577,
253,
3159,
10554,
4836,
275,
253,
19955,
36455,
22791,
1057,
417,
6283,
7472,
253,
3745,
273,
3448,
3210,
984,
627,
476,
320,
1142,
3451,
13650,
323,
253,
1390,
10669,
1677,
1016,
6197,
608,
642,
18276,
6667,
26332,
1566,
13650,
323,
1016,
22791,
4836,
275,
253,
2022,
2505,
285,
253,
30762,
721,
690,
5884,
3374,
275,
9759,
50276,
40957,
275,
2829,
337,
403,
3240,
1355,
326,
2789,
352,
1892,
281,
1239,
50276,
77,
14028,
2228,
10618,
4836,
310,
5816,
50276,
77,
23546,
352,
943,
320,
5315,
6291,
3185,
273,
5315,
1257,
797,
5474,
339,
431,
248,
4477,
9569,
253,
246,
463,
66,
1384,
10895,
534,
310,
247,
448,
5187,
10895,
835,
253,
6667,
403,
4561,
407,
2710,
3215,
11273,
3448,
3210,
253,
10895,
556,
644,
28267,
824,
326,
253,
5145,
14399,
2149,
17438,
476,
320,
7515,
327,
2710,
8892,
1561,
253,
3862,
9050,
273,
6120,
8892,
285,
19955,
36455,
8892,
253,
2022,
4757,
273,
436,
10895,
310,
697,
4311,
352,
9619,
8725,
246,
463,
66,
884,
281,
1024,
2882,
273,
22716,
1717,
28267,
14683,
824,
247,
10895,
588,
320,
3782,
4217,
275,
1474,
2182,
3082,
281,
2939,
253,
3290,
273,
253,
4561,
2505,
432,
3215,
11273,
3448,
3210,
671,
253,
4477,
452,
2668,
1270,
1557,
323,
247,
18144,
3290,
1453,
1232,
275,
1340,
281,
5416,
326,
253,
31825,
323,
253,
2710,
22791,
272,
8892,
476,
320,
18273,
2007,
8813,
390,
37699,
310,
2424,
323,
253,
1563,
2792,
50276,
186,
783,
4477,
1750,
326,
253,
6667,
4561,
403,
11117,
1955,
281,
253,
1027,
28490,
8130,
285,
671,
970,
577,
1027,
3215,
11273,
3448,
3210,
2299,
403,
577,
3215,
11273,
3448,
3210,
8612,
275,
253,
16503,
597,
1056,
323,
2852,
3215,
11273,
3448,
3210,
326,
452,
1705,
562,
4102,
285,
281,
1705,
562,
534,
403,
2080,
4067,
275,
1979,
285,
778,
452,
1027,
18977,
32213,
50276,
186,
262,
651,
320,
1270,
604,
253,
38135,
275,
253,
8892,
4457,
246,
463,
66,
884,
812,
320,
4518,
43997,
562,
4457,
816,
253,
4311,
273,
253,
10895,
50276,
186,
5430,
3588,
310,
352,
281,
7277,
253,
18977,
32213,
273,
3210,
275,
448,
5187,
281,
48087,
6667,
347,
275,
30272,
32910,
50276,
7152,
33032,
2520,
2929,
29328,
246,
463,
66,
1384,
253,
6253,
10895,
323,
48858,
31632,
6332,
1160,
407,
3215,
11273,
3448,
3210,
352,
310,
271,
6508,
2715,
273,
246,
463,
66,
342,
2710,
1781,
3448,
3210,
285,
15450,
8892,
253,
2929,
7194,
2429,
697,
7680,
342,
246,
463,
66,
285,
30272,
32910,
2067,
4679,
403,
2684,
970,
253,
10895,
285,
5661,
1543,
327,
2710,
15450,
8892,
921,
326,
627,
403,
1781,
9956,
18216,
253,
4081,
10895,
50276,
262,
23395,
35205,
2045,
246,
463,
66,
275,
2426,
273,
9171,
1430,
22581,
37175,
3966,
50276,
30862,
3290,
1453,
327,
253,
5140,
1232,
50276,
856,
7334,
278,
885,
88,
285,
19955,
36455,
534,
476,
10073,
253,
22581,
37175,
50276,
783,
8208,
273,
278,
885,
88,
11998,
310,
21541,
533,
310,
8489,
48955,
342,
21210,
13905,
4328,
347,
247,
15450,
4836,
3021,
253,
15504,
273,
253,
4836,
943,
320,
625,
17285,
275,
690,
5133,
824,
347,
18276,
1783,
50276,
856,
7334,
19955,
36455,
943,
671,
320,
625,
5544,
275,
2426,
273,
2139,
352,
943,
320,
26277,
2783,
275,
2852,
2987,
5474,
33032,
2520,
2929,
17904,
281,
4685,
285,
8493,
253,
2505,
5978,
6332,
1160,
407,
1781,
3215,
11273,
3448,
3210,
253,
4477,
452,
3562,
253,
6253,
448,
5187,
3448,
10895,
273,
5145,
14399,
2149,
17438,
285,
247,
6832,
8578,
273,
253,
17438,
23627,
76,
497,
13542,
28267,
387,
247,
4030,
72,
11273,
1268,
285,
15045,
323,
2505,
3290,
3374,
47412,
474,
414,
285,
24705,
25253,
253,
4477,
897,
253,
28267,
941,
1097,
281,
22791,
253,
1682,
3045,
273,
577,
2201,
499,
983,
342,
4778,
10336,
285,
4311,
1411,
1016,
281,
713,
652,
2228,
7140,
285,
281,
1071,
253,
6070,
281,
534,
1442,
292,
25004,
342,
253,
1547,
1377,
263,
18685,
17438,
11355,
253,
8996,
273,
841,
6332,
275,
253,
499,
78,
18012,
50276,
783,
1979,
273,
253,
10895,
285,
2709,
31825,
21210,
35742,
285,
8723,
873,
273,
2228,
4919,
3000,
285,
17660,
323,
841,
3000,
2085,
271,
7126,
3720,
323,
12392,
253,
3753,
273,
47412,
474,
285,
24705,
25253,
6332,
275,
448,
5187,
3448,
16644,
2505,
4561,
407,
1655,
256,
5503,
3210,
9591,
387,
616,
1682,
347,
824,
436,
10895,
943,
320,
273,
10665,
1600,
281,
8607,
8445,
281,
2096,
253,
25306,
273,
2176,
2228,
3510,
275,
5145,
20419,
2505,
285,
671,
281,
2096,
432,
8103,
1941,
4632,
17854,
1966,
27163,
21194,
502,
782,
43425,
752,
9351,
273,
6332,
778,
320,
24838,
273,
5145,
20419,
18012,
285,
3103,
778,
1329,
253,
5481,
273,
5145,
20419,
2505,
253,
2007,
1600,
273,
436,
2929,
310,
253,
3177,
281,
897,
253,
2228,
11095,
5528,
15831,
278,
885,
88,
8557,
281,
3157,
253,
3290,
273,
253,
4561,
2505,
407,
8493,
6332,
273,
841,
9351,
275,
253,
3453,
1529,
4757,
273,
253,
2929,
310,
253,
2590,
285,
11088,
5740,
273,
253,
2561,
1232,
1690,
253,
22581,
1232,
534,
519,
14210,
281,
22581,
1682,
8333,
275,
1142,
4088,
1690,
271,
15301,
273,
12182,
1080,
7162,
3215,
26208,
281,
1329,
12182,
1080,
14940,
281,
247,
1029,
1268,
273,
734,
11423,
1080,
4345,
285,
34560,
851,
26208,
1309,
22581,
3966,
352,
651,
452,
644,
1175,
281,
923,
820,
28082,
465,
5596,
9990,
390,
2074,
11106,
323,
253,
734,
11423,
1080,
4345,
275,
2593,
5922,
3388,
7200,
273,
12182,
2392,
310,
5393,
3388,
3045,
50276,
19687,
1169,
432,
608,
2511,
281,
818,
4148,
285,
671,
734,
11423,
1080,
30859,
533,
417,
2557,
273,
253,
6158,
310,
11120,
2530,
50276,
7152,
339,
431,
248,
789,
21168,
327,
407,
20437,
247,
4067,
285,
2169,
3290,
2715,
273,
247,
3786,
4439,
10895,
4907,
246,
463,
66,
352,
310,
247,
11088,
4849,
273,
5145,
47895,
17438,
275,
448,
5187,
3448,
326,
452,
644,
28267,
323,
6332,
1754,
327,
247,
4460,
42081,
273,
6332,
436,
42081,
310,
1754,
327,
941,
15067,
323,
7208,
12952,
4948,
273,
6332,
3560,
407,
20446,
407,
6485,
12182,
2392,
33810,
12082,
1783,
281,
253,
28267,
6332,
310,
2684,
281,
10313,
6127,
326,
403,
9371,
275,
39417,
272,
253,
14603,
273,
2710,
499,
983,
327,
1027,
15302,
50276,
6275,
314,
253,
789,
4566,
327,
281,
17813,
1880,
6332,
1119,
476,
320,
4229,
8356,
342,
638,
20137,
1781,
3448,
3210,
285,
1089,
50276,
262,
1892,
281,
8415,
407,
4980,
2097,
436,
268,
3465,
253,
1039,
323,
337,
247,
10895,
281,
12106,
253,
2238,
6332,
499,
78,
2789,
374,
6684,
16644,
3082,
326,
476,
8356,
3451,
253,
6332,
1160,
407,
499,
983,
984,
5368,
256,
5503,
310,
417,
2217,
281,
9004,
1419,
352,
495,
22791,
273,
253,
3045,
273,
256,
5503,
327,
1097,
10401,
347,
973,
347,
18977,
6332,
323,
2852,
2987,
281,
7277,
1411,
50274,
49831,
420,
689,
2045,
789,
835,
4477,
908,
253,
19191,
28490,
5700,
285,
22563,
12339,
534,
3777,
253,
39296,
2905,
6332,
4294,
285,
7613,
671,
4136,
2341,
281,
320,
7106,
327,
12150,
6332,
50276,
783,
1180,
273,
28267,
3530,
310,
1781,
2217,
281,
6351,
7162,
285,
29966,
2495,
273,
13583,
11815,
1955,
281,
46541,
13007,
50276,
31591,
4698,
1543,
403,
6096,
327,
253,
3559,
10895,
970,
256,
5503,
3210,
534,
4483,
253,
2561,
3114,
281,
452,
4891,
1666,
25379,
281,
7277,
616,
2561,
285,
4342,
1411,
33810,
407,
16344,
7418,
499,
983,
327,
11117,
15302,
253,
789,
671,
7729,
4212,
273,
499,
983,
275,
18000,
4569,
1566,
323,
667,
1677,
295,
21619,
4836,
50275,
783,
14720,
3212,
13887,
1264,
1127,
4311,
323,
7004,
7162,
275,
22581,
3185,
273,
2629,
2078,
797,
4311,
310,
417,
2530,
50276,
1279,
17006,
326,
369,
2668,
432,
2905,
789,
310,
5816,
323,
253,
4327,
273,
12182,
1080,
3733,
16182,
50276,
28821,
326,
4457,
247,
2176,
7887,
499,
983,
921,
21313,
13789,
253,
789,
2218,
1060,
310,
1925,
715,
1953,
347,
1880,
390,
417,
253,
6332,
6127,
275,
1355,
499,
983,
608,
67,
3602,
403,
671,
21270,
275,
1781,
499,
983,
50276,
2313,
67,
751,
268,
2435,
1052,
67,
259,
86,
4204,
80,
1384,
20105,
85,
50275,
1710,
4175,
6332,
337,
298,
13851,
1158,
352,
374,
298,
16129,
275,
1264,
2069,
50276,
187,
187,
4118,
18435,
27,
783,
30628,
512,
10490,
253,
2929,
253,
4477,
2380,
31637,
954,
2792,
5439,
407,
253,
30628,
275,
1859,
273,
326,
253,
4477,
403,
7052,
12470,
281,
1379,
253,
8680,
327,
4450,
323,
253,
2457,
2715,
253,
2022,
16289,
2523,
5439,
407,
30628,
310,
253,
2495,
273,
2827,
5849,
285,
828,
261,
2322,
273,
32019,
13099,
275,
448,
5187,
3448,
3733,
941,
941,
8364,
878,
281,
320,
2879,
281,
253,
2457,
2715
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
250,
27167,
318,
374,
4092,
16289,
3374,
326,
878,
281,
320,
9713,
275,
253,
2457,
2715,
50276,
678,
982,
2278,
50276,
783,
4477,
9569,
253,
246,
463,
66,
1384,
10895,
534,
310,
247,
448,
5187,
10895,
835,
253,
6667,
403,
4561,
407,
2710,
3215,
11273,
3448,
3210,
253,
10895,
556,
644,
28267,
824,
326,
253,
5145,
14399,
2149,
17438,
476,
320,
7515,
327,
2710,
8892,
1561,
253,
3862,
9050,
273,
6120,
8892,
285,
19955,
36455,
8892,
50276,
783,
2022,
2523,
5439,
407,
30628,
310,
253,
2495,
273,
2827,
5849,
285,
828,
261,
2322,
273,
32019,
13099,
275,
448,
5187,
3448,
3733,
941,
247,
17401,
369,
26115,
275,
436,
2743,
18035,
10097,
50275,
936,
2953,
253,
30628,
7350,
327,
2827,
5849,
273,
2173,
1005,
275,
253,
448,
5187,
3448,
253,
4477,
5907,
281,
3057,
253,
12259,
273,
253,
13644,
2130,
3210,
597,
403,
970,
342,
253,
2934,
273,
7004,
323,
1491,
327,
253,
3733,
941,
908,
323,
841,
3210,
253,
4477,
671,
5907,
281,
2486,
941,
8364,
323,
289,
802,
351,
1241,
285,
15302,
3340,
342,
1675,
281,
253,
19112,
273,
448,
5187,
253,
4736,
310,
281,
4271,
19112,
273,
448,
5187,
643,
685,
7649,
19881,
625,
3839,
4477,
12661,
281,
2085,
247,
2590,
5740,
342,
1675,
281,
253,
5235,
273,
448,
5187,
275,
253,
17265,
2715,
273,
253,
2929,
50276,
2072,
5474,
33032,
2520,
2929,
10262,
247,
1236,
2510,
25912,
285,
1095,
456,
10895,
275,
448,
5187,
2112,
342,
767,
49602,
323,
6120,
608,
8892,
285,
19955,
36455,
374,
8892,
281,
3157,
3290,
273,
4561,
17438,
432,
3448,
3210,
253,
4477,
4158,
247,
11080,
22581,
1232,
1690,
941,
4849,
3733,
12182,
2392,
275,
253,
638,
22965,
3408,
285,
3290,
1453,
407,
253,
8680,
6287,
253,
4236,
14683,
323,
22581,
3835,
495,
7794,
1566,
28490,
5700,
285,
8959,
597,
671,
2530,
247,
7000,
1783,
327,
253,
10670,
273,
21210,
14683,
4197,
407,
247,
5235,
273,
3210,
342,
1027,
9552,
12319,
432,
9199,
78,
281,
3436,
67,
3602,
253,
5661,
1543,
327,
253,
4081,
49602,
921,
326,
253,
6120,
8892,
403,
11132,
285,
3733,
247,
3448,
1566,
1060,
305,
431,
19,
327,
616,
10895,
1361,
8493,
6332,
275,
253,
4561,
17438,
337,
941,
4849,
3408,
310,
973,
303,
38758,
253,
4477,
897,
1027,
3210,
28490,
8130,
285,
8959,
3510,
25662,
815,
6230,
267,
285,
2197,
1624,
342,
2710,
10625,
3668,
259,
15170,
285,
4384,
269,
11297,
281,
7940,
1419,
3510,
273,
21210,
14683,
374,
253,
22581,
1232,
285,
3290,
1453,
403,
6210,
392,
265,
1300,
281,
12182,
366,
253,
1236,
2510,
25912,
10895,
1223,
11850,
697,
3290,
495,
253,
10895,
556,
2442,
441,
1131,
1781,
3448,
3210,
275,
448,
5187,
476,
5649,
432,
3733,
327,
436,
10895,
281,
37460,
21210,
14683,
671,
253,
20741,
800,
3210,
476,
320,
10166,
281,
8356,
2736,
6332,
1160,
407,
3448,
3210,
3021,
436,
789,
476,
320,
9865,
323,
2852,
2561,
337,
891,
452,
690,
7350,
5001,
253,
3290,
1453,
50276,
77,
17599,
665,
10166,
253,
806,
577,
30628,
891,
651,
751,
281,
7472,
253,
3290,
273,
436,
10895,
9257,
1580,
616,
31825,
403,
908,
347,
3216,
33024,
84,
281,
6194,
643,
12182,
2392,
50276,
77,
3031,
752,
310,
3388,
3045,
273,
818,
973,
32927,
30628,
403,
597,
10166,
407,
253,
31825,
4197,
407,
253,
806,
577,
30628,
253,
1921,
891,
2546,
310,
326,
597,
403,
253,
4394,
665,
12215,
253,
1029,
15177,
6973,
323,
436,
10895,
374,
3738,
355,
20376,
267,
3086,
2957,
369,
908,
690,
10401,
8892,
824,
347,
21210,
2505,
5481,
278,
885,
88,
11998,
11089,
247,
5536,
440,
20203,
326,
778,
2818,
1566,
3733,
285,
7103,
594,
253,
1543,
327,
841,
15302,
403,
417,
3240,
21414,
495,
642,
9990,
2361,
323,
253,
4081,
8892,
275,
767,
5239,
273,
49602,
577,
253,
3159,
10554,
4836,
275,
253,
19955,
36455,
22791,
1057,
417,
6283,
7472,
253,
3745,
273,
3448,
3210,
984,
627,
476,
320,
1142,
3451,
13650,
323,
253,
1390,
10669,
1677,
1016,
6197,
608,
642,
18276,
6667,
26332,
1566,
13650,
323,
1016,
22791,
4836,
275,
253,
2022,
2505,
285,
253,
30762,
721,
690,
5884,
3374,
275,
9759,
50276,
40957,
275,
2829,
337,
403,
3240,
1355,
326,
2789,
352,
1892,
281,
1239,
50276,
77,
14028,
2228,
10618,
4836,
310,
5816,
50276,
77,
23546,
352,
943,
320,
5315,
6291,
3185,
273,
5315,
1257,
797,
5474,
339,
431,
248,
4477,
9569,
253,
246,
463,
66,
1384,
10895,
534,
310,
247,
448,
5187,
10895,
835,
253,
6667,
403,
4561,
407,
2710,
3215,
11273,
3448,
3210,
253,
10895,
556,
644,
28267,
824,
326,
253,
5145,
14399,
2149,
17438,
476,
320,
7515,
327,
2710,
8892,
1561,
253,
3862,
9050,
273,
6120,
8892,
285,
19955,
36455,
8892,
253,
2022,
4757,
273,
436,
10895,
310,
697,
4311,
352,
9619,
8725,
246,
463,
66,
884,
281,
1024,
2882,
273,
22716,
1717,
28267,
14683,
824,
247,
10895,
588,
320,
3782,
4217,
275,
1474,
2182,
3082,
281,
2939,
253,
3290,
273,
253,
4561,
2505,
432,
3215,
11273,
3448,
3210,
671,
253,
4477,
452,
2668,
1270,
1557,
323,
247,
18144,
3290,
1453,
1232,
275,
1340,
281,
5416,
326,
253,
31825,
323,
253,
2710,
22791,
272,
8892,
476,
320,
18273,
2007,
8813,
390,
37699,
310,
2424,
323,
253,
1563,
2792,
50276,
186,
783,
4477,
1750,
326,
253,
6667,
4561,
403,
11117,
1955,
281,
253,
1027,
28490,
8130,
285,
671,
970,
577,
1027,
3215,
11273,
3448,
3210,
2299,
403,
577,
3215,
11273,
3448,
3210,
8612,
275,
253,
16503,
597,
1056,
323,
2852,
3215,
11273,
3448,
3210,
326,
452,
1705,
562,
4102,
285,
281,
1705,
562,
534,
403,
2080,
4067,
275,
1979,
285,
778,
452,
1027,
18977,
32213,
50276,
186,
262,
651,
320,
1270,
604,
253,
38135,
275,
253,
8892,
4457,
246,
463,
66,
884,
812,
320,
4518,
43997,
562,
4457,
816,
253,
4311,
273,
253,
10895,
50276,
186,
5430,
3588,
310,
352,
281,
7277,
253,
18977,
32213,
273,
3210,
275,
448,
5187,
281,
48087,
6667,
347,
275,
30272,
32910,
50276,
7152,
33032,
2520,
2929,
29328,
246,
463,
66,
1384,
253,
6253,
10895,
323,
48858,
31632,
6332,
1160,
407,
3215,
11273,
3448,
3210,
352,
310,
271,
6508,
2715,
273,
246,
463,
66,
342,
2710,
1781,
3448,
3210,
285,
15450,
8892,
253,
2929,
7194,
2429,
697,
7680,
342,
246,
463,
66,
285,
30272,
32910,
2067,
4679,
403,
2684,
970,
253,
10895,
285,
5661,
1543,
327,
2710,
15450,
8892,
921,
326,
627,
403,
1781,
9956,
18216,
253,
4081,
10895,
50276,
262,
23395,
35205,
2045,
246,
463,
66,
275,
2426,
273,
9171,
1430,
22581,
37175,
3966,
50276,
30862,
3290,
1453,
327,
253,
5140,
1232,
50276,
856,
7334,
278,
885,
88,
285,
19955,
36455,
534,
476,
10073,
253,
22581,
37175,
50276,
783,
8208,
273,
278,
885,
88,
11998,
310,
21541,
533,
310,
8489,
48955,
342,
21210,
13905,
4328,
347,
247,
15450,
4836,
3021,
253,
15504,
273,
253,
4836,
943,
320,
625,
17285,
275,
690,
5133,
824,
347,
18276,
1783,
50276,
856,
7334,
19955,
36455,
943,
671,
320,
625,
5544,
275,
2426,
273,
2139,
352,
943,
320,
26277,
2783,
275,
2852,
2987,
5474,
33032,
2520,
2929,
17904,
281,
4685,
285,
8493,
253,
2505,
5978,
6332,
1160,
407,
1781,
3215,
11273,
3448,
3210,
253,
4477,
452,
3562,
253,
6253,
448,
5187,
3448,
10895,
273,
5145,
14399,
2149,
17438,
285,
247,
6832,
8578,
273,
253,
17438,
23627,
76,
497,
13542,
28267,
387,
247,
4030,
72,
11273,
1268,
285,
15045,
323,
2505,
3290,
3374,
47412,
474,
414,
285,
24705,
25253,
253,
4477,
897,
253,
28267,
941,
1097,
281,
22791,
253,
1682,
3045,
273,
577,
2201,
499,
983,
342,
4778,
10336,
285,
4311,
1411,
1016,
281,
713,
652,
2228,
7140,
285,
281,
1071,
253,
6070,
281,
534,
1442,
292,
25004,
342,
253,
1547,
1377,
263,
18685,
17438,
11355,
253,
8996,
273,
841,
6332,
275,
253,
499,
78,
18012,
50276,
783,
1979,
273,
253,
10895,
285,
2709,
31825,
21210,
35742,
285,
8723,
873,
273,
2228,
4919,
3000,
285,
17660,
323,
841,
3000,
2085,
271,
7126,
3720,
323,
12392,
253,
3753,
273,
47412,
474,
285,
24705,
25253,
6332,
275,
448,
5187,
3448,
16644,
2505,
4561,
407,
1655,
256,
5503,
3210,
9591,
387,
616,
1682,
347,
824,
436,
10895,
943,
320,
273,
10665,
1600,
281,
8607,
8445,
281,
2096,
253,
25306,
273,
2176,
2228,
3510,
275,
5145,
20419,
2505,
285,
671,
281,
2096,
432,
8103,
1941,
4632,
17854,
1966,
27163,
21194,
502,
782,
43425,
752,
9351,
273,
6332,
778,
320,
24838,
273,
5145,
20419,
18012,
285,
3103,
778,
1329,
253,
5481,
273,
5145,
20419,
2505,
253,
2007,
1600,
273,
436,
2929,
310,
253,
3177,
281,
897,
253,
2228,
11095,
5528,
15831,
278,
885,
88,
8557,
281,
3157,
253,
3290,
273,
253,
4561,
2505,
407,
8493,
6332,
273,
841,
9351,
275,
253,
3453,
1529,
4757,
273,
253,
2929,
310,
253,
2590,
285,
11088,
5740,
273,
253,
2561,
1232,
1690,
253,
22581,
1232,
534,
519,
14210,
281,
22581,
1682,
8333,
275,
1142,
4088,
1690,
271,
15301,
273,
12182,
1080,
7162,
3215,
26208,
281,
1329,
12182,
1080,
14940,
281,
247,
1029,
1268,
273,
734,
11423,
1080,
4345,
285,
34560,
851,
26208,
1309,
22581,
3966,
352,
651,
452,
644,
1175,
281,
923,
820,
28082,
465,
5596,
9990,
390,
2074,
11106,
323,
253,
734,
11423,
1080,
4345,
275,
2593,
5922,
3388,
7200,
273,
12182,
2392,
310,
5393,
3388,
3045,
50276,
19687,
1169,
432,
608,
2511,
281,
818,
4148,
285,
671,
734,
11423,
1080,
30859,
533,
417,
2557,
273,
253,
6158,
310,
11120,
2530,
50276,
7152,
339,
431,
248,
789,
21168,
327,
407,
20437,
247,
4067,
285,
2169,
3290,
2715,
273,
247,
3786,
4439,
10895,
4907,
246,
463,
66,
352,
310,
247,
11088,
4849,
273,
5145,
47895,
17438,
275,
448,
5187,
3448,
326,
452,
644,
28267,
323,
6332,
1754,
327,
247,
4460,
42081,
273,
6332,
436,
42081,
310,
1754,
327,
941,
15067,
323,
7208,
12952,
4948,
273,
6332,
3560,
407,
20446,
407,
6485,
12182,
2392,
33810,
12082,
1783,
281,
253,
28267,
6332,
310,
2684,
281,
10313,
6127,
326,
403,
9371,
275,
39417,
272,
253,
14603,
273,
2710,
499,
983,
327,
1027,
15302,
50276,
6275,
314,
253,
789,
4566,
327,
281,
17813,
1880,
6332,
1119,
476,
320,
4229,
8356,
342,
638,
20137,
1781,
3448,
3210,
285,
1089,
50276,
262,
1892,
281,
8415,
407,
4980,
2097,
436,
268,
3465,
253,
1039,
323,
337,
247,
10895,
281,
12106,
253,
2238,
6332,
499,
78,
2789,
374,
6684,
16644,
3082,
326,
476,
8356,
3451,
253,
6332,
1160,
407,
499,
983,
984,
5368,
256,
5503,
310,
417,
2217,
281,
9004,
1419,
352,
495,
22791,
273,
253,
3045,
273,
256,
5503,
327,
1097,
10401,
347,
973,
347,
18977,
6332,
323,
2852,
2987,
281,
7277,
1411,
50274,
49831,
420,
689,
2045,
789,
835,
4477,
908,
253,
19191,
28490,
5700,
285,
22563,
12339,
534,
3777,
253,
39296,
2905,
6332,
4294,
285,
7613,
671,
4136,
2341,
281,
320,
7106,
327,
12150,
6332,
50276,
783,
1180,
273,
28267,
3530,
310,
1781,
2217,
281,
6351,
7162,
285,
29966,
2495,
273,
13583,
11815,
1955,
281,
46541,
13007,
50276,
31591,
4698,
1543,
403,
6096,
327,
253,
3559,
10895,
970,
256,
5503,
3210,
534,
4483,
253,
2561,
3114,
281,
452,
4891,
1666,
25379,
281,
7277,
616,
2561,
285,
4342,
1411,
33810,
407,
16344,
7418,
499,
983,
327,
11117,
15302,
253,
789,
671,
7729,
4212,
273,
499,
983,
275,
18000,
4569,
1566,
323,
667,
1677,
295,
21619,
4836,
50275,
783,
14720,
3212,
13887,
1264,
1127,
4311,
323,
7004,
7162,
275,
22581,
3185,
273,
2629,
2078,
797,
4311,
310,
417,
2530,
50276,
1279,
17006,
326,
369,
2668,
432,
2905,
789,
310,
5816,
323,
253,
4327,
273,
12182,
1080,
3733,
16182,
50276,
28821,
326,
4457,
247,
2176,
7887,
499,
983,
921,
21313,
13789,
253,
789,
2218,
1060,
310,
1925,
715,
1953,
347,
1880,
390,
417,
253,
6332,
6127,
275,
1355,
499,
983,
608,
67,
3602,
403,
671,
21270,
275,
1781,
499,
983,
50276,
2313,
67,
751,
268,
2435,
1052,
67,
259,
86,
4204,
80,
1384,
20105,
85,
50275,
1710,
4175,
6332,
337,
298,
13851,
1158,
352,
374,
298,
16129,
275,
1264,
2069,
50276,
187,
187,
4118,
18435,
27,
783,
30628,
512,
10490,
253,
2929,
253,
4477,
2380,
31637,
954,
2792,
5439,
407,
253,
30628,
275,
1859,
273,
326,
253,
4477,
403,
7052,
12470,
281,
1379,
253,
8680,
327,
4450,
323,
253,
2457,
2715,
253,
2022,
16289,
2523,
5439,
407,
30628,
310,
253,
2495,
273,
2827,
5849,
285,
828,
261,
2322,
273,
32019,
13099,
275,
448,
5187,
3448,
3733,
941,
941,
8364,
878,
281,
320,
2879,
281,
253,
2457,
2715
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
i have read the authors responses to all reviews and ultimately elected to leave my score as it is weak accept i think the empirical results are strong and while i am not as troubled by the motivation and framing of the work as reviewers 3 and 4 i think their more conceptual and methodological critiques have merit dampening my enthusiasm for the submission this submission proposes a modeldriven data augmentation strategy that aims to improve the calibration and reduce the overconfidence of a variety of bayesian neural network architectures when dealing with outofdistribution ood samples it involves adding a generator network that aims to generate plausible ood samples during training along with an objective term that tries to force the predictor network to make high entropy low confidence predictions for these samples the paper does a fairly thorough empirical comparison with ten datasets eight regression two image classification and half a dozen baselines most of which can be combined with pad the results indicate that pad usually improves both calibration and accuracy by at least a small amount this is a solid paper the proposed method seems sensible if pretty complex and appears to be modestly effective in the included experimental results the introduction summarizes the papers contributions as 1 it proposes a modeldriven data augmentation technique aimed at improving calibration and reducing overconfidence for ood samples 2 it adapts and extends the technique to regression problems which the paper argues is unprecedented 3 it demonstrates empirically that the proposed approach improves the ood accuracy and calibration of four different strong bayesian neural net models i lack the broad familiarity with the data augmentation literature required to verify claim 2 i suspect that if this simple claim is true then it may be trivially so its hard to believe that no one has applied data augmentation to regression tasks so perhaps folks havent bothered to publish it the authors can always modify or remove this claim if needed the other two contributions seem supported although the empirical improvements are for the most part small and probably not statistically significant i lean weakly toward acceptance i would not oppose its inclusion in the iclr 2021 proceedings but i wouldnt enthusiastically endorse it ill explain below the papers motivation as laid out in sections 1 and 2 is sound calibration and proper quantification of uncertainty are increasingly important in a wide range of applications where machine learning has real world consequences for safety fairness etc what is more existing techniques based on neural networks increasingly widespread do seem to suffer significant flaws especially exhibiting overconfidence when they should not the paper offers a diagnosis in the form of a conjecture section 22 failure to revert to the prior ptheta for regions of the input space with insufficient evidence to warrant low entropy predictions figure 1 effectively visualizes this phenomenon in a toy setting but no further proof is offered further the assumption that prior reversion is the correct thing to do isnt examined though thats a basic tenet bayesian modeling so well set that aside the proposed technique seems sensible if complicated add a generator network to produce ood pseudosamples during training and penalize the prediction network for making high confidence low entropy predictions on these pseudosamples we can consider this a form of modeldriven vs heuristic data augmentation the generator loss given in equation 5 looks correct to my nonexpert eye and i suspect its immediately comprehensible to readers familiar with gans vaes and bayesian neural nets the intuition for the ood samples resonates with me they should be close enough to real data to be plausible but far enough away that the predictor would be unjustified in assigning a high confidence or departing from the prior the regularization term in equation 7 is a bit more arcane at first glance but its intuitive the conditional prediction distribution should be close to the prior for pseudodata points far from the real training distribution the derivations of the kl term for regression and categorical classification are given in the appendix but these arent critical details for judging the significance of the paper theyre quite straightforward the design of the experiments is sound they simulate ood settings by clustering each dataset and using distinct clusters for training and test splits and measure both accuracy and calibration i dont have an opinion about the choice of the kuleshov metric for calibration the chosen baselines look strong but i am not uptodate on the relevant literature so i would not be able to identify a nonobvious missing baseline the experimental results are promising a pad variant is usually but not always the best for each task and metric exceptions include gp for navalaccuracy and r1bnn for powercalibration perhaps more important pad does generally seem to improve both accuracy and calibration across both variants de mc swag etc and datasets so in other words if a modeler chooses to use one of the compatible bayesian neural networks in most cases they should also use pad the work and manuscript have a few weaknesses that prevent me from more strongly recommending acceptance for one some of the exposition around training is unclear in particular how the objectivees in equations 5 and 8 are combined during training i praised the results above but i think the manuscripts interpretation of its results section 43 is still more generous than mine pad does consistently improve accuracy and calibration but the margin is sometimes small raising the question aboout whether the added complexity is worthwhile in all cases the paper argues that pad consistently improves the calibration curves in figure 3 at least for poorly calibrated models but that does not seem obvious to me this might be because the curves are somewhat cluttered but i see a number of exceptions where the pad look potentially worse energy naval swag and yacht i think perhaps the real problem is not the results themselves which overall are strong but rather the manuscripts rather cursory discussion of the results and its failure to offer any insights or guidance about eg when pad should be expected to help based on task or dataset or which baselines it works best with one last note i dont want to overindex on a toy figure included for illustration purposes but i dont find the results in figure 1 convincing perhaps i am misunderstanding what behavior we desire if so please correct me i agree that the pad distribution does a better job of capturing the uncertainty in the central lowdata area but at the left and righthand ends the baselines actually look preferable in that the uncertainty is often appropriately wider and contains or at least follows the true function it looks like pad might be overregularizing things in these cases here are some actionable suggestions for improvements clarify how the objectives are combined and how training proceeds consider adding eg an algorithm summary figure expand the results discussion beyond simply restating the results which are displayed in the tables for the raw accuracy and calibration numbers perhaps you could compute some summary statistics for the baseline vs pad differences so readers could get a quick sense of whether pad usually beats the baseline maybe also some counts for how often a pad variant has the best performance also in the discussion try to distill out some illustrative patterns that could be turned into insights or practical guidance for the calilbration curve plots figure 3 consider reducing the clutter by removing redundant curves for most of the tasks most baselines and corresponding pad variants are quite similar so perhaps you could show a representative subset for each task and then put the complete figures in the appendix heres a laundry list of questions how does training proceed is it a typical alternating adversarial optimization ie optimize generator then discriminator repeat what additional computational complexity does pad introduce during training under which conditions pad should be expected to help most type or distribution of data task structure baseline model etcdocsepthis paper proposes a data augmentation scheme named pad to improve accuracy and calibration of nns the idea is to generate ood data close to the training data where the model is overconfident and force a higher entropy for their corresponding predictions this topic is very relevant to the iclr community the paper is clear and i was excited with the goal in a first place however the paper as it is has major drawbacks 1 the biggest drawback is that the proposed approach is adhoc a heuristic with no guarantees that it will work as desired in fact recent work has shown that data augmentation on top of ensembles can be harmful the authors should discuss this in the paper see wen etal 2020 combining ensembles and data augmentation can harm your calibration for this paper to be accepted the authors should explore the properties of the proposed approach with careful controlled toy scenarios and bring further insights on when the approach is expected ideally guaranteed to work 2 using pad on top of other probabilistic approaches destroys the probabilistic interpretation 3 experimental results are extensive but not convincing figure 1 lacks the gp reference and shows bad performance on the left extreme the ablation study suggest that equation 5 could be simplified finally results in table 24 suggest that the proposed approach hurts in highdimensional scenarios energy and kin8nm datasets the reported numbers also strongly depend on model selection and tuning of pad and other baselines information which is currently missing 4 the authors do not compare nor mention recent advances on calibrating dnns for example antoran etal 2020 depth uncertainty in neural networks liu etal 2019 simple and principled uncertainty estimation with deterministic deep learning via distance awareness more comments the proposed model does not seem to scale to highdimensions as filling the gaps with the ood data generator becomes infeasible this is reflected in the tables where both accuracy and calibration are systematically worse for the highdim kin8nm dataset up to how many dimensions would this approach be useful the ood dataset produce an equally sized pseudo dataset yet one might think that the amount of data needed to robustify uncertainty would depend on the manifold geometry location of ood samples is chosen as an interpolation of latent representations for the observed data that means that many generated datapoints will not bee outofsample from the ablation study tables 4 and 5 without ab gives similar results to regular always within the reported error bars of regular that seems to indicate that terms a and b are not that relevant am i missing something figure 1 the authors should include one column for the gp behavior since the authors claim that the observed behavior is similar to that otherwise it is unclear by eye what is best in particular pad could the proposed approach suffer from the opposite issue ie deliver too high uncertainty in the augmented ood data how do you avoid this issue how does the proposed approach compare to a dnn whose last layer is gp or bayesian rbf network see httpwwwgatsbyuclacukbalajiudl2020acceptedpapersudl2020paper009pdf the proposed method encourages a reversion to a specific prior 0 mean functions minor the authors mention limited expressiveness of gps but this is subject to a simple kernel if the kernel is complicated enough then gps are as expressive as we would like to see equivalences between dnn and gps in neil 1997 and lee etal 2017 please clarify this statement figure 3 is hard to read i suggest to highlight the pad curves by changing the color schemedocsepoverview the authors propose a data augmentation scheme that generates samples out of distribution and helps with uncertainty estimates comparisons are to various bayesian methods in uci regression and mnistcfar for classification pad seems to give some improvements in out of distribution uncertainty quantification the major concern is that the gains seem relatively small and the objective is adhoc it would be nice to see either more substantial uniform gains so that the authors can justify the procedure on the results alone or more solid conceptual motivation of the method especially from the bayesian side it seems like the motivation and intro is clear and section 3 onwards becomes very adhoc and loses much of this it would be nice to be convinced that there are a set of assumptions and conditions under which this is the right way to do uncertainty quantification positives the evaluations are extensive and its commendable that they include both positive and negative results in their regression evaluations uncertainty estimation out of distribution is an important and timely problem negatives minor im not sure why there is a claim that the problems with uncertainty estimation comes from pthetad and not pthetayx the fact that nonbayesian methods have similar issues with uncertainty quantification would suggest that the latter is certainly an issue figure 1 doesnt seem like a compelling argument for the narrative in the paper mc dropout and deep ensembles both have decent behavior outside the support x0 x1 but suffer in the gap between 025 to 075 which is arguably due to overaggressive interpolation term a in equation 5 is justified as generating data where f is overconfident but i dont see how this is true its just generating data where theres low prediction entropy this includes areas where it is confident for the right reasons somewhat minor but the sum of term a and term b seems a bit problematic since a will be in terms of discrete entropy in classification and term b is going to be differential entropy in general rescaling x also seems like it would arbitrarily shift the weights between ab and c the weird thresholding on the c penalty for regression problems does not inspire confidence overall equation 5 gives a sense of a fairly adhoc criterion id like to be convinced that this is actually the right way of doing things especially from a bayesian perspective looking at equation 7 it seems like learning the distribution of tilde x is alot of work to regularize the kl towards the marginal with a squaredexponential penalty away from the training data is it really not possible to postprocess the model distribution to achieve the same thing the experiments are extensive but a bit mixed the dataset construction for regression seems like it would naturally favor padtype methods because the clustering occurs on the basis of feature distances and pad enforces uncertainty based on feature distances via the squaredexp term in equation 7 in terms of results i think overall pad gives gains but its not uniform and in cases like swag on table 2 seems to hurt more than it helps the corruptions in mnist cifar must also be pretty aggressive as the accuracy numbers are quite low for both does pad do similarly well on milder or no distribution shift settings i am slightly concerned that the evaluations here focus so much on the large distribution setting and that pad is tuned to that case minor inline equation involving sines is missing a closing parenthesis notation gphi is a generative model in section 2 but it seems to be the output of an autoencoder in section 31 qphi seems to be the actual generative modeldocsepi think the paper is interesting and wellwritten i agree with the miscalibration can be caused by outofdistribution data even though it is still commonly observed without such discrepancy addressing ood data is an important direction and i think the author proposed a reasonable approach to prevent models from overfitting on data points that are rarely observed during training however i believe there are some limitation of the method at the current stage and the experiments did not fully convince me strength 1 the paper addresses an important question of models being overconfident on outofdistribution data the method is practical for applications where uncertainty estimation is needed 2 the adversarial generation of ood data is intersting and the rationale is well explained 3 the authors included a good selection of datasets and experiments the padbased methods are also compared to a good variety of baselines weakness 1 to determine whether a data point is outofdistribution both the adversarial model and main model rely on the l2 distance and the length scale parameter ell my concern here is about 1 the heterogeneity in different dimensions and 2 ell seems particularly important and difficult parameter to tune i would like the authors to give more details on how it is chosen 2 i believe the approach here is to revert to prior when evaluated data points are far away from the observed ones thus the difference in accuracy depends on how good the starting prior is and how much bias the baseline model learned from ood data table 3 shows some of the tradeoff but i think it also be good to show the difference when theres no ood data because its not necessarily known in advance 3 looking at figure 3 i dont think the pad method shows significant improvement in most of the datasets housing seems to be the only one here in figure 5 it looks like the ood data are mostly in the convex hull of observed data at least in this low dimensional embedding it is unclear how to differentiate those from the region where models should be interpolating moreover all the ood data are artificially constructed i think it would be more convincing to test the methods on some ood data that arises naturally one such source could be temporal data where distribution could shift over time minor comments 1 in section 5 excessive computation for large models an datasets anand thank the authors for lot of these responses im still around neutral for this paper but i will raise my score to marginally above acceptance
### Summary: | this paper studies the problem of uncertainty estimation under distribution shift the proposed approach pad addresses this underestimation issue by augmenting the training data with inputs that the network has unjustified low uncertainty estimates and asking the model to correct this underestimation at those augmented datapoints results show promising improvement over a set of common benchmark tasks in uncertainty estimation with comparisons to a number of existing approaches all the reviewer agreed that the experiments are well conducted and the empirical results are very promising however they also had a shared concern on the justification of the approach reviewers are less willing to accept a paper merely for commending its empirical performance i share the above concern as the reviewers and i personally found the presentation of the approach a bit rush and disconnected from the motivation for example the current presentation feels like the method is motivated by bnns but it is not clear to me how the proposed objective connects to the motivation also no derivation of the objective is included in either main text or appendix in revision i would suggest a focus on improving the clarity and theoretical justification of the proposed objective function | [
351,
941,
2810,
281,
253,
3733,
941,
835,
253,
1566,
310,
689,
8259,
888,
285,
3490,
247,
2169,
15579,
323,
616,
3969,
13650,
50276,
2520,
9400,
310,
1077,
4623,
281,
253,
17857,
32888,
3114,
253,
2929,
310,
2590,
285,
891,
369,
9049,
342,
253,
4736,
275,
247,
806,
1659,
2299,
253,
2929,
347,
352,
310,
556,
2201,
30453,
50276,
18,
253,
5962,
32489,
310,
326,
253,
4081,
2746,
310,
519,
37806,
247,
47641,
342,
642,
23632,
326,
352,
588,
789,
347,
6799,
275,
958,
3332,
789,
556,
2011,
326,
941,
42072,
327,
1755,
273,
49328,
476,
320,
19632,
253,
4477,
943,
2319,
436,
275,
253,
2929,
923,
259,
257,
1162,
267,
9169,
16248,
49328,
285,
941,
42072,
476,
5237,
634,
18543,
323,
436,
2929,
281,
320,
7607,
253,
4477,
943,
8338,
253,
3607,
273,
253,
4081,
2746,
342,
10182,
6537,
20953,
15216,
285,
3324,
2007,
16039,
327,
672,
253,
2746,
310,
3264,
34243,
16293,
281,
789,
50276,
19,
970,
13229,
327,
1755,
273,
643,
37851,
7274,
46340,
253,
37851,
7914,
495,
5661,
1543,
403,
9470,
533,
417,
21414,
4677,
337,
19756,
253,
31025,
3806,
285,
2722,
3076,
3045,
327,
253,
1669,
9559,
253,
28913,
1263,
1804,
326,
5150,
608,
812,
320,
21010,
4720,
1543,
275,
2829,
2164,
1804,
326,
253,
4081,
2746,
31835,
275,
1029,
6967,
15216,
2341,
285,
5708,
25,
10602,
15302,
253,
2361,
3904,
671,
7052,
3469,
327,
1566,
5438,
285,
25184,
273,
13229,
285,
643,
1666,
25379,
1491,
534,
310,
4390,
5816,
577,
253,
4477,
513,
417,
7277,
4543,
3748,
3332,
16424,
327,
24403,
839,
277,
79,
2224,
323,
1650,
28910,
1331,
263,
266,
1162,
267,
9169,
6864,
11649,
275,
11454,
6928,
28910,
632,
86,
1162,
267,
6247,
2969,
285,
3505,
74,
6216,
11649,
13418,
342,
30027,
3676,
4715,
3066,
4181,
11891,
28910,
50276,
3062,
5701,
50275,
783,
4081,
1566,
1057,
417,
1646,
281,
4311,
281,
1029,
4528,
5354,
347,
12868,
253,
18388,
342,
253,
258,
351,
941,
14156,
4916,
275,
36764,
917,
436,
310,
11392,
275,
253,
7180,
835,
1097,
7200,
285,
18543,
403,
24181,
7197,
323,
253,
1029,
4528,
5708,
25,
10602,
10895,
598,
281,
849,
1142,
10103,
651,
436,
2746,
320,
4217,
50275,
783,
258,
351,
10895,
4711,
271,
9696,
25180,
17927,
10895,
2568,
581,
1537,
1158,
326,
253,
2408,
273,
941,
3058,
281,
10237,
1419,
11649,
651,
3469,
327,
253,
16751,
12087,
50276,
12428,
273,
258,
351,
3530,
310,
6777,
347,
271,
30370,
273,
21624,
14237,
323,
253,
2540,
941,
326,
2097,
326,
1142,
4561,
2856,
522,
842,
84,
588,
417,
30747,
562,
1171,
16848,
50275,
4064,
253,
28913,
1263,
7180,
577,
285,
608,
1293,
490,
4245,
2074,
1543,
281,
3963,
1900,
1561,
253,
2361,
2228,
8965,
273,
3963,
326,
3133,
281,
5224,
326,
2426,
247,
285,
270,
403,
417,
326,
4623,
717,
891,
5816,
1633,
50275,
13206,
337,
253,
4477,
943,
2486,
581,
5084,
323,
253,
31025,
3879,
1580,
253,
4477,
1750,
326,
253,
2540,
3879,
310,
2074,
281,
326,
5010,
352,
310,
12744,
407,
5130,
752,
310,
1682,
275,
1798,
13229,
50275,
16534,
253,
4081,
2746,
11089,
432,
253,
7285,
2523,
26332,
7257,
1512,
1029,
11649,
275,
253,
31612,
258,
351,
941,
849,
513,
368,
3693,
436,
2523,
50276,
5430,
1057,
253,
4081,
2746,
7277,
281,
247,
277,
9866,
3692,
1390,
3828,
310,
31025,
390,
17699,
16561,
391,
3342,
2990,
923,
3944,
2700,
72,
1832,
1615,
13340,
317,
2788,
7187,
37848,
438,
77,
14952,
14764,
264,
50004,
438,
77,
14952,
20790,
8972,
9275,
50276,
783,
4081,
1332,
29426,
247,
294,
4149,
281,
247,
2173,
2720,
470,
1599,
3470,
50276,
37585,
50275,
783,
4477,
3748,
3710,
3890,
6460,
273,
305,
793,
533,
436,
310,
2256,
281,
247,
2969,
10295,
604,
253,
10295,
310,
9542,
2217,
840,
305,
793,
403,
347,
43541,
347,
359,
651,
751,
281,
923,
5217,
2979,
875,
277,
9866,
285,
305,
793,
275,
425,
300,
8210,
285,
458,
70,
1162,
267,
4240,
4496,
19148,
436,
3908,
50276,
13206,
495,
310,
1892,
281,
1239,
891,
1804,
281,
6780,
253,
13229,
9191,
407,
6890,
253,
3295,
4436,
1314,
406,
33032,
39930,
50276,
783,
4477,
12661,
247,
941,
42072,
6974,
326,
15693,
3530,
562,
273,
3268,
285,
7729,
342,
11649,
8197,
14023,
403,
281,
2710,
17699,
16561,
3082,
275,
44274,
74,
9077,
285,
278,
79,
382,
7836,
274,
323,
9162,
13229,
3133,
281,
1918,
690,
11701,
275,
562,
273,
3268,
11649,
21652,
50276,
783,
2201,
4468,
310,
326,
253,
15988,
1646,
4942,
1355,
285,
253,
8103,
310,
519,
37806,
352,
651,
320,
5322,
281,
923,
2057,
625,
6832,
6447,
15988,
594,
326,
253,
4477,
476,
15249,
253,
5199,
327,
253,
1543,
3815,
390,
625,
4891,
20178,
16038,
273,
253,
1332,
3340,
432,
253,
17699,
16561,
1930,
352,
3133,
751,
253,
16038,
285,
26432,
310,
2590,
285,
2593,
495,
39210,
4916,
1077,
519,
37806,
285,
25068,
1199,
273,
436,
352,
651,
320,
5322,
281,
320,
13762,
326,
627,
403,
247,
873,
273,
13260,
285,
2515,
762,
534,
436,
310,
253,
987,
1039,
281,
513,
11649,
21652,
50275,
993,
23223,
50276,
783,
27163,
403,
9470,
285,
697,
49638,
494,
326,
597,
2486,
1097,
2762,
285,
4016,
1543,
275,
616,
9077,
27163,
50275,
7157,
1695,
555,
13418,
562,
273,
3268,
310,
271,
1774,
285,
14793,
1895,
50276,
8265,
3993,
50276,
37585,
516,
417,
2119,
2139,
627,
310,
247,
1750,
326,
253,
3237,
342,
11649,
13418,
3249,
432,
268,
783,
85,
324,
285,
417,
268,
783,
85,
333,
89,
253,
958,
326,
1327,
32442,
16561,
3082,
452,
2074,
3374,
342,
11649,
21652,
651,
1804,
326,
253,
6158,
310,
5604,
271,
2523,
50276,
13206,
337,
36908,
1646,
751,
247,
18511,
4154,
323,
253,
14511,
275,
253,
2929,
278,
68,
5926,
483,
285,
3676,
49328,
1097,
452,
12524,
3879,
3345,
253,
1329,
1269,
17,
1269,
18,
533,
11089,
275,
253,
8037,
875,
470,
1099,
281,
470,
1976,
534,
310,
25711,
1955,
281,
689,
356,
11020,
30370,
50276,
3945,
247,
275,
5150,
608,
310,
17285,
347,
11365,
941,
835,
269,
310,
689,
8259,
888,
533,
891,
13414,
923,
849,
436,
310,
2032,
697,
816,
11365,
941,
835,
253,
373,
1698,
10554,
15579,
436,
3797,
3672,
835,
352,
310,
13224,
323,
253,
987,
4606,
50276,
8826,
5371,
5884,
533,
253,
2020,
273,
1307,
247,
285,
1307,
270,
3133,
247,
2372,
20276,
1580,
247,
588,
320,
275,
2426,
273,
13358,
15579,
275,
9162,
285,
1307,
270,
310,
1469,
281,
320,
8967,
15579,
275,
2087,
46595,
272,
1269,
671,
3133,
751,
352,
651,
29607,
5333,
253,
13461,
875,
490,
285,
260,
253,
12504,
7887,
272,
327,
253,
260,
12339,
323,
9077,
3237,
1057,
417,
26761,
7162,
50276,
1189,
455,
5150,
608,
4245,
247,
3282,
273,
247,
9648,
519,
37806,
17705,
2654,
751,
281,
320,
13762,
326,
436,
310,
2686,
253,
987,
1039,
273,
2509,
1841,
3340,
432,
247,
17699,
16561,
8668,
50276,
13565,
387,
5150,
818,
352,
3133,
751,
4715,
253,
3268,
273,
246,
6227,
1269,
310,
47899,
273,
789,
281,
3963,
907,
253,
27451,
4404,
253,
16888,
342,
247,
30044,
4347,
45426,
12339,
1977,
432,
253,
3733,
941,
310,
352,
1663,
417,
1896,
281,
1501,
7404,
253,
1566,
3268,
281,
5115,
253,
1072,
2181,
50275,
783,
4679,
403,
9470,
533,
247,
2372,
6804,
253,
10895,
5140,
323,
9077,
3133,
751,
352,
651,
10748,
3718,
13229,
881,
3082,
984,
253,
17524,
6634,
327,
253,
3720,
273,
4735,
13849,
285,
13229,
546,
36217,
11649,
1754,
327,
4735,
13849,
3066,
253,
30044,
4347,
1307,
275,
5150,
818,
275,
2426,
273,
1543,
891,
1158,
4583,
13229,
4245,
15988,
533,
697,
417,
6447,
285,
275,
2219,
751,
1863,
356,
327,
2829,
374,
3133,
281,
8513,
625,
685,
352,
7729,
50275,
783,
17715,
621,
275,
278,
79,
382,
50276,
46277,
274,
1364,
671,
320,
3965,
13847,
347,
253,
7200,
3904,
403,
3240,
1698,
323,
1097,
1057,
13229,
513,
12014,
973,
327,
2301,
491,
390,
642,
3268,
5333,
7533,
891,
717,
5777,
7514,
326,
253,
27163,
1060,
2770,
594,
1199,
327,
253,
1781,
3268,
4758,
285,
326,
13229,
310,
24251,
281,
326,
1083,
50276,
37585,
50276,
17243,
5150,
7668,
256,
1100,
310,
5816,
247,
11196,
2885,
25232,
50276,
25604,
305,
2162,
310,
247,
1006,
800,
1566,
275,
2593,
374,
533,
352,
3133,
281,
320,
253,
3453,
273,
271,
6753,
36465,
275,
2593,
4562,
2805,
2162,
3133,
281,
320,
253,
4588,
1006,
800,
1566,
7152,
339,
2059,
1158,
253,
2929,
310,
4722,
285,
973,
15720,
891,
5194,
342,
253,
3731,
1179,
11457,
476,
320,
4269,
407,
562,
1171,
35360,
941,
1014,
2167,
352,
310,
1335,
7744,
2540,
1293,
824,
26210,
15974,
258,
351,
941,
310,
271,
1774,
3884,
285,
891,
1158,
253,
2488,
4081,
247,
5272,
2746,
281,
3657,
3210,
432,
689,
31893,
327,
941,
2792,
326,
403,
11766,
2540,
1309,
3733,
2299,
891,
2868,
627,
403,
690,
12291,
273,
253,
1332,
387,
253,
1655,
3924,
285,
253,
4679,
858,
417,
4751,
18578,
479,
50276,
45563,
337,
253,
2929,
12453,
271,
1774,
1953,
273,
3210,
1146,
689,
8259,
888,
327,
562,
1171,
35360,
941,
253,
1332,
310,
8542,
323,
4893,
835,
11649,
13418,
310,
3058,
50276,
19,
253,
48960,
5978,
273,
258,
351,
941,
310,
734,
296,
272,
285,
253,
24775,
310,
973,
5544,
495,
253,
4477,
2908,
247,
1175,
5438,
273,
15302,
285,
4679,
253,
13229,
3169,
3082,
403,
671,
2429,
281,
247,
1175,
5235,
273,
1666,
25379,
50276,
20881,
1255,
337,
281,
3653,
1880,
247,
941,
1127,
310,
562,
1171,
35360,
1097,
253,
48960,
1566,
285,
2022,
1566,
10725,
327,
253,
298,
19,
4181,
285,
253,
2978,
4311,
4764,
11591,
619,
4468,
1060,
310,
670,
337,
253,
19331,
275,
1027,
10103,
285,
374,
11591,
3133,
3782,
1774,
285,
2834,
4764,
281,
19928,
891,
651,
751,
253,
4477,
281,
1918,
625,
4278,
327,
849,
352,
310,
6777,
374,
891,
2868,
253,
2746,
1060,
310,
281,
43004,
281,
2720,
672,
6760,
941,
2792,
403,
2080,
1977,
432,
253,
2540,
4394,
3021,
253,
3064,
275,
7200,
7024,
327,
849,
1175,
253,
4983,
2720,
310,
285,
849,
1199,
8492,
253,
8245,
1566,
6311,
432,
258,
351,
941,
2829,
495,
2722,
690,
273,
253,
5454,
2727,
533,
891,
1158,
352,
671,
320,
1175,
281,
921,
253,
3064,
672,
253,
373,
642,
258,
351,
941,
984,
697,
417,
7933,
1929,
275,
7170,
495,
2819,
387,
4677,
495,
891,
13414,
1158,
253,
13229,
1332,
2722,
1534,
7756,
275,
954,
273,
253,
15302,
8039,
3133,
281,
320,
253,
760,
581,
1060,
275,
4677,
608,
352,
4453,
751,
253,
258,
351,
941,
403,
6571,
275,
253,
17133,
28470,
273,
2540,
941,
387,
1878,
275,
436,
1698,
15759,
21496,
352,
310,
12744,
849,
281,
22629,
1110,
432,
253,
2919,
835,
3210,
943,
320,
20670,
839,
25761,
512,
253,
258,
351,
941,
403,
41544,
8818,
891,
1158,
352,
651,
320,
625,
21414,
281,
1071,
253,
3082,
327,
690,
258,
351,
941,
326,
15877,
10748,
581,
824,
2603,
812,
320,
11935,
941,
835,
3268,
812,
5333,
689,
673,
50276,
37585,
5701,
337,
275,
2593,
608,
13622,
13782,
323,
1781,
3210,
271,
15302,
271,
395,
50274,
47033,
253,
4477,
323,
2257,
273,
841,
6128,
516,
1335,
1475,
9238,
323,
436,
2929,
533,
891,
588,
7164,
619,
4868,
281,
42876,
1840,
14924,
50273,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
1895,
273,
11649,
13418,
762,
3268,
5333,
253,
4081,
2746,
13229,
12453,
436,
22698,
14508,
2523,
407,
35919,
272,
253,
3733,
941,
342,
14800,
326,
253,
2990,
556,
26694,
1245,
1698,
11649,
8197,
285,
7004,
253,
1566,
281,
3451,
436,
22698,
14508,
387,
1110,
31612,
2856,
522,
842,
84,
1543,
921,
12532,
7756,
689,
247,
873,
273,
1846,
22791,
8892,
275,
11649,
13418,
342,
14023,
281,
247,
1180,
273,
5368,
7274,
50276,
455,
253,
37317,
5821,
326,
253,
4679,
403,
973,
5196,
285,
253,
16774,
1543,
403,
1077,
12532,
2299,
597,
671,
574,
247,
6096,
4468,
327,
253,
22861,
273,
253,
2746,
30628,
403,
1679,
7378,
281,
2997,
247,
2929,
7960,
323,
764,
1946,
697,
16774,
3045,
50276,
74,
3894,
253,
1840,
4468,
347,
253,
30628,
285,
891,
11697,
1119,
253,
9759,
273,
253,
2746,
247,
2372,
16949,
285,
33817,
432,
253,
16038,
323,
1650,
253,
1655,
9759,
9193,
751,
253,
1332,
310,
17194,
407,
270,
79,
2224,
533,
352,
310,
417,
2590,
281,
479,
849,
253,
4081,
8103,
23417,
281,
253,
16038,
671,
642,
28529,
273,
253,
8103,
310,
2908,
275,
2057,
2022,
2505,
390,
30762,
50275,
249,
18520,
891,
651,
1804,
247,
2770,
327,
11138,
253,
19843,
285,
10527,
22861,
273,
253,
4081,
8103,
1159
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
351,
941,
2810,
281,
253,
3733,
941,
835,
253,
1566,
310,
689,
8259,
888,
285,
3490,
247,
2169,
15579,
323,
616,
3969,
13650,
50276,
2520,
9400,
310,
1077,
4623,
281,
253,
17857,
32888,
3114,
253,
2929,
310,
2590,
285,
891,
369,
9049,
342,
253,
4736,
275,
247,
806,
1659,
2299,
253,
2929,
347,
352,
310,
556,
2201,
30453,
50276,
18,
253,
5962,
32489,
310,
326,
253,
4081,
2746,
310,
519,
37806,
247,
47641,
342,
642,
23632,
326,
352,
588,
789,
347,
6799,
275,
958,
3332,
789,
556,
2011,
326,
941,
42072,
327,
1755,
273,
49328,
476,
320,
19632,
253,
4477,
943,
2319,
436,
275,
253,
2929,
923,
259,
257,
1162,
267,
9169,
16248,
49328,
285,
941,
42072,
476,
5237,
634,
18543,
323,
436,
2929,
281,
320,
7607,
253,
4477,
943,
8338,
253,
3607,
273,
253,
4081,
2746,
342,
10182,
6537,
20953,
15216,
285,
3324,
2007,
16039,
327,
672,
253,
2746,
310,
3264,
34243,
16293,
281,
789,
50276,
19,
970,
13229,
327,
1755,
273,
643,
37851,
7274,
46340,
253,
37851,
7914,
495,
5661,
1543,
403,
9470,
533,
417,
21414,
4677,
337,
19756,
253,
31025,
3806,
285,
2722,
3076,
3045,
327,
253,
1669,
9559,
253,
28913,
1263,
1804,
326,
5150,
608,
812,
320,
21010,
4720,
1543,
275,
2829,
2164,
1804,
326,
253,
4081,
2746,
31835,
275,
1029,
6967,
15216,
2341,
285,
5708,
25,
10602,
15302,
253,
2361,
3904,
671,
7052,
3469,
327,
1566,
5438,
285,
25184,
273,
13229,
285,
643,
1666,
25379,
1491,
534,
310,
4390,
5816,
577,
253,
4477,
513,
417,
7277,
4543,
3748,
3332,
16424,
327,
24403,
839,
277,
79,
2224,
323,
1650,
28910,
1331,
263,
266,
1162,
267,
9169,
6864,
11649,
275,
11454,
6928,
28910,
632,
86,
1162,
267,
6247,
2969,
285,
3505,
74,
6216,
11649,
13418,
342,
30027,
3676,
4715,
3066,
4181,
11891,
28910,
50276,
3062,
5701,
50275,
783,
4081,
1566,
1057,
417,
1646,
281,
4311,
281,
1029,
4528,
5354,
347,
12868,
253,
18388,
342,
253,
258,
351,
941,
14156,
4916,
275,
36764,
917,
436,
310,
11392,
275,
253,
7180,
835,
1097,
7200,
285,
18543,
403,
24181,
7197,
323,
253,
1029,
4528,
5708,
25,
10602,
10895,
598,
281,
849,
1142,
10103,
651,
436,
2746,
320,
4217,
50275,
783,
258,
351,
10895,
4711,
271,
9696,
25180,
17927,
10895,
2568,
581,
1537,
1158,
326,
253,
2408,
273,
941,
3058,
281,
10237,
1419,
11649,
651,
3469,
327,
253,
16751,
12087,
50276,
12428,
273,
258,
351,
3530,
310,
6777,
347,
271,
30370,
273,
21624,
14237,
323,
253,
2540,
941,
326,
2097,
326,
1142,
4561,
2856,
522,
842,
84,
588,
417,
30747,
562,
1171,
16848,
50275,
4064,
253,
28913,
1263,
7180,
577,
285,
608,
1293,
490,
4245,
2074,
1543,
281,
3963,
1900,
1561,
253,
2361,
2228,
8965,
273,
3963,
326,
3133,
281,
5224,
326,
2426,
247,
285,
270,
403,
417,
326,
4623,
717,
891,
5816,
1633,
50275,
13206,
337,
253,
4477,
943,
2486,
581,
5084,
323,
253,
31025,
3879,
1580,
253,
4477,
1750,
326,
253,
2540,
3879,
310,
2074,
281,
326,
5010,
352,
310,
12744,
407,
5130,
752,
310,
1682,
275,
1798,
13229,
50275,
16534,
253,
4081,
2746,
11089,
432,
253,
7285,
2523,
26332,
7257,
1512,
1029,
11649,
275,
253,
31612,
258,
351,
941,
849,
513,
368,
3693,
436,
2523,
50276,
5430,
1057,
253,
4081,
2746,
7277,
281,
247,
277,
9866,
3692,
1390,
3828,
310,
31025,
390,
17699,
16561,
391,
3342,
2990,
923,
3944,
2700,
72,
1832,
1615,
13340,
317,
2788,
7187,
37848,
438,
77,
14952,
14764,
264,
50004,
438,
77,
14952,
20790,
8972,
9275,
50276,
783,
4081,
1332,
29426,
247,
294,
4149,
281,
247,
2173,
2720,
470,
1599,
3470,
50276,
37585,
50275,
783,
4477,
3748,
3710,
3890,
6460,
273,
305,
793,
533,
436,
310,
2256,
281,
247,
2969,
10295,
604,
253,
10295,
310,
9542,
2217,
840,
305,
793,
403,
347,
43541,
347,
359,
651,
751,
281,
923,
5217,
2979,
875,
277,
9866,
285,
305,
793,
275,
425,
300,
8210,
285,
458,
70,
1162,
267,
4240,
4496,
19148,
436,
3908,
50276,
13206,
495,
310,
1892,
281,
1239,
891,
1804,
281,
6780,
253,
13229,
9191,
407,
6890,
253,
3295,
4436,
1314,
406,
33032,
39930,
50276,
783,
4477,
12661,
247,
941,
42072,
6974,
326,
15693,
3530,
562,
273,
3268,
285,
7729,
342,
11649,
8197,
14023,
403,
281,
2710,
17699,
16561,
3082,
275,
44274,
74,
9077,
285,
278,
79,
382,
7836,
274,
323,
9162,
13229,
3133,
281,
1918,
690,
11701,
275,
562,
273,
3268,
11649,
21652,
50276,
783,
2201,
4468,
310,
326,
253,
15988,
1646,
4942,
1355,
285,
253,
8103,
310,
519,
37806,
352,
651,
320,
5322,
281,
923,
2057,
625,
6832,
6447,
15988,
594,
326,
253,
4477,
476,
15249,
253,
5199,
327,
253,
1543,
3815,
390,
625,
4891,
20178,
16038,
273,
253,
1332,
3340,
432,
253,
17699,
16561,
1930,
352,
3133,
751,
253,
16038,
285,
26432,
310,
2590,
285,
2593,
495,
39210,
4916,
1077,
519,
37806,
285,
25068,
1199,
273,
436,
352,
651,
320,
5322,
281,
320,
13762,
326,
627,
403,
247,
873,
273,
13260,
285,
2515,
762,
534,
436,
310,
253,
987,
1039,
281,
513,
11649,
21652,
50275,
993,
23223,
50276,
783,
27163,
403,
9470,
285,
697,
49638,
494,
326,
597,
2486,
1097,
2762,
285,
4016,
1543,
275,
616,
9077,
27163,
50275,
7157,
1695,
555,
13418,
562,
273,
3268,
310,
271,
1774,
285,
14793,
1895,
50276,
8265,
3993,
50276,
37585,
516,
417,
2119,
2139,
627,
310,
247,
1750,
326,
253,
3237,
342,
11649,
13418,
3249,
432,
268,
783,
85,
324,
285,
417,
268,
783,
85,
333,
89,
253,
958,
326,
1327,
32442,
16561,
3082,
452,
2074,
3374,
342,
11649,
21652,
651,
1804,
326,
253,
6158,
310,
5604,
271,
2523,
50276,
13206,
337,
36908,
1646,
751,
247,
18511,
4154,
323,
253,
14511,
275,
253,
2929,
278,
68,
5926,
483,
285,
3676,
49328,
1097,
452,
12524,
3879,
3345,
253,
1329,
1269,
17,
1269,
18,
533,
11089,
275,
253,
8037,
875,
470,
1099,
281,
470,
1976,
534,
310,
25711,
1955,
281,
689,
356,
11020,
30370,
50276,
3945,
247,
275,
5150,
608,
310,
17285,
347,
11365,
941,
835,
269,
310,
689,
8259,
888,
533,
891,
13414,
923,
849,
436,
310,
2032,
697,
816,
11365,
941,
835,
253,
373,
1698,
10554,
15579,
436,
3797,
3672,
835,
352,
310,
13224,
323,
253,
987,
4606,
50276,
8826,
5371,
5884,
533,
253,
2020,
273,
1307,
247,
285,
1307,
270,
3133,
247,
2372,
20276,
1580,
247,
588,
320,
275,
2426,
273,
13358,
15579,
275,
9162,
285,
1307,
270,
310,
1469,
281,
320,
8967,
15579,
275,
2087,
46595,
272,
1269,
671,
3133,
751,
352,
651,
29607,
5333,
253,
13461,
875,
490,
285,
260,
253,
12504,
7887,
272,
327,
253,
260,
12339,
323,
9077,
3237,
1057,
417,
26761,
7162,
50276,
1189,
455,
5150,
608,
4245,
247,
3282,
273,
247,
9648,
519,
37806,
17705,
2654,
751,
281,
320,
13762,
326,
436,
310,
2686,
253,
987,
1039,
273,
2509,
1841,
3340,
432,
247,
17699,
16561,
8668,
50276,
13565,
387,
5150,
818,
352,
3133,
751,
4715,
253,
3268,
273,
246,
6227,
1269,
310,
47899,
273,
789,
281,
3963,
907,
253,
27451,
4404,
253,
16888,
342,
247,
30044,
4347,
45426,
12339,
1977,
432,
253,
3733,
941,
310,
352,
1663,
417,
1896,
281,
1501,
7404,
253,
1566,
3268,
281,
5115,
253,
1072,
2181,
50275,
783,
4679,
403,
9470,
533,
247,
2372,
6804,
253,
10895,
5140,
323,
9077,
3133,
751,
352,
651,
10748,
3718,
13229,
881,
3082,
984,
253,
17524,
6634,
327,
253,
3720,
273,
4735,
13849,
285,
13229,
546,
36217,
11649,
1754,
327,
4735,
13849,
3066,
253,
30044,
4347,
1307,
275,
5150,
818,
275,
2426,
273,
1543,
891,
1158,
4583,
13229,
4245,
15988,
533,
697,
417,
6447,
285,
275,
2219,
751,
1863,
356,
327,
2829,
374,
3133,
281,
8513,
625,
685,
352,
7729,
50275,
783,
17715,
621,
275,
278,
79,
382,
50276,
46277,
274,
1364,
671,
320,
3965,
13847,
347,
253,
7200,
3904,
403,
3240,
1698,
323,
1097,
1057,
13229,
513,
12014,
973,
327,
2301,
491,
390,
642,
3268,
5333,
7533,
891,
717,
5777,
7514,
326,
253,
27163,
1060,
2770,
594,
1199,
327,
253,
1781,
3268,
4758,
285,
326,
13229,
310,
24251,
281,
326,
1083,
50276,
37585,
50276,
17243,
5150,
7668,
256,
1100,
310,
5816,
247,
11196,
2885,
25232,
50276,
25604,
305,
2162,
310,
247,
1006,
800,
1566,
275,
2593,
374,
533,
352,
3133,
281,
320,
253,
3453,
273,
271,
6753,
36465,
275,
2593,
4562,
2805,
2162,
3133,
281,
320,
253,
4588,
1006,
800,
1566,
7152,
339,
2059,
1158,
253,
2929,
310,
4722,
285,
973,
15720,
891,
5194,
342,
253,
3731,
1179,
11457,
476,
320,
4269,
407,
562,
1171,
35360,
941,
1014,
2167,
352,
310,
1335,
7744,
2540,
1293,
824,
26210,
15974,
258,
351,
941,
310,
271,
1774,
3884,
285,
891,
1158,
253,
2488,
4081,
247,
5272,
2746,
281,
3657,
3210,
432,
689,
31893,
327,
941,
2792,
326,
403,
11766,
2540,
1309,
3733,
2299,
891,
2868,
627,
403,
690,
12291,
273,
253,
1332,
387,
253,
1655,
3924,
285,
253,
4679,
858,
417,
4751,
18578,
479,
50276,
45563,
337,
253,
2929,
12453,
271,
1774,
1953,
273,
3210,
1146,
689,
8259,
888,
327,
562,
1171,
35360,
941,
253,
1332,
310,
8542,
323,
4893,
835,
11649,
13418,
310,
3058,
50276,
19,
253,
48960,
5978,
273,
258,
351,
941,
310,
734,
296,
272,
285,
253,
24775,
310,
973,
5544,
495,
253,
4477,
2908,
247,
1175,
5438,
273,
15302,
285,
4679,
253,
13229,
3169,
3082,
403,
671,
2429,
281,
247,
1175,
5235,
273,
1666,
25379,
50276,
20881,
1255,
337,
281,
3653,
1880,
247,
941,
1127,
310,
562,
1171,
35360,
1097,
253,
48960,
1566,
285,
2022,
1566,
10725,
327,
253,
298,
19,
4181,
285,
253,
2978,
4311,
4764,
11591,
619,
4468,
1060,
310,
670,
337,
253,
19331,
275,
1027,
10103,
285,
374,
11591,
3133,
3782,
1774,
285,
2834,
4764,
281,
19928,
891,
651,
751,
253,
4477,
281,
1918,
625,
4278,
327,
849,
352,
310,
6777,
374,
891,
2868,
253,
2746,
1060,
310,
281,
43004,
281,
2720,
672,
6760,
941,
2792,
403,
2080,
1977,
432,
253,
2540,
4394,
3021,
253,
3064,
275,
7200,
7024,
327,
849,
1175,
253,
4983,
2720,
310,
285,
849,
1199,
8492,
253,
8245,
1566,
6311,
432,
258,
351,
941,
2829,
495,
2722,
690,
273,
253,
5454,
2727,
533,
891,
1158,
352,
671,
320,
1175,
281,
921,
253,
3064,
672,
253,
373,
642,
258,
351,
941,
984,
697,
417,
7933,
1929,
275,
7170,
495,
2819,
387,
4677,
495,
891,
13414,
1158,
253,
13229,
1332,
2722,
1534,
7756,
275,
954,
273,
253,
15302,
8039,
3133,
281,
320,
253,
760,
581,
1060,
275,
4677,
608,
352,
4453,
751,
253,
258,
351,
941,
403,
6571,
275,
253,
17133,
28470,
273,
2540,
941,
387,
1878,
275,
436,
1698,
15759,
21496,
352,
310,
12744,
849,
281,
22629,
1110,
432,
253,
2919,
835,
3210,
943,
320,
20670,
839,
25761,
512,
253,
258,
351,
941,
403,
41544,
8818,
891,
1158,
352,
651,
320,
625,
21414,
281,
1071,
253,
3082,
327,
690,
258,
351,
941,
326,
15877,
10748,
581,
824,
2603,
812,
320,
11935,
941,
835,
3268,
812,
5333,
689,
673,
50276,
37585,
5701,
337,
275,
2593,
608,
13622,
13782,
323,
1781,
3210,
271,
15302,
271,
395,
50274,
47033,
253,
4477,
323,
2257,
273,
841,
6128,
516,
1335,
1475,
9238,
323,
436,
2929,
533,
891,
588,
7164,
619,
4868,
281,
42876,
1840,
14924,
50273,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
1895,
273,
11649,
13418,
762,
3268,
5333,
253,
4081,
2746,
13229,
12453,
436,
22698,
14508,
2523,
407,
35919,
272,
253,
3733,
941,
342,
14800,
326,
253,
2990,
556,
26694,
1245,
1698,
11649,
8197,
285,
7004,
253,
1566,
281,
3451,
436,
22698,
14508,
387,
1110,
31612,
2856,
522,
842,
84,
1543,
921,
12532,
7756,
689,
247,
873,
273,
1846,
22791,
8892,
275,
11649,
13418,
342,
14023,
281,
247,
1180,
273,
5368,
7274,
50276,
455,
253,
37317,
5821,
326,
253,
4679,
403,
973,
5196,
285,
253,
16774,
1543,
403,
1077,
12532,
2299,
597,
671,
574,
247,
6096,
4468,
327,
253,
22861,
273,
253,
2746,
30628,
403,
1679,
7378,
281,
2997,
247,
2929,
7960,
323,
764,
1946,
697,
16774,
3045,
50276,
74,
3894,
253,
1840,
4468,
347,
253,
30628,
285,
891,
11697,
1119,
253,
9759,
273,
253,
2746,
247,
2372,
16949,
285,
33817,
432,
253,
16038,
323,
1650,
253,
1655,
9759,
9193,
751,
253,
1332,
310,
17194,
407,
270,
79,
2224,
533,
352,
310,
417,
2590,
281,
479,
849,
253,
4081,
8103,
23417,
281,
253,
16038,
671,
642,
28529,
273,
253,
8103,
310,
2908,
275,
2057,
2022,
2505,
390,
30762,
50275,
249,
18520,
891,
651,
1804,
247,
2770,
327,
11138,
253,
19843,
285,
10527,
22861,
273,
253,
4081,
8103,
1159
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper addresses the complexity of the forward pass inference in neural odes the paper proposes to augment training of the neural ode with an auxiliary neural network that dynamically selects the best numerical integrator for a given input sample furthermore the paper also proposes a regularizer that uses the errors of the numerical integrator to reduce the number of function evaluations without sacrificing accuracy the paper is well written and addresses an impediment to utilizing neural odes in practice i did find the paper lacking in detail however for example it is not clear where the regularizer in eq 2 is derived from the authors mention a connection to the finlay reference in sec 23 but it is not clear what this is precisely for the cost of each integrator in eq 4 how should m be chosen what does it mean to say that a prediction is correct what is the criteria being used for this purpose it appears that the authors treat the training of the auxiliary network as a supervised learning procedure why is this appropriate for this task another way of looking at the problem is through a reinforcement learning lens where the objective is to learn a policy mapping inputs to choices of integrators and minimizing longterm costs either discounted or longterm average of course there is perhaps no markov structure to the data in this setting but presumably the inputs in the set t could be viewed as iid samples could the authors comment on such alternate formulations docsep summary this study proposes a method to accelerate the forwardpass in neural odes known to be a significant time bottleneck the study is technically sound the empirical results convincing but the clarity could be substantially improved quality the paper is technically sound and the claims are for the most part appropriately backed by empirical evaluation there is just one minor point i would suggest the authors to address the authors write one interesting point is that rk4 had not been used at all because it does not show any notable difference from the euler method in this task in that regard choosing the euler method is a sensible decision this claim is not really illustrated anywhere in the manuscript and it would be good if the authors show this even if in a supplement clarity the manuscript provides enough information for an expert reader to understand all the steps to reproduce the results however the text contains a substantial amount of grammar errors and imprecisions which i would recommend the authors to tackle here is a nonexhaustive list instead of much work has been actively studied to much work has been actively devoted to instead of neuarl odes and numerical methods neural odes confusing formulation it had been reported that approximating neural networks with differential equations can be done by many researchers instead of as shown in fig 2 consist of three parts as shown in fig 1 instead of and the step size is decided by a function and the step size is determined by a function or and the step size is a function instead of dupont et al said that by dupont et al showed that by instead of which is not our main interest which is not our setting instead of neural odes have one critical drawback that it requires neural odes they require instead of step size is decided by an inverse function step size is an inverse function instead of because the average step size will decrease shouldnt it be because the average step size will increase instead of the auxiliary integrator selection network v is stabilized and we can deploy them the auxiliary integrator selection network we can deploy it confusing sentence which is our main experimental stage maybe delete it for clarity instead of in average on average instead of in the paper or in their github repository in the paper or in the respective github repository instead of it is note that it is worth noting that instead of the taskspecific loss is to maximize ie ltask the taskspecific loss is to maximize ie minimize ltask originality the novelty of the study is two fold 1 it proposes a regulariser to speedup the dopri ode numerical solver 2 it trains an auxiliary neural network to choose the most appropriate numerical solver for the neural ode between dopri fourthorder rungekutta rk4 and forward euler significance of the work the results suggest that the developed approach is a solid step towards developing faster neural odesdocsepthe authors make two suggestions in the context of neural odes 1 a regularization term based on the error estimate of an adaptive step size ode solver dormandprince 2 an auxiliary predictor to recommend an integrator to use for a specific sample based on minimizing the required number of function evaluations in the numerical integrator based on their suggestions the authors show that it is possible to obtain improved neural ode accuracy results at less computational cost for three tasks 1 mnist image classification 2 physionet mortality classification 3 continuous normalizing flows the paper can be significantly improved in two major areas 1 there is already important related work that the authors should take into account the paper learning differential equations that are easy to solve httpsarxivorgabs200704504 suggests the regularization of the kth order derivatives with respect to time based on the view of the taylor method integrator the higherorder derivatives with respect to time are an error estimate of the current time step and also reflect on the cost of computing the solution up to a certain accuracy the idea in the above paper very similar to the idea of the regularization of the error estimate of an adaptive step size solver such as dormandprince the authors say that in the dormandprince method the error is estimated by the difference between the fourthorder and the fifthorder rungekutta methods rungekutta methods use multiple of the previous function evaluations in order to extrapolate the solution of the next step the higher the order the higher the term of the taylor expansion that is estimated assuming the integrated function is differentiable up to the necessary order so the error estimation of the dormandprince method is related proportional to a higher derivative with respect to time and regularizing it is thus very similar to the more general idea in the above paper the authors could make their analysis more clear and relate it to the previous work in general the work would benefit from a clearer exposition about adaptive step size solvers and the smoothness of the ode at hand 2 principled reasoning and explanation of the auxiliary integrator recommendation system the purpose of an adaptive step size solver is already to make large steps where the integrated function allows this given the effort it takes to properly tune the auxiliary network architecture in a task specific way it is not clear to me that there is a truly general purpose advantage to quote the authors the neural network architecture for v should be carefully designed for each application furthermore the objective function of the auxiliary network is based on a discrete quantity number of function evaluations that is not differentiable with respect to the input as far as i can see the paper does not directly explain how this objective can efficiently be trained as gradients should not be available i do not recommend to accept the paper since the described large changes are required for the paper to become a serious contribution further recommendations give references to the claims made in the abstract already in the abstract even if the references follow in the text later especially for big statements like significantly reduce number of parameters that statement could also be improved by making it more quantitatively specific how much is the reduction define the term procrastinated in the context of neural odes the finding that a nontrivial percentage of cases can be solved with simple integrators seems to somewhat contradict the previous claim that advanced integrators have to be used also for simple in numerical analysis terms less stiff cases an adaptive solver should already use much fewer steps and hence number of function evaluations introduction two advantages neural odes can interpret the neural network layer as a continuous variable and a hidden vector at an arbitrary layer can be calculated why is this an advantage what is this useful for section 21 the statement it had been reported that approximating neural networks with differential equations can be done by many researchs can be read in two ways maybe find a different formulation table 1 why not also list wallclock time of inference as that is what we are truly interested in section 22 dopri is one of the most powerful integrators what do you mean by powerful how is that measured clearly explain the adaptive step size scheme of dopri instead of just saying inversely proportional to error if i evaluate with step size h1 and get error estimate e1 do i then choose h2 1 e1 how does that work exactly perhaps say something about the differentiability assumptions of the higherorder rungekutta methods perhaps differentiate between explicit and implicit euler method instead of just saying euler method implicit integrators are not as unstable for stiff problems but can require many more function evaluations since they perform a nonlinear system solve at every time step in equation 2 you could make more specific what range i is summed over section 32 solving for h0 we solve with h0 as initial data but we solve for htfinal how is the alpha in the exponent of the auxiliary loss chosen and for what reason docsepsummarizing the paper claims the paper addresses the question of reducing on average the number of function evaluations nfes for the forwardpass inference through the neural ode the proposed approach includes two main components the first one is a direct regularization of solvers dopri estimated errors during training the second one is an auxiliary neural network that is learned to predict which solver from the predefined set of solvers dopri and fixedstep rk4 euler should be used during inference for a given input sample the paper claims that these components and their combination yield to reduce nfes strong points the paper attracts attention to the fact that neural ode architecture shouldnt stick to the sole usage of the most powerful solver during inference depending on the input data less powerful solvers can be applied the proposed approach is evaluated on a variety of tasks image classification mortality prediction continuous normalizing flows weak points some important details concerning the experimental setup are omitted which makes it hard to correctly evaluate the benefits of the proposed approach and reproduce the results please see below for a wider explanation particularly the following points need to be clarified to understand the fairness of the provided comparisons 1 were the models from the same table table1table5table8 trained using the same random seeds neural ode performance can significantly depend on architecture initialization and hence using the same random seeds is required for a fair comparison 2 were the models from the same table table1table5table8 computed only once or provided data correspond to the mean value across several experimental runs if the mean is provided what is the corresponding standard deviation without knowing the standard deviation its not clear if there is a significant improvement of one method over another 3 how many steps of rk4 and euler are done during forward what are hyperparameters for dopri eg tolerance in the paper i didnt find an explanation of how the number of steps for fixedstep size solvers has been picked and how the tolerance for dopri has been set the quality as well as nfes and dise predictions can vary significantly depending on these parameters recommendation accept or reject for the current stage of the review i tend to reject the paper however i find the topic of the paper important to the neural ode community and will make the final score decision after the authors clarification on crucial experimental setups questions that would be helpful for understanding to see the test performance of pretrained with dopri neural ode when only rk4 or only euler is used will we observe the same behavior if we perform a comparison with adaptive methods of smaller order does the dise strategy to choose an appropriate solver outperforms the strategy when we randomly sample solver for the next input during inference if sampling uses the same probabilities as obtained with dise if uniform sampling is done what is a time overhead for the training using introduced regularization that would be nice to see plots nfesforward vs epochs wall clock time and nfesbackward vs epochs wall clock time dependence for different methods during training
### Summary: | this paper proposes two methods to speed up the evaluation of neural odes regularizing the ode to be easier to integrate and adaptively choosing which integrator to use these two ideas are fundamentally sensible but the execution of the current paper is lacking in addition to writing and clarity issues the main problem is not comparing to finlay et al the kelly et al paper could potentially be considered concurrent work i also suggest broadening the scope of the dise method to ode sde pde solvers in general in situations where many similar differential equations need to be solved amortizing the solver selection will be worthwhile even if there are no neural nets in the differential equation i also encourage the authors to do experiments that explore the tradeoffs of different approaches rather than aiming just for bold lines in tables | [
247,
1159,
285,
253,
3213,
1979,
310,
3413,
407,
247,
1159,
390,
285,
253,
3213,
1979,
310,
247,
1159,
50276,
34235,
273,
24708,
834,
1162,
355,
753,
326,
407,
24708,
834,
1162,
355,
2692,
326,
407,
50276,
34235,
273,
534,
310,
417,
776,
2022,
1600,
534,
310,
417,
776,
4758,
50276,
34235,
273,
11454,
258,
3229,
452,
581,
4619,
32489,
326,
352,
4419,
11454,
258,
3229,
50276,
9328,
2430,
50276,
34235,
273,
3213,
1979,
310,
4425,
407,
271,
13737,
1159,
3213,
1979,
310,
271,
13737,
1159,
50276,
34235,
273,
984,
253,
3388,
3213,
1979,
588,
6379,
943,
2649,
352,
320,
984,
253,
3388,
3213,
1979,
588,
2572,
50276,
34235,
273,
253,
24026,
2899,
1080,
5438,
2990,
362,
310,
32779,
285,
359,
476,
8745,
731,
253,
24026,
2899,
1080,
5438,
2990,
50276,
664,
476,
8745,
352,
50276,
8259,
5302,
6197,
534,
310,
776,
2022,
5661,
3924,
5046,
11352,
352,
323,
19843,
50276,
34235,
273,
275,
3388,
327,
3388,
50276,
34235,
273,
275,
253,
2929,
390,
275,
616,
40477,
18491,
275,
253,
2929,
390,
275,
253,
9056,
40477,
18491,
50276,
34235,
273,
352,
310,
3877,
326,
352,
310,
4409,
15806,
326,
50276,
34235,
273,
253,
8892,
29765,
2957,
310,
281,
22950,
50276,
466,
298,
14605,
253,
8892,
29765,
2957,
310,
281,
22950,
50276,
466,
15338,
298,
14605,
50274,
19164,
414,
50276,
783,
38135,
273,
253,
1263,
310,
767,
7975,
50276,
18,
352,
29328,
247,
3963,
9141,
281,
3885,
484,
253,
13548,
363,
258,
615,
10704,
47037,
374,
352,
18784,
271,
24026,
11454,
2990,
281,
5206,
253,
954,
4569,
10704,
47037,
323,
253,
11454,
258,
615,
875,
13548,
363,
7002,
2621,
1408,
463,
76,
29662,
391,
76,
21,
285,
3579,
299,
14398,
50274,
9188,
40348,
273,
253,
789,
50276,
783,
1543,
1804,
326,
253,
3715,
2746,
310,
247,
4891,
3213,
4404,
6684,
7938,
11454,
258,
3229,
7152,
339,
431,
248,
4477,
1056,
767,
13991,
275,
253,
3634,
273,
11454,
258,
3229,
337,
247,
37820,
1307,
1754,
327,
253,
2228,
6642,
273,
271,
17825,
3213,
1979,
258,
615,
47037,
23574,
395,
1087,
1090,
374,
271,
24026,
23403,
281,
5583,
271,
2899,
1080,
281,
897,
323,
247,
2173,
3410,
1754,
327,
28699,
253,
2424,
1180,
273,
1159,
27163,
275,
253,
10704,
2899,
1080,
1754,
327,
616,
13991,
253,
4477,
921,
326,
352,
310,
1896,
281,
4044,
5520,
11454,
258,
615,
7200,
1543,
387,
1679,
15180,
2105,
323,
1264,
8892,
337,
278,
79,
382,
2460,
9162,
374,
2150,
279,
292,
7891,
9162,
495,
5415,
2622,
3006,
14221,
50276,
783,
2929,
476,
320,
3012,
5520,
275,
767,
2201,
3672,
50276,
18,
627,
310,
2168,
1774,
2905,
789,
326,
253,
4477,
943,
1379,
715,
2395,
50276,
783,
2929,
4715,
8967,
7424,
326,
403,
3477,
281,
8415,
5987,
39962,
2061,
5375,
1518,
1967,
15645,
21,
5936,
253,
37820,
273,
253,
465,
394,
1340,
13335,
342,
1675,
281,
673,
1754,
327,
253,
1859,
273,
253,
246,
9614,
1332,
2899,
1080,
253,
2169,
2621,
13335,
342,
1675,
281,
673,
403,
271,
2228,
6642,
273,
253,
1655,
673,
3213,
285,
671,
4887,
327,
253,
2105,
273,
12672,
253,
2900,
598,
281,
247,
2176,
7200,
50276,
783,
2934,
275,
253,
1840,
2929,
1077,
2074,
281,
253,
2934,
273,
253,
37820,
273,
253,
2228,
6642,
273,
271,
17825,
3213,
1979,
47037,
824,
347,
23574,
395,
1087,
1090,
253,
4477,
1333,
326,
275,
253,
23574,
395,
1087,
1090,
1332,
253,
2228,
310,
5998,
407,
253,
3064,
875,
253,
7002,
2621,
285,
253,
10720,
2621,
1408,
463,
76,
29662,
3082,
1408,
463,
76,
29662,
3082,
897,
2709,
273,
253,
2045,
1159,
27163,
275,
1340,
281,
26480,
25839,
253,
2900,
273,
253,
1735,
3213,
253,
2169,
253,
1340,
253,
2169,
253,
1307,
273,
253,
246,
9614,
7466,
326,
310,
5998,
7384,
253,
8527,
1159,
310,
46350,
598,
281,
253,
3309,
1340,
594,
253,
2228,
13418,
273,
253,
23574,
395,
1087,
1090,
1332,
310,
2905,
14495,
281,
247,
2169,
4309,
342,
1675,
281,
673,
285,
3963,
3006,
352,
310,
3021,
1077,
2074,
281,
253,
625,
2087,
2934,
275,
253,
1840,
2929,
50276,
783,
4477,
812,
1056,
616,
1783,
625,
2590,
285,
14588,
352,
281,
253,
2045,
789,
275,
2087,
253,
789,
651,
5649,
432,
247,
30909,
47284,
670,
17825,
3213,
1979,
1220,
735,
285,
253,
6032,
1255,
273,
253,
258,
615,
387,
1133,
50276,
19,
3505,
74,
6216,
14720,
285,
8813,
273,
253,
24026,
2899,
1080,
17401,
985,
50276,
783,
4096,
273,
271,
17825,
3213,
1979,
47037,
310,
2168,
281,
1056,
1781,
5018,
835,
253,
8527,
1159,
4483,
436,
1677,
253,
3434,
352,
3936,
281,
6283,
19928,
253,
24026,
2990,
10336,
275,
247,
4836,
2173,
1039,
352,
310,
417,
2590,
281,
479,
326,
627,
310,
247,
7777,
2087,
4096,
5750,
281,
14430,
253,
4477,
253,
11454,
2990,
10336,
323,
362,
943,
320,
9257,
4158,
323,
1016,
2898,
50276,
44295,
3062,
253,
8103,
1159,
273,
253,
24026,
2990,
310,
1754,
327,
247,
13358,
10671,
1180,
273,
1159,
27163,
326,
310,
417,
46350,
342,
1675,
281,
253,
3280,
347,
2080,
347,
891,
476,
923,
253,
2929,
1057,
417,
3587,
5513,
849,
436,
8103,
476,
14556,
320,
10166,
347,
27935,
943,
417,
320,
2130,
50276,
74,
513,
417,
5583,
281,
2997,
253,
2929,
1580,
253,
2529,
1781,
2544,
403,
2424,
323,
253,
2929,
281,
2489,
247,
4092,
7680,
50276,
44295,
12645,
50276,
31089,
10414,
281,
253,
3916,
1160,
275,
253,
12002,
2168,
275,
253,
12002,
1014,
604,
253,
10414,
956,
275,
253,
2505,
1996,
3340,
323,
1943,
7234,
751,
3012,
4796,
1180,
273,
3602,
326,
3908,
812,
671,
320,
5520,
407,
2403,
352,
625,
36878,
2173,
849,
1199,
310,
253,
5141,
50276,
3182,
253,
1307,
354,
7083,
505,
3901,
275,
253,
3634,
273,
11454,
258,
3229,
50276,
783,
4560,
326,
247,
37825,
7155,
273,
2219,
476,
320,
14042,
342,
2969,
2899,
2392,
3133,
281,
8489,
17343,
253,
2045,
1750,
326,
7269,
2899,
2392,
452,
281,
320,
908,
671,
323,
2969,
275,
10704,
1783,
2426,
1679,
13827,
2219,
271,
17825,
47037,
943,
2168,
897,
1199,
11184,
5018,
285,
7613,
1180,
273,
1159,
27163,
50276,
46089,
767,
11361,
11454,
258,
3229,
476,
4665,
253,
11454,
2990,
3828,
347,
247,
5415,
4778,
285,
247,
8763,
4972,
387,
271,
10341,
3828,
476,
320,
5118,
2139,
310,
436,
271,
5750,
50276,
5371,
310,
436,
4217,
323,
50276,
4674,
3127,
253,
3908,
352,
574,
644,
2361,
326,
4020,
839,
11454,
6928,
342,
8967,
7424,
476,
320,
2218,
407,
1142,
2561,
84,
476,
320,
1239,
275,
767,
4088,
5046,
1089,
247,
1027,
15895,
50276,
2420,
337,
2139,
417,
671,
1618,
3402,
13273,
673,
273,
17032,
347,
326,
310,
752,
359,
403,
7777,
6110,
275,
50276,
4674,
3307,
13548,
363,
310,
581,
273,
253,
954,
6422,
2899,
2392,
752,
513,
368,
1599,
407,
6422,
849,
310,
326,
4080,
50276,
49346,
5513,
253,
17825,
3213,
1979,
6974,
273,
13548,
363,
3185,
273,
816,
3981,
39342,
14495,
281,
2228,
604,
891,
7472,
342,
3213,
1979,
288,
18,
285,
755,
2228,
6642,
299,
18,
513,
891,
840,
5206,
288,
19,
50276,
18,
50276,
70,
18,
849,
1057,
326,
789,
4555,
50276,
30875,
1333,
1633,
670,
253,
1027,
74,
1430,
13260,
273,
253,
2169,
2621,
1408,
463,
76,
29662,
3082,
50276,
30875,
22629,
875,
6843,
285,
15424,
299,
14398,
1332,
3185,
273,
816,
3981,
299,
14398,
1332,
15424,
2899,
2392,
403,
417,
347,
17631,
323,
13827,
3237,
533,
476,
2430,
1142,
625,
1159,
27163,
1580,
597,
1347,
247,
14561,
985,
8415,
387,
1046,
673,
3213,
50276,
249,
5150,
374,
368,
812,
1056,
625,
2173,
752,
2491,
891,
310,
37254,
689,
50276,
4674,
4567,
16161,
323,
288,
17,
359,
8415,
342,
288,
17,
347,
3302,
941,
533,
359,
8415,
323,
288,
85,
13017,
50276,
5430,
310,
253,
9765,
275,
253,
23653,
273,
253,
24026,
2957,
6777,
285,
323,
752,
1921,
5474,
339,
793,
360,
4175,
3006,
253,
2929,
3916,
50276,
783,
2929,
12453,
253,
1953,
273,
8493,
327,
3388,
253,
1180,
273,
1159,
27163,
295,
71,
265,
323,
253,
3579,
5858,
17032,
949,
253,
11454,
258,
615,
50276,
783,
4081,
2746,
3797,
767,
2022,
4295,
253,
806,
581,
310,
247,
1480,
37820,
273,
1220,
735,
13548,
363,
5998,
6332,
1309,
3733,
50276,
783,
1273,
581,
310,
271,
24026,
11454,
2990,
326,
310,
6311,
281,
3283,
534,
47037,
432,
253,
41364,
873,
273,
1220,
735,
13548,
363,
285,
4229,
10539,
391,
76,
21,
299,
14398,
943,
320,
908,
1309,
17032,
323,
247,
50276,
28821,
3280,
3410,
253,
2929,
3916,
326,
841,
4295,
285,
616,
5019,
4917,
281,
4796,
295,
71,
265,
50276,
9072,
2792,
50275,
783,
2929,
45465,
4116,
281,
253,
958,
326,
11454,
258,
615,
10336,
943,
2649,
7356,
281,
253,
7934,
10393,
273,
253,
954,
6422,
47037,
1309,
17032,
7293,
327,
253,
3280,
941,
1679,
6422,
1220,
735,
476,
320,
3732,
50276,
783,
4081,
2746,
310,
6760,
327,
247,
5235,
273,
8892,
2460,
9162,
7891,
10554,
5415,
2622,
3006,
14221,
50276,
20881,
2792,
50276,
8826,
1774,
4278,
8664,
253,
5661,
9978,
403,
11035,
534,
2789,
352,
1892,
281,
9113,
7472,
253,
5373,
273,
253,
4081,
2746,
285,
18302,
253,
1543,
4496,
923,
2708,
323,
247,
14200,
8813,
50276,
35456,
253,
1563,
2792,
878,
281,
320,
31637,
281,
2096,
253,
28959,
273,
253,
2530,
14023,
50276,
18,
497,
253,
3210,
432,
253,
1072,
2829,
2829,
18,
2420,
22,
2420,
25,
10166,
970,
253,
1072,
3632,
12922,
50276,
570,
1546,
258,
615,
3045,
476,
3012,
3469,
327,
10336,
31850,
285,
7613,
970,
253,
1072,
3632,
12922,
310,
2424,
323,
247,
4344,
5301,
50276,
19,
497,
253,
3210,
432,
253,
1072,
2829,
2829,
18,
2420,
22,
2420,
25,
10302,
760,
2378,
390,
2530,
941,
2723,
281,
253,
1599,
1318,
2439,
2067,
5661,
6613,
604,
253,
1599,
310,
2530,
752,
310,
253,
3969,
2629,
11254,
1293,
8958,
253,
2629,
11254,
697,
417,
2590,
604,
627,
310,
247,
1534,
7756,
273,
581,
1332,
689,
1529,
50276,
20,
849,
1142,
5018,
273,
391,
76,
21,
285,
299,
14398,
403,
2218,
1309,
3579,
752,
403,
4373,
22041,
323,
13548,
363,
24088,
13761,
275,
253,
2929,
891,
42126,
1089,
271,
8813,
273,
849,
253,
1180,
273,
5018,
323,
4229,
10539,
1979,
1220,
735,
556,
644,
5055,
285,
849,
253,
13761,
323,
13548,
363,
556,
644,
873,
253,
3290,
347,
973,
347,
295,
71,
265,
285,
2215,
13650,
476,
6889,
3012,
7293,
327,
841,
3602,
50276,
250,
27167,
318,
2997,
390,
12009,
50276,
1542,
253,
1655,
3924,
273,
253,
2278,
891,
5257,
281,
12009,
253,
2929,
2299,
891,
1089,
253,
9400,
273,
253,
2929,
1774,
281,
253,
11454,
258,
615,
3114,
285,
588,
1056,
253,
2457,
4868,
3061,
846,
253,
4477,
37699,
327,
9560,
5661,
873,
8777,
50276,
34974,
50275,
3529,
651,
320,
9371,
323,
4685,
281,
923,
253,
1071,
3045,
273,
3215,
11273,
342,
13548,
363,
11454,
258,
615,
672,
760,
50276,
33716,
21,
390,
760,
299,
14398,
310,
908,
50276,
9846,
359,
10018,
253,
1072,
3879,
604,
359,
1347,
247,
5301,
342,
17825,
3082,
273,
4577,
1340,
50276,
18566,
253,
2215,
5700,
281,
5206,
271,
4569,
47037,
41731,
13015,
253,
5700,
672,
359,
12421,
3410,
47037,
323,
253,
1735,
3280,
1309,
17032,
604,
10491,
4648,
253,
1072,
20552,
347,
2797,
342,
2215,
604,
6447,
10491,
310,
2218,
50276,
5371,
310,
247,
673,
18332,
323,
253,
3733,
970,
5611,
37820,
326,
651,
320,
5322,
281,
923,
14777,
295,
71,
265,
10495,
4632,
44540,
3402,
8886,
673,
285,
295,
71,
265,
2135,
1034,
4632,
44540,
3402,
8886,
673,
10096,
323,
1027,
3082,
1309,
3733,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
767,
3082,
281,
3885,
598,
253,
7103,
273,
11454,
258,
3229,
3963,
3006,
253,
258,
615,
281,
320,
6927,
281,
19837,
285,
5223,
1242,
13887,
534,
2899,
1080,
281,
897,
50276,
20513,
767,
5697,
403,
26401,
24600,
533,
253,
10636,
273,
253,
1655,
2929,
310,
14999,
50276,
249,
1635,
281,
4028,
285,
19843,
3374,
253,
2022,
1895,
310,
417,
10941,
281,
1442,
15328,
1162,
355,
50276,
783,
465,
9609,
1162,
355,
2929,
812,
7826,
320,
2783,
17336,
789,
50276,
74,
671,
1804,
44448,
253,
7990,
273,
253,
2215,
1332,
281,
258,
615,
50276,
84,
615,
268,
615,
1220,
735,
275,
2087,
275,
9534,
835,
1142,
2074,
8967,
7424,
878,
281,
320,
14042,
717,
430,
3006,
253,
47037,
5438,
588,
320,
32811,
1014,
604,
627,
403,
642,
11454,
37507,
275,
253,
8967,
5150,
50276,
74,
671,
11907,
253,
4477,
281,
513,
4679,
326,
8338,
253,
5454,
14273,
273,
1027,
7274,
2581,
685,
26400,
816,
323,
13433,
3104,
275,
7180
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
247,
1159,
285,
253,
3213,
1979,
310,
3413,
407,
247,
1159,
390,
285,
253,
3213,
1979,
310,
247,
1159,
50276,
34235,
273,
24708,
834,
1162,
355,
753,
326,
407,
24708,
834,
1162,
355,
2692,
326,
407,
50276,
34235,
273,
534,
310,
417,
776,
2022,
1600,
534,
310,
417,
776,
4758,
50276,
34235,
273,
11454,
258,
3229,
452,
581,
4619,
32489,
326,
352,
4419,
11454,
258,
3229,
50276,
9328,
2430,
50276,
34235,
273,
3213,
1979,
310,
4425,
407,
271,
13737,
1159,
3213,
1979,
310,
271,
13737,
1159,
50276,
34235,
273,
984,
253,
3388,
3213,
1979,
588,
6379,
943,
2649,
352,
320,
984,
253,
3388,
3213,
1979,
588,
2572,
50276,
34235,
273,
253,
24026,
2899,
1080,
5438,
2990,
362,
310,
32779,
285,
359,
476,
8745,
731,
253,
24026,
2899,
1080,
5438,
2990,
50276,
664,
476,
8745,
352,
50276,
8259,
5302,
6197,
534,
310,
776,
2022,
5661,
3924,
5046,
11352,
352,
323,
19843,
50276,
34235,
273,
275,
3388,
327,
3388,
50276,
34235,
273,
275,
253,
2929,
390,
275,
616,
40477,
18491,
275,
253,
2929,
390,
275,
253,
9056,
40477,
18491,
50276,
34235,
273,
352,
310,
3877,
326,
352,
310,
4409,
15806,
326,
50276,
34235,
273,
253,
8892,
29765,
2957,
310,
281,
22950,
50276,
466,
298,
14605,
253,
8892,
29765,
2957,
310,
281,
22950,
50276,
466,
15338,
298,
14605,
50274,
19164,
414,
50276,
783,
38135,
273,
253,
1263,
310,
767,
7975,
50276,
18,
352,
29328,
247,
3963,
9141,
281,
3885,
484,
253,
13548,
363,
258,
615,
10704,
47037,
374,
352,
18784,
271,
24026,
11454,
2990,
281,
5206,
253,
954,
4569,
10704,
47037,
323,
253,
11454,
258,
615,
875,
13548,
363,
7002,
2621,
1408,
463,
76,
29662,
391,
76,
21,
285,
3579,
299,
14398,
50274,
9188,
40348,
273,
253,
789,
50276,
783,
1543,
1804,
326,
253,
3715,
2746,
310,
247,
4891,
3213,
4404,
6684,
7938,
11454,
258,
3229,
7152,
339,
431,
248,
4477,
1056,
767,
13991,
275,
253,
3634,
273,
11454,
258,
3229,
337,
247,
37820,
1307,
1754,
327,
253,
2228,
6642,
273,
271,
17825,
3213,
1979,
258,
615,
47037,
23574,
395,
1087,
1090,
374,
271,
24026,
23403,
281,
5583,
271,
2899,
1080,
281,
897,
323,
247,
2173,
3410,
1754,
327,
28699,
253,
2424,
1180,
273,
1159,
27163,
275,
253,
10704,
2899,
1080,
1754,
327,
616,
13991,
253,
4477,
921,
326,
352,
310,
1896,
281,
4044,
5520,
11454,
258,
615,
7200,
1543,
387,
1679,
15180,
2105,
323,
1264,
8892,
337,
278,
79,
382,
2460,
9162,
374,
2150,
279,
292,
7891,
9162,
495,
5415,
2622,
3006,
14221,
50276,
783,
2929,
476,
320,
3012,
5520,
275,
767,
2201,
3672,
50276,
18,
627,
310,
2168,
1774,
2905,
789,
326,
253,
4477,
943,
1379,
715,
2395,
50276,
783,
2929,
4715,
8967,
7424,
326,
403,
3477,
281,
8415,
5987,
39962,
2061,
5375,
1518,
1967,
15645,
21,
5936,
253,
37820,
273,
253,
465,
394,
1340,
13335,
342,
1675,
281,
673,
1754,
327,
253,
1859,
273,
253,
246,
9614,
1332,
2899,
1080,
253,
2169,
2621,
13335,
342,
1675,
281,
673,
403,
271,
2228,
6642,
273,
253,
1655,
673,
3213,
285,
671,
4887,
327,
253,
2105,
273,
12672,
253,
2900,
598,
281,
247,
2176,
7200,
50276,
783,
2934,
275,
253,
1840,
2929,
1077,
2074,
281,
253,
2934,
273,
253,
37820,
273,
253,
2228,
6642,
273,
271,
17825,
3213,
1979,
47037,
824,
347,
23574,
395,
1087,
1090,
253,
4477,
1333,
326,
275,
253,
23574,
395,
1087,
1090,
1332,
253,
2228,
310,
5998,
407,
253,
3064,
875,
253,
7002,
2621,
285,
253,
10720,
2621,
1408,
463,
76,
29662,
3082,
1408,
463,
76,
29662,
3082,
897,
2709,
273,
253,
2045,
1159,
27163,
275,
1340,
281,
26480,
25839,
253,
2900,
273,
253,
1735,
3213,
253,
2169,
253,
1340,
253,
2169,
253,
1307,
273,
253,
246,
9614,
7466,
326,
310,
5998,
7384,
253,
8527,
1159,
310,
46350,
598,
281,
253,
3309,
1340,
594,
253,
2228,
13418,
273,
253,
23574,
395,
1087,
1090,
1332,
310,
2905,
14495,
281,
247,
2169,
4309,
342,
1675,
281,
673,
285,
3963,
3006,
352,
310,
3021,
1077,
2074,
281,
253,
625,
2087,
2934,
275,
253,
1840,
2929,
50276,
783,
4477,
812,
1056,
616,
1783,
625,
2590,
285,
14588,
352,
281,
253,
2045,
789,
275,
2087,
253,
789,
651,
5649,
432,
247,
30909,
47284,
670,
17825,
3213,
1979,
1220,
735,
285,
253,
6032,
1255,
273,
253,
258,
615,
387,
1133,
50276,
19,
3505,
74,
6216,
14720,
285,
8813,
273,
253,
24026,
2899,
1080,
17401,
985,
50276,
783,
4096,
273,
271,
17825,
3213,
1979,
47037,
310,
2168,
281,
1056,
1781,
5018,
835,
253,
8527,
1159,
4483,
436,
1677,
253,
3434,
352,
3936,
281,
6283,
19928,
253,
24026,
2990,
10336,
275,
247,
4836,
2173,
1039,
352,
310,
417,
2590,
281,
479,
326,
627,
310,
247,
7777,
2087,
4096,
5750,
281,
14430,
253,
4477,
253,
11454,
2990,
10336,
323,
362,
943,
320,
9257,
4158,
323,
1016,
2898,
50276,
44295,
3062,
253,
8103,
1159,
273,
253,
24026,
2990,
310,
1754,
327,
247,
13358,
10671,
1180,
273,
1159,
27163,
326,
310,
417,
46350,
342,
1675,
281,
253,
3280,
347,
2080,
347,
891,
476,
923,
253,
2929,
1057,
417,
3587,
5513,
849,
436,
8103,
476,
14556,
320,
10166,
347,
27935,
943,
417,
320,
2130,
50276,
74,
513,
417,
5583,
281,
2997,
253,
2929,
1580,
253,
2529,
1781,
2544,
403,
2424,
323,
253,
2929,
281,
2489,
247,
4092,
7680,
50276,
44295,
12645,
50276,
31089,
10414,
281,
253,
3916,
1160,
275,
253,
12002,
2168,
275,
253,
12002,
1014,
604,
253,
10414,
956,
275,
253,
2505,
1996,
3340,
323,
1943,
7234,
751,
3012,
4796,
1180,
273,
3602,
326,
3908,
812,
671,
320,
5520,
407,
2403,
352,
625,
36878,
2173,
849,
1199,
310,
253,
5141,
50276,
3182,
253,
1307,
354,
7083,
505,
3901,
275,
253,
3634,
273,
11454,
258,
3229,
50276,
783,
4560,
326,
247,
37825,
7155,
273,
2219,
476,
320,
14042,
342,
2969,
2899,
2392,
3133,
281,
8489,
17343,
253,
2045,
1750,
326,
7269,
2899,
2392,
452,
281,
320,
908,
671,
323,
2969,
275,
10704,
1783,
2426,
1679,
13827,
2219,
271,
17825,
47037,
943,
2168,
897,
1199,
11184,
5018,
285,
7613,
1180,
273,
1159,
27163,
50276,
46089,
767,
11361,
11454,
258,
3229,
476,
4665,
253,
11454,
2990,
3828,
347,
247,
5415,
4778,
285,
247,
8763,
4972,
387,
271,
10341,
3828,
476,
320,
5118,
2139,
310,
436,
271,
5750,
50276,
5371,
310,
436,
4217,
323,
50276,
4674,
3127,
253,
3908,
352,
574,
644,
2361,
326,
4020,
839,
11454,
6928,
342,
8967,
7424,
476,
320,
2218,
407,
1142,
2561,
84,
476,
320,
1239,
275,
767,
4088,
5046,
1089,
247,
1027,
15895,
50276,
2420,
337,
2139,
417,
671,
1618,
3402,
13273,
673,
273,
17032,
347,
326,
310,
752,
359,
403,
7777,
6110,
275,
50276,
4674,
3307,
13548,
363,
310,
581,
273,
253,
954,
6422,
2899,
2392,
752,
513,
368,
1599,
407,
6422,
849,
310,
326,
4080,
50276,
49346,
5513,
253,
17825,
3213,
1979,
6974,
273,
13548,
363,
3185,
273,
816,
3981,
39342,
14495,
281,
2228,
604,
891,
7472,
342,
3213,
1979,
288,
18,
285,
755,
2228,
6642,
299,
18,
513,
891,
840,
5206,
288,
19,
50276,
18,
50276,
70,
18,
849,
1057,
326,
789,
4555,
50276,
30875,
1333,
1633,
670,
253,
1027,
74,
1430,
13260,
273,
253,
2169,
2621,
1408,
463,
76,
29662,
3082,
50276,
30875,
22629,
875,
6843,
285,
15424,
299,
14398,
1332,
3185,
273,
816,
3981,
299,
14398,
1332,
15424,
2899,
2392,
403,
417,
347,
17631,
323,
13827,
3237,
533,
476,
2430,
1142,
625,
1159,
27163,
1580,
597,
1347,
247,
14561,
985,
8415,
387,
1046,
673,
3213,
50276,
249,
5150,
374,
368,
812,
1056,
625,
2173,
752,
2491,
891,
310,
37254,
689,
50276,
4674,
4567,
16161,
323,
288,
17,
359,
8415,
342,
288,
17,
347,
3302,
941,
533,
359,
8415,
323,
288,
85,
13017,
50276,
5430,
310,
253,
9765,
275,
253,
23653,
273,
253,
24026,
2957,
6777,
285,
323,
752,
1921,
5474,
339,
793,
360,
4175,
3006,
253,
2929,
3916,
50276,
783,
2929,
12453,
253,
1953,
273,
8493,
327,
3388,
253,
1180,
273,
1159,
27163,
295,
71,
265,
323,
253,
3579,
5858,
17032,
949,
253,
11454,
258,
615,
50276,
783,
4081,
2746,
3797,
767,
2022,
4295,
253,
806,
581,
310,
247,
1480,
37820,
273,
1220,
735,
13548,
363,
5998,
6332,
1309,
3733,
50276,
783,
1273,
581,
310,
271,
24026,
11454,
2990,
326,
310,
6311,
281,
3283,
534,
47037,
432,
253,
41364,
873,
273,
1220,
735,
13548,
363,
285,
4229,
10539,
391,
76,
21,
299,
14398,
943,
320,
908,
1309,
17032,
323,
247,
50276,
28821,
3280,
3410,
253,
2929,
3916,
326,
841,
4295,
285,
616,
5019,
4917,
281,
4796,
295,
71,
265,
50276,
9072,
2792,
50275,
783,
2929,
45465,
4116,
281,
253,
958,
326,
11454,
258,
615,
10336,
943,
2649,
7356,
281,
253,
7934,
10393,
273,
253,
954,
6422,
47037,
1309,
17032,
7293,
327,
253,
3280,
941,
1679,
6422,
1220,
735,
476,
320,
3732,
50276,
783,
4081,
2746,
310,
6760,
327,
247,
5235,
273,
8892,
2460,
9162,
7891,
10554,
5415,
2622,
3006,
14221,
50276,
20881,
2792,
50276,
8826,
1774,
4278,
8664,
253,
5661,
9978,
403,
11035,
534,
2789,
352,
1892,
281,
9113,
7472,
253,
5373,
273,
253,
4081,
2746,
285,
18302,
253,
1543,
4496,
923,
2708,
323,
247,
14200,
8813,
50276,
35456,
253,
1563,
2792,
878,
281,
320,
31637,
281,
2096,
253,
28959,
273,
253,
2530,
14023,
50276,
18,
497,
253,
3210,
432,
253,
1072,
2829,
2829,
18,
2420,
22,
2420,
25,
10166,
970,
253,
1072,
3632,
12922,
50276,
570,
1546,
258,
615,
3045,
476,
3012,
3469,
327,
10336,
31850,
285,
7613,
970,
253,
1072,
3632,
12922,
310,
2424,
323,
247,
4344,
5301,
50276,
19,
497,
253,
3210,
432,
253,
1072,
2829,
2829,
18,
2420,
22,
2420,
25,
10302,
760,
2378,
390,
2530,
941,
2723,
281,
253,
1599,
1318,
2439,
2067,
5661,
6613,
604,
253,
1599,
310,
2530,
752,
310,
253,
3969,
2629,
11254,
1293,
8958,
253,
2629,
11254,
697,
417,
2590,
604,
627,
310,
247,
1534,
7756,
273,
581,
1332,
689,
1529,
50276,
20,
849,
1142,
5018,
273,
391,
76,
21,
285,
299,
14398,
403,
2218,
1309,
3579,
752,
403,
4373,
22041,
323,
13548,
363,
24088,
13761,
275,
253,
2929,
891,
42126,
1089,
271,
8813,
273,
849,
253,
1180,
273,
5018,
323,
4229,
10539,
1979,
1220,
735,
556,
644,
5055,
285,
849,
253,
13761,
323,
13548,
363,
556,
644,
873,
253,
3290,
347,
973,
347,
295,
71,
265,
285,
2215,
13650,
476,
6889,
3012,
7293,
327,
841,
3602,
50276,
250,
27167,
318,
2997,
390,
12009,
50276,
1542,
253,
1655,
3924,
273,
253,
2278,
891,
5257,
281,
12009,
253,
2929,
2299,
891,
1089,
253,
9400,
273,
253,
2929,
1774,
281,
253,
11454,
258,
615,
3114,
285,
588,
1056,
253,
2457,
4868,
3061,
846,
253,
4477,
37699,
327,
9560,
5661,
873,
8777,
50276,
34974,
50275,
3529,
651,
320,
9371,
323,
4685,
281,
923,
253,
1071,
3045,
273,
3215,
11273,
342,
13548,
363,
11454,
258,
615,
672,
760,
50276,
33716,
21,
390,
760,
299,
14398,
310,
908,
50276,
9846,
359,
10018,
253,
1072,
3879,
604,
359,
1347,
247,
5301,
342,
17825,
3082,
273,
4577,
1340,
50276,
18566,
253,
2215,
5700,
281,
5206,
271,
4569,
47037,
41731,
13015,
253,
5700,
672,
359,
12421,
3410,
47037,
323,
253,
1735,
3280,
1309,
17032,
604,
10491,
4648,
253,
1072,
20552,
347,
2797,
342,
2215,
604,
6447,
10491,
310,
2218,
50276,
5371,
310,
247,
673,
18332,
323,
253,
3733,
970,
5611,
37820,
326,
651,
320,
5322,
281,
923,
14777,
295,
71,
265,
10495,
4632,
44540,
3402,
8886,
673,
285,
295,
71,
265,
2135,
1034,
4632,
44540,
3402,
8886,
673,
10096,
323,
1027,
3082,
1309,
3733,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
767,
3082,
281,
3885,
598,
253,
7103,
273,
11454,
258,
3229,
3963,
3006,
253,
258,
615,
281,
320,
6927,
281,
19837,
285,
5223,
1242,
13887,
534,
2899,
1080,
281,
897,
50276,
20513,
767,
5697,
403,
26401,
24600,
533,
253,
10636,
273,
253,
1655,
2929,
310,
14999,
50276,
249,
1635,
281,
4028,
285,
19843,
3374,
253,
2022,
1895,
310,
417,
10941,
281,
1442,
15328,
1162,
355,
50276,
783,
465,
9609,
1162,
355,
2929,
812,
7826,
320,
2783,
17336,
789,
50276,
74,
671,
1804,
44448,
253,
7990,
273,
253,
2215,
1332,
281,
258,
615,
50276,
84,
615,
268,
615,
1220,
735,
275,
2087,
275,
9534,
835,
1142,
2074,
8967,
7424,
878,
281,
320,
14042,
717,
430,
3006,
253,
47037,
5438,
588,
320,
32811,
1014,
604,
627,
403,
642,
11454,
37507,
275,
253,
8967,
5150,
50276,
74,
671,
11907,
253,
4477,
281,
513,
4679,
326,
8338,
253,
5454,
14273,
273,
1027,
7274,
2581,
685,
26400,
816,
323,
13433,
3104,
275,
7180
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
authors propose a method that takes a video with multiple foreground objects as input along with the corresponding background framesimage and rough segmentation masks for each object it then outputs an alpha decomposition of the video where one layer corresponds to background and the others to all visual effects from each object so eg one layer includes a person plus their shadows and reflections the method is trained for the task of missingframe reconstruction and relies on inductive biases ease of learning to ensure the correct shadowetc are paired with their respective objects rather than any sophisticated constraints synthetic data with plenty of shadows and reflections is used for training at test time the method is further tuned to optimise the decomposition of a given video it is demonstrated on both synthetic and real data quantitative results on synthetic data are better than a recent baseline and qualitative results on real data look reasonable strengths the proposed approach is novel the proposed method achieves significantly better quantitative results on synthetic data than a recent baseline the method is also demonstrated on realworld data where it often achieves visually acceptable results the paper is clear and easy to read throughout it is well structured and the figures are appropriate weaknesses the assumption of known background seems rather restrictive and makes the problem considerably easier given the background is assumed known apriori what is the actual practical purposeuse of the proposed method authors should identify this clearly in the introduction and add an experimental evaluation that shows performance on this task on real data the method is not significantly better than the baseline though it is faster if one disregards training time there is no attempt to measure quantitative performance on realworld data while i appreciate that this would require some manual annotation it would only need to be a handful of frames from the few videos indicating which regions are indeed moving with which object the technical contribution is small compared with the baseline 12 and also considering similarinspirit works such as video centrifuge particularly given the method uses fairly standard architectures and doesnt have a strong justification for its own successes see below there is no convincing reason presented for the method to work the correct result is one of several equivalent local optima assigning shadows to arbitrary layers and the arguments about schelling points do not resolve why the correct result is found merely why the layers should find some arbitrary valid joint decomposition it seems that the correct optimum is found simply because it is easier for the network to learn but itd be nice to see a proper analysis of this at minimum does the model training ever converge to incorrect solutions and for what fraction of training runs if so the method assumes i think data is in srgb normalised to 01 and that different object contributions can be combined by alpha blending however this is not true in general reflections should be treated as strictly additive in linear color space and shadows darken surfaces rather than alphaoverlaying this may limit applicability to scenes where the lighting and exposure are fairly wellbehaved resolution is limited only 128x128 even trainingtesting on tpus thus limiting applicability in practice there is minimal discussion of limitations the paper would benefit from adding an explicit subsection for this there is adequate discussion of broader impacts docsepthe goal of this work is to decompose videos into different layers for example objects of interest and their shadow reflection and other visual effects it is a challenging problem due to the complexity of the 3d geometry and lighting conditions in the real world as well as the difficulties to get the ground truth this paper proposes a selfsupervised method to solve this problem they borrow the idea from game theory and train networks to achieve this focal point the experiments show the effectiveness of their design choice strengths the idea of using focal point is interesting and reasonable this method achieves promising visualization results for video decomposition the paper is easy to follow and the method is demonstrated detailedly weakness object number in this paper the authors only show the results of 2 or 4 object scenarios i am not sure this method can handle the scenarios with arbitrary objects it might limit the generalization ability of this method it is better to show how this network achieves this focal point please refer to the weakness docsepthis paper presents a novel frame for video layer decomposition where they borrow the idea from game theory concept of focal points to frame this problem as a coordination game and let the networks reach consensus on their predictions strengths the presented idea is both novel and interesting the paper is well written and easy to read extensive experiments are conducted and improved results are shown weakness the results on real dataset is a little bit worse than the one on synthetic data the presented method might fail with more complex scene with heavy occlusion docsepgiven a short video with some moving objects and a rough mask for each object this paper tackles the problem of generating a perobject color and alpha mask for each object containing all the effects on the image caused by that object including eg shadows and reflections this paper achieves this via a network which plays a coordination game each copy of the network is supplied with a different objects input mask and attempts to reconstruct the mask and pixels corresponding to this object the network is trained via a selfsupervised reconstruction loss originality the problem being tackled is not original but the proposed solution is to the best of my knowledge novel and interesting quality really nice to see experiments with real data rather than just rely fully on synthetic experiments i would have liked to have seen more simple heuristicdriven baselines for example given the videos are from a fixed camera i could imagine using simple mediandifferencing to find a background image for each video then this background image could be used to find perframe perpixel differences from the background image these must mostly be due to effects of foreground objects finally each of these pixels can be associated to the foreground masks via eg nearest neighbour assignment i dont expect this simple heuristic to beat the proposed approach but it might give better context to the numbers i would also have liked to see more ablations on the components of the algorithm for example what would happen if wt was set to ones everywhere what about varying the hyperparameters in line 158 2sigmoid5x clarity the overall writing of the paper is clear and easy to follow i enjoyed section 35 an explanatory view of the model being used i was very pleased to see the video results i felt these made the overall system and quality of results clear i would have liked a little more justification of the formulation of wt line 155 the idea of this is to give the network more emphasis on reconstructing areas outside the mask of the current object significance the problem tackled is an interesting one and the authors propose a thoughtprovoking solution i hope that the paper shows other researchers that this solution of multiple identical networks playing a coordination game can be of use in these types of scenarios yes i think they have
### Summary: | all reviewers found that the paper provides a novel interesting solution and is well written they appreciated that the proposed method outperforms prior work on synthetic experiments and shows reasonable results on real data the video results were particularly helpful in judging the results the majority of the reviewers were concerned about the convergence of the proposed coordination game to the correct solution while the authors provided some empirical evidence a more formal analysis could alleviate concerns much more easily and would provide a strong justification for the proposed method the requests by reviewers for more ablations simple heuristic baselines and quantitative results on real data were simply ignored by the authors this does not induce confidence that any of these requests will be addressed in a final version | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
43355,
12661,
247,
1332,
326,
3936,
247,
3492,
342,
2709,
35936,
5113,
347,
3280,
2112,
342,
253,
3969,
4114,
13009,
5695,
285,
7227,
26405,
25965,
323,
1016,
1789,
352,
840,
18012,
271,
9765,
14717,
273,
253,
3492,
835,
581,
3828,
10140,
281,
4114,
285,
253,
2571,
281,
512,
5304,
2538,
432,
1016,
1789,
50276,
601,
24088,
581,
3828,
3797,
247,
1436,
5043,
616,
20586,
285,
24233,
253,
1332,
310,
10166,
323,
253,
4836,
273,
5816,
6301,
14433,
285,
15771,
327,
42115,
31306,
11990,
273,
4715,
281,
5416,
253,
3451,
12195,
14069,
403,
18433,
342,
616,
9056,
5113,
2581,
685,
667,
18144,
10806,
13506,
941,
342,
9828,
273,
20586,
285,
24233,
310,
908,
323,
3733,
387,
1071,
673,
253,
1332,
310,
2007,
24251,
281,
5556,
885,
253,
14717,
273,
247,
1677,
3492,
352,
310,
5183,
327,
1097,
13506,
285,
1524,
941,
11745,
1543,
327,
13506,
941,
403,
1805,
685,
247,
3332,
8245,
285,
18276,
1543,
327,
1524,
941,
1007,
5272,
50276,
296,
3755,
20556,
50275,
783,
4081,
2746,
310,
4460,
50275,
783,
4081,
1332,
33526,
3012,
1805,
11745,
1543,
327,
13506,
941,
685,
247,
3332,
8245,
50275,
783,
1332,
310,
671,
5183,
327,
1524,
10186,
941,
835,
352,
2223,
33526,
25910,
12207,
1543,
50275,
783,
2929,
310,
2590,
285,
3477,
281,
1239,
4768,
352,
310,
973,
18872,
285,
253,
8442,
403,
4569,
50276,
20881,
1255,
265,
50275,
783,
9376,
273,
1929,
4114,
3133,
2581,
29190,
285,
2789,
253,
1895,
15455,
6927,
50275,
28821,
253,
4114,
310,
8025,
1929,
1049,
7947,
74,
752,
310,
253,
4588,
8542,
4096,
2327,
273,
253,
4081,
1332,
4477,
943,
4271,
436,
4518,
275,
253,
10199,
285,
823,
271,
5661,
7103,
326,
2722,
3045,
327,
436,
4836,
50275,
251,
1524,
941,
253,
1332,
310,
417,
3012,
1805,
685,
253,
8245,
2167,
352,
310,
7938,
604,
581,
20398,
2196,
3733,
673,
50275,
9088,
310,
642,
3177,
281,
2557,
11745,
3045,
327,
1524,
10186,
941,
1223,
891,
11435,
326,
436,
651,
2430,
690,
11595,
22581,
352,
651,
760,
878,
281,
320,
247,
17167,
273,
13009,
432,
253,
1643,
10556,
7809,
534,
4811,
403,
6296,
4886,
342,
534,
1789,
50275,
783,
7681,
7680,
310,
1355,
2429,
342,
253,
8245,
1249,
285,
671,
7296,
2074,
968,
81,
4621,
2987,
824,
347,
3492,
1399,
8619,
4079,
50276,
35456,
1677,
253,
1332,
4648,
9648,
2629,
35615,
285,
36908,
452,
247,
2266,
22861,
323,
697,
1211,
34574,
923,
2708,
50275,
9088,
310,
642,
21414,
1921,
3559,
323,
253,
1332,
281,
789,
50276,
783,
3451,
906,
310,
581,
273,
2067,
6425,
1980,
5556,
66,
34018,
20586,
281,
10341,
8090,
285,
253,
7125,
670,
4436,
620,
272,
2792,
513,
417,
11322,
2139,
253,
3451,
906,
310,
1119,
7960,
2139,
253,
8090,
943,
1089,
690,
10341,
3588,
6036,
14717,
352,
3133,
326,
253,
3451,
24571,
310,
1119,
3365,
984,
352,
310,
6927,
323,
253,
2990,
281,
3037,
533,
352,
69,
320,
5322,
281,
923,
247,
1463,
1783,
273,
436,
387,
5927,
1057,
253,
1566,
3733,
2455,
29623,
281,
13583,
5482,
50276,
395,
323,
752,
6919,
273,
3733,
6613,
604,
594,
50275,
783,
1332,
19584,
891,
1158,
941,
310,
275,
256,
31952,
2622,
1701,
281,
14805,
285,
326,
1027,
1789,
9021,
476,
320,
5678,
407,
9765,
41209,
2299,
436,
310,
417,
2032,
275,
2087,
50276,
709,
9889,
943,
320,
4127,
347,
13714,
21842,
275,
4872,
3295,
2317,
285,
20586,
3644,
257,
9421,
2581,
685,
9765,
46732,
272,
436,
778,
2701,
30437,
281,
13451,
835,
253,
15632,
285,
5445,
403,
9648,
973,
48384,
9367,
50275,
21061,
310,
3710,
760,
12842,
89,
8196,
1014,
3733,
19462,
327,
246,
81,
316,
3021,
14155,
30437,
275,
3946,
50276,
9088,
310,
8723,
5955,
273,
7364,
253,
2929,
651,
5649,
432,
6240,
271,
6843,
19087,
323,
436,
627,
310,
10599,
5955,
273,
16055,
16274,
5474,
339,
431,
248,
4736,
273,
436,
789,
310,
281,
11101,
3014,
10556,
715,
1027,
8090,
323,
1650,
5113,
273,
1600,
285,
616,
12195,
12906,
285,
643,
5304,
2538,
352,
310,
247,
11132,
1895,
1955,
281,
253,
10454,
273,
253,
495,
69,
12087,
285,
15632,
2515,
275,
253,
1524,
1533,
347,
973,
347,
253,
12748,
281,
755,
253,
3216,
5083,
436,
2929,
29328,
247,
1881,
35421,
1332,
281,
8415,
436,
1895,
597,
13179,
253,
2934,
432,
2165,
3762,
285,
6194,
6928,
281,
5115,
436,
18560,
1127,
253,
4679,
921,
253,
12510,
273,
616,
2216,
4327,
20544,
50276,
783,
2934,
273,
970,
18560,
1127,
310,
4722,
285,
5272,
50276,
2520,
1332,
33526,
12532,
24426,
1543,
323,
3492,
14717,
50276,
783,
2929,
310,
3477,
281,
956,
285,
253,
1332,
310,
5183,
7000,
314,
14855,
50276,
6082,
1180,
275,
436,
2929,
253,
4477,
760,
921,
253,
1543,
273,
374,
390,
577,
1789,
15216,
891,
717,
417,
2119,
436,
1332,
476,
6016,
253,
15216,
342,
10341,
5113,
352,
1537,
2701,
253,
26647,
3745,
273,
436,
1332,
50276,
262,
310,
1805,
281,
921,
849,
436,
2990,
33526,
436,
18560,
1127,
4496,
3730,
281,
253,
14855,
5474,
33032,
2520,
2929,
10262,
247,
4460,
3665,
323,
3492,
3828,
14717,
835,
597,
13179,
253,
2934,
432,
2165,
3762,
4473,
273,
18560,
2792,
281,
3665,
436,
1895,
347,
247,
19915,
2165,
285,
1339,
253,
6928,
3986,
13969,
327,
616,
13650,
50276,
296,
3755,
20556,
50275,
783,
3559,
2934,
310,
1097,
4460,
285,
4722,
50275,
783,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
50275,
2068,
3134,
4679,
403,
5196,
285,
5520,
1543,
403,
2011,
50276,
20881,
1255,
50275,
783,
1543,
327,
1524,
10895,
310,
247,
1652,
2372,
7197,
685,
253,
581,
327,
13506,
941,
253,
3559,
1332,
1537,
1891,
342,
625,
2570,
6200,
342,
5536,
30796,
5474,
339,
8159,
3870,
247,
2159,
3492,
342,
690,
4886,
5113,
285,
247,
7227,
8989,
323,
1016,
1789,
436,
2929,
39223,
253,
1895,
273,
11365,
247,
591,
6082,
3295,
285,
9765,
8989,
323,
1016,
1789,
4508,
512,
253,
2538,
327,
253,
2460,
4269,
407,
326,
1789,
1690,
24088,
20586,
285,
24233,
436,
2929,
33526,
436,
3066,
247,
2990,
534,
7120,
247,
19915,
2165,
1016,
3491,
273,
253,
2990,
310,
12164,
342,
247,
1027,
5113,
3280,
8989,
285,
9437,
281,
17029,
253,
8989,
285,
15115,
3969,
281,
436,
1789,
253,
2990,
310,
10166,
3066,
247,
1881,
35421,
14433,
2957,
50276,
19164,
414,
50276,
783,
1895,
1146,
11463,
1070,
310,
417,
3236,
533,
253,
4081,
2900,
310,
281,
253,
1682,
273,
619,
3640,
4460,
285,
4722,
50275,
15177,
1663,
5322,
281,
923,
4679,
342,
1524,
941,
2581,
685,
816,
10725,
4751,
327,
13506,
4679,
50276,
74,
651,
452,
10490,
281,
452,
2326,
625,
2969,
47641,
17477,
1666,
25379,
323,
1650,
1677,
253,
10556,
403,
432,
247,
4229,
6568,
891,
812,
8564,
970,
2969,
12069,
395,
338,
46382,
281,
1089,
247,
4114,
2460,
323,
1016,
3492,
50276,
7461,
436,
4114,
2460,
812,
320,
908,
281,
1089,
591,
6301,
591,
29206,
3910,
432,
253,
4114,
2460,
841,
1364,
6571,
320,
1955,
281,
2538,
273,
35936,
5113,
4720,
1016,
273,
841,
15115,
476,
320,
2330,
281,
253,
35936,
25965,
3066,
24088,
5275,
14646,
12714,
891,
13414,
1902,
436,
2969,
47641,
281,
7171,
253,
4081,
2746,
533,
352,
1537,
1918,
1805,
3634,
281,
253,
3904,
50276,
74,
651,
671,
452,
10490,
281,
923,
625,
490,
77,
569,
327,
253,
4295,
273,
253,
5933,
323,
1650,
752,
651,
5108,
604,
22923,
369,
873,
281,
4394,
11678,
752,
670,
11962,
253,
4373,
22041,
275,
1386,
22287,
374,
84,
15379,
1238,
22,
89,
50275,
498,
15752,
50276,
783,
4583,
4028,
273,
253,
2929,
310,
2590,
285,
3477,
281,
956,
50276,
74,
11346,
2593,
4791,
271,
41355,
1859,
273,
253,
1566,
1146,
908,
50276,
74,
369,
1077,
13864,
281,
923,
253,
3492,
1543,
891,
3543,
841,
1160,
253,
4583,
985,
285,
3290,
273,
1543,
2590,
50276,
74,
651,
452,
10490,
247,
1652,
625,
22861,
273,
253,
15895,
273,
22923,
1386,
20029,
253,
2934,
273,
436,
310,
281,
1918,
253,
2990,
625,
15075,
327,
17029,
272,
3672,
3345,
253,
8989,
273,
253,
1655,
1789,
50274,
9188,
40348,
50276,
783,
1895,
11463,
1070,
310,
271,
4722,
581,
285,
253,
4477,
12661,
247,
1869,
11404,
6856,
2900,
891,
3524,
326,
253,
2929,
2722,
643,
8607,
326,
436,
2900,
273,
2709,
8931,
6928,
4882,
247,
19915,
2165,
476,
320,
273,
897,
275,
841,
3510,
273,
15216,
50274,
9820,
891,
1158,
597,
452,
2490,
187,
4118,
18435,
27,
455,
30628,
1119,
326,
253,
2929,
3400,
247,
4460,
4722,
2900,
285,
310,
973,
3542,
597,
14109,
326,
253,
4081,
1332,
41731,
13015,
2720,
789,
327,
13506,
4679,
285,
2722,
5272,
1543,
327,
1524,
941,
253,
3492,
1543,
497,
3782,
9371,
275,
32721,
253,
1543,
50276,
783,
5020,
273,
253,
30628,
497,
7514,
670,
253,
14940,
273,
253,
4081,
19915,
2165,
281,
253,
3451,
2900,
1223,
253,
4477,
2530,
690,
16774,
1941,
247,
625,
7473,
1783,
812,
33623,
7350,
1199,
625,
4354,
285,
651,
2085,
247,
2266,
22861,
323,
253,
4081,
1332,
50275,
783,
9762,
407,
30628,
323,
625,
490,
77,
569,
2969,
47641,
1666,
25379,
285,
11745,
1543,
327,
1524,
941,
497,
3365,
12841,
407,
253,
4477,
436,
1057,
417,
10808,
7162,
326,
667,
273,
841,
9762,
588,
320,
9713,
275,
247,
2457,
2715
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
43355,
12661,
247,
1332,
326,
3936,
247,
3492,
342,
2709,
35936,
5113,
347,
3280,
2112,
342,
253,
3969,
4114,
13009,
5695,
285,
7227,
26405,
25965,
323,
1016,
1789,
352,
840,
18012,
271,
9765,
14717,
273,
253,
3492,
835,
581,
3828,
10140,
281,
4114,
285,
253,
2571,
281,
512,
5304,
2538,
432,
1016,
1789,
50276,
601,
24088,
581,
3828,
3797,
247,
1436,
5043,
616,
20586,
285,
24233,
253,
1332,
310,
10166,
323,
253,
4836,
273,
5816,
6301,
14433,
285,
15771,
327,
42115,
31306,
11990,
273,
4715,
281,
5416,
253,
3451,
12195,
14069,
403,
18433,
342,
616,
9056,
5113,
2581,
685,
667,
18144,
10806,
13506,
941,
342,
9828,
273,
20586,
285,
24233,
310,
908,
323,
3733,
387,
1071,
673,
253,
1332,
310,
2007,
24251,
281,
5556,
885,
253,
14717,
273,
247,
1677,
3492,
352,
310,
5183,
327,
1097,
13506,
285,
1524,
941,
11745,
1543,
327,
13506,
941,
403,
1805,
685,
247,
3332,
8245,
285,
18276,
1543,
327,
1524,
941,
1007,
5272,
50276,
296,
3755,
20556,
50275,
783,
4081,
2746,
310,
4460,
50275,
783,
4081,
1332,
33526,
3012,
1805,
11745,
1543,
327,
13506,
941,
685,
247,
3332,
8245,
50275,
783,
1332,
310,
671,
5183,
327,
1524,
10186,
941,
835,
352,
2223,
33526,
25910,
12207,
1543,
50275,
783,
2929,
310,
2590,
285,
3477,
281,
1239,
4768,
352,
310,
973,
18872,
285,
253,
8442,
403,
4569,
50276,
20881,
1255,
265,
50275,
783,
9376,
273,
1929,
4114,
3133,
2581,
29190,
285,
2789,
253,
1895,
15455,
6927,
50275,
28821,
253,
4114,
310,
8025,
1929,
1049,
7947,
74,
752,
310,
253,
4588,
8542,
4096,
2327,
273,
253,
4081,
1332,
4477,
943,
4271,
436,
4518,
275,
253,
10199,
285,
823,
271,
5661,
7103,
326,
2722,
3045,
327,
436,
4836,
50275,
251,
1524,
941,
253,
1332,
310,
417,
3012,
1805,
685,
253,
8245,
2167,
352,
310,
7938,
604,
581,
20398,
2196,
3733,
673,
50275,
9088,
310,
642,
3177,
281,
2557,
11745,
3045,
327,
1524,
10186,
941,
1223,
891,
11435,
326,
436,
651,
2430,
690,
11595,
22581,
352,
651,
760,
878,
281,
320,
247,
17167,
273,
13009,
432,
253,
1643,
10556,
7809,
534,
4811,
403,
6296,
4886,
342,
534,
1789,
50275,
783,
7681,
7680,
310,
1355,
2429,
342,
253,
8245,
1249,
285,
671,
7296,
2074,
968,
81,
4621,
2987,
824,
347,
3492,
1399,
8619,
4079,
50276,
35456,
1677,
253,
1332,
4648,
9648,
2629,
35615,
285,
36908,
452,
247,
2266,
22861,
323,
697,
1211,
34574,
923,
2708,
50275,
9088,
310,
642,
21414,
1921,
3559,
323,
253,
1332,
281,
789,
50276,
783,
3451,
906,
310,
581,
273,
2067,
6425,
1980,
5556,
66,
34018,
20586,
281,
10341,
8090,
285,
253,
7125,
670,
4436,
620,
272,
2792,
513,
417,
11322,
2139,
253,
3451,
906,
310,
1119,
7960,
2139,
253,
8090,
943,
1089,
690,
10341,
3588,
6036,
14717,
352,
3133,
326,
253,
3451,
24571,
310,
1119,
3365,
984,
352,
310,
6927,
323,
253,
2990,
281,
3037,
533,
352,
69,
320,
5322,
281,
923,
247,
1463,
1783,
273,
436,
387,
5927,
1057,
253,
1566,
3733,
2455,
29623,
281,
13583,
5482,
50276,
395,
323,
752,
6919,
273,
3733,
6613,
604,
594,
50275,
783,
1332,
19584,
891,
1158,
941,
310,
275,
256,
31952,
2622,
1701,
281,
14805,
285,
326,
1027,
1789,
9021,
476,
320,
5678,
407,
9765,
41209,
2299,
436,
310,
417,
2032,
275,
2087,
50276,
709,
9889,
943,
320,
4127,
347,
13714,
21842,
275,
4872,
3295,
2317,
285,
20586,
3644,
257,
9421,
2581,
685,
9765,
46732,
272,
436,
778,
2701,
30437,
281,
13451,
835,
253,
15632,
285,
5445,
403,
9648,
973,
48384,
9367,
50275,
21061,
310,
3710,
760,
12842,
89,
8196,
1014,
3733,
19462,
327,
246,
81,
316,
3021,
14155,
30437,
275,
3946,
50276,
9088,
310,
8723,
5955,
273,
7364,
253,
2929,
651,
5649,
432,
6240,
271,
6843,
19087,
323,
436,
627,
310,
10599,
5955,
273,
16055,
16274,
5474,
339,
431,
248,
4736,
273,
436,
789,
310,
281,
11101,
3014,
10556,
715,
1027,
8090,
323,
1650,
5113,
273,
1600,
285,
616,
12195,
12906,
285,
643,
5304,
2538,
352,
310,
247,
11132,
1895,
1955,
281,
253,
10454,
273,
253,
495,
69,
12087,
285,
15632,
2515,
275,
253,
1524,
1533,
347,
973,
347,
253,
12748,
281,
755,
253,
3216,
5083,
436,
2929,
29328,
247,
1881,
35421,
1332,
281,
8415,
436,
1895,
597,
13179,
253,
2934,
432,
2165,
3762,
285,
6194,
6928,
281,
5115,
436,
18560,
1127,
253,
4679,
921,
253,
12510,
273,
616,
2216,
4327,
20544,
50276,
783,
2934,
273,
970,
18560,
1127,
310,
4722,
285,
5272,
50276,
2520,
1332,
33526,
12532,
24426,
1543,
323,
3492,
14717,
50276,
783,
2929,
310,
3477,
281,
956,
285,
253,
1332,
310,
5183,
7000,
314,
14855,
50276,
6082,
1180,
275,
436,
2929,
253,
4477,
760,
921,
253,
1543,
273,
374,
390,
577,
1789,
15216,
891,
717,
417,
2119,
436,
1332,
476,
6016,
253,
15216,
342,
10341,
5113,
352,
1537,
2701,
253,
26647,
3745,
273,
436,
1332,
50276,
262,
310,
1805,
281,
921,
849,
436,
2990,
33526,
436,
18560,
1127,
4496,
3730,
281,
253,
14855,
5474,
33032,
2520,
2929,
10262,
247,
4460,
3665,
323,
3492,
3828,
14717,
835,
597,
13179,
253,
2934,
432,
2165,
3762,
4473,
273,
18560,
2792,
281,
3665,
436,
1895,
347,
247,
19915,
2165,
285,
1339,
253,
6928,
3986,
13969,
327,
616,
13650,
50276,
296,
3755,
20556,
50275,
783,
3559,
2934,
310,
1097,
4460,
285,
4722,
50275,
783,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
50275,
2068,
3134,
4679,
403,
5196,
285,
5520,
1543,
403,
2011,
50276,
20881,
1255,
50275,
783,
1543,
327,
1524,
10895,
310,
247,
1652,
2372,
7197,
685,
253,
581,
327,
13506,
941,
253,
3559,
1332,
1537,
1891,
342,
625,
2570,
6200,
342,
5536,
30796,
5474,
339,
8159,
3870,
247,
2159,
3492,
342,
690,
4886,
5113,
285,
247,
7227,
8989,
323,
1016,
1789,
436,
2929,
39223,
253,
1895,
273,
11365,
247,
591,
6082,
3295,
285,
9765,
8989,
323,
1016,
1789,
4508,
512,
253,
2538,
327,
253,
2460,
4269,
407,
326,
1789,
1690,
24088,
20586,
285,
24233,
436,
2929,
33526,
436,
3066,
247,
2990,
534,
7120,
247,
19915,
2165,
1016,
3491,
273,
253,
2990,
310,
12164,
342,
247,
1027,
5113,
3280,
8989,
285,
9437,
281,
17029,
253,
8989,
285,
15115,
3969,
281,
436,
1789,
253,
2990,
310,
10166,
3066,
247,
1881,
35421,
14433,
2957,
50276,
19164,
414,
50276,
783,
1895,
1146,
11463,
1070,
310,
417,
3236,
533,
253,
4081,
2900,
310,
281,
253,
1682,
273,
619,
3640,
4460,
285,
4722,
50275,
15177,
1663,
5322,
281,
923,
4679,
342,
1524,
941,
2581,
685,
816,
10725,
4751,
327,
13506,
4679,
50276,
74,
651,
452,
10490,
281,
452,
2326,
625,
2969,
47641,
17477,
1666,
25379,
323,
1650,
1677,
253,
10556,
403,
432,
247,
4229,
6568,
891,
812,
8564,
970,
2969,
12069,
395,
338,
46382,
281,
1089,
247,
4114,
2460,
323,
1016,
3492,
50276,
7461,
436,
4114,
2460,
812,
320,
908,
281,
1089,
591,
6301,
591,
29206,
3910,
432,
253,
4114,
2460,
841,
1364,
6571,
320,
1955,
281,
2538,
273,
35936,
5113,
4720,
1016,
273,
841,
15115,
476,
320,
2330,
281,
253,
35936,
25965,
3066,
24088,
5275,
14646,
12714,
891,
13414,
1902,
436,
2969,
47641,
281,
7171,
253,
4081,
2746,
533,
352,
1537,
1918,
1805,
3634,
281,
253,
3904,
50276,
74,
651,
671,
452,
10490,
281,
923,
625,
490,
77,
569,
327,
253,
4295,
273,
253,
5933,
323,
1650,
752,
651,
5108,
604,
22923,
369,
873,
281,
4394,
11678,
752,
670,
11962,
253,
4373,
22041,
275,
1386,
22287,
374,
84,
15379,
1238,
22,
89,
50275,
498,
15752,
50276,
783,
4583,
4028,
273,
253,
2929,
310,
2590,
285,
3477,
281,
956,
50276,
74,
11346,
2593,
4791,
271,
41355,
1859,
273,
253,
1566,
1146,
908,
50276,
74,
369,
1077,
13864,
281,
923,
253,
3492,
1543,
891,
3543,
841,
1160,
253,
4583,
985,
285,
3290,
273,
1543,
2590,
50276,
74,
651,
452,
10490,
247,
1652,
625,
22861,
273,
253,
15895,
273,
22923,
1386,
20029,
253,
2934,
273,
436,
310,
281,
1918,
253,
2990,
625,
15075,
327,
17029,
272,
3672,
3345,
253,
8989,
273,
253,
1655,
1789,
50274,
9188,
40348,
50276,
783,
1895,
11463,
1070,
310,
271,
4722,
581,
285,
253,
4477,
12661,
247,
1869,
11404,
6856,
2900,
891,
3524,
326,
253,
2929,
2722,
643,
8607,
326,
436,
2900,
273,
2709,
8931,
6928,
4882,
247,
19915,
2165,
476,
320,
273,
897,
275,
841,
3510,
273,
15216,
50274,
9820,
891,
1158,
597,
452,
2490,
187,
4118,
18435,
27,
455,
30628,
1119,
326,
253,
2929,
3400,
247,
4460,
4722,
2900,
285,
310,
973,
3542,
597,
14109,
326,
253,
4081,
1332,
41731,
13015,
2720,
789,
327,
13506,
4679,
285,
2722,
5272,
1543,
327,
1524,
941,
253,
3492,
1543,
497,
3782,
9371,
275,
32721,
253,
1543,
50276,
783,
5020,
273,
253,
30628,
497,
7514,
670,
253,
14940,
273,
253,
4081,
19915,
2165,
281,
253,
3451,
2900,
1223,
253,
4477,
2530,
690,
16774,
1941,
247,
625,
7473,
1783,
812,
33623,
7350,
1199,
625,
4354,
285,
651,
2085,
247,
2266,
22861,
323,
253,
4081,
1332,
50275,
783,
9762,
407,
30628,
323,
625,
490,
77,
569,
2969,
47641,
1666,
25379,
285,
11745,
1543,
327,
1524,
941,
497,
3365,
12841,
407,
253,
4477,
436,
1057,
417,
10808,
7162,
326,
667,
273,
841,
9762,
588,
320,
9713,
275,
247,
2457,
2715
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the idea of creating an intermediate model of the eye for tracking is an interesting one the weakest part of the paper seems to be the evaluation and lack of real world testing with such validation the paper would be much more compelling and much stronger some specific questions you say you are doing 82 above sota if you could convert this into a real world number eg x degree tighter track it seems like you are testing 9 possible gaze extrema but this doesnt give a really good idea how well tracking would occur with a real world application is your statement about too much disk space and slow interface speeds a factor of the hardware you are running on is it something moores law will address or is there something fundamental to the problem you are modeling occlusion as a random stipple noise pattern this is not really accurate is it occlusion is typically a group of pixels the random noise is actually the best possible case for reconstruction did you test with larger dropout regions in your efficiency arguments frame to frame tracking is typically a perturbation problem how do you perform when you know the last frame docsepthe authors present a novel approach to reconstructing the complete the eye region from a noisy partial eye scan they describe the short comings in existing rgb based approaches and point out that 3d semantic surface modeling can lead to a more practical usecase ie gaze estimation the authors describe their methods and evaluation in sufficient detail and show that they achieve excellent performance for both tasks they also propose a simple way to build a dataset for semantic completion eyes based on unityeyes meshes the manuscript is relatively easy to read and makes arguments relatively well they present evaluation results that are compelling in terms of performance and time and in case of gaze estimation accuracy overall i found the manuscript to be very good
### Summary: | this work proposes a neural architecture pointcloud approach for reconstructing the eye geometry based on noisy partial observations reviewers have found the problem addressed interesting and the experimental results relatively compelling some of the reviews raised several questions that can be adequately addressed in the cameraready version overall the paper has received positive feedback that suggests acceptance | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2934,
273,
6153,
271,
10444,
1566,
273,
253,
5130,
323,
12544,
310,
271,
4722,
581,
253,
5075,
383,
629,
273,
253,
2929,
3133,
281,
320,
253,
7103,
285,
3480,
273,
1524,
1533,
5175,
342,
824,
12820,
253,
2929,
651,
320,
1199,
625,
18511,
285,
1199,
10046,
690,
2173,
3533,
50275,
5658,
1333,
368,
403,
2509,
11487,
1840,
256,
5503,
604,
368,
812,
6455,
436,
715,
247,
1524,
1533,
1180,
24088,
1269,
4248,
40638,
3540,
50275,
262,
3133,
751,
368,
403,
5175,
898,
1896,
15944,
1021,
250,
785,
50276,
2858,
436,
36908,
1918,
247,
1663,
1175,
2934,
849,
973,
12544,
651,
2826,
342,
247,
1524,
1533,
2898,
50276,
261,
634,
3908,
670,
1512,
1199,
7592,
2317,
285,
3468,
5673,
18819,
247,
2803,
273,
253,
10309,
368,
403,
3515,
327,
310,
352,
1633,
5497,
2324,
1569,
588,
2953,
390,
310,
627,
1633,
7936,
281,
253,
1895,
50276,
5658,
403,
14053,
30796,
347,
247,
3632,
331,
33687,
6046,
3102,
436,
310,
417,
1663,
7899,
310,
352,
30796,
310,
5431,
247,
1387,
273,
15115,
253,
3632,
6046,
310,
2686,
253,
1682,
1896,
1083,
323,
14433,
858,
368,
1071,
342,
4067,
5926,
483,
4811,
50276,
249,
634,
6733,
7125,
3665,
281,
3665,
12544,
310,
5431,
247,
20452,
1895,
849,
513,
368,
1347,
672,
368,
871,
253,
1390,
3665,
5474,
339,
431,
248,
4477,
1246,
247,
4460,
2746,
281,
17029,
272,
253,
3426,
253,
5130,
2919,
432,
247,
27620,
7898,
5130,
11017,
597,
6266,
253,
2159,
389,
723,
275,
5368,
46206,
1754,
7274,
285,
1127,
562,
326,
495,
69,
24705,
2553,
14053,
476,
1421,
281,
247,
625,
8542,
441,
886,
511,
26332,
15944,
13418,
253,
4477,
6266,
616,
3082,
285,
7103,
275,
4209,
2508,
285,
921,
326,
597,
5115,
7126,
3045,
323,
1097,
8892,
597,
671,
12661,
247,
2969,
1039,
281,
1973,
247,
10895,
323,
24705,
12240,
2927,
1754,
327,
16167,
2653,
265,
6191,
1041,
50276,
783,
7714,
310,
4942,
3477,
281,
1239,
285,
2789,
7125,
4942,
973,
597,
1246,
7103,
1543,
326,
403,
18511,
275,
2426,
273,
3045,
285,
673,
285,
275,
1083,
273,
15944,
13418,
50276,
18921,
1974,
50276,
1189,
455,
891,
1119,
253,
7714,
281,
320,
1077,
1175,
187,
187,
4118,
18435,
27,
2520,
789,
29328,
247,
11454,
10336,
1127,
18534,
2746,
323,
17029,
272,
253,
5130,
12087,
1754,
327,
27620,
7898,
7313,
30628,
452,
1119,
253,
1895,
9713,
4722,
285,
253,
5661,
1543,
4942,
18511,
690,
273,
253,
10123,
5439,
2067,
3533,
326,
476,
320,
18212,
9713,
275,
253,
4049,
254,
609,
5102,
2715,
4583,
253,
2929,
556,
2959,
2762,
8680,
326,
5936,
14924
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2934,
273,
6153,
271,
10444,
1566,
273,
253,
5130,
323,
12544,
310,
271,
4722,
581,
253,
5075,
383,
629,
273,
253,
2929,
3133,
281,
320,
253,
7103,
285,
3480,
273,
1524,
1533,
5175,
342,
824,
12820,
253,
2929,
651,
320,
1199,
625,
18511,
285,
1199,
10046,
690,
2173,
3533,
50275,
5658,
1333,
368,
403,
2509,
11487,
1840,
256,
5503,
604,
368,
812,
6455,
436,
715,
247,
1524,
1533,
1180,
24088,
1269,
4248,
40638,
3540,
50275,
262,
3133,
751,
368,
403,
5175,
898,
1896,
15944,
1021,
250,
785,
50276,
2858,
436,
36908,
1918,
247,
1663,
1175,
2934,
849,
973,
12544,
651,
2826,
342,
247,
1524,
1533,
2898,
50276,
261,
634,
3908,
670,
1512,
1199,
7592,
2317,
285,
3468,
5673,
18819,
247,
2803,
273,
253,
10309,
368,
403,
3515,
327,
310,
352,
1633,
5497,
2324,
1569,
588,
2953,
390,
310,
627,
1633,
7936,
281,
253,
1895,
50276,
5658,
403,
14053,
30796,
347,
247,
3632,
331,
33687,
6046,
3102,
436,
310,
417,
1663,
7899,
310,
352,
30796,
310,
5431,
247,
1387,
273,
15115,
253,
3632,
6046,
310,
2686,
253,
1682,
1896,
1083,
323,
14433,
858,
368,
1071,
342,
4067,
5926,
483,
4811,
50276,
249,
634,
6733,
7125,
3665,
281,
3665,
12544,
310,
5431,
247,
20452,
1895,
849,
513,
368,
1347,
672,
368,
871,
253,
1390,
3665,
5474,
339,
431,
248,
4477,
1246,
247,
4460,
2746,
281,
17029,
272,
253,
3426,
253,
5130,
2919,
432,
247,
27620,
7898,
5130,
11017,
597,
6266,
253,
2159,
389,
723,
275,
5368,
46206,
1754,
7274,
285,
1127,
562,
326,
495,
69,
24705,
2553,
14053,
476,
1421,
281,
247,
625,
8542,
441,
886,
511,
26332,
15944,
13418,
253,
4477,
6266,
616,
3082,
285,
7103,
275,
4209,
2508,
285,
921,
326,
597,
5115,
7126,
3045,
323,
1097,
8892,
597,
671,
12661,
247,
2969,
1039,
281,
1973,
247,
10895,
323,
24705,
12240,
2927,
1754,
327,
16167,
2653,
265,
6191,
1041,
50276,
783,
7714,
310,
4942,
3477,
281,
1239,
285,
2789,
7125,
4942,
973,
597,
1246,
7103,
1543,
326,
403,
18511,
275,
2426,
273,
3045,
285,
673,
285,
275,
1083,
273,
15944,
13418,
50276,
18921,
1974,
50276,
1189,
455,
891,
1119,
253,
7714,
281,
320,
1077,
1175,
187,
187,
4118,
18435,
27,
2520,
789,
29328,
247,
11454,
10336,
1127,
18534,
2746,
323,
17029,
272,
253,
5130,
12087,
1754,
327,
27620,
7898,
7313,
30628,
452,
1119,
253,
1895,
9713,
4722,
285,
253,
5661,
1543,
4942,
18511,
690,
273,
253,
10123,
5439,
2067,
3533,
326,
476,
320,
18212,
9713,
275,
253,
4049,
254,
609,
5102,
2715,
4583,
253,
2929,
556,
2959,
2762,
8680,
326,
5936,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a genarative model for images which explicitly separates the within class variation the covariant part from the across class variation invariant part functionally this achieves a similar result as various recent works on incorporating invariances in neural nets but the fact that it is able to explicitly construct models for both parts of the distribution is nice results on mnist are good but of course this is a very simple dataset it would be very interesting to see how the model performs on a more realistic problem admittedly i am not an expert in generative models this is a clean paper with a clear goal it is hard for me to judge how original the idea is covariant might not be the best word to use here because it has a very specific meaning in the context of some other neural networks related to how quantities transform according to representations of a symmetry group this is a potential source of confusion docsepthis paper is well written and the quality of the figures is good in this paper the authors propose an invariantcovariant idea which should be dated back at least to the bilinear models the general direction is important and should be pursued further however the literature is not well addressed eslami et al 2018 have been cited but some very important and related earlier works like 1 kulkarni et al 2015 deep convolutional inverse graphics network 2 cheung et al 2015 discovering hidden factors of variation in deep networks were not discussed at all the authors should certainly make an effort to discuss the connections and new developments beyond these works at the end of section 1 the authors argue that the covariant vector could be more general but in fact these earlier works can achieve further equivalence which is much stronger than the proposed covariance there is also an effort to compare this work to sabour et al 2017 and the general capsule idea i would like to point out the capsule concept is a much more finegrained what where separation rather than a coarsegrained class pose separation in one shot in a hierarchical representation what where can appear at any level as one class can consist of several parts each with a geometrical configuration space so the comparison of this work to the generic capsule network is only superficial if the authors can not make the proposed architecture into a hierarchical separation besides different capsule network papers i found another potentially useful reference on a finegrained separation 3goroshin et al learning to linearize under uncertainty in the paper it is argued several times that the latent vector ry contains a rich set of global properties of class y rather than just its label and the aim is that it can learn what the elements of the class manifold have in common but this point is not supported well since we can always make a label and this latent vector ry equivalent by a template i think this point could be meaningful if we look at rys for different y where each of the dimension may have some semantic meaning additional interpretation is certainly needed under equation 3 note that v is inferred from ry should be inferred from both ry and x which is pretty clear from the fig 5 related to this i could imagine some encoder can extract the style directly from x but here both ry and x are used i couldnt find any guarantee that v only contains the style information based on the architecture with even this additional complication could the authors comment on this equation 5 is not really a marginalization and further equation 6 may not be a lower bound anymore this is probably a relatively minor thing and a little extra care is probably enough the numbers in table 2 seems a little outdated to conclude i like the general direction of separating the identity and configurations the natural signals have hierarchical structures and the class manifold concept is not general enough to describe the regularities and provide a transparent representation rather its a good starting point if the authors could carefully address the related prior works and help us understand the unique and original contributions of this work this paper could be considered for publicationdocsepthe paper presents a vae that uses labels to separate the learned representation into an invariant and a covariant part the method is validated using experiments on the mnist dataset the writing in this paper is somewhat problematic although it is hard to put the finger on a particularly severe instance the paper is filled with vague and hyperbolic statements words like efficiently meaningful natural etc are sprinkled throughout to confer a positive connotation often without having a specific meaning in their context or adding any information where the meaning is somewhat clear the claims are often not supported by evidence sometimes the claims are so broad that it is not clear what kind of evidence could support such a claim a relatively large amount of space is used to explain the general concept of invariantcovariant learning which as a general concept is widely understood and not novel there are other instances of overclaiming such as the goal of covae is to provide an approach to probabilistic modelling that enables meaningful representations in fact covae is a rather specific modelclass rather than an approach to probabilistic modelling the paper is at times meandering for instance the benefits of and motivation for the proposed approach are not simply stated in the introduction and then demonstrated in the rest of the paper but instead the paper states some benefits and motivations explains some technical content mentions some more benefits repeats some motivations stated before etc many researchers working on representation learning hope to discover the underlying learning principles that lead to representations that seem natural to a human being in this paper labels are used to guide the representation into the right representation it is in my opinion not very surprising that one can use labels to induce certain qualities deemed desirable in the representation to conclude because of the writing limited novelty and limited experiments i think this paper currently does not pass the bar for iclr
### Summary: | the paper presents a new approach to learn separate classinvariant and classequivariant latent representations by training on labeled and optional additional unlabelled multi class data empirical results on mnist and svhn show that the method works well reviewers initially highlighted the following weaknesses of the paper insufficient references and contrasting with related work given that this problem space has been much explored before limited novelty of the approach limited experiments mnist only one reviewer also mentioned a sometimes vague overly hyperbolic and meandering writeup authors did a commendable effort to improve the paper based on the reviews adding new references removing and rewriting parts of the paper to make it more focused and providing experimental results on an additional dataset svhn the paper did improve as a result but while attenuated the initial criticisms remain valid the literature review and discussion remains short and too superficial the peculiarities of the approach which grant it modest originality are insufficiently theoretically and empirically justified and not clearly enough put in context of the whole body of prior work consequently the proposed approach feels very adhoc finally the additional experiments are a step in the right direction but experiments on only mnist and svhn are hardly enough in 2018 to convince the reader that a method has a universal potential and is more generally useful given the limited novelty and in the absence of theoretical justification experiments should be much more extensive both in diversity of dataproblems and in the range of alternative approaches compared to to build a convincing case | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
730,
274,
800,
1566,
323,
3888,
534,
11120,
36158,
253,
1561,
966,
7629,
50276,
783,
43359,
629,
432,
253,
2439,
966,
7629,
13727,
629,
30333,
436,
33526,
247,
2074,
50276,
6870,
347,
2710,
3332,
2987,
327,
24049,
828,
6656,
707,
275,
11454,
37507,
533,
253,
958,
326,
352,
310,
2104,
50276,
936,
11120,
3989,
3210,
323,
1097,
4243,
273,
253,
3268,
310,
5322,
1543,
327,
278,
79,
382,
403,
1175,
50276,
2858,
273,
2282,
436,
310,
247,
1077,
2969,
10895,
352,
651,
320,
1077,
4722,
281,
923,
849,
253,
1566,
50276,
468,
13015,
327,
247,
625,
15958,
1895,
50275,
324,
3004,
314,
891,
717,
417,
271,
6485,
275,
1006,
800,
3210,
436,
310,
247,
4076,
2929,
342,
247,
2590,
4736,
352,
310,
1892,
50276,
1542,
479,
281,
5963,
849,
3236,
253,
2934,
310,
50276,
31485,
6410,
1537,
417,
320,
253,
1682,
3159,
281,
897,
1060,
984,
352,
556,
247,
1077,
2173,
4495,
275,
253,
3634,
50276,
1171,
690,
643,
11454,
6928,
2905,
281,
849,
13483,
4979,
2556,
281,
14237,
273,
247,
10377,
50276,
4399,
436,
310,
247,
2442,
2603,
273,
13775,
5474,
33032,
2520,
2929,
310,
973,
3542,
285,
253,
3290,
273,
253,
8442,
310,
1175,
275,
436,
2929,
253,
4477,
12661,
271,
13727,
31485,
6410,
2934,
534,
943,
320,
15483,
896,
387,
1878,
281,
253,
10370,
48971,
3210,
253,
2087,
3884,
310,
1774,
285,
943,
320,
23321,
2007,
50275,
35529,
253,
6239,
310,
417,
973,
9713,
1578,
5247,
74,
1162,
355,
4765,
452,
644,
11106,
533,
690,
1077,
1774,
285,
2905,
4321,
2987,
751,
50276,
18,
465,
23073,
1596,
74,
1162,
355,
4104,
3676,
27311,
267,
13737,
15896,
2990,
374,
1161,
1947,
1162,
355,
4104,
30375,
8763,
2616,
273,
7629,
275,
3676,
6928,
497,
417,
5469,
387,
512,
253,
4477,
943,
5604,
1056,
271,
3434,
281,
2319,
253,
10291,
285,
747,
16936,
4457,
841,
2987,
387,
253,
990,
273,
2593,
337,
253,
4477,
9059,
326,
253,
43359,
4972,
812,
320,
625,
2087,
533,
275,
958,
841,
4321,
2987,
476,
5115,
2007,
19945,
534,
310,
1199,
10046,
685,
253,
4081,
26677,
50276,
9088,
310,
671,
271,
3434,
281,
7277,
436,
789,
281,
18429,
454,
1162,
355,
4240,
285,
253,
2087,
26661,
2934,
891,
651,
751,
281,
1127,
562,
253,
26661,
4473,
310,
247,
1199,
625,
4030,
72,
11273,
752,
50276,
2811,
9712,
2581,
685,
247,
25319,
72,
11273,
966,
50276,
3014,
9712,
275,
581,
5103,
275,
247,
24498,
6779,
752,
50276,
2811,
476,
3176,
387,
667,
1268,
347,
581,
966,
476,
2882,
273,
2067,
4243,
1016,
342,
247,
38307,
6661,
2317,
594,
253,
5301,
273,
436,
789,
281,
253,
12314,
26661,
2990,
310,
760,
28019,
604,
253,
4477,
476,
417,
1056,
253,
4081,
10336,
715,
247,
24498,
9712,
16280,
1027,
26661,
2990,
9380,
891,
1119,
1529,
7826,
4217,
3806,
327,
247,
4030,
72,
11273,
9712,
495,
3892,
6934,
249,
1162,
355,
4715,
281,
4872,
907,
762,
11649,
50276,
249,
253,
2929,
352,
310,
9125,
2067,
2069,
326,
253,
21624,
4972,
43938,
4428,
247,
6793,
873,
273,
4156,
3607,
273,
966,
340,
2581,
685,
816,
697,
5203,
285,
253,
4388,
310,
326,
352,
476,
3037,
752,
253,
3603,
273,
253,
966,
16751,
452,
275,
1846,
533,
436,
1127,
310,
417,
4516,
973,
1580,
359,
476,
1900,
1056,
247,
5203,
285,
436,
21624,
4972,
43938,
6425,
407,
247,
7646,
891,
1158,
436,
1127,
812,
320,
14282,
604,
359,
1007,
387,
391,
656,
323,
1027,
340,
835,
1016,
273,
253,
7877,
778,
452,
690,
24705,
4495,
3081,
7914,
310,
5604,
3058,
50276,
4524,
5150,
495,
3877,
326,
362,
310,
22245,
432,
43938,
943,
320,
22245,
432,
1097,
43938,
285,
1269,
534,
310,
3965,
2590,
432,
253,
3036,
608,
2905,
281,
436,
891,
812,
8564,
690,
32049,
476,
4908,
253,
3740,
3587,
432,
1269,
533,
1060,
1097,
43938,
285,
1269,
403,
908,
891,
812,
2649,
1089,
667,
12215,
326,
362,
760,
4428,
253,
3740,
1491,
1754,
327,
253,
10336,
342,
1014,
436,
3081,
23950,
812,
253,
4477,
4385,
327,
436,
50276,
29813,
608,
310,
417,
1663,
247,
16888,
1320,
285,
2007,
5150,
721,
778,
417,
320,
247,
2406,
3033,
10542,
436,
310,
3164,
247,
4942,
5884,
2181,
285,
247,
1652,
4465,
1557,
310,
3164,
2217,
50276,
783,
3904,
275,
2829,
374,
3133,
247,
1652,
36761,
50276,
936,
7525,
891,
751,
253,
2087,
3884,
273,
23694,
253,
6489,
285,
16012,
253,
3626,
6298,
452,
24498,
5289,
285,
253,
966,
16751,
4473,
310,
417,
2087,
2217,
281,
6266,
253,
3963,
1005,
285,
2085,
247,
13955,
6779,
2581,
697,
247,
1175,
4983,
1127,
604,
253,
4477,
812,
9257,
2953,
253,
2905,
2720,
2987,
285,
1361,
441,
2096,
253,
4451,
285,
3236,
9021,
273,
436,
789,
436,
2929,
812,
320,
2783,
323,
9311,
7152,
339,
431,
248,
2929,
10262,
247,
362,
3348,
326,
4648,
13301,
281,
4858,
253,
6311,
6779,
715,
271,
13727,
285,
247,
43359,
629,
253,
1332,
310,
17618,
970,
4679,
327,
253,
278,
79,
382,
10895,
50276,
783,
4028,
275,
436,
2929,
310,
8489,
20276,
3738,
352,
310,
1892,
281,
1691,
253,
9185,
327,
247,
3782,
5460,
4227,
253,
2929,
310,
6898,
342,
21248,
285,
28095,
7234,
3000,
751,
14556,
14282,
3626,
3966,
403,
8689,
34269,
4768,
281,
32039,
247,
2762,
345,
25604,
2223,
1293,
1907,
247,
2173,
4495,
275,
616,
3634,
390,
6240,
667,
1491,
835,
253,
4495,
310,
8489,
2590,
253,
3916,
403,
2223,
417,
4516,
407,
1941,
4536,
253,
3916,
403,
594,
3862,
326,
352,
310,
417,
2590,
752,
2238,
273,
1941,
812,
1329,
824,
247,
1750,
50276,
66,
4942,
1781,
2408,
273,
2317,
310,
908,
281,
5513,
253,
2087,
4473,
273,
13727,
31485,
6410,
4715,
534,
347,
247,
2087,
4473,
310,
7561,
7192,
285,
417,
4460,
627,
403,
643,
10872,
273,
689,
43759,
824,
347,
253,
4736,
273,
9383,
3348,
310,
281,
2085,
271,
2746,
281,
37851,
26278,
326,
13276,
14282,
14237,
50276,
249,
958,
9383,
3348,
310,
247,
2581,
2173,
1566,
2437,
2581,
685,
271,
2746,
281,
37851,
26278,
50276,
783,
2929,
310,
387,
2069,
479,
48299,
323,
4227,
253,
5373,
273,
285,
16038,
323,
253,
4081,
2746,
403,
417,
3365,
4767,
275,
253,
10199,
285,
840,
5183,
275,
253,
1551,
273,
253,
2929,
533,
3185,
253,
2929,
3054,
690,
5373,
285,
42852,
11424,
690,
7681,
2600,
25957,
690,
625,
5373,
24510,
690,
42852,
4767,
1078,
3966,
50276,
20415,
8607,
2444,
327,
6779,
4715,
3524,
281,
9413,
253,
6944,
4715,
9241,
326,
1421,
281,
14237,
326,
1646,
3626,
281,
247,
1966,
1146,
275,
436,
2929,
13301,
403,
908,
281,
7102,
253,
6779,
715,
253,
987,
6779,
352,
310,
275,
619,
4743,
417,
1077,
10084,
326,
581,
476,
897,
13301,
281,
10808,
2176,
18701,
14320,
11408,
275,
253,
6779,
50276,
936,
7525,
984,
273,
253,
4028,
3710,
38135,
285,
3710,
4679,
891,
1158,
436,
2929,
4390,
1057,
417,
1509,
253,
2534,
323,
17857,
32888,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
747,
2746,
281,
3037,
4858,
966,
25168,
285,
502,
284,
2346,
400,
6410,
21624,
14237,
407,
3733,
327,
13130,
285,
15266,
3081,
440,
47728,
4471,
966,
941,
16774,
1543,
327,
278,
79,
382,
285,
18504,
13107,
921,
326,
253,
1332,
2987,
973,
30628,
8523,
16318,
253,
1563,
32213,
273,
253,
2929,
12497,
10414,
285,
42455,
342,
2905,
789,
1677,
326,
436,
1895,
2317,
556,
644,
1199,
14859,
1078,
50276,
15870,
38135,
273,
253,
2746,
3710,
4679,
278,
79,
382,
760,
581,
37317,
671,
5393,
247,
4536,
21248,
27662,
28095,
285,
479,
48299,
3630,
484,
50276,
43355,
858,
247,
49638,
494,
3434,
281,
3157,
253,
2929,
1754,
327,
253,
10123,
6240,
747,
10414,
11922,
285,
294,
17695,
4243,
273,
253,
2929,
281,
1056,
352,
625,
7106,
285,
5277,
5661,
1543,
327,
271,
3081,
10895,
18504,
13107,
253,
2929,
858,
3157,
347,
247,
906,
533,
1223,
26513,
253,
3302,
43680,
3464,
3588,
253,
6239,
2278,
285,
5955,
4558,
2159,
285,
1512,
28019,
253,
19532,
1005,
273,
253,
2746,
534,
4098,
352,
16453,
3236,
414,
403,
12497,
314,
28055,
285,
45190,
17285,
285,
417,
4518,
2217,
1691,
275,
3634,
273,
253,
2644,
2133,
273,
2720,
789,
17912,
253,
4081,
2746,
9193,
1077,
519,
37806,
4720,
253,
3081,
4679,
403,
247,
3213,
275,
253,
987,
3884,
533,
4679,
327,
760,
278,
79,
382,
285,
18504,
13107,
403,
10693,
2217,
275,
4765,
281,
18578,
253,
9414,
326,
247,
1332,
556,
247,
10898,
2442,
285,
310,
625,
3839,
4217,
1677,
253,
3710,
38135,
285,
275,
253,
5928,
273,
10527,
22861,
4679,
943,
320,
1199,
625,
9470,
1097,
275,
9991,
273,
2856,
522,
287,
23042,
285,
275,
253,
2491,
273,
5795,
7274,
2429,
281,
281,
1973,
247,
21414,
1083,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
730,
274,
800,
1566,
323,
3888,
534,
11120,
36158,
253,
1561,
966,
7629,
50276,
783,
43359,
629,
432,
253,
2439,
966,
7629,
13727,
629,
30333,
436,
33526,
247,
2074,
50276,
6870,
347,
2710,
3332,
2987,
327,
24049,
828,
6656,
707,
275,
11454,
37507,
533,
253,
958,
326,
352,
310,
2104,
50276,
936,
11120,
3989,
3210,
323,
1097,
4243,
273,
253,
3268,
310,
5322,
1543,
327,
278,
79,
382,
403,
1175,
50276,
2858,
273,
2282,
436,
310,
247,
1077,
2969,
10895,
352,
651,
320,
1077,
4722,
281,
923,
849,
253,
1566,
50276,
468,
13015,
327,
247,
625,
15958,
1895,
50275,
324,
3004,
314,
891,
717,
417,
271,
6485,
275,
1006,
800,
3210,
436,
310,
247,
4076,
2929,
342,
247,
2590,
4736,
352,
310,
1892,
50276,
1542,
479,
281,
5963,
849,
3236,
253,
2934,
310,
50276,
31485,
6410,
1537,
417,
320,
253,
1682,
3159,
281,
897,
1060,
984,
352,
556,
247,
1077,
2173,
4495,
275,
253,
3634,
50276,
1171,
690,
643,
11454,
6928,
2905,
281,
849,
13483,
4979,
2556,
281,
14237,
273,
247,
10377,
50276,
4399,
436,
310,
247,
2442,
2603,
273,
13775,
5474,
33032,
2520,
2929,
310,
973,
3542,
285,
253,
3290,
273,
253,
8442,
310,
1175,
275,
436,
2929,
253,
4477,
12661,
271,
13727,
31485,
6410,
2934,
534,
943,
320,
15483,
896,
387,
1878,
281,
253,
10370,
48971,
3210,
253,
2087,
3884,
310,
1774,
285,
943,
320,
23321,
2007,
50275,
35529,
253,
6239,
310,
417,
973,
9713,
1578,
5247,
74,
1162,
355,
4765,
452,
644,
11106,
533,
690,
1077,
1774,
285,
2905,
4321,
2987,
751,
50276,
18,
465,
23073,
1596,
74,
1162,
355,
4104,
3676,
27311,
267,
13737,
15896,
2990,
374,
1161,
1947,
1162,
355,
4104,
30375,
8763,
2616,
273,
7629,
275,
3676,
6928,
497,
417,
5469,
387,
512,
253,
4477,
943,
5604,
1056,
271,
3434,
281,
2319,
253,
10291,
285,
747,
16936,
4457,
841,
2987,
387,
253,
990,
273,
2593,
337,
253,
4477,
9059,
326,
253,
43359,
4972,
812,
320,
625,
2087,
533,
275,
958,
841,
4321,
2987,
476,
5115,
2007,
19945,
534,
310,
1199,
10046,
685,
253,
4081,
26677,
50276,
9088,
310,
671,
271,
3434,
281,
7277,
436,
789,
281,
18429,
454,
1162,
355,
4240,
285,
253,
2087,
26661,
2934,
891,
651,
751,
281,
1127,
562,
253,
26661,
4473,
310,
247,
1199,
625,
4030,
72,
11273,
752,
50276,
2811,
9712,
2581,
685,
247,
25319,
72,
11273,
966,
50276,
3014,
9712,
275,
581,
5103,
275,
247,
24498,
6779,
752,
50276,
2811,
476,
3176,
387,
667,
1268,
347,
581,
966,
476,
2882,
273,
2067,
4243,
1016,
342,
247,
38307,
6661,
2317,
594,
253,
5301,
273,
436,
789,
281,
253,
12314,
26661,
2990,
310,
760,
28019,
604,
253,
4477,
476,
417,
1056,
253,
4081,
10336,
715,
247,
24498,
9712,
16280,
1027,
26661,
2990,
9380,
891,
1119,
1529,
7826,
4217,
3806,
327,
247,
4030,
72,
11273,
9712,
495,
3892,
6934,
249,
1162,
355,
4715,
281,
4872,
907,
762,
11649,
50276,
249,
253,
2929,
352,
310,
9125,
2067,
2069,
326,
253,
21624,
4972,
43938,
4428,
247,
6793,
873,
273,
4156,
3607,
273,
966,
340,
2581,
685,
816,
697,
5203,
285,
253,
4388,
310,
326,
352,
476,
3037,
752,
253,
3603,
273,
253,
966,
16751,
452,
275,
1846,
533,
436,
1127,
310,
417,
4516,
973,
1580,
359,
476,
1900,
1056,
247,
5203,
285,
436,
21624,
4972,
43938,
6425,
407,
247,
7646,
891,
1158,
436,
1127,
812,
320,
14282,
604,
359,
1007,
387,
391,
656,
323,
1027,
340,
835,
1016,
273,
253,
7877,
778,
452,
690,
24705,
4495,
3081,
7914,
310,
5604,
3058,
50276,
4524,
5150,
495,
3877,
326,
362,
310,
22245,
432,
43938,
943,
320,
22245,
432,
1097,
43938,
285,
1269,
534,
310,
3965,
2590,
432,
253,
3036,
608,
2905,
281,
436,
891,
812,
8564,
690,
32049,
476,
4908,
253,
3740,
3587,
432,
1269,
533,
1060,
1097,
43938,
285,
1269,
403,
908,
891,
812,
2649,
1089,
667,
12215,
326,
362,
760,
4428,
253,
3740,
1491,
1754,
327,
253,
10336,
342,
1014,
436,
3081,
23950,
812,
253,
4477,
4385,
327,
436,
50276,
29813,
608,
310,
417,
1663,
247,
16888,
1320,
285,
2007,
5150,
721,
778,
417,
320,
247,
2406,
3033,
10542,
436,
310,
3164,
247,
4942,
5884,
2181,
285,
247,
1652,
4465,
1557,
310,
3164,
2217,
50276,
783,
3904,
275,
2829,
374,
3133,
247,
1652,
36761,
50276,
936,
7525,
891,
751,
253,
2087,
3884,
273,
23694,
253,
6489,
285,
16012,
253,
3626,
6298,
452,
24498,
5289,
285,
253,
966,
16751,
4473,
310,
417,
2087,
2217,
281,
6266,
253,
3963,
1005,
285,
2085,
247,
13955,
6779,
2581,
697,
247,
1175,
4983,
1127,
604,
253,
4477,
812,
9257,
2953,
253,
2905,
2720,
2987,
285,
1361,
441,
2096,
253,
4451,
285,
3236,
9021,
273,
436,
789,
436,
2929,
812,
320,
2783,
323,
9311,
7152,
339,
431,
248,
2929,
10262,
247,
362,
3348,
326,
4648,
13301,
281,
4858,
253,
6311,
6779,
715,
271,
13727,
285,
247,
43359,
629,
253,
1332,
310,
17618,
970,
4679,
327,
253,
278,
79,
382,
10895,
50276,
783,
4028,
275,
436,
2929,
310,
8489,
20276,
3738,
352,
310,
1892,
281,
1691,
253,
9185,
327,
247,
3782,
5460,
4227,
253,
2929,
310,
6898,
342,
21248,
285,
28095,
7234,
3000,
751,
14556,
14282,
3626,
3966,
403,
8689,
34269,
4768,
281,
32039,
247,
2762,
345,
25604,
2223,
1293,
1907,
247,
2173,
4495,
275,
616,
3634,
390,
6240,
667,
1491,
835,
253,
4495,
310,
8489,
2590,
253,
3916,
403,
2223,
417,
4516,
407,
1941,
4536,
253,
3916,
403,
594,
3862,
326,
352,
310,
417,
2590,
752,
2238,
273,
1941,
812,
1329,
824,
247,
1750,
50276,
66,
4942,
1781,
2408,
273,
2317,
310,
908,
281,
5513,
253,
2087,
4473,
273,
13727,
31485,
6410,
4715,
534,
347,
247,
2087,
4473,
310,
7561,
7192,
285,
417,
4460,
627,
403,
643,
10872,
273,
689,
43759,
824,
347,
253,
4736,
273,
9383,
3348,
310,
281,
2085,
271,
2746,
281,
37851,
26278,
326,
13276,
14282,
14237,
50276,
249,
958,
9383,
3348,
310,
247,
2581,
2173,
1566,
2437,
2581,
685,
271,
2746,
281,
37851,
26278,
50276,
783,
2929,
310,
387,
2069,
479,
48299,
323,
4227,
253,
5373,
273,
285,
16038,
323,
253,
4081,
2746,
403,
417,
3365,
4767,
275,
253,
10199,
285,
840,
5183,
275,
253,
1551,
273,
253,
2929,
533,
3185,
253,
2929,
3054,
690,
5373,
285,
42852,
11424,
690,
7681,
2600,
25957,
690,
625,
5373,
24510,
690,
42852,
4767,
1078,
3966,
50276,
20415,
8607,
2444,
327,
6779,
4715,
3524,
281,
9413,
253,
6944,
4715,
9241,
326,
1421,
281,
14237,
326,
1646,
3626,
281,
247,
1966,
1146,
275,
436,
2929,
13301,
403,
908,
281,
7102,
253,
6779,
715,
253,
987,
6779,
352,
310,
275,
619,
4743,
417,
1077,
10084,
326,
581,
476,
897,
13301,
281,
10808,
2176,
18701,
14320,
11408,
275,
253,
6779,
50276,
936,
7525,
984,
273,
253,
4028,
3710,
38135,
285,
3710,
4679,
891,
1158,
436,
2929,
4390,
1057,
417,
1509,
253,
2534,
323,
17857,
32888,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
747,
2746,
281,
3037,
4858,
966,
25168,
285,
502,
284,
2346,
400,
6410,
21624,
14237,
407,
3733,
327,
13130,
285,
15266,
3081,
440,
47728,
4471,
966,
941,
16774,
1543,
327,
278,
79,
382,
285,
18504,
13107,
921,
326,
253,
1332,
2987,
973,
30628,
8523,
16318,
253,
1563,
32213,
273,
253,
2929,
12497,
10414,
285,
42455,
342,
2905,
789,
1677,
326,
436,
1895,
2317,
556,
644,
1199,
14859,
1078,
50276,
15870,
38135,
273,
253,
2746,
3710,
4679,
278,
79,
382,
760,
581,
37317,
671,
5393,
247,
4536,
21248,
27662,
28095,
285,
479,
48299,
3630,
484,
50276,
43355,
858,
247,
49638,
494,
3434,
281,
3157,
253,
2929,
1754,
327,
253,
10123,
6240,
747,
10414,
11922,
285,
294,
17695,
4243,
273,
253,
2929,
281,
1056,
352,
625,
7106,
285,
5277,
5661,
1543,
327,
271,
3081,
10895,
18504,
13107,
253,
2929,
858,
3157,
347,
247,
906,
533,
1223,
26513,
253,
3302,
43680,
3464,
3588,
253,
6239,
2278,
285,
5955,
4558,
2159,
285,
1512,
28019,
253,
19532,
1005,
273,
253,
2746,
534,
4098,
352,
16453,
3236,
414,
403,
12497,
314,
28055,
285,
45190,
17285,
285,
417,
4518,
2217,
1691,
275,
3634,
273,
253,
2644,
2133,
273,
2720,
789,
17912,
253,
4081,
2746,
9193,
1077,
519,
37806,
4720,
253,
3081,
4679,
403,
247,
3213,
275,
253,
987,
3884,
533,
4679,
327,
760,
278,
79,
382,
285,
18504,
13107,
403,
10693,
2217,
275,
4765,
281,
18578,
253,
9414,
326,
247,
1332,
556,
247,
10898,
2442,
285,
310,
625,
3839,
4217,
1677,
253,
3710,
38135,
285,
275,
253,
5928,
273,
10527,
22861,
4679,
943,
320,
1199,
625,
9470,
1097,
275,
9991,
273,
2856,
522,
287,
23042,
285,
275,
253,
2491,
273,
5795,
7274,
2429,
281,
281,
1973,
247,
21414,
1083,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the reliance of deep learning models on spurious correlations in particular the authors look at the quality of feature representations learned by models trained via erm versus models trained using group robustness methods they evaluate these feature representations by utilizing the deep feature retraining dfr procedure retraining the last layer of the model on a heldout set which would likely not contain spurious correlations present in the training set this procedure helps reveal how much information about causal factors is present in the learned representations the authors further explore how these feature representations are influenced by the model architecture pretraining task and strategy regularization via weight decay data augmentation training length and whether or not model has been trained on the target data they find that the quality of these representations often depends heavily on choice of data augmentation model architecture and the pretraining strategy involved while regularization and training time may not be as helpful in improving the quality of said representations the authors share results on celeba waterbirds wildsfmow cxr multinli and civilcomments datasets overall the paper presents interesting results and insights i have some minor comments which i hope the authors will address during the response period strengths this work looks at an interesting and well motivated problem the experimental setup is well designed and results offer insights that would be useful to future researchers the paper is well written and organized weaknesses the experiments on nlp datasets are based on a bert model while i understand that the goal here is not to create a state of the art model but to analyze representations learned by a model significantly better models deberta ernie t5 etc are out there that the authors could have used there are several works in nlp that have looked at the problem of spurious correlations 1234 are just a few examples addressing them and understanding when models weigh causal features vs non causal features the paper currently does not position itself well in that literature additional comments section 3 preliminaries lines 99102 this appears to be incorrect in light of 1 in fact as most machine learning tasks are anticausal models will rely on spurious correlations regardless as 1 show in their anticausal setup a model will rely on spurious factors most of the time unless the spurious features observe higher noise compared to the causal features 1 kaushik d setlur a hovy e h lipton z c explaining the efficacy of counterfactually augmented data iclr 2021 2 eisenstein j 2022 uninformative input features and counterfactual invariance two perspectives on spurious correlations in natural language arxiv preprint arxiv220404487 3 veitch v damour a yadlowsky s eisenstein j counterfactual invariance to spurious correlations in text classification neurips 2021 4 kaushik d hovy e lipton z learning the difference that makes a difference with counterfactuallyaugmented data iclr 2020 as this is an analysis paper it is hard to understand the limitations and potential negative social impact but i would urge the authors to think about potential negative impacts arising from misinterpretation of their analysis docsepthis paper considers deep learning in the common case where the training data contains spurious correlations the main takeaway is that empirical risk minimization alone is sufficient to obtain stateoftheart performance specialized group robustness methods do not appear to provide a significant benefit this is demonstrated on six datasets spanning both vision and text problems the effect of the architecture pretraining strategy and regularization is also considered spurious correlations are a concern when fitting neural networks so this paper tackles an important problem overall i found the presentation to be quite good and the experiments fairly convincing my main issue with the work in its current form is that the scope and therefore potential impact of the work is more limited than the title and introduction suggest the spurious correlations studied are labeled properties of the inputs rather than latent spurious features in most cases one does not have access to labeled attributes or even class labels when fitting a neural network and therefore this work has a narrower scope than expected this being the case i think it is notable that specialized group robustness methods appear to perform no better than erm when it comes to learning the in the presence of spurious correlations in addition the empirical observations regarding regularization and other effects of the base model are interesting although many of them rely on dfr which afaik is not peer reviewed the dfr procedure is somewhat similar to whats done in contrastive learning eg supervised contrastive learning except there the second stage is performed on the original dataset this often results in improved performance thanks to the contrastive objective im curious how supervised contrastive learning would impact the results both using the original dataset or the reweighted one i would have liked to see the analysis of pretraining to include additional experiments with text bert a smaller concern is that this paper leans heavily on deep feature reweighting dfr which appears in a recent arxiv preprint kirichenko et al 22 unfortunately reading that preprint is necessary to understand this work reading the short description in s3 was not sufficient to follow along it would be better to make this paper selfcontained especially given how simple the dfr idea is eg define the reweighting dataset i would have liked to see the scope of the paper defined a bit more clearly in addition while both vision and text datasets are used the majority of the experiments are only on the image datasets docsepthis is primarily an empirical paper about learning in the presence of spurious correlations they run a lot of tests that show training a model with empirical risk minimization erm followed by deep feature reweighting dfr which is retraining last layer on some on a heldout set that doesnt have spurious correlations yields results that are not too different from group robustness training group dro thanks to the authors for the hard work on this paper i like it overall and think it is a valuable contribution strengths a large amount of insightful experiments clear writing sensible comparisons and conclusions weaknesses the conclusion of this paper hinges on the idea that if two different methods get roughly the same heldout performance then they must be learning the same kind of thing i dont think thats necessarily true and your experiments dont really prove it the conclusion in lines 207211 could be made stronger with other types of analyses to have more certainty its that better weighting of the learned features rather than learning better representations of the core features you could for example actually extract the representations learned from ermdft and gdro and see if they look similar this experiment could be tricky but if you make sure to have identical weight initialization and run the experiment 10 times you could show that the feature spaces learned are really similar or not why is early stopping important for rwy rwg and gdro are there some simple experiments you could do to elucidate this not required here
### Summary: | the paper shows that empirical risk minimization is sufficient to obtain good worstgroup accuracies and specialized group robustness methods do not appear to provide additional benefits the reviewers pointed out that the current work depends on dfr which seems to require some additional data compared to group robustness methods the reviewers also note that the nlp experiments did not use more recent models and the authors addressed these issues generally the reviewers think this is a wellexecuted paper on an important problem and are unanimous in accepting it | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
22095,
273,
3676,
4715,
3210,
327,
46541,
13007,
275,
1798,
253,
4477,
1007,
387,
253,
3290,
273,
4735,
14237,
6311,
407,
3210,
10166,
3066,
209,
693,
7147,
3210,
10166,
970,
1387,
31640,
3082,
597,
7472,
841,
4735,
14237,
407,
17617,
253,
3676,
4735,
851,
26208,
277,
925,
5199,
851,
26208,
253,
1390,
3828,
273,
253,
1566,
327,
247,
2918,
483,
873,
534,
651,
2779,
417,
3831,
46541,
13007,
1246,
275,
253,
3733,
873,
436,
5199,
7729,
10313,
849,
1199,
1491,
670,
19349,
2616,
310,
1246,
275,
253,
6311,
14237,
253,
4477,
2007,
8338,
849,
841,
4735,
14237,
403,
12208,
407,
253,
1566,
10336,
3215,
26208,
4836,
285,
5700,
37820,
3066,
2801,
10027,
941,
42072,
3733,
2978,
285,
1880,
390,
417,
1566,
556,
644,
10166,
327,
253,
2303,
941,
597,
1089,
326,
253,
3290,
273,
841,
14237,
2223,
7024,
11306,
327,
4327,
273,
941,
42072,
1566,
10336,
285,
253,
3215,
26208,
5700,
3206,
1223,
37820,
285,
3733,
673,
778,
417,
320,
347,
9371,
275,
11138,
253,
3290,
273,
753,
14237,
253,
4477,
3894,
1543,
327,
6076,
5830,
1824,
40382,
4956,
6091,
78,
319,
48363,
83,
37197,
965,
285,
5079,
26122,
15302,
50276,
1189,
455,
253,
2929,
10262,
4722,
1543,
285,
16039,
891,
452,
690,
5884,
5701,
534,
891,
3524,
253,
4477,
588,
2953,
1309,
253,
2380,
2180,
50276,
296,
3755,
20556,
50276,
2520,
789,
4453,
387,
271,
4722,
285,
973,
17194,
1895,
50276,
783,
5661,
9978,
310,
973,
4158,
285,
1543,
3959,
16039,
326,
651,
320,
4217,
281,
2852,
8607,
50276,
783,
2929,
310,
973,
3542,
285,
10932,
50275,
20881,
1255,
265,
50276,
783,
4679,
327,
295,
24343,
15302,
403,
1754,
327,
247,
270,
797,
1566,
1223,
891,
2096,
326,
253,
4736,
1060,
310,
417,
281,
2794,
247,
1375,
273,
253,
1445,
1566,
533,
281,
12106,
14237,
6311,
407,
247,
1566,
3012,
1805,
3210,
372,
589,
893,
209,
1808,
466,
246,
22,
3966,
403,
562,
627,
326,
253,
4477,
812,
452,
908,
50275,
9088,
403,
2067,
2987,
275,
295,
24343,
326,
452,
3261,
387,
253,
1895,
273,
46541,
13007,
1249,
1706,
403,
816,
247,
1643,
6667,
15974,
731,
285,
4685,
672,
3210,
14357,
19349,
3386,
4632,
1327,
19349,
3386,
253,
2929,
4390,
1057,
417,
1899,
3139,
973,
275,
326,
6239,
50275,
38092,
5701,
50276,
4674,
495,
11944,
249,
3927,
3104,
8688,
11335,
436,
4620,
281,
320,
13583,
275,
1708,
273,
337,
275,
958,
347,
954,
5145,
4715,
8892,
403,
37935,
27026,
3210,
588,
10725,
327,
46541,
13007,
10159,
347,
337,
921,
275,
616,
37935,
27026,
9978,
247,
1566,
588,
10725,
327,
46541,
2616,
954,
273,
253,
673,
5734,
253,
46541,
3386,
10018,
2169,
6046,
2429,
281,
253,
19349,
3386,
50276,
18,
465,
666,
73,
1479,
277,
873,
77,
321,
247,
288,
35068,
299,
288,
50276,
965,
11933,
1182,
260,
15571,
253,
10307,
273,
4828,
12690,
1230,
31612,
941,
17857,
32888,
43425,
374,
299,
11889,
6339,
480,
1384,
1423,
440,
37650,
800,
3280,
3386,
285,
4828,
12690,
780,
31429,
767,
24302,
327,
46541,
13007,
275,
3626,
3448,
549,
32693,
638,
3845,
549,
32693,
14256,
1449,
2031,
2597,
495,
1670,
2682,
362,
2687,
454,
247,
340,
324,
676,
13456,
256,
50276,
70,
11889,
6339,
480,
4828,
12690,
780,
31429,
281,
46541,
13007,
275,
2505,
9162,
5723,
2824,
43425,
577,
465,
666,
73,
1479,
277,
288,
35068,
299,
50276,
965,
11933,
1182,
4715,
253,
3064,
326,
2789,
247,
3064,
342,
4828,
12690,
1230,
2321,
16390,
941,
17857,
32888,
9169,
347,
436,
310,
271,
1783,
2929,
352,
310,
1892,
281,
2096,
253,
7364,
285,
2442,
4016,
2675,
3486,
533,
891,
651,
21434,
253,
4477,
281,
1158,
670,
2442,
4016,
16274,
14475,
432,
3731,
22416,
318,
273,
616,
1783,
5474,
33032,
2520,
2929,
19401,
3676,
4715,
275,
253,
1846,
1083,
835,
253,
3733,
941,
4428,
46541,
13007,
253,
2022,
1379,
12594,
310,
326,
16774,
2495,
41458,
3815,
310,
4209,
281,
4044,
1375,
23037,
14387,
3045,
18052,
1387,
31640,
3082,
513,
417,
3176,
281,
2085,
247,
1534,
5649,
436,
310,
5183,
327,
2800,
15302,
28369,
1097,
8113,
285,
2505,
3237,
253,
1055,
273,
253,
10336,
3215,
26208,
5700,
285,
37820,
310,
671,
2783,
46541,
13007,
403,
247,
4468,
672,
13532,
11454,
6928,
594,
436,
2929,
39223,
271,
1774,
1895,
4583,
891,
1119,
253,
9759,
281,
320,
3240,
1175,
285,
253,
4679,
9648,
21414,
619,
2022,
2523,
342,
253,
789,
275,
697,
1655,
830,
310,
326,
253,
7990,
285,
3103,
2442,
3486,
273,
253,
789,
310,
625,
3710,
685,
253,
4060,
285,
10199,
1804,
253,
46541,
13007,
5421,
403,
13130,
3607,
273,
253,
14800,
2581,
685,
21624,
46541,
3386,
275,
954,
2219,
581,
1057,
417,
452,
2289,
281,
13130,
12474,
390,
1014,
966,
13301,
672,
13532,
247,
11454,
2990,
285,
3103,
436,
789,
556,
247,
39937,
7990,
685,
3264,
50276,
2520,
1146,
253,
1083,
891,
1158,
352,
310,
16613,
326,
18052,
1387,
31640,
3082,
3176,
281,
1347,
642,
1805,
685,
209,
693,
672,
352,
3249,
281,
4715,
253,
275,
253,
3361,
273,
46541,
13007,
275,
1635,
253,
16774,
7313,
5001,
37820,
285,
643,
2538,
273,
253,
2613,
1566,
403,
4722,
3738,
1142,
273,
731,
10725,
327,
277,
925,
534,
6706,
66,
1479,
310,
417,
14218,
9814,
50276,
783,
277,
925,
5199,
310,
8489,
2074,
281,
47515,
2218,
275,
4499,
422,
4715,
24088,
22296,
4499,
422,
4715,
3707,
627,
253,
1273,
3924,
310,
2684,
327,
253,
3236,
10895,
436,
2223,
1543,
275,
5520,
3045,
6701,
281,
253,
4499,
422,
8103,
516,
14338,
849,
22296,
4499,
422,
4715,
651,
3486,
253,
1543,
1097,
970,
253,
3236,
10895,
390,
253,
294,
24676,
581,
50276,
74,
651,
452,
10490,
281,
923,
253,
1783,
273,
3215,
26208,
281,
2486,
3081,
4679,
342,
2505,
50276,
6291,
50276,
66,
4577,
4468,
310,
326,
436,
2929,
458,
507,
11306,
327,
3676,
4735,
294,
6712,
272,
277,
925,
534,
4620,
275,
247,
3332,
549,
32693,
638,
3845,
465,
343,
35009,
7381,
1162,
355,
3307,
19235,
4361,
326,
638,
3845,
310,
3309,
281,
2096,
436,
789,
4361,
253,
2159,
5740,
275,
256,
20,
369,
417,
4209,
281,
956,
2112,
352,
651,
320,
1805,
281,
1056,
436,
2929,
1881,
41010,
3340,
1677,
849,
2969,
253,
277,
925,
2934,
310,
24088,
4853,
253,
294,
6712,
272,
10895,
50275,
74,
651,
452,
10490,
281,
923,
253,
7990,
273,
253,
2929,
2931,
247,
2372,
625,
4518,
275,
1635,
1223,
1097,
8113,
285,
2505,
15302,
403,
908,
253,
5020,
273,
253,
4679,
403,
760,
327,
253,
2460,
15302,
5474,
33032,
2520,
310,
8558,
271,
16774,
2929,
670,
4715,
275,
253,
3361,
273,
46541,
13007,
597,
1408,
247,
2257,
273,
5216,
326,
921,
3733,
247,
1566,
342,
16774,
2495,
41458,
209,
693,
3560,
407,
3676,
4735,
294,
6712,
272,
277,
925,
534,
310,
851,
26208,
1390,
3828,
327,
690,
327,
247,
2918,
483,
873,
326,
36908,
452,
46541,
13007,
11026,
1543,
326,
403,
417,
1512,
1027,
432,
1387,
31640,
3733,
1387,
3926,
6701,
281,
253,
4477,
323,
253,
1892,
789,
327,
436,
2929,
891,
751,
352,
4583,
285,
1158,
352,
310,
247,
9865,
7680,
50276,
296,
3755,
20556,
50275,
66,
1781,
2408,
273,
47860,
4679,
50276,
8250,
4028,
50276,
6209,
917,
14023,
285,
11815,
50276,
20881,
1255,
265,
50275,
783,
6452,
273,
436,
2929,
34865,
265,
327,
253,
2934,
326,
604,
767,
1027,
3082,
755,
11467,
253,
1072,
2918,
483,
3045,
840,
597,
1364,
320,
4715,
253,
1072,
2238,
273,
2181,
891,
13414,
1158,
28763,
7933,
2032,
285,
634,
4679,
13414,
1663,
5276,
352,
253,
6452,
275,
3104,
1384,
3547,
883,
812,
320,
1160,
10046,
342,
643,
3510,
273,
6260,
281,
452,
625,
23140,
697,
326,
1805,
42428,
273,
253,
6311,
3386,
2581,
685,
4715,
1805,
14237,
273,
253,
5161,
3386,
368,
812,
323,
1650,
2686,
4908,
253,
14237,
6311,
432,
209,
693,
69,
649,
285,
305,
3002,
285,
923,
604,
597,
1007,
2074,
436,
3368,
812,
320,
28190,
533,
604,
368,
1056,
2119,
281,
452,
8931,
2801,
31850,
285,
1408,
253,
3368,
884,
2069,
368,
812,
921,
326,
253,
4735,
8470,
6311,
403,
1663,
2074,
390,
417,
50276,
22309,
310,
2393,
15910,
1774,
323,
391,
22383,
391,
46506,
285,
305,
3002,
403,
627,
690,
2969,
4679,
368,
812,
513,
281,
30955,
436,
417,
2424,
1060,
2490,
187,
4118,
18435,
27,
783,
2929,
2722,
326,
16774,
2495,
41458,
310,
4209,
281,
4044,
1175,
9065,
4399,
3933,
19103,
285,
18052,
1387,
31640,
3082,
513,
417,
3176,
281,
2085,
3081,
5373,
50276,
783,
30628,
8042,
562,
326,
253,
1655,
789,
7024,
327,
277,
925,
534,
3133,
281,
2430,
690,
3081,
941,
2429,
281,
1387,
31640,
3082,
253,
30628,
671,
3877,
326,
253,
295,
24343,
4679,
858,
417,
897,
625,
3332,
3210,
285,
253,
4477,
9713,
841,
3374,
3839,
253,
30628,
1158,
436,
310,
247,
6210,
1591,
886,
4525,
2929,
327,
271,
1774,
1895,
285,
403,
42293,
275,
18738,
352,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
22095,
273,
3676,
4715,
3210,
327,
46541,
13007,
275,
1798,
253,
4477,
1007,
387,
253,
3290,
273,
4735,
14237,
6311,
407,
3210,
10166,
3066,
209,
693,
7147,
3210,
10166,
970,
1387,
31640,
3082,
597,
7472,
841,
4735,
14237,
407,
17617,
253,
3676,
4735,
851,
26208,
277,
925,
5199,
851,
26208,
253,
1390,
3828,
273,
253,
1566,
327,
247,
2918,
483,
873,
534,
651,
2779,
417,
3831,
46541,
13007,
1246,
275,
253,
3733,
873,
436,
5199,
7729,
10313,
849,
1199,
1491,
670,
19349,
2616,
310,
1246,
275,
253,
6311,
14237,
253,
4477,
2007,
8338,
849,
841,
4735,
14237,
403,
12208,
407,
253,
1566,
10336,
3215,
26208,
4836,
285,
5700,
37820,
3066,
2801,
10027,
941,
42072,
3733,
2978,
285,
1880,
390,
417,
1566,
556,
644,
10166,
327,
253,
2303,
941,
597,
1089,
326,
253,
3290,
273,
841,
14237,
2223,
7024,
11306,
327,
4327,
273,
941,
42072,
1566,
10336,
285,
253,
3215,
26208,
5700,
3206,
1223,
37820,
285,
3733,
673,
778,
417,
320,
347,
9371,
275,
11138,
253,
3290,
273,
753,
14237,
253,
4477,
3894,
1543,
327,
6076,
5830,
1824,
40382,
4956,
6091,
78,
319,
48363,
83,
37197,
965,
285,
5079,
26122,
15302,
50276,
1189,
455,
253,
2929,
10262,
4722,
1543,
285,
16039,
891,
452,
690,
5884,
5701,
534,
891,
3524,
253,
4477,
588,
2953,
1309,
253,
2380,
2180,
50276,
296,
3755,
20556,
50276,
2520,
789,
4453,
387,
271,
4722,
285,
973,
17194,
1895,
50276,
783,
5661,
9978,
310,
973,
4158,
285,
1543,
3959,
16039,
326,
651,
320,
4217,
281,
2852,
8607,
50276,
783,
2929,
310,
973,
3542,
285,
10932,
50275,
20881,
1255,
265,
50276,
783,
4679,
327,
295,
24343,
15302,
403,
1754,
327,
247,
270,
797,
1566,
1223,
891,
2096,
326,
253,
4736,
1060,
310,
417,
281,
2794,
247,
1375,
273,
253,
1445,
1566,
533,
281,
12106,
14237,
6311,
407,
247,
1566,
3012,
1805,
3210,
372,
589,
893,
209,
1808,
466,
246,
22,
3966,
403,
562,
627,
326,
253,
4477,
812,
452,
908,
50275,
9088,
403,
2067,
2987,
275,
295,
24343,
326,
452,
3261,
387,
253,
1895,
273,
46541,
13007,
1249,
1706,
403,
816,
247,
1643,
6667,
15974,
731,
285,
4685,
672,
3210,
14357,
19349,
3386,
4632,
1327,
19349,
3386,
253,
2929,
4390,
1057,
417,
1899,
3139,
973,
275,
326,
6239,
50275,
38092,
5701,
50276,
4674,
495,
11944,
249,
3927,
3104,
8688,
11335,
436,
4620,
281,
320,
13583,
275,
1708,
273,
337,
275,
958,
347,
954,
5145,
4715,
8892,
403,
37935,
27026,
3210,
588,
10725,
327,
46541,
13007,
10159,
347,
337,
921,
275,
616,
37935,
27026,
9978,
247,
1566,
588,
10725,
327,
46541,
2616,
954,
273,
253,
673,
5734,
253,
46541,
3386,
10018,
2169,
6046,
2429,
281,
253,
19349,
3386,
50276,
18,
465,
666,
73,
1479,
277,
873,
77,
321,
247,
288,
35068,
299,
288,
50276,
965,
11933,
1182,
260,
15571,
253,
10307,
273,
4828,
12690,
1230,
31612,
941,
17857,
32888,
43425,
374,
299,
11889,
6339,
480,
1384,
1423,
440,
37650,
800,
3280,
3386,
285,
4828,
12690,
780,
31429,
767,
24302,
327,
46541,
13007,
275,
3626,
3448,
549,
32693,
638,
3845,
549,
32693,
14256,
1449,
2031,
2597,
495,
1670,
2682,
362,
2687,
454,
247,
340,
324,
676,
13456,
256,
50276,
70,
11889,
6339,
480,
4828,
12690,
780,
31429,
281,
46541,
13007,
275,
2505,
9162,
5723,
2824,
43425,
577,
465,
666,
73,
1479,
277,
288,
35068,
299,
50276,
965,
11933,
1182,
4715,
253,
3064,
326,
2789,
247,
3064,
342,
4828,
12690,
1230,
2321,
16390,
941,
17857,
32888,
9169,
347,
436,
310,
271,
1783,
2929,
352,
310,
1892,
281,
2096,
253,
7364,
285,
2442,
4016,
2675,
3486,
533,
891,
651,
21434,
253,
4477,
281,
1158,
670,
2442,
4016,
16274,
14475,
432,
3731,
22416,
318,
273,
616,
1783,
5474,
33032,
2520,
2929,
19401,
3676,
4715,
275,
253,
1846,
1083,
835,
253,
3733,
941,
4428,
46541,
13007,
253,
2022,
1379,
12594,
310,
326,
16774,
2495,
41458,
3815,
310,
4209,
281,
4044,
1375,
23037,
14387,
3045,
18052,
1387,
31640,
3082,
513,
417,
3176,
281,
2085,
247,
1534,
5649,
436,
310,
5183,
327,
2800,
15302,
28369,
1097,
8113,
285,
2505,
3237,
253,
1055,
273,
253,
10336,
3215,
26208,
5700,
285,
37820,
310,
671,
2783,
46541,
13007,
403,
247,
4468,
672,
13532,
11454,
6928,
594,
436,
2929,
39223,
271,
1774,
1895,
4583,
891,
1119,
253,
9759,
281,
320,
3240,
1175,
285,
253,
4679,
9648,
21414,
619,
2022,
2523,
342,
253,
789,
275,
697,
1655,
830,
310,
326,
253,
7990,
285,
3103,
2442,
3486,
273,
253,
789,
310,
625,
3710,
685,
253,
4060,
285,
10199,
1804,
253,
46541,
13007,
5421,
403,
13130,
3607,
273,
253,
14800,
2581,
685,
21624,
46541,
3386,
275,
954,
2219,
581,
1057,
417,
452,
2289,
281,
13130,
12474,
390,
1014,
966,
13301,
672,
13532,
247,
11454,
2990,
285,
3103,
436,
789,
556,
247,
39937,
7990,
685,
3264,
50276,
2520,
1146,
253,
1083,
891,
1158,
352,
310,
16613,
326,
18052,
1387,
31640,
3082,
3176,
281,
1347,
642,
1805,
685,
209,
693,
672,
352,
3249,
281,
4715,
253,
275,
253,
3361,
273,
46541,
13007,
275,
1635,
253,
16774,
7313,
5001,
37820,
285,
643,
2538,
273,
253,
2613,
1566,
403,
4722,
3738,
1142,
273,
731,
10725,
327,
277,
925,
534,
6706,
66,
1479,
310,
417,
14218,
9814,
50276,
783,
277,
925,
5199,
310,
8489,
2074,
281,
47515,
2218,
275,
4499,
422,
4715,
24088,
22296,
4499,
422,
4715,
3707,
627,
253,
1273,
3924,
310,
2684,
327,
253,
3236,
10895,
436,
2223,
1543,
275,
5520,
3045,
6701,
281,
253,
4499,
422,
8103,
516,
14338,
849,
22296,
4499,
422,
4715,
651,
3486,
253,
1543,
1097,
970,
253,
3236,
10895,
390,
253,
294,
24676,
581,
50276,
74,
651,
452,
10490,
281,
923,
253,
1783,
273,
3215,
26208,
281,
2486,
3081,
4679,
342,
2505,
50276,
6291,
50276,
66,
4577,
4468,
310,
326,
436,
2929,
458,
507,
11306,
327,
3676,
4735,
294,
6712,
272,
277,
925,
534,
4620,
275,
247,
3332,
549,
32693,
638,
3845,
465,
343,
35009,
7381,
1162,
355,
3307,
19235,
4361,
326,
638,
3845,
310,
3309,
281,
2096,
436,
789,
4361,
253,
2159,
5740,
275,
256,
20,
369,
417,
4209,
281,
956,
2112,
352,
651,
320,
1805,
281,
1056,
436,
2929,
1881,
41010,
3340,
1677,
849,
2969,
253,
277,
925,
2934,
310,
24088,
4853,
253,
294,
6712,
272,
10895,
50275,
74,
651,
452,
10490,
281,
923,
253,
7990,
273,
253,
2929,
2931,
247,
2372,
625,
4518,
275,
1635,
1223,
1097,
8113,
285,
2505,
15302,
403,
908,
253,
5020,
273,
253,
4679,
403,
760,
327,
253,
2460,
15302,
5474,
33032,
2520,
310,
8558,
271,
16774,
2929,
670,
4715,
275,
253,
3361,
273,
46541,
13007,
597,
1408,
247,
2257,
273,
5216,
326,
921,
3733,
247,
1566,
342,
16774,
2495,
41458,
209,
693,
3560,
407,
3676,
4735,
294,
6712,
272,
277,
925,
534,
310,
851,
26208,
1390,
3828,
327,
690,
327,
247,
2918,
483,
873,
326,
36908,
452,
46541,
13007,
11026,
1543,
326,
403,
417,
1512,
1027,
432,
1387,
31640,
3733,
1387,
3926,
6701,
281,
253,
4477,
323,
253,
1892,
789,
327,
436,
2929,
891,
751,
352,
4583,
285,
1158,
352,
310,
247,
9865,
7680,
50276,
296,
3755,
20556,
50275,
66,
1781,
2408,
273,
47860,
4679,
50276,
8250,
4028,
50276,
6209,
917,
14023,
285,
11815,
50276,
20881,
1255,
265,
50275,
783,
6452,
273,
436,
2929,
34865,
265,
327,
253,
2934,
326,
604,
767,
1027,
3082,
755,
11467,
253,
1072,
2918,
483,
3045,
840,
597,
1364,
320,
4715,
253,
1072,
2238,
273,
2181,
891,
13414,
1158,
28763,
7933,
2032,
285,
634,
4679,
13414,
1663,
5276,
352,
253,
6452,
275,
3104,
1384,
3547,
883,
812,
320,
1160,
10046,
342,
643,
3510,
273,
6260,
281,
452,
625,
23140,
697,
326,
1805,
42428,
273,
253,
6311,
3386,
2581,
685,
4715,
1805,
14237,
273,
253,
5161,
3386,
368,
812,
323,
1650,
2686,
4908,
253,
14237,
6311,
432,
209,
693,
69,
649,
285,
305,
3002,
285,
923,
604,
597,
1007,
2074,
436,
3368,
812,
320,
28190,
533,
604,
368,
1056,
2119,
281,
452,
8931,
2801,
31850,
285,
1408,
253,
3368,
884,
2069,
368,
812,
921,
326,
253,
4735,
8470,
6311,
403,
1663,
2074,
390,
417,
50276,
22309,
310,
2393,
15910,
1774,
323,
391,
22383,
391,
46506,
285,
305,
3002,
403,
627,
690,
2969,
4679,
368,
812,
513,
281,
30955,
436,
417,
2424,
1060,
2490,
187,
4118,
18435,
27,
783,
2929,
2722,
326,
16774,
2495,
41458,
310,
4209,
281,
4044,
1175,
9065,
4399,
3933,
19103,
285,
18052,
1387,
31640,
3082,
513,
417,
3176,
281,
2085,
3081,
5373,
50276,
783,
30628,
8042,
562,
326,
253,
1655,
789,
7024,
327,
277,
925,
534,
3133,
281,
2430,
690,
3081,
941,
2429,
281,
1387,
31640,
3082,
253,
30628,
671,
3877,
326,
253,
295,
24343,
4679,
858,
417,
897,
625,
3332,
3210,
285,
253,
4477,
9713,
841,
3374,
3839,
253,
30628,
1158,
436,
310,
247,
6210,
1591,
886,
4525,
2929,
327,
271,
1774,
1895,
285,
403,
42293,
275,
18738,
352,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper focuses on the understanding of the effects of adaptive learning rate and momentum in particular it proves that the adaptive learning rate can escape saddle points efficiently and cannot select flat minima as sgd does it also shows that momentum helps the training process by passing through saddle points and without affecting the minima selectionthe paper also proposes a new adaptive algorithm named adai algorithm 2 which uses parameterwise adaptive intertia to accelerate the training and finds flat minima as well as sgd finally the paper provides extensive numerical testing showing the benefits of adai i believe that the main ideas of the paper are interesting however i find that the presentation of this work is not very clear and somehow confusing in particular the structure of the paper and the results presented in sections 2 and 3 are difficult to absorb see my comments below i understand the motivation of the authors and what they tried to communicate but i find that there is no satisfactory explanation of the results presented in sections 2 and 3 the authors assumed that the reader is familiar with the closely related recent work on the sgd diffusion theory and do not provide enough details on the framework for example they use terminology like fokkerplanck equation divergence operator and diffusion matrix that are not really standard in the area of adaptive methods in addition assumption 1 on the second order taylor approximation near critical points is given without providing some interesting problems where it is satisfied also in section 3 assumptions 2 and 3 are used without further explanation of what exactly they mean the authors provide a few details in the appendix on what is the quasiequilibrium approximation and low temperature approximation but this is not sufficient how these assumptions are related to standard concepts in the area what are the mathematical expressions of these assumptions how are related to stochastic gradients and the noise also i find it a bit surprising that there is no formal presentation of the problem that we are interested to solve and the assumptions that one requires to be able to prove convergence a statement of the minimization or maximation problem under study with the main assumptions is missing from the paper one of the most important contributions of the paper is the analysis of the new algorithm adaptive inertia optimization adai proposed in section 5 however if one focuses on theorem 4 which provides the convergence guarantees of the adai it is clear that the analysis hold under very strong conditions assumptions for example the authors assumed both bounded variance and bounded gradient of the objective function which rarely hold in practical scenarios note that these conditions have already been proved to contradict special classes of nonconvex problems like functions satisfying the polyaklojasiewicz condition the combination of these assumptions lead to an empty set of problems thus the theorem cannot hold for all nonconvex smooth problems as i mentioned in the main review i believe that the main ideas of the paper are interesting however i find that the presentation of this work is not very clear and somehow confusing in particular the structure of the paper and the results presented in sections 2 and 3 are difficult to absorb see my comments below docsepthis paper disentangles the effects of adaptive learning rate and momentum in adam learning dynamics and proves that adaptive learning rate is good at escaping saddle points but not good at selecting at minima while momentum helps escape saddle point and matters little to escaping sharp minima based on the analysis the authors propose a novel optimizer adai compared to sgdm adai parameterwisely adapts the momentum hyperparameter to the approximated hessians of saddle points and is proved to fast escape saddle points and sharp minima the theoretical analysis of the adam optimizer is based on the sgd diffusion theory and the results confirms and explains the observation that adam can sometimes converge faster but generalize worse than sgdm the proposed adai optimizer is theoretically sound and demonstrates slightly better generalization performance than sgdm and significantly better than adam on image classification tasks despite estimating the moments in a similar way as adam the proposed adai optimizer seems more akin to sgdm with the only difference being its adaptive momentum and it doesnt use adaptive learning rates which is a main feature of adam moreover as shown in figs 1 to 3 and 10 the training curves of adai very much resemble those of sgdm therefore to further improve this work more comparisons should be made between adai and sgdm rather than between adai and adam in particular it would be interesting to see if the performance gap between adai and sgdm results from faster convergence as suggested by the theory and therefore a convergence comparison between adai and sgdm as the one conducted between adai and adam fig 11 should be helpful this paper provides new insights into the performance of adam and proposes a novel optimizer that both converges fast and generalizes well further improvements can be made by comparing the proposed method to sgdm more thoroughly docsepthis work analyzes the dynamics of momentum sgd and adam on escaping saddle points and sharp minima which is based on the diffusion theoretical framework proposed in xie et al 2021b the authors prove that momentum provides a drift effect around saddle points and does not affect flat minima selection for sgd and while adam escapes saddle points efficiently it does not favor flat minima as well as sgd the analysis explains some empirical observation of sgdm and adam motivated by the analysis the authors propose adaptive inertia adai method which can approximately achieve hessianindependent momentum drift escapes saddle points fast and favors flat minima as well as momentum sgd this paper is generally wellwritten the diffusion theoretical analysis does provide some insight on the empirical performance of momentum sgd and adam the authors also put in efforts to conduct numerical verifications to their theoretical statements which is highly appreciated however i think that this work does not completed disentangle the effects of adaptive learning rate and momentum since the work analyzes adam which fuses these two algorithmic components it would be better to discuss the effect of each component in adam separately probably by setting some parameters to zero the authors then propose adai which achieves approximately hessianindependent momentum drift without damaging flat minima selection btw the proof of proposition 3 is missing if it is a direct consequence of theorem 2 it is better to mention it somehow the construction of adai is interesting and its effectiveness is justified by the empirical experiments however it seems to me that this contribution is a bit disconnected to the main story as adai does not use adaptive learning rate some revision probably changing the title might be good to make the story clearer and more fluent typos missing reference on page 7 note that an existing adaptive momentum method the last sentence better than popular adam and sgd this work provides some new theoretical insights for momentum sgd and adam which are interesting and important the authors then propose adaptive inertia based on the insights which shows good performance some revision is needed to make the story clearer see main review docsepthis paper studies the behaviors of some algorithms when the iterate is at a critical point via sdes a variant of adam is given in the end the paper draws some conclusions about adaptive learning rate and momentum claiming that quote momentum matters little to escaping sharp minima unquote quote adaptive learning rate is not good at selecting flat minima unquote here are some of the questions and concerns from my perspective the theoretical analysis is not rigorous and skips a lot of details for example in section 4 page 5 the paper claims that the continuoustime dynamic of adam can be written as equation 6 however a detail derivation to support the claim and the proof of showing that the sde is a valid one for adam are missing in the paper indeed it can be seen from page 5 where the authors said they are analyzing an idealized adam it is not clear why this is an idealized one and the writing is like hiding something under the rug we can see that some statements about adam are made but these conclusions are drawn from a sde which does not really correspond to adam on the other hand the sde of adam does exist in the literature eg httpsarxivorgpdf201005627pdf if this paper really wants to claimargue something about adam then a careful analysis regarding the discretization error between the solution of the proposed sde and the discretetime adam should be provided in the paper for another example on page 3 it states quote the diffusion matrix d is independent of theta near critical points unquote but the proof is not provided how is it independent it looks like some approximations were used there the paper should prove the independence another concern is about 8 the paper should provide a detailed derivation of showing that 8 is really about how the distribution evolves when the underlying dynamic is 7 currently the equation 8 is like jumping out of nowhere how does assumption 1 help to show 8 also some places in the paper are not clear a second paragraph on page 1 quote all previous works have not touched the saddlepoint escaping property of the dynamics unquote apparently there are quite a few works regarding saddlepoint escaping by sgd sgd with momentum and adam the authors might want to explain what they meant here b there are two approximations on 3 but it would be more helpful to explain how the approximations are made in detail there are some descriptions below 3 but are not very clear c last paragraph in section 3 quote the momentum does not affect flat minima selection in terms of the escape time unquote this is another confusing statement what does momentum does not affect flat minima selection mean does it mean momentum and sgd without momentum converge to the same point what is the definition of flat minima d second to last paragraph in section 4 quote adam has logtauohae12 sgd and sgdmomentum both have logtau ohae1 unquote it seems that the conclusion right below this sentence would be reversed if hae1 something wrong e theorem 1 and 2 the authors show some guarantees about the variance at time t when the iterate of the algorithm follows the gaussian distribution but does a higher variance of the gaussian distribution imply a faster saddle point escape the authors might want to add some discussions about the connection to the notion of saddle point escape in the literature eg jin et al 2017 f after reading the paper i am still not sure how the effect of the learning rate and momentum was disentangled in the analysis i see some analysis about the behavior of sgd adam sgdmomentum at critical points it would be more helpful if the authors can explain why the effect of the learning rate and momentum can be isolated the presentation and statements are confusing in my opinion some steps in the analysis are not transparent
### Summary: | the paper is aimed at providing an explaining the perceived lack of generalization results for adam as compared to sgd to this end the paper decouples the effect of adaptive per parameter learning rate and the momentum aspect of adam the paper shows that the while adaptive rates help escape saddle points faster they are worse when consider the flatness of minima being selected further momentum has no effect on the flatness of minima but again leads to better optimization by providing a drift leading to saddle point evasion they also provide a new algorithm adai based on inertia targeted at better generalization of adaptive methods the paper definitely provides an interesting perspective and the approach to decouple the effect of momentum and adaptive lr and study their efficacy in escaping saddle points and flatness of minima seems a very useful perspective the primary reason for my recommendation is the presentation of the paper in terms of the rigor its assumptions to establish the results these aspects have been highlighted by the reviewers in detail i suggest the authors to carefully revisit the paper and improve the presentation of the assumptions adding rigor to the presentation as well as adding justifications where appropriate especially in light of nonstandardness of these assumptions in optimization literature | [
1677,
1293,
5277,
690,
4722,
3237,
835,
352,
310,
10048,
50276,
12563,
275,
2593,
495,
13260,
374,
285,
495,
403,
908,
1293,
2007,
8813,
273,
752,
4555,
597,
1599,
253,
4477,
2085,
247,
1643,
4278,
275,
253,
30762,
327,
752,
310,
253,
21582,
466,
48494,
11193,
285,
1698,
3276,
11193,
533,
436,
310,
417,
4209,
50276,
5430,
841,
13260,
403,
2905,
281,
2629,
12342,
275,
253,
2170,
752,
403,
253,
15965,
12091,
273,
841,
13260,
849,
403,
2905,
281,
19191,
27935,
285,
253,
6046,
50275,
12563,
891,
1089,
352,
247,
2372,
10084,
326,
627,
310,
642,
7473,
9759,
273,
253,
1895,
326,
359,
403,
6110,
281,
8415,
285,
253,
13260,
326,
581,
4419,
281,
320,
2104,
281,
5276,
14940,
247,
3908,
273,
253,
41458,
390,
11903,
318,
1895,
762,
1263,
342,
253,
2022,
13260,
310,
5816,
432,
253,
2929,
50275,
531,
273,
253,
954,
1774,
9021,
273,
253,
2929,
310,
253,
1783,
273,
253,
747,
5933,
17825,
41299,
13757,
519,
2284,
4081,
275,
2593,
608,
2299,
604,
581,
16633,
327,
10012,
577,
534,
3400,
253,
14940,
23632,
273,
253,
519,
2284,
352,
310,
2590,
326,
253,
1783,
2186,
762,
1077,
2266,
2515,
50276,
515,
360,
6372,
323,
1650,
253,
4477,
8025,
1097,
11542,
11041,
285,
11542,
11786,
273,
253,
8103,
1159,
534,
11766,
2186,
275,
8542,
15216,
3877,
326,
841,
2515,
452,
2168,
644,
8058,
281,
17343,
2714,
5971,
273,
1327,
44181,
3237,
751,
3470,
14127,
253,
3488,
518,
4213,
33583,
48446,
1617,
253,
5019,
273,
841,
13260,
1421,
281,
271,
6325,
873,
273,
3237,
3021,
253,
10012,
2550,
2186,
323,
512,
1327,
44181,
6032,
3237,
347,
891,
5393,
275,
253,
2022,
2278,
891,
2868,
326,
253,
2022,
5697,
273,
253,
2929,
403,
4722,
2299,
891,
1089,
326,
253,
9759,
273,
436,
789,
310,
417,
1077,
2590,
285,
10380,
21643,
275,
1798,
253,
2605,
273,
253,
2929,
285,
253,
1543,
3559,
275,
7118,
374,
285,
495,
403,
2834,
281,
15816,
923,
619,
5701,
2708,
50276,
7152,
33032,
2520,
2929,
557,
290,
19236,
253,
2538,
273,
17825,
4715,
2281,
285,
10254,
275,
38622,
4715,
8062,
285,
19539,
326,
17825,
4715,
2281,
310,
1175,
387,
34528,
26759,
2792,
533,
417,
1175,
387,
17221,
387,
46836,
1223,
10254,
7729,
8773,
26759,
1127,
285,
8213,
1652,
281,
34528,
9479,
46836,
1754,
327,
253,
1783,
253,
4477,
12661,
247,
4460,
5556,
6081,
519,
2284,
2429,
281,
48237,
17670,
519,
2284,
4764,
88,
9299,
5223,
84,
253,
10254,
4373,
19484,
281,
253,
34930,
344,
859,
2458,
273,
26759,
2792,
285,
310,
8058,
281,
3809,
8773,
26759,
2792,
285,
9479,
46836,
253,
10527,
1783,
273,
253,
38622,
5556,
6081,
310,
1754,
327,
253,
256,
35333,
12393,
3762,
285,
253,
1543,
23849,
285,
11424,
253,
8310,
326,
38622,
476,
4536,
29623,
7938,
533,
39970,
7197,
685,
48237,
17670,
253,
4081,
519,
2284,
5556,
6081,
310,
28055,
3590,
285,
14371,
5777,
1805,
26647,
3045,
685,
48237,
17670,
285,
3012,
1805,
685,
38622,
327,
2460,
9162,
8892,
50276,
3229,
3784,
26230,
253,
9506,
275,
247,
2074,
1039,
347,
38622,
253,
4081,
519,
2284,
5556,
6081,
3133,
625,
33917,
281,
48237,
17670,
342,
253,
760,
3064,
1146,
697,
17825,
10254,
285,
352,
36908,
897,
17825,
4715,
4142,
534,
310,
247,
2022,
4735,
273,
38622,
25761,
347,
2011,
275,
3036,
84,
337,
281,
495,
285,
884,
253,
3733,
9191,
273,
519,
2284,
1077,
1199,
28788,
1110,
273,
48237,
17670,
3103,
281,
2007,
3157,
436,
789,
625,
14023,
943,
320,
1160,
875,
519,
2284,
285,
48237,
17670,
2581,
685,
875,
519,
2284,
285,
38622,
275,
1798,
352,
651,
320,
4722,
281,
923,
604,
253,
3045,
8037,
875,
519,
2284,
285,
48237,
17670,
1543,
432,
7938,
14940,
347,
5125,
407,
253,
3762,
285,
3103,
247,
14940,
5301,
875,
519,
2284,
285,
48237,
17670,
347,
253,
581,
5196,
875,
519,
2284,
285,
38622,
3036,
1903,
943,
320,
9371,
436,
2929,
3400,
747,
16039,
715,
253,
3045,
273,
38622,
285,
29328,
247,
4460,
5556,
6081,
326,
1097,
26414,
3809,
285,
2087,
4219,
973,
2007,
11701,
476,
320,
1160,
407,
10941,
253,
4081,
1332,
281,
48237,
17670,
625,
16575,
5474,
33032,
2520,
789,
3537,
13505,
253,
8062,
273,
10254,
256,
35333,
285,
38622,
327,
34528,
26759,
2792,
285,
9479,
46836,
534,
310,
1754,
327,
253,
12393,
10527,
7792,
4081,
275,
1269,
466,
1162,
355,
43425,
67,
253,
4477,
5276,
326,
10254,
3400,
247,
16924,
1055,
1475,
26759,
2792,
285,
1057,
417,
2818,
6507,
46836,
5438,
323,
256,
35333,
285,
1223,
38622,
44716,
26759,
2792,
14556,
352,
1057,
417,
3718,
6507,
46836,
347,
973,
347,
256,
35333,
253,
1783,
11424,
690,
16774,
8310,
273,
48237,
17670,
285,
38622,
17194,
407,
253,
1783,
253,
4477,
12661,
17825,
41299,
519,
2284,
1332,
534,
476,
5512,
5115,
344,
859,
757,
17777,
10254,
16924,
44716,
26759,
2792,
3809,
285,
32955,
6507,
46836,
347,
973,
347,
10254,
256,
35333,
50275,
2520,
2929,
310,
3839,
973,
15720,
253,
12393,
10527,
1783,
1057,
2085,
690,
12288,
327,
253,
16774,
3045,
273,
10254,
256,
35333,
285,
38622,
253,
4477,
671,
1691,
275,
6031,
281,
2589,
10704,
2336,
6787,
281,
616,
10527,
7234,
534,
310,
4122,
14109,
2299,
891,
1158,
326,
436,
789,
1057,
417,
6312,
557,
290,
2134,
253,
2538,
273,
17825,
4715,
2281,
285,
10254,
1580,
253,
789,
3537,
13505,
38622,
534,
269,
5123,
841,
767,
5933,
280,
4295,
352,
651,
320,
1805,
281,
2319,
253,
1055,
273,
1016,
4445,
275,
38622,
11794,
3164,
407,
4758,
690,
3602,
281,
5058,
50275,
783,
4477,
840,
12661,
519,
2284,
534,
33526,
5512,
344,
859,
757,
17777,
10254,
16924,
1293,
24038,
6507,
46836,
5438,
270,
7553,
253,
4737,
273,
13989,
495,
310,
5816,
604,
352,
310,
247,
1480,
9936,
273,
10012,
374,
352,
310,
1805,
281,
3748,
352,
10380,
253,
5140,
273,
519,
2284,
310,
4722,
285,
697,
12510,
310,
17285,
407,
253,
16774,
4679,
2299,
352,
3133,
281,
479,
326,
436,
7680,
310,
247,
2372,
33817,
281,
253,
2022,
2926,
347,
519,
2284,
1057,
417,
897,
17825,
4715,
2281,
690,
18520,
3164,
6890,
253,
4060,
1537,
320,
1175,
281,
1056,
253,
2926,
30909,
285,
625,
2938,
290,
50276,
555,
993,
50276,
33722,
3806,
327,
3239,
818,
3877,
326,
271,
5368,
17825,
10254,
1332,
50274,
783,
1390,
6197,
1805,
685,
4633,
38622,
285,
256,
35333,
436,
789,
3400,
690,
747,
10527,
16039,
323,
10254,
256,
35333,
285,
38622,
534,
403,
4722,
285,
1774,
253,
4477,
840,
12661,
17825,
41299,
1754,
327,
253,
16039,
534,
2722,
1175,
3045,
690,
18520,
310,
3058,
281,
1056,
253,
2926,
30909,
923,
2022,
2278,
5474,
33032,
2520,
2929,
2175,
253,
13576,
273,
690,
11333,
672,
253,
35388,
310,
387,
247,
4619,
1127,
3066,
256,
3229,
247,
12955,
273,
38622,
310,
1677,
275,
253,
990,
253,
2929,
21354,
690,
11815,
670,
17825,
4715,
2281,
285,
10254,
15081,
326,
50274,
21049,
10254,
8213,
1652,
281,
34528,
9479,
46836,
440,
21049,
50276,
21049,
17825,
4715,
2281,
310,
417,
1175,
387,
17221,
6507,
46836,
440,
21049,
50276,
1568,
403,
690,
273,
253,
3533,
285,
7350,
50276,
4064,
619,
8668,
253,
10527,
1783,
310,
417,
26565,
285,
1629,
2824,
247,
2257,
273,
4278,
323,
1650,
275,
2593,
577,
3239,
608,
253,
2929,
3916,
326,
253,
44351,
26202,
553,
7870,
273,
38622,
476,
320,
3542,
347,
5150,
721,
2299,
247,
2508,
28529,
281,
1329,
253,
1750,
285,
253,
4737,
273,
4645,
326,
253,
256,
615,
310,
247,
3588,
581,
323,
38622,
403,
5816,
275,
253,
2929,
6296,
352,
476,
320,
2326,
432,
3239,
608,
835,
253,
4477,
753,
597,
403,
18918,
271,
7445,
1025,
38622,
352,
310,
417,
2590,
2139,
436,
310,
271,
7445,
1025,
581,
285,
253,
4028,
310,
751,
17197,
1633,
762,
253,
16051,
359,
476,
923,
326,
690,
7234,
670,
38622,
403,
1160,
533,
841,
11815,
403,
8392,
432,
247,
256,
615,
534,
1057,
417,
1663,
2723,
281,
38622,
327,
253,
643,
1133,
253,
256,
615,
273,
38622,
1057,
2226,
275,
253,
6239,
24088,
5987,
39962,
2061,
9275,
1252,
361,
3208,
1630,
9275,
604,
436,
2929,
1663,
5605,
281,
1750,
1662,
489,
1633,
670,
38622,
840,
247,
10182,
1783,
5001,
253,
35132,
1320,
2228,
875,
253,
2900,
273,
253,
4081,
256,
615,
285,
253,
35132,
7816,
38622,
943,
320,
2530,
275,
253,
2929,
50275,
1542,
1529,
1650,
327,
3239,
495,
352,
3054,
14430,
253,
12393,
4315,
277,
310,
3907,
273,
39116,
2822,
4619,
2792,
440,
21049,
533,
253,
4737,
310,
417,
2530,
849,
310,
352,
3907,
352,
4453,
751,
690,
34754,
497,
908,
627,
253,
2929,
943,
5276,
253,
14275,
50276,
23955,
4468,
310,
670,
854,
253,
2929,
943,
2085,
247,
7000,
28529,
273,
4645,
326,
854,
310,
1663,
670,
849,
253,
3268,
43279,
672,
253,
6944,
7870,
310,
818,
4390,
253,
5150,
854,
310,
751,
22802,
562,
273,
17663,
849,
1057,
9376,
337,
1361,
281,
921,
854,
50276,
12563,
690,
5053,
275,
253,
2929,
403,
417,
2590,
50276,
66,
1273,
12494,
327,
3239,
337,
14430,
512,
2045,
2987,
452,
417,
14435,
253,
26759,
3659,
34528,
2867,
273,
253,
8062,
440,
21049,
8505,
627,
403,
3240,
247,
1643,
2987,
5001,
26759,
3659,
34528,
407,
256,
35333,
256,
35333,
342,
10254,
285,
38622,
253,
4477,
1537,
971,
281,
5513,
752,
597,
5486,
1060,
50276,
67,
627,
403,
767,
34754,
327,
495,
533,
352,
651,
320,
625,
9371,
281,
5513,
849,
253,
34754,
403,
1160,
275,
2508,
627,
403,
690,
20121,
2708,
495,
533,
403,
417,
1077,
2590,
50275,
68,
1390,
12494,
275,
2593,
495,
14430,
253,
10254,
1057,
417,
2818,
6507,
46836,
5438,
275,
2426,
273,
253,
8773,
673,
440,
21049,
436,
310,
1529,
21643,
3908,
752,
1057,
10254,
1057,
417,
2818,
6507,
46836,
5438,
1599,
1057,
352,
1599,
10254,
285,
256,
35333,
1293,
10254,
29623,
281,
253,
1072,
1127,
752,
310,
253,
5426,
273,
6507,
46836,
50276,
69,
1273,
281,
1390,
12494,
275,
2593,
577,
14430,
38622,
556,
2412,
3115,
1368,
3348,
805,
50276,
8433,
69,
285,
48237,
17670,
297,
290,
360,
1097,
452,
2412,
3115,
50276,
1368,
3348,
18,
440,
21049,
352,
3133,
326,
253,
6452,
987,
2708,
436,
6197,
651,
320,
13891,
604,
419,
70,
18,
1633,
3430,
50276,
70,
10012,
337,
285,
374,
253,
4477,
921,
690,
23632,
670,
253,
11041,
387,
673,
246,
672,
253,
35388,
273,
253,
5933,
3637,
253,
305,
12064,
3268,
533,
1057,
247,
2169,
11041,
273,
253,
305,
12064,
3268,
16084,
247,
7938,
26759,
1127,
8773,
253,
4477,
1537,
971,
281,
823,
690,
11985,
670,
253,
4602,
281,
253,
10732,
273,
26759,
1127,
8773,
275,
253,
6239,
24088,
50276,
37525,
1162,
355,
4240,
50276,
71,
846,
4361,
253,
2929,
891,
717,
1335,
417,
2119,
849,
253,
1055,
273,
253,
4715,
2281,
285,
10254,
369,
557,
290,
33195,
275,
253,
1783,
891,
923,
690,
1783,
670,
253,
3879,
273,
256,
35333,
38622,
48237,
17670,
297,
290,
360,
387,
4619,
2792,
50276,
262,
651,
320,
625,
9371,
604,
253,
4477,
476,
5513,
2139,
253,
1055,
273,
253,
4715,
2281,
285,
10254,
476,
320,
7011,
50275,
783,
9759,
285,
7234,
403,
21643,
275,
619,
4743,
690,
5018,
275,
253,
1783,
403,
417,
13955,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
310,
11205,
387,
5277,
271,
15571,
253,
12351,
3480,
273,
26647,
1543,
323,
38622,
347,
2429,
281,
256,
35333,
281,
436,
990,
253,
2929,
34430,
1868,
253,
1055,
273,
17825,
591,
4764,
4715,
2281,
285,
253,
10254,
4809,
273,
38622,
253,
2929,
2722,
326,
253,
1223,
17825,
4142,
1361,
8773,
26759,
2792,
7938,
50276,
9328,
403,
7197,
672,
1908,
253,
6507,
1255,
273,
46836,
1146,
4236,
2007,
10254,
556,
642,
1055,
327,
253,
6507,
1255,
273,
46836,
533,
969,
5644,
281,
1805,
13757,
407,
5277,
247,
16924,
4283,
281,
26759,
1127,
612,
4930,
597,
671,
2085,
247,
747,
5933,
519,
2284,
1754,
327,
41299,
10522,
387,
1805,
26647,
273,
17825,
3082,
50275,
783,
2929,
7964,
3400,
271,
4722,
8668,
285,
253,
2746,
281,
34430,
713,
253,
1055,
273,
10254,
285,
17825,
298,
83,
285,
1263,
616,
10307,
275,
34528,
26759,
2792,
285,
6507,
1255,
273,
46836,
3133,
247,
1077,
4217,
8668,
253,
3625,
1921,
323,
619,
17401,
310,
253,
9759,
273,
253,
2929,
275,
2426,
273,
253,
8132,
263,
697,
13260,
281,
5100,
253,
1543,
841,
7794,
452,
644,
16318,
407,
253,
30628,
275,
2508,
891,
1804,
253,
4477,
281,
9257,
45735,
253,
2929,
285,
3157,
253,
9759,
273,
253,
13260,
6240,
8132,
263,
281,
253,
9759,
347,
973,
347,
6240,
816,
6787,
835,
4569,
3340,
275,
1708,
273,
1327,
15291,
1255,
273,
841,
13260,
275,
13757,
6239
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
1677,
1293,
5277,
690,
4722,
3237,
835,
352,
310,
10048,
50276,
12563,
275,
2593,
495,
13260,
374,
285,
495,
403,
908,
1293,
2007,
8813,
273,
752,
4555,
597,
1599,
253,
4477,
2085,
247,
1643,
4278,
275,
253,
30762,
327,
752,
310,
253,
21582,
466,
48494,
11193,
285,
1698,
3276,
11193,
533,
436,
310,
417,
4209,
50276,
5430,
841,
13260,
403,
2905,
281,
2629,
12342,
275,
253,
2170,
752,
403,
253,
15965,
12091,
273,
841,
13260,
849,
403,
2905,
281,
19191,
27935,
285,
253,
6046,
50275,
12563,
891,
1089,
352,
247,
2372,
10084,
326,
627,
310,
642,
7473,
9759,
273,
253,
1895,
326,
359,
403,
6110,
281,
8415,
285,
253,
13260,
326,
581,
4419,
281,
320,
2104,
281,
5276,
14940,
247,
3908,
273,
253,
41458,
390,
11903,
318,
1895,
762,
1263,
342,
253,
2022,
13260,
310,
5816,
432,
253,
2929,
50275,
531,
273,
253,
954,
1774,
9021,
273,
253,
2929,
310,
253,
1783,
273,
253,
747,
5933,
17825,
41299,
13757,
519,
2284,
4081,
275,
2593,
608,
2299,
604,
581,
16633,
327,
10012,
577,
534,
3400,
253,
14940,
23632,
273,
253,
519,
2284,
352,
310,
2590,
326,
253,
1783,
2186,
762,
1077,
2266,
2515,
50276,
515,
360,
6372,
323,
1650,
253,
4477,
8025,
1097,
11542,
11041,
285,
11542,
11786,
273,
253,
8103,
1159,
534,
11766,
2186,
275,
8542,
15216,
3877,
326,
841,
2515,
452,
2168,
644,
8058,
281,
17343,
2714,
5971,
273,
1327,
44181,
3237,
751,
3470,
14127,
253,
3488,
518,
4213,
33583,
48446,
1617,
253,
5019,
273,
841,
13260,
1421,
281,
271,
6325,
873,
273,
3237,
3021,
253,
10012,
2550,
2186,
323,
512,
1327,
44181,
6032,
3237,
347,
891,
5393,
275,
253,
2022,
2278,
891,
2868,
326,
253,
2022,
5697,
273,
253,
2929,
403,
4722,
2299,
891,
1089,
326,
253,
9759,
273,
436,
789,
310,
417,
1077,
2590,
285,
10380,
21643,
275,
1798,
253,
2605,
273,
253,
2929,
285,
253,
1543,
3559,
275,
7118,
374,
285,
495,
403,
2834,
281,
15816,
923,
619,
5701,
2708,
50276,
7152,
33032,
2520,
2929,
557,
290,
19236,
253,
2538,
273,
17825,
4715,
2281,
285,
10254,
275,
38622,
4715,
8062,
285,
19539,
326,
17825,
4715,
2281,
310,
1175,
387,
34528,
26759,
2792,
533,
417,
1175,
387,
17221,
387,
46836,
1223,
10254,
7729,
8773,
26759,
1127,
285,
8213,
1652,
281,
34528,
9479,
46836,
1754,
327,
253,
1783,
253,
4477,
12661,
247,
4460,
5556,
6081,
519,
2284,
2429,
281,
48237,
17670,
519,
2284,
4764,
88,
9299,
5223,
84,
253,
10254,
4373,
19484,
281,
253,
34930,
344,
859,
2458,
273,
26759,
2792,
285,
310,
8058,
281,
3809,
8773,
26759,
2792,
285,
9479,
46836,
253,
10527,
1783,
273,
253,
38622,
5556,
6081,
310,
1754,
327,
253,
256,
35333,
12393,
3762,
285,
253,
1543,
23849,
285,
11424,
253,
8310,
326,
38622,
476,
4536,
29623,
7938,
533,
39970,
7197,
685,
48237,
17670,
253,
4081,
519,
2284,
5556,
6081,
310,
28055,
3590,
285,
14371,
5777,
1805,
26647,
3045,
685,
48237,
17670,
285,
3012,
1805,
685,
38622,
327,
2460,
9162,
8892,
50276,
3229,
3784,
26230,
253,
9506,
275,
247,
2074,
1039,
347,
38622,
253,
4081,
519,
2284,
5556,
6081,
3133,
625,
33917,
281,
48237,
17670,
342,
253,
760,
3064,
1146,
697,
17825,
10254,
285,
352,
36908,
897,
17825,
4715,
4142,
534,
310,
247,
2022,
4735,
273,
38622,
25761,
347,
2011,
275,
3036,
84,
337,
281,
495,
285,
884,
253,
3733,
9191,
273,
519,
2284,
1077,
1199,
28788,
1110,
273,
48237,
17670,
3103,
281,
2007,
3157,
436,
789,
625,
14023,
943,
320,
1160,
875,
519,
2284,
285,
48237,
17670,
2581,
685,
875,
519,
2284,
285,
38622,
275,
1798,
352,
651,
320,
4722,
281,
923,
604,
253,
3045,
8037,
875,
519,
2284,
285,
48237,
17670,
1543,
432,
7938,
14940,
347,
5125,
407,
253,
3762,
285,
3103,
247,
14940,
5301,
875,
519,
2284,
285,
48237,
17670,
347,
253,
581,
5196,
875,
519,
2284,
285,
38622,
3036,
1903,
943,
320,
9371,
436,
2929,
3400,
747,
16039,
715,
253,
3045,
273,
38622,
285,
29328,
247,
4460,
5556,
6081,
326,
1097,
26414,
3809,
285,
2087,
4219,
973,
2007,
11701,
476,
320,
1160,
407,
10941,
253,
4081,
1332,
281,
48237,
17670,
625,
16575,
5474,
33032,
2520,
789,
3537,
13505,
253,
8062,
273,
10254,
256,
35333,
285,
38622,
327,
34528,
26759,
2792,
285,
9479,
46836,
534,
310,
1754,
327,
253,
12393,
10527,
7792,
4081,
275,
1269,
466,
1162,
355,
43425,
67,
253,
4477,
5276,
326,
10254,
3400,
247,
16924,
1055,
1475,
26759,
2792,
285,
1057,
417,
2818,
6507,
46836,
5438,
323,
256,
35333,
285,
1223,
38622,
44716,
26759,
2792,
14556,
352,
1057,
417,
3718,
6507,
46836,
347,
973,
347,
256,
35333,
253,
1783,
11424,
690,
16774,
8310,
273,
48237,
17670,
285,
38622,
17194,
407,
253,
1783,
253,
4477,
12661,
17825,
41299,
519,
2284,
1332,
534,
476,
5512,
5115,
344,
859,
757,
17777,
10254,
16924,
44716,
26759,
2792,
3809,
285,
32955,
6507,
46836,
347,
973,
347,
10254,
256,
35333,
50275,
2520,
2929,
310,
3839,
973,
15720,
253,
12393,
10527,
1783,
1057,
2085,
690,
12288,
327,
253,
16774,
3045,
273,
10254,
256,
35333,
285,
38622,
253,
4477,
671,
1691,
275,
6031,
281,
2589,
10704,
2336,
6787,
281,
616,
10527,
7234,
534,
310,
4122,
14109,
2299,
891,
1158,
326,
436,
789,
1057,
417,
6312,
557,
290,
2134,
253,
2538,
273,
17825,
4715,
2281,
285,
10254,
1580,
253,
789,
3537,
13505,
38622,
534,
269,
5123,
841,
767,
5933,
280,
4295,
352,
651,
320,
1805,
281,
2319,
253,
1055,
273,
1016,
4445,
275,
38622,
11794,
3164,
407,
4758,
690,
3602,
281,
5058,
50275,
783,
4477,
840,
12661,
519,
2284,
534,
33526,
5512,
344,
859,
757,
17777,
10254,
16924,
1293,
24038,
6507,
46836,
5438,
270,
7553,
253,
4737,
273,
13989,
495,
310,
5816,
604,
352,
310,
247,
1480,
9936,
273,
10012,
374,
352,
310,
1805,
281,
3748,
352,
10380,
253,
5140,
273,
519,
2284,
310,
4722,
285,
697,
12510,
310,
17285,
407,
253,
16774,
4679,
2299,
352,
3133,
281,
479,
326,
436,
7680,
310,
247,
2372,
33817,
281,
253,
2022,
2926,
347,
519,
2284,
1057,
417,
897,
17825,
4715,
2281,
690,
18520,
3164,
6890,
253,
4060,
1537,
320,
1175,
281,
1056,
253,
2926,
30909,
285,
625,
2938,
290,
50276,
555,
993,
50276,
33722,
3806,
327,
3239,
818,
3877,
326,
271,
5368,
17825,
10254,
1332,
50274,
783,
1390,
6197,
1805,
685,
4633,
38622,
285,
256,
35333,
436,
789,
3400,
690,
747,
10527,
16039,
323,
10254,
256,
35333,
285,
38622,
534,
403,
4722,
285,
1774,
253,
4477,
840,
12661,
17825,
41299,
1754,
327,
253,
16039,
534,
2722,
1175,
3045,
690,
18520,
310,
3058,
281,
1056,
253,
2926,
30909,
923,
2022,
2278,
5474,
33032,
2520,
2929,
2175,
253,
13576,
273,
690,
11333,
672,
253,
35388,
310,
387,
247,
4619,
1127,
3066,
256,
3229,
247,
12955,
273,
38622,
310,
1677,
275,
253,
990,
253,
2929,
21354,
690,
11815,
670,
17825,
4715,
2281,
285,
10254,
15081,
326,
50274,
21049,
10254,
8213,
1652,
281,
34528,
9479,
46836,
440,
21049,
50276,
21049,
17825,
4715,
2281,
310,
417,
1175,
387,
17221,
6507,
46836,
440,
21049,
50276,
1568,
403,
690,
273,
253,
3533,
285,
7350,
50276,
4064,
619,
8668,
253,
10527,
1783,
310,
417,
26565,
285,
1629,
2824,
247,
2257,
273,
4278,
323,
1650,
275,
2593,
577,
3239,
608,
253,
2929,
3916,
326,
253,
44351,
26202,
553,
7870,
273,
38622,
476,
320,
3542,
347,
5150,
721,
2299,
247,
2508,
28529,
281,
1329,
253,
1750,
285,
253,
4737,
273,
4645,
326,
253,
256,
615,
310,
247,
3588,
581,
323,
38622,
403,
5816,
275,
253,
2929,
6296,
352,
476,
320,
2326,
432,
3239,
608,
835,
253,
4477,
753,
597,
403,
18918,
271,
7445,
1025,
38622,
352,
310,
417,
2590,
2139,
436,
310,
271,
7445,
1025,
581,
285,
253,
4028,
310,
751,
17197,
1633,
762,
253,
16051,
359,
476,
923,
326,
690,
7234,
670,
38622,
403,
1160,
533,
841,
11815,
403,
8392,
432,
247,
256,
615,
534,
1057,
417,
1663,
2723,
281,
38622,
327,
253,
643,
1133,
253,
256,
615,
273,
38622,
1057,
2226,
275,
253,
6239,
24088,
5987,
39962,
2061,
9275,
1252,
361,
3208,
1630,
9275,
604,
436,
2929,
1663,
5605,
281,
1750,
1662,
489,
1633,
670,
38622,
840,
247,
10182,
1783,
5001,
253,
35132,
1320,
2228,
875,
253,
2900,
273,
253,
4081,
256,
615,
285,
253,
35132,
7816,
38622,
943,
320,
2530,
275,
253,
2929,
50275,
1542,
1529,
1650,
327,
3239,
495,
352,
3054,
14430,
253,
12393,
4315,
277,
310,
3907,
273,
39116,
2822,
4619,
2792,
440,
21049,
533,
253,
4737,
310,
417,
2530,
849,
310,
352,
3907,
352,
4453,
751,
690,
34754,
497,
908,
627,
253,
2929,
943,
5276,
253,
14275,
50276,
23955,
4468,
310,
670,
854,
253,
2929,
943,
2085,
247,
7000,
28529,
273,
4645,
326,
854,
310,
1663,
670,
849,
253,
3268,
43279,
672,
253,
6944,
7870,
310,
818,
4390,
253,
5150,
854,
310,
751,
22802,
562,
273,
17663,
849,
1057,
9376,
337,
1361,
281,
921,
854,
50276,
12563,
690,
5053,
275,
253,
2929,
403,
417,
2590,
50276,
66,
1273,
12494,
327,
3239,
337,
14430,
512,
2045,
2987,
452,
417,
14435,
253,
26759,
3659,
34528,
2867,
273,
253,
8062,
440,
21049,
8505,
627,
403,
3240,
247,
1643,
2987,
5001,
26759,
3659,
34528,
407,
256,
35333,
256,
35333,
342,
10254,
285,
38622,
253,
4477,
1537,
971,
281,
5513,
752,
597,
5486,
1060,
50276,
67,
627,
403,
767,
34754,
327,
495,
533,
352,
651,
320,
625,
9371,
281,
5513,
849,
253,
34754,
403,
1160,
275,
2508,
627,
403,
690,
20121,
2708,
495,
533,
403,
417,
1077,
2590,
50275,
68,
1390,
12494,
275,
2593,
495,
14430,
253,
10254,
1057,
417,
2818,
6507,
46836,
5438,
275,
2426,
273,
253,
8773,
673,
440,
21049,
436,
310,
1529,
21643,
3908,
752,
1057,
10254,
1057,
417,
2818,
6507,
46836,
5438,
1599,
1057,
352,
1599,
10254,
285,
256,
35333,
1293,
10254,
29623,
281,
253,
1072,
1127,
752,
310,
253,
5426,
273,
6507,
46836,
50276,
69,
1273,
281,
1390,
12494,
275,
2593,
577,
14430,
38622,
556,
2412,
3115,
1368,
3348,
805,
50276,
8433,
69,
285,
48237,
17670,
297,
290,
360,
1097,
452,
2412,
3115,
50276,
1368,
3348,
18,
440,
21049,
352,
3133,
326,
253,
6452,
987,
2708,
436,
6197,
651,
320,
13891,
604,
419,
70,
18,
1633,
3430,
50276,
70,
10012,
337,
285,
374,
253,
4477,
921,
690,
23632,
670,
253,
11041,
387,
673,
246,
672,
253,
35388,
273,
253,
5933,
3637,
253,
305,
12064,
3268,
533,
1057,
247,
2169,
11041,
273,
253,
305,
12064,
3268,
16084,
247,
7938,
26759,
1127,
8773,
253,
4477,
1537,
971,
281,
823,
690,
11985,
670,
253,
4602,
281,
253,
10732,
273,
26759,
1127,
8773,
275,
253,
6239,
24088,
50276,
37525,
1162,
355,
4240,
50276,
71,
846,
4361,
253,
2929,
891,
717,
1335,
417,
2119,
849,
253,
1055,
273,
253,
4715,
2281,
285,
10254,
369,
557,
290,
33195,
275,
253,
1783,
891,
923,
690,
1783,
670,
253,
3879,
273,
256,
35333,
38622,
48237,
17670,
297,
290,
360,
387,
4619,
2792,
50276,
262,
651,
320,
625,
9371,
604,
253,
4477,
476,
5513,
2139,
253,
1055,
273,
253,
4715,
2281,
285,
10254,
476,
320,
7011,
50275,
783,
9759,
285,
7234,
403,
21643,
275,
619,
4743,
690,
5018,
275,
253,
1783,
403,
417,
13955,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
310,
11205,
387,
5277,
271,
15571,
253,
12351,
3480,
273,
26647,
1543,
323,
38622,
347,
2429,
281,
256,
35333,
281,
436,
990,
253,
2929,
34430,
1868,
253,
1055,
273,
17825,
591,
4764,
4715,
2281,
285,
253,
10254,
4809,
273,
38622,
253,
2929,
2722,
326,
253,
1223,
17825,
4142,
1361,
8773,
26759,
2792,
7938,
50276,
9328,
403,
7197,
672,
1908,
253,
6507,
1255,
273,
46836,
1146,
4236,
2007,
10254,
556,
642,
1055,
327,
253,
6507,
1255,
273,
46836,
533,
969,
5644,
281,
1805,
13757,
407,
5277,
247,
16924,
4283,
281,
26759,
1127,
612,
4930,
597,
671,
2085,
247,
747,
5933,
519,
2284,
1754,
327,
41299,
10522,
387,
1805,
26647,
273,
17825,
3082,
50275,
783,
2929,
7964,
3400,
271,
4722,
8668,
285,
253,
2746,
281,
34430,
713,
253,
1055,
273,
10254,
285,
17825,
298,
83,
285,
1263,
616,
10307,
275,
34528,
26759,
2792,
285,
6507,
1255,
273,
46836,
3133,
247,
1077,
4217,
8668,
253,
3625,
1921,
323,
619,
17401,
310,
253,
9759,
273,
253,
2929,
275,
2426,
273,
253,
8132,
263,
697,
13260,
281,
5100,
253,
1543,
841,
7794,
452,
644,
16318,
407,
253,
30628,
275,
2508,
891,
1804,
253,
4477,
281,
9257,
45735,
253,
2929,
285,
3157,
253,
9759,
273,
253,
13260,
6240,
8132,
263,
281,
253,
9759,
347,
973,
347,
6240,
816,
6787,
835,
4569,
3340,
275,
1708,
273,
1327,
15291,
1255,
273,
841,
13260,
275,
13757,
6239
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper derives a counter term to the gradient flow ode formulation that reduces the discretization error from eulers method which is gradient descent when this correction term is expanded as a taylor series adding a select number of terms reduces the discretization order accordingly this is then used to analyze the behavior of gd under symmetry constraints specifically scale and translationinvariant parameters specifically this adds learningratedependent correction terms to the decay rates of certain quantities which matches gradient descent in practice pros quite an interesting take which attempts to correct the theoretical ode formulation in order to match practice as opposed to bridging this gap by using higher order solvers for example the motivation process and theoretical results are presented very well i could follow understand every result just the results not the proofs despite not being an expert in the theory of gradient descent cons i imagine the theory of estimating discretization error of odes has been done before though perhaps not in this exact context the zeroth order term eq 9 has shown up in machine learning studies before its not clear to me if we gain anything from using the higher order terms in the derivation as the analysis and experimental results all use only the zeroth order term docsepthe authors derive an equation of motion ie a continuous differential equation that matches the discrete time dynamics of gradient descent more clearly they do so by adding a counter terms to gradient flow that cancels out higher order discretization errors in dnns and this counter term is derived using backward error analysis more precisely it is the solution to equation 6 in the paper given they are using backward error analysis they are also able to quantify the discretization error for gd approximation of gf along with the counter term and hence also provide a bound on the learning rate such that this discretization error is small the authors apply their equation of motion to translation and scale invariant layers and show that their theoretical predictions better match gd my main concern is that this paper does not provide any interesting new result the main novelty of the paper is the general form of the counter term which is derived in theorem 33 as opposed to previous work for example barrett and dherin implicit gradient regularization which only uses the first order term as their regularization term while the authors mention that they derive the discretization error in corollary 41 the precise formulation is not provided and the rate is provided as an upper bound using bigoh which i believe is an artifact of standard series expansion results furthermore for most of the results on the discretization error bounds and the upper bound on the learning rate the authors assume that the counter term is either equal to zero or assume the first order counter term ie the term in equation 9 with the counter term equal to 0 the analysis matches a lot of the previous work example elkabetz and cohen 18 and with the first order term the main theoretical results are very similar to the ones already established in barrett and dherin who introduce the first order counter term as the implicit regularizer and the error is discussed in theorem 31 besides this the authors do apply their analysis to characterize the learning dynamics of scale and translation invariant layers and show that with the inclusion of the first order adding higher order counter term is going to be computationally expensive they are able to better predict the decay of parameter norm which is interesting but not that surprising yes the authors have discussed the limitations of their work docsepthis paper deals with the discrepancy between the actual discretized gradient descent and its continuous version ie a gradient flow for describing the equation of motion of learning dynamics more precisely the discrepancy error is formally introduced by using the backward error in numerical analysis the authors derive a counter term which can compensate for such a discrepancy of the gradient flow thus can describe the actual discretized trajectories in a continuous manner while the derived counter term is a complicated functional integral equation it can be analytically solved for all orders by assuming the underlying solution is a power series as an application the authors use the derived dynamics with the proposed counter term for investigating scaling and translationinvariant layers note because i am not an expert on learning theory my evaluation might not be exhaustive also i did not read the proofs in the supplementary material carefully strengths to me the derived counter term is novel and seems to be useful to predict and interpret complicated learning dynamics of deep neural network models although there have been previous studies that incorporate some correction terms with respect to the backward analysis error to my knowledge they are restricted to 1st order compensation frac14 nabla nabla ftheta which is generally called an implicit gradient regularization the proposed counter term is generalized for higher orders and can recover the previous studies well as in 9 while the main result 8 seems to be a known technique in the numerical analysis field i would like to give appropriate credit to the authors for the contribution that introducing such a technique to the deep learning field well weakenesses the authors address a fullbatch gradient descent only the authors also mention such a limitation of this work in conclusion and limitations i am not sure whether the approach can be easily generalized for the minibatch stochastic gradient descent method while the authors theoretically prove highorder corrections are required to cancel the leading order of discretization error it will be great if the authors 1 experimentally show the discrepancy between the gf with the proposed correction and that with a first order correction and 2 demonstrate the former can approximate gd well compared the latter eg in figure 2 or figure 4 the proposed method is not guaranteed for gd with a large learning rate thus cannot be used for explaining some interesting phenomena eg the regularization effect of an initial large learning rate however i think it is not a crucial drawback of this paper considering the essential assumption of gf the paper is very dense and thus hard to read a journal format might be more suitable for a clear representation of this work the authors discuss the limitations of the proposed method eg the lack of concerning the minibatch stochastic gd and other optimizers beyond gd in conclusion and limitations section it will be nice if the authors also address the questions raised above docsepthis paper is concerned with a theoretical understanding of modelling the dynamics of gradient descent with a differential equation previous work gradient flow describes the differential equation as fracdthetadt nablatheta ltheta which is the euler discretisation of gradient descent thetat1 thetat eta nablathetalthetat however discretisation error exists such that gradient flow and gradient descent diverge this paper derives a counter term to gradient flow labelled by xi fracdthetadt nablatheta ltheta etaxitheta the counter term is a functional integral the paper approximates the counter term with a series solution in eta with a recursive relationship existing to get from term k to term k1 the new dynamics are called equation of motion eom a limit for the learning rate is also derived which allows accurate simulation of gradient descent using eom with larger step sizes finally these findings are tested on scaleinvariant layers and translationinvariant layers and the results support the theoretical findings strengths this is not an area i have significant expertise in however overall this paper is very good in my opinion specifically the paper shows many impressive theoretical results the experiments are extensive and support the theory the motivation for the paper is very clear weaknesses again i believe the paper is very good i think it presents good theoretical results with sufficient experiment to support this the only weaknesses overall are in the writing style and presentation i think the paper is quite math heavy currently which makes it less accessible however this is down to personal preference for example lines 239245 definitions feel like quite a complicated way of saying most of the meanings one definition is alphamathcala alpha imathcala imathcalac but it is easier in my opinion to say alphamathcala is alpha for the parameters in layer mathcala and 1 for the others this can be extended for most of the definitions in this paragraph i think the best way the paper can be improved is by having as much intuition as possible in the main text with theorems included and then possibly having more mathematical detail in the appendix other small points are the presentation of results figure captions dont have figure n in line with the caption but over to the left which feels weird table 1 could also be improved rather than listing the decay rates it might be more informative to list the differences and relative differences in decay rates between gf gd and eom gd the authors have been upfront with the limitations of their work these are given in the conclusion and provide a nice avenue for future research they are about different optimizers and how using minibatches are not accounted for in the current work i cannot think of any further limitations
### Summary: | reviewers were unanimous in recommending that the paper be accepted and i accordingly recommend the same i encourage the authors to take into account suggestions made by reviewers so as to further improve the text in the cameraready version | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
38422,
247,
4828,
1307,
281,
253,
11786,
2685,
258,
615,
15895,
326,
11355,
253,
35132,
1320,
2228,
432,
299,
335,
398,
1332,
534,
310,
11786,
18499,
672,
436,
10618,
1307,
310,
11848,
347,
247,
246,
9614,
2962,
6240,
247,
3609,
1180,
273,
2426,
11355,
253,
35132,
1320,
1340,
15672,
436,
310,
840,
908,
281,
12106,
253,
3879,
273,
305,
69,
762,
10377,
10806,
5742,
4311,
285,
10234,
25168,
3602,
5742,
436,
11323,
4715,
14092,
2662,
10618,
2426,
281,
253,
10027,
4142,
273,
2176,
13483,
534,
10129,
11786,
18499,
275,
3946,
5847,
50274,
39911,
271,
4722,
1379,
534,
9437,
281,
3451,
253,
10527,
258,
615,
15895,
275,
1340,
281,
3761,
3946,
347,
10066,
281,
49519,
436,
8037,
407,
970,
2169,
1340,
1220,
735,
323,
1650,
50273,
783,
16038,
1232,
285,
10527,
1543,
403,
3559,
1077,
973,
891,
812,
956,
2096,
1046,
906,
816,
253,
1543,
417,
253,
27947,
5747,
417,
1146,
271,
6485,
275,
253,
3762,
273,
11786,
18499,
50276,
5040,
50274,
74,
8564,
253,
3762,
273,
26230,
35132,
1320,
2228,
273,
258,
3229,
556,
644,
2218,
1078,
2167,
4931,
417,
275,
436,
3242,
3634,
253,
1182,
254,
837,
1340,
1307,
16186,
898,
556,
2011,
598,
275,
5145,
4715,
2175,
1078,
50274,
953,
417,
2590,
281,
479,
604,
359,
6351,
2712,
432,
970,
253,
2169,
1340,
2426,
275,
253,
28529,
347,
253,
1783,
285,
5661,
1543,
512,
897,
760,
253,
1182,
254,
837,
1340,
1307,
50275,
7152,
339,
431,
248,
4477,
15313,
271,
5150,
273,
3200,
26332,
247,
5415,
8967,
5150,
326,
10129,
253,
13358,
673,
8062,
273,
11786,
18499,
625,
4518,
597,
513,
594,
407,
6240,
247,
4828,
2426,
281,
11786,
2685,
326,
476,
35430,
562,
2169,
1340,
35132,
1320,
6332,
275,
277,
79,
2224,
285,
436,
4828,
1307,
310,
6012,
970,
19265,
2228,
1783,
625,
10534,
352,
310,
253,
2900,
281,
5150,
721,
275,
253,
2929,
50275,
28821,
597,
403,
970,
19265,
2228,
1783,
597,
403,
671,
2104,
281,
22048,
253,
35132,
1320,
2228,
323,
305,
69,
11193,
273,
305,
71,
2112,
342,
253,
4828,
1307,
285,
7613,
671,
2085,
247,
3033,
327,
253,
4715,
2281,
824,
326,
436,
35132,
1320,
2228,
310,
1355,
50275,
783,
4477,
4647,
616,
5150,
273,
3200,
281,
10234,
285,
4311,
13727,
8090,
285,
921,
326,
616,
10527,
13650,
1805,
3761,
305,
69,
619,
2022,
4468,
310,
326,
436,
2929,
1057,
417,
2085,
667,
4722,
747,
906,
253,
2022,
38135,
273,
253,
2929,
310,
253,
2087,
830,
273,
253,
4828,
1307,
534,
310,
6012,
275,
10012,
5922,
347,
10066,
281,
2045,
789,
323,
1650,
2534,
12436,
285,
277,
29769,
15424,
11786,
37820,
534,
760,
4648,
253,
806,
1340,
1307,
347,
616,
37820,
1307,
50275,
6050,
253,
4477,
3748,
326,
597,
15313,
253,
35132,
1320,
2228,
275,
40460,
7609,
253,
10799,
15895,
310,
417,
2530,
285,
253,
2281,
310,
2530,
347,
271,
5170,
3033,
970,
1943,
1368,
534,
891,
2868,
310,
271,
34332,
273,
2629,
2962,
7466,
1543,
50275,
44295,
3062,
323,
954,
273,
253,
1543,
327,
253,
35132,
1320,
2228,
14493,
285,
253,
5170,
3033,
327,
253,
4715,
2281,
253,
4477,
5467,
326,
253,
4828,
1307,
310,
2057,
4503,
281,
5058,
390,
5467,
253,
806,
1340,
4828,
1307,
26332,
253,
1307,
275,
5150,
898,
50276,
3113,
253,
4828,
1307,
4503,
281,
470,
253,
1783,
10129,
247,
2257,
273,
253,
2045,
789,
1650,
1045,
76,
357,
25532,
285,
820,
864,
1283,
285,
342,
253,
806,
1340,
1307,
253,
2022,
10527,
1543,
403,
1077,
2074,
281,
253,
4394,
2168,
4232,
275,
2534,
12436,
285,
277,
29769,
665,
9569,
253,
806,
1340,
4828,
1307,
347,
253,
15424,
3963,
6081,
285,
253,
2228,
310,
5469,
275,
10012,
4562,
50276,
67,
11587,
436,
253,
4477,
513,
4647,
616,
1783,
281,
17710,
253,
4715,
8062,
273,
4311,
285,
10234,
13727,
8090,
285,
921,
326,
342,
253,
11250,
273,
253,
806,
1340,
6240,
2169,
1340,
4828,
1307,
310,
1469,
281,
320,
43245,
8214,
597,
403,
2104,
281,
1805,
3283,
253,
10027,
273,
4764,
5222,
534,
310,
4722,
533,
417,
326,
10084,
4754,
253,
4477,
452,
5469,
253,
7364,
273,
616,
789,
50276,
7152,
33032,
2520,
2929,
13330,
342,
253,
26210,
875,
253,
4588,
35132,
1025,
11786,
18499,
285,
697,
5415,
2715,
26332,
247,
11786,
2685,
323,
12930,
253,
5150,
273,
3200,
273,
4715,
8062,
625,
10534,
253,
26210,
2228,
310,
19186,
5611,
407,
970,
253,
19265,
2228,
275,
10704,
1783,
253,
4477,
15313,
247,
4828,
1307,
534,
476,
23514,
323,
824,
247,
26210,
273,
253,
11786,
2685,
3021,
476,
6266,
253,
4588,
35132,
1025,
24102,
275,
247,
5415,
5133,
1223,
253,
6012,
4828,
1307,
310,
247,
9542,
5164,
9909,
5150,
352,
476,
320,
41398,
14042,
323,
512,
7367,
407,
7384,
253,
6944,
2900,
310,
247,
1612,
2962,
347,
271,
2898,
253,
4477,
897,
253,
6012,
8062,
342,
253,
4081,
4828,
1307,
323,
15686,
13642,
285,
10234,
25168,
8090,
3877,
50276,
12157,
891,
717,
417,
271,
6485,
327,
4715,
3762,
619,
7103,
1537,
417,
320,
41389,
671,
891,
858,
417,
1239,
253,
27947,
275,
253,
24864,
2144,
9257,
50276,
296,
3755,
20556,
50276,
936,
479,
253,
6012,
4828,
1307,
310,
4460,
285,
3133,
281,
320,
4217,
281,
3283,
285,
4665,
9542,
4715,
8062,
273,
3676,
11454,
2990,
3210,
3738,
627,
452,
644,
2045,
2175,
326,
19071,
690,
10618,
2426,
342,
1675,
281,
253,
19265,
1783,
2228,
281,
619,
3640,
597,
403,
11096,
281,
337,
296,
1340,
10963,
1315,
317,
1047,
295,
6348,
50276,
6526,
269,
3124,
534,
310,
3839,
1925,
271,
15424,
11786,
37820,
253,
4081,
4828,
1307,
310,
14923,
323,
2169,
7367,
285,
476,
9295,
253,
2045,
2175,
973,
347,
275,
898,
1223,
253,
2022,
906,
854,
3133,
281,
320,
247,
1929,
5853,
275,
253,
10704,
1783,
1673,
891,
651,
751,
281,
1918,
4569,
6152,
281,
253,
4477,
323,
253,
7680,
326,
16984,
824,
247,
5853,
281,
253,
3676,
4715,
1673,
973,
50275,
664,
2114,
48998,
50276,
783,
4477,
2953,
247,
2120,
23941,
11786,
18499,
760,
253,
4477,
671,
3748,
824,
247,
12291,
273,
436,
789,
275,
6452,
285,
7364,
891,
717,
417,
2119,
1880,
253,
2746,
476,
320,
4354,
14923,
323,
253,
1054,
487,
1506,
19191,
11786,
18499,
1332,
50276,
6050,
253,
4477,
28055,
5276,
1029,
2621,
17660,
403,
2424,
281,
14002,
253,
4283,
1340,
273,
35132,
1320,
2228,
352,
588,
320,
1270,
604,
253,
4477,
337,
21657,
921,
253,
26210,
875,
253,
305,
71,
342,
253,
4081,
10618,
285,
326,
342,
247,
806,
1340,
10618,
285,
374,
7568,
253,
3438,
476,
16851,
305,
69,
973,
2429,
253,
6158,
24088,
275,
4677,
374,
390,
4677,
577,
50276,
783,
4081,
1332,
310,
417,
16293,
323,
305,
69,
342,
247,
1781,
4715,
2281,
3021,
2550,
320,
908,
323,
15571,
690,
4722,
16958,
24088,
253,
37820,
1055,
273,
271,
3302,
1781,
4715,
2281,
2299,
891,
1158,
352,
310,
417,
247,
9560,
32489,
273,
436,
2929,
7296,
253,
5667,
9376,
273,
305,
71,
50276,
783,
2929,
310,
1077,
14086,
285,
3021,
1892,
281,
1239,
247,
6698,
5981,
1537,
320,
625,
7470,
323,
247,
2590,
6779,
273,
436,
789,
50273,
783,
4477,
2319,
253,
7364,
273,
253,
4081,
1332,
24088,
50276,
783,
3480,
273,
8664,
253,
1054,
487,
1506,
19191,
305,
69,
285,
643,
5556,
14460,
4457,
305,
69,
275,
6452,
285,
7364,
2593,
352,
588,
320,
5322,
604,
253,
4477,
671,
2953,
253,
3533,
5439,
1840,
5474,
33032,
2520,
2929,
310,
7514,
342,
247,
10527,
4685,
273,
26278,
253,
8062,
273,
11786,
18499,
342,
247,
8967,
5150,
2045,
789,
11786,
2685,
8631,
253,
8967,
5150,
347,
50276,
1124,
69,
783,
85,
324,
85,
50276,
79,
1752,
4349,
893,
298,
3124,
50276,
4609,
310,
253,
299,
14398,
35132,
5837,
273,
11786,
18499,
50276,
783,
41506,
18,
50276,
783,
41506,
50276,
1464,
295,
1752,
4349,
22559,
783,
41506,
50276,
35529,
35132,
5837,
2228,
4961,
824,
326,
11786,
2685,
285,
11786,
18499,
11711,
463,
436,
2929,
38422,
247,
4828,
1307,
281,
11786,
2685,
27214,
407,
1269,
74,
50276,
1124,
69,
783,
85,
324,
85,
50276,
79,
1752,
4349,
893,
298,
3124,
50276,
292,
991,
262,
22666,
50276,
783,
4828,
1307,
310,
247,
5164,
9909,
253,
2929,
4020,
684,
253,
4828,
1307,
342,
247,
2962,
2900,
275,
1162,
66,
342,
247,
33037,
2954,
5368,
281,
755,
432,
1307,
465,
281,
1307,
465,
18,
253,
747,
8062,
403,
1925,
5150,
273,
3200,
299,
297,
247,
2701,
323,
253,
4715,
2281,
310,
671,
6012,
534,
4483,
7899,
9864,
273,
11786,
18499,
970,
299,
297,
342,
4067,
3213,
9552,
50276,
71,
3341,
841,
4342,
403,
5762,
327,
4311,
25168,
8090,
285,
10234,
25168,
8090,
285,
253,
1543,
1329,
253,
10527,
4342,
20544,
50276,
2520,
310,
417,
271,
2170,
891,
452,
1534,
15040,
275,
2299,
4583,
436,
2929,
310,
1077,
1175,
275,
619,
4743,
5742,
50275,
783,
2929,
2722,
1142,
13943,
10527,
1543,
50276,
783,
4679,
403,
9470,
285,
1329,
253,
3762,
50276,
783,
16038,
323,
253,
2929,
310,
1077,
2590,
50276,
20881,
1255,
265,
50276,
16245,
891,
2868,
253,
2929,
310,
1077,
1175,
891,
1158,
352,
10262,
1175,
10527,
1543,
342,
4209,
3368,
281,
1329,
436,
253,
760,
32213,
4583,
403,
275,
253,
4028,
3740,
285,
9759,
891,
1158,
253,
2929,
310,
3240,
14168,
5536,
4390,
534,
2789,
352,
1679,
12482,
2299,
436,
310,
1066,
281,
3367,
14682,
323,
1650,
3104,
27862,
19490,
14308,
1928,
751,
3240,
247,
9542,
1039,
273,
3981,
954,
273,
253,
30460,
581,
5426,
310,
355,
545,
312,
506,
1179,
66,
50276,
1637,
516,
506,
1179,
66,
50276,
303,
506,
1179,
317,
533,
352,
310,
6927,
275,
619,
4743,
281,
1333,
355,
545,
312,
506,
1179,
66,
310,
9765,
323,
253,
3602,
275,
3828,
14168,
1179,
66,
285,
337,
323,
253,
2571,
436,
476,
320,
6508,
323,
954,
273,
253,
14308,
275,
436,
12494,
891,
1158,
253,
1682,
1039,
253,
2929,
476,
320,
5520,
310,
407,
1907,
347,
1199,
30328,
347,
1896,
275,
253,
2022,
2505,
342,
39383,
2908,
285,
840,
6830,
1907,
625,
15965,
2508,
275,
253,
30762,
50276,
977,
1355,
2792,
403,
253,
9759,
273,
1543,
4677,
3403,
621,
13414,
452,
4677,
295,
275,
1386,
342,
253,
11743,
533,
689,
281,
253,
1669,
534,
9193,
12504,
2829,
337,
812,
671,
320,
5520,
2581,
685,
16485,
253,
10027,
4142,
352,
1537,
320,
625,
27096,
281,
1618,
253,
3910,
285,
4103,
3910,
275,
10027,
4142,
875,
305,
71,
50276,
35333,
285,
299,
297,
50276,
35333,
253,
4477,
452,
644,
598,
6342,
342,
253,
7364,
273,
616,
789,
841,
403,
1677,
275,
253,
6452,
285,
2085,
247,
5322,
39893,
323,
2852,
2561,
597,
403,
670,
1027,
5556,
14460,
285,
849,
970,
1054,
487,
32358,
403,
417,
20184,
323,
275,
253,
1655,
789,
891,
2550,
1158,
273,
667,
2007,
7364,
2490,
187,
4118,
18435,
27,
15337,
398,
497,
42293,
275,
46705,
326,
253,
2929,
320,
7607,
285,
891,
15672,
5583,
253,
1072,
50276,
74,
11907,
253,
4477,
281,
1379,
715,
2395,
13991,
1160,
407,
30628,
594,
347,
281,
2007,
3157,
253,
2505,
275,
253,
4049,
254,
609,
5102,
2715
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
38422,
247,
4828,
1307,
281,
253,
11786,
2685,
258,
615,
15895,
326,
11355,
253,
35132,
1320,
2228,
432,
299,
335,
398,
1332,
534,
310,
11786,
18499,
672,
436,
10618,
1307,
310,
11848,
347,
247,
246,
9614,
2962,
6240,
247,
3609,
1180,
273,
2426,
11355,
253,
35132,
1320,
1340,
15672,
436,
310,
840,
908,
281,
12106,
253,
3879,
273,
305,
69,
762,
10377,
10806,
5742,
4311,
285,
10234,
25168,
3602,
5742,
436,
11323,
4715,
14092,
2662,
10618,
2426,
281,
253,
10027,
4142,
273,
2176,
13483,
534,
10129,
11786,
18499,
275,
3946,
5847,
50274,
39911,
271,
4722,
1379,
534,
9437,
281,
3451,
253,
10527,
258,
615,
15895,
275,
1340,
281,
3761,
3946,
347,
10066,
281,
49519,
436,
8037,
407,
970,
2169,
1340,
1220,
735,
323,
1650,
50273,
783,
16038,
1232,
285,
10527,
1543,
403,
3559,
1077,
973,
891,
812,
956,
2096,
1046,
906,
816,
253,
1543,
417,
253,
27947,
5747,
417,
1146,
271,
6485,
275,
253,
3762,
273,
11786,
18499,
50276,
5040,
50274,
74,
8564,
253,
3762,
273,
26230,
35132,
1320,
2228,
273,
258,
3229,
556,
644,
2218,
1078,
2167,
4931,
417,
275,
436,
3242,
3634,
253,
1182,
254,
837,
1340,
1307,
16186,
898,
556,
2011,
598,
275,
5145,
4715,
2175,
1078,
50274,
953,
417,
2590,
281,
479,
604,
359,
6351,
2712,
432,
970,
253,
2169,
1340,
2426,
275,
253,
28529,
347,
253,
1783,
285,
5661,
1543,
512,
897,
760,
253,
1182,
254,
837,
1340,
1307,
50275,
7152,
339,
431,
248,
4477,
15313,
271,
5150,
273,
3200,
26332,
247,
5415,
8967,
5150,
326,
10129,
253,
13358,
673,
8062,
273,
11786,
18499,
625,
4518,
597,
513,
594,
407,
6240,
247,
4828,
2426,
281,
11786,
2685,
326,
476,
35430,
562,
2169,
1340,
35132,
1320,
6332,
275,
277,
79,
2224,
285,
436,
4828,
1307,
310,
6012,
970,
19265,
2228,
1783,
625,
10534,
352,
310,
253,
2900,
281,
5150,
721,
275,
253,
2929,
50275,
28821,
597,
403,
970,
19265,
2228,
1783,
597,
403,
671,
2104,
281,
22048,
253,
35132,
1320,
2228,
323,
305,
69,
11193,
273,
305,
71,
2112,
342,
253,
4828,
1307,
285,
7613,
671,
2085,
247,
3033,
327,
253,
4715,
2281,
824,
326,
436,
35132,
1320,
2228,
310,
1355,
50275,
783,
4477,
4647,
616,
5150,
273,
3200,
281,
10234,
285,
4311,
13727,
8090,
285,
921,
326,
616,
10527,
13650,
1805,
3761,
305,
69,
619,
2022,
4468,
310,
326,
436,
2929,
1057,
417,
2085,
667,
4722,
747,
906,
253,
2022,
38135,
273,
253,
2929,
310,
253,
2087,
830,
273,
253,
4828,
1307,
534,
310,
6012,
275,
10012,
5922,
347,
10066,
281,
2045,
789,
323,
1650,
2534,
12436,
285,
277,
29769,
15424,
11786,
37820,
534,
760,
4648,
253,
806,
1340,
1307,
347,
616,
37820,
1307,
50275,
6050,
253,
4477,
3748,
326,
597,
15313,
253,
35132,
1320,
2228,
275,
40460,
7609,
253,
10799,
15895,
310,
417,
2530,
285,
253,
2281,
310,
2530,
347,
271,
5170,
3033,
970,
1943,
1368,
534,
891,
2868,
310,
271,
34332,
273,
2629,
2962,
7466,
1543,
50275,
44295,
3062,
323,
954,
273,
253,
1543,
327,
253,
35132,
1320,
2228,
14493,
285,
253,
5170,
3033,
327,
253,
4715,
2281,
253,
4477,
5467,
326,
253,
4828,
1307,
310,
2057,
4503,
281,
5058,
390,
5467,
253,
806,
1340,
4828,
1307,
26332,
253,
1307,
275,
5150,
898,
50276,
3113,
253,
4828,
1307,
4503,
281,
470,
253,
1783,
10129,
247,
2257,
273,
253,
2045,
789,
1650,
1045,
76,
357,
25532,
285,
820,
864,
1283,
285,
342,
253,
806,
1340,
1307,
253,
2022,
10527,
1543,
403,
1077,
2074,
281,
253,
4394,
2168,
4232,
275,
2534,
12436,
285,
277,
29769,
665,
9569,
253,
806,
1340,
4828,
1307,
347,
253,
15424,
3963,
6081,
285,
253,
2228,
310,
5469,
275,
10012,
4562,
50276,
67,
11587,
436,
253,
4477,
513,
4647,
616,
1783,
281,
17710,
253,
4715,
8062,
273,
4311,
285,
10234,
13727,
8090,
285,
921,
326,
342,
253,
11250,
273,
253,
806,
1340,
6240,
2169,
1340,
4828,
1307,
310,
1469,
281,
320,
43245,
8214,
597,
403,
2104,
281,
1805,
3283,
253,
10027,
273,
4764,
5222,
534,
310,
4722,
533,
417,
326,
10084,
4754,
253,
4477,
452,
5469,
253,
7364,
273,
616,
789,
50276,
7152,
33032,
2520,
2929,
13330,
342,
253,
26210,
875,
253,
4588,
35132,
1025,
11786,
18499,
285,
697,
5415,
2715,
26332,
247,
11786,
2685,
323,
12930,
253,
5150,
273,
3200,
273,
4715,
8062,
625,
10534,
253,
26210,
2228,
310,
19186,
5611,
407,
970,
253,
19265,
2228,
275,
10704,
1783,
253,
4477,
15313,
247,
4828,
1307,
534,
476,
23514,
323,
824,
247,
26210,
273,
253,
11786,
2685,
3021,
476,
6266,
253,
4588,
35132,
1025,
24102,
275,
247,
5415,
5133,
1223,
253,
6012,
4828,
1307,
310,
247,
9542,
5164,
9909,
5150,
352,
476,
320,
41398,
14042,
323,
512,
7367,
407,
7384,
253,
6944,
2900,
310,
247,
1612,
2962,
347,
271,
2898,
253,
4477,
897,
253,
6012,
8062,
342,
253,
4081,
4828,
1307,
323,
15686,
13642,
285,
10234,
25168,
8090,
3877,
50276,
12157,
891,
717,
417,
271,
6485,
327,
4715,
3762,
619,
7103,
1537,
417,
320,
41389,
671,
891,
858,
417,
1239,
253,
27947,
275,
253,
24864,
2144,
9257,
50276,
296,
3755,
20556,
50276,
936,
479,
253,
6012,
4828,
1307,
310,
4460,
285,
3133,
281,
320,
4217,
281,
3283,
285,
4665,
9542,
4715,
8062,
273,
3676,
11454,
2990,
3210,
3738,
627,
452,
644,
2045,
2175,
326,
19071,
690,
10618,
2426,
342,
1675,
281,
253,
19265,
1783,
2228,
281,
619,
3640,
597,
403,
11096,
281,
337,
296,
1340,
10963,
1315,
317,
1047,
295,
6348,
50276,
6526,
269,
3124,
534,
310,
3839,
1925,
271,
15424,
11786,
37820,
253,
4081,
4828,
1307,
310,
14923,
323,
2169,
7367,
285,
476,
9295,
253,
2045,
2175,
973,
347,
275,
898,
1223,
253,
2022,
906,
854,
3133,
281,
320,
247,
1929,
5853,
275,
253,
10704,
1783,
1673,
891,
651,
751,
281,
1918,
4569,
6152,
281,
253,
4477,
323,
253,
7680,
326,
16984,
824,
247,
5853,
281,
253,
3676,
4715,
1673,
973,
50275,
664,
2114,
48998,
50276,
783,
4477,
2953,
247,
2120,
23941,
11786,
18499,
760,
253,
4477,
671,
3748,
824,
247,
12291,
273,
436,
789,
275,
6452,
285,
7364,
891,
717,
417,
2119,
1880,
253,
2746,
476,
320,
4354,
14923,
323,
253,
1054,
487,
1506,
19191,
11786,
18499,
1332,
50276,
6050,
253,
4477,
28055,
5276,
1029,
2621,
17660,
403,
2424,
281,
14002,
253,
4283,
1340,
273,
35132,
1320,
2228,
352,
588,
320,
1270,
604,
253,
4477,
337,
21657,
921,
253,
26210,
875,
253,
305,
71,
342,
253,
4081,
10618,
285,
326,
342,
247,
806,
1340,
10618,
285,
374,
7568,
253,
3438,
476,
16851,
305,
69,
973,
2429,
253,
6158,
24088,
275,
4677,
374,
390,
4677,
577,
50276,
783,
4081,
1332,
310,
417,
16293,
323,
305,
69,
342,
247,
1781,
4715,
2281,
3021,
2550,
320,
908,
323,
15571,
690,
4722,
16958,
24088,
253,
37820,
1055,
273,
271,
3302,
1781,
4715,
2281,
2299,
891,
1158,
352,
310,
417,
247,
9560,
32489,
273,
436,
2929,
7296,
253,
5667,
9376,
273,
305,
71,
50276,
783,
2929,
310,
1077,
14086,
285,
3021,
1892,
281,
1239,
247,
6698,
5981,
1537,
320,
625,
7470,
323,
247,
2590,
6779,
273,
436,
789,
50273,
783,
4477,
2319,
253,
7364,
273,
253,
4081,
1332,
24088,
50276,
783,
3480,
273,
8664,
253,
1054,
487,
1506,
19191,
305,
69,
285,
643,
5556,
14460,
4457,
305,
69,
275,
6452,
285,
7364,
2593,
352,
588,
320,
5322,
604,
253,
4477,
671,
2953,
253,
3533,
5439,
1840,
5474,
33032,
2520,
2929,
310,
7514,
342,
247,
10527,
4685,
273,
26278,
253,
8062,
273,
11786,
18499,
342,
247,
8967,
5150,
2045,
789,
11786,
2685,
8631,
253,
8967,
5150,
347,
50276,
1124,
69,
783,
85,
324,
85,
50276,
79,
1752,
4349,
893,
298,
3124,
50276,
4609,
310,
253,
299,
14398,
35132,
5837,
273,
11786,
18499,
50276,
783,
41506,
18,
50276,
783,
41506,
50276,
1464,
295,
1752,
4349,
22559,
783,
41506,
50276,
35529,
35132,
5837,
2228,
4961,
824,
326,
11786,
2685,
285,
11786,
18499,
11711,
463,
436,
2929,
38422,
247,
4828,
1307,
281,
11786,
2685,
27214,
407,
1269,
74,
50276,
1124,
69,
783,
85,
324,
85,
50276,
79,
1752,
4349,
893,
298,
3124,
50276,
292,
991,
262,
22666,
50276,
783,
4828,
1307,
310,
247,
5164,
9909,
253,
2929,
4020,
684,
253,
4828,
1307,
342,
247,
2962,
2900,
275,
1162,
66,
342,
247,
33037,
2954,
5368,
281,
755,
432,
1307,
465,
281,
1307,
465,
18,
253,
747,
8062,
403,
1925,
5150,
273,
3200,
299,
297,
247,
2701,
323,
253,
4715,
2281,
310,
671,
6012,
534,
4483,
7899,
9864,
273,
11786,
18499,
970,
299,
297,
342,
4067,
3213,
9552,
50276,
71,
3341,
841,
4342,
403,
5762,
327,
4311,
25168,
8090,
285,
10234,
25168,
8090,
285,
253,
1543,
1329,
253,
10527,
4342,
20544,
50276,
2520,
310,
417,
271,
2170,
891,
452,
1534,
15040,
275,
2299,
4583,
436,
2929,
310,
1077,
1175,
275,
619,
4743,
5742,
50275,
783,
2929,
2722,
1142,
13943,
10527,
1543,
50276,
783,
4679,
403,
9470,
285,
1329,
253,
3762,
50276,
783,
16038,
323,
253,
2929,
310,
1077,
2590,
50276,
20881,
1255,
265,
50276,
16245,
891,
2868,
253,
2929,
310,
1077,
1175,
891,
1158,
352,
10262,
1175,
10527,
1543,
342,
4209,
3368,
281,
1329,
436,
253,
760,
32213,
4583,
403,
275,
253,
4028,
3740,
285,
9759,
891,
1158,
253,
2929,
310,
3240,
14168,
5536,
4390,
534,
2789,
352,
1679,
12482,
2299,
436,
310,
1066,
281,
3367,
14682,
323,
1650,
3104,
27862,
19490,
14308,
1928,
751,
3240,
247,
9542,
1039,
273,
3981,
954,
273,
253,
30460,
581,
5426,
310,
355,
545,
312,
506,
1179,
66,
50276,
1637,
516,
506,
1179,
66,
50276,
303,
506,
1179,
317,
533,
352,
310,
6927,
275,
619,
4743,
281,
1333,
355,
545,
312,
506,
1179,
66,
310,
9765,
323,
253,
3602,
275,
3828,
14168,
1179,
66,
285,
337,
323,
253,
2571,
436,
476,
320,
6508,
323,
954,
273,
253,
14308,
275,
436,
12494,
891,
1158,
253,
1682,
1039,
253,
2929,
476,
320,
5520,
310,
407,
1907,
347,
1199,
30328,
347,
1896,
275,
253,
2022,
2505,
342,
39383,
2908,
285,
840,
6830,
1907,
625,
15965,
2508,
275,
253,
30762,
50276,
977,
1355,
2792,
403,
253,
9759,
273,
1543,
4677,
3403,
621,
13414,
452,
4677,
295,
275,
1386,
342,
253,
11743,
533,
689,
281,
253,
1669,
534,
9193,
12504,
2829,
337,
812,
671,
320,
5520,
2581,
685,
16485,
253,
10027,
4142,
352,
1537,
320,
625,
27096,
281,
1618,
253,
3910,
285,
4103,
3910,
275,
10027,
4142,
875,
305,
71,
50276,
35333,
285,
299,
297,
50276,
35333,
253,
4477,
452,
644,
598,
6342,
342,
253,
7364,
273,
616,
789,
841,
403,
1677,
275,
253,
6452,
285,
2085,
247,
5322,
39893,
323,
2852,
2561,
597,
403,
670,
1027,
5556,
14460,
285,
849,
970,
1054,
487,
32358,
403,
417,
20184,
323,
275,
253,
1655,
789,
891,
2550,
1158,
273,
667,
2007,
7364,
2490,
187,
4118,
18435,
27,
15337,
398,
497,
42293,
275,
46705,
326,
253,
2929,
320,
7607,
285,
891,
15672,
5583,
253,
1072,
50276,
74,
11907,
253,
4477,
281,
1379,
715,
2395,
13991,
1160,
407,
30628,
594,
347,
281,
2007,
3157,
253,
2505,
275,
253,
4049,
254,
609,
5102,
2715
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes an approach for using data relabelling for metarl for better sample efficiency and to enable training on sparse reward environments specifically the proposed method combines pearl1 with a modified version of hipi2 where the trajectories chosen for relabelling are effective for adaptation and not necessarily high in reward themselves 1 efficient offpolicy metarl vis probabilistic context variables rakelly et al 2 rewriting history with inverse rl esyenbach et al strengths 1 usefulness of relabelling the problem considered is an important one since even though hindsight relabelling is standard in multitask rl and has been shown to enable learning on sparse reward environments which are otherwise very difficult to solve this approach hasnt been applied to the metarl setting yet this is despite the fact that metarl also considers a multitask distribution and can benefit from explicitly using data for a different task and relabelling it under the corresponding reward function the mathematical formulation of the approach closely follows hipi2 with the difference that postadaptation trajectory return is considered instead of current trajectory return to be aligned with the metalearning objective the authors show experimentally that current metarl approaches do not begin to make progress on sparsereward tasks showing the importance and effectiveness of relabelling 2 extent of evaluation and analysis the paper includes evaluation on 5 different sparse reward environments 3 dense reward environments which show that the relabelling scheme offers benefits over current metarl approaches mainly in the sparse reward setting the authors also include ablationsanalysis of specific components such as using a learned reward instead of the true reward using hardmax instead of softmax for sampling the relabelling task etc the paper is well written the presentation is clear and wellmotivated weaknesses 1 small performance gap with hipi simplistic environments out of the 8 experimental domains chosen the performance of the proposed approach hfr is significantly better than hipi on only two domains antgoal and sawyerpush this indicates that most of the benefit is coming from the relabelling scheme for most environments either because the adaptation procedure doesnt actually lead to better performance or because the environments are too simple to require a lot of adaptation to heldout tasks given that performance is much better than hipi on the hardest environments antgoal and sawyerpush i am inclined to think the issue is the latter instead of the former which can be addressed by evaluating on harder environments these could include other singlefamily robotic tasks from metaworld eg sawyerdooropen sawyerboxclose etc even better would be metatraining across task families using metaworld ml10 or ml45 this would test adaptation to tasks that are semantically different and would make the paper a lot more compelling the approach introduces relabelling which has already shown to be important in multitask rl in metarl and shows superior performance on sparse reward environments the paper would be more compelling if it included evaluations on more challenging environments to establish the importance of the adaptation component docsepthis paper proposes a way to share data across different tasks in metareinforcement learning metarl where the data from one task is reused in another task by relabeling the rewards based on the hipi method 1 the authors construct a relabeling distribution to relabel the preadaptation trajectories from one task to be used for another task the relabeling probability of a trajectory is chosen to be proportional to the exponentiated utility function which is defined as the expected return after the agent uses that trajectory to adapt in practice the postadaptation return is approximated using the learned q function the authors apply this relabeling distribution to pearl an existing offpolicy actorcritic style metalearning algorithm the authors conduct experiments on simulated robotics experiments the results suggest that the proposed method outperforms prior methods on sparse reward tasks while performing roughly the same on dense reward tasks references 1 eysenbach benjamin et al rewriting history with inverse rl hindsight inference for policy improvement arxiv preprint arxiv200211089 2020 overall i think this paper presents an interesting idea for sharing data between tasks of a metarl problem the paper is well written and the ideas are presented clearly pros 1 i find the main insight of the paper simple and intuitive the idea that we need to relabel not according to how much return we achieve but according to how much information we can gather for task identification gives us a clear distinction between multitask rl and metarl the derivation of relabeling according to the exponentiated post adaptation return follows naturally 2 the ablation study in the paper is very informative the ablation study gives us clear comparisons showing us which component is more important from the ablation study it seems that using the partition function and softmax relabeling distribution are the most important components 3 the paper is well written the insights ideas algorithms and experiments are easy to follow cons 1 i am somewhat skeptical about the approach of using the learned q function to estimate return after adaptation in the base metarl algorithm pearl the context encoder is trained to identify the task from a distribution of tasks producing a posterior distribution of context z corresponding to that task this means that given the relabeled trajectory even if the context encoder predicts the context corresponding to a wrong task as long as the produced context is within the distribution of tasks the expected return will still be high because the policy is trained to do well also on that wrong task therefore it is not clear to me why using the learned q function is a good way to estimate return on that specific task in order to verify this id like to ask the authors to include the following experiments first train the proposed algorithm to convergence and freeze the weights of the context encoder then train a nway classifier on top of the context encoder to classify the context into one of n training tasks without using any relabeled trajectories and then report the accuracy of the classifier finally relabel the trajectories according to the proposed method and report the classifiers accuracy on top of the relabeled trajectories if the relabeling mechanism using the learned q function is correctly capturing the task information we would see that the classifiers accuracy on the relabeled trajectories is comparable to that on the true trajectories in fact this experiment could also lead to an even simpler relabeling strategy directly use the true tasks probability under the classifiers prediction as the source of relabeling signal 2 the empirical performance of the proposed method does not seem very strong only in 3 of 5 sparse reward tasks the proposed method significantly outperforms the baselines and the proposed method does not show much improvement on dense reward tasks given these limitations im leaning slightly towards not accepting the paper id highly encourage the authors to conduct the experiment i suggested in order to verify that the proposed method is indeed capturing the task information correctly update after author response the authors conducted additional classifier experiments i requested and the results suggest that using learned q function to estimate returns is highly informative about the task therefore my main concern about the proposed method has been addressed and im now leaning towards accepting the paper the paper presents an interesting idea about reusing data across tasks in metarl the idea is very intuitive and the paper is well written however im not sure about whether the approach used to implement the idea of the paper really does what the authors claim it does therefore id like to see more evidence before i can recommend accepting the paper docsepthis paper studies task relabelling in hindsight to increase the efficiency of metareinforcementlearning the authors propose a strategy for calculating a distribution of tasks for which a particular batch of data would be useful for adaptation and sample from this distribution to construct a relabelled batch which augments the training data the authors show empirically that this improves sample efficiency over more naive relabelling schemes particularly for sparse reward tasks a series of ablations further justifies several design decisions or investigates robustness to hyperparameters i like this paper overall the motivation is sound metarl is almost by definition slow by using a slower timescale for metalearning than the fast learning or adaptation and so dataefficient methods are key task relabelling like in multitask or goalconditioned rl makes a lot of sense in this context the particular proposed method seems reasonable although i have some concerns about the detail of the exposition i found section 41 fairly difficult to follow first its not 100 clear to me how to map eq1 to the objective written in terms of utilities because eq1 does not define where taupre come from presumably this is just following pitheta and the conditioning of qtau psi on psi only results in differing rewards in tau not differing stateaction sequences then most importantly for understanding this section i dont follow why the objective for theta phi should be maximised by adjusting this q for fixed theta phi the paper says this facilitates alignment with the goals of the metalearner but im not sure what this means the derivation then continues in a very brusque manner im not a fan of it is easy to show in general if it is easy rather write it yourself in an appendix if need be for space or cite appropriately eventually we arrive at an optimal relabeling distribution but i dont understand in what sense it is optimal due the previous confusion it could be im missing something simple or these things are all straightforward and clear to a reader with the right context however i encourage the authors to substantially clarify and elaborate this section to engage with a broad audience the issue that i have more intuitively with the method is that the optimal task inference should depend on the true distribution of tasks by altering this distribution through relabeling it seems it would change the optimal theta phi can the authors elaborate on whether or not this should be a consideration perhaps by clarifying the exposition given in s41 the implementation of the approach is quite neat i like the use of pearls particular type of value function to efficiently estimate the value of the postadaptation policy without sampling any fresh transitions i also like the empirical study the performance gains seem substantial in several tasks and i appreciate the credible baselines which are more naive but not just vanilla pearl without any task relabeling i appreciated the informative ablations minor comments or questions paragraph 2 of the intro says metarl is inherently onpolicy this is incorrect why relabel with just one task sampled from qpsitau why not several samples or weighted samples in algo 1 maybe use a different letter to distinguish n in getlogpartition and in n in computeutility the paper would benefit from more details on the setup with a learned reward function the authors were able to clarify the points that had confused me in my initial reading i am persuaded that the optimal metalearned solution will not be biased by the proposed relabelling and that the derivation is sound optimising the relabelling distribution for the immediate postadaptation returns makes sense as a somewhat myopic heuristic to accelerate metalearning i also appreciate the additional experiment carried out for hrwg further while the connection to prior work is close in many ways i believe the adaptation of the method for this context is sufficiently novel and effective to warrant acceptance the work is wellmotivated intuitively but the mathematical justification for the specific method is difficult to follow so i cannot quickly verify its soundness the empirical study is well done overall so i lean to accept the paper but would likely increase my score and confidence if the authors can clarify the theoretical motivation for their relabeling strategy docsepthe paper proposes a trajectory relabeling method for meta reinforcement learning metarl aiming to share some of the collected trajectories to improve sample efficiency during metatraining the relabeling method is built on hipi eysenbach et al 2020 instead of relabeling the trajectory based on the total reward as in hipi the paper argues that in metarl the metric of interest for trajectories from different tasks is their usefulness for taskidentification rather than returns the paper further proposes a metarl algorithm based on pearl rakelly et al 2019 the experimental results on several sparsereward tasks show that the method outperforms other relabeling methods as well as pearl a i am a little concerned with the novelty of the paper the authors made an interesting point that compared to multitask rl the objective in metarl is to learn to learn a new task so the metric of interest for relabeling trajectories in metarl is their usefulness for taskidentification rather than the returns like in multitask rl section 4 page 4 the papers main contribution is a trajectoryrelabeling algorithm in metarl setting based on this intuition however the proposed algorithm still seems to try to relabel trajectories based on their returns on other tasks specifically the resulting learning objective equation 8 is quite similar to that of hipi eysenbach et al 2020 as the utility function similarly aims to maximize the expected return of the trajectory on the new task which i think would be exactly the total reward of trajectory on the new task in the multitask setting this seems to be contradictory to what the papers says about the difference of relabeling trajectories of previous work in multitask rl and relabeling trajectories in metarl proposed in this work plus the actual implementation looks like a straightforward combination of hipi and pearl rakelly et al 2019 to me b the proposed algorithm makes an important assumption that the actual reward function is known for each task in this metarl setting this makes the metarl problem setting confusing as the paper also says the tasks share the same dynamics and only differ in the reward function but like the paper mentions there may exist some scenarios where this assumption is reasonable the experimental results show that the proposed algorithm improves performance compared with other relabeling methods hipi and random in such settings first i think there might exist methods that under the same assumptions are able to do better than metarl algorithms for instance one can relabel all the collected data with the new task reward and run some kind of offline rl algorithms on it without metalearning in my opinion that would be another good baseline to strengthen the authors claim under the same assumptions secondly in the experiment section the paper mentions a variant of the proposed method that also considers the scenario where the true reward function cannot be queried for individual transitions this is a more interesting setting and i think the authors should elaborate on this part eg how do you learn the reward functions is there anything specifically designed for metarl settings and it would be better to show more experimental results under these settings for instance the author can compare to the stateoftheart metarl algorithms under such more common settings especially in sparsereward environments where the proposed algorithm is more competent potential baseline 1 metacure meta reinforcement learning with empowermentdriven exploration zhang et al icml 2021 3 towards effective context for metareinforcement learning an approach based on contrastive learning fu et al aaai 2021 c i have some minor comments listed below 1 equation 4 on the left hand side j should be superscript instead of subscript 2 figure 4 the actor pi and critic q should be parameterized using different denotations instead of jointly using theta 3 figure 6 could the authors explain why randomly relabeling the trajectories can achieve competitive performance or even better performance than hipi and pearl 4 the adaptation procedures listed on page 6 before equation 9 confuse me could the authors provide an algorithm bar for the metatest phase maybe in appendix the idea in the paper is well presented and carefully investigated the proposed method is simple and effective however i am not quite convinced about the novelty of the proposed idea and i think the experimental settings can be improved to strengthen the papers claim
### Summary: | this paper proposes hindsight foresight relabeling hfr an approach for reward relabeling for meta rl the main contribution is a measure of how useful a given trajectory is for the purpose of metatask identification as well as the derivation of a task relabeling distribution based on this measure reviewers agreed that the paper tackles an interesting problem and found the main insight to be simple and intuitive while the initial reviews raised some concerns regarding novelty the performance gap and using the learned qfunction to estimate postadaptation returns the rebuttal did a good job of addressing these concerns overall the paper proposes a nontrivial extension of hindsight relabeling to meta rl and while the results could be stronger i think the paper provides useful ideas and insights so i recommend acceptance as a poster | [
275,
958,
436,
3368,
812,
671,
1421,
281,
271,
1014,
19554,
774,
1492,
272,
5700,
3587,
897,
253,
2032,
8892,
5912,
762,
253,
49996,
10554,
347,
253,
2603,
273,
774,
1492,
272,
2625,
50274,
19,
253,
16774,
3045,
273,
253,
4081,
1332,
1057,
417,
1646,
1077,
2266,
760,
275,
495,
273,
608,
23507,
10921,
8892,
253,
4081,
1332,
3012,
41731,
13015,
253,
1666,
25379,
285,
253,
4081,
1332,
1057,
417,
921,
1199,
7756,
327,
14086,
10921,
8892,
50275,
28821,
841,
7364,
516,
25661,
5777,
4404,
417,
18738,
253,
2929,
2654,
4122,
11907,
253,
4477,
281,
2589,
253,
3368,
891,
5125,
275,
1340,
281,
12654,
326,
253,
4081,
1332,
310,
6296,
26475,
253,
4836,
1491,
9113,
50274,
11183,
846,
2488,
2380,
253,
4477,
5196,
3081,
30410,
4679,
891,
9521,
285,
253,
1543,
1804,
326,
970,
6311,
2805,
1159,
281,
6642,
6548,
310,
4122,
27096,
670,
253,
4836,
3103,
619,
2022,
4468,
670,
253,
4081,
1332,
556,
644,
9713,
285,
516,
1024,
25661,
4404,
18738,
253,
2929,
50275,
783,
2929,
10262,
271,
4722,
2934,
670,
294,
5302,
941,
2439,
8892,
275,
1313,
7694,
253,
2934,
310,
1077,
27350,
285,
253,
2929,
310,
973,
3542,
2299,
516,
417,
2119,
670,
1880,
253,
2746,
908,
281,
3359,
253,
2934,
273,
253,
2929,
1663,
1057,
752,
253,
4477,
1750,
352,
1057,
3103,
2654,
751,
281,
923,
625,
1941,
1078,
891,
476,
5583,
18738,
253,
2929,
50276,
7152,
33032,
2520,
2929,
2175,
4836,
774,
357,
3485,
275,
17134,
18347,
281,
2572,
253,
6733,
273,
1313,
609,
249,
19503,
28269,
253,
4477,
12661,
247,
5700,
323,
18899,
247,
3268,
273,
8892,
323,
534,
247,
1798,
14604,
273,
941,
651,
320,
4217,
323,
15644,
285,
3410,
432,
436,
3268,
281,
3989,
247,
774,
357,
5911,
14604,
534,
14688,
942,
253,
3733,
941,
253,
4477,
921,
45190,
326,
436,
19132,
3410,
6733,
689,
625,
27785,
774,
357,
3485,
15849,
3782,
323,
23507,
10921,
8892,
247,
2962,
273,
490,
77,
569,
2007,
816,
7790,
2067,
2216,
7089,
390,
2340,
684,
31640,
281,
4373,
22041,
891,
751,
436,
2929,
4583,
253,
16038,
310,
3590,
1313,
7694,
310,
2761,
407,
5426,
3468,
407,
970,
247,
17357,
43936,
323,
5148,
613,
920,
685,
253,
3809,
4715,
390,
15644,
285,
594,
941,
20246,
3082,
403,
2234,
4836,
774,
357,
3485,
751,
275,
1554,
262,
1945,
390,
4736,
44321,
391,
77,
2789,
247,
2257,
273,
3282,
275,
436,
3634,
50276,
783,
1798,
4081,
1332,
3133,
5272,
3738,
891,
452,
690,
7350,
670,
253,
2508,
273,
253,
47284,
50276,
74,
1119,
2593,
7609,
9648,
2834,
281,
956,
806,
697,
417,
2233,
2590,
281,
479,
849,
281,
3711,
16186,
18,
281,
253,
8103,
3542,
275,
2426,
273,
28275,
984,
16186,
18,
1057,
417,
4853,
835,
15307,
484,
250,
1705,
432,
18289,
436,
310,
816,
1563,
8483,
22666,
285,
253,
21839,
273,
2805,
3115,
50276,
4144,
327,
3714,
74,
760,
1543,
275,
26704,
23267,
275,
29201,
417,
26704,
1375,
1913,
6430,
840,
954,
15538,
323,
4685,
436,
2593,
891,
13414,
956,
2139,
253,
8103,
323,
39116,
815,
74,
943,
320,
11903,
1701,
407,
19427,
436,
2805,
323,
4229,
39116,
815,
74,
253,
2929,
2296,
436,
29499,
12420,
342,
253,
7342,
273,
253,
5148,
613,
1216,
533,
516,
417,
2119,
752,
436,
2097,
253,
28529,
840,
7788,
275,
247,
1077,
1308,
316,
1452,
5133,
516,
417,
247,
7989,
273,
352,
310,
3477,
281,
921,
275,
2087,
604,
352,
310,
3477,
2581,
3630,
352,
4834,
275,
271,
30762,
604,
878,
320,
323,
2317,
390,
26542,
20420,
6524,
359,
12666,
387,
271,
8654,
774,
1492,
272,
3268,
533,
891,
13414,
2096,
275,
752,
3282,
352,
310,
8654,
1955,
253,
2045,
13775,
352,
812,
320,
516,
5816,
1633,
2969,
390,
841,
1841,
403,
512,
15246,
285,
2590,
281,
247,
9414,
342,
253,
987,
3634,
2299,
891,
11907,
253,
4477,
281,
9619,
19148,
285,
21184,
436,
2593,
281,
11377,
342,
247,
3862,
8446,
50276,
783,
2523,
326,
891,
452,
625,
540,
41597,
342,
253,
1332,
310,
326,
253,
8654,
4836,
17032,
943,
3469,
327,
253,
2032,
3268,
273,
8892,
407,
30897,
436,
3268,
949,
774,
1492,
272,
352,
3133,
352,
651,
1818,
253,
8654,
39116,
815,
74,
476,
253,
4477,
21184,
327,
1880,
390,
417,
436,
943,
320,
247,
8180,
4931,
407,
8254,
5411,
253,
47284,
1677,
275,
256,
3156,
50276,
783,
7092,
273,
253,
2746,
310,
3240,
18176,
891,
751,
253,
897,
273,
27887,
5200,
1798,
1511,
273,
1318,
1159,
281,
14556,
6642,
253,
1318,
273,
253,
1501,
26672,
318,
3646,
1293,
10491,
667,
5352,
16307,
50276,
74,
671,
751,
253,
16774,
1263,
253,
3045,
15988,
1646,
6832,
275,
2067,
8892,
285,
891,
11435,
253,
24542,
1666,
25379,
534,
403,
625,
27785,
533,
417,
816,
26724,
27887,
77,
1293,
667,
4836,
774,
1492,
272,
891,
14109,
253,
27096,
490,
77,
569,
50276,
37585,
5701,
390,
3533,
50275,
43575,
374,
273,
253,
26432,
2296,
1313,
7694,
310,
26557,
327,
22872,
436,
310,
13583,
50275,
22309,
774,
1492,
342,
816,
581,
4836,
19958,
432,
2805,
793,
262,
1952,
2139,
417,
2067,
3530,
390,
17375,
3530,
50275,
249,
30390,
337,
5046,
897,
247,
1027,
4857,
281,
12129,
295,
275,
755,
2808,
37717,
285,
275,
295,
275,
11897,
307,
874,
50275,
783,
2929,
651,
5649,
432,
625,
4278,
327,
253,
9978,
342,
247,
6311,
10921,
1159,
50274,
783,
4477,
497,
2104,
281,
19148,
253,
2792,
326,
574,
13477,
479,
275,
619,
3302,
4361,
891,
717,
28198,
326,
253,
8654,
5148,
613,
9306,
2900,
588,
417,
320,
23539,
407,
253,
4081,
774,
357,
3485,
285,
326,
253,
28529,
310,
3590,
5556,
2182,
253,
774,
357,
3485,
3268,
323,
253,
8993,
1501,
26672,
318,
6548,
2789,
3282,
347,
247,
8489,
619,
6361,
47641,
281,
28523,
5148,
613,
920,
891,
671,
11435,
253,
3081,
3368,
4824,
562,
323,
20589,
46506,
2007,
1223,
253,
4602,
281,
2720,
789,
310,
2810,
275,
1142,
4088,
891,
2868,
253,
15644,
273,
253,
1332,
323,
436,
3634,
310,
10481,
4460,
285,
3576,
281,
7501,
14924,
253,
789,
310,
973,
24013,
8550,
540,
41597,
533,
253,
15965,
22861,
323,
253,
2173,
1332,
310,
2834,
281,
956,
594,
891,
2550,
4541,
12654,
697,
3590,
1255,
253,
16774,
1263,
310,
973,
2218,
4583,
594,
891,
9644,
281,
2997,
253,
2929,
533,
651,
2779,
2572,
619,
4868,
285,
7162,
604,
253,
4477,
476,
19148,
253,
10527,
16038,
323,
616,
774,
1492,
272,
5700,
5474,
339,
431,
248,
2929,
29328,
247,
18974,
774,
1492,
272,
1332,
323,
11419,
35221,
4715,
1313,
7694,
26400,
281,
3894,
690,
273,
253,
5728,
24102,
281,
3157,
3410,
6733,
1309,
1313,
255,
26208,
253,
774,
1492,
272,
1332,
310,
4270,
327,
13290,
74,
299,
656,
257,
16836,
1162,
355,
9169,
3185,
273,
774,
1492,
272,
253,
18974,
1754,
327,
253,
2264,
10921,
347,
275,
13290,
74,
253,
2929,
8219,
326,
275,
1313,
7694,
253,
7982,
273,
1600,
323,
24102,
432,
1027,
8892,
310,
616,
31471,
323,
4836,
888,
1877,
2581,
685,
6548,
253,
2929,
2007,
29328,
247,
1313,
7694,
5933,
1754,
327,
27887,
77,
1218,
76,
9609,
1162,
355,
6247,
253,
5661,
1543,
327,
2067,
23507,
250,
1034,
8892,
921,
326,
253,
1332,
41731,
13015,
643,
774,
1492,
272,
3082,
347,
973,
347,
27887,
77,
247,
891,
717,
247,
1652,
7514,
342,
253,
38135,
273,
253,
2929,
50276,
783,
4477,
1160,
271,
4722,
1127,
326,
2429,
281,
1554,
262,
1945,
391,
77,
253,
8103,
275,
1313,
7694,
310,
281,
3037,
281,
3037,
247,
747,
4836,
594,
253,
7982,
273,
1600,
323,
774,
1492,
272,
24102,
275,
1313,
7694,
310,
616,
31471,
323,
4836,
888,
1877,
2581,
685,
253,
6548,
751,
275,
1554,
262,
1945,
391,
77,
2593,
577,
3239,
577,
253,
9380,
2022,
7680,
310,
247,
18974,
1661,
1492,
272,
5933,
275,
1313,
7694,
4758,
1754,
327,
436,
30328,
2299,
253,
4081,
5933,
1335,
3133,
281,
1611,
281,
774,
1492,
24102,
1754,
327,
616,
6548,
327,
643,
8892,
5742,
253,
4795,
4715,
8103,
5150,
854,
310,
3240,
2074,
281,
326,
273,
13290,
74,
299,
656,
257,
16836,
1162,
355,
9169,
347,
253,
11839,
1159,
12014,
13698,
281,
22950,
253,
3264,
1091,
273,
253,
18974,
327,
253,
747,
4836,
534,
891,
1158,
651,
320,
4555,
253,
2264,
10921,
273,
18974,
327,
253,
747,
4836,
275,
253,
1554,
262,
1945,
4758,
436,
3133,
281,
320,
34126,
281,
752,
253,
9380,
2296,
670,
253,
3064,
273,
774,
1492,
272,
24102,
273,
2045,
789,
275,
1554,
262,
1945,
391,
77,
285,
774,
1492,
272,
24102,
275,
1313,
7694,
4081,
275,
436,
789,
5043,
253,
4588,
7092,
4453,
751,
247,
15246,
5019,
273,
13290,
74,
285,
27887,
77,
1218,
76,
9609,
1162,
355,
6247,
281,
479,
50275,
67,
253,
4081,
5933,
2789,
271,
1774,
9376,
326,
253,
4588,
10921,
1159,
310,
1929,
323,
1016,
4836,
275,
436,
1313,
7694,
4758,
436,
2789,
253,
1313,
7694,
1895,
4758,
21643,
347,
253,
2929,
671,
2296,
253,
8892,
3894,
253,
1072,
8062,
285,
760,
9184,
275,
253,
10921,
1159,
533,
751,
253,
2929,
25957,
627,
778,
2226,
690,
15216,
835,
436,
9376,
310,
5272,
253,
5661,
1543,
921,
326,
253,
4081,
5933,
19132,
3045,
2429,
342,
643,
774,
1492,
272,
3082,
13290,
74,
285,
3632,
275,
824,
7533,
806,
891,
1158,
627,
1537,
2226,
3082,
326,
762,
253,
1072,
13260,
403,
2104,
281,
513,
1805,
685,
1313,
7694,
11333,
323,
4227,
581,
476,
774,
1492,
512,
253,
5728,
941,
342,
253,
747,
4836,
10921,
285,
1408,
690,
2238,
273,
28841,
391,
77,
11333,
327,
352,
1293,
5148,
613,
920,
275,
619,
4743,
326,
651,
320,
1529,
1175,
8245,
281,
17084,
253,
4477,
1750,
762,
253,
1072,
13260,
1273,
314,
275,
253,
3368,
2593,
253,
2929,
25957,
247,
12955,
273,
253,
4081,
1332,
326,
671,
19401,
253,
10076,
835,
253,
2032,
10921,
1159,
2550,
320,
32305,
728,
323,
2060,
16307,
436,
310,
247,
625,
4722,
4758,
285,
891,
1158,
253,
4477,
943,
21184,
327,
436,
629,
24088,
849,
513,
368,
3037,
253,
10921,
3470,
310,
627,
2712,
5742,
4158,
323,
1313,
7694,
7533,
285,
352,
651,
320,
1805,
281,
921,
625,
5661,
1543,
762,
841,
7533,
323,
4227,
253,
2488,
476,
7277,
281,
253,
1375,
23037,
14387,
1313,
7694,
11333,
762,
824,
625,
1846,
7533,
3340,
275,
23507,
250,
1034,
12620,
835,
253,
4081,
5933,
310,
625,
20566,
2442,
8245,
337,
1313,
317,
459,
11419,
35221,
4715,
342,
49952,
17477,
17947,
1182,
12109,
1162,
355,
17857,
1686,
43425,
50276,
20,
50276,
32289,
2196,
3576,
3634,
323,
1313,
609,
249,
19503,
4715,
271,
2746,
1754,
327,
4499,
422,
4715,
15260,
1162,
355,
39951,
2284,
43425,
50274,
68,
891,
452,
690,
5884,
5701,
7117,
2708,
50276,
18,
5150,
577,
327,
253,
1669,
1133,
1930,
480,
943,
320,
17402,
1687,
3185,
273,
749,
3866,
50276,
19,
4677,
577,
253,
12353,
12580,
285,
7291,
2805,
943,
320,
4764,
1025,
970,
1027,
1850,
302,
569,
3185,
273,
26277,
970,
39116,
50276,
20,
4677,
721,
812,
253,
4477,
5513,
2139,
12421,
774,
1492,
272,
253,
24102,
476,
5115,
12085,
3045,
390,
1014,
1805,
3045,
685,
13290,
74,
285,
27887,
77,
50276,
21,
253,
15644,
7259,
7117,
327,
3239,
721,
1078,
5150,
898,
40678,
479,
812,
253,
4477,
2085,
271,
5933,
2534,
323,
253,
1313,
255,
383,
3408,
5046,
275,
30762,
50276,
783,
2934,
275,
253,
2929,
310,
973,
3559,
285,
9257,
6949,
253,
4081,
1332,
310,
2969,
285,
3576,
2299,
891,
717,
417,
3240,
13762,
670,
253,
38135,
273,
253,
4081,
2934,
285,
891,
1158,
253,
5661,
7533,
476,
320,
5520,
281,
17084,
253,
9380,
1750,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
17134,
18347,
35903,
429,
774,
1492,
272,
288,
925,
271,
2746,
323,
10921,
774,
1492,
272,
323,
11419,
391,
77,
253,
2022,
7680,
310,
247,
2557,
273,
849,
4217,
247,
1677,
18974,
310,
323,
253,
4096,
273,
1313,
255,
1945,
8137,
347,
973,
347,
253,
28529,
273,
247,
4836,
774,
1492,
272,
3268,
1754,
327,
436,
2557,
50276,
15337,
398,
5821,
326,
253,
2929,
39223,
271,
4722,
1895,
285,
1119,
253,
2022,
12288,
281,
320,
2969,
285,
27350,
1223,
253,
3302,
10123,
5439,
690,
7350,
5001,
38135,
253,
3045,
8037,
285,
970,
253,
6311,
2805,
3701,
281,
6642,
1501,
26672,
318,
6548,
253,
30080,
22559,
858,
247,
1175,
2628,
273,
15974,
841,
7350,
4583,
253,
2929,
29328,
247,
37825,
6880,
273,
17134,
18347,
774,
1492,
272,
281,
11419,
391,
77,
285,
1223,
253,
1543,
812,
320,
10046,
891,
1158,
253,
2929,
3400,
4217,
5697,
285,
16039,
594,
891,
5583,
14924,
347,
247,
20731
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
275,
958,
436,
3368,
812,
671,
1421,
281,
271,
1014,
19554,
774,
1492,
272,
5700,
3587,
897,
253,
2032,
8892,
5912,
762,
253,
49996,
10554,
347,
253,
2603,
273,
774,
1492,
272,
2625,
50274,
19,
253,
16774,
3045,
273,
253,
4081,
1332,
1057,
417,
1646,
1077,
2266,
760,
275,
495,
273,
608,
23507,
10921,
8892,
253,
4081,
1332,
3012,
41731,
13015,
253,
1666,
25379,
285,
253,
4081,
1332,
1057,
417,
921,
1199,
7756,
327,
14086,
10921,
8892,
50275,
28821,
841,
7364,
516,
25661,
5777,
4404,
417,
18738,
253,
2929,
2654,
4122,
11907,
253,
4477,
281,
2589,
253,
3368,
891,
5125,
275,
1340,
281,
12654,
326,
253,
4081,
1332,
310,
6296,
26475,
253,
4836,
1491,
9113,
50274,
11183,
846,
2488,
2380,
253,
4477,
5196,
3081,
30410,
4679,
891,
9521,
285,
253,
1543,
1804,
326,
970,
6311,
2805,
1159,
281,
6642,
6548,
310,
4122,
27096,
670,
253,
4836,
3103,
619,
2022,
4468,
670,
253,
4081,
1332,
556,
644,
9713,
285,
516,
1024,
25661,
4404,
18738,
253,
2929,
50275,
783,
2929,
10262,
271,
4722,
2934,
670,
294,
5302,
941,
2439,
8892,
275,
1313,
7694,
253,
2934,
310,
1077,
27350,
285,
253,
2929,
310,
973,
3542,
2299,
516,
417,
2119,
670,
1880,
253,
2746,
908,
281,
3359,
253,
2934,
273,
253,
2929,
1663,
1057,
752,
253,
4477,
1750,
352,
1057,
3103,
2654,
751,
281,
923,
625,
1941,
1078,
891,
476,
5583,
18738,
253,
2929,
50276,
7152,
33032,
2520,
2929,
2175,
4836,
774,
357,
3485,
275,
17134,
18347,
281,
2572,
253,
6733,
273,
1313,
609,
249,
19503,
28269,
253,
4477,
12661,
247,
5700,
323,
18899,
247,
3268,
273,
8892,
323,
534,
247,
1798,
14604,
273,
941,
651,
320,
4217,
323,
15644,
285,
3410,
432,
436,
3268,
281,
3989,
247,
774,
357,
5911,
14604,
534,
14688,
942,
253,
3733,
941,
253,
4477,
921,
45190,
326,
436,
19132,
3410,
6733,
689,
625,
27785,
774,
357,
3485,
15849,
3782,
323,
23507,
10921,
8892,
247,
2962,
273,
490,
77,
569,
2007,
816,
7790,
2067,
2216,
7089,
390,
2340,
684,
31640,
281,
4373,
22041,
891,
751,
436,
2929,
4583,
253,
16038,
310,
3590,
1313,
7694,
310,
2761,
407,
5426,
3468,
407,
970,
247,
17357,
43936,
323,
5148,
613,
920,
685,
253,
3809,
4715,
390,
15644,
285,
594,
941,
20246,
3082,
403,
2234,
4836,
774,
357,
3485,
751,
275,
1554,
262,
1945,
390,
4736,
44321,
391,
77,
2789,
247,
2257,
273,
3282,
275,
436,
3634,
50276,
783,
1798,
4081,
1332,
3133,
5272,
3738,
891,
452,
690,
7350,
670,
253,
2508,
273,
253,
47284,
50276,
74,
1119,
2593,
7609,
9648,
2834,
281,
956,
806,
697,
417,
2233,
2590,
281,
479,
849,
281,
3711,
16186,
18,
281,
253,
8103,
3542,
275,
2426,
273,
28275,
984,
16186,
18,
1057,
417,
4853,
835,
15307,
484,
250,
1705,
432,
18289,
436,
310,
816,
1563,
8483,
22666,
285,
253,
21839,
273,
2805,
3115,
50276,
4144,
327,
3714,
74,
760,
1543,
275,
26704,
23267,
275,
29201,
417,
26704,
1375,
1913,
6430,
840,
954,
15538,
323,
4685,
436,
2593,
891,
13414,
956,
2139,
253,
8103,
323,
39116,
815,
74,
943,
320,
11903,
1701,
407,
19427,
436,
2805,
323,
4229,
39116,
815,
74,
253,
2929,
2296,
436,
29499,
12420,
342,
253,
7342,
273,
253,
5148,
613,
1216,
533,
516,
417,
2119,
752,
436,
2097,
253,
28529,
840,
7788,
275,
247,
1077,
1308,
316,
1452,
5133,
516,
417,
247,
7989,
273,
352,
310,
3477,
281,
921,
275,
2087,
604,
352,
310,
3477,
2581,
3630,
352,
4834,
275,
271,
30762,
604,
878,
320,
323,
2317,
390,
26542,
20420,
6524,
359,
12666,
387,
271,
8654,
774,
1492,
272,
3268,
533,
891,
13414,
2096,
275,
752,
3282,
352,
310,
8654,
1955,
253,
2045,
13775,
352,
812,
320,
516,
5816,
1633,
2969,
390,
841,
1841,
403,
512,
15246,
285,
2590,
281,
247,
9414,
342,
253,
987,
3634,
2299,
891,
11907,
253,
4477,
281,
9619,
19148,
285,
21184,
436,
2593,
281,
11377,
342,
247,
3862,
8446,
50276,
783,
2523,
326,
891,
452,
625,
540,
41597,
342,
253,
1332,
310,
326,
253,
8654,
4836,
17032,
943,
3469,
327,
253,
2032,
3268,
273,
8892,
407,
30897,
436,
3268,
949,
774,
1492,
272,
352,
3133,
352,
651,
1818,
253,
8654,
39116,
815,
74,
476,
253,
4477,
21184,
327,
1880,
390,
417,
436,
943,
320,
247,
8180,
4931,
407,
8254,
5411,
253,
47284,
1677,
275,
256,
3156,
50276,
783,
7092,
273,
253,
2746,
310,
3240,
18176,
891,
751,
253,
897,
273,
27887,
5200,
1798,
1511,
273,
1318,
1159,
281,
14556,
6642,
253,
1318,
273,
253,
1501,
26672,
318,
3646,
1293,
10491,
667,
5352,
16307,
50276,
74,
671,
751,
253,
16774,
1263,
253,
3045,
15988,
1646,
6832,
275,
2067,
8892,
285,
891,
11435,
253,
24542,
1666,
25379,
534,
403,
625,
27785,
533,
417,
816,
26724,
27887,
77,
1293,
667,
4836,
774,
1492,
272,
891,
14109,
253,
27096,
490,
77,
569,
50276,
37585,
5701,
390,
3533,
50275,
43575,
374,
273,
253,
26432,
2296,
1313,
7694,
310,
26557,
327,
22872,
436,
310,
13583,
50275,
22309,
774,
1492,
342,
816,
581,
4836,
19958,
432,
2805,
793,
262,
1952,
2139,
417,
2067,
3530,
390,
17375,
3530,
50275,
249,
30390,
337,
5046,
897,
247,
1027,
4857,
281,
12129,
295,
275,
755,
2808,
37717,
285,
275,
295,
275,
11897,
307,
874,
50275,
783,
2929,
651,
5649,
432,
625,
4278,
327,
253,
9978,
342,
247,
6311,
10921,
1159,
50274,
783,
4477,
497,
2104,
281,
19148,
253,
2792,
326,
574,
13477,
479,
275,
619,
3302,
4361,
891,
717,
28198,
326,
253,
8654,
5148,
613,
9306,
2900,
588,
417,
320,
23539,
407,
253,
4081,
774,
357,
3485,
285,
326,
253,
28529,
310,
3590,
5556,
2182,
253,
774,
357,
3485,
3268,
323,
253,
8993,
1501,
26672,
318,
6548,
2789,
3282,
347,
247,
8489,
619,
6361,
47641,
281,
28523,
5148,
613,
920,
891,
671,
11435,
253,
3081,
3368,
4824,
562,
323,
20589,
46506,
2007,
1223,
253,
4602,
281,
2720,
789,
310,
2810,
275,
1142,
4088,
891,
2868,
253,
15644,
273,
253,
1332,
323,
436,
3634,
310,
10481,
4460,
285,
3576,
281,
7501,
14924,
253,
789,
310,
973,
24013,
8550,
540,
41597,
533,
253,
15965,
22861,
323,
253,
2173,
1332,
310,
2834,
281,
956,
594,
891,
2550,
4541,
12654,
697,
3590,
1255,
253,
16774,
1263,
310,
973,
2218,
4583,
594,
891,
9644,
281,
2997,
253,
2929,
533,
651,
2779,
2572,
619,
4868,
285,
7162,
604,
253,
4477,
476,
19148,
253,
10527,
16038,
323,
616,
774,
1492,
272,
5700,
5474,
339,
431,
248,
2929,
29328,
247,
18974,
774,
1492,
272,
1332,
323,
11419,
35221,
4715,
1313,
7694,
26400,
281,
3894,
690,
273,
253,
5728,
24102,
281,
3157,
3410,
6733,
1309,
1313,
255,
26208,
253,
774,
1492,
272,
1332,
310,
4270,
327,
13290,
74,
299,
656,
257,
16836,
1162,
355,
9169,
3185,
273,
774,
1492,
272,
253,
18974,
1754,
327,
253,
2264,
10921,
347,
275,
13290,
74,
253,
2929,
8219,
326,
275,
1313,
7694,
253,
7982,
273,
1600,
323,
24102,
432,
1027,
8892,
310,
616,
31471,
323,
4836,
888,
1877,
2581,
685,
6548,
253,
2929,
2007,
29328,
247,
1313,
7694,
5933,
1754,
327,
27887,
77,
1218,
76,
9609,
1162,
355,
6247,
253,
5661,
1543,
327,
2067,
23507,
250,
1034,
8892,
921,
326,
253,
1332,
41731,
13015,
643,
774,
1492,
272,
3082,
347,
973,
347,
27887,
77,
247,
891,
717,
247,
1652,
7514,
342,
253,
38135,
273,
253,
2929,
50276,
783,
4477,
1160,
271,
4722,
1127,
326,
2429,
281,
1554,
262,
1945,
391,
77,
253,
8103,
275,
1313,
7694,
310,
281,
3037,
281,
3037,
247,
747,
4836,
594,
253,
7982,
273,
1600,
323,
774,
1492,
272,
24102,
275,
1313,
7694,
310,
616,
31471,
323,
4836,
888,
1877,
2581,
685,
253,
6548,
751,
275,
1554,
262,
1945,
391,
77,
2593,
577,
3239,
577,
253,
9380,
2022,
7680,
310,
247,
18974,
1661,
1492,
272,
5933,
275,
1313,
7694,
4758,
1754,
327,
436,
30328,
2299,
253,
4081,
5933,
1335,
3133,
281,
1611,
281,
774,
1492,
24102,
1754,
327,
616,
6548,
327,
643,
8892,
5742,
253,
4795,
4715,
8103,
5150,
854,
310,
3240,
2074,
281,
326,
273,
13290,
74,
299,
656,
257,
16836,
1162,
355,
9169,
347,
253,
11839,
1159,
12014,
13698,
281,
22950,
253,
3264,
1091,
273,
253,
18974,
327,
253,
747,
4836,
534,
891,
1158,
651,
320,
4555,
253,
2264,
10921,
273,
18974,
327,
253,
747,
4836,
275,
253,
1554,
262,
1945,
4758,
436,
3133,
281,
320,
34126,
281,
752,
253,
9380,
2296,
670,
253,
3064,
273,
774,
1492,
272,
24102,
273,
2045,
789,
275,
1554,
262,
1945,
391,
77,
285,
774,
1492,
272,
24102,
275,
1313,
7694,
4081,
275,
436,
789,
5043,
253,
4588,
7092,
4453,
751,
247,
15246,
5019,
273,
13290,
74,
285,
27887,
77,
1218,
76,
9609,
1162,
355,
6247,
281,
479,
50275,
67,
253,
4081,
5933,
2789,
271,
1774,
9376,
326,
253,
4588,
10921,
1159,
310,
1929,
323,
1016,
4836,
275,
436,
1313,
7694,
4758,
436,
2789,
253,
1313,
7694,
1895,
4758,
21643,
347,
253,
2929,
671,
2296,
253,
8892,
3894,
253,
1072,
8062,
285,
760,
9184,
275,
253,
10921,
1159,
533,
751,
253,
2929,
25957,
627,
778,
2226,
690,
15216,
835,
436,
9376,
310,
5272,
253,
5661,
1543,
921,
326,
253,
4081,
5933,
19132,
3045,
2429,
342,
643,
774,
1492,
272,
3082,
13290,
74,
285,
3632,
275,
824,
7533,
806,
891,
1158,
627,
1537,
2226,
3082,
326,
762,
253,
1072,
13260,
403,
2104,
281,
513,
1805,
685,
1313,
7694,
11333,
323,
4227,
581,
476,
774,
1492,
512,
253,
5728,
941,
342,
253,
747,
4836,
10921,
285,
1408,
690,
2238,
273,
28841,
391,
77,
11333,
327,
352,
1293,
5148,
613,
920,
275,
619,
4743,
326,
651,
320,
1529,
1175,
8245,
281,
17084,
253,
4477,
1750,
762,
253,
1072,
13260,
1273,
314,
275,
253,
3368,
2593,
253,
2929,
25957,
247,
12955,
273,
253,
4081,
1332,
326,
671,
19401,
253,
10076,
835,
253,
2032,
10921,
1159,
2550,
320,
32305,
728,
323,
2060,
16307,
436,
310,
247,
625,
4722,
4758,
285,
891,
1158,
253,
4477,
943,
21184,
327,
436,
629,
24088,
849,
513,
368,
3037,
253,
10921,
3470,
310,
627,
2712,
5742,
4158,
323,
1313,
7694,
7533,
285,
352,
651,
320,
1805,
281,
921,
625,
5661,
1543,
762,
841,
7533,
323,
4227,
253,
2488,
476,
7277,
281,
253,
1375,
23037,
14387,
1313,
7694,
11333,
762,
824,
625,
1846,
7533,
3340,
275,
23507,
250,
1034,
12620,
835,
253,
4081,
5933,
310,
625,
20566,
2442,
8245,
337,
1313,
317,
459,
11419,
35221,
4715,
342,
49952,
17477,
17947,
1182,
12109,
1162,
355,
17857,
1686,
43425,
50276,
20,
50276,
32289,
2196,
3576,
3634,
323,
1313,
609,
249,
19503,
4715,
271,
2746,
1754,
327,
4499,
422,
4715,
15260,
1162,
355,
39951,
2284,
43425,
50274,
68,
891,
452,
690,
5884,
5701,
7117,
2708,
50276,
18,
5150,
577,
327,
253,
1669,
1133,
1930,
480,
943,
320,
17402,
1687,
3185,
273,
749,
3866,
50276,
19,
4677,
577,
253,
12353,
12580,
285,
7291,
2805,
943,
320,
4764,
1025,
970,
1027,
1850,
302,
569,
3185,
273,
26277,
970,
39116,
50276,
20,
4677,
721,
812,
253,
4477,
5513,
2139,
12421,
774,
1492,
272,
253,
24102,
476,
5115,
12085,
3045,
390,
1014,
1805,
3045,
685,
13290,
74,
285,
27887,
77,
50276,
21,
253,
15644,
7259,
7117,
327,
3239,
721,
1078,
5150,
898,
40678,
479,
812,
253,
4477,
2085,
271,
5933,
2534,
323,
253,
1313,
255,
383,
3408,
5046,
275,
30762,
50276,
783,
2934,
275,
253,
2929,
310,
973,
3559,
285,
9257,
6949,
253,
4081,
1332,
310,
2969,
285,
3576,
2299,
891,
717,
417,
3240,
13762,
670,
253,
38135,
273,
253,
4081,
2934,
285,
891,
1158,
253,
5661,
7533,
476,
320,
5520,
281,
17084,
253,
9380,
1750,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
17134,
18347,
35903,
429,
774,
1492,
272,
288,
925,
271,
2746,
323,
10921,
774,
1492,
272,
323,
11419,
391,
77,
253,
2022,
7680,
310,
247,
2557,
273,
849,
4217,
247,
1677,
18974,
310,
323,
253,
4096,
273,
1313,
255,
1945,
8137,
347,
973,
347,
253,
28529,
273,
247,
4836,
774,
1492,
272,
3268,
1754,
327,
436,
2557,
50276,
15337,
398,
5821,
326,
253,
2929,
39223,
271,
4722,
1895,
285,
1119,
253,
2022,
12288,
281,
320,
2969,
285,
27350,
1223,
253,
3302,
10123,
5439,
690,
7350,
5001,
38135,
253,
3045,
8037,
285,
970,
253,
6311,
2805,
3701,
281,
6642,
1501,
26672,
318,
6548,
253,
30080,
22559,
858,
247,
1175,
2628,
273,
15974,
841,
7350,
4583,
253,
2929,
29328,
247,
37825,
6880,
273,
17134,
18347,
774,
1492,
272,
281,
11419,
391,
77,
285,
1223,
253,
1543,
812,
320,
10046,
891,
1158,
253,
2929,
3400,
4217,
5697,
285,
16039,
594,
891,
5583,
14924,
347,
247,
20731
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper introduces a task of joint visuallinguistic grammar induction from parallel imagetext data presents models and metrics for the task and shows strong empirical results strengths as far as i know this is the first paper that proposes joint visuallinguistic grammar induction in a realworld setting in contrast to synthetic settings hong et al 2021 the approach and the evaluation process are solid and make a lot of sense to me the visually grounded parsing results are quite impressive weakness my major concern is about the model selection process and the potential unfair comparisons to existing work model selection if i understood correctly for text parsing the best models are selected wrt to the parsing performance on a 1000example dev set appendix f this is an unrealistic setting see httpsaclanthologyorg2020emnlpmain614pdf for discussions in short for any fancy unsupervised parsing model that uses a labeled development set a supervised model trained on these development examples should be considered as a strong baseline introducing unsupervised criteria for model selection is more important than our initial impression unfair comparison cliora the model proposed in this paper uses diora as initialization which uses elmo to initialize word embeddings and the ptb labeled development set for model selection this means that cliora has seen far more text than other baselines vgnsl cpcfg vcpcfg and so on and human language learners this issue also undermines the authors arguments about potential links to how humans learn language i expect either a cliora trained from scratch without diora initialization or weakened arguments about the relationship between the current cliora and human language learning there seem to be some confusion on basic linguistic concepts eg nonterminal vs terminal symbols and a few typos that affects smooth understanding please see also detailed comments below other comments and questions introduction these works however fail to consider a unified vl structure nor have they demonstrated impact on visual understanding i dont think i necessarily agree with this statement especially regarding hong et al 2021 despite that there is a clear gap between their dataset and the real world settings they are aligning the visual grammars to language grammars yielding an arguably unified vl structure introduction the nonterminal symbol of a conventional constituency structure is a category label from a limited set eg the set of partofspeech pos tags hopcroft et al 2001 do you mean terminal symbols here we usually refer to pos tags to clarify phrase tags are not pos tags by preterminal or terminal depending on whether the phrasestructure grammar is lexicalized ie whether its considering real words or just pos tags and refer to the phrase nodes by nonterminal nodessymbols eg np pp it seems that this is not a typo i have the same questions for the following task definition section on page 3 task definition evaluation metrics if i understood correctly ccra requires some extra annotation of critical concepts how did you collect such annotations to determine which nps are critical very minor based on the full name ccra should really be ccrr what does a stand for here section 32 feature extraction the yoon kim et al 2019b paper is not relevant to image features at all did you mean shi et al 2019 table 1 what is the dagger after vgnslhi section 43 did you mean augments by arguments some more thoughts regarding motivation limitations humans arguably learns how to parse concrete sentences first and can then generalize to abstract domains that are not visually groundable in this work it seems that the model only works when both the text and image are available as there is a need to infuse visual features into text spans do you have any thoughts on enabling a trained cliora model to parse pure text without grounding signals missing reference kojima et al 1 has strengthened the vgnsl model by simplifying the architecture and argued that such visually grounded models are potentially biased towards concrete noun phrases however the paper neither cited it nor discussed the relevant issues 1 httpsaclanthologyorg2020aclmain234pdf there have been a lot of relevant work earlier than 2019 on visualsemantic embeddings or structured visual understanding with text to name a few older work on structured imagetext representations 2 httpsopenaccessthecvfcomcontenticcv2015papersmamultimodalconvolutionalneuraliccv2015paperpdf 3 httpsopenaccessthecvfcomcontentcvpr2018papersyouendtoendconvolutionalsemanticcvpr2018paperpdf contrastive loss for visualsemantic embeddings 4 httpsarxivorgpdf14112539pdf minor editing comments i was confused about what ccra is when reading the abstract would be good to include the full name and give an intuitive description of the metric yoon et al rightarrow kim et al shi et al 2019 proposes rightarrow shi et al 2019 propose in my opinion putting section 34 before 33 would better streamline the paper this paper introduces the task of joint visuallinguistic grammar induction and presents models metrics and empirical results on it while i appreciate the impressive results i am concerned about the unrealistic model selection process comparing model outputs to a large set of groundtruth parse trees and the unfair comparison the proposed model has access to much more unlabeled text data than baselines docsepthis paper presents a new model for grammar induction for text with help from the coupled images the model was built on top of an existing unsupervised grammar induction model used for text without image information the experimental results show the approach was effective the work essentially demonstrates some effective ways of leveraging the additional image information for improving the grammar induction task the paper also discussed some weaknesses of the approach and future work the topic of grammar induction has been there for a very long time in nlp and is a very fundamental topic the model was largely built based on an existing model for purely textbased grammar induction the model essentially makes use of neural networks to learn good latent representations using a reconstruction loss where the latent representation is defined with neural networks which yield scores for constituents and vector representations of them the approach adopts the classical insideoutside process for the computing of the scores the paper essentially investigates what might be the effective methods for integrating image information into text for improved grammar induction the execution of the paper was quite good and the results are convincing however i feel the overall model is essentially a way to use image information to regularize the grammar induction process little can be said about in what precise manner the image is actively contributing to the induction process indeed the authors also acknowledged something along with what i thought in the final section nevertheless i think it is an interesting piece that might inspire future research on multimodal processing for image language i think this is a reasonable piece with good writing and a nice set of experiments it would be helpful for future research in this domain docsepthe paper proposed a new method cliora to do unsupervised parsing and visionlanguage grounding cliora is based on diora model but different from previous unsupervised parsing methods cliora also induces alignment between constituents and image regions in order to train the model the author introduces a contrastive loss experiment results show that the proposed method outperforms baseline unsupervised parsing methods and it also induces meaningful alignment between image regions and constituents strengths the idea of jointly inducing structure in natural language and grounding the constituents with realworld images is intuitively correct the ablation study also shows that both featurelevel fusion and scorelevel fusion including the contrastive loss if i understand correctly helps in improving the parsing results weakness 1 the image features are only used for computing the inside pass the image feature should contain information that can help predict the missing word such that it could be used in the outside pass too selecting the best image region for predicting the missing word is also an intuitively correct way to build the visionlanguage alignment 2 the compute of simi cij includes a max operator this could lead to a biased gradient 3 as the author mentioned in the discussion section the model doesnt consider the latent hierarchical structure of the image for example the sentence describes the entire image while each phrase describes part of the image overall the proposed method is interesting and inspiring the idea should be interesting to both unsupervised parsing and multimodel communities
### Summary: | this paper proposes to perform unsupervised grammar induction over imagetext pairs and used shared structure between the modalities to improve grammar induction on both sides authors find the paper clear creative and interesting and recommend acceptance without hesitation | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
23970,
247,
4836,
273,
6036,
1649,
86,
455,
5191,
2531,
28146,
9953,
432,
7529,
4440,
292,
2068,
941,
10262,
3210,
285,
17082,
323,
253,
4836,
285,
2722,
2266,
16774,
1543,
50274,
296,
3755,
20556,
50274,
284,
2080,
347,
891,
871,
436,
310,
253,
806,
2929,
326,
29328,
6036,
1649,
86,
455,
5191,
2531,
28146,
9953,
275,
247,
1524,
10186,
4758,
275,
4499,
281,
13506,
7533,
288,
543,
1162,
355,
43425,
50275,
783,
2746,
285,
253,
7103,
1232,
403,
4891,
285,
1056,
247,
2257,
273,
3282,
281,
479,
50273,
783,
25910,
28462,
29072,
1543,
403,
3240,
13943,
50274,
20881,
1255,
50275,
2577,
2201,
4468,
310,
670,
253,
1566,
5438,
1232,
285,
253,
2442,
16593,
14023,
281,
5368,
789,
50271,
7645,
5438,
604,
891,
7192,
9113,
323,
2505,
29072,
253,
1682,
3210,
403,
4236,
8772,
281,
253,
29072,
3045,
327,
247,
9098,
11667,
1474,
873,
30762,
269,
50276,
2520,
310,
271,
46521,
4758,
923,
5987,
29404,
14718,
1497,
2061,
14952,
358,
13307,
2617,
404,
47161,
9275,
323,
11985,
275,
2159,
323,
667,
18612,
440,
35421,
29072,
1566,
326,
4648,
247,
13130,
2440,
873,
247,
22296,
1566,
10166,
327,
841,
2440,
6667,
943,
320,
2783,
347,
247,
2266,
8245,
50276,
36445,
2844,
440,
35421,
6866,
323,
1566,
5438,
310,
625,
1774,
685,
776,
3302,
13214,
50271,
328,
25525,
5301,
502,
1528,
66,
253,
1566,
4081,
275,
436,
2929,
4648,
1073,
6464,
347,
31850,
534,
4648,
1045,
6972,
281,
26641,
3159,
46234,
285,
253,
268,
25192,
13130,
2440,
873,
323,
1566,
5438,
436,
2097,
326,
502,
1528,
66,
556,
2326,
2080,
625,
2505,
685,
643,
1666,
25379,
362,
72,
2224,
77,
260,
5902,
16054,
362,
68,
5902,
16054,
285,
594,
327,
285,
1966,
3448,
40390,
50276,
2520,
2523,
671,
35162,
1100,
253,
4477,
7125,
670,
2442,
4859,
281,
849,
7497,
3037,
3448,
891,
1902,
2057,
247,
502,
1528,
66,
10166,
432,
20041,
1293,
1073,
6464,
31850,
390,
33153,
7125,
670,
253,
2954,
875,
253,
1655,
502,
1528,
66,
285,
1966,
3448,
4715,
50275,
9088,
1646,
281,
320,
690,
13775,
327,
5044,
32019,
12342,
24088,
1327,
14104,
4632,
8351,
14217,
285,
247,
1643,
963,
993,
326,
11852,
6032,
4685,
4496,
923,
671,
7000,
5701,
2708,
50274,
977,
5701,
285,
3533,
50275,
46089,
841,
2987,
2299,
1891,
281,
1908,
247,
27998,
362,
77,
2605,
4543,
452,
597,
5183,
3486,
327,
5304,
4685,
50275,
74,
13414,
1158,
891,
7933,
5194,
342,
436,
3908,
3340,
5001,
288,
543,
1162,
355,
43425,
5747,
326,
627,
310,
247,
2590,
8037,
875,
616,
10895,
285,
253,
1524,
1533,
7533,
597,
403,
8495,
272,
253,
5304,
650,
3681,
1032,
281,
3448,
650,
3681,
1032,
27012,
271,
25711,
27998,
362,
77,
2605,
50274,
46089,
253,
1327,
14104,
9484,
273,
247,
6041,
37288,
2605,
310,
247,
7140,
5203,
432,
247,
3710,
873,
24088,
253,
873,
273,
629,
1171,
48460,
803,
14610,
5184,
44063,
1162,
355,
6585,
50275,
3088,
368,
1599,
8351,
14217,
1060,
359,
3798,
3730,
281,
803,
14610,
281,
19148,
12616,
14610,
403,
417,
803,
14610,
407,
638,
14104,
390,
8351,
7293,
327,
1880,
253,
815,
6230,
383,
7818,
28146,
310,
26752,
474,
1025,
26332,
1880,
697,
7296,
1524,
3000,
390,
816,
803,
14610,
285,
3730,
281,
253,
12616,
7632,
407,
1327,
14104,
6913,
405,
31356,
24088,
15749,
7266,
352,
3133,
326,
436,
310,
417,
247,
1745,
80,
50276,
74,
452,
253,
1072,
3533,
323,
253,
1563,
4836,
5426,
2593,
327,
3239,
495,
50275,
14605,
5426,
7103,
17082,
604,
891,
7192,
9113,
25215,
376,
4419,
690,
4465,
22581,
273,
4619,
12342,
50276,
5430,
858,
368,
4822,
824,
31825,
281,
3653,
534,
295,
793,
403,
4619,
50276,
635,
5884,
1754,
327,
253,
2120,
1416,
25215,
376,
943,
1663,
320,
25215,
2676,
50276,
5371,
1057,
247,
1462,
323,
1060,
50274,
4674,
4567,
4735,
11998,
253,
340,
3508,
465,
303,
1162,
355,
6247,
67,
2929,
310,
417,
4623,
281,
2460,
3386,
387,
512,
50276,
14958,
368,
1599,
439,
74,
1162,
355,
6247,
50274,
2420,
337,
752,
310,
253,
45766,
846,
362,
72,
2224,
77,
5801,
50274,
4674,
7652,
858,
368,
1599,
14688,
942,
407,
7125,
50274,
8826,
625,
7906,
5001,
16038,
7364,
7497,
25711,
33772,
849,
281,
14390,
11859,
14683,
806,
285,
476,
840,
39970,
281,
12002,
10625,
326,
403,
417,
25910,
3216,
494,
275,
436,
789,
352,
3133,
326,
253,
1566,
760,
2987,
672,
1097,
253,
2505,
285,
2460,
403,
2130,
347,
627,
310,
247,
878,
281,
2192,
2327,
5304,
3386,
715,
2505,
35742,
513,
368,
452,
667,
7906,
327,
17690,
247,
10166,
502,
1528,
66,
1566,
281,
14390,
6313,
2505,
1293,
3216,
272,
6298,
50275,
33722,
3806,
50276,
7381,
75,
8032,
1162,
355,
337,
556,
34615,
253,
362,
72,
2224,
77,
1566,
407,
8077,
5411,
253,
10336,
285,
9125,
326,
824,
25910,
28462,
3210,
403,
7826,
23539,
4404,
11859,
28407,
25491,
2299,
253,
2929,
6747,
11106,
352,
4543,
5469,
253,
4623,
3374,
50275,
18,
5987,
29404,
14718,
1497,
2061,
14952,
29404,
7265,
20210,
9275,
50275,
9088,
452,
644,
247,
2257,
273,
4623,
789,
4321,
685,
6247,
327,
5304,
6017,
6484,
46234,
390,
18872,
5304,
4685,
342,
2505,
281,
1416,
247,
1643,
50275,
4486,
789,
327,
18872,
4440,
292,
2068,
14237,
50276,
19,
5987,
5758,
317,
707,
296,
248,
17312,
71,
681,
6071,
280,
17312,
6620,
50004,
78,
312,
503,
303,
26306,
13118,
2241,
267,
570,
1546,
280,
17312,
6620,
20790,
9275,
50276,
20,
5987,
5758,
317,
707,
296,
248,
17312,
71,
681,
6071,
17312,
1087,
7798,
50004,
5658,
423,
936,
423,
13118,
2241,
2018,
78,
6484,
17312,
1087,
7798,
20790,
9275,
50276,
45842,
422,
2957,
323,
5304,
6017,
6484,
46234,
50276,
21,
5987,
39962,
2061,
9275,
18962,
9312,
1867,
9275,
50274,
37585,
14835,
5701,
50276,
74,
369,
13477,
670,
752,
25215,
376,
310,
672,
4361,
253,
12002,
50276,
12756,
320,
1175,
281,
2486,
253,
2120,
1416,
285,
1918,
271,
27350,
5740,
273,
253,
7982,
50274,
90,
3508,
1162,
355,
987,
2501,
465,
303,
1162,
355,
50274,
41386,
1162,
355,
6247,
29328,
987,
2501,
439,
74,
1162,
355,
6247,
12661,
50275,
249,
619,
4743,
8133,
2593,
5910,
1078,
5922,
651,
1805,
5542,
1282,
253,
2929,
50275,
2520,
2929,
23970,
253,
4836,
273,
6036,
1649,
86,
455,
5191,
2531,
28146,
9953,
285,
10262,
3210,
17082,
285,
16774,
1543,
327,
352,
1223,
891,
11435,
253,
13943,
1543,
891,
717,
7514,
670,
253,
46521,
1566,
5438,
1232,
10941,
1566,
18012,
281,
247,
1781,
873,
273,
3216,
33024,
14390,
7139,
285,
253,
16593,
5301,
253,
4081,
1566,
556,
2289,
281,
1199,
625,
440,
22027,
2505,
941,
685,
1666,
25379,
5474,
33032,
2520,
2929,
10262,
247,
747,
1566,
323,
28146,
9953,
323,
2505,
342,
1361,
432,
253,
9904,
3888,
253,
1566,
369,
4270,
327,
1755,
273,
271,
5368,
440,
35421,
28146,
9953,
1566,
908,
323,
2505,
1293,
2460,
1491,
253,
5661,
1543,
921,
253,
2746,
369,
3576,
253,
789,
9093,
14371,
690,
3576,
4088,
273,
19732,
2977,
253,
3081,
2460,
1491,
323,
11138,
253,
28146,
9953,
4836,
253,
2929,
671,
5469,
690,
32213,
273,
253,
2746,
285,
2852,
789,
253,
9400,
273,
28146,
9953,
556,
644,
627,
323,
247,
1077,
1048,
673,
275,
295,
24343,
285,
310,
247,
1077,
7936,
9400,
50276,
783,
1566,
369,
8127,
4270,
1754,
327,
271,
5368,
1566,
323,
15846,
2505,
3169,
28146,
9953,
253,
1566,
9093,
2789,
897,
273,
11454,
6928,
281,
3037,
1175,
21624,
14237,
970,
247,
14433,
2957,
835,
253,
21624,
6779,
310,
2931,
342,
11454,
6928,
534,
4917,
7363,
323,
26290,
285,
4972,
14237,
273,
731,
253,
2746,
47932,
253,
8946,
3304,
40939,
1232,
323,
253,
12672,
273,
253,
7363,
50276,
783,
2929,
9093,
2340,
684,
752,
1537,
320,
253,
3576,
3082,
323,
24399,
2460,
1491,
715,
2505,
323,
5520,
28146,
9953,
253,
10636,
273,
253,
2929,
369,
3240,
1175,
285,
253,
1543,
403,
21414,
2299,
891,
1928,
253,
4583,
1566,
310,
9093,
247,
1039,
281,
897,
2460,
1491,
281,
3963,
907,
253,
28146,
9953,
1232,
1652,
476,
320,
753,
670,
275,
752,
10799,
5133,
253,
2460,
310,
15257,
15979,
281,
253,
9953,
1232,
6296,
253,
4477,
671,
14969,
1633,
2112,
342,
752,
891,
1869,
275,
253,
2457,
2593,
17837,
891,
1158,
352,
310,
271,
4722,
5313,
326,
1537,
26761,
2852,
2561,
327,
23390,
26306,
5162,
323,
2460,
50276,
12982,
891,
1158,
436,
310,
247,
5272,
5313,
342,
1175,
4028,
285,
247,
5322,
873,
273,
4679,
352,
651,
320,
9371,
323,
2852,
2561,
275,
436,
5028,
5474,
339,
431,
248,
2929,
4081,
247,
747,
1332,
502,
1528,
66,
281,
513,
440,
35421,
29072,
285,
8113,
12982,
3216,
272,
50276,
498,
1528,
66,
310,
1754,
327,
1073,
6464,
1566,
50276,
2858,
1027,
432,
2045,
440,
35421,
29072,
3082,
502,
1528,
66,
671,
14757,
12420,
875,
26290,
285,
2460,
4811,
50276,
249,
1340,
281,
6194,
253,
1566,
253,
2488,
23970,
247,
4499,
422,
2957,
3368,
1543,
921,
326,
253,
4081,
1332,
41731,
13015,
8245,
440,
35421,
29072,
3082,
285,
352,
671,
14757,
14282,
12420,
875,
2460,
4811,
285,
26290,
20544,
253,
2934,
273,
26277,
24635,
2605,
275,
3626,
3448,
285,
3216,
272,
253,
26290,
342,
1524,
10186,
3888,
310,
540,
41597,
3451,
253,
28913,
1263,
671,
2722,
326,
1097,
4735,
5251,
11781,
285,
4868,
5251,
11781,
1690,
253,
4499,
422,
2957,
604,
891,
2096,
9113,
7729,
275,
11138,
253,
29072,
1543,
50276,
20881,
1255,
337,
253,
2460,
3386,
403,
760,
908,
323,
12672,
253,
3304,
1509,
253,
2460,
4735,
943,
3831,
1491,
326,
476,
1361,
3283,
253,
5816,
3159,
824,
326,
352,
812,
320,
908,
275,
253,
3345,
1509,
1512,
17221,
253,
1682,
2460,
2919,
323,
21565,
253,
5816,
3159,
310,
671,
271,
540,
41597,
3451,
1039,
281,
1973,
253,
8113,
12982,
12420,
374,
253,
11897,
273,
948,
74,
260,
1944,
3797,
247,
2781,
5572,
436,
812,
1421,
281,
247,
23539,
11786,
495,
347,
253,
2488,
5393,
275,
253,
5955,
2593,
253,
1566,
36908,
1908,
253,
21624,
24498,
2605,
273,
253,
2460,
323,
1650,
253,
6197,
8631,
253,
2862,
2460,
1223,
1016,
12616,
8631,
629,
273,
253,
2460,
4583,
253,
4081,
1332,
310,
4722,
285,
29853,
50276,
783,
2934,
943,
320,
4722,
281,
1097,
440,
35421,
29072,
285,
23390,
49797,
7888,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
281,
1347,
440,
35421,
28146,
9953,
689,
4440,
292,
2068,
8557,
285,
908,
6096,
2605,
875,
253,
33433,
281,
3157,
28146,
9953,
327,
1097,
7123,
4477,
1089,
253,
2929,
2590,
10995,
285,
4722,
285,
5583,
14924,
1293,
39500
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
23970,
247,
4836,
273,
6036,
1649,
86,
455,
5191,
2531,
28146,
9953,
432,
7529,
4440,
292,
2068,
941,
10262,
3210,
285,
17082,
323,
253,
4836,
285,
2722,
2266,
16774,
1543,
50274,
296,
3755,
20556,
50274,
284,
2080,
347,
891,
871,
436,
310,
253,
806,
2929,
326,
29328,
6036,
1649,
86,
455,
5191,
2531,
28146,
9953,
275,
247,
1524,
10186,
4758,
275,
4499,
281,
13506,
7533,
288,
543,
1162,
355,
43425,
50275,
783,
2746,
285,
253,
7103,
1232,
403,
4891,
285,
1056,
247,
2257,
273,
3282,
281,
479,
50273,
783,
25910,
28462,
29072,
1543,
403,
3240,
13943,
50274,
20881,
1255,
50275,
2577,
2201,
4468,
310,
670,
253,
1566,
5438,
1232,
285,
253,
2442,
16593,
14023,
281,
5368,
789,
50271,
7645,
5438,
604,
891,
7192,
9113,
323,
2505,
29072,
253,
1682,
3210,
403,
4236,
8772,
281,
253,
29072,
3045,
327,
247,
9098,
11667,
1474,
873,
30762,
269,
50276,
2520,
310,
271,
46521,
4758,
923,
5987,
29404,
14718,
1497,
2061,
14952,
358,
13307,
2617,
404,
47161,
9275,
323,
11985,
275,
2159,
323,
667,
18612,
440,
35421,
29072,
1566,
326,
4648,
247,
13130,
2440,
873,
247,
22296,
1566,
10166,
327,
841,
2440,
6667,
943,
320,
2783,
347,
247,
2266,
8245,
50276,
36445,
2844,
440,
35421,
6866,
323,
1566,
5438,
310,
625,
1774,
685,
776,
3302,
13214,
50271,
328,
25525,
5301,
502,
1528,
66,
253,
1566,
4081,
275,
436,
2929,
4648,
1073,
6464,
347,
31850,
534,
4648,
1045,
6972,
281,
26641,
3159,
46234,
285,
253,
268,
25192,
13130,
2440,
873,
323,
1566,
5438,
436,
2097,
326,
502,
1528,
66,
556,
2326,
2080,
625,
2505,
685,
643,
1666,
25379,
362,
72,
2224,
77,
260,
5902,
16054,
362,
68,
5902,
16054,
285,
594,
327,
285,
1966,
3448,
40390,
50276,
2520,
2523,
671,
35162,
1100,
253,
4477,
7125,
670,
2442,
4859,
281,
849,
7497,
3037,
3448,
891,
1902,
2057,
247,
502,
1528,
66,
10166,
432,
20041,
1293,
1073,
6464,
31850,
390,
33153,
7125,
670,
253,
2954,
875,
253,
1655,
502,
1528,
66,
285,
1966,
3448,
4715,
50275,
9088,
1646,
281,
320,
690,
13775,
327,
5044,
32019,
12342,
24088,
1327,
14104,
4632,
8351,
14217,
285,
247,
1643,
963,
993,
326,
11852,
6032,
4685,
4496,
923,
671,
7000,
5701,
2708,
50274,
977,
5701,
285,
3533,
50275,
46089,
841,
2987,
2299,
1891,
281,
1908,
247,
27998,
362,
77,
2605,
4543,
452,
597,
5183,
3486,
327,
5304,
4685,
50275,
74,
13414,
1158,
891,
7933,
5194,
342,
436,
3908,
3340,
5001,
288,
543,
1162,
355,
43425,
5747,
326,
627,
310,
247,
2590,
8037,
875,
616,
10895,
285,
253,
1524,
1533,
7533,
597,
403,
8495,
272,
253,
5304,
650,
3681,
1032,
281,
3448,
650,
3681,
1032,
27012,
271,
25711,
27998,
362,
77,
2605,
50274,
46089,
253,
1327,
14104,
9484,
273,
247,
6041,
37288,
2605,
310,
247,
7140,
5203,
432,
247,
3710,
873,
24088,
253,
873,
273,
629,
1171,
48460,
803,
14610,
5184,
44063,
1162,
355,
6585,
50275,
3088,
368,
1599,
8351,
14217,
1060,
359,
3798,
3730,
281,
803,
14610,
281,
19148,
12616,
14610,
403,
417,
803,
14610,
407,
638,
14104,
390,
8351,
7293,
327,
1880,
253,
815,
6230,
383,
7818,
28146,
310,
26752,
474,
1025,
26332,
1880,
697,
7296,
1524,
3000,
390,
816,
803,
14610,
285,
3730,
281,
253,
12616,
7632,
407,
1327,
14104,
6913,
405,
31356,
24088,
15749,
7266,
352,
3133,
326,
436,
310,
417,
247,
1745,
80,
50276,
74,
452,
253,
1072,
3533,
323,
253,
1563,
4836,
5426,
2593,
327,
3239,
495,
50275,
14605,
5426,
7103,
17082,
604,
891,
7192,
9113,
25215,
376,
4419,
690,
4465,
22581,
273,
4619,
12342,
50276,
5430,
858,
368,
4822,
824,
31825,
281,
3653,
534,
295,
793,
403,
4619,
50276,
635,
5884,
1754,
327,
253,
2120,
1416,
25215,
376,
943,
1663,
320,
25215,
2676,
50276,
5371,
1057,
247,
1462,
323,
1060,
50274,
4674,
4567,
4735,
11998,
253,
340,
3508,
465,
303,
1162,
355,
6247,
67,
2929,
310,
417,
4623,
281,
2460,
3386,
387,
512,
50276,
14958,
368,
1599,
439,
74,
1162,
355,
6247,
50274,
2420,
337,
752,
310,
253,
45766,
846,
362,
72,
2224,
77,
5801,
50274,
4674,
7652,
858,
368,
1599,
14688,
942,
407,
7125,
50274,
8826,
625,
7906,
5001,
16038,
7364,
7497,
25711,
33772,
849,
281,
14390,
11859,
14683,
806,
285,
476,
840,
39970,
281,
12002,
10625,
326,
403,
417,
25910,
3216,
494,
275,
436,
789,
352,
3133,
326,
253,
1566,
760,
2987,
672,
1097,
253,
2505,
285,
2460,
403,
2130,
347,
627,
310,
247,
878,
281,
2192,
2327,
5304,
3386,
715,
2505,
35742,
513,
368,
452,
667,
7906,
327,
17690,
247,
10166,
502,
1528,
66,
1566,
281,
14390,
6313,
2505,
1293,
3216,
272,
6298,
50275,
33722,
3806,
50276,
7381,
75,
8032,
1162,
355,
337,
556,
34615,
253,
362,
72,
2224,
77,
1566,
407,
8077,
5411,
253,
10336,
285,
9125,
326,
824,
25910,
28462,
3210,
403,
7826,
23539,
4404,
11859,
28407,
25491,
2299,
253,
2929,
6747,
11106,
352,
4543,
5469,
253,
4623,
3374,
50275,
18,
5987,
29404,
14718,
1497,
2061,
14952,
29404,
7265,
20210,
9275,
50275,
9088,
452,
644,
247,
2257,
273,
4623,
789,
4321,
685,
6247,
327,
5304,
6017,
6484,
46234,
390,
18872,
5304,
4685,
342,
2505,
281,
1416,
247,
1643,
50275,
4486,
789,
327,
18872,
4440,
292,
2068,
14237,
50276,
19,
5987,
5758,
317,
707,
296,
248,
17312,
71,
681,
6071,
280,
17312,
6620,
50004,
78,
312,
503,
303,
26306,
13118,
2241,
267,
570,
1546,
280,
17312,
6620,
20790,
9275,
50276,
20,
5987,
5758,
317,
707,
296,
248,
17312,
71,
681,
6071,
17312,
1087,
7798,
50004,
5658,
423,
936,
423,
13118,
2241,
2018,
78,
6484,
17312,
1087,
7798,
20790,
9275,
50276,
45842,
422,
2957,
323,
5304,
6017,
6484,
46234,
50276,
21,
5987,
39962,
2061,
9275,
18962,
9312,
1867,
9275,
50274,
37585,
14835,
5701,
50276,
74,
369,
13477,
670,
752,
25215,
376,
310,
672,
4361,
253,
12002,
50276,
12756,
320,
1175,
281,
2486,
253,
2120,
1416,
285,
1918,
271,
27350,
5740,
273,
253,
7982,
50274,
90,
3508,
1162,
355,
987,
2501,
465,
303,
1162,
355,
50274,
41386,
1162,
355,
6247,
29328,
987,
2501,
439,
74,
1162,
355,
6247,
12661,
50275,
249,
619,
4743,
8133,
2593,
5910,
1078,
5922,
651,
1805,
5542,
1282,
253,
2929,
50275,
2520,
2929,
23970,
253,
4836,
273,
6036,
1649,
86,
455,
5191,
2531,
28146,
9953,
285,
10262,
3210,
17082,
285,
16774,
1543,
327,
352,
1223,
891,
11435,
253,
13943,
1543,
891,
717,
7514,
670,
253,
46521,
1566,
5438,
1232,
10941,
1566,
18012,
281,
247,
1781,
873,
273,
3216,
33024,
14390,
7139,
285,
253,
16593,
5301,
253,
4081,
1566,
556,
2289,
281,
1199,
625,
440,
22027,
2505,
941,
685,
1666,
25379,
5474,
33032,
2520,
2929,
10262,
247,
747,
1566,
323,
28146,
9953,
323,
2505,
342,
1361,
432,
253,
9904,
3888,
253,
1566,
369,
4270,
327,
1755,
273,
271,
5368,
440,
35421,
28146,
9953,
1566,
908,
323,
2505,
1293,
2460,
1491,
253,
5661,
1543,
921,
253,
2746,
369,
3576,
253,
789,
9093,
14371,
690,
3576,
4088,
273,
19732,
2977,
253,
3081,
2460,
1491,
323,
11138,
253,
28146,
9953,
4836,
253,
2929,
671,
5469,
690,
32213,
273,
253,
2746,
285,
2852,
789,
253,
9400,
273,
28146,
9953,
556,
644,
627,
323,
247,
1077,
1048,
673,
275,
295,
24343,
285,
310,
247,
1077,
7936,
9400,
50276,
783,
1566,
369,
8127,
4270,
1754,
327,
271,
5368,
1566,
323,
15846,
2505,
3169,
28146,
9953,
253,
1566,
9093,
2789,
897,
273,
11454,
6928,
281,
3037,
1175,
21624,
14237,
970,
247,
14433,
2957,
835,
253,
21624,
6779,
310,
2931,
342,
11454,
6928,
534,
4917,
7363,
323,
26290,
285,
4972,
14237,
273,
731,
253,
2746,
47932,
253,
8946,
3304,
40939,
1232,
323,
253,
12672,
273,
253,
7363,
50276,
783,
2929,
9093,
2340,
684,
752,
1537,
320,
253,
3576,
3082,
323,
24399,
2460,
1491,
715,
2505,
323,
5520,
28146,
9953,
253,
10636,
273,
253,
2929,
369,
3240,
1175,
285,
253,
1543,
403,
21414,
2299,
891,
1928,
253,
4583,
1566,
310,
9093,
247,
1039,
281,
897,
2460,
1491,
281,
3963,
907,
253,
28146,
9953,
1232,
1652,
476,
320,
753,
670,
275,
752,
10799,
5133,
253,
2460,
310,
15257,
15979,
281,
253,
9953,
1232,
6296,
253,
4477,
671,
14969,
1633,
2112,
342,
752,
891,
1869,
275,
253,
2457,
2593,
17837,
891,
1158,
352,
310,
271,
4722,
5313,
326,
1537,
26761,
2852,
2561,
327,
23390,
26306,
5162,
323,
2460,
50276,
12982,
891,
1158,
436,
310,
247,
5272,
5313,
342,
1175,
4028,
285,
247,
5322,
873,
273,
4679,
352,
651,
320,
9371,
323,
2852,
2561,
275,
436,
5028,
5474,
339,
431,
248,
2929,
4081,
247,
747,
1332,
502,
1528,
66,
281,
513,
440,
35421,
29072,
285,
8113,
12982,
3216,
272,
50276,
498,
1528,
66,
310,
1754,
327,
1073,
6464,
1566,
50276,
2858,
1027,
432,
2045,
440,
35421,
29072,
3082,
502,
1528,
66,
671,
14757,
12420,
875,
26290,
285,
2460,
4811,
50276,
249,
1340,
281,
6194,
253,
1566,
253,
2488,
23970,
247,
4499,
422,
2957,
3368,
1543,
921,
326,
253,
4081,
1332,
41731,
13015,
8245,
440,
35421,
29072,
3082,
285,
352,
671,
14757,
14282,
12420,
875,
2460,
4811,
285,
26290,
20544,
253,
2934,
273,
26277,
24635,
2605,
275,
3626,
3448,
285,
3216,
272,
253,
26290,
342,
1524,
10186,
3888,
310,
540,
41597,
3451,
253,
28913,
1263,
671,
2722,
326,
1097,
4735,
5251,
11781,
285,
4868,
5251,
11781,
1690,
253,
4499,
422,
2957,
604,
891,
2096,
9113,
7729,
275,
11138,
253,
29072,
1543,
50276,
20881,
1255,
337,
253,
2460,
3386,
403,
760,
908,
323,
12672,
253,
3304,
1509,
253,
2460,
4735,
943,
3831,
1491,
326,
476,
1361,
3283,
253,
5816,
3159,
824,
326,
352,
812,
320,
908,
275,
253,
3345,
1509,
1512,
17221,
253,
1682,
2460,
2919,
323,
21565,
253,
5816,
3159,
310,
671,
271,
540,
41597,
3451,
1039,
281,
1973,
253,
8113,
12982,
12420,
374,
253,
11897,
273,
948,
74,
260,
1944,
3797,
247,
2781,
5572,
436,
812,
1421,
281,
247,
23539,
11786,
495,
347,
253,
2488,
5393,
275,
253,
5955,
2593,
253,
1566,
36908,
1908,
253,
21624,
24498,
2605,
273,
253,
2460,
323,
1650,
253,
6197,
8631,
253,
2862,
2460,
1223,
1016,
12616,
8631,
629,
273,
253,
2460,
4583,
253,
4081,
1332,
310,
4722,
285,
29853,
50276,
783,
2934,
943,
320,
4722,
281,
1097,
440,
35421,
29072,
285,
23390,
49797,
7888,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
281,
1347,
440,
35421,
28146,
9953,
689,
4440,
292,
2068,
8557,
285,
908,
6096,
2605,
875,
253,
33433,
281,
3157,
28146,
9953,
327,
1097,
7123,
4477,
1089,
253,
2929,
2590,
10995,
285,
4722,
285,
5583,
14924,
1293,
39500
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a selfsupervised idea for unsupervised anomaly detection specifically this framework enables highperformance ad without any labels via srr which is an ensemble approach to propose candidate anomaly samples that are refined from training this way allows more robust fitting of the anomaly decision boundaries and also better learning of data representations multiple examples are used to demonstrate the proposed methods on effectiveness and robustness strength 1 the authors fully explore the power of unsupervised ad and shows outperformed results by using the proposed srr scheme based on the goad framework 2 the performance is improved by leveraging the ensemble of the occ which works well but may result in additional computational cost 3 enough details such as the sensitivity of hyperparameters are provided to reproduce the experiments on multiple tasks including tabular and image datasets weakness 1 the idea sounds promising but may not be the first work the performance is enhanced by an ensemble of multiple tricks it would be helpful to see a detailed ablation study which may show more insights 2 the baseline is mainly consisting of goad ocsvm etc some recent sota baselines are missed for example neutral 1 1 qiu chen timo pfrommer marius kloft stephan mandt and maja rudolph neural transformation learning for deep anomaly detection beyond images icml 2021 3 the baseline varies case by case for example cutpaste is used for mvtec datasets have you considered the sota performance for example in this link httpspaperswithcodecomsotaanomalydetectiononmvtecad cupaste only ranks the 9th have you compared with the other sota baselines i therefore have a concern how do you choose the baseline method in different datasets i suggest to choose more rather than a specific one overall the paper is wellwritten and the results and performance look solid my major concern is the novelty specifically compared with the goad method in addition the code is not uploaded so i am not sure how it works for reproducibility and time cost docsepthis paper proposes an ensemble approach called srr selfsupervise refine repeat for robust unsupervised anomaly detection the proposed approach trains an ensemble of k detectors together with a joint selfsupervised feature extractor g on k disjoint subsets of the data the ensemble is then used to filter the training data keeping only the data points that are collectively deemed normal by the k detectors this trainingdata filtering process is repeated until the selfsupervised feature extractor has converged a final detector is then trained using the refined data and converged selfsupervised feature extractor experiments on tabular and image datasets are presented which show that the proposed ensemble approach is more robust at high anomaly contamination ratios than respective stateoftheart single detectors pros the experimental results demonstrate significant anomaly detection performance improvements for the proposed srr approach especially at high anomaly ratios the paper is overall presented and written well as well as technically sound the paper is well placed well into the existing literature up to including recent works cons the methodological novelty of the proposed srr approach is rather low ensemble learning is standard to improve robustness and the individual detection method from sohn et al 2020 is not new the experiments do not contain a comparison to any specifically robust ad approach eg robust pca for the tabular and robust autoencoders for the image dataset while the paper includes ablation studies on key components and hyperparameters ensemble size data rejection confidence updating the selfsupervised feature extractor i think there should also be a comparison to just using the final ensemble on the converged selfsupervised extractor vs training an additional final model is there much of a difference left some additional minor points i dont agree with the framing that most prior ad works all depend on some labeled data as expressed in the abstract and introduction the bulk of ad research is on unsupervised methods see chandola et al 2009 and the recent reviews by ruff et al 2021 and pang et al 2021 however i agree that most methods assume fairly clean training data this view should be updated in my mind the gde abbreviation is used before it is explained p6 for the n setting typo though i think the methodological novelty of the proposed approach is rather low and that the experimental comparison should be somewhat extended where i expect the improvements of srr to hold up i am overall positive towards accepting this work since robust anomaly detection is a relevant problem of high practical significance for which the proposed srr approach demonstrates significant improvements over current stateoftheart methods docsepthe paper tackles an unsupervised anomaly detection problem where the training set contains an unknown portion of anomalies when anomalies are contained in the training set it is known that classical ad approaches performance degrades the idea is to filter out potential anomaly samples data refinement by ensemble model each model in the ensemble is trained on a disjoint set of training data and then used as a classifier to determine potential anomalies then the data refinement process uses a hard assignment excluding anomalies from the training set refinement and ensemble training is repeated iteratively until convergence the proposed framework is validated on the four tabular datasets and four image datasets stregnth the effectiveness of the proposed framework is validated on top of contrastive learningbased models which are stateofthearts extensive experiments on both tabular and image datasets support the effectiveness of the framework ablation studies decouple the effects of each hyperparameter also the representation update study shows that retraining representation with the refined dataset is important weakness although the proposed framework is tested on contrastive models the idea of data refinement itself is independent of these models the framework can be applied to other types of anomaly detectors but it is not shown the iterative refinement has been studied 123 previously however no comparison or discussion is addressed with those approaches although they use aebased models the idea of iterative refinement can be deployed on top of contrastive models as well what makes srr more effective than these methods what is the main factor that makes srr more competitive than these methods 1 xia yan et al learning discriminative reconstructions for unsupervised outlier removal proceedings of the ieee international conference on computer vision 2015 2 beggel laura michael pfeiffer and bernd bischl robust anomaly detection in images using adversarial autoencoders arxiv preprint arxiv190106355 2019 3 pang guansong et al selftrained deep ordinal regression for endtoend video anomaly detection proceedings of the ieeecvf conference on computer vision and pattern recognition 2020 hyperparameter selection requires tuning although it is shown in figure 7 that any value of gamma improves over baseline it is still important how to choose this value the paper suggested otsus method as a solution to predict the approximate anomaly ratio of the dataset but it is not shown in the main experiments mvtec experiment is provided in a5 in the main experiments hyperparameter gamma is tuned with two times of anomaly ratio which this setting requires prior knowledge on anomaly ratio i wonder how effective otsus method is in other scenarios the training requires heavy computation as the framework requires ensemble learning on top of contrastive learning questions what is the convergence condition is it necessary to train with the framework until the data refinement gives marginal change to the data how long does it take what is the important difference of srr from the previous refinement methods is srr more effective than the previous refinement approaches how does srr perform when trained with otsus method rather than using true anomaly ratio post rebuttal the authors addressed all my concerns i raise the score assuming the rebuttal materials will be included in the revised version rebuttal materials here means otsus method experiment detailed discussion on the difference between srr and the previous iterative works convergence analysis maybe this one in the appendix the paper proposes a framework to refine data and train contrastive models for unsupervised anomaly detection problems the extensive experiments show the effectiveness of the method however the main experiments require prior knowledge of the true anomaly ratio which is unavailable in realworld problems discussions about the important difference between the proposed method and the previous iterative methods would make the paper more convincing post rebuttal the authors addressed all my concerns i raise the score assuming the rebuttal materials will be included in the revised version rebuttal materials here means otsus method experiment detailed discussion on the difference between srr and the previous iterative works convergence analysis maybe this one in the appendix docsepthe authors propose a data refinement approach combined with selfsupervised representation to robust oneclass classification which is commonly used in the anomaly detection scenario the proposed data refinement approach is designed based on an ensemble of oneclass classifiers the authors propose a novel ad framework to enable inspect defects with oneclass which is called srr and applicable on unlabeled datasetssrr employs an ensemble of multiple occs to give the potential anomaly refined samples from training srr brings the advantages of making the anomaly decision boundaries more robust and giving better data representations the proof and experiment results are well organized the paper is ready for acceptance
### Summary: | the paper worked on fully unsupervised anomaly detection and proposed to use selfsupervised representation learning to improve the performance of oneclass classification this is a borderline case close to acceptance but cannot make it specifically it is useful but its novelty is the main issue since it is not surprising that selfsupervised representation learning can improve oneclass classification without representation learning this part is still much of the taste of iclr and an ensemble of multiple models can improve upon a single model which is just bootstrap aggregating or bagging used everyday in practice and known to machine learning and statistics societies a very long time ago after seeing the rebuttal the concerns were not really addressed well and the issues were only partially solved thus the paper is not enough to guarantee an acceptance to iclr unfortunately | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
1881,
35421,
2934,
323,
440,
35421,
30207,
5481,
5742,
436,
7792,
13276,
1029,
24159,
519,
1293,
667,
13301,
3066,
256,
2676,
534,
310,
271,
19862,
2746,
281,
12661,
7431,
30207,
3530,
326,
403,
22407,
432,
3733,
436,
1039,
4483,
625,
10237,
13532,
273,
253,
30207,
3061,
13674,
285,
671,
1805,
4715,
273,
941,
14237,
50276,
34263,
6667,
403,
908,
281,
7568,
253,
4081,
3082,
327,
12510,
285,
31640,
50276,
45563,
337,
253,
4477,
4751,
8338,
253,
1612,
273,
440,
35421,
519,
285,
2722,
41731,
10574,
1543,
407,
970,
253,
4081,
256,
2676,
6974,
1754,
327,
253,
564,
324,
7792,
50276,
19,
253,
3045,
310,
5520,
407,
19732,
2977,
253,
19862,
273,
253,
1609,
534,
2987,
973,
533,
778,
906,
275,
3081,
15180,
2105,
50276,
20,
2217,
4278,
824,
347,
253,
7340,
273,
4373,
22041,
403,
2530,
281,
18302,
253,
4679,
327,
2709,
8892,
1690,
10334,
792,
285,
2460,
15302,
50275,
20881,
1255,
337,
253,
2934,
7835,
12532,
533,
778,
417,
320,
253,
806,
789,
253,
3045,
310,
8655,
407,
271,
19862,
273,
2709,
24866,
352,
651,
320,
9371,
281,
923,
247,
7000,
28913,
1263,
534,
778,
921,
625,
16039,
374,
253,
8245,
310,
7194,
11253,
273,
564,
324,
258,
6113,
11618,
3966,
690,
3332,
256,
5503,
1666,
25379,
403,
9829,
323,
1650,
9238,
337,
337,
2805,
14353,
260,
864,
4522,
80,
268,
4064,
961,
278,
26548,
465,
4213,
649,
2870,
15671,
7649,
85,
285,
19684,
66,
30514,
15967,
11454,
9261,
4715,
323,
3676,
30207,
5481,
4457,
3888,
17857,
1686,
43425,
50276,
20,
253,
8245,
16149,
1083,
407,
1083,
323,
1650,
2624,
23164,
310,
908,
323,
278,
87,
39766,
15302,
452,
368,
2783,
253,
256,
5503,
3045,
323,
1650,
275,
436,
3048,
50276,
3614,
50004,
3113,
3211,
681,
84,
5503,
266,
27724,
49558,
251,
36386,
442,
21192,
50275,
6837,
4740,
760,
17210,
253,
898,
394,
452,
368,
2429,
342,
253,
643,
256,
5503,
1666,
25379,
50276,
74,
3103,
452,
247,
4468,
849,
513,
368,
5206,
253,
8245,
1332,
275,
1027,
15302,
891,
1804,
281,
5206,
625,
2581,
685,
247,
2173,
581,
50274,
1189,
455,
253,
2929,
310,
973,
15720,
285,
253,
1543,
285,
3045,
1007,
4891,
619,
2201,
4468,
310,
253,
38135,
5742,
2429,
342,
253,
564,
324,
1332,
275,
1635,
253,
2127,
310,
417,
28228,
594,
891,
717,
417,
2119,
849,
352,
2987,
323,
38041,
285,
673,
2105,
50275,
7152,
33032,
2520,
2929,
29328,
271,
19862,
2746,
1925,
256,
2676,
1881,
12185,
87,
885,
39494,
10280,
323,
10237,
440,
35421,
30207,
5481,
253,
4081,
2746,
18784,
271,
19862,
273,
465,
25421,
2366,
342,
247,
6036,
1881,
35421,
4735,
4908,
263,
305,
327,
465,
28465,
20077,
273,
253,
941,
253,
19862,
310,
840,
908,
281,
5806,
253,
3733,
941,
7562,
760,
253,
941,
2792,
326,
403,
26708,
14320,
2622,
407,
253,
465,
25421,
436,
3733,
2203,
19690,
1232,
310,
6015,
1919,
253,
1881,
35421,
4735,
4908,
263,
556,
5975,
2400,
247,
2457,
13562,
310,
840,
10166,
970,
253,
22407,
941,
285,
5975,
2400,
1881,
35421,
4735,
4908,
263,
4679,
327,
10334,
792,
285,
2460,
15302,
403,
3559,
534,
921,
326,
253,
4081,
19862,
2746,
310,
625,
10237,
387,
1029,
30207,
17969,
11878,
685,
9056,
1375,
23037,
14387,
2014,
25421,
5847,
50276,
783,
5661,
1543,
7568,
1534,
30207,
5481,
3045,
11701,
323,
253,
4081,
256,
2676,
2746,
3340,
387,
1029,
30207,
11878,
50276,
783,
2929,
310,
4583,
3559,
285,
3542,
973,
347,
973,
347,
22335,
3590,
50276,
783,
2929,
310,
973,
4845,
973,
715,
253,
5368,
6239,
598,
281,
1690,
3332,
2987,
50276,
5040,
50276,
783,
35961,
38135,
273,
253,
4081,
256,
2676,
2746,
310,
2581,
1698,
19862,
4715,
310,
2629,
281,
3157,
31640,
285,
253,
2060,
5481,
1332,
432,
594,
13107,
1162,
355,
9169,
310,
417,
747,
50276,
783,
4679,
513,
417,
3831,
247,
5301,
281,
667,
5742,
10237,
519,
2746,
24088,
10237,
268,
6357,
323,
253,
10334,
792,
285,
10237,
6753,
2083,
351,
398,
323,
253,
2460,
10895,
50275,
6050,
253,
2929,
3797,
28913,
2175,
327,
2234,
4295,
285,
4373,
22041,
19862,
1979,
941,
18235,
7162,
22753,
253,
1881,
35421,
4735,
4908,
263,
891,
1158,
627,
943,
671,
320,
247,
5301,
281,
816,
970,
253,
2457,
19862,
327,
253,
5975,
2400,
1881,
35421,
4908,
263,
4632,
3733,
271,
3081,
2457,
1566,
310,
627,
1199,
273,
247,
3064,
1669,
50275,
8826,
3081,
5884,
2792,
50275,
74,
13414,
5194,
342,
253,
39926,
326,
954,
2720,
519,
2987,
512,
3469,
327,
690,
13130,
941,
347,
4469,
275,
253,
12002,
285,
10199,
253,
10713,
273,
519,
2561,
310,
327,
440,
35421,
3082,
923,
448,
395,
6836,
1162,
355,
4748,
285,
253,
3332,
10123,
407,
391,
2066,
1162,
355,
43425,
285,
268,
606,
1162,
355,
43425,
2299,
891,
5194,
326,
954,
3082,
5467,
9648,
4076,
3733,
941,
436,
1859,
943,
320,
9300,
275,
619,
2564,
50276,
783,
305,
615,
31931,
2492,
310,
908,
1078,
352,
310,
5544,
50276,
81,
23,
323,
253,
295,
4758,
50276,
555,
5367,
2167,
891,
1158,
253,
35961,
38135,
273,
253,
4081,
2746,
310,
2581,
1698,
285,
326,
253,
5661,
5301,
943,
320,
8489,
6508,
835,
891,
1902,
253,
11701,
273,
256,
2676,
281,
2186,
598,
891,
717,
4583,
2762,
4404,
18738,
436,
789,
1580,
10237,
30207,
5481,
310,
247,
4623,
1895,
273,
1029,
8542,
8453,
323,
534,
253,
4081,
256,
2676,
2746,
14371,
1534,
11701,
689,
1655,
1375,
23037,
14387,
3082,
50276,
7152,
339,
431,
248,
2929,
39223,
271,
440,
35421,
30207,
5481,
1895,
835,
253,
3733,
873,
4428,
271,
7202,
5110,
273,
31101,
672,
31101,
403,
6221,
275,
253,
3733,
873,
352,
310,
1929,
326,
8946,
519,
7274,
3045,
372,
25013,
253,
2934,
310,
281,
5806,
562,
2442,
30207,
3530,
941,
29646,
407,
19862,
1566,
1016,
1566,
275,
253,
19862,
310,
10166,
327,
247,
28465,
873,
273,
3733,
941,
285,
840,
908,
347,
247,
30410,
281,
3653,
2442,
31101,
840,
253,
941,
29646,
1232,
4648,
247,
1892,
12714,
22914,
31101,
432,
253,
3733,
873,
29646,
285,
19862,
3733,
310,
6015,
10040,
3146,
1919,
14940,
253,
4081,
7792,
310,
17618,
327,
253,
1740,
10334,
792,
15302,
285,
1740,
2460,
15302,
331,
21357,
394,
253,
12510,
273,
253,
4081,
7792,
310,
17618,
327,
1755,
273,
4499,
422,
4715,
3169,
3210,
534,
403,
1375,
23037,
248,
12863,
9470,
4679,
327,
1097,
10334,
792,
285,
2460,
15302,
1329,
253,
12510,
273,
253,
7792,
28913,
2175,
34430,
713,
253,
2538,
273,
1016,
4373,
19484,
671,
253,
6779,
5731,
1263,
2722,
326,
851,
26208,
6779,
342,
253,
22407,
10895,
310,
1774,
50276,
20881,
1255,
50276,
20261,
253,
4081,
7792,
310,
5762,
327,
4499,
422,
3210,
253,
2934,
273,
941,
29646,
3139,
310,
3907,
273,
841,
3210,
253,
7792,
476,
320,
3732,
281,
643,
3510,
273,
30207,
25421,
533,
352,
310,
417,
2011,
50275,
783,
34560,
29646,
556,
644,
5421,
15567,
3786,
2299,
642,
5301,
390,
5955,
310,
9713,
342,
1110,
7274,
3738,
597,
897,
247,
2275,
833,
3210,
253,
2934,
273,
34560,
29646,
476,
320,
18329,
327,
1755,
273,
4499,
422,
3210,
347,
973,
752,
2789,
256,
2676,
625,
3576,
685,
841,
3082,
752,
310,
253,
2022,
2803,
326,
2789,
256,
2676,
625,
12085,
685,
841,
3082,
50276,
18,
1269,
571,
340,
266,
1162,
355,
4715,
20741,
800,
49866,
6477,
323,
440,
35421,
562,
3623,
8570,
10061,
273,
253,
26332,
1796,
5213,
8059,
327,
4382,
8113,
4104,
50276,
19,
28332,
293,
826,
5650,
278,
44023,
268,
453,
36271,
285,
270,
1808,
69,
270,
17291,
77,
10237,
30207,
5481,
275,
3888,
970,
48960,
6753,
2083,
351,
398,
549,
32693,
638,
3845,
549,
32693,
746,
520,
3071,
22341,
6247,
50276,
20,
268,
606,
1149,
507,
543,
1162,
355,
11329,
649,
11273,
3676,
50144,
9077,
323,
990,
936,
423,
3492,
30207,
5481,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
9169,
50275,
27049,
19484,
5438,
4419,
25184,
3738,
352,
310,
2011,
275,
4677,
818,
326,
667,
1318,
273,
17356,
19132,
689,
8245,
352,
310,
1335,
1774,
849,
281,
5206,
436,
1318,
253,
2929,
5125,
258,
1641,
316,
1332,
347,
247,
2900,
281,
3283,
253,
16851,
30207,
4313,
273,
253,
10895,
533,
352,
310,
417,
2011,
275,
253,
2022,
4679,
278,
87,
39766,
3368,
310,
2530,
275,
247,
22,
275,
253,
2022,
4679,
4373,
19484,
17356,
310,
24251,
342,
767,
2069,
273,
30207,
4313,
534,
436,
4758,
4419,
2720,
3640,
327,
30207,
4313,
891,
4282,
849,
3576,
258,
1641,
316,
1332,
310,
275,
643,
15216,
50275,
783,
3733,
4419,
5536,
13782,
347,
253,
7792,
4419,
19862,
4715,
327,
1755,
273,
4499,
422,
4715,
50275,
34974,
50276,
5371,
310,
253,
14940,
1617,
310,
352,
3309,
281,
6194,
342,
253,
7792,
1919,
253,
941,
29646,
4245,
16888,
1818,
281,
253,
941,
849,
1048,
1057,
352,
1379,
50275,
5371,
310,
253,
1774,
3064,
273,
256,
2676,
432,
253,
2045,
29646,
3082,
310,
256,
2676,
625,
3576,
685,
253,
2045,
29646,
7274,
50275,
5430,
1057,
256,
2676,
1347,
672,
10166,
342,
258,
1641,
316,
1332,
2581,
685,
970,
2032,
30207,
4313,
50275,
5996,
30080,
22559,
253,
4477,
9713,
512,
619,
7350,
891,
7164,
253,
4868,
7384,
253,
30080,
22559,
4753,
588,
320,
2908,
275,
253,
17265,
2715,
30080,
22559,
4753,
1060,
2097,
50276,
1502,
316,
1332,
3368,
50276,
5992,
7193,
5955,
327,
253,
3064,
875,
256,
2676,
285,
253,
2045,
34560,
2987,
50276,
585,
41801,
1783,
5046,
436,
581,
275,
253,
30762,
253,
2929,
29328,
247,
7792,
281,
39494,
941,
285,
6194,
4499,
422,
3210,
323,
440,
35421,
30207,
5481,
3237,
253,
9470,
4679,
921,
253,
12510,
273,
253,
1332,
2299,
253,
2022,
4679,
2430,
2720,
3640,
273,
253,
2032,
30207,
4313,
534,
310,
29356,
275,
1524,
10186,
3237,
11985,
670,
253,
1774,
3064,
875,
253,
4081,
1332,
285,
253,
2045,
34560,
3082,
651,
1056,
253,
2929,
625,
21414,
50276,
5996,
30080,
22559,
253,
4477,
9713,
512,
619,
7350,
891,
7164,
253,
4868,
7384,
253,
30080,
22559,
4753,
588,
320,
2908,
275,
253,
17265,
2715,
30080,
22559,
4753,
1060,
2097,
50276,
1502,
316,
1332,
3368,
50276,
5992,
7193,
5955,
327,
253,
3064,
875,
256,
2676,
285,
253,
2045,
34560,
2987,
50276,
585,
41801,
1783,
5046,
436,
581,
275,
253,
30762,
5474,
339,
431,
248,
4477,
12661,
247,
941,
29646,
2746,
5678,
342,
1881,
35421,
6779,
281,
10237,
581,
2437,
9162,
534,
310,
7744,
908,
275,
253,
30207,
5481,
10076,
253,
4081,
941,
29646,
2746,
310,
4158,
1754,
327,
271,
19862,
273,
581,
2437,
49996,
50276,
783,
4477,
12661,
247,
4460,
519,
7792,
281,
8046,
16030,
12834,
342,
581,
2437,
534,
310,
1925,
256,
2676,
285,
7763,
327,
440,
22027,
10895,
859,
2676,
27532,
271,
19862,
273,
2709,
1609,
84,
281,
1918,
253,
2442,
30207,
22407,
3530,
432,
3733,
256,
2676,
10316,
253,
11361,
273,
2403,
253,
30207,
3061,
13674,
625,
10237,
285,
4933,
1805,
941,
14237,
253,
4737,
285,
3368,
1543,
403,
973,
10932,
50276,
783,
2929,
310,
4704,
323,
14924,
2490,
187,
4118,
18435,
27,
783,
2929,
4307,
327,
4751,
440,
35421,
30207,
5481,
285,
4081,
281,
897,
1881,
35421,
6779,
4715,
281,
3157,
253,
3045,
273,
581,
2437,
9162,
436,
310,
247,
45210,
1083,
2810,
281,
14924,
533,
2550,
1056,
352,
5742,
352,
310,
4217,
533,
697,
38135,
310,
253,
2022,
2523,
1580,
352,
310,
417,
10084,
326,
1881,
35421,
6779,
4715,
476,
3157,
581,
2437,
9162,
1293,
6779,
4715,
436,
629,
310,
1335,
1199,
273,
253,
9075,
273,
17857,
32888,
285,
271,
19862,
273,
2709,
3210,
476,
3157,
2220,
247,
2014,
1566,
534,
310,
816,
28551,
9406,
839,
390,
7351,
3390,
908,
15363,
275,
3946,
285,
1929,
281,
5145,
4715,
285,
9990,
21262,
247,
1077,
1048,
673,
3622,
846,
6523,
253,
30080,
22559,
253,
7350,
497,
417,
1663,
9713,
973,
285,
253,
3374,
497,
760,
10571,
14042,
3021,
253,
2929,
310,
417,
2217,
281,
12215,
271,
14924,
281,
17857,
32888,
19235
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
247,
1881,
35421,
2934,
323,
440,
35421,
30207,
5481,
5742,
436,
7792,
13276,
1029,
24159,
519,
1293,
667,
13301,
3066,
256,
2676,
534,
310,
271,
19862,
2746,
281,
12661,
7431,
30207,
3530,
326,
403,
22407,
432,
3733,
436,
1039,
4483,
625,
10237,
13532,
273,
253,
30207,
3061,
13674,
285,
671,
1805,
4715,
273,
941,
14237,
50276,
34263,
6667,
403,
908,
281,
7568,
253,
4081,
3082,
327,
12510,
285,
31640,
50276,
45563,
337,
253,
4477,
4751,
8338,
253,
1612,
273,
440,
35421,
519,
285,
2722,
41731,
10574,
1543,
407,
970,
253,
4081,
256,
2676,
6974,
1754,
327,
253,
564,
324,
7792,
50276,
19,
253,
3045,
310,
5520,
407,
19732,
2977,
253,
19862,
273,
253,
1609,
534,
2987,
973,
533,
778,
906,
275,
3081,
15180,
2105,
50276,
20,
2217,
4278,
824,
347,
253,
7340,
273,
4373,
22041,
403,
2530,
281,
18302,
253,
4679,
327,
2709,
8892,
1690,
10334,
792,
285,
2460,
15302,
50275,
20881,
1255,
337,
253,
2934,
7835,
12532,
533,
778,
417,
320,
253,
806,
789,
253,
3045,
310,
8655,
407,
271,
19862,
273,
2709,
24866,
352,
651,
320,
9371,
281,
923,
247,
7000,
28913,
1263,
534,
778,
921,
625,
16039,
374,
253,
8245,
310,
7194,
11253,
273,
564,
324,
258,
6113,
11618,
3966,
690,
3332,
256,
5503,
1666,
25379,
403,
9829,
323,
1650,
9238,
337,
337,
2805,
14353,
260,
864,
4522,
80,
268,
4064,
961,
278,
26548,
465,
4213,
649,
2870,
15671,
7649,
85,
285,
19684,
66,
30514,
15967,
11454,
9261,
4715,
323,
3676,
30207,
5481,
4457,
3888,
17857,
1686,
43425,
50276,
20,
253,
8245,
16149,
1083,
407,
1083,
323,
1650,
2624,
23164,
310,
908,
323,
278,
87,
39766,
15302,
452,
368,
2783,
253,
256,
5503,
3045,
323,
1650,
275,
436,
3048,
50276,
3614,
50004,
3113,
3211,
681,
84,
5503,
266,
27724,
49558,
251,
36386,
442,
21192,
50275,
6837,
4740,
760,
17210,
253,
898,
394,
452,
368,
2429,
342,
253,
643,
256,
5503,
1666,
25379,
50276,
74,
3103,
452,
247,
4468,
849,
513,
368,
5206,
253,
8245,
1332,
275,
1027,
15302,
891,
1804,
281,
5206,
625,
2581,
685,
247,
2173,
581,
50274,
1189,
455,
253,
2929,
310,
973,
15720,
285,
253,
1543,
285,
3045,
1007,
4891,
619,
2201,
4468,
310,
253,
38135,
5742,
2429,
342,
253,
564,
324,
1332,
275,
1635,
253,
2127,
310,
417,
28228,
594,
891,
717,
417,
2119,
849,
352,
2987,
323,
38041,
285,
673,
2105,
50275,
7152,
33032,
2520,
2929,
29328,
271,
19862,
2746,
1925,
256,
2676,
1881,
12185,
87,
885,
39494,
10280,
323,
10237,
440,
35421,
30207,
5481,
253,
4081,
2746,
18784,
271,
19862,
273,
465,
25421,
2366,
342,
247,
6036,
1881,
35421,
4735,
4908,
263,
305,
327,
465,
28465,
20077,
273,
253,
941,
253,
19862,
310,
840,
908,
281,
5806,
253,
3733,
941,
7562,
760,
253,
941,
2792,
326,
403,
26708,
14320,
2622,
407,
253,
465,
25421,
436,
3733,
2203,
19690,
1232,
310,
6015,
1919,
253,
1881,
35421,
4735,
4908,
263,
556,
5975,
2400,
247,
2457,
13562,
310,
840,
10166,
970,
253,
22407,
941,
285,
5975,
2400,
1881,
35421,
4735,
4908,
263,
4679,
327,
10334,
792,
285,
2460,
15302,
403,
3559,
534,
921,
326,
253,
4081,
19862,
2746,
310,
625,
10237,
387,
1029,
30207,
17969,
11878,
685,
9056,
1375,
23037,
14387,
2014,
25421,
5847,
50276,
783,
5661,
1543,
7568,
1534,
30207,
5481,
3045,
11701,
323,
253,
4081,
256,
2676,
2746,
3340,
387,
1029,
30207,
11878,
50276,
783,
2929,
310,
4583,
3559,
285,
3542,
973,
347,
973,
347,
22335,
3590,
50276,
783,
2929,
310,
973,
4845,
973,
715,
253,
5368,
6239,
598,
281,
1690,
3332,
2987,
50276,
5040,
50276,
783,
35961,
38135,
273,
253,
4081,
256,
2676,
2746,
310,
2581,
1698,
19862,
4715,
310,
2629,
281,
3157,
31640,
285,
253,
2060,
5481,
1332,
432,
594,
13107,
1162,
355,
9169,
310,
417,
747,
50276,
783,
4679,
513,
417,
3831,
247,
5301,
281,
667,
5742,
10237,
519,
2746,
24088,
10237,
268,
6357,
323,
253,
10334,
792,
285,
10237,
6753,
2083,
351,
398,
323,
253,
2460,
10895,
50275,
6050,
253,
2929,
3797,
28913,
2175,
327,
2234,
4295,
285,
4373,
22041,
19862,
1979,
941,
18235,
7162,
22753,
253,
1881,
35421,
4735,
4908,
263,
891,
1158,
627,
943,
671,
320,
247,
5301,
281,
816,
970,
253,
2457,
19862,
327,
253,
5975,
2400,
1881,
35421,
4908,
263,
4632,
3733,
271,
3081,
2457,
1566,
310,
627,
1199,
273,
247,
3064,
1669,
50275,
8826,
3081,
5884,
2792,
50275,
74,
13414,
5194,
342,
253,
39926,
326,
954,
2720,
519,
2987,
512,
3469,
327,
690,
13130,
941,
347,
4469,
275,
253,
12002,
285,
10199,
253,
10713,
273,
519,
2561,
310,
327,
440,
35421,
3082,
923,
448,
395,
6836,
1162,
355,
4748,
285,
253,
3332,
10123,
407,
391,
2066,
1162,
355,
43425,
285,
268,
606,
1162,
355,
43425,
2299,
891,
5194,
326,
954,
3082,
5467,
9648,
4076,
3733,
941,
436,
1859,
943,
320,
9300,
275,
619,
2564,
50276,
783,
305,
615,
31931,
2492,
310,
908,
1078,
352,
310,
5544,
50276,
81,
23,
323,
253,
295,
4758,
50276,
555,
5367,
2167,
891,
1158,
253,
35961,
38135,
273,
253,
4081,
2746,
310,
2581,
1698,
285,
326,
253,
5661,
5301,
943,
320,
8489,
6508,
835,
891,
1902,
253,
11701,
273,
256,
2676,
281,
2186,
598,
891,
717,
4583,
2762,
4404,
18738,
436,
789,
1580,
10237,
30207,
5481,
310,
247,
4623,
1895,
273,
1029,
8542,
8453,
323,
534,
253,
4081,
256,
2676,
2746,
14371,
1534,
11701,
689,
1655,
1375,
23037,
14387,
3082,
50276,
7152,
339,
431,
248,
2929,
39223,
271,
440,
35421,
30207,
5481,
1895,
835,
253,
3733,
873,
4428,
271,
7202,
5110,
273,
31101,
672,
31101,
403,
6221,
275,
253,
3733,
873,
352,
310,
1929,
326,
8946,
519,
7274,
3045,
372,
25013,
253,
2934,
310,
281,
5806,
562,
2442,
30207,
3530,
941,
29646,
407,
19862,
1566,
1016,
1566,
275,
253,
19862,
310,
10166,
327,
247,
28465,
873,
273,
3733,
941,
285,
840,
908,
347,
247,
30410,
281,
3653,
2442,
31101,
840,
253,
941,
29646,
1232,
4648,
247,
1892,
12714,
22914,
31101,
432,
253,
3733,
873,
29646,
285,
19862,
3733,
310,
6015,
10040,
3146,
1919,
14940,
253,
4081,
7792,
310,
17618,
327,
253,
1740,
10334,
792,
15302,
285,
1740,
2460,
15302,
331,
21357,
394,
253,
12510,
273,
253,
4081,
7792,
310,
17618,
327,
1755,
273,
4499,
422,
4715,
3169,
3210,
534,
403,
1375,
23037,
248,
12863,
9470,
4679,
327,
1097,
10334,
792,
285,
2460,
15302,
1329,
253,
12510,
273,
253,
7792,
28913,
2175,
34430,
713,
253,
2538,
273,
1016,
4373,
19484,
671,
253,
6779,
5731,
1263,
2722,
326,
851,
26208,
6779,
342,
253,
22407,
10895,
310,
1774,
50276,
20881,
1255,
50276,
20261,
253,
4081,
7792,
310,
5762,
327,
4499,
422,
3210,
253,
2934,
273,
941,
29646,
3139,
310,
3907,
273,
841,
3210,
253,
7792,
476,
320,
3732,
281,
643,
3510,
273,
30207,
25421,
533,
352,
310,
417,
2011,
50275,
783,
34560,
29646,
556,
644,
5421,
15567,
3786,
2299,
642,
5301,
390,
5955,
310,
9713,
342,
1110,
7274,
3738,
597,
897,
247,
2275,
833,
3210,
253,
2934,
273,
34560,
29646,
476,
320,
18329,
327,
1755,
273,
4499,
422,
3210,
347,
973,
752,
2789,
256,
2676,
625,
3576,
685,
841,
3082,
752,
310,
253,
2022,
2803,
326,
2789,
256,
2676,
625,
12085,
685,
841,
3082,
50276,
18,
1269,
571,
340,
266,
1162,
355,
4715,
20741,
800,
49866,
6477,
323,
440,
35421,
562,
3623,
8570,
10061,
273,
253,
26332,
1796,
5213,
8059,
327,
4382,
8113,
4104,
50276,
19,
28332,
293,
826,
5650,
278,
44023,
268,
453,
36271,
285,
270,
1808,
69,
270,
17291,
77,
10237,
30207,
5481,
275,
3888,
970,
48960,
6753,
2083,
351,
398,
549,
32693,
638,
3845,
549,
32693,
746,
520,
3071,
22341,
6247,
50276,
20,
268,
606,
1149,
507,
543,
1162,
355,
11329,
649,
11273,
3676,
50144,
9077,
323,
990,
936,
423,
3492,
30207,
5481,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
9169,
50275,
27049,
19484,
5438,
4419,
25184,
3738,
352,
310,
2011,
275,
4677,
818,
326,
667,
1318,
273,
17356,
19132,
689,
8245,
352,
310,
1335,
1774,
849,
281,
5206,
436,
1318,
253,
2929,
5125,
258,
1641,
316,
1332,
347,
247,
2900,
281,
3283,
253,
16851,
30207,
4313,
273,
253,
10895,
533,
352,
310,
417,
2011,
275,
253,
2022,
4679,
278,
87,
39766,
3368,
310,
2530,
275,
247,
22,
275,
253,
2022,
4679,
4373,
19484,
17356,
310,
24251,
342,
767,
2069,
273,
30207,
4313,
534,
436,
4758,
4419,
2720,
3640,
327,
30207,
4313,
891,
4282,
849,
3576,
258,
1641,
316,
1332,
310,
275,
643,
15216,
50275,
783,
3733,
4419,
5536,
13782,
347,
253,
7792,
4419,
19862,
4715,
327,
1755,
273,
4499,
422,
4715,
50275,
34974,
50276,
5371,
310,
253,
14940,
1617,
310,
352,
3309,
281,
6194,
342,
253,
7792,
1919,
253,
941,
29646,
4245,
16888,
1818,
281,
253,
941,
849,
1048,
1057,
352,
1379,
50275,
5371,
310,
253,
1774,
3064,
273,
256,
2676,
432,
253,
2045,
29646,
3082,
310,
256,
2676,
625,
3576,
685,
253,
2045,
29646,
7274,
50275,
5430,
1057,
256,
2676,
1347,
672,
10166,
342,
258,
1641,
316,
1332,
2581,
685,
970,
2032,
30207,
4313,
50275,
5996,
30080,
22559,
253,
4477,
9713,
512,
619,
7350,
891,
7164,
253,
4868,
7384,
253,
30080,
22559,
4753,
588,
320,
2908,
275,
253,
17265,
2715,
30080,
22559,
4753,
1060,
2097,
50276,
1502,
316,
1332,
3368,
50276,
5992,
7193,
5955,
327,
253,
3064,
875,
256,
2676,
285,
253,
2045,
34560,
2987,
50276,
585,
41801,
1783,
5046,
436,
581,
275,
253,
30762,
253,
2929,
29328,
247,
7792,
281,
39494,
941,
285,
6194,
4499,
422,
3210,
323,
440,
35421,
30207,
5481,
3237,
253,
9470,
4679,
921,
253,
12510,
273,
253,
1332,
2299,
253,
2022,
4679,
2430,
2720,
3640,
273,
253,
2032,
30207,
4313,
534,
310,
29356,
275,
1524,
10186,
3237,
11985,
670,
253,
1774,
3064,
875,
253,
4081,
1332,
285,
253,
2045,
34560,
3082,
651,
1056,
253,
2929,
625,
21414,
50276,
5996,
30080,
22559,
253,
4477,
9713,
512,
619,
7350,
891,
7164,
253,
4868,
7384,
253,
30080,
22559,
4753,
588,
320,
2908,
275,
253,
17265,
2715,
30080,
22559,
4753,
1060,
2097,
50276,
1502,
316,
1332,
3368,
50276,
5992,
7193,
5955,
327,
253,
3064,
875,
256,
2676,
285,
253,
2045,
34560,
2987,
50276,
585,
41801,
1783,
5046,
436,
581,
275,
253,
30762,
5474,
339,
431,
248,
4477,
12661,
247,
941,
29646,
2746,
5678,
342,
1881,
35421,
6779,
281,
10237,
581,
2437,
9162,
534,
310,
7744,
908,
275,
253,
30207,
5481,
10076,
253,
4081,
941,
29646,
2746,
310,
4158,
1754,
327,
271,
19862,
273,
581,
2437,
49996,
50276,
783,
4477,
12661,
247,
4460,
519,
7792,
281,
8046,
16030,
12834,
342,
581,
2437,
534,
310,
1925,
256,
2676,
285,
7763,
327,
440,
22027,
10895,
859,
2676,
27532,
271,
19862,
273,
2709,
1609,
84,
281,
1918,
253,
2442,
30207,
22407,
3530,
432,
3733,
256,
2676,
10316,
253,
11361,
273,
2403,
253,
30207,
3061,
13674,
625,
10237,
285,
4933,
1805,
941,
14237,
253,
4737,
285,
3368,
1543,
403,
973,
10932,
50276,
783,
2929,
310,
4704,
323,
14924,
2490,
187,
4118,
18435,
27,
783,
2929,
4307,
327,
4751,
440,
35421,
30207,
5481,
285,
4081,
281,
897,
1881,
35421,
6779,
4715,
281,
3157,
253,
3045,
273,
581,
2437,
9162,
436,
310,
247,
45210,
1083,
2810,
281,
14924,
533,
2550,
1056,
352,
5742,
352,
310,
4217,
533,
697,
38135,
310,
253,
2022,
2523,
1580,
352,
310,
417,
10084,
326,
1881,
35421,
6779,
4715,
476,
3157,
581,
2437,
9162,
1293,
6779,
4715,
436,
629,
310,
1335,
1199,
273,
253,
9075,
273,
17857,
32888,
285,
271,
19862,
273,
2709,
3210,
476,
3157,
2220,
247,
2014,
1566,
534,
310,
816,
28551,
9406,
839,
390,
7351,
3390,
908,
15363,
275,
3946,
285,
1929,
281,
5145,
4715,
285,
9990,
21262,
247,
1077,
1048,
673,
3622,
846,
6523,
253,
30080,
22559,
253,
7350,
497,
417,
1663,
9713,
973,
285,
253,
3374,
497,
760,
10571,
14042,
3021,
253,
2929,
310,
417,
2217,
281,
12215,
271,
14924,
281,
17857,
32888,
19235
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper addressed an interesting problem of reducing the kernel to achieve cnn models which is important and attracts lots of research work however the methods dont have very good justifications for example in section 31 the authors mentioned that specifically in normal cnns it is quite common to have multiple stagesblocks which contain repeated patterns such as layers or structures it is still unclear why it is better to replace these socalled repeated patterns the defined information field is not clearly explained and the benefits are also not demonstrateddocsepstandard dense 2d convolution dense in space and channels may waste parameters this paper points out the many ways that sparser convolutional operators kernels may be combined into a single combined operator that may be used in place of dense convolution the paper waxes grandiose about the exponentially many ways that operations may be combined but then defines and tries only four while trying four alternatives may be quite interesting the paper could have avoided grandiose language by just stating we tried four things if you restrict yourself to kernels with 3x3 receptive field and no repeated operations and probably other assumptions there are only four unique combinations to be tried perhaps a page of text could have been saved the paper also defines information field as the product of the operators spatial receptive field and the number of channels that each unit can see authors proceed to make broad claims about how information field is an important concept that predicts performance while this may indeed turn out to be an important concept it is not shown as such by the paper claims we identify a easily measurable quantity named information field behind various sparse kernel designs which is closely related to the model accuracy during the process to reduce the design space we find an unified property named information field behind various designs which could directly indicate the final accuracy but the paper does not substantiate these claims since information field is defined as the product of the receptive field and the number of channels seen it would seem necessary to show say at least some experiments with varying receptive field sizes and number of channels then it might be shown for example that across a wide array of network sizes widths depths holding all but information field constant information field is predictive of performance but these experiments are not done receptive fields the paper only ever tries 3x3 receptive fields table 2 3 4 so absolutely no support is given for the relevance of two out of the three components i size j size comprising information field number of channels as far as i can tell table 2 and 3 contain the only results in this direction reading off of table 2 for networks of the same depth 98 info size 256 works a bit better than 128 and 512 works a bit better than 256 see also table 3 lines 4 vs 5 show the same 256 vs 128 effect cool but two comparisons are not even close to enough to support the statement we find an unified property named information field behind various designs it is enough to support the statement for this single network we tried and using 3x3 receptive fields we found that letting units see more channels seemed to help unfortunately this conclusion on its own is not a publishable result to make this paper great you will have to close the gap between what you believe and what you have shown 1 you believe that information field is predictive of accuracy so show it is predictive of accuracy across sufficiently many wellcontrolled experiments 2 it may also be that the pwgconvdwpwgconv combination is a winning one in this case show that swapping it in for standard convolution helps in a variety of networks not just resnet and tasks not just imagenet other minor notes equations are critical in some parts of some papers but eg triple nested sums probably arent the easiest way of describing group convolution the part about regexes seemed unnecessary if 1000 different designs were tried in a large automated study where architectures were generated and pruned automatically this detail might be important but put it in si but if only four are tried this detail isnt needed we can see all four are different figure 1 is a great diagram how efficient are these kernels to compute on the gpu include computation time efficiency given the total amount of parameters these equations and scaling properties seemed to miss the point for example it can be easily verified that given the total number of parameters the greatest width is reached when the best efficiency is achieved this is just saying that standard convolution scales poorly as f infinity this doesnt seem like the most useful definition of efficiency a better one might be how many params do you need to get to x accuracy on imagenet then show curves of params vs accuracy for variants of a few popular model architectures like resnet or xception with varying width and depth 332 define m and n docsepthe paper considers sparse kernel design in order to reduce the space complexity of a convolutional neural network in specifics the proposed procedure is composed of following steps 1 remove repeated layers 2 remove designs with large degradation design and 3 further remove design for better parameter efficiency the paper proposed the composition of group convolution pointwise convolution and depthwise convolution for the sparse kernel design which seems pretty promising in addition the authors discussed the efficiency of each convolution compositions i failed to appreciate the idea of information field i didnt understand the claims that for one output tensor sizes of information fields for all activations are usually the same when introducing a new concept its very important to make it clear and friendly the author could consider more intuitive high level explanation or some graphic demonstrations also i couldnt see why this notion is important in the rest of the paper personally im so confused by the theorem it looks like a mathematical overclaim to me it claims that the best efficiency is achieved when m n c however is it always the case what is m n neq c what does the theorem mean for real applications all the reasoning and derivation are assuming the 3 x 3 spatial area and 4 way tensor i would assume these constant are not important the paper could be much stronger if there is a clear notion of general results
### Summary: | this paper points out methods to obtain sparse convolutional operators the reviewers have a consensus on rejection due to clarity and lack of support to the claims | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
9713,
271,
4722,
1895,
273,
8493,
253,
10295,
281,
5115,
260,
9866,
3210,
534,
310,
1774,
285,
45465,
8783,
273,
2561,
789,
2299,
253,
3082,
13414,
452,
1077,
1175,
816,
6787,
50276,
1542,
1650,
275,
2593,
4562,
253,
4477,
5393,
326,
5742,
275,
2622,
260,
79,
2224,
352,
310,
3240,
1846,
281,
452,
2709,
8661,
27027,
534,
3831,
6015,
6127,
824,
347,
8090,
390,
5289,
352,
310,
1335,
12744,
2139,
352,
310,
1805,
281,
8171,
841,
9267,
18859,
6015,
6127,
50276,
783,
2931,
1491,
1673,
310,
417,
4518,
5544,
285,
253,
5373,
403,
671,
417,
5183,
7152,
33032,
15291,
14086,
374,
69,
27311,
14086,
275,
2317,
285,
8123,
778,
8138,
3602,
436,
2929,
2792,
562,
253,
1142,
4088,
326,
653,
9332,
27311,
267,
9158,
34501,
778,
320,
5678,
715,
247,
2014,
5678,
5572,
326,
778,
320,
908,
275,
1659,
273,
14086,
27311,
50276,
783,
2929,
22612,
265,
4936,
74,
583,
670,
253,
28596,
1142,
4088,
326,
5871,
778,
320,
5678,
533,
840,
13067,
285,
14177,
760,
1740,
1223,
2820,
1740,
18075,
778,
320,
3240,
4722,
253,
2929,
812,
452,
16371,
4936,
74,
583,
3448,
407,
816,
14851,
359,
3597,
1740,
1841,
604,
368,
4656,
4834,
281,
34501,
342,
495,
89,
20,
44952,
1673,
285,
642,
6015,
5871,
285,
3164,
643,
13260,
627,
403,
760,
1740,
4451,
13553,
281,
320,
3597,
4931,
247,
3239,
273,
2505,
812,
452,
644,
9809,
50276,
783,
2929,
671,
13067,
1491,
1673,
347,
253,
1885,
273,
253,
9158,
8820,
44952,
1673,
285,
253,
1180,
273,
8123,
326,
1016,
3943,
476,
923,
4477,
4262,
281,
1056,
3862,
3916,
670,
849,
1491,
1673,
310,
271,
1774,
4473,
326,
26295,
3045,
1223,
436,
778,
6296,
1614,
562,
281,
320,
271,
1774,
4473,
352,
310,
417,
2011,
347,
824,
407,
253,
2929,
50276,
28803,
50276,
664,
4271,
247,
4354,
27289,
10671,
4907,
1491,
1673,
3212,
2710,
23507,
10295,
11809,
534,
310,
8244,
2905,
281,
253,
1566,
7200,
50276,
32674,
253,
1232,
281,
4796,
253,
2216,
2317,
359,
1089,
271,
27998,
2867,
4907,
1491,
1673,
3212,
2710,
11809,
534,
812,
3587,
5224,
253,
2457,
7200,
50276,
2858,
253,
2929,
1057,
417,
4326,
4513,
841,
3916,
50276,
17480,
1491,
1673,
310,
2931,
347,
253,
1885,
273,
253,
44952,
1673,
285,
253,
1180,
273,
8123,
2326,
352,
651,
1646,
3309,
281,
921,
1333,
387,
1878,
690,
4679,
342,
11962,
44952,
1673,
9552,
285,
1180,
273,
8123,
840,
352,
1537,
320,
2011,
323,
1650,
326,
2439,
247,
4618,
3781,
273,
2990,
9552,
34414,
24484,
5877,
512,
533,
1491,
1673,
3638,
1491,
1673,
310,
15970,
273,
3045,
533,
841,
4679,
403,
417,
2218,
50276,
250,
31898,
4910,
253,
2929,
760,
2455,
14177,
495,
89,
20,
44952,
4910,
2829,
374,
495,
577,
594,
8839,
642,
1329,
310,
1677,
323,
253,
17200,
273,
767,
562,
273,
253,
1264,
4295,
891,
1979,
480,
1979,
11616,
1491,
1673,
50276,
9133,
273,
8123,
347,
2080,
347,
891,
476,
2028,
2829,
374,
285,
495,
3831,
253,
760,
1543,
275,
436,
3884,
4361,
745,
273,
2829,
374,
323,
6928,
273,
253,
1072,
6864,
10508,
8692,
1979,
17558,
2987,
247,
2372,
1805,
685,
12842,
285,
23414,
2987,
247,
2372,
1805,
685,
17558,
50274,
2887,
671,
2829,
495,
3104,
577,
4632,
608,
921,
253,
1072,
17558,
4632,
12842,
1055,
50276,
26175,
533,
767,
14023,
403,
417,
1014,
2810,
281,
2217,
281,
1329,
253,
3908,
359,
1089,
271,
27998,
2867,
4907,
1491,
1673,
3212,
2710,
11809,
352,
310,
2217,
281,
1329,
253,
3908,
323,
436,
2014,
2990,
359,
3597,
285,
970,
495,
89,
20,
44952,
4910,
359,
1119,
326,
13872,
5085,
923,
625,
8123,
4455,
281,
1361,
19235,
436,
6452,
327,
697,
1211,
310,
417,
247,
15452,
494,
906,
50274,
936,
1056,
436,
2929,
1270,
368,
588,
452,
281,
2810,
253,
8037,
875,
752,
368,
2868,
285,
752,
368,
452,
2011,
50276,
18,
368,
2868,
326,
1491,
1673,
310,
15970,
273,
7200,
594,
921,
352,
310,
15970,
273,
7200,
2439,
10481,
1142,
973,
16894,
4679,
50276,
19,
352,
778,
671,
320,
326,
253,
268,
46506,
13118,
69,
16471,
46506,
13118,
5019,
310,
247,
9880,
581,
275,
436,
1083,
921,
326,
1863,
5436,
352,
275,
323,
2629,
27311,
7729,
275,
247,
5235,
273,
6928,
417,
816,
501,
3024,
285,
8892,
417,
816,
4440,
257,
292,
50274,
977,
5884,
7211,
50274,
2655,
569,
403,
4619,
275,
690,
4243,
273,
690,
9380,
533,
24088,
16260,
20494,
22661,
3164,
403,
2649,
253,
24746,
1039,
273,
12930,
1387,
27311,
50274,
783,
629,
670,
21165,
265,
4455,
15279,
604,
9098,
1027,
11809,
497,
3597,
275,
247,
1781,
16644,
1263,
835,
35615,
497,
4561,
285,
819,
37437,
8356,
436,
2508,
1537,
320,
1774,
533,
1691,
352,
275,
4927,
533,
604,
760,
1740,
403,
3597,
436,
2508,
310,
2649,
3058,
359,
476,
923,
512,
1740,
403,
1027,
50274,
13206,
337,
310,
247,
1270,
10659,
50274,
5430,
5919,
403,
841,
34501,
281,
11897,
327,
253,
305,
11113,
2486,
13782,
673,
50274,
46505,
1677,
253,
2264,
2408,
273,
3602,
841,
7424,
285,
13642,
3607,
4455,
281,
2985,
253,
1127,
323,
1650,
352,
476,
320,
4354,
16058,
326,
1677,
253,
2264,
1180,
273,
3602,
253,
6459,
4871,
310,
4925,
672,
253,
1682,
6733,
310,
6786,
436,
310,
816,
3981,
326,
2629,
27311,
11498,
15225,
347,
269,
50276,
43723,
436,
36908,
1646,
751,
253,
954,
4217,
5426,
273,
6733,
247,
1805,
581,
1537,
320,
849,
1142,
18912,
513,
368,
878,
281,
755,
281,
1269,
7200,
327,
4440,
257,
292,
840,
921,
9191,
50276,
1171,
18912,
4632,
7200,
323,
11640,
273,
247,
1643,
4633,
1566,
35615,
751,
501,
3024,
390,
1269,
2409,
342,
11962,
4871,
285,
6864,
50274,
32078,
4853,
278,
285,
295,
5474,
339,
431,
248,
2929,
19401,
23507,
10295,
2216,
275,
1340,
281,
4796,
253,
2317,
10454,
273,
50276,
66,
27311,
267,
11454,
2990,
275,
40155,
253,
4081,
5199,
310,
9924,
273,
1563,
5018,
337,
5386,
6015,
8090,
374,
5386,
11809,
342,
1781,
11961,
2216,
285,
495,
2007,
5386,
2216,
323,
1805,
4764,
6733,
50276,
783,
2929,
4081,
253,
5889,
273,
1387,
27311,
1127,
3020,
27311,
285,
6864,
3020,
27311,
50276,
1542,
253,
23507,
10295,
2216,
534,
3133,
3965,
12532,
275,
1635,
253,
4477,
5469,
253,
6733,
273,
1016,
27311,
16672,
50276,
74,
4242,
281,
11435,
253,
2934,
273,
1491,
1673,
891,
42126,
2096,
253,
3916,
326,
323,
581,
3453,
13148,
9552,
273,
1491,
4910,
323,
512,
1396,
569,
403,
3798,
253,
1072,
672,
16984,
247,
747,
4473,
697,
1077,
1774,
281,
1056,
352,
2590,
285,
11453,
253,
2488,
812,
1908,
625,
27350,
1029,
1268,
8813,
390,
690,
19908,
32367,
671,
891,
812,
2649,
923,
2139,
436,
10732,
310,
1774,
275,
253,
1551,
273,
253,
2929,
50276,
10816,
595,
516,
594,
13477,
407,
253,
10012,
352,
4453,
751,
247,
15965,
689,
7041,
281,
479,
352,
3916,
326,
253,
1682,
6733,
310,
6786,
672,
278,
295,
50276,
68,
2299,
310,
352,
1900,
253,
1083,
752,
310,
278,
295,
425,
82,
260,
752,
1057,
253,
10012,
1599,
323,
1524,
4893,
50276,
455,
253,
14720,
285,
28529,
403,
7384,
253,
495,
1269,
495,
8820,
2170,
285,
577,
1039,
13148,
891,
651,
5467,
841,
3638,
403,
417,
1774,
253,
2929,
812,
320,
1199,
10046,
604,
627,
310,
247,
2590,
10732,
273,
2087,
1543,
187,
187,
4118,
18435,
27,
2520,
2929,
2792,
562,
3082,
281,
4044,
23507,
27311,
267,
9158,
253,
30628,
452,
247,
13969,
327,
18235,
1955,
281,
19843,
285,
3480,
273,
1329,
281,
253,
3916
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
9713,
271,
4722,
1895,
273,
8493,
253,
10295,
281,
5115,
260,
9866,
3210,
534,
310,
1774,
285,
45465,
8783,
273,
2561,
789,
2299,
253,
3082,
13414,
452,
1077,
1175,
816,
6787,
50276,
1542,
1650,
275,
2593,
4562,
253,
4477,
5393,
326,
5742,
275,
2622,
260,
79,
2224,
352,
310,
3240,
1846,
281,
452,
2709,
8661,
27027,
534,
3831,
6015,
6127,
824,
347,
8090,
390,
5289,
352,
310,
1335,
12744,
2139,
352,
310,
1805,
281,
8171,
841,
9267,
18859,
6015,
6127,
50276,
783,
2931,
1491,
1673,
310,
417,
4518,
5544,
285,
253,
5373,
403,
671,
417,
5183,
7152,
33032,
15291,
14086,
374,
69,
27311,
14086,
275,
2317,
285,
8123,
778,
8138,
3602,
436,
2929,
2792,
562,
253,
1142,
4088,
326,
653,
9332,
27311,
267,
9158,
34501,
778,
320,
5678,
715,
247,
2014,
5678,
5572,
326,
778,
320,
908,
275,
1659,
273,
14086,
27311,
50276,
783,
2929,
22612,
265,
4936,
74,
583,
670,
253,
28596,
1142,
4088,
326,
5871,
778,
320,
5678,
533,
840,
13067,
285,
14177,
760,
1740,
1223,
2820,
1740,
18075,
778,
320,
3240,
4722,
253,
2929,
812,
452,
16371,
4936,
74,
583,
3448,
407,
816,
14851,
359,
3597,
1740,
1841,
604,
368,
4656,
4834,
281,
34501,
342,
495,
89,
20,
44952,
1673,
285,
642,
6015,
5871,
285,
3164,
643,
13260,
627,
403,
760,
1740,
4451,
13553,
281,
320,
3597,
4931,
247,
3239,
273,
2505,
812,
452,
644,
9809,
50276,
783,
2929,
671,
13067,
1491,
1673,
347,
253,
1885,
273,
253,
9158,
8820,
44952,
1673,
285,
253,
1180,
273,
8123,
326,
1016,
3943,
476,
923,
4477,
4262,
281,
1056,
3862,
3916,
670,
849,
1491,
1673,
310,
271,
1774,
4473,
326,
26295,
3045,
1223,
436,
778,
6296,
1614,
562,
281,
320,
271,
1774,
4473,
352,
310,
417,
2011,
347,
824,
407,
253,
2929,
50276,
28803,
50276,
664,
4271,
247,
4354,
27289,
10671,
4907,
1491,
1673,
3212,
2710,
23507,
10295,
11809,
534,
310,
8244,
2905,
281,
253,
1566,
7200,
50276,
32674,
253,
1232,
281,
4796,
253,
2216,
2317,
359,
1089,
271,
27998,
2867,
4907,
1491,
1673,
3212,
2710,
11809,
534,
812,
3587,
5224,
253,
2457,
7200,
50276,
2858,
253,
2929,
1057,
417,
4326,
4513,
841,
3916,
50276,
17480,
1491,
1673,
310,
2931,
347,
253,
1885,
273,
253,
44952,
1673,
285,
253,
1180,
273,
8123,
2326,
352,
651,
1646,
3309,
281,
921,
1333,
387,
1878,
690,
4679,
342,
11962,
44952,
1673,
9552,
285,
1180,
273,
8123,
840,
352,
1537,
320,
2011,
323,
1650,
326,
2439,
247,
4618,
3781,
273,
2990,
9552,
34414,
24484,
5877,
512,
533,
1491,
1673,
3638,
1491,
1673,
310,
15970,
273,
3045,
533,
841,
4679,
403,
417,
2218,
50276,
250,
31898,
4910,
253,
2929,
760,
2455,
14177,
495,
89,
20,
44952,
4910,
2829,
374,
495,
577,
594,
8839,
642,
1329,
310,
1677,
323,
253,
17200,
273,
767,
562,
273,
253,
1264,
4295,
891,
1979,
480,
1979,
11616,
1491,
1673,
50276,
9133,
273,
8123,
347,
2080,
347,
891,
476,
2028,
2829,
374,
285,
495,
3831,
253,
760,
1543,
275,
436,
3884,
4361,
745,
273,
2829,
374,
323,
6928,
273,
253,
1072,
6864,
10508,
8692,
1979,
17558,
2987,
247,
2372,
1805,
685,
12842,
285,
23414,
2987,
247,
2372,
1805,
685,
17558,
50274,
2887,
671,
2829,
495,
3104,
577,
4632,
608,
921,
253,
1072,
17558,
4632,
12842,
1055,
50276,
26175,
533,
767,
14023,
403,
417,
1014,
2810,
281,
2217,
281,
1329,
253,
3908,
359,
1089,
271,
27998,
2867,
4907,
1491,
1673,
3212,
2710,
11809,
352,
310,
2217,
281,
1329,
253,
3908,
323,
436,
2014,
2990,
359,
3597,
285,
970,
495,
89,
20,
44952,
4910,
359,
1119,
326,
13872,
5085,
923,
625,
8123,
4455,
281,
1361,
19235,
436,
6452,
327,
697,
1211,
310,
417,
247,
15452,
494,
906,
50274,
936,
1056,
436,
2929,
1270,
368,
588,
452,
281,
2810,
253,
8037,
875,
752,
368,
2868,
285,
752,
368,
452,
2011,
50276,
18,
368,
2868,
326,
1491,
1673,
310,
15970,
273,
7200,
594,
921,
352,
310,
15970,
273,
7200,
2439,
10481,
1142,
973,
16894,
4679,
50276,
19,
352,
778,
671,
320,
326,
253,
268,
46506,
13118,
69,
16471,
46506,
13118,
5019,
310,
247,
9880,
581,
275,
436,
1083,
921,
326,
1863,
5436,
352,
275,
323,
2629,
27311,
7729,
275,
247,
5235,
273,
6928,
417,
816,
501,
3024,
285,
8892,
417,
816,
4440,
257,
292,
50274,
977,
5884,
7211,
50274,
2655,
569,
403,
4619,
275,
690,
4243,
273,
690,
9380,
533,
24088,
16260,
20494,
22661,
3164,
403,
2649,
253,
24746,
1039,
273,
12930,
1387,
27311,
50274,
783,
629,
670,
21165,
265,
4455,
15279,
604,
9098,
1027,
11809,
497,
3597,
275,
247,
1781,
16644,
1263,
835,
35615,
497,
4561,
285,
819,
37437,
8356,
436,
2508,
1537,
320,
1774,
533,
1691,
352,
275,
4927,
533,
604,
760,
1740,
403,
3597,
436,
2508,
310,
2649,
3058,
359,
476,
923,
512,
1740,
403,
1027,
50274,
13206,
337,
310,
247,
1270,
10659,
50274,
5430,
5919,
403,
841,
34501,
281,
11897,
327,
253,
305,
11113,
2486,
13782,
673,
50274,
46505,
1677,
253,
2264,
2408,
273,
3602,
841,
7424,
285,
13642,
3607,
4455,
281,
2985,
253,
1127,
323,
1650,
352,
476,
320,
4354,
16058,
326,
1677,
253,
2264,
1180,
273,
3602,
253,
6459,
4871,
310,
4925,
672,
253,
1682,
6733,
310,
6786,
436,
310,
816,
3981,
326,
2629,
27311,
11498,
15225,
347,
269,
50276,
43723,
436,
36908,
1646,
751,
253,
954,
4217,
5426,
273,
6733,
247,
1805,
581,
1537,
320,
849,
1142,
18912,
513,
368,
878,
281,
755,
281,
1269,
7200,
327,
4440,
257,
292,
840,
921,
9191,
50276,
1171,
18912,
4632,
7200,
323,
11640,
273,
247,
1643,
4633,
1566,
35615,
751,
501,
3024,
390,
1269,
2409,
342,
11962,
4871,
285,
6864,
50274,
32078,
4853,
278,
285,
295,
5474,
339,
431,
248,
2929,
19401,
23507,
10295,
2216,
275,
1340,
281,
4796,
253,
2317,
10454,
273,
50276,
66,
27311,
267,
11454,
2990,
275,
40155,
253,
4081,
5199,
310,
9924,
273,
1563,
5018,
337,
5386,
6015,
8090,
374,
5386,
11809,
342,
1781,
11961,
2216,
285,
495,
2007,
5386,
2216,
323,
1805,
4764,
6733,
50276,
783,
2929,
4081,
253,
5889,
273,
1387,
27311,
1127,
3020,
27311,
285,
6864,
3020,
27311,
50276,
1542,
253,
23507,
10295,
2216,
534,
3133,
3965,
12532,
275,
1635,
253,
4477,
5469,
253,
6733,
273,
1016,
27311,
16672,
50276,
74,
4242,
281,
11435,
253,
2934,
273,
1491,
1673,
891,
42126,
2096,
253,
3916,
326,
323,
581,
3453,
13148,
9552,
273,
1491,
4910,
323,
512,
1396,
569,
403,
3798,
253,
1072,
672,
16984,
247,
747,
4473,
697,
1077,
1774,
281,
1056,
352,
2590,
285,
11453,
253,
2488,
812,
1908,
625,
27350,
1029,
1268,
8813,
390,
690,
19908,
32367,
671,
891,
812,
2649,
923,
2139,
436,
10732,
310,
1774,
275,
253,
1551,
273,
253,
2929,
50276,
10816,
595,
516,
594,
13477,
407,
253,
10012,
352,
4453,
751,
247,
15965,
689,
7041,
281,
479,
352,
3916,
326,
253,
1682,
6733,
310,
6786,
672,
278,
295,
50276,
68,
2299,
310,
352,
1900,
253,
1083,
752,
310,
278,
295,
425,
82,
260,
752,
1057,
253,
10012,
1599,
323,
1524,
4893,
50276,
455,
253,
14720,
285,
28529,
403,
7384,
253,
495,
1269,
495,
8820,
2170,
285,
577,
1039,
13148,
891,
651,
5467,
841,
3638,
403,
417,
1774,
253,
2929,
812,
320,
1199,
10046,
604,
627,
310,
247,
2590,
10732,
273,
2087,
1543,
187,
187,
4118,
18435,
27,
2520,
2929,
2792,
562,
3082,
281,
4044,
23507,
27311,
267,
9158,
253,
30628,
452,
247,
13969,
327,
18235,
1955,
281,
19843,
285,
3480,
273,
1329,
281,
253,
3916
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
paper summary this paper proposed a simple regularization technique for domain generalization tasks termed mixstyle based on the observation that domains are determined by image styles by mixing styles of different instances which generates synthesized domain samples while preserving the content features the proposed method achieves the generalizability of the trained model the mixstyle was applied to numerous applications such as category classification instance retrieval and reinforcement learning and attained the stateofthe arts the mixstyle is relatively simple to implement but effective paper strength simple methodological design so it is easy to implement understanding the domain shift problems as a style variation makes sense randomizing the styles might be the solution to alleviate the domain generalization problems but searching all the possible styles and applying them would be challenging and not feasible so using different instance samples to extract the styles was nice it makes sense that introducing the lambda to mix the styles itself and ones of different instances the paper is well organized and written paper weakness i have no major comments on this paper but minor comments as follows even though the authors have shown the ablation study to analyze the levels where the mixstyle should be applied it is not clear for me yet the authors applied the mixstyle after 1st 2nd and 3rd residual blocks for category classification problems but applied the mixstyle after 1st and 2nd residual blocks for category classification problems for instance retrieval task in 34 analysis they only showed the ablation studies on the category classification thus one think the optimal combinations may vary according to the applications in addition another combination eg conv34 conv25 would be more interesting fig 4 is hard to understand what do the corresponding style statistics mean why does d only represent different legends in table 1 some experimental settings eg cartoon or photo have shown that mixstyle w random shuffle was better the discussion on this might be interesting docsepthis work proposes a technique for domain generalization by mixing style of images from different domains this work adopts a mix up style approach a for domain generalization different from a the paper proposes to conduct mixup in the intermediate layers in particular instance normalization layers the proposed approach diversifies the data implicitly and the experimental results show that the mixstyle can improve domain generalization overall the paper is wellwritten with plenty of details i also appreciate the experimental analysis in sec 34 and the variance reported in table 1 however i have several concerns regarding the paper the technical novelty seems rather incremental this method is an extension of a to the instance normalization layer similar strategies have been discussed in other works such b and c however these works are not discussed in terms of main similaritiesdifferences i also found the experimental validation not fully sufficient to grant publication currently the validation is only conducted on pacs the improvement also seems limited i believe validation on more datasetssuch as digits officehome as used in l2aot can further confirm the effectiveness of the proposed method i suspect that interpolating the style parameter might cause performance drop on the domains that have been seen during training would it be possible to report performance on the domains that have been seen in the training a vikas verma et al manifold mixup better representations by interpolating hidden states in icml 2019 b rui gong et al dlow domain flow for adaptation and generalization in cvpr 2019 c seonguk seo learning to optimize domain specific normalization for domain generalization in eccv 2020 i have read authors response and other reviews some of my concerns are addressed in the response especially the added discussion with related work is helpful thus i would increase my rating to 6 docsepsummary the paper proposes a simple method for domain generalization where multiple source domains are given for a certain task like image classification and testing happens on an unseen domain the authors are inspired by normalizationbased styletransfer techniques adaptive instancenorm and propose to mix the styles of different source domains to effectively increase diversity of domains during training pros overall this is a well written paper with a clear idea that is simple but intuitive the idea is well described put into context of prior work and empirically validated to improve results over various baselines it is good to see experiments outside of plain image classification to validate the proposed idea the analysis where to apply mixstyle is good and makes intuitive sense cons the relation to mixup needs to be explained in more details while related to the proposed mixstyle mixup creates a convex combination of both input and output spaces i can believe that mixup as a standard data augmentation gives worse results than a vanilla cnn table 1 but i would not fully agree with the statement which demonstrates the advantage of mixing style statistics at the feature level over mixing images at the pixel level from page 4 mixup also interpolates the output label space so the advantage cannot be only attributed the placement of the mixing within the network instead of at the pixel level as an additional baseline one could use mixup with a sampled lambda that is larger than 05 in all cases like in fixmatch sohn et al neurips20 but keeping the label from sample x rather than interpolating with hatx i do not understand why the suffix x is added to the analysis in table 3 is mixstyle applied after each convolutional layer or after each block in a resnet architecture specifically for conv234x how often is the mixstyle layer added 3 times or 3 numconvsinblock times for the reid experiments i think it should be better highlighted that the crossdataset setup is the key difference to evaluations in prior work this somehow gets almost unnoticed because the default setting of reid is already considered a valid domain generalization task due to the new label space and camera views this left me a bit confused about how randomerase can be a widely used data augmentation technique for reid when it gives worse results in the experiments from table 2 this became clear to me only after reading the discussion in the last paragraph of section 32 i would not make the statement that mixing is clearly better than replacing on page 7 see table 4 while also stating that with alpha increasing from 01 to 04 the accuracy slightly slides from 828 to 817 that slight change is larger than the clear gap before other notes and open questions mixup was used successfully as regularization for semisupervised learning ssl mixmatch berthelot et al nips19 can mixstyle also be used for ssl
### Summary: | all three reviewers recommend acceptance after the rebuttal stage and the ac found no reason to disagree with them the proposed method is simple and effective and the concerns raised about experimental validation and novelty seem well addressed in the rebuttal | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2929,
6010,
50275,
2520,
2929,
4081,
247,
2969,
37820,
5853,
323,
5028,
26647,
8892,
23776,
5878,
4826,
1754,
327,
253,
8310,
326,
10625,
403,
3413,
407,
2460,
14957,
407,
12480,
14957,
273,
1027,
10872,
534,
15693,
17791,
5028,
3530,
1223,
24279,
253,
2600,
3386,
253,
4081,
1332,
33526,
253,
2087,
50228,
273,
253,
10166,
1566,
253,
5878,
4826,
369,
3732,
281,
7418,
4893,
824,
347,
7140,
9162,
4227,
25064,
285,
35221,
4715,
285,
26553,
253,
1375,
23037,
248,
14635,
253,
5878,
4826,
310,
4942,
2969,
281,
3359,
533,
3576,
50272,
20790,
4757,
50275,
19583,
35961,
2216,
594,
352,
310,
3477,
281,
3359,
50276,
4524,
6924,
253,
5028,
5333,
3237,
347,
247,
3740,
7629,
2789,
3282,
50276,
14719,
3006,
253,
14957,
1537,
320,
253,
2900,
281,
33623,
253,
5028,
26647,
3237,
533,
12203,
512,
253,
1896,
14957,
285,
9433,
731,
651,
320,
11132,
285,
417,
17887,
594,
970,
1027,
4227,
3530,
281,
4908,
253,
14957,
369,
5322,
50276,
262,
2789,
3282,
326,
16984,
253,
29331,
281,
5878,
253,
14957,
3139,
285,
4394,
273,
1027,
10872,
50276,
783,
2929,
310,
973,
10932,
285,
3542,
50275,
20790,
14855,
50275,
74,
452,
642,
2201,
5701,
327,
436,
2929,
533,
5884,
5701,
347,
3637,
50276,
9154,
2167,
253,
4477,
452,
2011,
253,
28913,
1263,
281,
12106,
253,
2308,
835,
253,
5878,
4826,
943,
320,
3732,
352,
310,
417,
2590,
323,
479,
2568,
253,
4477,
3732,
253,
5878,
4826,
846,
337,
296,
374,
2109,
285,
495,
5784,
12541,
8336,
323,
7140,
9162,
3237,
533,
3732,
253,
5878,
4826,
846,
337,
296,
285,
374,
2109,
12541,
8336,
323,
7140,
9162,
3237,
323,
4227,
25064,
4836,
275,
5910,
1783,
597,
760,
2692,
253,
28913,
2175,
327,
253,
7140,
9162,
3021,
581,
1158,
253,
8654,
13553,
778,
6889,
2556,
281,
253,
4893,
275,
1635,
1529,
5019,
24088,
2410,
1706,
2410,
1099,
651,
320,
625,
4722,
50276,
926,
577,
310,
1892,
281,
2096,
752,
513,
253,
3969,
3740,
9990,
1599,
2139,
1057,
277,
760,
1957,
1027,
38209,
50275,
249,
2829,
337,
690,
5661,
7533,
24088,
28224,
390,
7512,
452,
2011,
326,
5878,
4826,
259,
3632,
46671,
369,
1805,
253,
5955,
327,
436,
1537,
320,
4722,
50276,
7152,
33032,
2520,
789,
29328,
247,
5853,
323,
5028,
26647,
407,
12480,
3740,
273,
3888,
432,
1027,
10625,
436,
789,
47932,
247,
5878,
598,
3740,
2746,
247,
323,
5028,
26647,
1027,
432,
247,
253,
2929,
29328,
281,
2589,
5878,
484,
275,
253,
10444,
8090,
275,
1798,
4227,
21539,
8090,
253,
4081,
2746,
7940,
7790,
253,
941,
29688,
285,
253,
5661,
1543,
921,
326,
253,
5878,
4826,
476,
3157,
5028,
26647,
50276,
1189,
455,
253,
2929,
310,
973,
15720,
342,
9828,
273,
4278,
891,
671,
11435,
253,
5661,
1783,
275,
4706,
5910,
285,
253,
11041,
2361,
275,
2829,
337,
2299,
891,
452,
2067,
7350,
5001,
253,
2929,
50276,
783,
7681,
38135,
3133,
2581,
32809,
436,
1332,
310,
271,
6880,
273,
247,
281,
253,
4227,
21539,
3828,
2074,
8130,
452,
644,
5469,
275,
643,
2987,
824,
270,
285,
260,
2299,
841,
2987,
403,
417,
5469,
275,
2426,
273,
2022,
22620,
69,
26776,
50276,
74,
671,
1119,
253,
5661,
12820,
417,
4751,
4209,
281,
4098,
9311,
4390,
253,
12820,
310,
760,
5196,
327,
268,
18944,
253,
7756,
671,
3133,
3710,
891,
2868,
12820,
327,
625,
10895,
859,
976,
347,
24321,
3906,
9511,
347,
908,
275,
298,
19,
66,
302,
476,
2007,
6583,
253,
12510,
273,
253,
4081,
1332,
50276,
74,
9101,
326,
50276,
2388,
4818,
839,
253,
3740,
4764,
1537,
2847,
3045,
5926,
327,
253,
10625,
326,
452,
644,
2326,
1309,
3733,
651,
352,
320,
1896,
281,
1304,
3045,
327,
253,
10625,
326,
452,
644,
2326,
275,
253,
3733,
50276,
66,
362,
1479,
284,
2336,
785,
1162,
355,
16751,
5878,
484,
1805,
14237,
407,
20670,
839,
8763,
3054,
275,
17857,
1686,
6247,
270,
391,
4113,
305,
543,
1162,
355,
277,
676,
5028,
2685,
323,
15644,
285,
26647,
275,
30105,
1087,
6247,
260,
396,
543,
2788,
396,
80,
4715,
281,
22318,
5028,
2173,
21539,
323,
5028,
26647,
275,
23746,
87,
9169,
50275,
74,
452,
1239,
4477,
2380,
285,
643,
10123,
690,
273,
619,
7350,
403,
9713,
275,
253,
2380,
3340,
253,
2879,
5955,
342,
2905,
789,
310,
9371,
3021,
891,
651,
2572,
619,
13716,
281,
721,
5474,
339,
793,
360,
3454,
253,
2929,
29328,
247,
2969,
1332,
323,
5028,
26647,
835,
2709,
2603,
10625,
403,
1677,
323,
247,
2176,
4836,
751,
2460,
9162,
285,
5175,
6569,
327,
271,
39709,
5028,
253,
4477,
403,
11797,
407,
21539,
3169,
30085,
1059,
16147,
1592,
5609,
17825,
978,
1377,
257,
526,
285,
12661,
281,
5878,
253,
14957,
273,
1027,
2603,
10625,
281,
8069,
2572,
9991,
273,
10625,
1309,
3733,
50276,
856,
84,
50275,
1189,
455,
436,
310,
247,
973,
3542,
2929,
342,
247,
2590,
2934,
326,
310,
2969,
533,
27350,
50276,
783,
2934,
310,
973,
2529,
1691,
715,
3634,
273,
2720,
789,
285,
45190,
17618,
281,
3157,
1543,
689,
2710,
1666,
25379,
50276,
262,
310,
1175,
281,
923,
4679,
3345,
273,
8342,
2460,
9162,
281,
17813,
253,
4081,
2934,
50276,
783,
1783,
835,
281,
4647,
5878,
4826,
310,
1175,
285,
2789,
27350,
3282,
50275,
5040,
50275,
783,
5886,
281,
5878,
484,
3198,
281,
320,
5544,
275,
625,
4278,
1223,
2905,
281,
253,
4081,
5878,
4826,
5878,
484,
10513,
247,
17133,
5019,
273,
1097,
3280,
285,
3453,
8470,
891,
476,
2868,
326,
5878,
484,
347,
247,
2629,
941,
42072,
4245,
7197,
1543,
685,
247,
26724,
260,
9866,
2829,
337,
533,
891,
651,
417,
4751,
5194,
342,
253,
3908,
50276,
4609,
14371,
253,
5750,
273,
12480,
3740,
9990,
387,
253,
4735,
1268,
689,
12480,
3888,
387,
253,
12275,
1268,
432,
3239,
577,
5878,
484,
671,
20670,
684,
253,
3453,
5203,
2317,
594,
253,
5750,
2550,
320,
760,
12877,
253,
14663,
273,
253,
12480,
1561,
253,
2990,
3185,
273,
387,
253,
12275,
1268,
50276,
284,
271,
3081,
8245,
581,
812,
897,
5878,
484,
342,
247,
19958,
29331,
326,
310,
4067,
685,
16987,
275,
512,
2219,
751,
275,
4993,
8992,
594,
13107,
1162,
355,
5723,
2824,
938,
533,
7562,
253,
5203,
432,
3410,
1269,
2581,
685,
20670,
839,
342,
7856,
89,
50275,
74,
513,
417,
2096,
2139,
253,
34078,
1269,
310,
2879,
281,
253,
1783,
275,
2829,
495,
310,
5878,
4826,
3732,
846,
1016,
27311,
267,
3828,
390,
846,
1016,
2972,
275,
247,
501,
3024,
10336,
5742,
323,
2410,
20210,
89,
849,
2223,
310,
253,
5878,
4826,
3828,
2879,
495,
2069,
390,
495,
50276,
6370,
13118,
7432,
6172,
2069,
50275,
1542,
253,
294,
301,
4679,
891,
1158,
352,
943,
320,
1805,
16318,
326,
253,
2831,
42429,
9978,
310,
253,
2234,
3064,
281,
27163,
275,
2720,
789,
436,
10380,
4850,
2761,
440,
44120,
984,
253,
4284,
4758,
273,
294,
301,
310,
2168,
2783,
247,
3588,
5028,
26647,
4836,
1955,
281,
253,
747,
5203,
2317,
285,
6568,
6849,
436,
1669,
479,
247,
2372,
13477,
670,
849,
3632,
254,
511,
476,
320,
247,
7561,
908,
941,
42072,
5853,
323,
294,
301,
672,
352,
4245,
7197,
1543,
275,
253,
4679,
432,
2829,
374,
436,
3395,
2590,
281,
479,
760,
846,
4361,
253,
5955,
275,
253,
1390,
12494,
273,
2593,
4567,
50275,
74,
651,
417,
1056,
253,
3908,
326,
50276,
24706,
272,
310,
4518,
1805,
685,
15706,
327,
3239,
818,
923,
2829,
577,
1223,
671,
14851,
326,
50276,
3113,
9765,
3629,
432,
14805,
281,
16703,
253,
7200,
5777,
19459,
432,
854,
1619,
281,
854,
1166,
326,
4512,
1818,
310,
4067,
685,
253,
2590,
8037,
1078,
50275,
977,
7211,
285,
1527,
3533,
50276,
24706,
484,
369,
908,
8379,
347,
37820,
323,
49863,
29974,
13337,
4715,
256,
3433,
5878,
8992,
17099,
783,
11753,
1162,
355,
295,
2824,
746,
476,
5878,
4826,
671,
320,
908,
323,
256,
3433,
2490,
187,
4118,
18435,
27,
455,
1264,
30628,
5583,
14924,
846,
253,
30080,
22559,
3924,
285,
253,
913,
1119,
642,
1921,
281,
14936,
342,
731,
253,
4081,
1332,
310,
2969,
285,
3576,
285,
253,
7350,
5439,
670,
5661,
12820,
285,
38135,
1646,
973,
9713,
275,
253,
30080,
22559,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2929,
6010,
50275,
2520,
2929,
4081,
247,
2969,
37820,
5853,
323,
5028,
26647,
8892,
23776,
5878,
4826,
1754,
327,
253,
8310,
326,
10625,
403,
3413,
407,
2460,
14957,
407,
12480,
14957,
273,
1027,
10872,
534,
15693,
17791,
5028,
3530,
1223,
24279,
253,
2600,
3386,
253,
4081,
1332,
33526,
253,
2087,
50228,
273,
253,
10166,
1566,
253,
5878,
4826,
369,
3732,
281,
7418,
4893,
824,
347,
7140,
9162,
4227,
25064,
285,
35221,
4715,
285,
26553,
253,
1375,
23037,
248,
14635,
253,
5878,
4826,
310,
4942,
2969,
281,
3359,
533,
3576,
50272,
20790,
4757,
50275,
19583,
35961,
2216,
594,
352,
310,
3477,
281,
3359,
50276,
4524,
6924,
253,
5028,
5333,
3237,
347,
247,
3740,
7629,
2789,
3282,
50276,
14719,
3006,
253,
14957,
1537,
320,
253,
2900,
281,
33623,
253,
5028,
26647,
3237,
533,
12203,
512,
253,
1896,
14957,
285,
9433,
731,
651,
320,
11132,
285,
417,
17887,
594,
970,
1027,
4227,
3530,
281,
4908,
253,
14957,
369,
5322,
50276,
262,
2789,
3282,
326,
16984,
253,
29331,
281,
5878,
253,
14957,
3139,
285,
4394,
273,
1027,
10872,
50276,
783,
2929,
310,
973,
10932,
285,
3542,
50275,
20790,
14855,
50275,
74,
452,
642,
2201,
5701,
327,
436,
2929,
533,
5884,
5701,
347,
3637,
50276,
9154,
2167,
253,
4477,
452,
2011,
253,
28913,
1263,
281,
12106,
253,
2308,
835,
253,
5878,
4826,
943,
320,
3732,
352,
310,
417,
2590,
323,
479,
2568,
253,
4477,
3732,
253,
5878,
4826,
846,
337,
296,
374,
2109,
285,
495,
5784,
12541,
8336,
323,
7140,
9162,
3237,
533,
3732,
253,
5878,
4826,
846,
337,
296,
285,
374,
2109,
12541,
8336,
323,
7140,
9162,
3237,
323,
4227,
25064,
4836,
275,
5910,
1783,
597,
760,
2692,
253,
28913,
2175,
327,
253,
7140,
9162,
3021,
581,
1158,
253,
8654,
13553,
778,
6889,
2556,
281,
253,
4893,
275,
1635,
1529,
5019,
24088,
2410,
1706,
2410,
1099,
651,
320,
625,
4722,
50276,
926,
577,
310,
1892,
281,
2096,
752,
513,
253,
3969,
3740,
9990,
1599,
2139,
1057,
277,
760,
1957,
1027,
38209,
50275,
249,
2829,
337,
690,
5661,
7533,
24088,
28224,
390,
7512,
452,
2011,
326,
5878,
4826,
259,
3632,
46671,
369,
1805,
253,
5955,
327,
436,
1537,
320,
4722,
50276,
7152,
33032,
2520,
789,
29328,
247,
5853,
323,
5028,
26647,
407,
12480,
3740,
273,
3888,
432,
1027,
10625,
436,
789,
47932,
247,
5878,
598,
3740,
2746,
247,
323,
5028,
26647,
1027,
432,
247,
253,
2929,
29328,
281,
2589,
5878,
484,
275,
253,
10444,
8090,
275,
1798,
4227,
21539,
8090,
253,
4081,
2746,
7940,
7790,
253,
941,
29688,
285,
253,
5661,
1543,
921,
326,
253,
5878,
4826,
476,
3157,
5028,
26647,
50276,
1189,
455,
253,
2929,
310,
973,
15720,
342,
9828,
273,
4278,
891,
671,
11435,
253,
5661,
1783,
275,
4706,
5910,
285,
253,
11041,
2361,
275,
2829,
337,
2299,
891,
452,
2067,
7350,
5001,
253,
2929,
50276,
783,
7681,
38135,
3133,
2581,
32809,
436,
1332,
310,
271,
6880,
273,
247,
281,
253,
4227,
21539,
3828,
2074,
8130,
452,
644,
5469,
275,
643,
2987,
824,
270,
285,
260,
2299,
841,
2987,
403,
417,
5469,
275,
2426,
273,
2022,
22620,
69,
26776,
50276,
74,
671,
1119,
253,
5661,
12820,
417,
4751,
4209,
281,
4098,
9311,
4390,
253,
12820,
310,
760,
5196,
327,
268,
18944,
253,
7756,
671,
3133,
3710,
891,
2868,
12820,
327,
625,
10895,
859,
976,
347,
24321,
3906,
9511,
347,
908,
275,
298,
19,
66,
302,
476,
2007,
6583,
253,
12510,
273,
253,
4081,
1332,
50276,
74,
9101,
326,
50276,
2388,
4818,
839,
253,
3740,
4764,
1537,
2847,
3045,
5926,
327,
253,
10625,
326,
452,
644,
2326,
1309,
3733,
651,
352,
320,
1896,
281,
1304,
3045,
327,
253,
10625,
326,
452,
644,
2326,
275,
253,
3733,
50276,
66,
362,
1479,
284,
2336,
785,
1162,
355,
16751,
5878,
484,
1805,
14237,
407,
20670,
839,
8763,
3054,
275,
17857,
1686,
6247,
270,
391,
4113,
305,
543,
1162,
355,
277,
676,
5028,
2685,
323,
15644,
285,
26647,
275,
30105,
1087,
6247,
260,
396,
543,
2788,
396,
80,
4715,
281,
22318,
5028,
2173,
21539,
323,
5028,
26647,
275,
23746,
87,
9169,
50275,
74,
452,
1239,
4477,
2380,
285,
643,
10123,
690,
273,
619,
7350,
403,
9713,
275,
253,
2380,
3340,
253,
2879,
5955,
342,
2905,
789,
310,
9371,
3021,
891,
651,
2572,
619,
13716,
281,
721,
5474,
339,
793,
360,
3454,
253,
2929,
29328,
247,
2969,
1332,
323,
5028,
26647,
835,
2709,
2603,
10625,
403,
1677,
323,
247,
2176,
4836,
751,
2460,
9162,
285,
5175,
6569,
327,
271,
39709,
5028,
253,
4477,
403,
11797,
407,
21539,
3169,
30085,
1059,
16147,
1592,
5609,
17825,
978,
1377,
257,
526,
285,
12661,
281,
5878,
253,
14957,
273,
1027,
2603,
10625,
281,
8069,
2572,
9991,
273,
10625,
1309,
3733,
50276,
856,
84,
50275,
1189,
455,
436,
310,
247,
973,
3542,
2929,
342,
247,
2590,
2934,
326,
310,
2969,
533,
27350,
50276,
783,
2934,
310,
973,
2529,
1691,
715,
3634,
273,
2720,
789,
285,
45190,
17618,
281,
3157,
1543,
689,
2710,
1666,
25379,
50276,
262,
310,
1175,
281,
923,
4679,
3345,
273,
8342,
2460,
9162,
281,
17813,
253,
4081,
2934,
50276,
783,
1783,
835,
281,
4647,
5878,
4826,
310,
1175,
285,
2789,
27350,
3282,
50275,
5040,
50275,
783,
5886,
281,
5878,
484,
3198,
281,
320,
5544,
275,
625,
4278,
1223,
2905,
281,
253,
4081,
5878,
4826,
5878,
484,
10513,
247,
17133,
5019,
273,
1097,
3280,
285,
3453,
8470,
891,
476,
2868,
326,
5878,
484,
347,
247,
2629,
941,
42072,
4245,
7197,
1543,
685,
247,
26724,
260,
9866,
2829,
337,
533,
891,
651,
417,
4751,
5194,
342,
253,
3908,
50276,
4609,
14371,
253,
5750,
273,
12480,
3740,
9990,
387,
253,
4735,
1268,
689,
12480,
3888,
387,
253,
12275,
1268,
432,
3239,
577,
5878,
484,
671,
20670,
684,
253,
3453,
5203,
2317,
594,
253,
5750,
2550,
320,
760,
12877,
253,
14663,
273,
253,
12480,
1561,
253,
2990,
3185,
273,
387,
253,
12275,
1268,
50276,
284,
271,
3081,
8245,
581,
812,
897,
5878,
484,
342,
247,
19958,
29331,
326,
310,
4067,
685,
16987,
275,
512,
2219,
751,
275,
4993,
8992,
594,
13107,
1162,
355,
5723,
2824,
938,
533,
7562,
253,
5203,
432,
3410,
1269,
2581,
685,
20670,
839,
342,
7856,
89,
50275,
74,
513,
417,
2096,
2139,
253,
34078,
1269,
310,
2879,
281,
253,
1783,
275,
2829,
495,
310,
5878,
4826,
3732,
846,
1016,
27311,
267,
3828,
390,
846,
1016,
2972,
275,
247,
501,
3024,
10336,
5742,
323,
2410,
20210,
89,
849,
2223,
310,
253,
5878,
4826,
3828,
2879,
495,
2069,
390,
495,
50276,
6370,
13118,
7432,
6172,
2069,
50275,
1542,
253,
294,
301,
4679,
891,
1158,
352,
943,
320,
1805,
16318,
326,
253,
2831,
42429,
9978,
310,
253,
2234,
3064,
281,
27163,
275,
2720,
789,
436,
10380,
4850,
2761,
440,
44120,
984,
253,
4284,
4758,
273,
294,
301,
310,
2168,
2783,
247,
3588,
5028,
26647,
4836,
1955,
281,
253,
747,
5203,
2317,
285,
6568,
6849,
436,
1669,
479,
247,
2372,
13477,
670,
849,
3632,
254,
511,
476,
320,
247,
7561,
908,
941,
42072,
5853,
323,
294,
301,
672,
352,
4245,
7197,
1543,
275,
253,
4679,
432,
2829,
374,
436,
3395,
2590,
281,
479,
760,
846,
4361,
253,
5955,
275,
253,
1390,
12494,
273,
2593,
4567,
50275,
74,
651,
417,
1056,
253,
3908,
326,
50276,
24706,
272,
310,
4518,
1805,
685,
15706,
327,
3239,
818,
923,
2829,
577,
1223,
671,
14851,
326,
50276,
3113,
9765,
3629,
432,
14805,
281,
16703,
253,
7200,
5777,
19459,
432,
854,
1619,
281,
854,
1166,
326,
4512,
1818,
310,
4067,
685,
253,
2590,
8037,
1078,
50275,
977,
7211,
285,
1527,
3533,
50276,
24706,
484,
369,
908,
8379,
347,
37820,
323,
49863,
29974,
13337,
4715,
256,
3433,
5878,
8992,
17099,
783,
11753,
1162,
355,
295,
2824,
746,
476,
5878,
4826,
671,
320,
908,
323,
256,
3433,
2490,
187,
4118,
18435,
27,
455,
1264,
30628,
5583,
14924,
846,
253,
30080,
22559,
3924,
285,
253,
913,
1119,
642,
1921,
281,
14936,
342,
731,
253,
4081,
1332,
310,
2969,
285,
3576,
285,
253,
7350,
5439,
670,
5661,
12820,
285,
38135,
1646,
973,
9713,
275,
253,
30080,
22559,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper unifies several variants of the graph convolutional networks gcns into a regularized quadratic optimization framework basically the function to be optimized considers both to preserve node information and to perform graph laplacian regularization whose optimal solution gives a convolutional layer the unification is given by equations 3 and 18 and elaborated in section 3 which includes several methods including gcn graph attentions residual connections concatenation etc this is not surprising as a gcn layer without activation is a linear transformation surely it is the optimum of a quadratic function broadly any linear layer can be trivially formulated as a quadratic optimization problem still i appreciate the authors delicate work on unifying these diverse methods from an optimization perspective which is useful and could lead to new methods from a technical perspective the main novelty is that the authors further extend this framework by adding another feature variance term so that the learned features have a certain variance this is similar to the idea of batch normalization this is reasonable because gcn tends to correlate the learned features with the graph laplacian embedding the optimal solution of the 2nd term in the authors framework this is interesting but empirical i would like to see how this additional regularization can be equivalent to transforming the original graph with some formal arguments unfortunately this technic is mainly introduced as a heuristic and more detailed analysis is missing as in any regularization framework there is an additional parameter involved that is the regularization strength alpha3 in 21 therefore the performance improvement is not surprising as the model is enlarged in the experiments or supplementary material there should be a sensitivity study of this parameter on three citation graphs that are commonly used to evaluate graph neural networks and semisupervised node classification tasks the authors showed that the regularizer can bring marginal performance improvement regarding clarity there are some typos in several places and rarely used phrases overall i dont feel excited after reading the article although the contents are useful as a large part of this work is on summarizing existing literature the new bit is mainly on the additional regularization term that is introduced as a heuristic based on the novelty a more proper venue for publishing this work could be relevant journals overall this submission presents a borderline case and i recommend weak acceptance as a minor comment equation 21 why not set alpha11 after rebuttal novelty my assessment remains the same it is not nontrivial enough to combine several linear operators into a unified optimization framework although the unification is useful it is not a major novelty thank you for the additional experiments on testing the hyperparameter as you mentioned instability it is worth to have some toy example to demonstrate the instability and study the cause of such instability and show how to avoid such instability using the proposed regularizer clearly 19 is bounded when alpha3 is large enough the solution will be trivial regarding nonlinearity the authors framework is for unifying a graph convolution operator that is one layer in a graph neural network nonlinear activation is another operator this is not a major problem from my perspective overall i think this work has some value although the novelty is not strong and still recommend weak acceptancedocsepthis paper presents a unified framework for graph convolutional neural networks based on regularized optimization connecting different variants of graph neural networks including vanilla attentionbased and topologybased approaches the authors also propose a novel regularization technique to approach the oversmoothing problem in graph convolution experiments on the standard settings of node classification on citeseer cora and pubmed prove the effectiveness of the proposed regularization techniques overall this is a very interesting paper proposing a unified framework for different variants of convolutionbased graph neural networks however i also have a few concerns 1 the proposed framework is mainly designed for gnns without considering the nonlinear transformation matrix what if we have to consider the nonlinear transformation is the whole framework able to unify different gnns 2 in the case of linear gnns without nonlinear transformation matrix it is actually not surprising formulating gnns as a regularized optimization problem such a regularization framework has already been discussed in the original gcn paper kipf et al 2016 3 in the case of linear gnns the overall framework is also very similar to the traditional label propagation framework zhou et al learning with local and global consistency could you explain the difference 4 the new novel regularization technique seems to be similar to the one proposed in pairnorm zhao et al 2020 could you also explain the difference docsepsummary the paper shows that several graph networks gcn attention gcn ppnp residual can be unified under a common framework of laplacianregularised optimisation subsequently different types of regularisation are combined to propose a new method for graph transduction which is then empirically evaluated significance laplacian regularisation is a classical approach for formulatingjustifying graph transduction algorithms multiple papers by mikhail belkin and xiaojin zhu around 200406 it is interesting to see that so many graph networks can also be unified in the same framework a unified framework does aid in both theoretical analysis and implementation of gcns however the claims and derivation do not seem to account for the nonlinear activation in the networks and hence significance of the work seems limited quality as noted above nonlinearity is not considered which makes the derivation significantly simpler moreover the firstorder approximation is quite misleading since even the proof do not seem to consider nonlinear activation since the proposed method combines multiple types of regularisation it is expected to perform better than other networks however it is not clear if the training time increases due to the complex regularisation clarity and orginality the paper is otherwise well written organised and the theoretical contributions although technically straightforward seem original and somewhat interesting docsepthe paper introduces a unified framework for graph convolutional networks by interpreting filters as regularizers in the graph fourier domain in particular this framework allows to establish the relationships between standard attentionbased and topologybased gnns furthermore the authors propose a regularization technique based upon the proposed framework which tackles the oversmoothing problem of gnns which achieves clear benefits on standard small benchmark datasets the paper is mostly wellwritten although though to understand on first read i especially liked that it tries to establish a systematic view of different gnn models and their relations which is a welcome work in the field of graph representation learning especially with the sheer amount of gnn models available in literature in my opinion the proposed framework has the potential to improve our understanding of gcns and inspire better models in return on the other hand it is not exactly clear to me how the proposed regularization technique differs from pairnorm which is build upon similar insights by preventing node embeddings from becoming too similar i would very much welcome a discussion between key differences and similarities between the two approaches furthermore the authors should consider comparing the proposed regularization technique against related approaches eg pairnorm and dropedge overall the empirical evaluation feels a bit shallow by only evaluating on small benchmark datasets but might be sufficient for a work that has mostly theoretical contributions minor comments it is not exactly clear to me how ecc can be viewed as an attentionbased gnn since this operator learns a weight matrix conditioned on the edge features instead of performing weighted normalization does this operator really fit into the proposed unified framework a gcn baseline is missing on the ppi dataset post rebuttal comments i would like to thank the authors for their insightful rebuttal and clarifications sadly i cannot find the newly added section regarding the nonlinearity analysis in the revised manuscript and therefore cannot judge the findings of the authors hence my rating will stay the same
### Summary: | four reviewers have reviewed and discussed this submission after rebuttal two reviewers felt the paper is below acceptance threshold firstly rev 1 and rev 2 were somewhat disappointed in the lack of analysis regarding nonlinearities despite authors suggested this was resolved in the revised manuscript eg rev 2 argued the paper without such an analysis is too similar to existing linear models eg appnp sgc and so on while rev 3 was mildly positive about the paper they also noted that combining several linear operators is somewhat trivial overall all reviewers felt there is some novelty in the proposed regularization term but also felt that contributions of the paper could have been stronger while ac sympathizes with this submission and hopes that authors can improve this work in its current form it appears marginally below the acceptance threshold | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
440,
7790,
2067,
11640,
273,
253,
4216,
27311,
267,
6928,
305,
68,
2224,
715,
247,
3963,
1025,
21396,
13757,
7792,
10323,
253,
1159,
281,
320,
18325,
19401,
1097,
281,
14003,
4666,
1491,
285,
281,
1347,
4216,
826,
43917,
37820,
3692,
8654,
2900,
4245,
247,
27311,
267,
3828,
50276,
783,
440,
1877,
310,
1677,
407,
7424,
495,
285,
1283,
285,
50221,
275,
2593,
495,
534,
3797,
2067,
3082,
1690,
305,
14340,
4216,
33056,
621,
12541,
10291,
32147,
318,
3966,
436,
310,
417,
10084,
347,
247,
305,
14340,
3828,
1293,
5743,
310,
247,
4872,
9261,
13353,
352,
310,
253,
24571,
273,
247,
21396,
1159,
21450,
667,
4872,
3828,
476,
320,
35820,
1365,
26115,
347,
247,
21396,
13757,
1895,
1335,
891,
11435,
253,
4477,
21140,
789,
327,
440,
5411,
841,
11117,
3082,
432,
271,
13757,
8668,
534,
310,
4217,
285,
812,
1421,
281,
747,
3082,
50276,
4064,
247,
7681,
8668,
253,
2022,
38135,
310,
326,
253,
4477,
2007,
9017,
436,
7792,
407,
6240,
1529,
4735,
11041,
1307,
594,
326,
253,
6311,
3386,
452,
247,
2176,
11041,
436,
310,
2074,
281,
253,
2934,
273,
14604,
21539,
436,
310,
5272,
984,
305,
14340,
14280,
281,
24888,
253,
6311,
3386,
342,
253,
4216,
826,
43917,
21496,
253,
8654,
2900,
273,
253,
374,
2109,
1307,
275,
253,
4477,
7792,
50276,
2520,
310,
4722,
533,
16774,
891,
651,
751,
281,
923,
849,
436,
3081,
37820,
476,
320,
6425,
281,
27197,
253,
3236,
4216,
342,
690,
7473,
7125,
19235,
436,
1732,
280,
310,
7194,
5611,
347,
247,
47641,
285,
625,
7000,
1783,
310,
5816,
50276,
284,
275,
667,
37820,
7792,
627,
310,
271,
3081,
4764,
3206,
326,
310,
253,
37820,
4757,
9765,
20,
275,
3127,
3103,
253,
3045,
7756,
310,
417,
10084,
347,
253,
1566,
310,
28148,
275,
253,
4679,
390,
24864,
2144,
627,
943,
320,
247,
7340,
1263,
273,
436,
4764,
50276,
251,
1264,
25577,
14580,
326,
403,
7744,
908,
281,
7472,
4216,
11454,
6928,
285,
49863,
29974,
13337,
4666,
9162,
8892,
253,
4477,
2692,
326,
253,
3963,
6081,
476,
3324,
16888,
3045,
7756,
50276,
1747,
13218,
19843,
627,
403,
690,
963,
993,
275,
2067,
5053,
285,
11766,
908,
25491,
50276,
1189,
455,
891,
13414,
1928,
9049,
846,
4361,
253,
3929,
3738,
253,
9410,
403,
4217,
347,
247,
1781,
629,
273,
436,
789,
310,
327,
10405,
3006,
5368,
6239,
50276,
783,
747,
2372,
310,
7194,
327,
253,
3081,
37820,
1307,
326,
310,
5611,
347,
247,
47641,
50276,
3169,
327,
253,
38135,
247,
625,
1463,
18767,
323,
18051,
436,
789,
812,
320,
4623,
24331,
4583,
436,
19529,
10262,
247,
45210,
1083,
285,
891,
5583,
5075,
14924,
50276,
284,
247,
5884,
4385,
5150,
3127,
2139,
417,
873,
9765,
883,
50275,
6438,
30080,
22559,
50276,
2369,
652,
555,
619,
6803,
4558,
253,
1072,
352,
310,
417,
37825,
2217,
281,
13398,
2067,
4872,
9158,
715,
247,
27998,
13757,
7792,
3738,
253,
440,
1877,
310,
4217,
352,
310,
417,
247,
2201,
38135,
50276,
47033,
368,
323,
253,
3081,
4679,
327,
5175,
253,
4373,
19484,
347,
368,
5393,
17620,
352,
310,
4409,
281,
452,
690,
20953,
1650,
281,
7568,
253,
17620,
285,
1263,
253,
2847,
273,
824,
17620,
285,
921,
849,
281,
3693,
824,
17620,
970,
253,
4081,
3963,
6081,
4518,
655,
310,
11542,
672,
9765,
20,
310,
1781,
2217,
253,
2900,
588,
320,
14916,
50276,
1747,
13218,
14561,
414,
253,
4477,
7792,
310,
323,
440,
5411,
247,
4216,
27311,
5572,
326,
310,
581,
3828,
275,
247,
4216,
11454,
2990,
14561,
5743,
310,
1529,
5572,
436,
310,
417,
247,
2201,
1895,
432,
619,
8668,
50276,
1189,
455,
891,
1158,
436,
789,
556,
690,
1318,
3738,
253,
38135,
310,
417,
2266,
285,
1335,
5583,
5075,
2997,
3086,
406,
33032,
2520,
2929,
10262,
247,
27998,
7792,
323,
4216,
27311,
267,
11454,
6928,
1754,
327,
3963,
1025,
13757,
12873,
1027,
50276,
20617,
1103,
273,
4216,
11454,
6928,
1690,
26724,
4116,
3169,
285,
18080,
3169,
7274,
253,
4477,
671,
12661,
50276,
66,
4460,
37820,
5853,
281,
2746,
253,
689,
34006,
272,
1895,
275,
4216,
27311,
4679,
327,
253,
2629,
7533,
273,
4666,
9162,
327,
4851,
3248,
254,
944,
66,
285,
13384,
1314,
5276,
253,
12510,
273,
253,
4081,
37820,
5609,
50275,
1189,
455,
436,
310,
247,
1077,
4722,
2929,
36636,
247,
27998,
7792,
323,
1027,
11640,
273,
27311,
3169,
4216,
11454,
6928,
2299,
891,
671,
452,
247,
1643,
7350,
50276,
18,
253,
4081,
7792,
310,
7194,
4158,
323,
18976,
2224,
1293,
7296,
253,
14561,
9261,
4315,
752,
604,
359,
452,
281,
1908,
253,
14561,
9261,
310,
253,
2644,
7792,
2104,
281,
440,
1419,
1027,
18976,
2224,
50276,
19,
275,
253,
1083,
273,
4872,
18976,
2224,
1293,
14561,
9261,
4315,
352,
310,
2686,
417,
10084,
830,
8287,
18976,
2224,
347,
247,
3963,
1025,
13757,
1895,
824,
247,
37820,
7792,
556,
2168,
644,
5469,
275,
253,
3236,
305,
14340,
2929,
465,
532,
71,
1162,
355,
4022,
50275,
20,
275,
253,
1083,
273,
4872,
18976,
2224,
253,
4583,
7792,
310,
671,
1077,
2074,
281,
253,
5899,
5203,
18634,
7792,
1182,
14451,
1162,
355,
4715,
342,
1980,
285,
4156,
15274,
812,
368,
5513,
253,
3064,
50276,
21,
253,
747,
4460,
37820,
5853,
3133,
281,
320,
2074,
281,
253,
581,
4081,
275,
4667,
12850,
1182,
31035,
1162,
355,
9169,
812,
368,
671,
5513,
253,
3064,
5474,
339,
793,
360,
3454,
253,
2929,
2722,
326,
2067,
4216,
6928,
305,
14340,
4116,
305,
14340,
7266,
18650,
12541,
476,
320,
27998,
762,
247,
1846,
7792,
273,
826,
43917,
12846,
1701,
5556,
5837,
9674,
1027,
3510,
273,
3963,
5837,
403,
5678,
281,
12661,
247,
747,
1332,
323,
4216,
28942,
534,
310,
840,
45190,
6760,
50276,
9188,
40348,
826,
43917,
3963,
5837,
310,
247,
8946,
2746,
323,
830,
8287,
6309,
5411,
4216,
28942,
11333,
2709,
9380,
407,
278,
20323,
647,
1112,
5914,
285,
1269,
571,
13511,
249,
1182,
11917,
1475,
1052,
24854,
352,
310,
4722,
281,
923,
326,
594,
1142,
4216,
6928,
476,
671,
320,
27998,
275,
253,
1072,
7792,
247,
27998,
7792,
1057,
8596,
275,
1097,
10527,
1783,
285,
7092,
273,
305,
68,
2224,
2299,
253,
3916,
285,
28529,
513,
417,
1646,
281,
2395,
323,
253,
14561,
5743,
275,
253,
6928,
285,
7613,
8453,
273,
253,
789,
3133,
3710,
50276,
15177,
347,
4879,
1840,
14561,
414,
310,
417,
2783,
534,
2789,
253,
28529,
3012,
19554,
25761,
253,
806,
2621,
11193,
310,
3240,
24363,
1580,
1014,
253,
4737,
513,
417,
1646,
281,
1908,
14561,
5743,
50276,
17480,
253,
4081,
1332,
24772,
2709,
3510,
273,
3963,
5837,
352,
310,
3264,
281,
1347,
1805,
685,
643,
6928,
2299,
352,
310,
417,
2590,
604,
253,
3733,
673,
5459,
1955,
281,
253,
2570,
3963,
5837,
50276,
498,
15752,
285,
4955,
989,
414,
253,
2929,
310,
5010,
973,
3542,
50276,
7397,
1701,
285,
253,
10527,
9021,
3738,
22335,
15246,
1646,
3236,
285,
8489,
4722,
50275,
7152,
339,
431,
248,
2929,
23970,
247,
27998,
7792,
323,
4216,
27311,
267,
6928,
407,
29375,
15116,
347,
3963,
14460,
275,
253,
4216,
269,
15421,
5028,
275,
1798,
436,
7792,
4483,
281,
5100,
253,
7688,
875,
2629,
4116,
3169,
285,
18080,
3169,
18976,
2224,
33810,
253,
4477,
12661,
247,
37820,
5853,
1754,
2220,
253,
4081,
7792,
534,
39223,
253,
689,
34006,
272,
1895,
273,
18976,
2224,
534,
33526,
2590,
5373,
327,
2629,
1355,
22791,
15302,
50276,
783,
2929,
310,
6571,
973,
15720,
3738,
2167,
281,
2096,
327,
806,
1239,
891,
3340,
10490,
326,
352,
14177,
281,
5100,
247,
12082,
1859,
273,
1027,
305,
9866,
3210,
285,
616,
2493,
534,
310,
247,
10112,
789,
275,
253,
1673,
273,
4216,
6779,
4715,
3340,
342,
253,
23658,
2408,
273,
305,
9866,
3210,
2130,
275,
6239,
275,
619,
4743,
253,
4081,
7792,
556,
253,
2442,
281,
3157,
776,
4685,
273,
305,
68,
2224,
285,
26761,
1805,
3210,
275,
1091,
50276,
251,
253,
643,
1133,
352,
310,
417,
4555,
2590,
281,
479,
849,
253,
4081,
37820,
5853,
19986,
432,
4667,
12850,
534,
310,
1973,
2220,
2074,
16039,
407,
13538,
4666,
46234,
432,
7552,
1512,
2074,
891,
651,
1077,
1199,
10112,
247,
5955,
875,
2234,
3910,
285,
22620,
875,
253,
767,
7274,
33810,
253,
4477,
943,
1908,
10941,
253,
4081,
37820,
5853,
1411,
2905,
7274,
24088,
4667,
12850,
285,
5926,
13057,
4583,
253,
16774,
7103,
9193,
247,
2372,
20126,
407,
760,
16344,
327,
1355,
22791,
15302,
533,
1537,
320,
4209,
323,
247,
789,
326,
556,
6571,
10527,
9021,
50276,
37585,
5701,
50275,
262,
310,
417,
4555,
2590,
281,
479,
849,
23746,
476,
320,
11575,
347,
271,
4116,
3169,
305,
9866,
1580,
436,
5572,
33772,
247,
2801,
4315,
27039,
327,
253,
5024,
3386,
3185,
273,
9591,
17375,
21539,
1057,
436,
5572,
1663,
4944,
715,
253,
4081,
27998,
7792,
50275,
66,
305,
14340,
8245,
310,
5816,
327,
253,
268,
2059,
10895,
50275,
5996,
30080,
22559,
5701,
50275,
74,
651,
751,
281,
5717,
253,
4477,
323,
616,
47860,
30080,
22559,
285,
8254,
6787,
30018,
891,
2550,
1089,
253,
9841,
2879,
2593,
5001,
253,
14561,
414,
1783,
275,
253,
17265,
7714,
285,
3103,
2550,
5963,
253,
4342,
273,
253,
4477,
50276,
48521,
619,
13716,
588,
3297,
253,
1072,
187,
187,
4118,
18435,
27,
12496,
30628,
452,
9814,
285,
5469,
436,
19529,
846,
30080,
22559,
767,
30628,
3543,
253,
2929,
310,
2708,
14924,
7887,
41005,
3585,
337,
285,
3585,
374,
497,
8489,
19271,
275,
253,
3480,
273,
1783,
5001,
14561,
1005,
5747,
4477,
5125,
436,
369,
11512,
275,
253,
17265,
7714,
24088,
3585,
374,
9125,
253,
2929,
1293,
824,
271,
1783,
310,
1512,
2074,
281,
5368,
4872,
3210,
24088,
622,
18650,
256,
23654,
285,
594,
327,
1223,
3585,
495,
369,
38920,
2762,
670,
253,
2929,
597,
671,
4879,
326,
16248,
2067,
4872,
9158,
310,
8489,
14916,
4583,
512,
30628,
3543,
627,
310,
690,
38135,
275,
253,
4081,
37820,
1307,
533,
671,
3543,
326,
9021,
273,
253,
2929,
812,
452,
644,
10046,
1223,
913,
34144,
4219,
342,
436,
19529,
285,
13079,
326,
4477,
476,
3157,
436,
789,
275,
697,
1655,
830,
352,
4620,
42876,
2708,
253,
14924,
7887
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
440,
7790,
2067,
11640,
273,
253,
4216,
27311,
267,
6928,
305,
68,
2224,
715,
247,
3963,
1025,
21396,
13757,
7792,
10323,
253,
1159,
281,
320,
18325,
19401,
1097,
281,
14003,
4666,
1491,
285,
281,
1347,
4216,
826,
43917,
37820,
3692,
8654,
2900,
4245,
247,
27311,
267,
3828,
50276,
783,
440,
1877,
310,
1677,
407,
7424,
495,
285,
1283,
285,
50221,
275,
2593,
495,
534,
3797,
2067,
3082,
1690,
305,
14340,
4216,
33056,
621,
12541,
10291,
32147,
318,
3966,
436,
310,
417,
10084,
347,
247,
305,
14340,
3828,
1293,
5743,
310,
247,
4872,
9261,
13353,
352,
310,
253,
24571,
273,
247,
21396,
1159,
21450,
667,
4872,
3828,
476,
320,
35820,
1365,
26115,
347,
247,
21396,
13757,
1895,
1335,
891,
11435,
253,
4477,
21140,
789,
327,
440,
5411,
841,
11117,
3082,
432,
271,
13757,
8668,
534,
310,
4217,
285,
812,
1421,
281,
747,
3082,
50276,
4064,
247,
7681,
8668,
253,
2022,
38135,
310,
326,
253,
4477,
2007,
9017,
436,
7792,
407,
6240,
1529,
4735,
11041,
1307,
594,
326,
253,
6311,
3386,
452,
247,
2176,
11041,
436,
310,
2074,
281,
253,
2934,
273,
14604,
21539,
436,
310,
5272,
984,
305,
14340,
14280,
281,
24888,
253,
6311,
3386,
342,
253,
4216,
826,
43917,
21496,
253,
8654,
2900,
273,
253,
374,
2109,
1307,
275,
253,
4477,
7792,
50276,
2520,
310,
4722,
533,
16774,
891,
651,
751,
281,
923,
849,
436,
3081,
37820,
476,
320,
6425,
281,
27197,
253,
3236,
4216,
342,
690,
7473,
7125,
19235,
436,
1732,
280,
310,
7194,
5611,
347,
247,
47641,
285,
625,
7000,
1783,
310,
5816,
50276,
284,
275,
667,
37820,
7792,
627,
310,
271,
3081,
4764,
3206,
326,
310,
253,
37820,
4757,
9765,
20,
275,
3127,
3103,
253,
3045,
7756,
310,
417,
10084,
347,
253,
1566,
310,
28148,
275,
253,
4679,
390,
24864,
2144,
627,
943,
320,
247,
7340,
1263,
273,
436,
4764,
50276,
251,
1264,
25577,
14580,
326,
403,
7744,
908,
281,
7472,
4216,
11454,
6928,
285,
49863,
29974,
13337,
4666,
9162,
8892,
253,
4477,
2692,
326,
253,
3963,
6081,
476,
3324,
16888,
3045,
7756,
50276,
1747,
13218,
19843,
627,
403,
690,
963,
993,
275,
2067,
5053,
285,
11766,
908,
25491,
50276,
1189,
455,
891,
13414,
1928,
9049,
846,
4361,
253,
3929,
3738,
253,
9410,
403,
4217,
347,
247,
1781,
629,
273,
436,
789,
310,
327,
10405,
3006,
5368,
6239,
50276,
783,
747,
2372,
310,
7194,
327,
253,
3081,
37820,
1307,
326,
310,
5611,
347,
247,
47641,
50276,
3169,
327,
253,
38135,
247,
625,
1463,
18767,
323,
18051,
436,
789,
812,
320,
4623,
24331,
4583,
436,
19529,
10262,
247,
45210,
1083,
285,
891,
5583,
5075,
14924,
50276,
284,
247,
5884,
4385,
5150,
3127,
2139,
417,
873,
9765,
883,
50275,
6438,
30080,
22559,
50276,
2369,
652,
555,
619,
6803,
4558,
253,
1072,
352,
310,
417,
37825,
2217,
281,
13398,
2067,
4872,
9158,
715,
247,
27998,
13757,
7792,
3738,
253,
440,
1877,
310,
4217,
352,
310,
417,
247,
2201,
38135,
50276,
47033,
368,
323,
253,
3081,
4679,
327,
5175,
253,
4373,
19484,
347,
368,
5393,
17620,
352,
310,
4409,
281,
452,
690,
20953,
1650,
281,
7568,
253,
17620,
285,
1263,
253,
2847,
273,
824,
17620,
285,
921,
849,
281,
3693,
824,
17620,
970,
253,
4081,
3963,
6081,
4518,
655,
310,
11542,
672,
9765,
20,
310,
1781,
2217,
253,
2900,
588,
320,
14916,
50276,
1747,
13218,
14561,
414,
253,
4477,
7792,
310,
323,
440,
5411,
247,
4216,
27311,
5572,
326,
310,
581,
3828,
275,
247,
4216,
11454,
2990,
14561,
5743,
310,
1529,
5572,
436,
310,
417,
247,
2201,
1895,
432,
619,
8668,
50276,
1189,
455,
891,
1158,
436,
789,
556,
690,
1318,
3738,
253,
38135,
310,
417,
2266,
285,
1335,
5583,
5075,
2997,
3086,
406,
33032,
2520,
2929,
10262,
247,
27998,
7792,
323,
4216,
27311,
267,
11454,
6928,
1754,
327,
3963,
1025,
13757,
12873,
1027,
50276,
20617,
1103,
273,
4216,
11454,
6928,
1690,
26724,
4116,
3169,
285,
18080,
3169,
7274,
253,
4477,
671,
12661,
50276,
66,
4460,
37820,
5853,
281,
2746,
253,
689,
34006,
272,
1895,
275,
4216,
27311,
4679,
327,
253,
2629,
7533,
273,
4666,
9162,
327,
4851,
3248,
254,
944,
66,
285,
13384,
1314,
5276,
253,
12510,
273,
253,
4081,
37820,
5609,
50275,
1189,
455,
436,
310,
247,
1077,
4722,
2929,
36636,
247,
27998,
7792,
323,
1027,
11640,
273,
27311,
3169,
4216,
11454,
6928,
2299,
891,
671,
452,
247,
1643,
7350,
50276,
18,
253,
4081,
7792,
310,
7194,
4158,
323,
18976,
2224,
1293,
7296,
253,
14561,
9261,
4315,
752,
604,
359,
452,
281,
1908,
253,
14561,
9261,
310,
253,
2644,
7792,
2104,
281,
440,
1419,
1027,
18976,
2224,
50276,
19,
275,
253,
1083,
273,
4872,
18976,
2224,
1293,
14561,
9261,
4315,
352,
310,
2686,
417,
10084,
830,
8287,
18976,
2224,
347,
247,
3963,
1025,
13757,
1895,
824,
247,
37820,
7792,
556,
2168,
644,
5469,
275,
253,
3236,
305,
14340,
2929,
465,
532,
71,
1162,
355,
4022,
50275,
20,
275,
253,
1083,
273,
4872,
18976,
2224,
253,
4583,
7792,
310,
671,
1077,
2074,
281,
253,
5899,
5203,
18634,
7792,
1182,
14451,
1162,
355,
4715,
342,
1980,
285,
4156,
15274,
812,
368,
5513,
253,
3064,
50276,
21,
253,
747,
4460,
37820,
5853,
3133,
281,
320,
2074,
281,
253,
581,
4081,
275,
4667,
12850,
1182,
31035,
1162,
355,
9169,
812,
368,
671,
5513,
253,
3064,
5474,
339,
793,
360,
3454,
253,
2929,
2722,
326,
2067,
4216,
6928,
305,
14340,
4116,
305,
14340,
7266,
18650,
12541,
476,
320,
27998,
762,
247,
1846,
7792,
273,
826,
43917,
12846,
1701,
5556,
5837,
9674,
1027,
3510,
273,
3963,
5837,
403,
5678,
281,
12661,
247,
747,
1332,
323,
4216,
28942,
534,
310,
840,
45190,
6760,
50276,
9188,
40348,
826,
43917,
3963,
5837,
310,
247,
8946,
2746,
323,
830,
8287,
6309,
5411,
4216,
28942,
11333,
2709,
9380,
407,
278,
20323,
647,
1112,
5914,
285,
1269,
571,
13511,
249,
1182,
11917,
1475,
1052,
24854,
352,
310,
4722,
281,
923,
326,
594,
1142,
4216,
6928,
476,
671,
320,
27998,
275,
253,
1072,
7792,
247,
27998,
7792,
1057,
8596,
275,
1097,
10527,
1783,
285,
7092,
273,
305,
68,
2224,
2299,
253,
3916,
285,
28529,
513,
417,
1646,
281,
2395,
323,
253,
14561,
5743,
275,
253,
6928,
285,
7613,
8453,
273,
253,
789,
3133,
3710,
50276,
15177,
347,
4879,
1840,
14561,
414,
310,
417,
2783,
534,
2789,
253,
28529,
3012,
19554,
25761,
253,
806,
2621,
11193,
310,
3240,
24363,
1580,
1014,
253,
4737,
513,
417,
1646,
281,
1908,
14561,
5743,
50276,
17480,
253,
4081,
1332,
24772,
2709,
3510,
273,
3963,
5837,
352,
310,
3264,
281,
1347,
1805,
685,
643,
6928,
2299,
352,
310,
417,
2590,
604,
253,
3733,
673,
5459,
1955,
281,
253,
2570,
3963,
5837,
50276,
498,
15752,
285,
4955,
989,
414,
253,
2929,
310,
5010,
973,
3542,
50276,
7397,
1701,
285,
253,
10527,
9021,
3738,
22335,
15246,
1646,
3236,
285,
8489,
4722,
50275,
7152,
339,
431,
248,
2929,
23970,
247,
27998,
7792,
323,
4216,
27311,
267,
6928,
407,
29375,
15116,
347,
3963,
14460,
275,
253,
4216,
269,
15421,
5028,
275,
1798,
436,
7792,
4483,
281,
5100,
253,
7688,
875,
2629,
4116,
3169,
285,
18080,
3169,
18976,
2224,
33810,
253,
4477,
12661,
247,
37820,
5853,
1754,
2220,
253,
4081,
7792,
534,
39223,
253,
689,
34006,
272,
1895,
273,
18976,
2224,
534,
33526,
2590,
5373,
327,
2629,
1355,
22791,
15302,
50276,
783,
2929,
310,
6571,
973,
15720,
3738,
2167,
281,
2096,
327,
806,
1239,
891,
3340,
10490,
326,
352,
14177,
281,
5100,
247,
12082,
1859,
273,
1027,
305,
9866,
3210,
285,
616,
2493,
534,
310,
247,
10112,
789,
275,
253,
1673,
273,
4216,
6779,
4715,
3340,
342,
253,
23658,
2408,
273,
305,
9866,
3210,
2130,
275,
6239,
275,
619,
4743,
253,
4081,
7792,
556,
253,
2442,
281,
3157,
776,
4685,
273,
305,
68,
2224,
285,
26761,
1805,
3210,
275,
1091,
50276,
251,
253,
643,
1133,
352,
310,
417,
4555,
2590,
281,
479,
849,
253,
4081,
37820,
5853,
19986,
432,
4667,
12850,
534,
310,
1973,
2220,
2074,
16039,
407,
13538,
4666,
46234,
432,
7552,
1512,
2074,
891,
651,
1077,
1199,
10112,
247,
5955,
875,
2234,
3910,
285,
22620,
875,
253,
767,
7274,
33810,
253,
4477,
943,
1908,
10941,
253,
4081,
37820,
5853,
1411,
2905,
7274,
24088,
4667,
12850,
285,
5926,
13057,
4583,
253,
16774,
7103,
9193,
247,
2372,
20126,
407,
760,
16344,
327,
1355,
22791,
15302,
533,
1537,
320,
4209,
323,
247,
789,
326,
556,
6571,
10527,
9021,
50276,
37585,
5701,
50275,
262,
310,
417,
4555,
2590,
281,
479,
849,
23746,
476,
320,
11575,
347,
271,
4116,
3169,
305,
9866,
1580,
436,
5572,
33772,
247,
2801,
4315,
27039,
327,
253,
5024,
3386,
3185,
273,
9591,
17375,
21539,
1057,
436,
5572,
1663,
4944,
715,
253,
4081,
27998,
7792,
50275,
66,
305,
14340,
8245,
310,
5816,
327,
253,
268,
2059,
10895,
50275,
5996,
30080,
22559,
5701,
50275,
74,
651,
751,
281,
5717,
253,
4477,
323,
616,
47860,
30080,
22559,
285,
8254,
6787,
30018,
891,
2550,
1089,
253,
9841,
2879,
2593,
5001,
253,
14561,
414,
1783,
275,
253,
17265,
7714,
285,
3103,
2550,
5963,
253,
4342,
273,
253,
4477,
50276,
48521,
619,
13716,
588,
3297,
253,
1072,
187,
187,
4118,
18435,
27,
12496,
30628,
452,
9814,
285,
5469,
436,
19529,
846,
30080,
22559,
767,
30628,
3543,
253,
2929,
310,
2708,
14924,
7887,
41005,
3585,
337,
285,
3585,
374,
497,
8489,
19271,
275,
253,
3480,
273,
1783,
5001,
14561,
1005,
5747,
4477,
5125,
436,
369,
11512,
275,
253,
17265,
7714,
24088,
3585,
374,
9125,
253,
2929,
1293,
824,
271,
1783,
310,
1512,
2074,
281,
5368,
4872,
3210,
24088,
622,
18650,
256,
23654,
285,
594,
327,
1223,
3585,
495,
369,
38920,
2762,
670,
253,
2929,
597,
671,
4879,
326,
16248,
2067,
4872,
9158,
310,
8489,
14916,
4583,
512,
30628,
3543,
627,
310,
690,
38135,
275,
253,
4081,
37820,
1307,
533,
671,
3543,
326,
9021,
273,
253,
2929,
812,
452,
644,
10046,
1223,
913,
34144,
4219,
342,
436,
19529,
285,
13079,
326,
4477,
476,
3157,
436,
789,
275,
697,
1655,
830,
352,
4620,
42876,
2708,
253,
14924,
7887
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper mainly studied how the negative samples can affect the model performance in supervised learning cio works through the experiments this work has a few interesting findings including the majority of negative samples are not important for the model learning only a small subset of hard samples determine the model importance these hard examples are also closely related with positive samples more semantically similar we can see from experiments that its very important to fairly treat negative samples in supervised learning tasks however there is no frameworks proposed to help improve the learning representation or speed up the training task in general the readers are more interested in the solutions after realizing the importance of negative samples treatment during the experiments it would be necessary to include the corresponding solutions by automatically setup these negatives samples in cid related taskdocsep this paper argues that in contrastive selfsupervised learning different negative instances have different importance this importance is relevant to the difficulty of negative instances on imagenet and moco2 the authors show that using the most difficult 5 negative instances can achieve similar performance compared with using all negative instances however the most difficult 01 of negative instances yield bad performance i recommend to reject this paper due to the following major concerns 1 study is performed on a single dataset which is not convincing 2 study is performed on a single method which casts doubts on whether the conclusions hold for other methods 3 this study does not seem to have practical value while this study is interesting it lacks rigor in the following aspects 1 the study is only performed on a single contrastive selfsupervised learning method moco2 it is unclear whether the conclusions hold for other contrastive ssl methods such as byol and many others 2 the study is conducted on a single dataset imagenet it is unclear whether the conclusions hold for other datasets 3 another concern is this study does not seem to have practical value in each iteration during training finding the hardest examples for a query needs to calculate the innerproduct between this query and all other training examples which is computationally very heavy 4 in the authors measure of difficulty the difficulty is a function of network weights in early stage of the training the network weights are random which implies that the calculated difficulty may be meaningless can the authors comment on this however the paper does have a few strong points 1 the paper is wellwritten the organization is clear and the paper is easy to follow 2 the studied problem is interesting and novel other comments 1 figure 5a is difficult to interpret the author may consider to reorganize it 2 in figure 3 only three temperature values were considered which may not be very convincing update i read the authors rebuttal the authors didnt address my concern the study is conducted on a single dataset imagenet it is unclear whether the conclusions hold for other datasets sufficiently i would like to keep my original rating docsep the findings of this work are that for contrastive learning most of the negatives deemed easily separable are unnecessary the most important negatives are somewhere in the top 5 closest to the positive sample and that some of the exceedingly hard examples are detrimental in general i felt the main findings of this work to be roughly in line with what we already know about contrastive learning we can easily look at this works findings with respect to the soft svm margin in that only the examples close to the decision boundary should matter max margin but some difficult examples the aforementioned exceedingly difficult ones make the data inseparable so we allow some violation slack terms while im not suggesting that slapping a soft svm here would solve the problem there is a large body of svmbased detectionclassification literature that precedes the findings of this work validity of wordnet as a measure of semantic similarity section 4 uses wordnet distances to estimate the semantic similarities between classes by finding their shared subtree root the deeper the subtree the more semantically similar while i do not dispute the claim of the hardest negatives being from semantically similar classes different parts of the wordnet synset tree have semantic hierarchies of varying levels of coarseness a 2 hop distance in one subtree could easily be more of a semantic jump than a 3 hop distance in another the exist prior works dealing with the neglected semantic hierarchies in imagenet by setting up hierarchical classifiers an example is 1 i would further argue that theres some nuance in the correlation between semantic similarity and example hardness in that it really depends on your choice of feature representation visual features will naturally correlate with closer semantic levels in visuallydefined categories however this will not necessarily hold for semantic categories defined by function in that two visually distinct items may fall under close semantic labels the related works section claims object detection works have not explicitly involved negative examples as in cid i have to imagine this statement is poorly phrased as 2 also cited in this paragraph very explicitly mines for facelike nonface patterns there is a very long list of hardnegative mining works in object detection overall i value the empirical impact of this work in that the rather detailed analysis may lead to improvements to future versions of the contrastive feature learning task however i do not find the findings of this work to be sufficiently novel for this conference and therefore cannot recommend this work for acceptance in its current state 1 yan et al hdcnn hierarchical deep convolutional neural networksfor large scale visual recognition iccv 2015 2 sung and poggio examplebased learning for viewbased human face detection tpami 1998docsepin this paper the authors carried out a series of experiments to analyze the impact of negative samples in contrastive learning instance discrimination cid in particular they try to identify which difficulty range is important for representation learning of the many recent selfsupervised learning approaches they chose moco v2 as the testbed they trained the moco model from an imagenet pretrained one various settings which correspond to various ways of filtering our hard or easy negatives were used hardness of samples are measured based on embedding distance to the query ie ones with large distance are easy their main findings are for negative samples 1 using the 5 hardest is enough for downstream tasks 2 the easiest 95 of them were unnecessary and insufficient 3 the hardest 01 is harmful and 4 hard negatives were more semantically similar to the query in general in my opinion this is a paper in which the authors tried to answers many interesting practical questions the author provided experiments and convincing evidences for a number of insights my main reservations with this paper are 1 most of the points are not new and are elaborations of what were pointed out before elsewhere for example in semihard mining for distance metric learning 2 the empirical results are only within the context of moco2 and for a linear classification task it is not clear how such numbers as 01 5 or 95 would change when adopting other frameworks such as byol or swav the reported gains seems a little bit sensitive to the temperature parameters of moco 3 the sample hardness is measured based on embedding distance which would be evolved during the training process itself it is not clear how accurate it is especially in the early stage of training my suggestion for improvements is that either to empirically show that their findings numbers are consistent across a number of frameworks and downstream tasks or to provide some theoretical justification for their findings if only moco v2 is used
### Summary: | this paper empirically studies the impact of different types of negatives used in recent contrastive selfsupervised learning methods results were initially shown on mocov2 though after rebuttal simclr was also added and several interesting findings were found including that only hardest 5 of the negatives are necessary and sufficient while the reviewers saw the benefit of rigorously studying this aspect of recent advances in selfsupervised learning a number of issues were raised including 1 the limited scope of the conclusions given that only two after rebuttal algorithms were used on one datasets 2 limited connections drawn to existing works on hard negative mining which is very common across machine learning including metric learning and object detection and 3 limited discussion of some of the methodological issues such as use of measures that are intrinsically tied to the models weights hence being less reliable early in the training and wordnet as a measure for semantic similarity though the authors provided lengthy rebuttals the reviewers still felt some of these issues were not addressed as a result i recommend rejection in this cycle and that the authors bolster some of these aspects for a submission to future venues i would like to emphasize that this type of work which provides rigorous empirical investigation of various phenomena in machine learning is indeed important and worth doing hence the lack of a new method eg to address the selection of negatives was not the basis of the decision while the paper clearly does a thorough job at investigating these issues for a limited scope eg in terms of datasets a larger contribution is expected for empirical papers such that 1 we can ensure the generality of the conclusions across methods and datasets 2 we have a conceptual framework for understanding the empirical results especially with respect to what is already known in adjacent areas eg metric learning and object detection and 3 we understand some of the methodological choices that were made and why they are sufficiently justified | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
7194,
5421,
849,
253,
4016,
3530,
476,
2818,
253,
1566,
3045,
275,
22296,
4715,
260,
900,
2987,
949,
253,
4679,
436,
789,
556,
247,
1643,
4722,
4342,
1690,
253,
5020,
273,
4016,
3530,
403,
417,
1774,
323,
253,
1566,
4715,
760,
247,
1355,
8578,
273,
1892,
3530,
3653,
253,
1566,
6349,
841,
1892,
6667,
403,
671,
8244,
2905,
342,
2762,
3530,
625,
3300,
39904,
2074,
50276,
664,
476,
923,
432,
4679,
326,
697,
1077,
1774,
50276,
936,
9648,
1555,
4016,
3530,
275,
22296,
4715,
8892,
2299,
627,
310,
642,
31225,
4081,
281,
1361,
3157,
253,
4715,
6779,
390,
3885,
598,
253,
3733,
4836,
575,
575,
249,
2087,
253,
10668,
403,
625,
6110,
275,
253,
5482,
846,
27017,
253,
6349,
273,
4016,
3530,
1971,
1309,
253,
4679,
50276,
262,
651,
320,
3309,
281,
2486,
253,
3969,
5482,
407,
8356,
9978,
841,
2297,
3993,
3530,
275,
260,
301,
2905,
4836,
7152,
33032,
436,
2929,
8219,
326,
275,
4499,
422,
1881,
35421,
4715,
1027,
4016,
10872,
452,
1027,
6349,
436,
6349,
310,
4623,
281,
253,
10183,
273,
4016,
10872,
327,
4440,
257,
292,
285,
278,
16856,
19,
253,
4477,
921,
326,
970,
253,
954,
2834,
608,
4016,
10872,
476,
5115,
2074,
3045,
2429,
342,
970,
512,
4016,
10872,
2299,
253,
954,
2834,
14805,
273,
4016,
10872,
4917,
3076,
3045,
50276,
74,
5583,
281,
12009,
436,
2929,
1955,
281,
253,
1563,
2201,
7350,
337,
1263,
310,
2684,
327,
247,
2014,
10895,
534,
310,
417,
21414,
374,
1263,
310,
2684,
327,
247,
2014,
1332,
534,
43603,
24626,
327,
1880,
253,
11815,
2186,
323,
643,
3082,
495,
436,
1263,
1057,
417,
1646,
281,
452,
8542,
1318,
50276,
6050,
436,
1263,
310,
4722,
352,
19756,
8132,
263,
275,
253,
1563,
7794,
337,
253,
1263,
310,
760,
2684,
327,
247,
2014,
4499,
422,
1881,
35421,
4715,
1332,
278,
16856,
19,
352,
310,
12744,
1880,
253,
11815,
2186,
323,
643,
4499,
422,
256,
3433,
3082,
824,
347,
407,
311,
285,
1142,
2571,
374,
253,
1263,
310,
5196,
327,
247,
2014,
10895,
4440,
257,
292,
352,
310,
12744,
1880,
253,
11815,
2186,
323,
643,
15302,
495,
1529,
4468,
310,
436,
1263,
1057,
417,
1646,
281,
452,
8542,
1318,
275,
1016,
19502,
1309,
3733,
4560,
253,
31056,
6667,
323,
247,
7316,
3198,
281,
10173,
253,
6703,
7509,
875,
436,
7316,
285,
512,
643,
3733,
6667,
534,
310,
43245,
1077,
5536,
50276,
21,
275,
253,
4477,
2557,
273,
10183,
253,
10183,
310,
247,
1159,
273,
2990,
13461,
275,
2393,
3924,
273,
253,
3733,
253,
2990,
13461,
403,
3632,
534,
8018,
326,
253,
5118,
10183,
778,
320,
34209,
476,
253,
4477,
4385,
327,
436,
50276,
35529,
253,
2929,
1057,
452,
247,
1643,
2266,
2792,
337,
253,
2929,
310,
973,
15720,
253,
6003,
310,
2590,
285,
253,
2929,
310,
3477,
281,
956,
374,
253,
5421,
1895,
310,
4722,
285,
4460,
50274,
977,
5701,
337,
4677,
608,
66,
310,
2834,
281,
4665,
253,
2488,
778,
1908,
281,
294,
7397,
907,
352,
374,
275,
4677,
495,
760,
1264,
3276,
2193,
497,
2783,
534,
778,
417,
320,
1077,
21414,
50275,
11183,
891,
1239,
253,
4477,
30080,
22559,
253,
4477,
42126,
2953,
619,
4468,
253,
1263,
310,
5196,
327,
247,
2014,
10895,
4440,
257,
292,
352,
310,
12744,
1880,
253,
11815,
2186,
323,
643,
15302,
50276,
31031,
314,
891,
651,
751,
281,
1978,
619,
3236,
13716,
50276,
7152,
33032,
253,
4342,
273,
436,
789,
403,
326,
323,
4499,
422,
4715,
954,
273,
253,
2297,
3993,
14320,
4354,
39690,
403,
15279,
253,
954,
1774,
2297,
3993,
403,
9366,
275,
253,
1755,
608,
8642,
281,
253,
2762,
3410,
285,
326,
690,
273,
253,
42508,
1892,
6667,
403,
30078,
50276,
249,
2087,
891,
3543,
253,
2022,
4342,
273,
436,
789,
281,
320,
11467,
275,
1386,
342,
752,
359,
2168,
871,
670,
4499,
422,
4715,
359,
476,
4354,
1007,
387,
436,
2987,
4342,
342,
1675,
281,
253,
2602,
256,
11618,
8459,
275,
326,
760,
253,
6667,
2810,
281,
253,
3061,
7548,
943,
2647,
2781,
8459,
533,
690,
2834,
6667,
50276,
783,
18979,
42508,
2834,
4394,
1056,
253,
941,
275,
16806,
494,
594,
359,
1581,
690,
8411,
37358,
2426,
50276,
6050,
516,
417,
7738,
326,
1499,
5436,
247,
2602,
256,
11618,
1060,
651,
8415,
253,
1895,
627,
310,
247,
1781,
2133,
273,
18504,
1814,
833,
5481,
42070,
6239,
326,
8436,
265,
253,
4342,
273,
436,
789,
50276,
7210,
414,
273,
3159,
3024,
347,
247,
2557,
273,
24705,
14259,
2593,
577,
4648,
3159,
3024,
13849,
281,
6642,
253,
24705,
22620,
875,
5971,
407,
4560,
616,
6096,
8482,
658,
5230,
253,
12861,
253,
8482,
658,
253,
625,
3300,
39904,
2074,
1223,
891,
513,
417,
12027,
253,
1750,
273,
253,
31056,
2297,
3993,
1146,
432,
3300,
39904,
2074,
5971,
1027,
4243,
273,
253,
3159,
3024,
2753,
1178,
5202,
452,
24705,
20258,
447,
273,
11962,
2308,
273,
820,
1032,
8098,
247,
374,
5184,
4181,
275,
581,
8482,
658,
812,
4354,
320,
625,
273,
247,
24705,
6923,
685,
247,
495,
5184,
4181,
275,
1529,
50275,
783,
2226,
2720,
2987,
10620,
342,
253,
22459,
24705,
20258,
447,
275,
4440,
257,
292,
407,
4758,
598,
24498,
49996,
271,
1650,
310,
337,
50276,
74,
651,
2007,
9059,
326,
253,
373,
690,
8794,
593,
275,
253,
5921,
875,
24705,
14259,
285,
1650,
38576,
275,
326,
352,
1663,
7024,
327,
634,
4327,
273,
4735,
6779,
5304,
3386,
588,
10748,
24888,
342,
8003,
24705,
2308,
275,
25910,
7769,
9050,
2299,
436,
588,
417,
7933,
2186,
323,
24705,
9050,
2931,
407,
1159,
275,
326,
767,
25910,
5799,
4957,
778,
2965,
762,
2810,
24705,
13301,
50275,
783,
2905,
2987,
2593,
3916,
1789,
5481,
2987,
452,
417,
11120,
3206,
4016,
6667,
347,
275,
260,
301,
891,
452,
281,
8564,
436,
3908,
310,
15225,
9839,
833,
347,
374,
671,
11106,
275,
436,
12494,
1077,
11120,
27304,
323,
50276,
28402,
44549,
1327,
1664,
6127,
627,
310,
247,
1077,
1048,
1618,
273,
1892,
12373,
15067,
2987,
275,
1789,
5481,
50276,
1189,
455,
891,
1318,
253,
16774,
3486,
273,
436,
789,
275,
326,
253,
2581,
7000,
1783,
778,
1421,
281,
11701,
281,
2852,
9508,
273,
253,
4499,
422,
4735,
4715,
4836,
2299,
891,
513,
417,
1089,
253,
4342,
273,
436,
789,
281,
320,
10481,
4460,
323,
436,
8059,
285,
3103,
2550,
5583,
436,
789,
323,
14924,
275,
697,
1655,
1375,
50274,
18,
340,
266,
1162,
355,
288,
12352,
9866,
24498,
3676,
27311,
267,
11454,
6928,
1542,
1781,
4311,
5304,
8981,
17857,
17312,
4104,
50276,
19,
29502,
285,
268,
19613,
900,
1650,
3169,
4715,
323,
1859,
3169,
1966,
2454,
5481,
246,
81,
7588,
8065,
7152,
339,
9852,
436,
2929,
253,
4477,
4824,
562,
247,
2962,
273,
4679,
281,
12106,
253,
3486,
273,
4016,
3530,
275,
4499,
422,
4715,
4227,
11081,
50276,
68,
301,
275,
1798,
597,
1611,
281,
4271,
534,
10183,
2491,
310,
1774,
323,
6779,
4715,
273,
253,
1142,
3332,
1881,
35421,
4715,
7274,
597,
9703,
278,
16856,
362,
19,
347,
253,
1071,
3026,
597,
10166,
253,
278,
16856,
1566,
432,
271,
4440,
257,
292,
3215,
11273,
581,
2710,
7533,
534,
2723,
281,
2710,
4088,
273,
19690,
776,
1892,
390,
3477,
2297,
3993,
497,
908,
38576,
273,
3530,
403,
4080,
1754,
327,
21496,
4181,
281,
253,
7316,
26332,
4394,
342,
1781,
4181,
403,
3477,
616,
2022,
4342,
403,
323,
4016,
3530,
337,
970,
253,
608,
31056,
310,
2217,
323,
15450,
8892,
374,
253,
24746,
5325,
273,
731,
497,
15279,
285,
12497,
495,
253,
31056,
14805,
310,
19632,
285,
50276,
21,
1892,
2297,
3993,
497,
625,
3300,
39904,
2074,
281,
253,
7316,
50276,
249,
2087,
275,
619,
4743,
436,
310,
247,
2929,
275,
534,
253,
4477,
3597,
281,
9172,
1142,
4722,
8542,
3533,
253,
2488,
2530,
4679,
285,
21414,
20456,
2979,
323,
247,
1180,
273,
16039,
619,
2022,
33196,
342,
436,
2929,
403,
50276,
18,
954,
273,
253,
2792,
403,
417,
747,
285,
403,
14883,
569,
273,
752,
497,
8042,
562,
1078,
11358,
323,
1650,
275,
3300,
6356,
472,
15067,
323,
4181,
7982,
4715,
374,
253,
16774,
1543,
403,
760,
1561,
253,
3634,
273,
278,
16856,
19,
285,
323,
247,
4872,
9162,
4836,
352,
310,
417,
2590,
849,
824,
3904,
347,
14805,
608,
390,
5325,
651,
1818,
672,
25987,
643,
31225,
824,
347,
407,
311,
390,
1863,
580,
253,
2361,
15988,
3133,
247,
1652,
2372,
7996,
281,
253,
3276,
3602,
273,
278,
16856,
495,
253,
3410,
38576,
310,
4080,
1754,
327,
21496,
4181,
534,
651,
320,
16323,
1309,
253,
3733,
1232,
3139,
50276,
262,
310,
417,
2590,
849,
7899,
352,
310,
3340,
275,
253,
2393,
3924,
273,
3733,
50276,
2577,
14876,
323,
11701,
310,
326,
2057,
281,
45190,
921,
326,
616,
4342,
3904,
403,
5185,
2439,
247,
1180,
273,
31225,
285,
15450,
8892,
390,
281,
2085,
690,
10527,
22861,
323,
616,
4342,
604,
760,
278,
16856,
362,
19,
310,
908,
2490,
187,
4118,
18435,
27,
2520,
2929,
45190,
2175,
253,
3486,
273,
1027,
3510,
273,
2297,
3993,
908,
275,
3332,
4499,
422,
1881,
35421,
4715,
3082,
1543,
497,
8523,
2011,
327,
278,
406,
729,
19,
2167,
846,
30080,
22559,
948,
498,
83,
369,
671,
2879,
285,
2067,
4722,
4342,
497,
1119,
1690,
326,
760,
31056,
608,
273,
253,
2297,
3993,
403,
3309,
285,
4209,
1223,
253,
30628,
3047,
253,
5649,
273,
8132,
29689,
12392,
436,
4809,
273,
3332,
16424,
275,
1881,
35421,
4715,
247,
1180,
273,
3374,
497,
5439,
1690,
337,
253,
3710,
7990,
273,
253,
11815,
1677,
326,
760,
767,
846,
30080,
22559,
11333,
497,
908,
327,
581,
15302,
374,
3710,
10291,
8392,
281,
5368,
2987,
327,
1892,
4016,
15067,
534,
310,
1077,
1846,
2439,
5145,
4715,
1690,
7982,
4715,
285,
1789,
5481,
285,
495,
3710,
5955,
273,
690,
273,
253,
35961,
3374,
824,
347,
897,
273,
5593,
326,
403,
45654,
12331,
281,
253,
3210,
13461,
7613,
1146,
1679,
9630,
2393,
275,
253,
3733,
285,
3159,
3024,
347,
247,
2557,
323,
24705,
14259,
2167,
253,
4477,
2530,
24585,
30080,
85,
932,
253,
30628,
1335,
3543,
690,
273,
841,
3374,
497,
417,
9713,
347,
247,
906,
891,
5583,
18235,
275,
436,
5880,
285,
326,
253,
4477,
48404,
690,
273,
841,
7794,
323,
247,
19529,
281,
2852,
28966,
50275,
74,
651,
751,
281,
22175,
326,
436,
1511,
273,
789,
534,
3400,
26565,
16774,
5839,
273,
2710,
16958,
275,
5145,
4715,
310,
6296,
1774,
285,
4409,
2509,
7613,
253,
3480,
273,
247,
747,
1332,
24088,
281,
2953,
253,
5438,
273,
2297,
3993,
369,
417,
253,
3720,
273,
253,
3061,
1223,
253,
2929,
4518,
1057,
247,
11080,
2628,
387,
15686,
841,
3374,
323,
247,
3710,
7990,
24088,
275,
2426,
273,
15302,
247,
4067,
7680,
310,
3264,
323,
16774,
9380,
824,
326,
337,
359,
476,
5416,
253,
31376,
273,
253,
11815,
2439,
3082,
285,
15302,
374,
359,
452,
247,
20178,
7792,
323,
4685,
253,
16774,
1543,
3340,
342,
1675,
281,
752,
310,
2168,
1929,
275,
9701,
3672,
24088,
7982,
4715,
285,
1789,
5481,
285,
495,
359,
2096,
690,
273,
253,
35961,
10165,
326,
497,
1160,
285,
2139,
597,
403,
10481,
17285,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
7194,
5421,
849,
253,
4016,
3530,
476,
2818,
253,
1566,
3045,
275,
22296,
4715,
260,
900,
2987,
949,
253,
4679,
436,
789,
556,
247,
1643,
4722,
4342,
1690,
253,
5020,
273,
4016,
3530,
403,
417,
1774,
323,
253,
1566,
4715,
760,
247,
1355,
8578,
273,
1892,
3530,
3653,
253,
1566,
6349,
841,
1892,
6667,
403,
671,
8244,
2905,
342,
2762,
3530,
625,
3300,
39904,
2074,
50276,
664,
476,
923,
432,
4679,
326,
697,
1077,
1774,
50276,
936,
9648,
1555,
4016,
3530,
275,
22296,
4715,
8892,
2299,
627,
310,
642,
31225,
4081,
281,
1361,
3157,
253,
4715,
6779,
390,
3885,
598,
253,
3733,
4836,
575,
575,
249,
2087,
253,
10668,
403,
625,
6110,
275,
253,
5482,
846,
27017,
253,
6349,
273,
4016,
3530,
1971,
1309,
253,
4679,
50276,
262,
651,
320,
3309,
281,
2486,
253,
3969,
5482,
407,
8356,
9978,
841,
2297,
3993,
3530,
275,
260,
301,
2905,
4836,
7152,
33032,
436,
2929,
8219,
326,
275,
4499,
422,
1881,
35421,
4715,
1027,
4016,
10872,
452,
1027,
6349,
436,
6349,
310,
4623,
281,
253,
10183,
273,
4016,
10872,
327,
4440,
257,
292,
285,
278,
16856,
19,
253,
4477,
921,
326,
970,
253,
954,
2834,
608,
4016,
10872,
476,
5115,
2074,
3045,
2429,
342,
970,
512,
4016,
10872,
2299,
253,
954,
2834,
14805,
273,
4016,
10872,
4917,
3076,
3045,
50276,
74,
5583,
281,
12009,
436,
2929,
1955,
281,
253,
1563,
2201,
7350,
337,
1263,
310,
2684,
327,
247,
2014,
10895,
534,
310,
417,
21414,
374,
1263,
310,
2684,
327,
247,
2014,
1332,
534,
43603,
24626,
327,
1880,
253,
11815,
2186,
323,
643,
3082,
495,
436,
1263,
1057,
417,
1646,
281,
452,
8542,
1318,
50276,
6050,
436,
1263,
310,
4722,
352,
19756,
8132,
263,
275,
253,
1563,
7794,
337,
253,
1263,
310,
760,
2684,
327,
247,
2014,
4499,
422,
1881,
35421,
4715,
1332,
278,
16856,
19,
352,
310,
12744,
1880,
253,
11815,
2186,
323,
643,
4499,
422,
256,
3433,
3082,
824,
347,
407,
311,
285,
1142,
2571,
374,
253,
1263,
310,
5196,
327,
247,
2014,
10895,
4440,
257,
292,
352,
310,
12744,
1880,
253,
11815,
2186,
323,
643,
15302,
495,
1529,
4468,
310,
436,
1263,
1057,
417,
1646,
281,
452,
8542,
1318,
275,
1016,
19502,
1309,
3733,
4560,
253,
31056,
6667,
323,
247,
7316,
3198,
281,
10173,
253,
6703,
7509,
875,
436,
7316,
285,
512,
643,
3733,
6667,
534,
310,
43245,
1077,
5536,
50276,
21,
275,
253,
4477,
2557,
273,
10183,
253,
10183,
310,
247,
1159,
273,
2990,
13461,
275,
2393,
3924,
273,
253,
3733,
253,
2990,
13461,
403,
3632,
534,
8018,
326,
253,
5118,
10183,
778,
320,
34209,
476,
253,
4477,
4385,
327,
436,
50276,
35529,
253,
2929,
1057,
452,
247,
1643,
2266,
2792,
337,
253,
2929,
310,
973,
15720,
253,
6003,
310,
2590,
285,
253,
2929,
310,
3477,
281,
956,
374,
253,
5421,
1895,
310,
4722,
285,
4460,
50274,
977,
5701,
337,
4677,
608,
66,
310,
2834,
281,
4665,
253,
2488,
778,
1908,
281,
294,
7397,
907,
352,
374,
275,
4677,
495,
760,
1264,
3276,
2193,
497,
2783,
534,
778,
417,
320,
1077,
21414,
50275,
11183,
891,
1239,
253,
4477,
30080,
22559,
253,
4477,
42126,
2953,
619,
4468,
253,
1263,
310,
5196,
327,
247,
2014,
10895,
4440,
257,
292,
352,
310,
12744,
1880,
253,
11815,
2186,
323,
643,
15302,
50276,
31031,
314,
891,
651,
751,
281,
1978,
619,
3236,
13716,
50276,
7152,
33032,
253,
4342,
273,
436,
789,
403,
326,
323,
4499,
422,
4715,
954,
273,
253,
2297,
3993,
14320,
4354,
39690,
403,
15279,
253,
954,
1774,
2297,
3993,
403,
9366,
275,
253,
1755,
608,
8642,
281,
253,
2762,
3410,
285,
326,
690,
273,
253,
42508,
1892,
6667,
403,
30078,
50276,
249,
2087,
891,
3543,
253,
2022,
4342,
273,
436,
789,
281,
320,
11467,
275,
1386,
342,
752,
359,
2168,
871,
670,
4499,
422,
4715,
359,
476,
4354,
1007,
387,
436,
2987,
4342,
342,
1675,
281,
253,
2602,
256,
11618,
8459,
275,
326,
760,
253,
6667,
2810,
281,
253,
3061,
7548,
943,
2647,
2781,
8459,
533,
690,
2834,
6667,
50276,
783,
18979,
42508,
2834,
4394,
1056,
253,
941,
275,
16806,
494,
594,
359,
1581,
690,
8411,
37358,
2426,
50276,
6050,
516,
417,
7738,
326,
1499,
5436,
247,
2602,
256,
11618,
1060,
651,
8415,
253,
1895,
627,
310,
247,
1781,
2133,
273,
18504,
1814,
833,
5481,
42070,
6239,
326,
8436,
265,
253,
4342,
273,
436,
789,
50276,
7210,
414,
273,
3159,
3024,
347,
247,
2557,
273,
24705,
14259,
2593,
577,
4648,
3159,
3024,
13849,
281,
6642,
253,
24705,
22620,
875,
5971,
407,
4560,
616,
6096,
8482,
658,
5230,
253,
12861,
253,
8482,
658,
253,
625,
3300,
39904,
2074,
1223,
891,
513,
417,
12027,
253,
1750,
273,
253,
31056,
2297,
3993,
1146,
432,
3300,
39904,
2074,
5971,
1027,
4243,
273,
253,
3159,
3024,
2753,
1178,
5202,
452,
24705,
20258,
447,
273,
11962,
2308,
273,
820,
1032,
8098,
247,
374,
5184,
4181,
275,
581,
8482,
658,
812,
4354,
320,
625,
273,
247,
24705,
6923,
685,
247,
495,
5184,
4181,
275,
1529,
50275,
783,
2226,
2720,
2987,
10620,
342,
253,
22459,
24705,
20258,
447,
275,
4440,
257,
292,
407,
4758,
598,
24498,
49996,
271,
1650,
310,
337,
50276,
74,
651,
2007,
9059,
326,
253,
373,
690,
8794,
593,
275,
253,
5921,
875,
24705,
14259,
285,
1650,
38576,
275,
326,
352,
1663,
7024,
327,
634,
4327,
273,
4735,
6779,
5304,
3386,
588,
10748,
24888,
342,
8003,
24705,
2308,
275,
25910,
7769,
9050,
2299,
436,
588,
417,
7933,
2186,
323,
24705,
9050,
2931,
407,
1159,
275,
326,
767,
25910,
5799,
4957,
778,
2965,
762,
2810,
24705,
13301,
50275,
783,
2905,
2987,
2593,
3916,
1789,
5481,
2987,
452,
417,
11120,
3206,
4016,
6667,
347,
275,
260,
301,
891,
452,
281,
8564,
436,
3908,
310,
15225,
9839,
833,
347,
374,
671,
11106,
275,
436,
12494,
1077,
11120,
27304,
323,
50276,
28402,
44549,
1327,
1664,
6127,
627,
310,
247,
1077,
1048,
1618,
273,
1892,
12373,
15067,
2987,
275,
1789,
5481,
50276,
1189,
455,
891,
1318,
253,
16774,
3486,
273,
436,
789,
275,
326,
253,
2581,
7000,
1783,
778,
1421,
281,
11701,
281,
2852,
9508,
273,
253,
4499,
422,
4735,
4715,
4836,
2299,
891,
513,
417,
1089,
253,
4342,
273,
436,
789,
281,
320,
10481,
4460,
323,
436,
8059,
285,
3103,
2550,
5583,
436,
789,
323,
14924,
275,
697,
1655,
1375,
50274,
18,
340,
266,
1162,
355,
288,
12352,
9866,
24498,
3676,
27311,
267,
11454,
6928,
1542,
1781,
4311,
5304,
8981,
17857,
17312,
4104,
50276,
19,
29502,
285,
268,
19613,
900,
1650,
3169,
4715,
323,
1859,
3169,
1966,
2454,
5481,
246,
81,
7588,
8065,
7152,
339,
9852,
436,
2929,
253,
4477,
4824,
562,
247,
2962,
273,
4679,
281,
12106,
253,
3486,
273,
4016,
3530,
275,
4499,
422,
4715,
4227,
11081,
50276,
68,
301,
275,
1798,
597,
1611,
281,
4271,
534,
10183,
2491,
310,
1774,
323,
6779,
4715,
273,
253,
1142,
3332,
1881,
35421,
4715,
7274,
597,
9703,
278,
16856,
362,
19,
347,
253,
1071,
3026,
597,
10166,
253,
278,
16856,
1566,
432,
271,
4440,
257,
292,
3215,
11273,
581,
2710,
7533,
534,
2723,
281,
2710,
4088,
273,
19690,
776,
1892,
390,
3477,
2297,
3993,
497,
908,
38576,
273,
3530,
403,
4080,
1754,
327,
21496,
4181,
281,
253,
7316,
26332,
4394,
342,
1781,
4181,
403,
3477,
616,
2022,
4342,
403,
323,
4016,
3530,
337,
970,
253,
608,
31056,
310,
2217,
323,
15450,
8892,
374,
253,
24746,
5325,
273,
731,
497,
15279,
285,
12497,
495,
253,
31056,
14805,
310,
19632,
285,
50276,
21,
1892,
2297,
3993,
497,
625,
3300,
39904,
2074,
281,
253,
7316,
50276,
249,
2087,
275,
619,
4743,
436,
310,
247,
2929,
275,
534,
253,
4477,
3597,
281,
9172,
1142,
4722,
8542,
3533,
253,
2488,
2530,
4679,
285,
21414,
20456,
2979,
323,
247,
1180,
273,
16039,
619,
2022,
33196,
342,
436,
2929,
403,
50276,
18,
954,
273,
253,
2792,
403,
417,
747,
285,
403,
14883,
569,
273,
752,
497,
8042,
562,
1078,
11358,
323,
1650,
275,
3300,
6356,
472,
15067,
323,
4181,
7982,
4715,
374,
253,
16774,
1543,
403,
760,
1561,
253,
3634,
273,
278,
16856,
19,
285,
323,
247,
4872,
9162,
4836,
352,
310,
417,
2590,
849,
824,
3904,
347,
14805,
608,
390,
5325,
651,
1818,
672,
25987,
643,
31225,
824,
347,
407,
311,
390,
1863,
580,
253,
2361,
15988,
3133,
247,
1652,
2372,
7996,
281,
253,
3276,
3602,
273,
278,
16856,
495,
253,
3410,
38576,
310,
4080,
1754,
327,
21496,
4181,
534,
651,
320,
16323,
1309,
253,
3733,
1232,
3139,
50276,
262,
310,
417,
2590,
849,
7899,
352,
310,
3340,
275,
253,
2393,
3924,
273,
3733,
50276,
2577,
14876,
323,
11701,
310,
326,
2057,
281,
45190,
921,
326,
616,
4342,
3904,
403,
5185,
2439,
247,
1180,
273,
31225,
285,
15450,
8892,
390,
281,
2085,
690,
10527,
22861,
323,
616,
4342,
604,
760,
278,
16856,
362,
19,
310,
908,
2490,
187,
4118,
18435,
27,
2520,
2929,
45190,
2175,
253,
3486,
273,
1027,
3510,
273,
2297,
3993,
908,
275,
3332,
4499,
422,
1881,
35421,
4715,
3082,
1543,
497,
8523,
2011,
327,
278,
406,
729,
19,
2167,
846,
30080,
22559,
948,
498,
83,
369,
671,
2879,
285,
2067,
4722,
4342,
497,
1119,
1690,
326,
760,
31056,
608,
273,
253,
2297,
3993,
403,
3309,
285,
4209,
1223,
253,
30628,
3047,
253,
5649,
273,
8132,
29689,
12392,
436,
4809,
273,
3332,
16424,
275,
1881,
35421,
4715,
247,
1180,
273,
3374,
497,
5439,
1690,
337,
253,
3710,
7990,
273,
253,
11815,
1677,
326,
760,
767,
846,
30080,
22559,
11333,
497,
908,
327,
581,
15302,
374,
3710,
10291,
8392,
281,
5368,
2987,
327,
1892,
4016,
15067,
534,
310,
1077,
1846,
2439,
5145,
4715,
1690,
7982,
4715,
285,
1789,
5481,
285,
495,
3710,
5955,
273,
690,
273,
253,
35961,
3374,
824,
347,
897,
273,
5593,
326,
403,
45654,
12331,
281,
253,
3210,
13461,
7613,
1146,
1679,
9630,
2393,
275,
253,
3733,
285,
3159,
3024,
347,
247,
2557,
323,
24705,
14259,
2167,
253,
4477,
2530,
24585,
30080,
85,
932,
253,
30628,
1335,
3543,
690,
273,
841,
3374,
497,
417,
9713,
347,
247,
906,
891,
5583,
18235,
275,
436,
5880,
285,
326,
253,
4477,
48404,
690,
273,
841,
7794,
323,
247,
19529,
281,
2852,
28966,
50275,
74,
651,
751,
281,
22175,
326,
436,
1511,
273,
789,
534,
3400,
26565,
16774,
5839,
273,
2710,
16958,
275,
5145,
4715,
310,
6296,
1774,
285,
4409,
2509,
7613,
253,
3480,
273,
247,
747,
1332,
24088,
281,
2953,
253,
5438,
273,
2297,
3993,
369,
417,
253,
3720,
273,
253,
3061,
1223,
253,
2929,
4518,
1057,
247,
11080,
2628,
387,
15686,
841,
3374,
323,
247,
3710,
7990,
24088,
275,
2426,
273,
15302,
247,
4067,
7680,
310,
3264,
323,
16774,
9380,
824,
326,
337,
359,
476,
5416,
253,
31376,
273,
253,
11815,
2439,
3082,
285,
15302,
374,
359,
452,
247,
20178,
7792,
323,
4685,
253,
16774,
1543,
3340,
342,
1675,
281,
752,
310,
2168,
1929,
275,
9701,
3672,
24088,
7982,
4715,
285,
1789,
5481,
285,
495,
359,
2096,
690,
273,
253,
35961,
10165,
326,
497,
1160,
285,
2139,
597,
403,
10481,
17285,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper provides a benchmark to evaluate approximators for wasserstein1 distances as loss functions in the generative adversarial network setting s1 while previous works use discrete distributions for benchmarking solvers this work suggests continuous distributions which is a novel aspect for benchmarking w1 w1 the benchmark contains only one image dataset with a single mode faces the addition of more image datasets especially multimodal ones eg cifar10 would improve the versatility of the benchmark and extend it to conditional models docsepauthors propose a generic methodology to construct benchmark pairs with ground truth ot plan ot cost and ot gradient we can use this tool to evaluate the performance of the neural dual ot solvers approximating the wasserstein1 distance or the gradient of wasserstein1 distance specifically the authors employ the 1lipschitz minfunnel functions to compute transport rays and define the ray monotone map with them we can define a target distribution mathbbq and compute ot cost and ot gradient based on the original distribution mathbbp the authors provide an elaborate introduction to the wasserstein1 and its neural dual ot solvers followed by compact math proof about their benchmark pairs experiments are also reasonable it is also a good point of view to consider the gradient of the wasserstein1 distance some minor concerns is it hard to turn hyperparameters for this method for example when you compute the highdimensional benchmark pairs you choose bn sim mathcaln001 and p 8 how do you choose it how long does it cost for the hyperparameter search the dimension of images in reality is higher than 27 can this tool handle higher dimensions if we carefully choose minfunnel function u instead of randomly picking will the performance be better what will be the effect of increasing n and d paper mentions in wgans the solvers move the generated distribution bad images mathbbq in our construction to the real distribution good images mathbbp however mathbbp is synthetic distribution and mathbbq is computed ground truth real image distribution in the case of images benchmark why do the solvers move mathbbq to mathbbp instead of the opposite authors mention solvers mm mmr takes longer for training compared with gp so and lp is the time gap significant docsepmotivated by the lack of benchmarks for w1 dual methods other than perceptual measures such as fid or is this paper proposes to create a semisynthetic set of benchmark datasets with known optimal transport plans maps and distance to do this the paper first develops theory about maps that are optimal by construction then the paper proposes concrete methods for constructing the necessary functions and computing the necessary plans maps and gradients finally synthetic dataset pairs are generated from truncated gaussian data and celeba data at various dimensionalities and used to evaluate and discuss many existing w1 methods discusses good overview of w1 methods proves theoretical results about how to construct maps that are optimal wrt w1 proposes novel way to construct groundtruth semisynthetic benchmarks for evaluating wasserstein1 dual solvers provides code and datasets for benchmark datasets and algorithms evaluates the gradient of the w1 wrt the parameters which is actually most important for most generative methods only one realworld dataset celeba is considered and the synthetic datasets are quite simple ie truncated gaussians it seems including more realworld datasets even mnist or cifar10 would be useful or using interesting realworld tabular data for smaller dimensions eg even something like iris this limitation is mentioned in the text but does seem to be the main limitation it seems the benchmark only considers maps where the samples are grouped more closely together or the reverse maps that expand parts of the space or where some expand and some contract would be better it is unclear whether the benchmark maps properly represent realworld ot maps minor but nonetheless important for final paper all result tables are in the appendix and the figures are in odd places with nonstandard captions at least some summary table of the results and your recommendations for suggested methods based on context would be important to include what methods would you recommend and why the answer may be a combination of easeofuse convergence behavior and overall performance docsepthis paper proposes a benchmark to evaluate the methods of computing the wasserstein1 distance the authors construct 1lipschitz functions and use them to build ray monotone transport plans which yield pairs of continuous benchmark distributions in highdimensional spaces some wgan dual form solvers are evaluated using these benchmark pairs 1 this paper proposed a benchmark to evaluate the methods of computing the wasserstein1 distance the problem is interesting to the community 2 this paper is wellwritten and technically sound the method uses 1lipschitz functions to construct pairs of continuous distributions which is well designed 3 this paper thoroughly evaluates popular wgan dual form solvers in highdimensional spaces using these benchmark pairs 1 the title of this paper is ambiguous and may lead to inappropriate reviewers 2 the theoretical analysis and the intuition of the proposed method is weak it is unclear why the proposed method works well than previous methods 3 evaluating the wasserstein1 distance does not directly validate the superiority of the methods on specific tasks which may need more explanations docsepthis paper proposes a benchmark for computing the wasserstein1 distance the authors first propose to use 1lipschitz functions to build ray monotone transport plans and obtain known ot maps these ground truth maps are then used to benchmark dual ot solvers used in particular in the wasserstein gan framework this papers proposes a method to build known ot maps using 1lipschitz minfunnet functions this choice is clearly justified as these functions are universal approximator of 1lipschitz functions prop2 having known ot maps allows to faithfully compare the ot solvers they carefully build transport ray of these functions the paper is well written and easy to follow the authors tackle an interesting problem and having more comparison like this one is crucial i regret that the results of the benchmarks are only available in the appendices i would recommend the authors to include some of them in the main paper since those are the main results of the paper the restriction to 1lipschitz minfunnet functions seems to be a main limitation of this work it seems that in the experiments only one random start is considered is there any reasons why the authors did not perform multiple runs this seems to impede to assess the methods stability and robustness with regard to the random start and the parameters an and bn in the funnel docsepthis paper proposes a benchmark for methods computing the wasserstein1 distance section 1 summarizes background information on computing w1 often with the dual in eq 4 and 5 and how the w1 is used in gan training section 2 summarizes methods estimating the dual potentials and transport maps section 3 describes the benchmark distributions and section 4 shows the results of evaluating the methods on the results which are quantified in section d of the appendix approximating w1 computations is widely used and a difficult setting to benchmark because the groundtruth transport maps and distances are often not known i am not aware of an established w1 benchmarks and papers often have to rely on downstream tasks such as inception scores to justify an algorithmic improvement to the w1 approximation this paper presents nontrivial settings where the groundtruth transport map is known and uses it to the experimental results are thorough and the paper strongly shows that minimax methods solve the benchmark tasks in most settings at least for obtaining a gradient that approximates the true gradient while the paper proposes a new benchmark for approximating the w1 it unfortunately does not present results in established gan settings as the groundtruth maps are not known thus research that is ultimately focused on improving the w1 computations in settings such as gans may be able to use these benchmarks for preliminary experiments but these benchmark tasks may not reflect the true difficulties of these methods thus established and powerful it is not clear how solved w1 ot is how much work remains in the field and how many new directions this benchmark will enable in other words better solutions to this benchmark will not directly enable new methods or new gan results
### Summary: | this paper proposes a new benchmark to evaluate the solution of optimal transport problems the reviewers concur that the benchmark is wellexecuted and novel some are concerned that a better benchmark for ot problems will not drive progress as the successes of wasserstein gans occur despite their failure to solve ot however it seems like a useful intermediate check to deepen understanding of why wasserstein gans and models to come work by at least eliminating nonexplanations | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
3400,
247,
22791,
281,
7472,
4020,
2392,
323,
369,
2152,
6339,
18,
13849,
347,
2957,
3470,
275,
253,
1006,
800,
48960,
2990,
4758,
50276,
84,
18,
1223,
2045,
2987,
897,
13358,
10670,
323,
22791,
272,
1220,
735,
436,
789,
5936,
5415,
10670,
534,
310,
247,
4460,
4809,
323,
22791,
272,
259,
18,
50276,
88,
18,
253,
22791,
4428,
760,
581,
2460,
10895,
342,
247,
2014,
4438,
9365,
253,
1635,
273,
625,
2460,
15302,
3340,
23390,
26306,
4394,
24088,
260,
338,
274,
740,
651,
3157,
253,
49607,
273,
253,
22791,
285,
9017,
352,
281,
17697,
3210,
5474,
33032,
43355,
12661,
247,
12314,
16182,
281,
3989,
22791,
8557,
342,
3216,
5083,
14366,
2098,
14366,
2105,
285,
14366,
11786,
359,
476,
897,
436,
4968,
281,
7472,
253,
3045,
273,
253,
11454,
8746,
14366,
1220,
735,
4020,
839,
253,
369,
2152,
6339,
18,
4181,
390,
253,
11786,
273,
369,
2152,
6339,
18,
4181,
50276,
46458,
253,
4477,
2126,
253,
337,
77,
2824,
37913,
1054,
2337,
8101,
3470,
281,
11897,
4616,
23154,
285,
4853,
253,
21868,
49123,
3711,
342,
731,
359,
476,
4853,
247,
2303,
3268,
14168,
4482,
82,
285,
11897,
14366,
2105,
285,
14366,
11786,
1754,
327,
253,
3236,
3268,
14168,
4482,
81,
253,
4477,
2085,
271,
21184,
10199,
281,
253,
369,
2152,
6339,
18,
285,
697,
11454,
8746,
14366,
1220,
735,
3560,
407,
8566,
14168,
4737,
670,
616,
22791,
8557,
4679,
403,
671,
5272,
352,
310,
671,
247,
1175,
1127,
273,
1859,
281,
1908,
253,
11786,
273,
253,
369,
2152,
6339,
18,
4181,
690,
5884,
7350,
50276,
261,
352,
1892,
281,
1614,
4373,
22041,
323,
436,
1332,
323,
1650,
672,
368,
11897,
253,
1029,
6967,
22791,
8557,
368,
5206,
270,
79,
948,
14168,
1179,
79,
2874,
285,
268,
50276,
25,
849,
513,
368,
5206,
352,
849,
1048,
1057,
352,
2105,
323,
253,
4373,
19484,
3186,
50276,
783,
7877,
273,
3888,
275,
6612,
310,
2169,
685,
3435,
476,
436,
4968,
6016,
2169,
10103,
50276,
338,
359,
9257,
5206,
1054,
2337,
8101,
1159,
1484,
3185,
273,
12421,
8871,
588,
253,
3045,
320,
1805,
50276,
5371,
588,
320,
253,
1055,
273,
3629,
295,
285,
277,
50276,
20790,
25957,
275,
259,
72,
507,
253,
1220,
735,
2118,
253,
4561,
3268,
3076,
3888,
14168,
4482,
82,
275,
776,
5140,
281,
253,
1524,
3268,
1175,
3888,
14168,
4482,
81,
2299,
14168,
4482,
81,
310,
13506,
3268,
285,
14168,
4482,
82,
310,
10302,
3216,
5083,
1524,
2460,
3268,
275,
253,
1083,
273,
3888,
22791,
2139,
513,
253,
1220,
735,
2118,
14168,
4482,
82,
281,
14168,
4482,
81,
3185,
273,
253,
7285,
50276,
43355,
3748,
1220,
735,
5823,
5823,
83,
3936,
3356,
323,
3733,
2429,
342,
31025,
594,
285,
39322,
310,
253,
673,
8037,
1534,
50276,
7152,
339,
2617,
302,
8550,
407,
253,
3480,
273,
49602,
323,
259,
18,
8746,
3082,
643,
685,
39612,
5593,
824,
347,
269,
301,
390,
310,
436,
2929,
29328,
281,
2794,
247,
49863,
23744,
873,
273,
22791,
15302,
342,
1929,
8654,
4616,
5827,
8115,
285,
4181,
281,
513,
436,
253,
2929,
806,
24357,
3762,
670,
8115,
326,
403,
8654,
407,
5140,
840,
253,
2929,
29328,
11859,
3082,
323,
26736,
253,
3309,
3470,
285,
12672,
253,
3309,
5827,
8115,
285,
27935,
4720,
13506,
10895,
8557,
403,
4561,
432,
28069,
305,
12064,
941,
285,
6076,
5830,
941,
387,
2710,
15759,
1005,
285,
908,
281,
7472,
285,
2319,
1142,
5368,
259,
18,
3082,
50275,
35844,
265,
1175,
18389,
273,
259,
18,
3082,
50276,
856,
1634,
10527,
1543,
670,
849,
281,
3989,
8115,
326,
403,
8654,
8772,
259,
18,
50276,
856,
6013,
4460,
1039,
281,
3989,
3216,
33024,
49863,
23744,
49602,
323,
16344,
369,
2152,
6339,
18,
8746,
1220,
735,
50276,
11404,
1487,
2127,
285,
15302,
323,
22791,
15302,
285,
11333,
50276,
15419,
21094,
253,
11786,
273,
253,
259,
18,
8772,
253,
3602,
534,
310,
2686,
954,
1774,
323,
954,
1006,
800,
3082,
50276,
7483,
581,
1524,
10186,
10895,
6076,
5830,
310,
2783,
285,
253,
13506,
15302,
403,
3240,
2969,
26332,
28069,
305,
10064,
2458,
352,
3133,
1690,
625,
1524,
10186,
15302,
1014,
278,
79,
382,
390,
260,
338,
274,
740,
651,
320,
4217,
390,
970,
4722,
1524,
10186,
10334,
792,
941,
323,
4577,
10103,
24088,
1014,
1633,
751,
3496,
261,
50275,
2520,
12291,
310,
5393,
275,
253,
2505,
533,
1057,
1646,
281,
320,
253,
2022,
12291,
352,
3133,
253,
22791,
760,
19401,
8115,
835,
253,
3530,
403,
24104,
625,
8244,
2366,
390,
253,
8107,
8115,
326,
5645,
4243,
273,
253,
2317,
390,
835,
690,
5645,
285,
690,
3310,
651,
320,
1805,
352,
310,
12744,
1880,
253,
22791,
8115,
6283,
1957,
1524,
10186,
14366,
8115,
50275,
37585,
533,
23188,
1774,
323,
2457,
2929,
512,
906,
7180,
403,
275,
253,
30762,
285,
253,
8442,
403,
275,
8909,
5053,
342,
1327,
15291,
3403,
621,
387,
1878,
690,
6010,
2829,
273,
253,
1543,
285,
634,
12645,
323,
5125,
3082,
1754,
327,
3634,
651,
320,
1774,
281,
2486,
752,
3082,
651,
368,
5583,
285,
2139,
50276,
783,
3662,
778,
320,
247,
5019,
273,
11990,
1171,
2327,
14940,
3879,
285,
4583,
3045,
50276,
7152,
33032,
2520,
2929,
29328,
247,
22791,
281,
7472,
253,
3082,
273,
12672,
253,
369,
2152,
6339,
18,
4181,
253,
4477,
3989,
337,
77,
2824,
37913,
3470,
285,
897,
731,
281,
1973,
21868,
49123,
4616,
5827,
534,
4917,
8557,
273,
5415,
22791,
10670,
275,
1029,
6967,
8470,
690,
259,
1247,
8746,
830,
1220,
735,
403,
6760,
970,
841,
22791,
8557,
337,
436,
2929,
4081,
247,
22791,
281,
7472,
253,
3082,
273,
12672,
253,
369,
2152,
6339,
18,
4181,
253,
1895,
310,
4722,
281,
253,
3114,
50275,
19,
436,
2929,
310,
973,
15720,
285,
22335,
3590,
253,
1332,
4648,
337,
77,
2824,
37913,
3470,
281,
3989,
8557,
273,
5415,
10670,
534,
310,
973,
4158,
50275,
20,
436,
2929,
16575,
44995,
4633,
259,
1247,
8746,
830,
1220,
735,
275,
1029,
6967,
8470,
970,
841,
22791,
8557,
337,
253,
4060,
273,
436,
2929,
310,
23851,
285,
778,
1421,
281,
19582,
30628,
50276,
19,
253,
10527,
1783,
285,
253,
30328,
273,
253,
4081,
1332,
310,
5075,
352,
310,
12744,
2139,
253,
4081,
1332,
2987,
973,
685,
2045,
3082,
50276,
20,
16344,
253,
369,
2152,
6339,
18,
4181,
1057,
417,
3587,
17813,
253,
34385,
273,
253,
3082,
327,
2173,
8892,
534,
778,
878,
625,
22909,
5474,
33032,
2520,
2929,
29328,
247,
22791,
323,
12672,
253,
369,
2152,
6339,
18,
4181,
253,
4477,
806,
12661,
281,
897,
337,
77,
2824,
37913,
3470,
281,
1973,
21868,
49123,
4616,
5827,
285,
4044,
1929,
14366,
8115,
841,
3216,
5083,
8115,
403,
840,
908,
281,
22791,
8746,
14366,
1220,
735,
908,
275,
1798,
275,
253,
369,
2152,
6339,
36827,
7792,
50276,
2520,
9380,
29328,
247,
1332,
281,
1973,
1929,
14366,
8115,
970,
337,
77,
2824,
37913,
1054,
2337,
3024,
3470,
436,
4327,
310,
4518,
17285,
347,
841,
3470,
403,
10898,
4020,
1080,
273,
337,
77,
2824,
37913,
3470,
4198,
19,
1907,
1929,
14366,
8115,
4483,
281,
48479,
7277,
253,
14366,
1220,
735,
50275,
9328,
9257,
1973,
4616,
21868,
273,
841,
3470,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
783,
4477,
18915,
271,
4722,
1895,
285,
1907,
625,
5301,
751,
436,
581,
310,
9560,
50275,
74,
14938,
326,
253,
1543,
273,
253,
49602,
403,
760,
2130,
275,
253,
14801,
1271,
891,
651,
5583,
253,
4477,
281,
2486,
690,
273,
731,
275,
253,
2022,
2929,
1580,
1110,
403,
253,
2022,
1543,
273,
253,
2929,
50276,
783,
12400,
281,
337,
77,
2824,
37913,
1054,
2337,
3024,
3470,
3133,
281,
320,
247,
2022,
12291,
273,
436,
789,
50276,
262,
3133,
326,
275,
253,
4679,
760,
581,
3632,
1265,
310,
2783,
310,
627,
667,
4606,
2139,
253,
4477,
858,
417,
1347,
2709,
6613,
436,
3133,
281,
1607,
13616,
281,
2939,
253,
3082,
7882,
285,
31640,
342,
2743,
281,
253,
3632,
1265,
285,
253,
3602,
271,
285,
270,
79,
275,
253,
37346,
5474,
33032,
2520,
2929,
29328,
247,
22791,
323,
3082,
12672,
253,
369,
2152,
6339,
18,
4181,
2593,
337,
37250,
4114,
1491,
327,
12672,
259,
18,
2223,
342,
253,
8746,
275,
16186,
577,
285,
608,
285,
849,
253,
259,
18,
310,
908,
275,
36827,
3733,
2593,
374,
37250,
3082,
26230,
253,
8746,
19316,
285,
4616,
8115,
2593,
495,
8631,
253,
22791,
10670,
285,
2593,
577,
2722,
253,
1543,
273,
16344,
253,
3082,
327,
253,
1543,
534,
403,
18755,
275,
2593,
277,
273,
253,
30762,
50276,
6772,
3266,
839,
259,
18,
30745,
310,
7561,
908,
285,
247,
2834,
50275,
28617,
281,
22791,
984,
253,
3216,
33024,
4616,
50275,
23226,
285,
13849,
403,
2223,
417,
1929,
50275,
74,
717,
417,
6600,
273,
271,
4232,
259,
18,
49602,
285,
9380,
50275,
32205,
452,
281,
10725,
327,
15450,
8892,
824,
347,
39645,
7363,
50275,
936,
15249,
271,
5933,
280,
7756,
281,
253,
259,
18,
11193,
50276,
2520,
2929,
10262,
37825,
7533,
835,
253,
50275,
2595,
33024,
4616,
3711,
310,
1929,
285,
4648,
352,
281,
50276,
783,
5661,
1543,
403,
11080,
285,
253,
2929,
50275,
9072,
314,
2722,
326,
7221,
991,
3082,
8415,
50275,
783,
22791,
8892,
275,
954,
7533,
387,
1878,
323,
50275,
49845,
1776,
247,
11786,
326,
4020,
684,
253,
2032,
11786,
50276,
6050,
253,
2929,
29328,
247,
747,
22791,
323,
4020,
839,
50275,
783,
259,
18,
352,
19235,
1057,
417,
1246,
1543,
50275,
249,
4232,
36827,
7533,
347,
253,
3216,
33024,
8115,
50275,
609,
417,
1929,
3021,
2561,
326,
310,
9142,
7106,
327,
50275,
303,
40037,
253,
259,
18,
30745,
275,
7533,
824,
347,
305,
507,
50275,
11159,
320,
2104,
281,
897,
841,
49602,
323,
12611,
50275,
16217,
3825,
533,
841,
22791,
8892,
778,
417,
50275,
22697,
253,
2032,
12748,
50275,
1171,
841,
3082,
3021,
4232,
285,
6422,
50276,
262,
310,
417,
2590,
849,
14042,
259,
18,
14366,
310,
849,
1199,
789,
50275,
2013,
1550,
275,
253,
1673,
285,
849,
1142,
747,
10746,
50275,
2520,
22791,
588,
8046,
275,
643,
3000,
1805,
50275,
84,
17009,
281,
436,
22791,
588,
417,
3587,
8046,
50275,
1826,
3082,
390,
747,
36827,
1543,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
747,
22791,
281,
7472,
253,
2900,
273,
8654,
4616,
3237,
253,
30628,
15038,
326,
253,
22791,
310,
6210,
1591,
886,
4525,
285,
4460,
690,
403,
7514,
326,
247,
1805,
22791,
323,
14366,
3237,
588,
417,
4446,
4780,
347,
253,
34574,
273,
369,
2152,
6339,
305,
507,
2826,
5747,
616,
4433,
281,
8415,
14366,
2299,
352,
3133,
751,
247,
4217,
10444,
2451,
281,
3676,
257,
4685,
273,
2139,
369,
2152,
6339,
305,
507,
285,
3210,
281,
1705,
789,
407,
387,
1878,
23703,
44382,
11139,
569
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
3400,
247,
22791,
281,
7472,
4020,
2392,
323,
369,
2152,
6339,
18,
13849,
347,
2957,
3470,
275,
253,
1006,
800,
48960,
2990,
4758,
50276,
84,
18,
1223,
2045,
2987,
897,
13358,
10670,
323,
22791,
272,
1220,
735,
436,
789,
5936,
5415,
10670,
534,
310,
247,
4460,
4809,
323,
22791,
272,
259,
18,
50276,
88,
18,
253,
22791,
4428,
760,
581,
2460,
10895,
342,
247,
2014,
4438,
9365,
253,
1635,
273,
625,
2460,
15302,
3340,
23390,
26306,
4394,
24088,
260,
338,
274,
740,
651,
3157,
253,
49607,
273,
253,
22791,
285,
9017,
352,
281,
17697,
3210,
5474,
33032,
43355,
12661,
247,
12314,
16182,
281,
3989,
22791,
8557,
342,
3216,
5083,
14366,
2098,
14366,
2105,
285,
14366,
11786,
359,
476,
897,
436,
4968,
281,
7472,
253,
3045,
273,
253,
11454,
8746,
14366,
1220,
735,
4020,
839,
253,
369,
2152,
6339,
18,
4181,
390,
253,
11786,
273,
369,
2152,
6339,
18,
4181,
50276,
46458,
253,
4477,
2126,
253,
337,
77,
2824,
37913,
1054,
2337,
8101,
3470,
281,
11897,
4616,
23154,
285,
4853,
253,
21868,
49123,
3711,
342,
731,
359,
476,
4853,
247,
2303,
3268,
14168,
4482,
82,
285,
11897,
14366,
2105,
285,
14366,
11786,
1754,
327,
253,
3236,
3268,
14168,
4482,
81,
253,
4477,
2085,
271,
21184,
10199,
281,
253,
369,
2152,
6339,
18,
285,
697,
11454,
8746,
14366,
1220,
735,
3560,
407,
8566,
14168,
4737,
670,
616,
22791,
8557,
4679,
403,
671,
5272,
352,
310,
671,
247,
1175,
1127,
273,
1859,
281,
1908,
253,
11786,
273,
253,
369,
2152,
6339,
18,
4181,
690,
5884,
7350,
50276,
261,
352,
1892,
281,
1614,
4373,
22041,
323,
436,
1332,
323,
1650,
672,
368,
11897,
253,
1029,
6967,
22791,
8557,
368,
5206,
270,
79,
948,
14168,
1179,
79,
2874,
285,
268,
50276,
25,
849,
513,
368,
5206,
352,
849,
1048,
1057,
352,
2105,
323,
253,
4373,
19484,
3186,
50276,
783,
7877,
273,
3888,
275,
6612,
310,
2169,
685,
3435,
476,
436,
4968,
6016,
2169,
10103,
50276,
338,
359,
9257,
5206,
1054,
2337,
8101,
1159,
1484,
3185,
273,
12421,
8871,
588,
253,
3045,
320,
1805,
50276,
5371,
588,
320,
253,
1055,
273,
3629,
295,
285,
277,
50276,
20790,
25957,
275,
259,
72,
507,
253,
1220,
735,
2118,
253,
4561,
3268,
3076,
3888,
14168,
4482,
82,
275,
776,
5140,
281,
253,
1524,
3268,
1175,
3888,
14168,
4482,
81,
2299,
14168,
4482,
81,
310,
13506,
3268,
285,
14168,
4482,
82,
310,
10302,
3216,
5083,
1524,
2460,
3268,
275,
253,
1083,
273,
3888,
22791,
2139,
513,
253,
1220,
735,
2118,
14168,
4482,
82,
281,
14168,
4482,
81,
3185,
273,
253,
7285,
50276,
43355,
3748,
1220,
735,
5823,
5823,
83,
3936,
3356,
323,
3733,
2429,
342,
31025,
594,
285,
39322,
310,
253,
673,
8037,
1534,
50276,
7152,
339,
2617,
302,
8550,
407,
253,
3480,
273,
49602,
323,
259,
18,
8746,
3082,
643,
685,
39612,
5593,
824,
347,
269,
301,
390,
310,
436,
2929,
29328,
281,
2794,
247,
49863,
23744,
873,
273,
22791,
15302,
342,
1929,
8654,
4616,
5827,
8115,
285,
4181,
281,
513,
436,
253,
2929,
806,
24357,
3762,
670,
8115,
326,
403,
8654,
407,
5140,
840,
253,
2929,
29328,
11859,
3082,
323,
26736,
253,
3309,
3470,
285,
12672,
253,
3309,
5827,
8115,
285,
27935,
4720,
13506,
10895,
8557,
403,
4561,
432,
28069,
305,
12064,
941,
285,
6076,
5830,
941,
387,
2710,
15759,
1005,
285,
908,
281,
7472,
285,
2319,
1142,
5368,
259,
18,
3082,
50275,
35844,
265,
1175,
18389,
273,
259,
18,
3082,
50276,
856,
1634,
10527,
1543,
670,
849,
281,
3989,
8115,
326,
403,
8654,
8772,
259,
18,
50276,
856,
6013,
4460,
1039,
281,
3989,
3216,
33024,
49863,
23744,
49602,
323,
16344,
369,
2152,
6339,
18,
8746,
1220,
735,
50276,
11404,
1487,
2127,
285,
15302,
323,
22791,
15302,
285,
11333,
50276,
15419,
21094,
253,
11786,
273,
253,
259,
18,
8772,
253,
3602,
534,
310,
2686,
954,
1774,
323,
954,
1006,
800,
3082,
50276,
7483,
581,
1524,
10186,
10895,
6076,
5830,
310,
2783,
285,
253,
13506,
15302,
403,
3240,
2969,
26332,
28069,
305,
10064,
2458,
352,
3133,
1690,
625,
1524,
10186,
15302,
1014,
278,
79,
382,
390,
260,
338,
274,
740,
651,
320,
4217,
390,
970,
4722,
1524,
10186,
10334,
792,
941,
323,
4577,
10103,
24088,
1014,
1633,
751,
3496,
261,
50275,
2520,
12291,
310,
5393,
275,
253,
2505,
533,
1057,
1646,
281,
320,
253,
2022,
12291,
352,
3133,
253,
22791,
760,
19401,
8115,
835,
253,
3530,
403,
24104,
625,
8244,
2366,
390,
253,
8107,
8115,
326,
5645,
4243,
273,
253,
2317,
390,
835,
690,
5645,
285,
690,
3310,
651,
320,
1805,
352,
310,
12744,
1880,
253,
22791,
8115,
6283,
1957,
1524,
10186,
14366,
8115,
50275,
37585,
533,
23188,
1774,
323,
2457,
2929,
512,
906,
7180,
403,
275,
253,
30762,
285,
253,
8442,
403,
275,
8909,
5053,
342,
1327,
15291,
3403,
621,
387,
1878,
690,
6010,
2829,
273,
253,
1543,
285,
634,
12645,
323,
5125,
3082,
1754,
327,
3634,
651,
320,
1774,
281,
2486,
752,
3082,
651,
368,
5583,
285,
2139,
50276,
783,
3662,
778,
320,
247,
5019,
273,
11990,
1171,
2327,
14940,
3879,
285,
4583,
3045,
50276,
7152,
33032,
2520,
2929,
29328,
247,
22791,
281,
7472,
253,
3082,
273,
12672,
253,
369,
2152,
6339,
18,
4181,
253,
4477,
3989,
337,
77,
2824,
37913,
3470,
285,
897,
731,
281,
1973,
21868,
49123,
4616,
5827,
534,
4917,
8557,
273,
5415,
22791,
10670,
275,
1029,
6967,
8470,
690,
259,
1247,
8746,
830,
1220,
735,
403,
6760,
970,
841,
22791,
8557,
337,
436,
2929,
4081,
247,
22791,
281,
7472,
253,
3082,
273,
12672,
253,
369,
2152,
6339,
18,
4181,
253,
1895,
310,
4722,
281,
253,
3114,
50275,
19,
436,
2929,
310,
973,
15720,
285,
22335,
3590,
253,
1332,
4648,
337,
77,
2824,
37913,
3470,
281,
3989,
8557,
273,
5415,
10670,
534,
310,
973,
4158,
50275,
20,
436,
2929,
16575,
44995,
4633,
259,
1247,
8746,
830,
1220,
735,
275,
1029,
6967,
8470,
970,
841,
22791,
8557,
337,
253,
4060,
273,
436,
2929,
310,
23851,
285,
778,
1421,
281,
19582,
30628,
50276,
19,
253,
10527,
1783,
285,
253,
30328,
273,
253,
4081,
1332,
310,
5075,
352,
310,
12744,
2139,
253,
4081,
1332,
2987,
973,
685,
2045,
3082,
50276,
20,
16344,
253,
369,
2152,
6339,
18,
4181,
1057,
417,
3587,
17813,
253,
34385,
273,
253,
3082,
327,
2173,
8892,
534,
778,
878,
625,
22909,
5474,
33032,
2520,
2929,
29328,
247,
22791,
323,
12672,
253,
369,
2152,
6339,
18,
4181,
253,
4477,
806,
12661,
281,
897,
337,
77,
2824,
37913,
3470,
281,
1973,
21868,
49123,
4616,
5827,
285,
4044,
1929,
14366,
8115,
841,
3216,
5083,
8115,
403,
840,
908,
281,
22791,
8746,
14366,
1220,
735,
908,
275,
1798,
275,
253,
369,
2152,
6339,
36827,
7792,
50276,
2520,
9380,
29328,
247,
1332,
281,
1973,
1929,
14366,
8115,
970,
337,
77,
2824,
37913,
1054,
2337,
3024,
3470,
436,
4327,
310,
4518,
17285,
347,
841,
3470,
403,
10898,
4020,
1080,
273,
337,
77,
2824,
37913,
3470,
4198,
19,
1907,
1929,
14366,
8115,
4483,
281,
48479,
7277,
253,
14366,
1220,
735,
50275,
9328,
9257,
1973,
4616,
21868,
273,
841,
3470,
50276,
783,
2929,
310,
973,
3542,
285,
3477,
281,
956,
50276,
783,
4477,
18915,
271,
4722,
1895,
285,
1907,
625,
5301,
751,
436,
581,
310,
9560,
50275,
74,
14938,
326,
253,
1543,
273,
253,
49602,
403,
760,
2130,
275,
253,
14801,
1271,
891,
651,
5583,
253,
4477,
281,
2486,
690,
273,
731,
275,
253,
2022,
2929,
1580,
1110,
403,
253,
2022,
1543,
273,
253,
2929,
50276,
783,
12400,
281,
337,
77,
2824,
37913,
1054,
2337,
3024,
3470,
3133,
281,
320,
247,
2022,
12291,
273,
436,
789,
50276,
262,
3133,
326,
275,
253,
4679,
760,
581,
3632,
1265,
310,
2783,
310,
627,
667,
4606,
2139,
253,
4477,
858,
417,
1347,
2709,
6613,
436,
3133,
281,
1607,
13616,
281,
2939,
253,
3082,
7882,
285,
31640,
342,
2743,
281,
253,
3632,
1265,
285,
253,
3602,
271,
285,
270,
79,
275,
253,
37346,
5474,
33032,
2520,
2929,
29328,
247,
22791,
323,
3082,
12672,
253,
369,
2152,
6339,
18,
4181,
2593,
337,
37250,
4114,
1491,
327,
12672,
259,
18,
2223,
342,
253,
8746,
275,
16186,
577,
285,
608,
285,
849,
253,
259,
18,
310,
908,
275,
36827,
3733,
2593,
374,
37250,
3082,
26230,
253,
8746,
19316,
285,
4616,
8115,
2593,
495,
8631,
253,
22791,
10670,
285,
2593,
577,
2722,
253,
1543,
273,
16344,
253,
3082,
327,
253,
1543,
534,
403,
18755,
275,
2593,
277,
273,
253,
30762,
50276,
6772,
3266,
839,
259,
18,
30745,
310,
7561,
908,
285,
247,
2834,
50275,
28617,
281,
22791,
984,
253,
3216,
33024,
4616,
50275,
23226,
285,
13849,
403,
2223,
417,
1929,
50275,
74,
717,
417,
6600,
273,
271,
4232,
259,
18,
49602,
285,
9380,
50275,
32205,
452,
281,
10725,
327,
15450,
8892,
824,
347,
39645,
7363,
50275,
936,
15249,
271,
5933,
280,
7756,
281,
253,
259,
18,
11193,
50276,
2520,
2929,
10262,
37825,
7533,
835,
253,
50275,
2595,
33024,
4616,
3711,
310,
1929,
285,
4648,
352,
281,
50276,
783,
5661,
1543,
403,
11080,
285,
253,
2929,
50275,
9072,
314,
2722,
326,
7221,
991,
3082,
8415,
50275,
783,
22791,
8892,
275,
954,
7533,
387,
1878,
323,
50275,
49845,
1776,
247,
11786,
326,
4020,
684,
253,
2032,
11786,
50276,
6050,
253,
2929,
29328,
247,
747,
22791,
323,
4020,
839,
50275,
783,
259,
18,
352,
19235,
1057,
417,
1246,
1543,
50275,
249,
4232,
36827,
7533,
347,
253,
3216,
33024,
8115,
50275,
609,
417,
1929,
3021,
2561,
326,
310,
9142,
7106,
327,
50275,
303,
40037,
253,
259,
18,
30745,
275,
7533,
824,
347,
305,
507,
50275,
11159,
320,
2104,
281,
897,
841,
49602,
323,
12611,
50275,
16217,
3825,
533,
841,
22791,
8892,
778,
417,
50275,
22697,
253,
2032,
12748,
50275,
1171,
841,
3082,
3021,
4232,
285,
6422,
50276,
262,
310,
417,
2590,
849,
14042,
259,
18,
14366,
310,
849,
1199,
789,
50275,
2013,
1550,
275,
253,
1673,
285,
849,
1142,
747,
10746,
50275,
2520,
22791,
588,
8046,
275,
643,
3000,
1805,
50275,
84,
17009,
281,
436,
22791,
588,
417,
3587,
8046,
50275,
1826,
3082,
390,
747,
36827,
1543,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
747,
22791,
281,
7472,
253,
2900,
273,
8654,
4616,
3237,
253,
30628,
15038,
326,
253,
22791,
310,
6210,
1591,
886,
4525,
285,
4460,
690,
403,
7514,
326,
247,
1805,
22791,
323,
14366,
3237,
588,
417,
4446,
4780,
347,
253,
34574,
273,
369,
2152,
6339,
305,
507,
2826,
5747,
616,
4433,
281,
8415,
14366,
2299,
352,
3133,
751,
247,
4217,
10444,
2451,
281,
3676,
257,
4685,
273,
2139,
369,
2152,
6339,
305,
507,
285,
3210,
281,
1705,
789,
407,
387,
1878,
23703,
44382,
11139,
569
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
overall i found the paper wellwritten the methods appear reasonable and the math appears correct i think the case made for the importancesignificance of the method could be improved but think the paper should be accepted regardless comments 1 i thought the introduction did a good job setting up the highlevel problem but did not really establish why a new method is needed i got to the end of the intro and wondered why i couldnt use any one of a dozen dro type methods to get the type of robustness described the experiments do a reasonable job of demonstrating the utility of the proposed method but i recommend adding something to the intro like methods have been proposed that promote robustness to distributional shift however these methods fail to capture xyz shifts because abc 2 i really liked the examples of the different types of shifts that the authors are interested in but thought the paper could do a better job arguing why these specific shifts are important or relevant perhaps swapping the red chair example for something more compelling might do the trick in particular it is worth giving a compelling reason why the drop in average performance might be worth an improvement in for example anomaly detection 3 i found the early parts of section 3 a bit confusing because it was unclear at that point in the paper where the majority and minority groups were coming from i recommend adding a few sentences to the beginning of that section like suppose that we knew the collection of nonsemantic features and thus new the relevant majority and minority groups this will not in general be the case and we will show how to derive such groups from the data but we first establish our method as if such groups were known 4 i found equation 10 very hard to follow first the notation has many problems superscripts from earlier in the section are changed to subscripts no domain is given for alpha it is not clear what it means to index alpha by e mu is not defined gamma is not defined second i would give a few sentences explaining what the pieces of this objective are doing and why it achieves the goal of splitting the data into relevant majority and minority groups minor comments 1 page 2 par 3 line 1 not clear what modeling bias refers to here or why it affects the dataset 2 equations 1 and 2 dont really seem necessary for the rest of the work in particular mathcalc isnt reference anywhere else in the paper i would just define hs and hn and move on 3 i found the notsim notation a bit confusing does it mean sampled from any distribution other than p based on the description it seems more like it means sampled from outside the support of p which matches the examples given in fig 1 4 equation 6 ell is not defined 5 equation 9 and surrounding text i would change reversekl to just kl since in this context this not an obvious forwardreverse direction 6 text under equation 9 it doesnt really make sense for a categorical distribution to be multimodal since there is no inherent ordering to values in the support unless there are literally two or more equivalent modes i would recommend changing unimodalmultimodal to peakedflat or lowentropyhighentropydocsepsummary the paper studies a setting where there are simple correlations with the target variables that are not however robust and more complex but robust features the simple correlation is usually such that in most cases the feature is descriptive of the label but at times it takes values that are part of a minority group that is not descriptive of any one class in particular systematicshift generalization is tested using the same spurious features that are present in training in all combinations except the usual pairing of spurious feature and class nonsystematic shifts are tested using novel spurious features anomaly detection is tested with unseen robust features neural networks trained with standard erm notoriously pick up on all features that correlate with the label and the paper compares several methods including irmv1 rex and groupdro to the proposed predictive group invariance the latter is shown to work better than the baselines even when using class conditioning on systematic shifts and anomaly detection and often nonsystematic shifts i found the paper to be mostly very well written with few exceptions and especially sections 1 and 2 very easy to read and understand the experiments setting is clear and not at all trivial my main questions and concern the description of pgi could be improved i would suggest adding an algorithm box that sums up what is described environments and partitions are the same thing for pgi how are environments chosen for the baseliness that need them with the same partition networks used for pgi if not it might be hard to disentangle the role of using the partition networks from the kl objective of pgi the part about partitioning is not detailed enough p13 we use a separate network for each object category what is an object category from page 3 i understood that there might be 2 categories easy and hard is this the case is it correct to think of the role of these partition predicting networks as a sort of clustering that focuses on the most easytofind features i havent seen any mention of early stopping in the experiments it sounds like the performance reported is the one at the end of all training epochs this might explain why for example erm on cocooncolours only achieves a 110 accuracy on systematic shift if this is correct why not using early stopping given how prone to overfitting these networks might be overall based on the results the proposed pgi seems to improve the accuracy over the baselines even though it seems to be more of an incremental improvement than a substantial and conceptual one i look forward to a constructive discussion with the authors and hope my questions and concerns will be clarified minor i think the blanket statement it has been reported that highly competitive performances can often be achieved with a baseline model on such domain generalisation benchmarks gulrajani lopezpaz 2020 similarly as in table 1 is a bit too vague and since it questions the validity of the results in all papers mentioned before it should be either properly justified or adjusted to make sure that what it says actually applies to all of them figure 3 is referenced in the text instead of figure 2 perhaps it was not referenced with ref docsepthis paper shows that group invariance methods across inferred partitions show better generalization in nonsystematic distributional shifts and anomaly detection settings it also suggests a new invariance penalty and empirically shows that it works better on three synthetic datasets viz colouredmnist cocooncolours and cocoonplaces the paper is written well and starts off by giving an intuition of why irmlike methods are important by presenting the results of a simple experiment on colouredmnist table 1 it then goes on to talk about nonsystematic generalization before introducing the proposed method the authors use reverse kl divergence between the group distributions as the penalty and use prior work to partition the datasets into groups they use the results look promising across datasets though it is slightly lower in the indistribution setting i am happy to see that they also talk extensively about hyperparameter selection especially in the case where they assume no access to validation sets with a distributional shift overall i like the work and would like to see it presented at the conference one minor point cite work the first time you introduce something not later on it can be a little confusing for the readers i wondered if i missed something for ex we find that a recently proposed method can be effective at discovering irmv1 etc docsepsummary this paper studies the behaviour of deep neural networks in situations where simple but irrelevant correlations exist between input and output and dominate more complex but relevant correlations the authors conduct experiments on synthetic datasets like coloured mnist and show that an invariance penalty helps the network focus on relevant correlations pros the paper studies neural network behaviour with respect to systemic biases that are likely faced by most neural networks in some form or the other to make the study tenable the authors make use of meaningful synthetic datasets and propose an intuitive regularization to overcome the systemic biases the analysis done in the paper is very methodical and the presentation is very clear the numerical simulations are comprehensive and convincing cons it would be nice to see how this would be applicable to real world datasets the paper is interesting even without it and i also appreciate that the authors are honest about it so i would not hold it against the authors but it would further strengthen the paper if some basic experiments are done on real world datasets for instance will one be able to find a partition on imagenet comments section 51 minimization is spelt incorrectly equation 67 i am not entirely sure what is happening with respect to the constraint on theta what does capital theta correspond to and if theta itself is the result of an optimisation argmin then why is there another optimisation on the same theta in the loss function in the text that appears before equation 3 it is mentioned that the predicted features ftheta will be matched for the two partitions but equation 7 matches the predicted output post softmax could you please clarify
### Summary: | all reviewers seems in favour of accepting this paper witht he majority voting for marginally above acceptance threshold the authors have taken special heed of the suggestions and improved the clarity of the paper from examination of the reviews the paper achieves enough to warrant publication my recommendation is therefore to accept the manuscript | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
1189,
455,
891,
1119,
253,
2929,
973,
15720,
253,
3082,
3176,
5272,
285,
253,
14168,
4620,
3451,
891,
1158,
253,
1083,
1160,
323,
253,
1395,
1972,
525,
40348,
273,
253,
1332,
812,
320,
5520,
533,
1158,
253,
2929,
943,
320,
7607,
10159,
50275,
26122,
50275,
18,
891,
1869,
253,
10199,
858,
247,
1175,
2628,
4758,
598,
253,
1029,
5251,
1895,
533,
858,
417,
1663,
5100,
2139,
247,
747,
1332,
310,
3058,
891,
1694,
281,
253,
990,
273,
253,
26432,
285,
13876,
2139,
891,
812,
2649,
897,
667,
581,
273,
247,
13365,
3926,
1511,
3082,
281,
755,
253,
1511,
273,
31640,
2529,
253,
4679,
513,
247,
5272,
2628,
273,
17227,
253,
11839,
273,
253,
4081,
1332,
533,
891,
5583,
6240,
1633,
281,
253,
26432,
751,
3082,
452,
644,
4081,
326,
8591,
31640,
281,
3268,
267,
5333,
2299,
841,
3082,
1891,
281,
9232,
1269,
30608,
15036,
984,
490,
68,
50276,
19,
891,
1663,
10490,
253,
6667,
273,
253,
1027,
3510,
273,
15036,
326,
253,
4477,
403,
6110,
275,
533,
1869,
253,
2929,
812,
513,
247,
1805,
2628,
16425,
2139,
841,
2173,
15036,
403,
1774,
390,
4623,
4931,
1863,
5436,
253,
2502,
6951,
1650,
323,
1633,
625,
18511,
1537,
513,
253,
10480,
275,
1798,
352,
310,
4409,
4933,
247,
18511,
1921,
2139,
253,
5926,
275,
3388,
3045,
1537,
320,
4409,
271,
7756,
275,
323,
1650,
30207,
5481,
50275,
20,
891,
1119,
253,
2393,
4243,
273,
2593,
495,
247,
2372,
21643,
984,
352,
369,
12744,
387,
326,
1127,
275,
253,
2929,
835,
253,
5020,
285,
15156,
2390,
497,
3551,
432,
891,
5583,
6240,
247,
1643,
14683,
281,
253,
5068,
273,
326,
2593,
751,
9428,
326,
359,
3260,
253,
4849,
273,
1327,
6017,
6484,
3386,
285,
3021,
747,
253,
4623,
5020,
285,
15156,
2390,
436,
588,
417,
275,
2087,
320,
253,
1083,
285,
359,
588,
921,
849,
281,
15313,
824,
2390,
432,
253,
941,
533,
359,
806,
5100,
776,
1332,
347,
604,
824,
2390,
497,
1929,
50276,
21,
891,
1119,
5150,
884,
1077,
1892,
281,
956,
806,
253,
14951,
556,
1142,
3237,
17402,
1687,
84,
432,
4321,
275,
253,
2593,
403,
4391,
281,
749,
26804,
642,
5028,
310,
1677,
323,
9765,
352,
310,
417,
2590,
752,
352,
2097,
281,
3605,
9765,
407,
299,
12910,
310,
417,
2931,
17356,
310,
417,
2931,
1273,
891,
651,
1918,
247,
1643,
14683,
15571,
752,
253,
7437,
273,
436,
8103,
403,
2509,
285,
2139,
352,
33526,
253,
4736,
273,
19860,
253,
941,
715,
4623,
5020,
285,
15156,
2390,
50275,
37585,
5701,
50275,
18,
3239,
374,
1061,
495,
1386,
337,
417,
2590,
752,
14053,
8492,
10770,
281,
1060,
390,
2139,
352,
11852,
253,
10895,
50276,
19,
7424,
337,
285,
374,
13414,
1663,
1646,
3309,
323,
253,
1551,
273,
253,
789,
275,
1798,
14168,
32557,
310,
2649,
3806,
9825,
2010,
275,
253,
2929,
891,
651,
816,
4853,
49343,
285,
288,
79,
285,
2118,
327,
50276,
20,
891,
1119,
253,
417,
3549,
14951,
247,
2372,
21643,
1057,
352,
1599,
19958,
432,
667,
3268,
643,
685,
268,
1754,
327,
253,
5740,
352,
3133,
625,
751,
352,
2097,
19958,
432,
3345,
253,
1329,
273,
268,
534,
10129,
253,
6667,
1677,
275,
3036,
337,
50276,
21,
5150,
721,
11591,
310,
417,
2931,
50276,
22,
5150,
898,
285,
8704,
2505,
891,
651,
1818,
8107,
7261,
281,
816,
27451,
1580,
275,
436,
3634,
436,
417,
271,
4755,
3579,
32514,
3884,
50276,
23,
2505,
762,
5150,
898,
352,
36908,
1663,
1056,
3282,
323,
247,
31091,
3268,
281,
320,
23390,
26306,
1580,
627,
310,
642,
12794,
15824,
281,
2193,
275,
253,
1329,
5734,
627,
403,
12832,
767,
390,
625,
6425,
10006,
891,
651,
5583,
6890,
32505,
26306,
9961,
303,
26306,
281,
33404,
22829,
390,
1698,
290,
10144,
26803,
864,
85,
10144,
7152,
339,
793,
360,
3454,
253,
2929,
2175,
247,
4758,
835,
627,
403,
2969,
13007,
342,
253,
2303,
4903,
326,
403,
417,
2299,
10237,
285,
625,
2570,
533,
10237,
3386,
253,
2969,
5921,
310,
3798,
824,
326,
275,
954,
2219,
253,
4735,
310,
27389,
273,
253,
5203,
533,
387,
2069,
352,
3936,
2193,
326,
403,
629,
273,
247,
15156,
1387,
326,
310,
417,
27389,
273,
667,
581,
966,
275,
1798,
985,
29126,
32190,
26647,
310,
5762,
970,
253,
1072,
46541,
3386,
326,
403,
1246,
275,
3733,
275,
512,
13553,
3707,
253,
7312,
25015,
273,
46541,
4735,
285,
966,
14122,
2468,
1420,
15036,
403,
5762,
970,
4460,
46541,
3386,
30207,
5481,
310,
5762,
342,
39709,
10237,
3386,
11454,
6928,
10166,
342,
2629,
209,
693,
417,
49186,
2619,
598,
327,
512,
3386,
326,
24888,
342,
253,
5203,
285,
253,
2929,
26662,
2067,
3082,
1690,
209,
2683,
87,
18,
294,
89,
285,
1387,
3002,
281,
253,
4081,
15970,
1387,
31429,
253,
6158,
310,
2011,
281,
789,
1805,
685,
253,
1666,
25379,
1014,
672,
970,
966,
21839,
327,
12082,
15036,
285,
30207,
5481,
285,
2223,
14122,
2468,
1420,
15036,
50276,
74,
1119,
253,
2929,
281,
320,
6571,
1077,
973,
3542,
342,
1643,
16022,
285,
3340,
7118,
337,
285,
374,
1077,
3477,
281,
1239,
285,
2096,
253,
4679,
4758,
310,
2590,
285,
417,
387,
512,
14916,
50276,
2577,
2022,
3533,
285,
4468,
50276,
783,
5740,
273,
268,
7311,
812,
320,
5520,
891,
651,
1804,
6240,
271,
5933,
3817,
326,
22661,
598,
752,
310,
2529,
50276,
257,
11986,
285,
27959,
403,
253,
1072,
2181,
323,
268,
7311,
849,
403,
12620,
6777,
323,
253,
1666,
293,
1632,
326,
878,
731,
342,
253,
1072,
10883,
6928,
908,
323,
268,
7311,
604,
417,
352,
1537,
320,
1892,
281,
557,
290,
2134,
253,
2554,
273,
970,
253,
10883,
6928,
432,
253,
27451,
8103,
273,
268,
7311,
50275,
783,
629,
670,
41463,
310,
417,
7000,
2217,
268,
1012,
359,
897,
247,
4858,
2990,
323,
1016,
1789,
7140,
752,
310,
271,
1789,
7140,
432,
3239,
495,
891,
7192,
326,
627,
1537,
320,
374,
9050,
3477,
285,
1892,
310,
436,
253,
1083,
310,
352,
3451,
281,
1158,
273,
253,
2554,
273,
841,
10883,
21565,
6928,
347,
247,
3686,
273,
17524,
326,
16633,
327,
253,
954,
3477,
936,
8606,
3386,
50276,
74,
419,
2254,
2326,
667,
3748,
273,
2393,
15910,
275,
253,
4679,
352,
7835,
751,
253,
3045,
2361,
310,
253,
581,
387,
253,
990,
273,
512,
3733,
44540,
436,
1537,
5513,
2139,
323,
1650,
209,
693,
327,
9285,
3508,
2052,
2108,
760,
33526,
247,
9199,
7200,
327,
12082,
5333,
604,
436,
310,
3451,
2139,
417,
970,
2393,
15910,
1677,
849,
21291,
281,
689,
31893,
841,
6928,
1537,
320,
50276,
1189,
455,
1754,
327,
253,
1543,
253,
4081,
268,
7311,
3133,
281,
3157,
253,
7200,
689,
253,
1666,
25379,
1014,
2167,
352,
3133,
281,
320,
625,
273,
271,
32809,
7756,
685,
247,
6832,
285,
20178,
581,
891,
1007,
3579,
281,
247,
25799,
5955,
342,
253,
4477,
285,
3524,
619,
3533,
285,
7350,
588,
320,
31637,
50275,
37585,
50276,
74,
1158,
253,
23069,
3908,
352,
556,
644,
2361,
326,
4122,
12085,
16226,
476,
2223,
320,
6786,
342,
247,
8245,
1566,
327,
824,
5028,
2087,
5837,
49602,
49725,
43792,
6451,
50276,
77,
27217,
81,
1370,
9169,
12014,
347,
275,
2829,
337,
50276,
261,
247,
2372,
1512,
21248,
285,
1580,
352,
3533,
253,
13091,
273,
253,
1543,
275,
512,
9380,
5393,
1078,
352,
943,
320,
2057,
6283,
17285,
390,
10904,
281,
1056,
2119,
326,
752,
352,
2296,
2686,
10384,
281,
512,
273,
731,
50276,
13206,
495,
310,
23378,
275,
253,
2505,
3185,
273,
4677,
374,
4931,
352,
369,
417,
23378,
342,
1275,
50276,
7152,
33032,
2520,
2929,
2722,
326,
1387,
31429,
3082,
2439,
22245,
27959,
921,
1805,
26647,
275,
14122,
2468,
1420,
3268,
267,
15036,
285,
30207,
5481,
7533,
352,
671,
5936,
247,
747,
31429,
12339,
285,
45190,
2722,
326,
352,
2987,
1805,
327,
1264,
13506,
15302,
40027,
37220,
16192,
382,
9285,
3508,
2052,
2108,
285,
9285,
3508,
30345,
50276,
783,
2929,
310,
3542,
973,
285,
7866,
745,
407,
4933,
271,
30328,
273,
2139,
209,
2683,
3022,
3082,
403,
1774,
407,
15250,
253,
1543,
273,
247,
2969,
3368,
327,
37220,
16192,
382,
2829,
337,
352,
840,
4566,
327,
281,
2312,
670,
14122,
2468,
1420,
26647,
1078,
16984,
253,
4081,
1332,
253,
4477,
897,
8107,
27451,
23279,
875,
253,
1387,
10670,
347,
253,
12339,
285,
897,
2720,
789,
281,
10883,
253,
15302,
715,
2390,
597,
897,
50275,
783,
1543,
1007,
12532,
2439,
15302,
2167,
352,
310,
5777,
2406,
275,
253,
31929,
2382,
4758,
891,
717,
5211,
281,
923,
326,
597,
671,
2312,
18171,
670,
4373,
19484,
5438,
3340,
275,
253,
1083,
835,
597,
5467,
642,
2289,
281,
12820,
5239,
342,
247,
3268,
267,
5333,
50276,
1189,
455,
891,
751,
253,
789,
285,
651,
751,
281,
923,
352,
3559,
387,
253,
8059,
581,
5884,
1127,
26542,
789,
253,
806,
673,
368,
9569,
1633,
417,
1996,
327,
352,
476,
320,
247,
1652,
21643,
323,
253,
10668,
891,
13876,
604,
891,
9829,
1633,
323,
385,
359,
1089,
326,
247,
4102,
4081,
1332,
476,
320,
3576,
387,
30375,
209,
2683,
87,
18,
3966,
5474,
339,
793,
360,
3454,
436,
2929,
2175,
253,
8770,
273,
3676,
11454,
6928,
275,
9534,
835,
2969,
533,
19124,
13007,
2226,
875,
3280,
285,
3453,
285,
25903,
625,
2570,
533,
4623,
13007,
253,
4477,
2589,
4679,
327,
13506,
15302,
751,
37220,
278,
79,
382,
285,
921,
326,
271,
31429,
12339,
7729,
253,
2990,
2770,
327,
4623,
13007,
50275,
856,
84,
50276,
783,
2929,
2175,
11454,
2990,
8770,
342,
1675,
281,
13407,
31306,
326,
403,
2779,
11372,
407,
954,
11454,
6928,
275,
690,
830,
390,
253,
643,
281,
1056,
253,
1263,
3578,
494,
253,
4477,
1056,
897,
273,
14282,
13506,
15302,
285,
12661,
271,
27350,
37820,
281,
11399,
253,
13407,
31306,
50275,
783,
1783,
2218,
275,
253,
2929,
310,
1077,
1332,
474,
285,
253,
9759,
310,
1077,
2590,
50276,
783,
10704,
9938,
403,
11088,
285,
21414,
50276,
5040,
50276,
262,
651,
320,
5322,
281,
923,
849,
436,
651,
320,
7763,
281,
1524,
1533,
15302,
253,
2929,
310,
4722,
1014,
1293,
352,
285,
891,
671,
11435,
326,
253,
4477,
403,
8274,
670,
352,
50276,
601,
891,
651,
417,
2186,
352,
1411,
253,
4477,
533,
352,
651,
2007,
17084,
253,
2929,
604,
690,
5044,
4679,
403,
2218,
327,
1524,
1533,
15302,
323,
4227,
588,
581,
320,
2104,
281,
1089,
247,
10883,
327,
4440,
257,
292,
50276,
26122,
50276,
4674,
8319,
41458,
310,
653,
2585,
30833,
50276,
29813,
9963,
891,
717,
417,
7094,
2119,
752,
310,
9369,
342,
1675,
281,
253,
7658,
327,
39116,
752,
1057,
5347,
39116,
2723,
281,
285,
604,
39116,
3139,
310,
253,
906,
273,
271,
5556,
5837,
1736,
1222,
840,
2139,
310,
627,
1529,
5556,
5837,
327,
253,
1072,
39116,
275,
253,
2957,
1159,
50276,
249,
253,
2505,
326,
4620,
1078,
5150,
495,
352,
310,
5393,
326,
253,
8131,
3386,
269,
3124,
588,
320,
13373,
323,
253,
767,
27959,
533,
5150,
818,
10129,
253,
8131,
3453,
1501,
2602,
4090,
812,
368,
4496,
19148,
187,
187,
4118,
18435,
27,
455,
30628,
3133,
275,
9796,
273,
18738,
436,
2929,
342,
85,
344,
5020,
13423,
323,
42876,
1840,
14924,
7887,
50276,
783,
4477,
452,
2668,
2714,
43119,
273,
253,
13991,
285,
5520,
253,
19843,
273,
253,
2929,
50276,
4064,
8368,
273,
253,
10123,
253,
2929,
33526,
2217,
281,
7501,
9311,
50276,
2577,
17401,
310,
3103,
281,
2997,
253,
7714,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
1189,
455,
891,
1119,
253,
2929,
973,
15720,
253,
3082,
3176,
5272,
285,
253,
14168,
4620,
3451,
891,
1158,
253,
1083,
1160,
323,
253,
1395,
1972,
525,
40348,
273,
253,
1332,
812,
320,
5520,
533,
1158,
253,
2929,
943,
320,
7607,
10159,
50275,
26122,
50275,
18,
891,
1869,
253,
10199,
858,
247,
1175,
2628,
4758,
598,
253,
1029,
5251,
1895,
533,
858,
417,
1663,
5100,
2139,
247,
747,
1332,
310,
3058,
891,
1694,
281,
253,
990,
273,
253,
26432,
285,
13876,
2139,
891,
812,
2649,
897,
667,
581,
273,
247,
13365,
3926,
1511,
3082,
281,
755,
253,
1511,
273,
31640,
2529,
253,
4679,
513,
247,
5272,
2628,
273,
17227,
253,
11839,
273,
253,
4081,
1332,
533,
891,
5583,
6240,
1633,
281,
253,
26432,
751,
3082,
452,
644,
4081,
326,
8591,
31640,
281,
3268,
267,
5333,
2299,
841,
3082,
1891,
281,
9232,
1269,
30608,
15036,
984,
490,
68,
50276,
19,
891,
1663,
10490,
253,
6667,
273,
253,
1027,
3510,
273,
15036,
326,
253,
4477,
403,
6110,
275,
533,
1869,
253,
2929,
812,
513,
247,
1805,
2628,
16425,
2139,
841,
2173,
15036,
403,
1774,
390,
4623,
4931,
1863,
5436,
253,
2502,
6951,
1650,
323,
1633,
625,
18511,
1537,
513,
253,
10480,
275,
1798,
352,
310,
4409,
4933,
247,
18511,
1921,
2139,
253,
5926,
275,
3388,
3045,
1537,
320,
4409,
271,
7756,
275,
323,
1650,
30207,
5481,
50275,
20,
891,
1119,
253,
2393,
4243,
273,
2593,
495,
247,
2372,
21643,
984,
352,
369,
12744,
387,
326,
1127,
275,
253,
2929,
835,
253,
5020,
285,
15156,
2390,
497,
3551,
432,
891,
5583,
6240,
247,
1643,
14683,
281,
253,
5068,
273,
326,
2593,
751,
9428,
326,
359,
3260,
253,
4849,
273,
1327,
6017,
6484,
3386,
285,
3021,
747,
253,
4623,
5020,
285,
15156,
2390,
436,
588,
417,
275,
2087,
320,
253,
1083,
285,
359,
588,
921,
849,
281,
15313,
824,
2390,
432,
253,
941,
533,
359,
806,
5100,
776,
1332,
347,
604,
824,
2390,
497,
1929,
50276,
21,
891,
1119,
5150,
884,
1077,
1892,
281,
956,
806,
253,
14951,
556,
1142,
3237,
17402,
1687,
84,
432,
4321,
275,
253,
2593,
403,
4391,
281,
749,
26804,
642,
5028,
310,
1677,
323,
9765,
352,
310,
417,
2590,
752,
352,
2097,
281,
3605,
9765,
407,
299,
12910,
310,
417,
2931,
17356,
310,
417,
2931,
1273,
891,
651,
1918,
247,
1643,
14683,
15571,
752,
253,
7437,
273,
436,
8103,
403,
2509,
285,
2139,
352,
33526,
253,
4736,
273,
19860,
253,
941,
715,
4623,
5020,
285,
15156,
2390,
50275,
37585,
5701,
50275,
18,
3239,
374,
1061,
495,
1386,
337,
417,
2590,
752,
14053,
8492,
10770,
281,
1060,
390,
2139,
352,
11852,
253,
10895,
50276,
19,
7424,
337,
285,
374,
13414,
1663,
1646,
3309,
323,
253,
1551,
273,
253,
789,
275,
1798,
14168,
32557,
310,
2649,
3806,
9825,
2010,
275,
253,
2929,
891,
651,
816,
4853,
49343,
285,
288,
79,
285,
2118,
327,
50276,
20,
891,
1119,
253,
417,
3549,
14951,
247,
2372,
21643,
1057,
352,
1599,
19958,
432,
667,
3268,
643,
685,
268,
1754,
327,
253,
5740,
352,
3133,
625,
751,
352,
2097,
19958,
432,
3345,
253,
1329,
273,
268,
534,
10129,
253,
6667,
1677,
275,
3036,
337,
50276,
21,
5150,
721,
11591,
310,
417,
2931,
50276,
22,
5150,
898,
285,
8704,
2505,
891,
651,
1818,
8107,
7261,
281,
816,
27451,
1580,
275,
436,
3634,
436,
417,
271,
4755,
3579,
32514,
3884,
50276,
23,
2505,
762,
5150,
898,
352,
36908,
1663,
1056,
3282,
323,
247,
31091,
3268,
281,
320,
23390,
26306,
1580,
627,
310,
642,
12794,
15824,
281,
2193,
275,
253,
1329,
5734,
627,
403,
12832,
767,
390,
625,
6425,
10006,
891,
651,
5583,
6890,
32505,
26306,
9961,
303,
26306,
281,
33404,
22829,
390,
1698,
290,
10144,
26803,
864,
85,
10144,
7152,
339,
793,
360,
3454,
253,
2929,
2175,
247,
4758,
835,
627,
403,
2969,
13007,
342,
253,
2303,
4903,
326,
403,
417,
2299,
10237,
285,
625,
2570,
533,
10237,
3386,
253,
2969,
5921,
310,
3798,
824,
326,
275,
954,
2219,
253,
4735,
310,
27389,
273,
253,
5203,
533,
387,
2069,
352,
3936,
2193,
326,
403,
629,
273,
247,
15156,
1387,
326,
310,
417,
27389,
273,
667,
581,
966,
275,
1798,
985,
29126,
32190,
26647,
310,
5762,
970,
253,
1072,
46541,
3386,
326,
403,
1246,
275,
3733,
275,
512,
13553,
3707,
253,
7312,
25015,
273,
46541,
4735,
285,
966,
14122,
2468,
1420,
15036,
403,
5762,
970,
4460,
46541,
3386,
30207,
5481,
310,
5762,
342,
39709,
10237,
3386,
11454,
6928,
10166,
342,
2629,
209,
693,
417,
49186,
2619,
598,
327,
512,
3386,
326,
24888,
342,
253,
5203,
285,
253,
2929,
26662,
2067,
3082,
1690,
209,
2683,
87,
18,
294,
89,
285,
1387,
3002,
281,
253,
4081,
15970,
1387,
31429,
253,
6158,
310,
2011,
281,
789,
1805,
685,
253,
1666,
25379,
1014,
672,
970,
966,
21839,
327,
12082,
15036,
285,
30207,
5481,
285,
2223,
14122,
2468,
1420,
15036,
50276,
74,
1119,
253,
2929,
281,
320,
6571,
1077,
973,
3542,
342,
1643,
16022,
285,
3340,
7118,
337,
285,
374,
1077,
3477,
281,
1239,
285,
2096,
253,
4679,
4758,
310,
2590,
285,
417,
387,
512,
14916,
50276,
2577,
2022,
3533,
285,
4468,
50276,
783,
5740,
273,
268,
7311,
812,
320,
5520,
891,
651,
1804,
6240,
271,
5933,
3817,
326,
22661,
598,
752,
310,
2529,
50276,
257,
11986,
285,
27959,
403,
253,
1072,
2181,
323,
268,
7311,
849,
403,
12620,
6777,
323,
253,
1666,
293,
1632,
326,
878,
731,
342,
253,
1072,
10883,
6928,
908,
323,
268,
7311,
604,
417,
352,
1537,
320,
1892,
281,
557,
290,
2134,
253,
2554,
273,
970,
253,
10883,
6928,
432,
253,
27451,
8103,
273,
268,
7311,
50275,
783,
629,
670,
41463,
310,
417,
7000,
2217,
268,
1012,
359,
897,
247,
4858,
2990,
323,
1016,
1789,
7140,
752,
310,
271,
1789,
7140,
432,
3239,
495,
891,
7192,
326,
627,
1537,
320,
374,
9050,
3477,
285,
1892,
310,
436,
253,
1083,
310,
352,
3451,
281,
1158,
273,
253,
2554,
273,
841,
10883,
21565,
6928,
347,
247,
3686,
273,
17524,
326,
16633,
327,
253,
954,
3477,
936,
8606,
3386,
50276,
74,
419,
2254,
2326,
667,
3748,
273,
2393,
15910,
275,
253,
4679,
352,
7835,
751,
253,
3045,
2361,
310,
253,
581,
387,
253,
990,
273,
512,
3733,
44540,
436,
1537,
5513,
2139,
323,
1650,
209,
693,
327,
9285,
3508,
2052,
2108,
760,
33526,
247,
9199,
7200,
327,
12082,
5333,
604,
436,
310,
3451,
2139,
417,
970,
2393,
15910,
1677,
849,
21291,
281,
689,
31893,
841,
6928,
1537,
320,
50276,
1189,
455,
1754,
327,
253,
1543,
253,
4081,
268,
7311,
3133,
281,
3157,
253,
7200,
689,
253,
1666,
25379,
1014,
2167,
352,
3133,
281,
320,
625,
273,
271,
32809,
7756,
685,
247,
6832,
285,
20178,
581,
891,
1007,
3579,
281,
247,
25799,
5955,
342,
253,
4477,
285,
3524,
619,
3533,
285,
7350,
588,
320,
31637,
50275,
37585,
50276,
74,
1158,
253,
23069,
3908,
352,
556,
644,
2361,
326,
4122,
12085,
16226,
476,
2223,
320,
6786,
342,
247,
8245,
1566,
327,
824,
5028,
2087,
5837,
49602,
49725,
43792,
6451,
50276,
77,
27217,
81,
1370,
9169,
12014,
347,
275,
2829,
337,
50276,
261,
247,
2372,
1512,
21248,
285,
1580,
352,
3533,
253,
13091,
273,
253,
1543,
275,
512,
9380,
5393,
1078,
352,
943,
320,
2057,
6283,
17285,
390,
10904,
281,
1056,
2119,
326,
752,
352,
2296,
2686,
10384,
281,
512,
273,
731,
50276,
13206,
495,
310,
23378,
275,
253,
2505,
3185,
273,
4677,
374,
4931,
352,
369,
417,
23378,
342,
1275,
50276,
7152,
33032,
2520,
2929,
2722,
326,
1387,
31429,
3082,
2439,
22245,
27959,
921,
1805,
26647,
275,
14122,
2468,
1420,
3268,
267,
15036,
285,
30207,
5481,
7533,
352,
671,
5936,
247,
747,
31429,
12339,
285,
45190,
2722,
326,
352,
2987,
1805,
327,
1264,
13506,
15302,
40027,
37220,
16192,
382,
9285,
3508,
2052,
2108,
285,
9285,
3508,
30345,
50276,
783,
2929,
310,
3542,
973,
285,
7866,
745,
407,
4933,
271,
30328,
273,
2139,
209,
2683,
3022,
3082,
403,
1774,
407,
15250,
253,
1543,
273,
247,
2969,
3368,
327,
37220,
16192,
382,
2829,
337,
352,
840,
4566,
327,
281,
2312,
670,
14122,
2468,
1420,
26647,
1078,
16984,
253,
4081,
1332,
253,
4477,
897,
8107,
27451,
23279,
875,
253,
1387,
10670,
347,
253,
12339,
285,
897,
2720,
789,
281,
10883,
253,
15302,
715,
2390,
597,
897,
50275,
783,
1543,
1007,
12532,
2439,
15302,
2167,
352,
310,
5777,
2406,
275,
253,
31929,
2382,
4758,
891,
717,
5211,
281,
923,
326,
597,
671,
2312,
18171,
670,
4373,
19484,
5438,
3340,
275,
253,
1083,
835,
597,
5467,
642,
2289,
281,
12820,
5239,
342,
247,
3268,
267,
5333,
50276,
1189,
455,
891,
751,
253,
789,
285,
651,
751,
281,
923,
352,
3559,
387,
253,
8059,
581,
5884,
1127,
26542,
789,
253,
806,
673,
368,
9569,
1633,
417,
1996,
327,
352,
476,
320,
247,
1652,
21643,
323,
253,
10668,
891,
13876,
604,
891,
9829,
1633,
323,
385,
359,
1089,
326,
247,
4102,
4081,
1332,
476,
320,
3576,
387,
30375,
209,
2683,
87,
18,
3966,
5474,
339,
793,
360,
3454,
436,
2929,
2175,
253,
8770,
273,
3676,
11454,
6928,
275,
9534,
835,
2969,
533,
19124,
13007,
2226,
875,
3280,
285,
3453,
285,
25903,
625,
2570,
533,
4623,
13007,
253,
4477,
2589,
4679,
327,
13506,
15302,
751,
37220,
278,
79,
382,
285,
921,
326,
271,
31429,
12339,
7729,
253,
2990,
2770,
327,
4623,
13007,
50275,
856,
84,
50276,
783,
2929,
2175,
11454,
2990,
8770,
342,
1675,
281,
13407,
31306,
326,
403,
2779,
11372,
407,
954,
11454,
6928,
275,
690,
830,
390,
253,
643,
281,
1056,
253,
1263,
3578,
494,
253,
4477,
1056,
897,
273,
14282,
13506,
15302,
285,
12661,
271,
27350,
37820,
281,
11399,
253,
13407,
31306,
50275,
783,
1783,
2218,
275,
253,
2929,
310,
1077,
1332,
474,
285,
253,
9759,
310,
1077,
2590,
50276,
783,
10704,
9938,
403,
11088,
285,
21414,
50276,
5040,
50276,
262,
651,
320,
5322,
281,
923,
849,
436,
651,
320,
7763,
281,
1524,
1533,
15302,
253,
2929,
310,
4722,
1014,
1293,
352,
285,
891,
671,
11435,
326,
253,
4477,
403,
8274,
670,
352,
50276,
601,
891,
651,
417,
2186,
352,
1411,
253,
4477,
533,
352,
651,
2007,
17084,
253,
2929,
604,
690,
5044,
4679,
403,
2218,
327,
1524,
1533,
15302,
323,
4227,
588,
581,
320,
2104,
281,
1089,
247,
10883,
327,
4440,
257,
292,
50276,
26122,
50276,
4674,
8319,
41458,
310,
653,
2585,
30833,
50276,
29813,
9963,
891,
717,
417,
7094,
2119,
752,
310,
9369,
342,
1675,
281,
253,
7658,
327,
39116,
752,
1057,
5347,
39116,
2723,
281,
285,
604,
39116,
3139,
310,
253,
906,
273,
271,
5556,
5837,
1736,
1222,
840,
2139,
310,
627,
1529,
5556,
5837,
327,
253,
1072,
39116,
275,
253,
2957,
1159,
50276,
249,
253,
2505,
326,
4620,
1078,
5150,
495,
352,
310,
5393,
326,
253,
8131,
3386,
269,
3124,
588,
320,
13373,
323,
253,
767,
27959,
533,
5150,
818,
10129,
253,
8131,
3453,
1501,
2602,
4090,
812,
368,
4496,
19148,
187,
187,
4118,
18435,
27,
455,
30628,
3133,
275,
9796,
273,
18738,
436,
2929,
342,
85,
344,
5020,
13423,
323,
42876,
1840,
14924,
7887,
50276,
783,
4477,
452,
2668,
2714,
43119,
273,
253,
13991,
285,
5520,
253,
19843,
273,
253,
2929,
50276,
4064,
8368,
273,
253,
10123,
253,
2929,
33526,
2217,
281,
7501,
9311,
50276,
2577,
17401,
310,
3103,
281,
2997,
253,
7714,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper takes an interesting nonconvex optimization perspective on the continual learning problem more specifically the authors pose continual learning with episodic memory as a smooth nonconvex finite sum problem they then consider the requirements for a theoretical proof of convergence to a stationary point for previously learned tasks this results in the proposed nccl method that leverages these ideas to modulate learning rates for the current and previous tasks to prevent escape from the feasible region overall the strength of this paper is its theoretical analysis and i find the idea of connecting continual learning with the associated nonconvex optimization problem compelling i am not an expert in nonconvex optimization but my understanding is that the analysis itself is not that unique for the field rather what is novel is the interesting application of the ideas to the continual learning problem i find the theoretical aspect of this paper strong but still lean towards rejection in its current form as i am very skeptical that the idea is at all validated by the experiments this potentially suggests that the theory may lack relevance in these domains there are some comparisons to baselines and prior work that i found a bit questionable on the bottom of page 6 the authors state that existing gem based algorithms only focus on canceling the negative direction but actually maximizing transfer even when gradient dot products align was explored in 1 the authors also suggest in section 42 that despite worse empirical results the nccl approach is superior to gem because of its inefficient quadratic program computation however this was already addressed in agem 2 so it is not so clear that there is a significant computational advantage to nccl i would think that the authors should actually compare compute times inline with prior work i also am almost 100 sure that the comparison to reservoir sampling is incorrect if you look at results in 1 and 3 you see that reservoir sampling consistently performs right around gem and sometimes better than gem on exactly these same benchmarks the 10 number seems unfathomable to me and at the very least needs an explanation about how this could be true 1 learning to learn without forgetting by maximizing transfer and minimizing interference riemer et al iclr 2019 2 efficient lifelong learning with agem chaudhry et al iclr 2019 3 on tiny episodic memories in continual learning chaudhry et al 2019 this last point is related to my biggest overall concern which is that it is not clear that the learning rate weighting scheme proposed in this work actually helps in comparison to generic replay for example it would be a really important ablation to try the very same buffer setup but with no learning rate modulation my experience leads me to believe that the gap between the gem based approaches and nccl is likely larger than the gap between these approaches and vanilla replay as a result i am very skeptical that the learning rate modulation component adds value based on the current results additionally it would be very interesting to look deeper into how the model is working to understand its effect on learning for example the authors should detail patterns with the chosen modulated learning rates over time while i appreciate the theoretical analysis of this paper i think the experiment section is too short and leaves many important questions unexplored unfortunately i feel that i must support rejection of this paper in its current form as my doubts about the experiments leave me unsure that the approach works at all in practice after the rebuttal i really appreciate the author response and it is a shame that the revisions do not seem to be correctly uploaded unfortunately the responses to my comments rely heavily on references to the revision that i cannot see making it impossible for me to validate if my concerns were actually adequately addressed the other reviewers have mentioned some very valid concerns about the submitted draft as well as such i continue to lean towards rejection of the submitted paper as significant revisions are certainly needed docsepsummary of paper this paper analyses the convergence of episodic memorybased continual learning methods by looking at it as a nonconvex optimisation problem they analyse the convergence rates for the case where all memory from past tasks is stored and then consider the case where there is only a subset of past data leading to overfitting on the episodic memory they then introduce a method that scales the learning rates of the their update method with the goal of tightening the bound obtained in the convergence analysis finally experiments are shown on different benchmarks and the proposed method is compared to some competing baselines summary of review i am recommending rejecting this paper although the goal of the paper is commendable convergence analysis for nonconvex episodic memorybased continual learning i feel like there are many parts of the paper that can be improved see later in the review pros of paper 1 the paper attempts to analyse the convergence of continual learning methods theoretically especially section 31 this is very important to do so that we can understand the problem of nonconvex continual learning better this has not been attempted enough in the literature partly because this is a very difficult problem 2 the work appears to be wellpositioned with related work on convergence rates as far as i am aware 3 the paper builds nicely from introduction to preliminary work to theoretical results to experiments cons of paper questions for the authors 4 although the aim of the paper is great it appears to me as if the methods the paper mentions are not instances of the update that the paper analyses equation 6 specifically gem and ewc mentioned in the first paragraph of section 31 gem has a different optimisation technique quadratic programming algorithm and ewc does not store any episodic memory only stores previous model parameters 5 i am struggling to see the significance of section 32 overfitting to episodic memory it appears like the authors are just pointing out that there is a bias introduced by storing only a subset of past data without sufficiently commenting on the effects or significance of this bias 6 appendix a proof of theorem 1 is incomplete 7 something seems wrong to me with the bwt metric in section 41 a my own experience with finetune and ewc strongly suggests that both methods should have bwt0 this is because the methods first learn the task well and then forget it slowly over time and is fully expected from such algorithms however the authors report bwt0 b finetune on permutedmnist table 1 has an acc of 243 but a bwt of 1210 surely be definition bwtacc always equation 18 c a final point on bwt a bwt0 does not imply that catastrophic forgetting happens final paragraph page 7 although it does imply forgetting this is not necessarily catastrophic forgetting which is only when bwt is extremely negative for example the concept of graceful forgetting will still have bwt0 but is usually distinguished from catastrophic forgetting 8 can the authors comment on why the proposed method performs better with 1 epoch per task than with 5 epochs per task tables 1 vs 2 permutedmnist this result appears to indicate that despite the correction terms of the method the method is forgetting tasks as it trains for longer additional minor feedback 9 i would strongly recommend proofreading the paper or else asking a native english speaker to do so 10 figure 1 is a nice sketch visually but i did not see how it shows the benefitkey idea of nccl specifically which is about finding optimal learning rates there is no visualdiagramatic element of how those learning rates might be chosen alternatively put a similar figure could be used to describe eg gem update to review thanks to the authors for responding they did clear up point 5 above for me however i shall keep my score of 4 unfortunately i cannot see the new revision of the paper that the authors refer to meaning i cannot change my score docsepthis paper attempts to provide a convergence analysis for nonconvex continual learning with episodic memories and try to theoretically show the degradation of backward transfer caused by overfitting to memorized samples it further proposes an algorithm for learning rate scheduling in nonconvex continual learning based on these results the reason of the score of the paper is that the theoretical proof is wrong in my understanding and cannot support the main contribution claimed in this paper the main problems are as below the proof of the main theorems is questionable regarding the nonconvex assumption which is the most important contribution claimed in this paper regarding the inequality in eq5 in my understanding it is hold to be true only when f is a convex function 1 and the theorems are based on this inequality which cannot be hold for nonconvex case if this inequality is not true for nonconvex functions if im wrong authors please provide proof of how to get eq5 by lsmooth nonconvex functions moreover in the proof of theorem 1 eq19 appendix a the inequality of the last step cannot be hold unless the inner product of gradients delta f delat g is always positive which cannot be guaranteed otherwise there is no reason to develop gradientbased approaches in continual learning such as gem 2 or agem 3 so even if eq5 can hold for nonconvex case the theorem is still questionable therefore the main claim of this paper is highly suspicious to me if authors cannot clarify these issues this paper would be considered as with significant flaws despite the questions on the main theorem the assumption of the initial state of the model is quite strong as it assumes the initial values of parameters are close to the optimal values which is not very practical unless a pretrained model is applied so the significance of this paper is further limited as the theoretical part is incorrect i havent reviewed the experiments part of this paper if the authors can clarify all above main concerns im willing to make another round of review 1 nesterov yurii introductory lectures on convex programming volume i basic course lecture notes 34 1998 5 2 lopezpaz david and marcaurelio ranzato gradient episodic memory for continual learning advances in neural information processing systems 2017 3 chaudhry arslan et al efficient lifelong learning with agem arxiv preprint arxiv181200420 2018 feedback to authors response im aware of the nonconvex setting is valid but since the corrected proof of the theorem is not uploaded i will raise my score to 3 docsepin this paper the authors provide theoretical justifications for memorybased continual learning cl methods and provide a scaling learning rate method nccl to improve the practical performance the results look quite exciting there is quite scant theoretical paper for cl however after looking into the details of the paper i was confused by many places and would say the authors need to further improve their manuscript in order to qualify for the iclr standard 1 the theoretical analysis is not very impressive the theory just split out the catastrophic forgetting term c and demonstrated that performance degradation depends on c however where c comes from i know it is an additional term directly from mathematical derivation but whats the meaning and intuition is not clearly discussed also the theorem based on the unrealistic assumption et is unbiased assumption 2 which can never happen in memorybased cl methods the authors do mention approaches such as nccl without assumption 2 but no theory is provided probability section 32 is on theory without assumption 2 then please provide a complete theorem instead of just waving hands 2 moreover there are many flaws in the proof i just list a few of them here or correct me if i misunderstand in proof of theorem 1 second inequality of eq 19 why does the cross product term disappear ie why nabla f nabla g2 nabla f2 nabla g2 why ct ei tildect when taking an expectation over it ct is defined in eq 7 and there is no randomness over jt already with ej but ei tildect still has randomness over jt why eet2 is written as et2 we also have randomness in et over it see definition of et in proof of lemma 1 why enabla f oenabla g or how do we get the second equality how do we get the relation of enabla g2 obeta2delta sqrtt i see it is directly assumed in corollary 1 expected stationary of g be deltasqrtt but i think we should derive this instead of simply making an assumption actually f and g are equivalent and interchangeable if we assume g already converge does that mean f also assumed converge but if we directly apply results derived from f this will be circular reasoning so i am not sure the authors better make more discussions on this 3 for practical performance if we compare nccl 6852 accuracies in table i with gem 8950 agem 8910 gss 7730 or even ewc 6830 there is no performance improvement at all the authors further claim their methods are faster in computation then please also include a time comparison instead of just mentioning it otherwise it is hard to quantify the contribution of the new method overall speaking i am afraid that such work does not have sufficient theoretical or algorithmic contributions and i doubt the true value of designing a new method without any performance improvement however i still appreciate the motivation of the paper and will be more tolerant since there are quite scant theory papers for cl so i would be happy to adjust my rating if all my concerns were properly addressed if there is any misunderstanding please also let me know update thanks for the response however there is no updated revision in the revision history of this paper based on the flaws that i have previously pointed out it is impossible for me to validate if my concerns were actually adequately addressed without seeing the updated version i will keep my score unchanged
### Summary: | this work proposes to analyse convergence of episodic memorybased continual learning methods by looking at this problem through the lense of nonconvex optimisation based on the analysis a method is proposed to scale learning rates such that the bounds on the convergence rate are improved pros i agree with the reviewers that this is an interesting and novel perspective on continual learning cons reviewers point out concernsissues with the clarity of the manuscript with respect to several parts reviewers raise concerns with respect to the significance of the evaluation reviewers point out that the theoretical analysis itself is somewhat standard and not novel in itself and 2 reviewers raise concerns with respect to the analysis made unfortunately the authors seem to have missed the upload of the revised version the reviewers have nevertheless considered the rebuttal by the authors and the consensus is that this manuscript is not ready yet in its current form | [
7141,
285,
840,
1908,
253,
1083,
835,
627,
310,
760,
247,
8578,
273,
2469,
941,
4283,
281,
689,
31893,
327,
253,
6314,
23329,
3541,
597,
840,
9569,
247,
1332,
326,
11498,
253,
4715,
4142,
273,
253,
616,
5731,
1332,
342,
253,
4736,
273,
48594,
253,
3033,
2797,
275,
253,
14940,
1783,
4720,
4679,
403,
2011,
327,
1027,
49602,
285,
253,
4081,
1332,
310,
2429,
281,
690,
11771,
1666,
25379,
50276,
8774,
273,
2278,
50276,
74,
717,
46705,
33944,
436,
2929,
3738,
253,
4736,
273,
253,
2929,
310,
49638,
494,
14940,
1783,
323,
1327,
44181,
6314,
23329,
3541,
3169,
45120,
4715,
891,
1928,
751,
627,
403,
1142,
4243,
273,
253,
2929,
326,
476,
320,
5520,
923,
1996,
275,
253,
2278,
50276,
856,
84,
273,
2929,
50276,
18,
253,
2929,
9437,
281,
30648,
253,
14940,
273,
45120,
4715,
3082,
28055,
3340,
2593,
4562,
436,
310,
1077,
1774,
281,
513,
594,
326,
359,
476,
2096,
253,
1895,
273,
1327,
44181,
45120,
4715,
1805,
436,
556,
417,
644,
9919,
2217,
275,
253,
6239,
13730,
984,
436,
310,
247,
1077,
2834,
1895,
50276,
19,
253,
789,
4620,
281,
320,
973,
3321,
264,
342,
2905,
789,
327,
14940,
4142,
347,
2080,
347,
891,
717,
6600,
495,
253,
2929,
21168,
23395,
432,
10199,
281,
12611,
789,
281,
10527,
1543,
281,
4679,
50276,
5040,
273,
2929,
3533,
323,
253,
4477,
50276,
21,
3738,
253,
4388,
273,
253,
2929,
310,
1270,
352,
4620,
281,
479,
347,
604,
253,
3082,
253,
2929,
25957,
403,
417,
10872,
273,
253,
5731,
326,
253,
2929,
6260,
5150,
721,
5742,
16915,
285,
299,
38212,
5393,
275,
253,
806,
12494,
273,
2593,
4562,
16915,
556,
247,
1027,
5556,
5837,
5853,
21396,
10717,
5933,
285,
299,
38212,
1057,
417,
4657,
667,
6314,
23329,
3541,
760,
10111,
2045,
1566,
3602,
608,
891,
717,
15586,
281,
923,
253,
8453,
273,
2593,
4567,
689,
31893,
281,
6314,
23329,
3541,
352,
4620,
751,
253,
4477,
403,
816,
13458,
562,
326,
627,
310,
247,
8492,
5611,
407,
20073,
760,
247,
8578,
273,
2469,
941,
1293,
10481,
36738,
327,
253,
2538,
390,
8453,
273,
436,
8492,
721,
30762,
247,
4737,
273,
10012,
337,
310,
18464,
818,
1633,
3133,
3430,
281,
479,
342,
253,
270,
17118,
7982,
275,
2593,
7609,
247,
619,
1211,
2793,
342,
1442,
292,
2517,
285,
299,
38212,
7052,
5936,
326,
1097,
3082,
943,
452,
270,
17118,
17,
436,
310,
984,
253,
3082,
806,
3037,
253,
4836,
973,
285,
840,
7740,
352,
7808,
689,
673,
285,
310,
4751,
3264,
432,
824,
11333,
2299,
253,
4477,
1304,
270,
17118,
17,
270,
1442,
292,
2517,
327,
8143,
4525,
16192,
382,
2829,
337,
556,
271,
756,
273,
30188,
533,
247,
270,
17118,
273,
1249,
740,
13353,
320,
5426,
270,
17118,
3649,
1900,
5150,
1283,
260,
247,
2457,
1127,
327,
270,
17118,
247,
270,
17118,
17,
1057,
417,
16084,
326,
36256,
37264,
6569,
2457,
12494,
3239,
818,
3738,
352,
1057,
16084,
37264,
436,
310,
417,
7933,
36256,
37264,
534,
310,
760,
672,
270,
17118,
310,
6685,
4016,
323,
1650,
253,
4473,
273,
45822,
37264,
588,
1335,
452,
270,
17118,
17,
533,
310,
3798,
15622,
432,
36256,
37264,
854,
476,
253,
4477,
4385,
327,
2139,
253,
4081,
1332,
17923,
1805,
342,
337,
23657,
591,
4836,
685,
342,
608,
44540,
591,
4836,
7180,
337,
4632,
374,
8143,
4525,
16192,
382,
436,
906,
4620,
281,
5224,
326,
5747,
253,
10618,
2426,
273,
253,
1332,
253,
1332,
310,
37264,
8892,
347,
352,
18784,
323,
3356,
50276,
38092,
5884,
8680,
50276,
26,
891,
651,
7052,
5583,
4737,
24042,
253,
2929,
390,
2010,
7004,
247,
7925,
48087,
14925,
281,
513,
594,
884,
4677,
337,
310,
247,
5322,
23211,
25910,
533,
891,
858,
417,
923,
849,
352,
2722,
253,
5649,
2364,
2934,
273,
295,
68,
498,
5742,
534,
310,
670,
4560,
8654,
4715,
4142,
627,
310,
642,
5304,
5168,
12068,
1420,
3284,
273,
849,
1110,
4715,
4142,
1537,
320,
6777,
31506,
1691,
247,
2074,
4677,
812,
320,
908,
281,
6266,
24088,
16915,
50276,
11183,
281,
2278,
50276,
35501,
281,
253,
4477,
323,
19392,
597,
858,
2590,
598,
1127,
608,
1840,
323,
479,
2299,
891,
3091,
1978,
619,
4868,
273,
577,
19235,
891,
2550,
923,
253,
747,
18520,
273,
253,
2929,
326,
253,
4477,
3730,
281,
4495,
891,
2550,
1818,
619,
4868,
5474,
33032,
2520,
2929,
9437,
281,
2085,
247,
14940,
1783,
323,
1327,
44181,
45120,
4715,
342,
6314,
23329,
12959,
285,
1611,
281,
28055,
921,
253,
11961,
273,
19265,
3700,
4269,
407,
50276,
1189,
31893,
281,
16407,
1025,
3530,
50276,
262,
2007,
29328,
271,
5933,
323,
4715,
2281,
27387,
275,
50276,
4160,
44181,
45120,
4715,
1754,
327,
841,
1543,
50276,
783,
1921,
273,
253,
4868,
273,
253,
2929,
310,
326,
253,
10527,
4737,
310,
3430,
275,
619,
4685,
285,
2550,
1329,
253,
2022,
7680,
7558,
275,
436,
2929,
50276,
783,
2022,
3237,
403,
347,
2708,
50274,
783,
4737,
273,
253,
2022,
39383,
310,
30455,
5001,
253,
1327,
44181,
9376,
534,
310,
253,
954,
1774,
7680,
7558,
275,
436,
2929,
50276,
1747,
13218,
253,
11370,
275,
50276,
2574,
22,
50276,
249,
619,
4685,
352,
310,
2186,
281,
320,
2032,
760,
672,
269,
310,
247,
17133,
1159,
337,
50276,
395,
253,
39383,
403,
1754,
327,
436,
11370,
534,
2550,
320,
2186,
50276,
1542,
1327,
44181,
1083,
604,
436,
11370,
310,
417,
2032,
323,
1327,
44181,
3470,
50276,
338,
516,
3430,
4477,
4496,
2085,
4737,
273,
849,
281,
755,
16186,
22,
407,
298,
34006,
1327,
44181,
3470,
50274,
3062,
1189,
275,
253,
4737,
273,
10012,
337,
16186,
746,
50276,
50237,
247,
253,
11370,
273,
253,
1390,
3213,
2550,
320,
2186,
5734,
253,
6703,
1885,
273,
27935,
50276,
3005,
269,
50276,
7555,
255,
305,
50276,
261,
1900,
2762,
534,
2550,
320,
16293,
5010,
627,
310,
642,
1921,
281,
1287,
11786,
3169,
7274,
275,
45120,
4715,
824,
347,
50276,
25460,
374,
390,
639,
358,
495,
50276,
601,
1014,
604,
16186,
22,
476,
2186,
323,
1327,
44181,
1083,
253,
10012,
310,
1335,
30455,
3103,
253,
2022,
1750,
273,
436,
2929,
310,
4122,
20634,
281,
479,
50276,
338,
4477,
2550,
19148,
841,
3374,
436,
2929,
651,
320,
2783,
347,
342,
1534,
32138,
50275,
3229,
3784,
253,
3533,
327,
253,
2022,
10012,
253,
9376,
273,
253,
3302,
1375,
273,
253,
1566,
310,
3240,
2266,
347,
352,
19584,
253,
3302,
2193,
273,
3602,
403,
2810,
281,
253,
8654,
2193,
534,
310,
417,
1077,
8542,
5734,
247,
3215,
11273,
1566,
310,
3732,
50276,
601,
253,
50276,
9188,
40348,
273,
436,
2929,
310,
2007,
3710,
50274,
284,
253,
10527,
629,
310,
13583,
891,
419,
2254,
9814,
253,
4679,
629,
273,
436,
2929,
604,
253,
4477,
476,
19148,
512,
1840,
2022,
7350,
516,
7378,
281,
1056,
1529,
3790,
273,
2278,
50275,
18,
295,
9358,
729,
340,
321,
2886,
47649,
29608,
327,
17133,
10717,
4644,
891,
5044,
2282,
22077,
7211,
5910,
8065,
608,
374,
298,
27217,
81,
1370,
34843,
301,
285,
2304,
6357,
459,
28482,
6337,
91,
4611,
11786,
6314,
23329,
3541,
323,
45120,
4715,
16424,
275,
11454,
1491,
5162,
2718,
4240,
495,
448,
5353,
73,
610,
549,
3433,
266,
1162,
355,
5919,
36536,
4715,
342,
639,
358,
549,
32693,
638,
3845,
549,
32693,
1093,
805,
5525,
938,
4765,
50276,
44333,
281,
4477,
2380,
50276,
303,
6600,
273,
253,
1327,
44181,
4758,
310,
3588,
533,
1580,
253,
15045,
4737,
273,
253,
10012,
310,
417,
28228,
891,
588,
7164,
619,
4868,
281,
495,
50273,
7152,
339,
9852,
436,
2929,
253,
4477,
2085,
10527,
816,
6787,
323,
3541,
3169,
45120,
4715,
502,
3082,
285,
2085,
247,
13642,
4715,
2281,
1332,
295,
68,
498,
281,
3157,
253,
8542,
3045,
253,
1543,
1007,
3240,
12302,
627,
310,
3240,
43721,
10527,
2929,
323,
502,
2299,
846,
2819,
715,
253,
4278,
273,
253,
2929,
891,
369,
13477,
407,
1142,
5053,
285,
651,
1333,
253,
4477,
878,
281,
2007,
3157,
616,
7714,
275,
1340,
281,
19478,
323,
253,
17857,
32888,
2629,
50276,
18,
253,
10527,
1783,
310,
417,
1077,
13943,
253,
3762,
816,
8085,
562,
253,
36256,
37264,
1307,
260,
285,
5183,
326,
3045,
11961,
7024,
327,
260,
2299,
835,
260,
3249,
432,
891,
871,
352,
310,
271,
3081,
1307,
3587,
432,
15965,
28529,
533,
47515,
253,
4495,
285,
30328,
310,
417,
4518,
5469,
671,
253,
10012,
1754,
327,
253,
46521,
9376,
1162,
310,
38663,
9376,
374,
534,
476,
1620,
5108,
275,
3541,
3169,
502,
3082,
253,
4477,
513,
3748,
7274,
824,
347,
295,
68,
498,
1293,
9376,
374,
533,
642,
3762,
310,
2530,
5912,
2593,
4567,
310,
327,
3762,
1293,
9376,
374,
840,
4496,
2085,
247,
3426,
10012,
3185,
273,
816,
34475,
3564,
50276,
19,
25761,
627,
403,
1142,
32138,
275,
253,
4737,
891,
816,
1618,
247,
1643,
273,
731,
1060,
390,
3451,
479,
604,
891,
23452,
1676,
50275,
249,
4737,
273,
10012,
337,
1273,
11370,
273,
16186,
655,
2139,
1057,
253,
2831,
1885,
1307,
15529,
26332,
2139,
295,
6348,
269,
50276,
6526,
305,
19,
50276,
6526,
269,
19,
50276,
6526,
305,
19,
50276,
22309,
45830,
50276,
27116,
246,
6227,
291,
672,
3192,
271,
15355,
689,
352,
45830,
310,
2931,
275,
16186,
818,
285,
627,
310,
642,
3632,
1255,
689,
480,
85,
2168,
342,
22317,
533,
22616,
246,
6227,
291,
1335,
556,
3632,
1255,
689,
480,
85,
50276,
22309,
299,
292,
19,
310,
3542,
347,
1162,
19,
359,
671,
452,
3632,
1255,
275,
1162,
689,
352,
923,
5426,
273,
1162,
50276,
249,
4737,
273,
18057,
337,
2139,
546,
6348,
269,
50276,
80,
257,
6348,
305,
390,
849,
513,
359,
755,
253,
1273,
13919,
50276,
5430,
513,
359,
755,
253,
5886,
273,
546,
6348,
305,
19,
50276,
706,
1464,
19,
3005,
50276,
2609,
85,
891,
923,
352,
310,
3587,
8025,
275,
40460,
337,
3264,
17429,
273,
305,
320,
1448,
85,
284,
2274,
85,
533,
891,
1158,
359,
943,
15313,
436,
3185,
273,
3365,
2403,
271,
9376,
2686,
269,
285,
305,
403,
6425,
285,
28961,
494,
604,
359,
5467,
305,
2168,
29623,
1057,
326,
1599,
269,
671,
8025,
29623,
533,
604,
359,
3587,
4647,
1543,
6012,
432,
269,
436,
588,
320,
13765,
14720,
594,
891,
717,
417,
2119,
253,
4477,
1805,
1056,
625,
11985,
327,
436,
50276,
20,
323,
8542,
3045,
604,
359,
7277,
295,
68,
498,
721,
38819,
3933,
19103,
275,
2829,
891,
342,
16915,
11289,
1235,
639,
358,
11289,
740,
305,
859,
10484,
1229,
390,
1014,
299,
38212,
9934,
1229,
627,
310,
642,
3045,
7756,
387,
512,
253,
4477,
2007,
1750,
616,
3082,
403,
7938,
275,
13782,
840,
4496,
671,
2486,
247,
673,
5301,
3185,
273,
816,
29570,
352,
5010,
352,
310,
1892,
281,
22048,
253,
7680,
273,
253,
747,
1332,
50276,
1189,
455,
8288,
891,
717,
9202,
326,
824,
789,
1057,
417,
452,
4209,
10527,
390,
5933,
280,
9021,
285,
891,
5545,
253,
2032,
1318,
273,
20462,
247,
747,
1332,
1293,
667,
3045,
7756,
2299,
891,
1335,
11435,
253,
16038,
273,
253,
2929,
285,
588,
320,
625,
41842,
1580,
627,
403,
3240,
43721,
3762,
9380,
323,
502,
594,
891,
651,
320,
5211,
281,
4575,
619,
13716,
604,
512,
619,
7350,
497,
6283,
9713,
604,
627,
310,
667,
40663,
4496,
671,
1339,
479,
871,
50276,
11183,
6701,
323,
253,
2380,
2299,
627,
310,
642,
9300,
18520,
275,
253,
18520,
2892,
273,
436,
2929,
1754,
327,
253,
32138,
326,
891,
452,
3786,
8042,
562,
352,
310,
7479,
323,
479,
281,
17813,
604,
619,
7350,
497,
2686,
18212,
9713,
1293,
6523,
253,
9300,
2715,
891,
588,
1978,
619,
4868,
19965,
50276,
187,
187,
4118,
18435,
27,
2520,
789,
29328,
281,
30648,
14940,
273,
6314,
23329,
3541,
3169,
45120,
4715,
3082,
407,
2819,
387,
436,
1895,
949,
253,
298,
1215,
273,
1327,
44181,
5556,
5837,
1754,
327,
253,
1783,
247,
1332,
310,
4081,
281,
4311,
4715,
4142,
824,
326,
253,
14493,
327,
253,
14940,
2281,
403,
5520,
50276,
856,
84,
50276,
74,
5194,
342,
253,
30628,
326,
436,
310,
271,
4722,
285,
4460,
8668,
327,
45120,
4715,
50276,
5040,
50276,
15337,
398,
1127,
562,
7350,
22402,
342,
253,
19843,
273,
253,
7714,
342,
1675,
281,
2067,
4243,
50275,
15337,
398,
7164,
7350,
342,
1675,
281,
253,
8453,
273,
253,
7103,
50275,
15337,
398,
1127,
562,
326,
253,
10527,
1783,
3139,
310,
8489,
2629,
285,
417,
4460,
275,
3139,
285,
374,
30628,
7164,
7350,
342,
1675,
281,
253,
1783,
1160,
50276,
328,
9520,
253,
4477,
1646,
281,
452,
9829,
253,
12119,
273,
253,
17265,
2715,
253,
30628,
452,
17837,
2783,
253,
30080,
22559,
407,
253,
4477,
285,
253,
13969,
310,
326,
436,
7714,
310,
417,
4704,
2568,
275,
697,
1655,
830,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
7141,
285,
840,
1908,
253,
1083,
835,
627,
310,
760,
247,
8578,
273,
2469,
941,
4283,
281,
689,
31893,
327,
253,
6314,
23329,
3541,
597,
840,
9569,
247,
1332,
326,
11498,
253,
4715,
4142,
273,
253,
616,
5731,
1332,
342,
253,
4736,
273,
48594,
253,
3033,
2797,
275,
253,
14940,
1783,
4720,
4679,
403,
2011,
327,
1027,
49602,
285,
253,
4081,
1332,
310,
2429,
281,
690,
11771,
1666,
25379,
50276,
8774,
273,
2278,
50276,
74,
717,
46705,
33944,
436,
2929,
3738,
253,
4736,
273,
253,
2929,
310,
49638,
494,
14940,
1783,
323,
1327,
44181,
6314,
23329,
3541,
3169,
45120,
4715,
891,
1928,
751,
627,
403,
1142,
4243,
273,
253,
2929,
326,
476,
320,
5520,
923,
1996,
275,
253,
2278,
50276,
856,
84,
273,
2929,
50276,
18,
253,
2929,
9437,
281,
30648,
253,
14940,
273,
45120,
4715,
3082,
28055,
3340,
2593,
4562,
436,
310,
1077,
1774,
281,
513,
594,
326,
359,
476,
2096,
253,
1895,
273,
1327,
44181,
45120,
4715,
1805,
436,
556,
417,
644,
9919,
2217,
275,
253,
6239,
13730,
984,
436,
310,
247,
1077,
2834,
1895,
50276,
19,
253,
789,
4620,
281,
320,
973,
3321,
264,
342,
2905,
789,
327,
14940,
4142,
347,
2080,
347,
891,
717,
6600,
495,
253,
2929,
21168,
23395,
432,
10199,
281,
12611,
789,
281,
10527,
1543,
281,
4679,
50276,
5040,
273,
2929,
3533,
323,
253,
4477,
50276,
21,
3738,
253,
4388,
273,
253,
2929,
310,
1270,
352,
4620,
281,
479,
347,
604,
253,
3082,
253,
2929,
25957,
403,
417,
10872,
273,
253,
5731,
326,
253,
2929,
6260,
5150,
721,
5742,
16915,
285,
299,
38212,
5393,
275,
253,
806,
12494,
273,
2593,
4562,
16915,
556,
247,
1027,
5556,
5837,
5853,
21396,
10717,
5933,
285,
299,
38212,
1057,
417,
4657,
667,
6314,
23329,
3541,
760,
10111,
2045,
1566,
3602,
608,
891,
717,
15586,
281,
923,
253,
8453,
273,
2593,
4567,
689,
31893,
281,
6314,
23329,
3541,
352,
4620,
751,
253,
4477,
403,
816,
13458,
562,
326,
627,
310,
247,
8492,
5611,
407,
20073,
760,
247,
8578,
273,
2469,
941,
1293,
10481,
36738,
327,
253,
2538,
390,
8453,
273,
436,
8492,
721,
30762,
247,
4737,
273,
10012,
337,
310,
18464,
818,
1633,
3133,
3430,
281,
479,
342,
253,
270,
17118,
7982,
275,
2593,
7609,
247,
619,
1211,
2793,
342,
1442,
292,
2517,
285,
299,
38212,
7052,
5936,
326,
1097,
3082,
943,
452,
270,
17118,
17,
436,
310,
984,
253,
3082,
806,
3037,
253,
4836,
973,
285,
840,
7740,
352,
7808,
689,
673,
285,
310,
4751,
3264,
432,
824,
11333,
2299,
253,
4477,
1304,
270,
17118,
17,
270,
1442,
292,
2517,
327,
8143,
4525,
16192,
382,
2829,
337,
556,
271,
756,
273,
30188,
533,
247,
270,
17118,
273,
1249,
740,
13353,
320,
5426,
270,
17118,
3649,
1900,
5150,
1283,
260,
247,
2457,
1127,
327,
270,
17118,
247,
270,
17118,
17,
1057,
417,
16084,
326,
36256,
37264,
6569,
2457,
12494,
3239,
818,
3738,
352,
1057,
16084,
37264,
436,
310,
417,
7933,
36256,
37264,
534,
310,
760,
672,
270,
17118,
310,
6685,
4016,
323,
1650,
253,
4473,
273,
45822,
37264,
588,
1335,
452,
270,
17118,
17,
533,
310,
3798,
15622,
432,
36256,
37264,
854,
476,
253,
4477,
4385,
327,
2139,
253,
4081,
1332,
17923,
1805,
342,
337,
23657,
591,
4836,
685,
342,
608,
44540,
591,
4836,
7180,
337,
4632,
374,
8143,
4525,
16192,
382,
436,
906,
4620,
281,
5224,
326,
5747,
253,
10618,
2426,
273,
253,
1332,
253,
1332,
310,
37264,
8892,
347,
352,
18784,
323,
3356,
50276,
38092,
5884,
8680,
50276,
26,
891,
651,
7052,
5583,
4737,
24042,
253,
2929,
390,
2010,
7004,
247,
7925,
48087,
14925,
281,
513,
594,
884,
4677,
337,
310,
247,
5322,
23211,
25910,
533,
891,
858,
417,
923,
849,
352,
2722,
253,
5649,
2364,
2934,
273,
295,
68,
498,
5742,
534,
310,
670,
4560,
8654,
4715,
4142,
627,
310,
642,
5304,
5168,
12068,
1420,
3284,
273,
849,
1110,
4715,
4142,
1537,
320,
6777,
31506,
1691,
247,
2074,
4677,
812,
320,
908,
281,
6266,
24088,
16915,
50276,
11183,
281,
2278,
50276,
35501,
281,
253,
4477,
323,
19392,
597,
858,
2590,
598,
1127,
608,
1840,
323,
479,
2299,
891,
3091,
1978,
619,
4868,
273,
577,
19235,
891,
2550,
923,
253,
747,
18520,
273,
253,
2929,
326,
253,
4477,
3730,
281,
4495,
891,
2550,
1818,
619,
4868,
5474,
33032,
2520,
2929,
9437,
281,
2085,
247,
14940,
1783,
323,
1327,
44181,
45120,
4715,
342,
6314,
23329,
12959,
285,
1611,
281,
28055,
921,
253,
11961,
273,
19265,
3700,
4269,
407,
50276,
1189,
31893,
281,
16407,
1025,
3530,
50276,
262,
2007,
29328,
271,
5933,
323,
4715,
2281,
27387,
275,
50276,
4160,
44181,
45120,
4715,
1754,
327,
841,
1543,
50276,
783,
1921,
273,
253,
4868,
273,
253,
2929,
310,
326,
253,
10527,
4737,
310,
3430,
275,
619,
4685,
285,
2550,
1329,
253,
2022,
7680,
7558,
275,
436,
2929,
50276,
783,
2022,
3237,
403,
347,
2708,
50274,
783,
4737,
273,
253,
2022,
39383,
310,
30455,
5001,
253,
1327,
44181,
9376,
534,
310,
253,
954,
1774,
7680,
7558,
275,
436,
2929,
50276,
1747,
13218,
253,
11370,
275,
50276,
2574,
22,
50276,
249,
619,
4685,
352,
310,
2186,
281,
320,
2032,
760,
672,
269,
310,
247,
17133,
1159,
337,
50276,
395,
253,
39383,
403,
1754,
327,
436,
11370,
534,
2550,
320,
2186,
50276,
1542,
1327,
44181,
1083,
604,
436,
11370,
310,
417,
2032,
323,
1327,
44181,
3470,
50276,
338,
516,
3430,
4477,
4496,
2085,
4737,
273,
849,
281,
755,
16186,
22,
407,
298,
34006,
1327,
44181,
3470,
50274,
3062,
1189,
275,
253,
4737,
273,
10012,
337,
16186,
746,
50276,
50237,
247,
253,
11370,
273,
253,
1390,
3213,
2550,
320,
2186,
5734,
253,
6703,
1885,
273,
27935,
50276,
3005,
269,
50276,
7555,
255,
305,
50276,
261,
1900,
2762,
534,
2550,
320,
16293,
5010,
627,
310,
642,
1921,
281,
1287,
11786,
3169,
7274,
275,
45120,
4715,
824,
347,
50276,
25460,
374,
390,
639,
358,
495,
50276,
601,
1014,
604,
16186,
22,
476,
2186,
323,
1327,
44181,
1083,
253,
10012,
310,
1335,
30455,
3103,
253,
2022,
1750,
273,
436,
2929,
310,
4122,
20634,
281,
479,
50276,
338,
4477,
2550,
19148,
841,
3374,
436,
2929,
651,
320,
2783,
347,
342,
1534,
32138,
50275,
3229,
3784,
253,
3533,
327,
253,
2022,
10012,
253,
9376,
273,
253,
3302,
1375,
273,
253,
1566,
310,
3240,
2266,
347,
352,
19584,
253,
3302,
2193,
273,
3602,
403,
2810,
281,
253,
8654,
2193,
534,
310,
417,
1077,
8542,
5734,
247,
3215,
11273,
1566,
310,
3732,
50276,
601,
253,
50276,
9188,
40348,
273,
436,
2929,
310,
2007,
3710,
50274,
284,
253,
10527,
629,
310,
13583,
891,
419,
2254,
9814,
253,
4679,
629,
273,
436,
2929,
604,
253,
4477,
476,
19148,
512,
1840,
2022,
7350,
516,
7378,
281,
1056,
1529,
3790,
273,
2278,
50275,
18,
295,
9358,
729,
340,
321,
2886,
47649,
29608,
327,
17133,
10717,
4644,
891,
5044,
2282,
22077,
7211,
5910,
8065,
608,
374,
298,
27217,
81,
1370,
34843,
301,
285,
2304,
6357,
459,
28482,
6337,
91,
4611,
11786,
6314,
23329,
3541,
323,
45120,
4715,
16424,
275,
11454,
1491,
5162,
2718,
4240,
495,
448,
5353,
73,
610,
549,
3433,
266,
1162,
355,
5919,
36536,
4715,
342,
639,
358,
549,
32693,
638,
3845,
549,
32693,
1093,
805,
5525,
938,
4765,
50276,
44333,
281,
4477,
2380,
50276,
303,
6600,
273,
253,
1327,
44181,
4758,
310,
3588,
533,
1580,
253,
15045,
4737,
273,
253,
10012,
310,
417,
28228,
891,
588,
7164,
619,
4868,
281,
495,
50273,
7152,
339,
9852,
436,
2929,
253,
4477,
2085,
10527,
816,
6787,
323,
3541,
3169,
45120,
4715,
502,
3082,
285,
2085,
247,
13642,
4715,
2281,
1332,
295,
68,
498,
281,
3157,
253,
8542,
3045,
253,
1543,
1007,
3240,
12302,
627,
310,
3240,
43721,
10527,
2929,
323,
502,
2299,
846,
2819,
715,
253,
4278,
273,
253,
2929,
891,
369,
13477,
407,
1142,
5053,
285,
651,
1333,
253,
4477,
878,
281,
2007,
3157,
616,
7714,
275,
1340,
281,
19478,
323,
253,
17857,
32888,
2629,
50276,
18,
253,
10527,
1783,
310,
417,
1077,
13943,
253,
3762,
816,
8085,
562,
253,
36256,
37264,
1307,
260,
285,
5183,
326,
3045,
11961,
7024,
327,
260,
2299,
835,
260,
3249,
432,
891,
871,
352,
310,
271,
3081,
1307,
3587,
432,
15965,
28529,
533,
47515,
253,
4495,
285,
30328,
310,
417,
4518,
5469,
671,
253,
10012,
1754,
327,
253,
46521,
9376,
1162,
310,
38663,
9376,
374,
534,
476,
1620,
5108,
275,
3541,
3169,
502,
3082,
253,
4477,
513,
3748,
7274,
824,
347,
295,
68,
498,
1293,
9376,
374,
533,
642,
3762,
310,
2530,
5912,
2593,
4567,
310,
327,
3762,
1293,
9376,
374,
840,
4496,
2085,
247,
3426,
10012,
3185,
273,
816,
34475,
3564,
50276,
19,
25761,
627,
403,
1142,
32138,
275,
253,
4737,
891,
816,
1618,
247,
1643,
273,
731,
1060,
390,
3451,
479,
604,
891,
23452,
1676,
50275,
249,
4737,
273,
10012,
337,
1273,
11370,
273,
16186,
655,
2139,
1057,
253,
2831,
1885,
1307,
15529,
26332,
2139,
295,
6348,
269,
50276,
6526,
305,
19,
50276,
6526,
269,
19,
50276,
6526,
305,
19,
50276,
22309,
45830,
50276,
27116,
246,
6227,
291,
672,
3192,
271,
15355,
689,
352,
45830,
310,
2931,
275,
16186,
818,
285,
627,
310,
642,
3632,
1255,
689,
480,
85,
2168,
342,
22317,
533,
22616,
246,
6227,
291,
1335,
556,
3632,
1255,
689,
480,
85,
50276,
22309,
299,
292,
19,
310,
3542,
347,
1162,
19,
359,
671,
452,
3632,
1255,
275,
1162,
689,
352,
923,
5426,
273,
1162,
50276,
249,
4737,
273,
18057,
337,
2139,
546,
6348,
269,
50276,
80,
257,
6348,
305,
390,
849,
513,
359,
755,
253,
1273,
13919,
50276,
5430,
513,
359,
755,
253,
5886,
273,
546,
6348,
305,
19,
50276,
706,
1464,
19,
3005,
50276,
2609,
85,
891,
923,
352,
310,
3587,
8025,
275,
40460,
337,
3264,
17429,
273,
305,
320,
1448,
85,
284,
2274,
85,
533,
891,
1158,
359,
943,
15313,
436,
3185,
273,
3365,
2403,
271,
9376,
2686,
269,
285,
305,
403,
6425,
285,
28961,
494,
604,
359,
5467,
305,
2168,
29623,
1057,
326,
1599,
269,
671,
8025,
29623,
533,
604,
359,
3587,
4647,
1543,
6012,
432,
269,
436,
588,
320,
13765,
14720,
594,
891,
717,
417,
2119,
253,
4477,
1805,
1056,
625,
11985,
327,
436,
50276,
20,
323,
8542,
3045,
604,
359,
7277,
295,
68,
498,
721,
38819,
3933,
19103,
275,
2829,
891,
342,
16915,
11289,
1235,
639,
358,
11289,
740,
305,
859,
10484,
1229,
390,
1014,
299,
38212,
9934,
1229,
627,
310,
642,
3045,
7756,
387,
512,
253,
4477,
2007,
1750,
616,
3082,
403,
7938,
275,
13782,
840,
4496,
671,
2486,
247,
673,
5301,
3185,
273,
816,
29570,
352,
5010,
352,
310,
1892,
281,
22048,
253,
7680,
273,
253,
747,
1332,
50276,
1189,
455,
8288,
891,
717,
9202,
326,
824,
789,
1057,
417,
452,
4209,
10527,
390,
5933,
280,
9021,
285,
891,
5545,
253,
2032,
1318,
273,
20462,
247,
747,
1332,
1293,
667,
3045,
7756,
2299,
891,
1335,
11435,
253,
16038,
273,
253,
2929,
285,
588,
320,
625,
41842,
1580,
627,
403,
3240,
43721,
3762,
9380,
323,
502,
594,
891,
651,
320,
5211,
281,
4575,
619,
13716,
604,
512,
619,
7350,
497,
6283,
9713,
604,
627,
310,
667,
40663,
4496,
671,
1339,
479,
871,
50276,
11183,
6701,
323,
253,
2380,
2299,
627,
310,
642,
9300,
18520,
275,
253,
18520,
2892,
273,
436,
2929,
1754,
327,
253,
32138,
326,
891,
452,
3786,
8042,
562,
352,
310,
7479,
323,
479,
281,
17813,
604,
619,
7350,
497,
2686,
18212,
9713,
1293,
6523,
253,
9300,
2715,
891,
588,
1978,
619,
4868,
19965,
50276,
187,
187,
4118,
18435,
27,
2520,
789,
29328,
281,
30648,
14940,
273,
6314,
23329,
3541,
3169,
45120,
4715,
3082,
407,
2819,
387,
436,
1895,
949,
253,
298,
1215,
273,
1327,
44181,
5556,
5837,
1754,
327,
253,
1783,
247,
1332,
310,
4081,
281,
4311,
4715,
4142,
824,
326,
253,
14493,
327,
253,
14940,
2281,
403,
5520,
50276,
856,
84,
50276,
74,
5194,
342,
253,
30628,
326,
436,
310,
271,
4722,
285,
4460,
8668,
327,
45120,
4715,
50276,
5040,
50276,
15337,
398,
1127,
562,
7350,
22402,
342,
253,
19843,
273,
253,
7714,
342,
1675,
281,
2067,
4243,
50275,
15337,
398,
7164,
7350,
342,
1675,
281,
253,
8453,
273,
253,
7103,
50275,
15337,
398,
1127,
562,
326,
253,
10527,
1783,
3139,
310,
8489,
2629,
285,
417,
4460,
275,
3139,
285,
374,
30628,
7164,
7350,
342,
1675,
281,
253,
1783,
1160,
50276,
328,
9520,
253,
4477,
1646,
281,
452,
9829,
253,
12119,
273,
253,
17265,
2715,
253,
30628,
452,
17837,
2783,
253,
30080,
22559,
407,
253,
4477,
285,
253,
13969,
310,
326,
436,
7714,
310,
417,
4704,
2568,
275,
697,
1655,
830,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper provides theory on paclearnability of outofdistribution ood detection ood detection is classification task but test data may come from unknown classes if test data come from classes known during training we want to classify them into those classes but otherwise we need to detect they belong to unknown classes the authors provide a series of theorems about conditions for ood detection in several interesting setups their results imply that we should not hope for finding an ood detection algorithm that works in general cases but we can still design algorithms for special cases strengths the paper provides rigorous theory on an important machine learning task the paper is excellentlywritten and easy to follow despite its technical content although all the proofs are in the supplemental material the scenarios that the authors consider are not too technical but highly relevant to practical ood detection methods hence it gives useful insights for practitioners as well weaknesses most results are negative ones showing impossibility of ood detection in general cases and the paper does not provide concrete algorithms the theory only handles a few special combinations of distributions and hypothesis spaces although i do not consider this as a very strong limitation because they cover many common practical situations docsepthe outofdistribution detection problem is defined as follows after training on an id joint distribution dx iy i with random variables from mathcalx and labels in mathcaly we need to learn a classifier which can detect a test sample as ood if the sample is drawn from outside of dx iy i while predicting the correct label if the test sample is drawn from id distribution this paper mainly answers the agnostic pac learnability of outofdistribution detection in different scenarios which is known as an open problem in outofdistribution learning theory this paper firstly defines the basic concepts of agnostic pac learnability of ood detection which are natural extensions of agnostic pac learnability of supervised learning then considering the imbalance issue of ood detection the author proposes the priorunknown spaces and indicates that researchers should focus on agnostic pac learnability of ood detection in the priorunknown spaces by discovering a necessary condition condition 1 the author shows that the condition cannot hold in the total space and separate space based on this observation the paper proves that in most general setting total space and separate space ood detection is not agnostic pac learnable next the author proves the necessary and sufficient conditions to show that the separate space can be learnable if and only if the hypothesis space contains almost all classifiers while the paper proves that in the finiteiddistribution space condition 3 is the necessary and sufficient condition for the learnability of ood detection the paper also proves that in the realizability assumption case ood detection is learnable in densitybased space lastly the author considers ood detection in some practical hypothesis spacefcnnbased and scorebased the paper shows that in the separate space ood is learnable in fcnnbased spaces or scorebased spaces iff the feature space is finite in theorem 11 the paper shows that condition 1 condition 3 and realizability assumption and learnability are equivalent in theorem 12 the author also reveals that overlap will lead to the failure of ood detection this paper is important to understand when and how ood can work in real applications as this also gives insight and guidance to ood detection algorithm designing strengths 1 the issue is definitely relevant to the neurips as well as icml alt and colt when ood detection can be learnable is an open issue in ood learning due to missing necessary information from the ood data the learnability of ood detection is very difficult despite plenty of applied work there is still few theory to be established for this issue to address this issue it requires the author to dig and discovery unknown necessary conditions from scratch this paper does make an effort to address this problem and make great progress 2 this paper is sound i am interested in this topic but the paper is long so i spend several days to check the proofs carefully all of the results in this paper are supported by proofs from what i have checked all proofs are correct 3 the paper answers negatively and positively the question of agnostic pac learnability of ood and introduces sufficient assumptions to recover it such as assumption 1 these assumptions are practical and mild and can be satisfied by many practical cases for example fcnns cnns and kernel space therefore the theory can be tightly connected with practical applications 4 plenty applied work has been proposed to address this ood but theoretical works discussing when ood detection work is lacking the paper theoretical shows when ood can work in practical cases i think the contribution are significantly important and this work can give a good guidance for the development of ood detection this paper has the potential to achieve a long term impact to ood learning field 5 the paper is written well enough to understand weaknesses 1 the appendix is long and the proofs are complicated although i have check almost all important proofs and believe they are correct i still spend three days to check them it is better for the author to provide proof sketch and intuitions for important theorems 2 it seems that the description of theorem 4 in main text is slightly different from the description of theorem 4 in appendix i have checked it and found that the description theorem 4 in appendix is more rigorous although you have explained why they are different because of the space limitation in appendix g2 i still suggest that the author should use the description of theorem 4 in appendix to replace theorem 4 in main text because the description in appendix is correct 3 typosgrammar 1 in line 305 k should be lambda 2 in line 340 dxyy in should be dxiyi 3 in line 171 dxi should be dxiyi 4 after checking your proof i think condition 2 can be removed from theorems 7 and 10 although condition 2 is weak and meaningful i still think it is better to remove condition 2 the idea about how to remove condition 2 can be motivate from the proof of theorem 9 the second part the paper focuses on theory for ood detection and gives the first theoretical support to understand when ood detection can work there is no any potential negative social impact docsepthis paper explores the theoretical foundation of learnability of outofdistribution detection based on the pac learning theory the paper proved several impossibility theorems for the learnability of ood detection under some scenarios and finds some conditions that ood detection is paclearnable also the paper demonstrate the theory in real practice using fcnn and ood scores as examples recently there are loads of papers proposed empirical methods for ood detection but the theory is rarely exploredthis paper is the first to investigate the theory of ood detection so thoroughly which is meaningful to this field strengths the paper is clear and wellwritten and the proofs are generally correct this paper is one of the few theoretical works focusing on ood detection which plays a significant role in this field the theory is intuitive and have some practical impacts it can somewhat guide the design of ood detection algorithms weakness some notations and expressions can be refined in section 2 for example s or dxyn in eq2 can be explained minor some typos in section 2 definition 1 if there exist an algorithm if there exists an algorithm some experiments can be added to show the correctness of the theorems the practical impacts may not be large enough yes docseprecently reliable ai plays important role in designing an intelligent machine learning system how to let ai system tell do not know is critical for reliable ai systems which is the focus of this paper in this paper the authors consider a practical scenario where outofdistribution data the system should not know is unseen during the training process in this scenario the authors want to investigate if the ood detection is learnable the theoretical part is easy to follow i find that the theoretical contributions are completed and interesting at first this paper shows that ood detection is not learnable in the most general case which does make sense due to the unavailability of ood data then this paper points out a necessary condition sometimes as a necessary and sufficient condition of the learnability of ood detection which directly induces a lot of necessary and sufficient conditions of learnability of ood detection in my opinion this is a significant contribution to the field finding necessary and sufficient conditions is always a core and the most important part when studying a problem from the practical part several theorems are considered using networks or finite indistribution domains making the whole paper also fit the taste of practitioners in many practical scenarios we cannot expect ood data is the ones we have already seen which is exactly the problem this paper studies besides the theorem regarding finite id distributions is also practical if i understand correctly in this practical scenario this paper gives a better result which is very interesting to me and significant to the field we often only have finite id distributions in practice pros 1 this paper is the first to characterize the learnability of ood detection which makes a significant contribution to the field there are many ood detection papers targeting the problem this paper considers the problem is very difficult yet very important in practice previously no theoretical works are proposed for this problem in this paper a completed theory is proposed for this problem including when ood detection will fail and when ood detection will succeed a lot of necessary and sufficient conditions of learnability of ood detection are exciting to this field 2 for practitioners this paper relieves some big concerns regarding existing ood detection methods before this work one could intuitively think that ood detection is not learnable which is true in the most general case yet our common datasets are not such general however this paper gives a theoretical boundary between learnability and unlearnability of ood detection by proving some necessary and sufficient conditions thus we can know on what kind of datasets ood detection is learnable this contribution is significant and meaningful 3 fig 1 is very helpful in understanding the key necessary condition of ood detection which seems that it can motivate a bunch of papers in this research direction 4 i can see that there are three research topics regarding that let ai say dont know 1 classification with reject option 2 pq learning and 3 ood detection the first two have already had some theories but the last one does not have this paper fills up this gap making ood detection method which might be more practical than the other two possible in theory 5 although the proofs of this paper are not easy to follow the logic and organizations of proofs are clear i have read most proofs and have not found unrecoverable errors for important results the proofs are soundness cons 1 i have read some papers regarding pq learning and feel that pq learning is totally different from ood detection pq learning focuses on scenarios where ood data are somehow available yet ood detection focuses on the opposite scenarios however it is better to demonstrate their difference deeply does pq learning have limitations when meeting different ood data in the future i am interested to see some discussions regarding this part 2 similar to pq learning classification with reject option could be deeply compared to ood detection instead of just comparing both using plain words i know they are very different and ood detection theory is more difficult but giving more detailed comparation is better for this paper 3 i have some questions regarding figure 1 which i hope that the authors can confirm with me in my opinion the solid line is the groundtruth line do we expect that the estimated lines dash lines get closer to the solid line if so when overlap exists why is the solid line not straight can you bring me to the specific part regarding this it seems that the solid line will be straight if there are no overlaps which makes ood detection learnable is that correct 4 more explanation like figure 1 could be added for understanding the theorems better brief proofs might be also useful 5 in line 26 there are too many separate citations in my opinion it is not necessary 6 line 148 should not be a new paragraph 7 the densitybased space is very important and interesting especially the theorem 11 is one of the spotlights can you give more explanations or applications regarding densitybased space theorems 9 and 11 8 the mathematic expression in definition 1 about pac learnability is different with the normal expression of pac learnability although line 118 has told us that they are equivalent and i also realize that they are equivalent by paper 2130 exercise 45 in 21 can prove it the paper will be improved and more clear if a brief proof for the equivalent descriptions is given in the final version it is a pure theoretical paper so i think there is no negative social impacts
### Summary: | this paper studies generalization and learnability questions in the realm of outofdistribution ood detection specifically it applies pac learning to the theory of ood detection the contributions include new conceptual definitions of agnostic pac learnability of ood detection then the authors argue for studying priorunknown spaces under certain necessary conditions this leads to a number of novel results both in theory and in terms of possible practical impact eg when ood detection will succeed vs fail the reviewers found the paper sound insightful clearlywritten and novel this paper benefits the community because it is one of the few theoretical studies of ood detection for the final version the reviewers have many comments regarding definitions terminology and some of the technical details i encourage the authors to incorporate as much of this feedback as possible to make the paper easier to read for future audiences for example please add the full proof of how eq 2 relates to paclearnability add and clarify the realizability assumption in the revision use the description of theorem 4 in appendix g2 to replace theorem 4 in main text the authors should also provide proof sketches for the main results either in the main paper or the appendix this paper contains many theoretical results as well as ways to unpack them in the context of more practical scenarios all of this would benefit from clear exposition there are also a handful of typos to fix in the notationequations and in the exposition given the large number of small questionsissues it is important to address these in the final version of the paper the reviewers all vote positively toward acceptance of this paper and therefore i also recommend acceptance | [
9400,
533,
253,
2929,
310,
1048,
594,
891,
6947,
2067,
1897,
281,
2451,
253,
27947,
9257,
512,
273,
253,
1543,
275,
436,
2929,
403,
4516,
407,
27947,
50276,
4064,
752,
891,
452,
10141,
512,
27947,
403,
3451,
495,
186,
783,
2929,
9172,
18123,
285,
14962,
253,
1953,
273,
639,
79,
6932,
19162,
3037,
1430,
273,
258,
351,
285,
23970,
4209,
13260,
281,
9295,
352,
824,
347,
9376,
337,
841,
13260,
403,
8542,
285,
11134,
285,
476,
320,
10048,
407,
1142,
8542,
2219,
323,
1650,
269,
14340,
2224,
260,
79,
2224,
285,
10295,
2317,
3103,
253,
3762,
476,
320,
18996,
4802,
342,
8542,
4893,
577,
186,
446,
4108,
3732,
789,
556,
644,
4081,
281,
2953,
436,
258,
351,
533,
10527,
2987,
16585,
672,
258,
351,
5481,
789,
310,
14999,
253,
2929,
10527,
2722,
672,
258,
351,
476,
789,
275,
8542,
2219,
891,
1158,
253,
7680,
403,
3012,
1774,
285,
436,
789,
476,
1918,
247,
1175,
12925,
323,
253,
2440,
273,
258,
351,
5481,
436,
2929,
556,
253,
2442,
281,
5115,
247,
1048,
1307,
3486,
281,
258,
351,
4715,
1673,
608,
186,
783,
2929,
310,
3542,
973,
2217,
281,
2096,
50275,
20881,
1255,
265,
50276,
18,
186,
783,
30762,
310,
1048,
285,
253,
27947,
403,
9542,
3738,
891,
452,
2451,
2761,
512,
1774,
27947,
285,
2868,
597,
403,
3451,
891,
1335,
6947,
1264,
1897,
281,
2451,
731,
352,
310,
1805,
323,
253,
2488,
281,
2085,
4737,
23211,
285,
16875,
4431,
323,
1774,
39383,
50276,
19,
186,
262,
3133,
326,
253,
5740,
273,
10012,
577,
275,
2022,
2505,
310,
5777,
1027,
432,
253,
5740,
273,
10012,
577,
275,
30762,
891,
452,
10141,
352,
285,
1119,
326,
253,
5740,
10012,
577,
275,
30762,
310,
625,
26565,
3738,
368,
452,
5544,
2139,
597,
403,
1027,
984,
273,
253,
2317,
12291,
275,
30762,
305,
19,
891,
1335,
1804,
326,
253,
2488,
943,
897,
253,
5740,
273,
10012,
577,
275,
30762,
281,
8171,
10012,
577,
275,
2022,
2505,
984,
253,
5740,
275,
30762,
310,
3451,
50276,
20,
186,
555,
993,
1710,
4175,
337,
275,
1386,
26402,
465,
943,
320,
29331,
374,
275,
1386,
28528,
277,
5246,
90,
275,
943,
320,
277,
2981,
28212,
495,
275,
1386,
24549,
277,
2981,
943,
320,
277,
2981,
28212,
577,
50276,
6438,
12669,
634,
4737,
891,
1158,
1617,
374,
476,
320,
5176,
432,
39383,
818,
285,
884,
3738,
1617,
374,
310,
5075,
285,
14282,
891,
1335,
1158,
352,
310,
1805,
281,
5386,
1617,
374,
253,
2934,
670,
849,
281,
5386,
1617,
374,
476,
320,
41509,
432,
253,
4737,
273,
10012,
898,
253,
1273,
629,
50275,
783,
2929,
16633,
327,
3762,
323,
258,
351,
5481,
285,
4245,
253,
806,
10527,
1329,
281,
2096,
672,
258,
351,
5481,
476,
789,
627,
310,
642,
667,
2442,
4016,
2675,
3486,
5474,
33032,
2520,
2929,
33826,
253,
10527,
12153,
273,
3037,
1430,
273,
562,
1171,
35360,
5481,
1754,
327,
253,
19162,
4715,
3762,
253,
2929,
8058,
2067,
37593,
2322,
39383,
323,
253,
3037,
1430,
273,
258,
351,
5481,
762,
690,
15216,
285,
9010,
690,
2515,
326,
258,
351,
5481,
310,
268,
6929,
1596,
494,
671,
253,
2929,
7568,
253,
3762,
275,
1524,
3946,
970,
269,
68,
9866,
285,
258,
351,
7363,
347,
6667,
4102,
627,
403,
16665,
273,
9380,
4081,
16774,
3082,
323,
258,
351,
5481,
533,
253,
3762,
310,
11766,
14859,
2520,
2929,
310,
253,
806,
281,
7409,
253,
3762,
273,
258,
351,
5481,
594,
16575,
534,
310,
14282,
281,
436,
1673,
20544,
50276,
783,
2929,
310,
2590,
285,
973,
15720,
285,
253,
27947,
403,
3839,
3451,
50276,
2520,
2929,
310,
581,
273,
253,
1643,
10527,
2987,
13654,
327,
258,
351,
5481,
534,
7120,
247,
1534,
2554,
275,
436,
1673,
50276,
783,
3762,
310,
27350,
285,
452,
690,
8542,
16274,
352,
476,
8489,
7102,
253,
2216,
273,
258,
351,
5481,
11333,
50276,
20881,
1255,
50276,
8826,
41818,
285,
12091,
476,
320,
22407,
275,
2593,
374,
323,
1650,
256,
390,
18747,
1362,
50276,
249,
16186,
19,
476,
320,
5544,
5884,
50275,
8826,
963,
993,
275,
2593,
374,
5426,
337,
604,
627,
2226,
271,
5933,
50276,
338,
627,
4961,
271,
5933,
50276,
8826,
4679,
476,
320,
2879,
281,
921,
253,
36594,
273,
253,
39383,
50276,
783,
8542,
16274,
778,
417,
320,
1781,
2217,
50276,
9820,
5474,
339,
3456,
1154,
314,
9630,
23105,
7120,
1774,
2554,
275,
20462,
271,
17497,
5145,
4715,
985,
849,
281,
1339,
23105,
985,
2028,
513,
417,
871,
310,
4619,
323,
9630,
23105,
2718,
534,
310,
253,
2770,
273,
436,
2929,
275,
436,
2929,
253,
4477,
1908,
247,
8542,
10076,
835,
562,
1171,
35360,
941,
253,
985,
943,
417,
871,
310,
39709,
1309,
253,
3733,
1232,
275,
436,
10076,
253,
4477,
971,
281,
7409,
604,
253,
258,
351,
5481,
310,
3037,
494,
50275,
783,
10527,
629,
310,
3477,
281,
956,
891,
1089,
326,
253,
10527,
9021,
403,
6312,
285,
4722,
387,
806,
436,
2929,
2722,
326,
258,
351,
5481,
310,
417,
3037,
494,
275,
253,
954,
2087,
1083,
534,
1057,
1056,
3282,
1955,
281,
253,
440,
32517,
273,
258,
351,
941,
840,
436,
2929,
2792,
562,
247,
3309,
1617,
4536,
347,
247,
3309,
285,
4209,
1617,
273,
253,
3037,
1430,
273,
258,
351,
5481,
534,
3587,
14757,
247,
2257,
273,
3309,
285,
4209,
2515,
273,
3037,
1430,
273,
258,
351,
5481,
275,
619,
4743,
436,
310,
247,
1534,
7680,
281,
253,
1673,
4560,
3309,
285,
4209,
2515,
310,
1900,
247,
5161,
285,
253,
954,
1774,
629,
672,
12392,
247,
1895,
50276,
4064,
253,
8542,
629,
2067,
39383,
403,
2783,
970,
6928,
390,
6486,
31929,
2382,
10625,
2403,
253,
2644,
2929,
671,
4944,
253,
9075,
273,
24432,
275,
1142,
8542,
15216,
359,
2550,
1902,
258,
351,
941,
310,
253,
4394,
359,
452,
2168,
2326,
534,
310,
4555,
253,
1895,
436,
2929,
2175,
16280,
253,
10012,
5001,
6486,
2654,
10670,
310,
671,
8542,
604,
891,
2096,
9113,
275,
436,
8542,
10076,
436,
2929,
4245,
247,
1805,
906,
534,
310,
1077,
4722,
281,
479,
285,
1534,
281,
253,
1673,
359,
2223,
760,
452,
6486,
2654,
10670,
275,
3946,
50276,
856,
84,
50276,
18,
436,
2929,
310,
253,
806,
281,
17710,
253,
3037,
1430,
273,
258,
351,
5481,
534,
2789,
247,
1534,
7680,
281,
253,
1673,
627,
403,
1142,
258,
351,
5481,
9380,
12262,
253,
1895,
436,
2929,
19401,
253,
1895,
310,
1077,
2834,
2568,
1077,
1774,
275,
3946,
3786,
642,
10527,
2987,
403,
4081,
323,
436,
1895,
275,
436,
2929,
247,
6312,
3762,
310,
4081,
323,
436,
1895,
1690,
672,
258,
351,
5481,
588,
1891,
285,
672,
258,
351,
5481,
588,
9302,
247,
2257,
273,
3309,
285,
4209,
2515,
273,
3037,
1430,
273,
258,
351,
5481,
403,
12302,
281,
436,
1673,
50276,
19,
323,
24432,
436,
2929,
18604,
1634,
690,
1943,
7350,
5001,
5368,
258,
351,
5481,
3082,
1078,
436,
789,
581,
812,
540,
41597,
1158,
326,
258,
351,
5481,
310,
417,
3037,
494,
534,
310,
2032,
275,
253,
954,
2087,
1083,
2568,
776,
1846,
15302,
403,
417,
824,
2087,
2299,
436,
2929,
4245,
247,
10527,
7548,
875,
3037,
1430,
285,
440,
29343,
1430,
273,
258,
351,
5481,
407,
18597,
690,
3309,
285,
4209,
2515,
3021,
359,
476,
871,
327,
752,
2238,
273,
15302,
258,
351,
5481,
310,
3037,
494,
436,
7680,
310,
1534,
285,
14282,
50275,
20,
3036,
337,
310,
1077,
9371,
275,
4685,
253,
2234,
3309,
1617,
273,
258,
351,
5481,
534,
3133,
326,
352,
476,
41509,
247,
12190,
273,
9380,
275,
436,
2561,
3884,
50276,
21,
891,
476,
923,
326,
627,
403,
1264,
2561,
12989,
5001,
326,
1339,
23105,
1333,
13414,
871,
337,
9162,
342,
12009,
4500,
374,
268,
82,
4715,
285,
495,
258,
351,
5481,
253,
806,
767,
452,
2168,
574,
690,
11813,
533,
253,
1390,
581,
1057,
417,
452,
436,
2929,
32113,
598,
436,
8037,
2403,
258,
351,
5481,
1332,
534,
1537,
320,
625,
8542,
685,
253,
643,
767,
1896,
275,
3762,
50276,
22,
3738,
253,
27947,
273,
436,
2929,
403,
417,
3477,
281,
956,
253,
9317,
285,
8889,
273,
27947,
403,
2590,
891,
452,
1239,
954,
27947,
285,
452,
417,
1119,
440,
2845,
1189,
494,
6332,
323,
1774,
1543,
253,
27947,
403,
3590,
1255,
50276,
5040,
50276,
18,
891,
452,
1239,
690,
9380,
5001,
268,
82,
4715,
285,
1928,
326,
268,
82,
4715,
310,
9106,
1027,
432,
258,
351,
5481,
268,
82,
4715,
16633,
327,
15216,
835,
258,
351,
941,
403,
10380,
2130,
2568,
258,
351,
5481,
16633,
327,
253,
7285,
15216,
2299,
352,
310,
1805,
281,
7568,
616,
3064,
11617,
1057,
268,
82,
4715,
452,
7364,
672,
4804,
1027,
258,
351,
941,
275,
253,
2852,
891,
717,
6110,
281,
923,
690,
11985,
5001,
436,
629,
50276,
19,
2074,
281,
268,
82,
4715,
9162,
342,
12009,
4500,
812,
320,
11617,
2429,
281,
258,
351,
5481,
3185,
273,
816,
10941,
1097,
970,
8342,
3000,
891,
871,
597,
403,
1077,
1027,
285,
258,
351,
5481,
3762,
310,
625,
2834,
533,
4933,
625,
7000,
3294,
318,
310,
1805,
323,
436,
2929,
50276,
20,
891,
452,
690,
3533,
5001,
4677,
337,
534,
891,
3524,
326,
253,
4477,
476,
6583,
342,
479,
275,
619,
4743,
253,
4891,
1386,
310,
253,
3216,
33024,
1386,
513,
359,
1902,
326,
253,
5998,
3104,
20134,
3104,
755,
8003,
281,
253,
4891,
1386,
604,
594,
672,
14787,
4961,
2139,
310,
253,
4891,
1386,
417,
4951,
476,
368,
3324,
479,
281,
253,
2173,
629,
5001,
436,
352,
3133,
326,
253,
4891,
1386,
588,
320,
4951,
604,
627,
403,
642,
47685,
534,
2789,
258,
351,
5481,
3037,
494,
310,
326,
3451,
50276,
21,
625,
8813,
751,
4677,
337,
812,
320,
2879,
323,
4685,
253,
39383,
1805,
4864,
27947,
1537,
320,
671,
4217,
50276,
22,
275,
1386,
3436,
627,
403,
1512,
1142,
4858,
30404,
275,
619,
4743,
352,
310,
417,
3309,
50276,
23,
1386,
20995,
943,
417,
320,
247,
747,
12494,
50276,
24,
253,
4038,
3169,
2317,
310,
1077,
1774,
285,
4722,
3340,
253,
10012,
1903,
310,
581,
273,
253,
6308,
22850,
476,
368,
1918,
625,
22909,
390,
4893,
5001,
4038,
3169,
2317,
39383,
898,
285,
1903,
50275,
25,
253,
39011,
2048,
275,
5426,
337,
670,
19162,
3037,
1430,
310,
1027,
342,
253,
2622,
2048,
273,
19162,
3037,
1430,
3738,
1386,
12643,
556,
2183,
441,
326,
597,
403,
6425,
285,
891,
671,
8968,
326,
597,
403,
6425,
407,
2929,
374,
11246,
5763,
5329,
275,
3127,
476,
5276,
352,
50276,
783,
2929,
588,
320,
5520,
285,
625,
2590,
604,
247,
4864,
4737,
323,
253,
6425,
20121,
310,
1677,
275,
253,
2457,
2715,
50276,
262,
310,
247,
6313,
10527,
2929,
594,
891,
1158,
627,
310,
642,
4016,
2675,
16274,
50275,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
26647,
285,
3037,
1430,
3533,
275,
253,
19929,
273,
562,
1171,
35360,
258,
351,
5481,
5742,
352,
10384,
19162,
4715,
281,
253,
3762,
273,
258,
351,
5481,
253,
9021,
2486,
747,
20178,
14308,
273,
639,
79,
6932,
19162,
3037,
1430,
273,
258,
351,
5481,
840,
253,
4477,
9059,
323,
12392,
2720,
29469,
8470,
762,
2176,
3309,
2515,
436,
5644,
281,
247,
1180,
273,
4460,
1543,
1097,
275,
3762,
285,
275,
2426,
273,
1896,
8542,
3486,
24088,
672,
258,
351,
5481,
588,
9302,
4632,
1891,
253,
30628,
1119,
253,
2929,
3590,
47860,
4518,
15720,
285,
4460,
436,
2929,
5373,
253,
3114,
984,
352,
310,
581,
273,
253,
1643,
10527,
2175,
273,
258,
351,
5481,
50276,
1542,
253,
2457,
2715,
253,
30628,
452,
1142,
5701,
5001,
14308,
28939,
285,
690,
273,
253,
7681,
4278,
891,
11907,
253,
4477,
281,
19071,
347,
1199,
273,
436,
8680,
347,
1896,
281,
1056,
253,
2929,
6927,
281,
1239,
323,
2852,
23886,
323,
1650,
4496,
50275,
1911,
253,
2120,
4737,
273,
849,
16186,
374,
7033,
281,
268,
6929,
1596,
1430,
50276,
1911,
285,
19148,
253,
42924,
1430,
9376,
275,
253,
18520,
50276,
2327,
253,
5740,
273,
10012,
577,
275,
30762,
305,
19,
281,
8171,
10012,
577,
275,
2022,
2505,
50276,
783,
4477,
943,
671,
2085,
4737,
46159,
323,
253,
2022,
1543,
2057,
275,
253,
2022,
2929,
390,
253,
30762,
436,
2929,
4428,
1142,
10527,
1543,
347,
973,
347,
4088,
281,
45737,
731,
275,
253,
3634,
273,
625,
8542,
15216,
512,
273,
436,
651,
5649,
432,
2590,
47284,
627,
403,
671,
247,
17167,
273,
963,
993,
281,
4993,
275,
253,
14951,
2655,
569,
285,
275,
253,
47284,
1677,
253,
1781,
1180,
273,
1355,
3533,
22402,
352,
310,
1774,
281,
2953,
841,
275,
253,
2457,
2715,
273,
253,
2929,
50276,
783,
30628,
512,
6273,
14962,
2584,
14924,
273,
436,
2929,
285,
3103,
891,
671,
5583,
14924
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
9400,
533,
253,
2929,
310,
1048,
594,
891,
6947,
2067,
1897,
281,
2451,
253,
27947,
9257,
512,
273,
253,
1543,
275,
436,
2929,
403,
4516,
407,
27947,
50276,
4064,
752,
891,
452,
10141,
512,
27947,
403,
3451,
495,
186,
783,
2929,
9172,
18123,
285,
14962,
253,
1953,
273,
639,
79,
6932,
19162,
3037,
1430,
273,
258,
351,
285,
23970,
4209,
13260,
281,
9295,
352,
824,
347,
9376,
337,
841,
13260,
403,
8542,
285,
11134,
285,
476,
320,
10048,
407,
1142,
8542,
2219,
323,
1650,
269,
14340,
2224,
260,
79,
2224,
285,
10295,
2317,
3103,
253,
3762,
476,
320,
18996,
4802,
342,
8542,
4893,
577,
186,
446,
4108,
3732,
789,
556,
644,
4081,
281,
2953,
436,
258,
351,
533,
10527,
2987,
16585,
672,
258,
351,
5481,
789,
310,
14999,
253,
2929,
10527,
2722,
672,
258,
351,
476,
789,
275,
8542,
2219,
891,
1158,
253,
7680,
403,
3012,
1774,
285,
436,
789,
476,
1918,
247,
1175,
12925,
323,
253,
2440,
273,
258,
351,
5481,
436,
2929,
556,
253,
2442,
281,
5115,
247,
1048,
1307,
3486,
281,
258,
351,
4715,
1673,
608,
186,
783,
2929,
310,
3542,
973,
2217,
281,
2096,
50275,
20881,
1255,
265,
50276,
18,
186,
783,
30762,
310,
1048,
285,
253,
27947,
403,
9542,
3738,
891,
452,
2451,
2761,
512,
1774,
27947,
285,
2868,
597,
403,
3451,
891,
1335,
6947,
1264,
1897,
281,
2451,
731,
352,
310,
1805,
323,
253,
2488,
281,
2085,
4737,
23211,
285,
16875,
4431,
323,
1774,
39383,
50276,
19,
186,
262,
3133,
326,
253,
5740,
273,
10012,
577,
275,
2022,
2505,
310,
5777,
1027,
432,
253,
5740,
273,
10012,
577,
275,
30762,
891,
452,
10141,
352,
285,
1119,
326,
253,
5740,
10012,
577,
275,
30762,
310,
625,
26565,
3738,
368,
452,
5544,
2139,
597,
403,
1027,
984,
273,
253,
2317,
12291,
275,
30762,
305,
19,
891,
1335,
1804,
326,
253,
2488,
943,
897,
253,
5740,
273,
10012,
577,
275,
30762,
281,
8171,
10012,
577,
275,
2022,
2505,
984,
253,
5740,
275,
30762,
310,
3451,
50276,
20,
186,
555,
993,
1710,
4175,
337,
275,
1386,
26402,
465,
943,
320,
29331,
374,
275,
1386,
28528,
277,
5246,
90,
275,
943,
320,
277,
2981,
28212,
495,
275,
1386,
24549,
277,
2981,
943,
320,
277,
2981,
28212,
577,
50276,
6438,
12669,
634,
4737,
891,
1158,
1617,
374,
476,
320,
5176,
432,
39383,
818,
285,
884,
3738,
1617,
374,
310,
5075,
285,
14282,
891,
1335,
1158,
352,
310,
1805,
281,
5386,
1617,
374,
253,
2934,
670,
849,
281,
5386,
1617,
374,
476,
320,
41509,
432,
253,
4737,
273,
10012,
898,
253,
1273,
629,
50275,
783,
2929,
16633,
327,
3762,
323,
258,
351,
5481,
285,
4245,
253,
806,
10527,
1329,
281,
2096,
672,
258,
351,
5481,
476,
789,
627,
310,
642,
667,
2442,
4016,
2675,
3486,
5474,
33032,
2520,
2929,
33826,
253,
10527,
12153,
273,
3037,
1430,
273,
562,
1171,
35360,
5481,
1754,
327,
253,
19162,
4715,
3762,
253,
2929,
8058,
2067,
37593,
2322,
39383,
323,
253,
3037,
1430,
273,
258,
351,
5481,
762,
690,
15216,
285,
9010,
690,
2515,
326,
258,
351,
5481,
310,
268,
6929,
1596,
494,
671,
253,
2929,
7568,
253,
3762,
275,
1524,
3946,
970,
269,
68,
9866,
285,
258,
351,
7363,
347,
6667,
4102,
627,
403,
16665,
273,
9380,
4081,
16774,
3082,
323,
258,
351,
5481,
533,
253,
3762,
310,
11766,
14859,
2520,
2929,
310,
253,
806,
281,
7409,
253,
3762,
273,
258,
351,
5481,
594,
16575,
534,
310,
14282,
281,
436,
1673,
20544,
50276,
783,
2929,
310,
2590,
285,
973,
15720,
285,
253,
27947,
403,
3839,
3451,
50276,
2520,
2929,
310,
581,
273,
253,
1643,
10527,
2987,
13654,
327,
258,
351,
5481,
534,
7120,
247,
1534,
2554,
275,
436,
1673,
50276,
783,
3762,
310,
27350,
285,
452,
690,
8542,
16274,
352,
476,
8489,
7102,
253,
2216,
273,
258,
351,
5481,
11333,
50276,
20881,
1255,
50276,
8826,
41818,
285,
12091,
476,
320,
22407,
275,
2593,
374,
323,
1650,
256,
390,
18747,
1362,
50276,
249,
16186,
19,
476,
320,
5544,
5884,
50275,
8826,
963,
993,
275,
2593,
374,
5426,
337,
604,
627,
2226,
271,
5933,
50276,
338,
627,
4961,
271,
5933,
50276,
8826,
4679,
476,
320,
2879,
281,
921,
253,
36594,
273,
253,
39383,
50276,
783,
8542,
16274,
778,
417,
320,
1781,
2217,
50276,
9820,
5474,
339,
3456,
1154,
314,
9630,
23105,
7120,
1774,
2554,
275,
20462,
271,
17497,
5145,
4715,
985,
849,
281,
1339,
23105,
985,
2028,
513,
417,
871,
310,
4619,
323,
9630,
23105,
2718,
534,
310,
253,
2770,
273,
436,
2929,
275,
436,
2929,
253,
4477,
1908,
247,
8542,
10076,
835,
562,
1171,
35360,
941,
253,
985,
943,
417,
871,
310,
39709,
1309,
253,
3733,
1232,
275,
436,
10076,
253,
4477,
971,
281,
7409,
604,
253,
258,
351,
5481,
310,
3037,
494,
50275,
783,
10527,
629,
310,
3477,
281,
956,
891,
1089,
326,
253,
10527,
9021,
403,
6312,
285,
4722,
387,
806,
436,
2929,
2722,
326,
258,
351,
5481,
310,
417,
3037,
494,
275,
253,
954,
2087,
1083,
534,
1057,
1056,
3282,
1955,
281,
253,
440,
32517,
273,
258,
351,
941,
840,
436,
2929,
2792,
562,
247,
3309,
1617,
4536,
347,
247,
3309,
285,
4209,
1617,
273,
253,
3037,
1430,
273,
258,
351,
5481,
534,
3587,
14757,
247,
2257,
273,
3309,
285,
4209,
2515,
273,
3037,
1430,
273,
258,
351,
5481,
275,
619,
4743,
436,
310,
247,
1534,
7680,
281,
253,
1673,
4560,
3309,
285,
4209,
2515,
310,
1900,
247,
5161,
285,
253,
954,
1774,
629,
672,
12392,
247,
1895,
50276,
4064,
253,
8542,
629,
2067,
39383,
403,
2783,
970,
6928,
390,
6486,
31929,
2382,
10625,
2403,
253,
2644,
2929,
671,
4944,
253,
9075,
273,
24432,
275,
1142,
8542,
15216,
359,
2550,
1902,
258,
351,
941,
310,
253,
4394,
359,
452,
2168,
2326,
534,
310,
4555,
253,
1895,
436,
2929,
2175,
16280,
253,
10012,
5001,
6486,
2654,
10670,
310,
671,
8542,
604,
891,
2096,
9113,
275,
436,
8542,
10076,
436,
2929,
4245,
247,
1805,
906,
534,
310,
1077,
4722,
281,
479,
285,
1534,
281,
253,
1673,
359,
2223,
760,
452,
6486,
2654,
10670,
275,
3946,
50276,
856,
84,
50276,
18,
436,
2929,
310,
253,
806,
281,
17710,
253,
3037,
1430,
273,
258,
351,
5481,
534,
2789,
247,
1534,
7680,
281,
253,
1673,
627,
403,
1142,
258,
351,
5481,
9380,
12262,
253,
1895,
436,
2929,
19401,
253,
1895,
310,
1077,
2834,
2568,
1077,
1774,
275,
3946,
3786,
642,
10527,
2987,
403,
4081,
323,
436,
1895,
275,
436,
2929,
247,
6312,
3762,
310,
4081,
323,
436,
1895,
1690,
672,
258,
351,
5481,
588,
1891,
285,
672,
258,
351,
5481,
588,
9302,
247,
2257,
273,
3309,
285,
4209,
2515,
273,
3037,
1430,
273,
258,
351,
5481,
403,
12302,
281,
436,
1673,
50276,
19,
323,
24432,
436,
2929,
18604,
1634,
690,
1943,
7350,
5001,
5368,
258,
351,
5481,
3082,
1078,
436,
789,
581,
812,
540,
41597,
1158,
326,
258,
351,
5481,
310,
417,
3037,
494,
534,
310,
2032,
275,
253,
954,
2087,
1083,
2568,
776,
1846,
15302,
403,
417,
824,
2087,
2299,
436,
2929,
4245,
247,
10527,
7548,
875,
3037,
1430,
285,
440,
29343,
1430,
273,
258,
351,
5481,
407,
18597,
690,
3309,
285,
4209,
2515,
3021,
359,
476,
871,
327,
752,
2238,
273,
15302,
258,
351,
5481,
310,
3037,
494,
436,
7680,
310,
1534,
285,
14282,
50275,
20,
3036,
337,
310,
1077,
9371,
275,
4685,
253,
2234,
3309,
1617,
273,
258,
351,
5481,
534,
3133,
326,
352,
476,
41509,
247,
12190,
273,
9380,
275,
436,
2561,
3884,
50276,
21,
891,
476,
923,
326,
627,
403,
1264,
2561,
12989,
5001,
326,
1339,
23105,
1333,
13414,
871,
337,
9162,
342,
12009,
4500,
374,
268,
82,
4715,
285,
495,
258,
351,
5481,
253,
806,
767,
452,
2168,
574,
690,
11813,
533,
253,
1390,
581,
1057,
417,
452,
436,
2929,
32113,
598,
436,
8037,
2403,
258,
351,
5481,
1332,
534,
1537,
320,
625,
8542,
685,
253,
643,
767,
1896,
275,
3762,
50276,
22,
3738,
253,
27947,
273,
436,
2929,
403,
417,
3477,
281,
956,
253,
9317,
285,
8889,
273,
27947,
403,
2590,
891,
452,
1239,
954,
27947,
285,
452,
417,
1119,
440,
2845,
1189,
494,
6332,
323,
1774,
1543,
253,
27947,
403,
3590,
1255,
50276,
5040,
50276,
18,
891,
452,
1239,
690,
9380,
5001,
268,
82,
4715,
285,
1928,
326,
268,
82,
4715,
310,
9106,
1027,
432,
258,
351,
5481,
268,
82,
4715,
16633,
327,
15216,
835,
258,
351,
941,
403,
10380,
2130,
2568,
258,
351,
5481,
16633,
327,
253,
7285,
15216,
2299,
352,
310,
1805,
281,
7568,
616,
3064,
11617,
1057,
268,
82,
4715,
452,
7364,
672,
4804,
1027,
258,
351,
941,
275,
253,
2852,
891,
717,
6110,
281,
923,
690,
11985,
5001,
436,
629,
50276,
19,
2074,
281,
268,
82,
4715,
9162,
342,
12009,
4500,
812,
320,
11617,
2429,
281,
258,
351,
5481,
3185,
273,
816,
10941,
1097,
970,
8342,
3000,
891,
871,
597,
403,
1077,
1027,
285,
258,
351,
5481,
3762,
310,
625,
2834,
533,
4933,
625,
7000,
3294,
318,
310,
1805,
323,
436,
2929,
50276,
20,
891,
452,
690,
3533,
5001,
4677,
337,
534,
891,
3524,
326,
253,
4477,
476,
6583,
342,
479,
275,
619,
4743,
253,
4891,
1386,
310,
253,
3216,
33024,
1386,
513,
359,
1902,
326,
253,
5998,
3104,
20134,
3104,
755,
8003,
281,
253,
4891,
1386,
604,
594,
672,
14787,
4961,
2139,
310,
253,
4891,
1386,
417,
4951,
476,
368,
3324,
479,
281,
253,
2173,
629,
5001,
436,
352,
3133,
326,
253,
4891,
1386,
588,
320,
4951,
604,
627,
403,
642,
47685,
534,
2789,
258,
351,
5481,
3037,
494,
310,
326,
3451,
50276,
21,
625,
8813,
751,
4677,
337,
812,
320,
2879,
323,
4685,
253,
39383,
1805,
4864,
27947,
1537,
320,
671,
4217,
50276,
22,
275,
1386,
3436,
627,
403,
1512,
1142,
4858,
30404,
275,
619,
4743,
352,
310,
417,
3309,
50276,
23,
1386,
20995,
943,
417,
320,
247,
747,
12494,
50276,
24,
253,
4038,
3169,
2317,
310,
1077,
1774,
285,
4722,
3340,
253,
10012,
1903,
310,
581,
273,
253,
6308,
22850,
476,
368,
1918,
625,
22909,
390,
4893,
5001,
4038,
3169,
2317,
39383,
898,
285,
1903,
50275,
25,
253,
39011,
2048,
275,
5426,
337,
670,
19162,
3037,
1430,
310,
1027,
342,
253,
2622,
2048,
273,
19162,
3037,
1430,
3738,
1386,
12643,
556,
2183,
441,
326,
597,
403,
6425,
285,
891,
671,
8968,
326,
597,
403,
6425,
407,
2929,
374,
11246,
5763,
5329,
275,
3127,
476,
5276,
352,
50276,
783,
2929,
588,
320,
5520,
285,
625,
2590,
604,
247,
4864,
4737,
323,
253,
6425,
20121,
310,
1677,
275,
253,
2457,
2715,
50276,
262,
310,
247,
6313,
10527,
2929,
594,
891,
1158,
627,
310,
642,
4016,
2675,
16274,
50275,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
26647,
285,
3037,
1430,
3533,
275,
253,
19929,
273,
562,
1171,
35360,
258,
351,
5481,
5742,
352,
10384,
19162,
4715,
281,
253,
3762,
273,
258,
351,
5481,
253,
9021,
2486,
747,
20178,
14308,
273,
639,
79,
6932,
19162,
3037,
1430,
273,
258,
351,
5481,
840,
253,
4477,
9059,
323,
12392,
2720,
29469,
8470,
762,
2176,
3309,
2515,
436,
5644,
281,
247,
1180,
273,
4460,
1543,
1097,
275,
3762,
285,
275,
2426,
273,
1896,
8542,
3486,
24088,
672,
258,
351,
5481,
588,
9302,
4632,
1891,
253,
30628,
1119,
253,
2929,
3590,
47860,
4518,
15720,
285,
4460,
436,
2929,
5373,
253,
3114,
984,
352,
310,
581,
273,
253,
1643,
10527,
2175,
273,
258,
351,
5481,
50276,
1542,
253,
2457,
2715,
253,
30628,
452,
1142,
5701,
5001,
14308,
28939,
285,
690,
273,
253,
7681,
4278,
891,
11907,
253,
4477,
281,
19071,
347,
1199,
273,
436,
8680,
347,
1896,
281,
1056,
253,
2929,
6927,
281,
1239,
323,
2852,
23886,
323,
1650,
4496,
50275,
1911,
253,
2120,
4737,
273,
849,
16186,
374,
7033,
281,
268,
6929,
1596,
1430,
50276,
1911,
285,
19148,
253,
42924,
1430,
9376,
275,
253,
18520,
50276,
2327,
253,
5740,
273,
10012,
577,
275,
30762,
305,
19,
281,
8171,
10012,
577,
275,
2022,
2505,
50276,
783,
4477,
943,
671,
2085,
4737,
46159,
323,
253,
2022,
1543,
2057,
275,
253,
2022,
2929,
390,
253,
30762,
436,
2929,
4428,
1142,
10527,
1543,
347,
973,
347,
4088,
281,
45737,
731,
275,
253,
3634,
273,
625,
8542,
15216,
512,
273,
436,
651,
5649,
432,
2590,
47284,
627,
403,
671,
247,
17167,
273,
963,
993,
281,
4993,
275,
253,
14951,
2655,
569,
285,
275,
253,
47284,
1677,
253,
1781,
1180,
273,
1355,
3533,
22402,
352,
310,
1774,
281,
2953,
841,
275,
253,
2457,
2715,
273,
253,
2929,
50276,
783,
30628,
512,
6273,
14962,
2584,
14924,
273,
436,
2929,
285,
3103,
891,
671,
5583,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes an alternative to softmax attention using the sinc function the authors were wellmotivated using the sinc function from the fourier integral estimator and provided theoretical support for approximation error they have done experiments on two datasets showing improvement over softmax attention strengths 1 the paper is a pleasant read and clear to understand 2 the established connection between nonparametric kernel density estimator and selfattention 3 the authors have provided wellmotivated intuition for their proposed approach 4 related work has been moderately covered 5 the evaluation is convincing to show the benefit of sincbased attention weaknesses 1 the authors only experimented with one choice of phi it would be great to see what other suitable candidate of phi is possible 2 how do they determine the value of r is it datasetspecific 3 there are no quantitative results on the runtime of the proposed attention mechanism authors have not adequately commented on the known limitations docsepthe authors demonstrate the fourierformer a new class of transformers in which the novel generalized fourier integral kernels replace the dotproduct kernels the fourierformer can capture correlations between query features and key selfattention vectors the authors empirically corroborate the advantages of fourierformers over the baseline transformers in various practical applications including language modeling and image classification strengths the ideas that the authors put forward are novel and the mathematical arguments are complete and ingenious weaknesses the experiments in this paper are insufficient and therefore not convincing enough to demonstrate the effectiveness of the fourierformer the experiment only involves two basic tasks based on wikitext103 and imagenet although the authors have given detailed proof mathematically due to the poor interpretability of the transformer itself i still need to see more experimental results to agree with their point of view the current experimental results are insufficient and not persuasive docsepin this paper the authors provide a new perspective to interpret the selfattention mechanism in transformers in particular with the assumption that the query and key vectors are normalized the selfattention mechanism coincides with the wellknown nonparametric kernel regression with kernel density estimation motivated by this the authors instead use the generalized fourier integral theorem to build more powerful estimators for capturing the interaction between features in different dimensions experiments on some benchmarks are conducted strengths the interpretation of seeing the selfattention mechanism as using the isotropic gaussian kernels for kernel density estimation and nonparametric regression estimation seems to be novel which provides a new perspective to the community to understand the behavior of selfattention the motivation seems to be reasonable to use the generalized fourier integral theorem to capture the feature interaction instead of using the multivariate gaussian kernels with proper covariance matrices the theoretical analysis is thorough including approximation error of the generalized fourier density estimator theorem 1 and the generalized fourier nonparametric regression estimator theorem 2 weaknesses regarding the background the authors should consider adding a preliminary section to introduce the background knowledge on the nonparametric kernel regression kernel density estimation and the generalized fourier integral theorem which could help the readers easily follow the derivation of section 2 and understand the motivation to use the fourier integral theorem as a guide to developing a new selfattention mechanism regarding the experimental evaluation the issues are threefold 1 since the authors provide an analysis of the approximation error between estimators and true functions theorem 1 and 2 it is informative to provide an empirical evaluation of these quantities on real data as further verification 2 the experiments should be more comprehensive and general for both the language modeling task and image classification task the model size is limited and the baselines are restrictive 3 since the fourierformer need customized operators for implementation the authors should also provide the memorytime cost profiling compared to popular transformer architectures based on these issues the efficiency and effectiveness of the fourierformer are doubtful after rebuttal thank authors for the detailed response most of my concerns have been addressed i have updated my scores to 6 no negative societal impact docsepthis paper proposes the fourierformer in which the dotproduct kernels are replaced by the generalized fourier integral kernels unlike the dotproduct kernels where we need to choose a good covariance matrix to capture the dependency of the features of data the generalized fourier integral kernels can automatically capture such dependency and remove the need to tune the covariance matrix this paper theoretically prove that the proposed fourier integral kernels can efficiently approximate key and query distributions and verify this point through experiments on two transformerbased tasks 1this paper introduces a new angle to interpret transformer and its key module this work provides a nonparametric regression interpretation to study selfattention in transformers and formulate selfattention from the viewpoint of kernel regression 2this work adopts the generalized fourier integral estimators to replace the traditional dotproduct selfattention and provide theoretical guarantees for the estimator 3overall the paper is well organized and technically sound the experimental results on multiple transformerbased tasks verify the efficiency of the proposed fourier former weaknesses 1the derivation process and the presentation need to be improved some important symbol annotations or explanation is missing during the algorithm description which make readers hard to follow the derivation process for example in the equation 9 some important symbol annotations are missing eg s r it is difficult for readers to catch up the derivation and the derivation of pk is crucial to the following interpretation 2some pitfalls in the paper a in line 100 are iid samples from b in equation 6 one of psi is written as phi cline 185 the c in text are in wrong format the paper didnt address the limitation and potential negative societal impact of the work
### Summary: | overall the reviews about this paper are very positive the authors spent great effort engaging in discussions and improving the paper with clarifications and additional experiments we recommend accepting the paper | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
271,
5795,
281,
2602,
4090,
4116,
970,
253,
256,
1763,
1159,
253,
4477,
497,
973,
24013,
8550,
970,
253,
256,
1763,
1159,
432,
253,
269,
15421,
9909,
29107,
285,
2530,
10527,
1329,
323,
11193,
2228,
597,
452,
2218,
4679,
327,
767,
15302,
4645,
7756,
689,
2602,
4090,
4116,
20544,
337,
253,
2929,
310,
247,
17127,
1239,
285,
2590,
281,
2096,
374,
253,
4232,
4602,
875,
1327,
36928,
10295,
4038,
29107,
285,
1881,
42959,
495,
253,
4477,
452,
2530,
973,
24013,
8550,
30328,
323,
616,
4081,
2746,
577,
2905,
789,
556,
644,
28249,
6107,
608,
253,
7103,
310,
21414,
281,
921,
253,
5649,
273,
256,
1763,
3169,
4116,
50276,
20881,
1255,
265,
337,
253,
4477,
760,
3368,
264,
342,
581,
4327,
273,
815,
74,
352,
651,
320,
1270,
281,
923,
752,
643,
7470,
7431,
273,
815,
74,
310,
1896,
374,
849,
513,
597,
3653,
253,
1318,
273,
391,
310,
352,
15302,
29765,
495,
627,
403,
642,
11745,
1543,
327,
253,
20243,
273,
253,
4081,
4116,
5122,
50276,
43355,
452,
417,
18212,
20503,
327,
253,
1929,
7364,
5474,
339,
431,
248,
4477,
7568,
253,
269,
15421,
19946,
247,
747,
966,
273,
4979,
398,
275,
534,
253,
4460,
14923,
269,
15421,
9909,
34501,
8171,
253,
14261,
7509,
34501,
253,
269,
15421,
19946,
476,
9232,
13007,
875,
7316,
3386,
285,
2234,
1881,
42959,
11390,
253,
4477,
45190,
25092,
366,
253,
11361,
273,
269,
15421,
630,
398,
689,
253,
8245,
4979,
398,
275,
2710,
8542,
4893,
1690,
3448,
14053,
285,
2460,
9162,
20544,
253,
5697,
326,
253,
4477,
1691,
3579,
403,
4460,
285,
253,
15965,
7125,
403,
3426,
285,
35604,
784,
32213,
50276,
783,
4679,
275,
436,
2929,
403,
12497,
285,
3103,
417,
21414,
2217,
281,
7568,
253,
12510,
273,
253,
269,
15421,
19946,
253,
3368,
760,
8687,
767,
5044,
8892,
1754,
327,
259,
1479,
614,
633,
12172,
285,
4440,
257,
292,
50276,
20261,
253,
4477,
452,
1677,
7000,
4737,
11076,
1037,
1955,
281,
253,
4105,
4665,
1430,
273,
253,
39707,
3139,
891,
1335,
878,
281,
923,
625,
5661,
1543,
281,
5194,
342,
616,
1127,
273,
1859,
253,
1655,
5661,
1543,
403,
12497,
285,
417,
34593,
5474,
339,
9852,
436,
2929,
253,
4477,
2085,
247,
747,
8668,
281,
4665,
253,
1881,
42959,
5122,
275,
4979,
398,
275,
1798,
342,
253,
9376,
326,
253,
7316,
285,
2234,
11390,
403,
12650,
253,
1881,
42959,
5122,
30150,
342,
253,
973,
4304,
1327,
36928,
10295,
9077,
342,
10295,
4038,
13418,
17194,
407,
436,
253,
4477,
3185,
897,
253,
14923,
269,
15421,
9909,
10012,
281,
1973,
625,
6422,
48489,
323,
26475,
253,
5016,
875,
3386,
275,
1027,
10103,
4679,
327,
690,
49602,
403,
5196,
20544,
50275,
783,
7914,
273,
6523,
253,
1881,
42959,
5122,
347,
970,
253,
29436,
305,
12064,
34501,
323,
10295,
4038,
13418,
285,
1327,
36928,
9077,
13418,
3133,
281,
320,
4460,
534,
3400,
247,
747,
8668,
281,
253,
3114,
281,
2096,
253,
3879,
273,
1881,
42959,
50276,
783,
16038,
3133,
281,
320,
5272,
281,
897,
253,
14923,
269,
15421,
9909,
10012,
281,
9232,
253,
4735,
5016,
3185,
273,
970,
253,
21471,
305,
12064,
34501,
342,
1463,
26677,
12624,
50276,
783,
10527,
1783,
310,
11080,
1690,
11193,
2228,
273,
253,
14923,
269,
15421,
4038,
29107,
10012,
337,
285,
253,
14923,
269,
15421,
1327,
36928,
9077,
29107,
10012,
374,
50275,
20881,
1255,
265,
50276,
1747,
13218,
253,
4114,
253,
4477,
943,
1908,
6240,
247,
12611,
2593,
281,
9569,
253,
4114,
3640,
327,
253,
1327,
36928,
10295,
9077,
10295,
4038,
13418,
285,
253,
14923,
269,
15421,
9909,
10012,
534,
812,
1361,
253,
10668,
4354,
956,
253,
28529,
273,
2593,
374,
285,
2096,
253,
16038,
281,
897,
253,
269,
15421,
9909,
10012,
347,
247,
7102,
281,
6684,
247,
747,
1881,
42959,
5122,
50276,
1747,
13218,
253,
5661,
7103,
253,
3374,
403,
1264,
8089,
337,
1580,
253,
4477,
2085,
271,
1783,
273,
253,
11193,
2228,
875,
48489,
285,
2032,
3470,
10012,
337,
285,
374,
352,
310,
27096,
281,
2085,
271,
16774,
7103,
273,
841,
13483,
327,
1524,
941,
347,
2007,
21999,
374,
50276,
783,
4679,
943,
320,
625,
11088,
285,
2087,
323,
1097,
253,
3448,
14053,
4836,
285,
2460,
9162,
4836,
253,
1566,
1979,
310,
3710,
285,
253,
1666,
25379,
403,
29190,
495,
1580,
253,
269,
15421,
19946,
878,
32176,
9158,
323,
7092,
253,
4477,
943,
671,
2085,
253,
3541,
2606,
2105,
27866,
2429,
281,
4633,
39707,
35615,
1754,
327,
841,
3374,
253,
6733,
285,
12510,
273,
253,
269,
15421,
19946,
403,
38342,
50276,
6438,
30080,
22559,
5717,
4477,
323,
253,
7000,
2380,
954,
273,
619,
7350,
452,
644,
9713,
891,
452,
9300,
619,
7363,
281,
721,
642,
4016,
38058,
3486,
5474,
33032,
2520,
2929,
29328,
253,
269,
15421,
19946,
275,
534,
253,
14261,
7509,
34501,
403,
7932,
407,
253,
14923,
269,
15421,
9909,
34501,
12401,
253,
14261,
7509,
34501,
835,
359,
878,
281,
5206,
247,
1175,
26677,
4315,
281,
9232,
253,
18925,
273,
253,
3386,
273,
941,
253,
14923,
269,
15421,
9909,
34501,
476,
8356,
9232,
824,
18925,
285,
5386,
253,
878,
281,
19928,
253,
26677,
4315,
436,
2929,
28055,
5276,
326,
253,
4081,
269,
15421,
9909,
34501,
476,
14556,
16851,
2234,
285,
7316,
10670,
285,
12654,
436,
1127,
949,
4679,
327,
767,
39707,
3169,
8892,
337,
2520,
2929,
23970,
247,
747,
6907,
281,
4665,
39707,
285,
697,
2234,
6333,
436,
789,
3400,
247,
1327,
36928,
9077,
7914,
281,
1263,
1881,
42959,
275,
4979,
398,
285,
36803,
1881,
42959,
432,
253,
31460,
273,
10295,
9077,
374,
2520,
789,
47932,
253,
14923,
269,
15421,
9909,
48489,
281,
8171,
253,
5899,
14261,
7509,
1881,
42959,
285,
2085,
10527,
23632,
323,
253,
29107,
495,
1189,
455,
253,
2929,
310,
973,
10932,
285,
22335,
3590,
253,
5661,
1543,
327,
2709,
39707,
3169,
8892,
12654,
253,
6733,
273,
253,
4081,
269,
15421,
3438,
50276,
20881,
1255,
265,
337,
783,
28529,
1232,
285,
253,
9759,
878,
281,
320,
5520,
690,
1774,
9484,
31825,
390,
8813,
310,
5816,
1309,
253,
5933,
5740,
534,
1056,
10668,
1892,
281,
956,
253,
28529,
1232,
323,
1650,
275,
253,
5150,
898,
690,
1774,
9484,
31825,
403,
5816,
24088,
256,
391,
352,
310,
2834,
323,
10668,
281,
5834,
598,
253,
28529,
285,
253,
28529,
273,
268,
76,
310,
9560,
281,
253,
1563,
7914,
374,
8826,
8483,
27366,
275,
253,
2929,
247,
275,
1386,
2233,
403,
891,
301,
3530,
432,
50276,
67,
275,
5150,
721,
581,
273,
3714,
74,
310,
3542,
347,
815,
74,
502,
460,
23512,
253,
260,
275,
2505,
403,
275,
3430,
5981,
50275,
783,
2929,
42126,
2953,
253,
12291,
285,
2442,
4016,
38058,
3486,
273,
253,
789,
2490,
187,
4118,
18435,
27,
1189,
455,
253,
10123,
670,
436,
2929,
403,
1077,
2762,
253,
4477,
5262,
1270,
3434,
15966,
275,
11985,
285,
11138,
253,
2929,
342,
8254,
6787,
285,
3081,
4679,
359,
5583,
18738,
253,
2929
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
29328,
271,
5795,
281,
2602,
4090,
4116,
970,
253,
256,
1763,
1159,
253,
4477,
497,
973,
24013,
8550,
970,
253,
256,
1763,
1159,
432,
253,
269,
15421,
9909,
29107,
285,
2530,
10527,
1329,
323,
11193,
2228,
597,
452,
2218,
4679,
327,
767,
15302,
4645,
7756,
689,
2602,
4090,
4116,
20544,
337,
253,
2929,
310,
247,
17127,
1239,
285,
2590,
281,
2096,
374,
253,
4232,
4602,
875,
1327,
36928,
10295,
4038,
29107,
285,
1881,
42959,
495,
253,
4477,
452,
2530,
973,
24013,
8550,
30328,
323,
616,
4081,
2746,
577,
2905,
789,
556,
644,
28249,
6107,
608,
253,
7103,
310,
21414,
281,
921,
253,
5649,
273,
256,
1763,
3169,
4116,
50276,
20881,
1255,
265,
337,
253,
4477,
760,
3368,
264,
342,
581,
4327,
273,
815,
74,
352,
651,
320,
1270,
281,
923,
752,
643,
7470,
7431,
273,
815,
74,
310,
1896,
374,
849,
513,
597,
3653,
253,
1318,
273,
391,
310,
352,
15302,
29765,
495,
627,
403,
642,
11745,
1543,
327,
253,
20243,
273,
253,
4081,
4116,
5122,
50276,
43355,
452,
417,
18212,
20503,
327,
253,
1929,
7364,
5474,
339,
431,
248,
4477,
7568,
253,
269,
15421,
19946,
247,
747,
966,
273,
4979,
398,
275,
534,
253,
4460,
14923,
269,
15421,
9909,
34501,
8171,
253,
14261,
7509,
34501,
253,
269,
15421,
19946,
476,
9232,
13007,
875,
7316,
3386,
285,
2234,
1881,
42959,
11390,
253,
4477,
45190,
25092,
366,
253,
11361,
273,
269,
15421,
630,
398,
689,
253,
8245,
4979,
398,
275,
2710,
8542,
4893,
1690,
3448,
14053,
285,
2460,
9162,
20544,
253,
5697,
326,
253,
4477,
1691,
3579,
403,
4460,
285,
253,
15965,
7125,
403,
3426,
285,
35604,
784,
32213,
50276,
783,
4679,
275,
436,
2929,
403,
12497,
285,
3103,
417,
21414,
2217,
281,
7568,
253,
12510,
273,
253,
269,
15421,
19946,
253,
3368,
760,
8687,
767,
5044,
8892,
1754,
327,
259,
1479,
614,
633,
12172,
285,
4440,
257,
292,
50276,
20261,
253,
4477,
452,
1677,
7000,
4737,
11076,
1037,
1955,
281,
253,
4105,
4665,
1430,
273,
253,
39707,
3139,
891,
1335,
878,
281,
923,
625,
5661,
1543,
281,
5194,
342,
616,
1127,
273,
1859,
253,
1655,
5661,
1543,
403,
12497,
285,
417,
34593,
5474,
339,
9852,
436,
2929,
253,
4477,
2085,
247,
747,
8668,
281,
4665,
253,
1881,
42959,
5122,
275,
4979,
398,
275,
1798,
342,
253,
9376,
326,
253,
7316,
285,
2234,
11390,
403,
12650,
253,
1881,
42959,
5122,
30150,
342,
253,
973,
4304,
1327,
36928,
10295,
9077,
342,
10295,
4038,
13418,
17194,
407,
436,
253,
4477,
3185,
897,
253,
14923,
269,
15421,
9909,
10012,
281,
1973,
625,
6422,
48489,
323,
26475,
253,
5016,
875,
3386,
275,
1027,
10103,
4679,
327,
690,
49602,
403,
5196,
20544,
50275,
783,
7914,
273,
6523,
253,
1881,
42959,
5122,
347,
970,
253,
29436,
305,
12064,
34501,
323,
10295,
4038,
13418,
285,
1327,
36928,
9077,
13418,
3133,
281,
320,
4460,
534,
3400,
247,
747,
8668,
281,
253,
3114,
281,
2096,
253,
3879,
273,
1881,
42959,
50276,
783,
16038,
3133,
281,
320,
5272,
281,
897,
253,
14923,
269,
15421,
9909,
10012,
281,
9232,
253,
4735,
5016,
3185,
273,
970,
253,
21471,
305,
12064,
34501,
342,
1463,
26677,
12624,
50276,
783,
10527,
1783,
310,
11080,
1690,
11193,
2228,
273,
253,
14923,
269,
15421,
4038,
29107,
10012,
337,
285,
253,
14923,
269,
15421,
1327,
36928,
9077,
29107,
10012,
374,
50275,
20881,
1255,
265,
50276,
1747,
13218,
253,
4114,
253,
4477,
943,
1908,
6240,
247,
12611,
2593,
281,
9569,
253,
4114,
3640,
327,
253,
1327,
36928,
10295,
9077,
10295,
4038,
13418,
285,
253,
14923,
269,
15421,
9909,
10012,
534,
812,
1361,
253,
10668,
4354,
956,
253,
28529,
273,
2593,
374,
285,
2096,
253,
16038,
281,
897,
253,
269,
15421,
9909,
10012,
347,
247,
7102,
281,
6684,
247,
747,
1881,
42959,
5122,
50276,
1747,
13218,
253,
5661,
7103,
253,
3374,
403,
1264,
8089,
337,
1580,
253,
4477,
2085,
271,
1783,
273,
253,
11193,
2228,
875,
48489,
285,
2032,
3470,
10012,
337,
285,
374,
352,
310,
27096,
281,
2085,
271,
16774,
7103,
273,
841,
13483,
327,
1524,
941,
347,
2007,
21999,
374,
50276,
783,
4679,
943,
320,
625,
11088,
285,
2087,
323,
1097,
253,
3448,
14053,
4836,
285,
2460,
9162,
4836,
253,
1566,
1979,
310,
3710,
285,
253,
1666,
25379,
403,
29190,
495,
1580,
253,
269,
15421,
19946,
878,
32176,
9158,
323,
7092,
253,
4477,
943,
671,
2085,
253,
3541,
2606,
2105,
27866,
2429,
281,
4633,
39707,
35615,
1754,
327,
841,
3374,
253,
6733,
285,
12510,
273,
253,
269,
15421,
19946,
403,
38342,
50276,
6438,
30080,
22559,
5717,
4477,
323,
253,
7000,
2380,
954,
273,
619,
7350,
452,
644,
9713,
891,
452,
9300,
619,
7363,
281,
721,
642,
4016,
38058,
3486,
5474,
33032,
2520,
2929,
29328,
253,
269,
15421,
19946,
275,
534,
253,
14261,
7509,
34501,
403,
7932,
407,
253,
14923,
269,
15421,
9909,
34501,
12401,
253,
14261,
7509,
34501,
835,
359,
878,
281,
5206,
247,
1175,
26677,
4315,
281,
9232,
253,
18925,
273,
253,
3386,
273,
941,
253,
14923,
269,
15421,
9909,
34501,
476,
8356,
9232,
824,
18925,
285,
5386,
253,
878,
281,
19928,
253,
26677,
4315,
436,
2929,
28055,
5276,
326,
253,
4081,
269,
15421,
9909,
34501,
476,
14556,
16851,
2234,
285,
7316,
10670,
285,
12654,
436,
1127,
949,
4679,
327,
767,
39707,
3169,
8892,
337,
2520,
2929,
23970,
247,
747,
6907,
281,
4665,
39707,
285,
697,
2234,
6333,
436,
789,
3400,
247,
1327,
36928,
9077,
7914,
281,
1263,
1881,
42959,
275,
4979,
398,
285,
36803,
1881,
42959,
432,
253,
31460,
273,
10295,
9077,
374,
2520,
789,
47932,
253,
14923,
269,
15421,
9909,
48489,
281,
8171,
253,
5899,
14261,
7509,
1881,
42959,
285,
2085,
10527,
23632,
323,
253,
29107,
495,
1189,
455,
253,
2929,
310,
973,
10932,
285,
22335,
3590,
253,
5661,
1543,
327,
2709,
39707,
3169,
8892,
12654,
253,
6733,
273,
253,
4081,
269,
15421,
3438,
50276,
20881,
1255,
265,
337,
783,
28529,
1232,
285,
253,
9759,
878,
281,
320,
5520,
690,
1774,
9484,
31825,
390,
8813,
310,
5816,
1309,
253,
5933,
5740,
534,
1056,
10668,
1892,
281,
956,
253,
28529,
1232,
323,
1650,
275,
253,
5150,
898,
690,
1774,
9484,
31825,
403,
5816,
24088,
256,
391,
352,
310,
2834,
323,
10668,
281,
5834,
598,
253,
28529,
285,
253,
28529,
273,
268,
76,
310,
9560,
281,
253,
1563,
7914,
374,
8826,
8483,
27366,
275,
253,
2929,
247,
275,
1386,
2233,
403,
891,
301,
3530,
432,
50276,
67,
275,
5150,
721,
581,
273,
3714,
74,
310,
3542,
347,
815,
74,
502,
460,
23512,
253,
260,
275,
2505,
403,
275,
3430,
5981,
50275,
783,
2929,
42126,
2953,
253,
12291,
285,
2442,
4016,
38058,
3486,
273,
253,
789,
2490,
187,
4118,
18435,
27,
1189,
455,
253,
10123,
670,
436,
2929,
403,
1077,
2762,
253,
4477,
5262,
1270,
3434,
15966,
275,
11985,
285,
11138,
253,
2929,
342,
8254,
6787,
285,
3081,
4679,
359,
5583,
18738,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper details a way of investigating the space of preimages that lead to a particular output from a relu layer with the goal of using the inverted representations as a way to understand the deep neural networks three experiments are proposed where the authors claim that the computed preimages help interpret the network decision making overall the paper is interesting however i am not certain of the novelty as some related work is not discussed additionally although the practical application of the method is interesting the clarity could be improved for the last experiment positives understanding the invariances of neural networks can potentially lead to more interpretable models and one way to investigate this is by looking at the preimages for a network the paper is a nice mix of theoretical results which lead to practical applications questions and concerns the authors state that maxpool can be rewritten in terms of a linear component and a relu but this is non obvious if this is true a mathematical formulation should be explicitly included in the paper the paper is missing some potentially related references previous work has investigated how multiple stimuli can get mapped onto approximately the same point in recognition networks by inverting the representations via iterative gradient descent mahendran vedaldi 2015 and recent work including invarianced based adversarial examples in jacobsen et al 2019 or model metamers in feather et al 2019 how does the proposed preimage computation help improve model interpretability beyond this previous work especially given the authors statement that the method is intractable for large networks the paper does not discuss invertible networks which have a bijective mapping between the input and output ie invertible residual networks in behrmann et al 2019 discussing this work seems relevant if the goal is to make models such that one can start with hypothetical outputs and understand the inputs that lead to them the final example of using this method in practice for acas systems is interesting but it is difficult to follow what success would mean for this experiment minor points the following sentence on page 3 seems to be missing something preimages are most insightful and useful when the inputs and outputs have definite interpretation application areas where the need for massive networks is less there it a typo in the last sentence of page 3 bubut docsepthere are many issues in the paper that can be improved the title is not appropriate this work does not address safety applications it is worth noting that the word safety is not defined and not used in the main body of the paper it is difficult to follow the presentation of the paper since mainly the applications are presented and then some contributions given in the same presentation as the abstract a major issue it that the paper is missing some important theoretical analysis of particular interest is the existence of the preimages because not all outputs have inputs moreover the uniqueness of the solution needs to be studied these properties should depend on the used nonlinearities and architecture of the neural network there are many spelling and grammatical errors such as suprising have been been coincidentlly configuationsdocsepthe paper presents a method to verify if a nn is performing as expected in sensitive applications although the general area is very important in machine learning the paper is not very well presented the problem is not well stated the approach is not very clear and the results are not well justified the presentation and the writing of the paper should be improved unfortunately with the current format it is hard to glean the idea of the paper there are some typos eg bu in page 3 plot in page 4 etc there are some concepts that are not defined early on and maybe never in the paper for example what is the problem that the paper tries to solve mathematically it is not very clear what is the mathematical definition of a preimage the authors say preimages are most insightful and useful when the inputs and outputs have definite interpretation application areas where the need for massive networks is less its hard to fully understand but it seems that the method suffers scalability issues can this be formally analyzed what is the complexity of the algorithm in time and space why is there a scalability issue is it a fundamental problem how does this limit the scope and applicability of the method also what does definite interpretation mean the nn used in the experiments are very tiny i would consider experiments that reflect more realistic situations in the realworld current setup significantly limits the scope of the method it is not clear how to verify the performance of the method results in figure 1 and 2 does not show us the quality of the method is it doing good or bad i found the results in figure 1 surprising as the moon data is fairly symmetric while the preimage is biased towards one class is there a reason for thatdocsepdeep neural networks are known to be brittle and can lead to dangerous consequences if left unverified forward reach set computation can be used as a basic primitive to verify properties of deep neural networks used in a robotic setting there has been a rising interest in verifying larger neural networks used in safety critical setting in this paper the authors propose a way to compute reachable sets for a neural network in a backward sense starting from the outputs of the neural network and then work its way to the inputs this is an interesting way to look at the problem itself but as the authors point out it is an intractable problem my concern about this paper is i dont see the use of a preimage computation algorithm as being very useful a forward reachability tool works pretty well for the size of neural networks considered in the paper preimage computation does not provide any advantage in terms of scalability as is apparent from the experiments moreover almost any safety constraint that needs to be verified with system dynamics in the loop always should ideally work forward in time thus for the neural network controller from the inputs to the outputs cartpole example the authors come up with rules about which output behaviors are correct for a few of the input regions then use this as a specification for the verification algorithm but the very specifications comes from reasoning about the forward behavior of the system dynamics itself the idea of forward reach sets computation would generalize much better to a wide range of examples therefore without the need to come up with such handcrafted rules the authors do make a convincing case for the acasxu example but this example is less interesting given the amount of attention it has received recently
### Summary: | thank you for your submission to iclr the reviewers and i unanimously felt even after some of the clarifications provided that while there was some interesting element to this work ultimately there were substantial issues with both the presentation and content of the paper specifically the reviewers largely felt that the precise problem being solved was somewhat poorly defined and the benefit of the proposed preimage technique wasnt always clear and while the acas system was a nice application it seems to be difficult to quantify the real benefit of the proposed method in this setting especially given that other techniques can similarly be used to verify nns for this size problem the answer that this paper provides seems to be something along the lines of ease of visual interpretation of the preimage conditions but this needs to be quantified substantially more to be a compelling case | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
4278,
247,
1039,
273,
15686,
253,
2317,
273,
638,
13485,
326,
1421,
281,
247,
1798,
3453,
432,
247,
774,
86,
3828,
342,
253,
4736,
273,
970,
253,
28483,
14237,
347,
247,
1039,
281,
2096,
253,
3676,
11454,
6928,
1264,
4679,
403,
4081,
835,
253,
4477,
1750,
326,
253,
10302,
638,
13485,
1361,
4665,
253,
2990,
3061,
2403,
50275,
1189,
455,
253,
2929,
310,
4722,
2299,
891,
717,
417,
2176,
273,
253,
38135,
347,
690,
2905,
789,
310,
417,
5469,
23000,
3738,
253,
8542,
2898,
273,
253,
1332,
310,
4722,
253,
19843,
812,
320,
5520,
323,
253,
1390,
3368,
50275,
993,
23223,
50276,
4524,
6924,
253,
828,
6656,
707,
273,
11454,
6928,
476,
7826,
1421,
281,
625,
4665,
494,
3210,
285,
581,
1039,
281,
7409,
436,
310,
407,
2819,
387,
253,
638,
13485,
323,
247,
2990,
50275,
783,
2929,
310,
247,
5322,
5878,
273,
10527,
1543,
534,
1421,
281,
8542,
4893,
50276,
34974,
285,
7350,
50276,
783,
4477,
1375,
326,
2781,
10730,
476,
320,
35993,
275,
2426,
273,
247,
4872,
4445,
285,
247,
774,
86,
533,
436,
310,
1327,
4755,
604,
436,
310,
2032,
247,
15965,
15895,
943,
320,
11120,
2908,
275,
253,
2929,
50275,
783,
2929,
310,
5816,
690,
7826,
2905,
10414,
2045,
789,
556,
6949,
849,
2709,
15374,
476,
755,
18301,
4830,
5512,
253,
1072,
1127,
275,
8981,
6928,
407,
275,
31324,
253,
14237,
3066,
34560,
11786,
18499,
6429,
45182,
4011,
50276,
1272,
267,
5168,
4104,
285,
3332,
789,
1690,
828,
6656,
758,
1754,
48960,
6667,
275,
480,
317,
706,
8243,
1162,
355,
6247,
390,
1566,
42281,
398,
275,
22676,
1162,
355,
6247,
849,
1057,
253,
4081,
638,
5695,
13782,
1361,
3157,
1566,
4665,
1430,
4457,
436,
2045,
789,
3340,
1677,
253,
4477,
3908,
326,
253,
1332,
310,
540,
44374,
323,
1781,
6928,
50275,
783,
2929,
1057,
417,
2319,
42275,
6928,
534,
452,
247,
1794,
25667,
10603,
875,
253,
3280,
285,
3453,
26332,
42275,
12541,
6928,
275,
1602,
1109,
1136,
1162,
355,
6247,
16585,
436,
789,
3133,
4623,
604,
253,
4736,
310,
281,
1056,
3210,
824,
326,
581,
476,
1265,
342,
27710,
18012,
285,
2096,
253,
14800,
326,
1421,
281,
731,
50275,
783,
2457,
1650,
273,
970,
436,
1332,
275,
3946,
323,
913,
284,
2718,
310,
4722,
533,
352,
310,
2834,
281,
956,
752,
2323,
651,
1599,
323,
436,
3368,
50276,
37585,
2792,
50275,
783,
1563,
6197,
327,
3239,
495,
3133,
281,
320,
5816,
1633,
638,
13485,
403,
954,
47860,
285,
4217,
672,
253,
14800,
285,
18012,
452,
19040,
7914,
50276,
13259,
3672,
835,
253,
878,
323,
7863,
6928,
310,
1679,
50276,
9088,
352,
247,
1745,
80,
275,
253,
1390,
6197,
273,
3239,
495,
12411,
307,
5474,
339,
431,
1568,
403,
1142,
3374,
275,
253,
2929,
326,
476,
320,
5520,
50276,
783,
4060,
310,
417,
4569,
436,
789,
1057,
417,
2953,
5252,
4893,
352,
310,
4409,
15806,
326,
253,
3159,
5252,
310,
417,
2931,
285,
417,
908,
275,
253,
2022,
2133,
273,
253,
2929,
50276,
262,
310,
2834,
281,
956,
253,
9759,
273,
253,
2929,
1580,
7194,
253,
4893,
403,
3559,
285,
840,
690,
9021,
1677,
275,
253,
1072,
9759,
347,
253,
12002,
50276,
66,
2201,
2523,
352,
326,
253,
2929,
310,
5816,
690,
1774,
10527,
1783,
273,
1798,
1600,
310,
253,
6242,
273,
253,
638,
13485,
984,
417,
512,
18012,
452,
14800,
25761,
253,
34002,
273,
253,
2900,
3198,
281,
320,
5421,
841,
3607,
943,
3469,
327,
253,
908,
14561,
1005,
285,
10336,
273,
253,
11454,
2990,
50276,
9088,
403,
1142,
33797,
285,
47412,
474,
6332,
824,
347,
402,
20733,
452,
644,
644,
10891,
888,
77,
314,
3596,
12542,
7152,
339,
431,
248,
2929,
10262,
247,
1332,
281,
12654,
604,
247,
48257,
310,
9591,
347,
3264,
275,
7996,
4893,
3738,
253,
2087,
2170,
310,
1077,
1774,
275,
5145,
4715,
253,
2929,
310,
417,
1077,
973,
3559,
253,
1895,
310,
417,
973,
4767,
253,
2746,
310,
417,
1077,
2590,
285,
253,
1543,
403,
417,
973,
17285,
50275,
783,
9759,
285,
253,
4028,
273,
253,
2929,
943,
320,
5520,
19235,
342,
253,
1655,
5981,
352,
310,
1892,
281,
20786,
266,
253,
2934,
273,
253,
2929,
627,
403,
690,
963,
993,
24088,
1081,
275,
3239,
495,
7484,
275,
3239,
577,
3966,
50276,
9088,
403,
690,
12342,
326,
403,
417,
2931,
2393,
327,
285,
5046,
1620,
275,
253,
2929,
323,
1650,
752,
310,
253,
1895,
326,
253,
2929,
14177,
281,
8415,
11076,
1037,
352,
310,
417,
1077,
2590,
752,
310,
253,
15965,
5426,
273,
247,
638,
5695,
50276,
783,
4477,
1333,
638,
13485,
403,
954,
47860,
285,
4217,
672,
253,
14800,
285,
18012,
452,
19040,
7914,
50276,
13259,
3672,
835,
253,
878,
323,
7863,
6928,
310,
1679,
697,
1892,
281,
4751,
2096,
533,
352,
3133,
326,
253,
1332,
27171,
9171,
1430,
3374,
476,
436,
320,
19186,
5867,
752,
310,
253,
10454,
273,
253,
5933,
275,
673,
285,
2317,
2139,
310,
627,
247,
9171,
1430,
2523,
310,
352,
247,
7936,
1895,
849,
1057,
436,
2701,
253,
7990,
285,
30437,
273,
253,
1332,
671,
752,
1057,
19040,
7914,
1599,
50276,
783,
48257,
908,
275,
253,
4679,
403,
1077,
10058,
891,
651,
1908,
4679,
326,
4887,
625,
15958,
9534,
275,
253,
1524,
10186,
1655,
9978,
3012,
7787,
253,
7990,
273,
253,
1332,
50276,
262,
310,
417,
2590,
849,
281,
12654,
253,
3045,
273,
253,
1332,
1543,
275,
4677,
337,
285,
374,
1057,
417,
921,
441,
253,
3290,
273,
253,
1332,
310,
352,
2509,
1175,
390,
3076,
891,
1119,
253,
1543,
275,
4677,
337,
10084,
347,
253,
12334,
941,
310,
9648,
13123,
1223,
253,
638,
5695,
310,
23539,
4404,
581,
966,
310,
627,
247,
1921,
323,
326,
7152,
33032,
22412,
11454,
6928,
403,
1929,
281,
320,
1308,
1522,
285,
476,
1421,
281,
8312,
9099,
604,
1669,
440,
332,
1245,
3579,
3986,
873,
13782,
476,
320,
908,
347,
247,
5044,
20523,
281,
12654,
3607,
273,
3676,
11454,
6928,
908,
275,
247,
35121,
4758,
627,
556,
644,
247,
11002,
1600,
275,
49160,
4067,
11454,
6928,
908,
275,
5252,
4619,
4758,
50275,
249,
436,
2929,
50276,
783,
4477,
12661,
247,
1039,
281,
11897,
3986,
494,
5239,
323,
247,
11454,
2990,
275,
247,
19265,
3282,
4983,
432,
253,
18012,
273,
253,
11454,
2990,
285,
50276,
7461,
789,
697,
1039,
281,
253,
14800,
436,
310,
271,
4722,
1039,
281,
1007,
387,
253,
1895,
3139,
50276,
2858,
347,
253,
4477,
1127,
562,
352,
310,
271,
540,
44374,
1895,
50276,
2577,
4468,
670,
436,
2929,
310,
891,
13414,
923,
253,
897,
273,
247,
638,
5695,
13782,
5933,
347,
1146,
1077,
4217,
247,
3579,
3986,
1430,
4968,
2987,
3965,
973,
323,
253,
1979,
273,
11454,
6928,
2783,
275,
253,
2929,
638,
5695,
13782,
1057,
417,
2085,
667,
5750,
275,
2426,
273,
9171,
1430,
347,
310,
5165,
432,
253,
4679,
25761,
2761,
667,
5252,
7658,
326,
3198,
281,
320,
16058,
342,
985,
8062,
275,
253,
6287,
1900,
943,
34243,
789,
3579,
275,
673,
3021,
323,
253,
11454,
2990,
9763,
432,
253,
14800,
281,
253,
18012,
50275,
23487,
36479,
1650,
50276,
783,
4477,
1705,
598,
342,
4803,
670,
534,
3453,
13576,
403,
3451,
323,
247,
1643,
273,
253,
3280,
4811,
840,
897,
436,
347,
247,
17776,
323,
253,
21999,
5933,
533,
253,
1077,
23944,
3249,
432,
14720,
670,
253,
3579,
3879,
273,
253,
985,
8062,
3139,
253,
2934,
273,
3579,
3986,
5239,
13782,
651,
39970,
1199,
1805,
281,
247,
4618,
2491,
273,
6667,
3103,
50276,
14920,
253,
878,
281,
1705,
598,
342,
824,
1133,
12517,
264,
4803,
50275,
783,
4477,
513,
1056,
247,
21414,
1083,
323,
253,
913,
284,
46036,
1650,
533,
436,
1650,
310,
1679,
4722,
1677,
253,
2408,
273,
4116,
352,
556,
2959,
4102,
2490,
187,
4118,
18435,
27,
47033,
368,
323,
634,
19529,
281,
17857,
32888,
50276,
783,
30628,
285,
891,
38350,
3543,
1014,
846,
690,
273,
253,
8254,
6787,
2530,
326,
1223,
627,
369,
690,
4722,
3284,
281,
436,
789,
9142,
627,
497,
6832,
3374,
342,
1097,
253,
9759,
285,
2600,
273,
253,
2929,
50276,
46458,
253,
30628,
8127,
3543,
326,
253,
10799,
1895,
1146,
14042,
369,
8489,
15225,
2931,
285,
253,
5649,
273,
253,
4081,
638,
5695,
5853,
369,
2649,
1900,
2590,
50276,
395,
1223,
253,
913,
284,
985,
369,
247,
5322,
2898,
352,
3133,
281,
320,
2834,
281,
22048,
253,
1524,
5649,
273,
253,
4081,
1332,
275,
436,
4758,
3340,
1677,
326,
643,
5609,
476,
12014,
320,
908,
281,
12654,
295,
2224,
323,
436,
1979,
1895,
50276,
783,
3662,
326,
436,
2929,
3400,
3133,
281,
320,
1633,
2112,
253,
3104,
273,
11990,
273,
5304,
7914,
273,
253,
638,
5695,
2515,
533,
436,
3198,
281,
320,
18755,
9619,
625,
281,
320,
247,
18511,
1083
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
4278,
247,
1039,
273,
15686,
253,
2317,
273,
638,
13485,
326,
1421,
281,
247,
1798,
3453,
432,
247,
774,
86,
3828,
342,
253,
4736,
273,
970,
253,
28483,
14237,
347,
247,
1039,
281,
2096,
253,
3676,
11454,
6928,
1264,
4679,
403,
4081,
835,
253,
4477,
1750,
326,
253,
10302,
638,
13485,
1361,
4665,
253,
2990,
3061,
2403,
50275,
1189,
455,
253,
2929,
310,
4722,
2299,
891,
717,
417,
2176,
273,
253,
38135,
347,
690,
2905,
789,
310,
417,
5469,
23000,
3738,
253,
8542,
2898,
273,
253,
1332,
310,
4722,
253,
19843,
812,
320,
5520,
323,
253,
1390,
3368,
50275,
993,
23223,
50276,
4524,
6924,
253,
828,
6656,
707,
273,
11454,
6928,
476,
7826,
1421,
281,
625,
4665,
494,
3210,
285,
581,
1039,
281,
7409,
436,
310,
407,
2819,
387,
253,
638,
13485,
323,
247,
2990,
50275,
783,
2929,
310,
247,
5322,
5878,
273,
10527,
1543,
534,
1421,
281,
8542,
4893,
50276,
34974,
285,
7350,
50276,
783,
4477,
1375,
326,
2781,
10730,
476,
320,
35993,
275,
2426,
273,
247,
4872,
4445,
285,
247,
774,
86,
533,
436,
310,
1327,
4755,
604,
436,
310,
2032,
247,
15965,
15895,
943,
320,
11120,
2908,
275,
253,
2929,
50275,
783,
2929,
310,
5816,
690,
7826,
2905,
10414,
2045,
789,
556,
6949,
849,
2709,
15374,
476,
755,
18301,
4830,
5512,
253,
1072,
1127,
275,
8981,
6928,
407,
275,
31324,
253,
14237,
3066,
34560,
11786,
18499,
6429,
45182,
4011,
50276,
1272,
267,
5168,
4104,
285,
3332,
789,
1690,
828,
6656,
758,
1754,
48960,
6667,
275,
480,
317,
706,
8243,
1162,
355,
6247,
390,
1566,
42281,
398,
275,
22676,
1162,
355,
6247,
849,
1057,
253,
4081,
638,
5695,
13782,
1361,
3157,
1566,
4665,
1430,
4457,
436,
2045,
789,
3340,
1677,
253,
4477,
3908,
326,
253,
1332,
310,
540,
44374,
323,
1781,
6928,
50275,
783,
2929,
1057,
417,
2319,
42275,
6928,
534,
452,
247,
1794,
25667,
10603,
875,
253,
3280,
285,
3453,
26332,
42275,
12541,
6928,
275,
1602,
1109,
1136,
1162,
355,
6247,
16585,
436,
789,
3133,
4623,
604,
253,
4736,
310,
281,
1056,
3210,
824,
326,
581,
476,
1265,
342,
27710,
18012,
285,
2096,
253,
14800,
326,
1421,
281,
731,
50275,
783,
2457,
1650,
273,
970,
436,
1332,
275,
3946,
323,
913,
284,
2718,
310,
4722,
533,
352,
310,
2834,
281,
956,
752,
2323,
651,
1599,
323,
436,
3368,
50276,
37585,
2792,
50275,
783,
1563,
6197,
327,
3239,
495,
3133,
281,
320,
5816,
1633,
638,
13485,
403,
954,
47860,
285,
4217,
672,
253,
14800,
285,
18012,
452,
19040,
7914,
50276,
13259,
3672,
835,
253,
878,
323,
7863,
6928,
310,
1679,
50276,
9088,
352,
247,
1745,
80,
275,
253,
1390,
6197,
273,
3239,
495,
12411,
307,
5474,
339,
431,
1568,
403,
1142,
3374,
275,
253,
2929,
326,
476,
320,
5520,
50276,
783,
4060,
310,
417,
4569,
436,
789,
1057,
417,
2953,
5252,
4893,
352,
310,
4409,
15806,
326,
253,
3159,
5252,
310,
417,
2931,
285,
417,
908,
275,
253,
2022,
2133,
273,
253,
2929,
50276,
262,
310,
2834,
281,
956,
253,
9759,
273,
253,
2929,
1580,
7194,
253,
4893,
403,
3559,
285,
840,
690,
9021,
1677,
275,
253,
1072,
9759,
347,
253,
12002,
50276,
66,
2201,
2523,
352,
326,
253,
2929,
310,
5816,
690,
1774,
10527,
1783,
273,
1798,
1600,
310,
253,
6242,
273,
253,
638,
13485,
984,
417,
512,
18012,
452,
14800,
25761,
253,
34002,
273,
253,
2900,
3198,
281,
320,
5421,
841,
3607,
943,
3469,
327,
253,
908,
14561,
1005,
285,
10336,
273,
253,
11454,
2990,
50276,
9088,
403,
1142,
33797,
285,
47412,
474,
6332,
824,
347,
402,
20733,
452,
644,
644,
10891,
888,
77,
314,
3596,
12542,
7152,
339,
431,
248,
2929,
10262,
247,
1332,
281,
12654,
604,
247,
48257,
310,
9591,
347,
3264,
275,
7996,
4893,
3738,
253,
2087,
2170,
310,
1077,
1774,
275,
5145,
4715,
253,
2929,
310,
417,
1077,
973,
3559,
253,
1895,
310,
417,
973,
4767,
253,
2746,
310,
417,
1077,
2590,
285,
253,
1543,
403,
417,
973,
17285,
50275,
783,
9759,
285,
253,
4028,
273,
253,
2929,
943,
320,
5520,
19235,
342,
253,
1655,
5981,
352,
310,
1892,
281,
20786,
266,
253,
2934,
273,
253,
2929,
627,
403,
690,
963,
993,
24088,
1081,
275,
3239,
495,
7484,
275,
3239,
577,
3966,
50276,
9088,
403,
690,
12342,
326,
403,
417,
2931,
2393,
327,
285,
5046,
1620,
275,
253,
2929,
323,
1650,
752,
310,
253,
1895,
326,
253,
2929,
14177,
281,
8415,
11076,
1037,
352,
310,
417,
1077,
2590,
752,
310,
253,
15965,
5426,
273,
247,
638,
5695,
50276,
783,
4477,
1333,
638,
13485,
403,
954,
47860,
285,
4217,
672,
253,
14800,
285,
18012,
452,
19040,
7914,
50276,
13259,
3672,
835,
253,
878,
323,
7863,
6928,
310,
1679,
697,
1892,
281,
4751,
2096,
533,
352,
3133,
326,
253,
1332,
27171,
9171,
1430,
3374,
476,
436,
320,
19186,
5867,
752,
310,
253,
10454,
273,
253,
5933,
275,
673,
285,
2317,
2139,
310,
627,
247,
9171,
1430,
2523,
310,
352,
247,
7936,
1895,
849,
1057,
436,
2701,
253,
7990,
285,
30437,
273,
253,
1332,
671,
752,
1057,
19040,
7914,
1599,
50276,
783,
48257,
908,
275,
253,
4679,
403,
1077,
10058,
891,
651,
1908,
4679,
326,
4887,
625,
15958,
9534,
275,
253,
1524,
10186,
1655,
9978,
3012,
7787,
253,
7990,
273,
253,
1332,
50276,
262,
310,
417,
2590,
849,
281,
12654,
253,
3045,
273,
253,
1332,
1543,
275,
4677,
337,
285,
374,
1057,
417,
921,
441,
253,
3290,
273,
253,
1332,
310,
352,
2509,
1175,
390,
3076,
891,
1119,
253,
1543,
275,
4677,
337,
10084,
347,
253,
12334,
941,
310,
9648,
13123,
1223,
253,
638,
5695,
310,
23539,
4404,
581,
966,
310,
627,
247,
1921,
323,
326,
7152,
33032,
22412,
11454,
6928,
403,
1929,
281,
320,
1308,
1522,
285,
476,
1421,
281,
8312,
9099,
604,
1669,
440,
332,
1245,
3579,
3986,
873,
13782,
476,
320,
908,
347,
247,
5044,
20523,
281,
12654,
3607,
273,
3676,
11454,
6928,
908,
275,
247,
35121,
4758,
627,
556,
644,
247,
11002,
1600,
275,
49160,
4067,
11454,
6928,
908,
275,
5252,
4619,
4758,
50275,
249,
436,
2929,
50276,
783,
4477,
12661,
247,
1039,
281,
11897,
3986,
494,
5239,
323,
247,
11454,
2990,
275,
247,
19265,
3282,
4983,
432,
253,
18012,
273,
253,
11454,
2990,
285,
50276,
7461,
789,
697,
1039,
281,
253,
14800,
436,
310,
271,
4722,
1039,
281,
1007,
387,
253,
1895,
3139,
50276,
2858,
347,
253,
4477,
1127,
562,
352,
310,
271,
540,
44374,
1895,
50276,
2577,
4468,
670,
436,
2929,
310,
891,
13414,
923,
253,
897,
273,
247,
638,
5695,
13782,
5933,
347,
1146,
1077,
4217,
247,
3579,
3986,
1430,
4968,
2987,
3965,
973,
323,
253,
1979,
273,
11454,
6928,
2783,
275,
253,
2929,
638,
5695,
13782,
1057,
417,
2085,
667,
5750,
275,
2426,
273,
9171,
1430,
347,
310,
5165,
432,
253,
4679,
25761,
2761,
667,
5252,
7658,
326,
3198,
281,
320,
16058,
342,
985,
8062,
275,
253,
6287,
1900,
943,
34243,
789,
3579,
275,
673,
3021,
323,
253,
11454,
2990,
9763,
432,
253,
14800,
281,
253,
18012,
50275,
23487,
36479,
1650,
50276,
783,
4477,
1705,
598,
342,
4803,
670,
534,
3453,
13576,
403,
3451,
323,
247,
1643,
273,
253,
3280,
4811,
840,
897,
436,
347,
247,
17776,
323,
253,
21999,
5933,
533,
253,
1077,
23944,
3249,
432,
14720,
670,
253,
3579,
3879,
273,
253,
985,
8062,
3139,
253,
2934,
273,
3579,
3986,
5239,
13782,
651,
39970,
1199,
1805,
281,
247,
4618,
2491,
273,
6667,
3103,
50276,
14920,
253,
878,
281,
1705,
598,
342,
824,
1133,
12517,
264,
4803,
50275,
783,
4477,
513,
1056,
247,
21414,
1083,
323,
253,
913,
284,
46036,
1650,
533,
436,
1650,
310,
1679,
4722,
1677,
253,
2408,
273,
4116,
352,
556,
2959,
4102,
2490,
187,
4118,
18435,
27,
47033,
368,
323,
634,
19529,
281,
17857,
32888,
50276,
783,
30628,
285,
891,
38350,
3543,
1014,
846,
690,
273,
253,
8254,
6787,
2530,
326,
1223,
627,
369,
690,
4722,
3284,
281,
436,
789,
9142,
627,
497,
6832,
3374,
342,
1097,
253,
9759,
285,
2600,
273,
253,
2929,
50276,
46458,
253,
30628,
8127,
3543,
326,
253,
10799,
1895,
1146,
14042,
369,
8489,
15225,
2931,
285,
253,
5649,
273,
253,
4081,
638,
5695,
5853,
369,
2649,
1900,
2590,
50276,
395,
1223,
253,
913,
284,
985,
369,
247,
5322,
2898,
352,
3133,
281,
320,
2834,
281,
22048,
253,
1524,
5649,
273,
253,
4081,
1332,
275,
436,
4758,
3340,
1677,
326,
643,
5609,
476,
12014,
320,
908,
281,
12654,
295,
2224,
323,
436,
1979,
1895,
50276,
783,
3662,
326,
436,
2929,
3400,
3133,
281,
320,
1633,
2112,
253,
3104,
273,
11990,
273,
5304,
7914,
273,
253,
638,
5695,
2515,
533,
436,
3198,
281,
320,
18755,
9619,
625,
281,
320,
247,
18511,
1083
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper develops a mean field theory for batch normalization bn in fullyconnected networks with randomly initialized weights there are a number of interesting predictions made in this paper on the basis of this analysis the main technical results of the paper are theorems 58 which compute the statistics of the covariance of the activations and the gradients comments 1 the observation that gradients explode in spite of bn is quite counterintuitive can you give an intuitive explanation of why this occurs 2 in a similar vein there a number of highly technical results in the paper and it would be great if the authors provide an intuitive explanation of their theorems 3 can the statistics of activations be controlled using activation functions or operations which break the symmetry for instance are bsb1 fixed points good for training neural networks 4 mean field analysis although it lends an insight into the statistics of the activations needs to connected with empirical observations for instance when the authors observe that the structure of the fixed point is such that activations are of identical norm equally spread apart in terms of angle this is quite far from practice it would be good to mention this in the introduction or the conclusionsdocsep this paper investigates the effect of the batch normalization in dnn learning the mean field theory in statistical mechanics was employed to analyze the progress of variance matrices between layers as the results the batch normalization itself is found to be the cause of gradient explosion moreover the authors pointed out that nearlinear activation function can improve such gradient explosion some numerical studies were reported to confirm theoretical findings the detailed analysis of the training of dnn with the batch normalization is quite interesting there are some minor comments below in page 3 2line above eq2 what is delta in the variance of the multivariate normal distribution the notation q appeared in the middle part of page 3 before the definition of q is shown in the last paragraph of p3 the randomized weight is not very practical though it may be the standard approach of mean field some comments would be helpful to the readers docsepthis paper provides a new dynamic perspective on deep neural network based on gaussian weights and biases the paper investigates the evolution of the covariance matrix along with the layers eventually the matrices achieve a stationary point ie fixed point of the dynamic system local performance around the fixed point is explored extensions are provided to include the batch normalization i believe this paper may stimulate some interesting ideas for other researchers two technical questions 1 when the layers tends to infinity the covariance matrix reaches stationary fixed point how to understand this phenomenon does this mean that the distribution of the layer outputs will not change too much if the layer is deep enough this somewhat conflicts the commonsense of the deeper the better 2 typos the weight matrix in the end of page 2 should be nl times nl1 also the xis in the first line of page 3 should be bold
### Summary: | this paper provides a meanfieldtheory analysis of batch normalization first there is a negative result as to the necessity of gradient explosion when using batch normalization in a fully connected network they then provide further insights as to what can be done about this along with experiments to confirm their theoretical predictions the reviewers and random commenters found this paper very interesting the reviewers were unanimous in their vote to accept | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
24357,
247,
1599,
1673,
3762,
323,
14604,
21539,
270,
79,
275,
4751,
14063,
6928,
342,
12421,
31260,
13461,
627,
403,
247,
1180,
273,
4722,
13650,
1160,
275,
436,
2929,
327,
253,
3720,
273,
436,
1783,
253,
2022,
7681,
1543,
273,
253,
2929,
403,
39383,
9135,
534,
11897,
253,
9990,
273,
253,
26677,
273,
253,
1396,
569,
285,
253,
27935,
50276,
26122,
50276,
18,
253,
8310,
326,
27935,
34667,
275,
15866,
273,
270,
79,
310,
3240,
4828,
565,
48714,
476,
368,
1918,
271,
27350,
8813,
273,
2139,
436,
6634,
50276,
19,
275,
247,
2074,
17716,
627,
247,
1180,
273,
4122,
7681,
1543,
275,
253,
2929,
285,
352,
651,
320,
1270,
604,
253,
4477,
2085,
271,
27350,
8813,
273,
616,
39383,
50276,
20,
476,
253,
9990,
273,
1396,
569,
320,
6537,
970,
5743,
3470,
390,
5871,
534,
2740,
253,
10377,
323,
4227,
403,
270,
18516,
18,
4229,
2792,
1175,
323,
3733,
11454,
6928,
50276,
21,
1599,
1673,
1783,
3738,
352,
298,
1727,
271,
12288,
715,
253,
9990,
273,
253,
1396,
569,
3198,
281,
4802,
342,
16774,
7313,
323,
4227,
672,
253,
4477,
10018,
326,
253,
2605,
273,
253,
4229,
1127,
310,
824,
326,
1396,
569,
403,
273,
8931,
5222,
9696,
5195,
7419,
275,
2426,
273,
6907,
436,
310,
3240,
2080,
432,
3946,
352,
651,
320,
1175,
281,
3748,
436,
275,
253,
10199,
390,
253,
11815,
7152,
33032,
436,
2929,
2340,
684,
253,
1055,
273,
253,
14604,
21539,
275,
277,
9866,
4715,
253,
1599,
1673,
3762,
275,
7605,
17823,
369,
7091,
281,
12106,
253,
4780,
273,
11041,
12624,
875,
8090,
50276,
284,
253,
1543,
253,
14604,
21539,
3139,
310,
1119,
281,
320,
253,
2847,
273,
11786,
18864,
50276,
3062,
1189,
253,
4477,
8042,
562,
326,
2822,
8172,
5743,
1159,
476,
3157,
824,
11786,
18864,
50276,
8826,
10704,
2175,
497,
2361,
281,
6583,
10527,
4342,
50276,
783,
7000,
1783,
273,
253,
3733,
273,
277,
9866,
342,
253,
14604,
21539,
310,
3240,
4722,
50276,
9088,
403,
690,
5884,
5701,
2708,
50275,
249,
3239,
495,
374,
1282,
1840,
16186,
19,
752,
310,
18687,
275,
253,
11041,
273,
253,
21471,
2622,
3268,
50276,
783,
14951,
2805,
5420,
275,
253,
4766,
629,
273,
3239,
495,
1078,
253,
5426,
273,
2805,
310,
2011,
275,
253,
1390,
12494,
273,
268,
20,
50275,
783,
14871,
2801,
310,
417,
1077,
8542,
2167,
352,
778,
320,
253,
2629,
2746,
273,
1599,
1673,
690,
5701,
651,
320,
9371,
281,
253,
10668,
50276,
7152,
33032,
2520,
2929,
3400,
247,
747,
7870,
8668,
327,
3676,
11454,
2990,
1754,
327,
305,
12064,
13461,
285,
31306,
253,
2929,
2340,
684,
253,
5606,
273,
253,
26677,
4315,
2112,
342,
253,
8090,
6524,
253,
12624,
5115,
247,
17429,
1127,
26332,
4229,
1127,
273,
253,
7870,
985,
1980,
3045,
1475,
253,
4229,
1127,
310,
14859,
18149,
403,
2530,
281,
2486,
253,
14604,
21539,
891,
2868,
436,
2929,
778,
23278,
690,
4722,
5697,
323,
643,
8607,
50276,
9389,
7681,
3533,
50276,
18,
672,
253,
8090,
14280,
281,
23579,
253,
26677,
4315,
14190,
17429,
4229,
1127,
849,
281,
2096,
436,
11562,
1057,
436,
1599,
326,
253,
3268,
273,
253,
3828,
18012,
588,
417,
1818,
1512,
1199,
604,
253,
3828,
310,
3676,
2217,
436,
8489,
15272,
253,
764,
49235,
273,
253,
12861,
253,
1805,
50275,
19,
963,
993,
253,
2801,
4315,
275,
253,
990,
273,
3239,
374,
943,
320,
295,
77,
2069,
295,
77,
18,
671,
253,
1269,
261,
275,
253,
806,
1386,
273,
3239,
495,
943,
320,
13433,
187,
187,
4118,
18435,
27,
2520,
2929,
3400,
247,
1599,
3423,
32525,
1783,
273,
14604,
21539,
806,
627,
310,
247,
4016,
906,
347,
281,
253,
15504,
273,
11786,
18864,
672,
970,
14604,
21539,
275,
247,
4751,
4802,
2990,
597,
840,
2085,
2007,
16039,
347,
281,
752,
476,
320,
2218,
670,
436,
2112,
342,
4679,
281,
6583,
616,
10527,
13650,
50276,
783,
30628,
285,
3632,
4385,
398,
1119,
436,
2929,
1077,
4722,
253,
30628,
497,
42293,
275,
616,
6273,
281,
2997
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
24357,
247,
1599,
1673,
3762,
323,
14604,
21539,
270,
79,
275,
4751,
14063,
6928,
342,
12421,
31260,
13461,
627,
403,
247,
1180,
273,
4722,
13650,
1160,
275,
436,
2929,
327,
253,
3720,
273,
436,
1783,
253,
2022,
7681,
1543,
273,
253,
2929,
403,
39383,
9135,
534,
11897,
253,
9990,
273,
253,
26677,
273,
253,
1396,
569,
285,
253,
27935,
50276,
26122,
50276,
18,
253,
8310,
326,
27935,
34667,
275,
15866,
273,
270,
79,
310,
3240,
4828,
565,
48714,
476,
368,
1918,
271,
27350,
8813,
273,
2139,
436,
6634,
50276,
19,
275,
247,
2074,
17716,
627,
247,
1180,
273,
4122,
7681,
1543,
275,
253,
2929,
285,
352,
651,
320,
1270,
604,
253,
4477,
2085,
271,
27350,
8813,
273,
616,
39383,
50276,
20,
476,
253,
9990,
273,
1396,
569,
320,
6537,
970,
5743,
3470,
390,
5871,
534,
2740,
253,
10377,
323,
4227,
403,
270,
18516,
18,
4229,
2792,
1175,
323,
3733,
11454,
6928,
50276,
21,
1599,
1673,
1783,
3738,
352,
298,
1727,
271,
12288,
715,
253,
9990,
273,
253,
1396,
569,
3198,
281,
4802,
342,
16774,
7313,
323,
4227,
672,
253,
4477,
10018,
326,
253,
2605,
273,
253,
4229,
1127,
310,
824,
326,
1396,
569,
403,
273,
8931,
5222,
9696,
5195,
7419,
275,
2426,
273,
6907,
436,
310,
3240,
2080,
432,
3946,
352,
651,
320,
1175,
281,
3748,
436,
275,
253,
10199,
390,
253,
11815,
7152,
33032,
436,
2929,
2340,
684,
253,
1055,
273,
253,
14604,
21539,
275,
277,
9866,
4715,
253,
1599,
1673,
3762,
275,
7605,
17823,
369,
7091,
281,
12106,
253,
4780,
273,
11041,
12624,
875,
8090,
50276,
284,
253,
1543,
253,
14604,
21539,
3139,
310,
1119,
281,
320,
253,
2847,
273,
11786,
18864,
50276,
3062,
1189,
253,
4477,
8042,
562,
326,
2822,
8172,
5743,
1159,
476,
3157,
824,
11786,
18864,
50276,
8826,
10704,
2175,
497,
2361,
281,
6583,
10527,
4342,
50276,
783,
7000,
1783,
273,
253,
3733,
273,
277,
9866,
342,
253,
14604,
21539,
310,
3240,
4722,
50276,
9088,
403,
690,
5884,
5701,
2708,
50275,
249,
3239,
495,
374,
1282,
1840,
16186,
19,
752,
310,
18687,
275,
253,
11041,
273,
253,
21471,
2622,
3268,
50276,
783,
14951,
2805,
5420,
275,
253,
4766,
629,
273,
3239,
495,
1078,
253,
5426,
273,
2805,
310,
2011,
275,
253,
1390,
12494,
273,
268,
20,
50275,
783,
14871,
2801,
310,
417,
1077,
8542,
2167,
352,
778,
320,
253,
2629,
2746,
273,
1599,
1673,
690,
5701,
651,
320,
9371,
281,
253,
10668,
50276,
7152,
33032,
2520,
2929,
3400,
247,
747,
7870,
8668,
327,
3676,
11454,
2990,
1754,
327,
305,
12064,
13461,
285,
31306,
253,
2929,
2340,
684,
253,
5606,
273,
253,
26677,
4315,
2112,
342,
253,
8090,
6524,
253,
12624,
5115,
247,
17429,
1127,
26332,
4229,
1127,
273,
253,
7870,
985,
1980,
3045,
1475,
253,
4229,
1127,
310,
14859,
18149,
403,
2530,
281,
2486,
253,
14604,
21539,
891,
2868,
436,
2929,
778,
23278,
690,
4722,
5697,
323,
643,
8607,
50276,
9389,
7681,
3533,
50276,
18,
672,
253,
8090,
14280,
281,
23579,
253,
26677,
4315,
14190,
17429,
4229,
1127,
849,
281,
2096,
436,
11562,
1057,
436,
1599,
326,
253,
3268,
273,
253,
3828,
18012,
588,
417,
1818,
1512,
1199,
604,
253,
3828,
310,
3676,
2217,
436,
8489,
15272,
253,
764,
49235,
273,
253,
12861,
253,
1805,
50275,
19,
963,
993,
253,
2801,
4315,
275,
253,
990,
273,
3239,
374,
943,
320,
295,
77,
2069,
295,
77,
18,
671,
253,
1269,
261,
275,
253,
806,
1386,
273,
3239,
495,
943,
320,
13433,
187,
187,
4118,
18435,
27,
2520,
2929,
3400,
247,
1599,
3423,
32525,
1783,
273,
14604,
21539,
806,
627,
310,
247,
4016,
906,
347,
281,
253,
15504,
273,
11786,
18864,
672,
970,
14604,
21539,
275,
247,
4751,
4802,
2990,
597,
840,
2085,
2007,
16039,
347,
281,
752,
476,
320,
2218,
670,
436,
2112,
342,
4679,
281,
6583,
616,
10527,
13650,
50276,
783,
30628,
285,
3632,
4385,
398,
1119,
436,
2929,
1077,
4722,
253,
30628,
497,
42293,
275,
616,
6273,
281,
2997
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper proposes a new algorithm for solving neural odes each numerical solver step of the neural ode is implemented as an invertible neural network via a variant of the asynchronous leafprog integrator while still computing an accurate gradient this allows memory savings by discarding intermediate data from the numerical integration steps since it can be reconstructed using the inverse a theoretical stability analysis is provided the experimental results show that the algorithm achieves similar performance to previous methods eg aca while using less memory strengths identifies a nice connection between invertibility and memory efficiency beyond neural odes this could enable use of larger models where invertible networks are useful eg normalizing flows the theoretical analysis of stability is useful to build intuition concerns weaknesses most experiments in the paper use dampingfactor eta 1 since theoretically this is not stable it would be nice to see if the empirical improvements hold up for eta 1 where stable regions do exist the naive method seems too naive why are all results from all m evaluations being saved it is obvious that m1 of these are unnecessary for gradient computation since they dont affect zt the related claim about the computation graph being deeper for the naive method also seems incorrect other comments in algorithm 1 shouldnt errorest inf be inside the while loop in algorithm 4 shouldnt at be the partial derivative of l wrt zt instead of total derivative in theorem 32 what is sigma should it be sigmai in various locations notation like on1 is used should this just be on since i assume n is at least omega1 it seems a bit strange that we have to do a local forward and local backward pass in algorithm 4 could this be solved by making each layer of f invertible in the same vein it seems that the adjoint method needs to do a separate solve of the reverse ode because of loss of information if we were to assume invertibility of the forward map is there a way to modify the adjoint method to exactly retrace the path backwardsdocsepsummary this paper presents a memoryefficient asynchronous leapfrog integrator for numerically solving neural odes referred to as mali the method comes with a constant memory guarantee like the adjoint method and also guarantees reversetime accuracy like the adaptive checkpoint adjoint aca method the authors also give a rigorous theoretical analysis of mali and also discuss a damped version with an increased stability region the method is evaluated on a variety of tasks which includes classification dynamical modelling and generative modelling pros the theoretical analysis looks correct noting that i havent worked out all the details experimental evaluation is very exhaustive and mali achieves nearthebest performance in all tasks the method is proven as accurate as the standard numerical ode solvers thanks to its reduced memory cost compared to aca mali can then be treated as an offtheshelf replacement cons and questions looking at the results im having difficulty seeing any significant improvement upon aca then the main contribution in addition to the theoretical analysis is the reduced memory consumption which makes me rethink whether iclr is a suitable venue although the memory costs of the adjoint method and mali are onf and onf1 this doesnt really reflect in figure 4c where the blue bar doubles the red one id be happy if the authors can briefly explain why looking at table2 why does the test performance of a node trained with mali increase when we switch from mali to rk4 it would be much nicer to see some error variance estimate i would be happy to see an experimental evaluation of the astability as mentioned by the authors the stability analysis is asymptotic and t could be arbitrarily small in eg continuoustime flows however thats not the case in timeseries modelling so i wonder if the stability claim can be verified on a scenario in which eg 100 observations arrive uniformly in time with 10 secs gaps to generate table2 did you train a resnet without any differentialsintegrals involved and try to evaluate the test performance using an ode solver simply using the trained resnet as the drift function if so i dont think this makes any sense except euler1 solver and the entire resnet row in table2 could go away additional comments figure4 caption could include some more detail at least mentioning the experiment name why is there a local forward step within the for loop in the backward routine in alg4 it would be nice to see a brief description of the mujoco dataset typo in the title of section b32 note after rebuttal i increase my overall score from 6 to 7docsep1 summary the manuscript proposes a reversible integration scheme for approximately estimating the gradient of neural ordinary differential equations these odes represent dnns with continuous rather than discrete values for the number of layers the solver is theoretically analysed and empirically compared to other solvers 2 strengths the paper is mostly well written the reversibility property of the solver leads to a memory footprint that does not depend on integration time the model is applied to standard datasets 3 concerns the concept of neural ode models their scope and their expected usefulness should be better motivated it is not obvious which role these models play and what they offer as potential strengths the integrations scheme seems already known and well established it does not seem that the paper makes code and data available to the public 4 remarksquestions a algorithm 1 it seems that the stepsize h should be initialized upon every step otherwise the steps can only get smaller b references capitalization not correct eg neural information processing systems odenet lennardjones c what benefits does the neural ode model have in the context of image classification what is the intuition behind the continuous depth idea in this scenariodocseppaper summary there are typically two methods for estimating the gradients with respect to the loss for neural odes the naive method directly backpropagates through the steps of the ode solver leading to accurate gradients but very large memory cost the adjoint method in contrast does not store the entire trajectory in memory but has reverse trajectory errors ie the numerical solution in the reverse direction will not be the inverse of the numerical solution in the forward direction in this paper the authors propose a method that is both reverse accurate and has low memory cost to achieve this the authors take advantage of the asynchronous leapfrog solver this numerical method is reversible solving an ode numerically in the reverse direction is the inverse of solving the ode numerically in the forward direction this is not generally true for the ode solvers typically used rk4 and so on in the neural ode literature as the numerical solver is explicitly invertible the authors can from only the final state and not the entire trajectory locally reconstruct each ode solver step to get the local gradient of the parameters with respect to the loss they can then calculate these gradients along the entire reverse trajectory to obtain an accurate estimate of the gradient with respect to the parameters as each step of the numerical solver is reversible they do not need to store the entire trajectory the authors analyse the stability and numerical error of their proposed method and provide a toy example to show how well their method estimates gradients compared the naive adjoint and adaptive checkpoint methods the authors then perform experiments on a variety of tasks to test their model they test their model on image classification experiments both on cifar10 and imagenet and achieve good results compared to the baselines in addition they perform adversarial robustness experiments on imagenet and also show good performance finally the authors test their method both for time series modeling and continuous normalizing flows again showing good performance compared with naive integration methods positives the motivation and core idea of the paper is clear numerical solvers are in general not reversible and this can lead to inaccurate gradient estimates when using the adjoint method the authors explain this clearly and then propose a method that effectively solves this the experimental results are quite impressive the model performs on par with the adaptive checkpoint method in terms of accuracy but is much more memory efficient and notably memory is independent of the number of integration steps this allows the authors to run their model on large scale datasets like imagenet which was not previously possible with most neural ode methods further the authors achieve good performance on quite a wide variety of tasks image classification adversarial attacks time series modeling generative modeling which is nice the authors perform a thorough analysis of the runtime of their integration method compared to others which is very helpful negatives the presentation of the method and results is not always very clear for example the section about damping for the alf integrator is not clear the authors mention that alf is not stable for eta1 but as far as i can tell never mention what value of eta they use in practice and whether choosing this value is difficult further it is not clear if alf is still reversible with this eta parameter presumably you would have to use 1eta in the reverse step for it to remain invertible in which case the reverse is not stable the authors should be more clear about this the toy example is confusing how come the integration time starts from t20 is this because the error only grows after t20 as you use t1 for all experiments and the rtol and atol are also roughly the same for all experiments it would be nice to see if this actually makes a difference also for t1 in figure 4 the authors also mention the derivative dldy0 but this derivative is never mentioned in the text do you mean dldz0 the plots of memory consumption are nice and clear though the alf solver already exists so the main contribution of the paper is simply to apply the alf solver to neural odes this means that the novelty of the method is somewhat limited but i do not think that this is a major issue as the method works well and is clearly motivated the section about using different numerical solvers for resnets does not make much sense resnets are not designed as flows and do not behave as flows in practice so we should not expect them to work at all with other numerical solvers than euler with timestep1 i dont really think these experiments show anything interesting and should be removed for clarity recommendation overall the paper has a clear motivation provides a nice and simple solution to an interesting problem and has good experimental results however there are some clarity issues which make some aspects of the model and method confusing i therefore recommend a weak accept but would increase my score if the clarity issues are solved questions the model achieves extremely good bitsdim on mnist 087 however it seems from the appendix that the samples are fairly poor compared to vanilla ffjord for example log likelihood and sample quality are not always correlated but the difference seems particularly jarring here do you know why this is typos and small comments in many places the authors use latex math to write words that should either just be written in italics wrt or using text in math mode eg atol rtol there are several typos in the script so i think it would be a good idea for the authors to read through the script again to fix those in several places the authors write onf 1 which instead should be onf the authors often write constant memory cost with respect to integration time i think it would be more helpful to say number of solver steps or something along those lines as integration time typically refers to the upper limit of the integral when solving an ode
### Summary: | this paper introduced a new ode integration scheme that allows constantmemory gradient computation i was concerned that the low order of convergence of this method would make it impractical but the authors performed extensive experiments and got impressive results overall the paper addresses one of the main practical difficulties with large neural ode models the authors satisfactorily addressed the reviewers concerns in the discussion | [
891,
5467,
295,
310,
387,
1878,
40639,
18,
50276,
262,
3133,
247,
2372,
8921,
326,
359,
452,
281,
513,
247,
1980,
3579,
285,
1980,
19265,
1509,
275,
5933,
577,
812,
436,
320,
14042,
407,
2403,
1016,
3828,
273,
269,
42275,
275,
253,
1072,
17716,
352,
3133,
326,
253,
39200,
1332,
3198,
281,
513,
247,
4858,
8415,
273,
253,
8107,
258,
615,
984,
273,
2957,
273,
1491,
604,
359,
497,
281,
5467,
30332,
2322,
273,
253,
3579,
3711,
310,
627,
247,
1039,
281,
10007,
253,
39200,
1332,
281,
4555,
851,
27679,
253,
1854,
24291,
7152,
339,
793,
360,
3454,
436,
2929,
10262,
247,
3541,
20246,
35576,
26416,
71,
6375,
2899,
1080,
323,
27184,
16161,
11454,
258,
3229,
6289,
281,
347,
4691,
74,
253,
1332,
3249,
342,
247,
3638,
3541,
12215,
751,
253,
39200,
1332,
285,
671,
23632,
7661,
7816,
7200,
751,
253,
17825,
32552,
39200,
913,
66,
1332,
253,
4477,
671,
1918,
247,
26565,
10527,
1783,
273,
4691,
74,
285,
671,
2319,
247,
16109,
264,
2715,
342,
271,
2559,
7882,
2919,
253,
1332,
310,
6760,
327,
247,
5235,
273,
8892,
534,
3797,
9162,
18525,
26278,
285,
1006,
800,
26278,
50276,
856,
84,
50276,
783,
10527,
1783,
4453,
3451,
15806,
326,
891,
419,
2254,
4307,
562,
512,
253,
4278,
50276,
49363,
7103,
310,
1077,
41389,
285,
4691,
74,
33526,
425,
435,
248,
14461,
3045,
275,
512,
8892,
50276,
783,
1332,
310,
11464,
347,
7899,
347,
253,
2629,
10704,
258,
615,
1220,
735,
6701,
281,
697,
3777,
3541,
2105,
2429,
281,
913,
66,
4691,
74,
476,
840,
320,
4127,
347,
271,
273,
649,
1041,
48164,
5407,
50276,
5040,
285,
3533,
50276,
13565,
387,
253,
1543,
516,
1907,
10183,
6523,
667,
1534,
7756,
2220,
913,
66,
840,
253,
2022,
7680,
275,
1635,
281,
253,
10527,
1783,
310,
253,
3777,
3541,
8353,
534,
2789,
479,
294,
18959,
1880,
17857,
32888,
310,
247,
7470,
18767,
50276,
20261,
253,
3541,
4815,
273,
253,
39200,
1332,
285,
4691,
74,
403,
327,
71,
285,
327,
71,
18,
436,
36908,
1663,
4887,
275,
4677,
577,
68,
835,
253,
4797,
2534,
33478,
253,
2502,
581,
2654,
320,
5211,
604,
253,
4477,
476,
13366,
5513,
2139,
50276,
13565,
387,
2829,
19,
2139,
1057,
253,
1071,
3045,
273,
247,
4666,
10166,
342,
4691,
74,
2572,
672,
359,
5234,
432,
4691,
74,
281,
391,
76,
21,
352,
651,
320,
1199,
49482,
281,
923,
690,
2228,
11041,
6642,
50276,
74,
651,
320,
5211,
281,
923,
271,
5661,
7103,
273,
253,
7846,
1430,
347,
5393,
407,
253,
4477,
253,
7882,
1783,
310,
20185,
285,
246,
812,
320,
29607,
1355,
275,
24088,
44351,
26202,
553,
14221,
2299,
28763,
417,
253,
1083,
275,
2069,
12395,
26278,
594,
891,
4282,
604,
253,
7882,
1750,
476,
320,
16058,
327,
247,
10076,
275,
534,
24088,
2233,
7313,
12666,
17568,
275,
673,
342,
884,
4706,
84,
18388,
50276,
936,
6635,
2829,
19,
858,
368,
6194,
247,
501,
3024,
1293,
667,
8967,
7432,
442,
737,
932,
3206,
285,
1611,
281,
7472,
253,
1071,
3045,
970,
271,
258,
615,
47037,
3365,
970,
253,
10166,
501,
3024,
347,
253,
16924,
1159,
604,
594,
891,
13414,
1158,
436,
2789,
667,
3282,
3707,
299,
14398,
18,
47037,
285,
253,
2862,
501,
3024,
4194,
275,
2829,
19,
812,
564,
1977,
50276,
38092,
5701,
50276,
13206,
21,
11743,
812,
2486,
690,
625,
2508,
387,
1878,
29570,
253,
3368,
1416,
50275,
22309,
310,
627,
247,
1980,
3579,
3213,
1561,
253,
323,
6287,
275,
253,
19265,
10934,
275,
20320,
21,
50276,
262,
651,
320,
5322,
281,
923,
247,
4864,
5740,
273,
253,
278,
10441,
16856,
10895,
50276,
555,
5367,
275,
253,
4060,
273,
2593,
270,
1237,
50276,
9939,
846,
30080,
22559,
891,
2572,
619,
4583,
4868,
432,
721,
281,
818,
7152,
33032,
18,
6010,
253,
7714,
29328,
247,
24048,
9554,
6974,
323,
5512,
26230,
253,
11786,
273,
11454,
9826,
8967,
7424,
841,
258,
3229,
1957,
277,
79,
2224,
342,
5415,
2581,
685,
13358,
2193,
323,
253,
1180,
273,
8090,
253,
47037,
310,
28055,
15626,
285,
45190,
2429,
281,
643,
1220,
735,
50276,
19,
20544,
50276,
783,
2929,
310,
6571,
973,
3542,
50276,
783,
7661,
2322,
2867,
273,
253,
47037,
5644,
281,
247,
3541,
33257,
326,
1057,
417,
3469,
327,
9554,
673,
50276,
783,
1566,
310,
3732,
281,
2629,
15302,
50276,
20,
7350,
50276,
783,
4473,
273,
11454,
258,
615,
3210,
616,
7990,
285,
616,
3264,
31471,
943,
320,
1805,
17194,
352,
310,
417,
4755,
534,
2554,
841,
3210,
1132,
285,
752,
597,
3959,
347,
2442,
20544,
50276,
783,
2899,
569,
6974,
3133,
2168,
1929,
285,
973,
4232,
50276,
262,
1057,
417,
1646,
326,
253,
2929,
2789,
2127,
285,
941,
2130,
281,
253,
1345,
50276,
21,
16157,
34974,
50275,
66,
5933,
337,
352,
3133,
326,
253,
5018,
907,
288,
943,
320,
31260,
2220,
1046,
3213,
5010,
253,
5018,
476,
760,
755,
4577,
50274,
67,
10414,
5347,
1320,
417,
3451,
24088,
11454,
1491,
5162,
2718,
258,
3354,
292,
298,
2477,
472,
75,
2487,
50275,
68,
752,
5373,
1057,
253,
11454,
258,
615,
1566,
452,
275,
253,
3634,
273,
2460,
9162,
752,
310,
253,
30328,
3212,
253,
5415,
6864,
2934,
275,
436,
5362,
1792,
351,
406,
339,
377,
6653,
6010,
627,
403,
5431,
767,
3082,
323,
26230,
253,
27935,
342,
1675,
281,
253,
2957,
323,
11454,
258,
3229,
253,
27785,
1332,
3587,
896,
44263,
684,
949,
253,
5018,
273,
253,
258,
615,
47037,
4283,
281,
7899,
27935,
533,
1077,
1781,
3541,
2105,
253,
39200,
1332,
275,
4499,
1057,
417,
4657,
253,
2862,
18974,
275,
3541,
533,
556,
8107,
18974,
6332,
26332,
253,
10704,
2900,
275,
253,
8107,
3884,
588,
417,
320,
253,
13737,
273,
253,
10704,
2900,
275,
253,
3579,
3884,
275,
436,
2929,
253,
4477,
12661,
247,
1332,
326,
310,
1097,
8107,
7899,
285,
556,
1698,
3541,
2105,
50276,
936,
5115,
436,
253,
4477,
1379,
5750,
273,
253,
35576,
26416,
71,
6375,
47037,
436,
10704,
1332,
310,
24048,
16161,
271,
258,
615,
27184,
275,
253,
8107,
3884,
310,
253,
13737,
273,
16161,
253,
258,
615,
27184,
275,
253,
3579,
3884,
436,
310,
417,
3839,
2032,
323,
253,
258,
615,
1220,
735,
5431,
908,
391,
76,
21,
285,
594,
327,
275,
253,
11454,
258,
615,
6239,
347,
253,
10704,
47037,
310,
11120,
42275,
253,
4477,
476,
432,
760,
253,
2457,
1375,
285,
417,
253,
2862,
18974,
12171,
17029,
1016,
258,
615,
47037,
3213,
281,
755,
253,
1980,
11786,
273,
253,
3602,
342,
1675,
281,
253,
2957,
597,
476,
840,
10173,
841,
27935,
2112,
253,
2862,
8107,
18974,
281,
4044,
271,
7899,
6642,
273,
253,
11786,
342,
1675,
281,
253,
3602,
347,
1016,
3213,
273,
253,
10704,
47037,
310,
24048,
597,
513,
417,
878,
281,
4657,
253,
2862,
18974,
253,
4477,
30648,
253,
7882,
285,
10704,
2228,
273,
616,
4081,
1332,
285,
2085,
247,
20953,
1650,
281,
921,
849,
973,
616,
1332,
8197,
27935,
2429,
253,
27785,
39200,
285,
17825,
32552,
3082,
50276,
783,
4477,
840,
1347,
4679,
327,
247,
5235,
273,
8892,
281,
1071,
616,
1566,
597,
1071,
616,
1566,
327,
2460,
9162,
4679,
1097,
327,
260,
338,
274,
740,
285,
4440,
257,
292,
285,
5115,
1175,
1543,
2429,
281,
253,
1666,
25379,
275,
1635,
597,
1347,
48960,
31640,
4679,
327,
4440,
257,
292,
285,
671,
921,
1175,
3045,
4720,
253,
4477,
1071,
616,
1332,
1097,
323,
673,
2962,
14053,
285,
5415,
2622,
3006,
14221,
969,
4645,
1175,
3045,
2429,
342,
27785,
9554,
3082,
50276,
993,
23223,
50276,
783,
16038,
285,
5161,
2934,
273,
253,
2929,
310,
2590,
10704,
1220,
735,
403,
275,
2087,
417,
24048,
285,
436,
476,
1421,
281,
31215,
11786,
8197,
672,
970,
253,
39200,
1332,
253,
4477,
5513,
436,
4518,
285,
840,
12661,
247,
1332,
326,
8069,
35910,
436,
50276,
783,
5661,
1543,
403,
3240,
13943,
253,
1566,
17923,
327,
1061,
342,
253,
17825,
32552,
1332,
275,
2426,
273,
7200,
533,
310,
1199,
625,
3541,
5919,
285,
19836,
3541,
310,
3907,
273,
253,
1180,
273,
9554,
5018,
436,
4483,
253,
4477,
281,
1408,
616,
1566,
327,
1781,
4311,
15302,
751,
4440,
257,
292,
534,
369,
417,
3786,
1896,
342,
954,
11454,
258,
615,
3082,
2007,
253,
4477,
5115,
1175,
3045,
327,
3240,
247,
4618,
5235,
273,
8892,
2460,
9162,
48960,
8104,
673,
2962,
14053,
1006,
800,
14053,
534,
310,
5322,
50276,
783,
4477,
1347,
247,
11080,
1783,
273,
253,
20243,
273,
616,
9554,
1332,
2429,
281,
2571,
534,
310,
1077,
9371,
50276,
8265,
3993,
50276,
783,
9759,
273,
253,
1332,
285,
1543,
310,
417,
1900,
1077,
2590,
323,
1650,
253,
2593,
670,
31731,
323,
253,
355,
71,
2899,
1080,
310,
417,
2590,
253,
4477,
3748,
326,
355,
71,
310,
417,
6474,
323,
1162,
66,
18,
533,
347,
2080,
347,
891,
476,
2028,
1620,
3748,
752,
1318,
273,
1162,
66,
597,
897,
275,
3946,
285,
1880,
13887,
436,
1318,
310,
2834,
2007,
352,
310,
417,
2590,
604,
355,
71,
310,
1335,
24048,
342,
436,
1162,
66,
4764,
18289,
368,
651,
452,
281,
897,
337,
1464,
275,
253,
8107,
3213,
323,
352,
281,
3464,
42275,
275,
534,
1083,
253,
8107,
310,
417,
6474,
253,
4477,
943,
320,
625,
2590,
670,
436,
50276,
783,
20953,
1650,
310,
21643,
849,
1705,
253,
9554,
673,
7866,
432,
246,
938,
310,
436,
984,
253,
2228,
760,
17202,
846,
246,
938,
347,
368,
897,
246,
18,
323,
512,
4679,
285,
253,
391,
34776,
285,
387,
311,
403,
671,
11467,
253,
1072,
323,
512,
4679,
352,
651,
320,
5322,
281,
923,
604,
436,
2686,
2789,
247,
3064,
671,
323,
246,
18,
275,
4677,
577,
253,
4477,
671,
3748,
253,
4309,
277,
392,
90,
17,
533,
436,
4309,
310,
1620,
5393,
275,
253,
2505,
513,
368,
1599,
277,
392,
91,
17,
253,
14777,
273,
3541,
8353,
403,
5322,
285,
2590,
2167,
50276,
783,
355,
71,
47037,
2168,
4961,
594,
253,
2022,
7680,
273,
253,
2929,
310,
3365,
281,
4647,
253,
355,
71,
47037,
281,
11454,
258,
3229,
436,
2097,
326,
253,
38135,
273,
253,
1332,
310,
8489,
3710,
533,
891,
513,
417,
1158,
326,
436,
310,
247,
2201,
2523,
347,
253,
1332,
2987,
973,
285,
310,
4518,
17194,
50276,
783,
2593,
670,
970,
1027,
10704,
1220,
735,
323,
501,
47301,
1057,
417,
1056,
1199,
3282,
501,
47301,
403,
417,
4158,
347,
14221,
285,
513,
417,
21319,
347,
14221,
275,
3946,
594,
359,
943,
417,
1902,
731,
281,
789,
387,
512,
342,
643,
10704,
1220,
735,
685,
299,
14398,
342,
4522,
383,
554,
18,
891,
13414,
1663,
1158,
841,
4679,
921,
2712,
4722,
285,
943,
320,
5176,
323,
19843,
50276,
250,
27167,
318,
4583,
253,
2929,
556,
247,
2590,
16038,
3400,
247,
5322,
285,
2969,
2900,
281,
271,
4722,
1895,
285,
556,
1175,
5661,
1543,
2299,
627,
403,
690,
19843,
3374,
534,
1056,
690,
7794,
273,
253,
1566,
285,
1332,
21643,
891,
3103,
5583,
247,
5075,
2997,
533,
651,
2572,
619,
4868,
604,
253,
19843,
3374,
403,
14042,
50276,
34974,
253,
1566,
33526,
6685,
1175,
9886,
4528,
327,
278,
79,
382,
470,
2597,
2299,
352,
3133,
432,
253,
30762,
326,
253,
3530,
403,
9648,
4105,
2429,
281,
26724,
34082,
75,
636,
323,
1650,
2412,
12177,
285,
3410,
3290,
403,
417,
1900,
9578,
533,
253,
3064,
3133,
3782,
21152,
804,
1060,
513,
368,
871,
2139,
436,
310,
50276,
555,
993,
285,
1355,
5701,
50276,
249,
1142,
5053,
253,
4477,
897,
44127,
14168,
281,
3630,
3000,
326,
943,
2057,
816,
320,
3542,
275,
36037,
982,
8772,
390,
970,
2505,
275,
14168,
4438,
24088,
387,
311,
391,
34776,
50276,
9088,
403,
2067,
963,
993,
275,
253,
6001,
594,
891,
1158,
352,
651,
320,
247,
1175,
2934,
323,
253,
4477,
281,
1239,
949,
253,
6001,
969,
281,
4993,
1110,
50276,
249,
2067,
5053,
253,
4477,
3630,
327,
71,
50276,
18,
534,
3185,
943,
320,
327,
71,
50276,
783,
4477,
2223,
3630,
3638,
3541,
2105,
342,
1675,
281,
9554,
673,
891,
1158,
352,
651,
320,
625,
9371,
281,
1333,
1180,
273,
47037,
5018,
390,
1633,
2112,
1110,
3104,
347,
9554,
673,
5431,
10770,
281,
253,
5170,
2701,
273,
253,
9909,
672,
16161,
271,
258,
615,
2490,
187,
4118,
18435,
27,
2520,
2929,
5611,
247,
747,
258,
615,
9554,
6974,
326,
4483,
3638,
20704,
11786,
13782,
50276,
74,
369,
7514,
326,
253,
1698,
1340,
273,
14940,
273,
436,
1332,
651,
1056,
352,
45783,
533,
253,
4477,
2684,
9470,
4679,
285,
1694,
13943,
1543,
50276,
1189,
455,
253,
2929,
12453,
581,
273,
253,
2022,
8542,
12748,
342,
1781,
11454,
258,
615,
3210,
50276,
783,
4477,
3449,
5906,
1031,
9713,
253,
30628,
7350,
275,
253,
5955
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
891,
5467,
295,
310,
387,
1878,
40639,
18,
50276,
262,
3133,
247,
2372,
8921,
326,
359,
452,
281,
513,
247,
1980,
3579,
285,
1980,
19265,
1509,
275,
5933,
577,
812,
436,
320,
14042,
407,
2403,
1016,
3828,
273,
269,
42275,
275,
253,
1072,
17716,
352,
3133,
326,
253,
39200,
1332,
3198,
281,
513,
247,
4858,
8415,
273,
253,
8107,
258,
615,
984,
273,
2957,
273,
1491,
604,
359,
497,
281,
5467,
30332,
2322,
273,
253,
3579,
3711,
310,
627,
247,
1039,
281,
10007,
253,
39200,
1332,
281,
4555,
851,
27679,
253,
1854,
24291,
7152,
339,
793,
360,
3454,
436,
2929,
10262,
247,
3541,
20246,
35576,
26416,
71,
6375,
2899,
1080,
323,
27184,
16161,
11454,
258,
3229,
6289,
281,
347,
4691,
74,
253,
1332,
3249,
342,
247,
3638,
3541,
12215,
751,
253,
39200,
1332,
285,
671,
23632,
7661,
7816,
7200,
751,
253,
17825,
32552,
39200,
913,
66,
1332,
253,
4477,
671,
1918,
247,
26565,
10527,
1783,
273,
4691,
74,
285,
671,
2319,
247,
16109,
264,
2715,
342,
271,
2559,
7882,
2919,
253,
1332,
310,
6760,
327,
247,
5235,
273,
8892,
534,
3797,
9162,
18525,
26278,
285,
1006,
800,
26278,
50276,
856,
84,
50276,
783,
10527,
1783,
4453,
3451,
15806,
326,
891,
419,
2254,
4307,
562,
512,
253,
4278,
50276,
49363,
7103,
310,
1077,
41389,
285,
4691,
74,
33526,
425,
435,
248,
14461,
3045,
275,
512,
8892,
50276,
783,
1332,
310,
11464,
347,
7899,
347,
253,
2629,
10704,
258,
615,
1220,
735,
6701,
281,
697,
3777,
3541,
2105,
2429,
281,
913,
66,
4691,
74,
476,
840,
320,
4127,
347,
271,
273,
649,
1041,
48164,
5407,
50276,
5040,
285,
3533,
50276,
13565,
387,
253,
1543,
516,
1907,
10183,
6523,
667,
1534,
7756,
2220,
913,
66,
840,
253,
2022,
7680,
275,
1635,
281,
253,
10527,
1783,
310,
253,
3777,
3541,
8353,
534,
2789,
479,
294,
18959,
1880,
17857,
32888,
310,
247,
7470,
18767,
50276,
20261,
253,
3541,
4815,
273,
253,
39200,
1332,
285,
4691,
74,
403,
327,
71,
285,
327,
71,
18,
436,
36908,
1663,
4887,
275,
4677,
577,
68,
835,
253,
4797,
2534,
33478,
253,
2502,
581,
2654,
320,
5211,
604,
253,
4477,
476,
13366,
5513,
2139,
50276,
13565,
387,
2829,
19,
2139,
1057,
253,
1071,
3045,
273,
247,
4666,
10166,
342,
4691,
74,
2572,
672,
359,
5234,
432,
4691,
74,
281,
391,
76,
21,
352,
651,
320,
1199,
49482,
281,
923,
690,
2228,
11041,
6642,
50276,
74,
651,
320,
5211,
281,
923,
271,
5661,
7103,
273,
253,
7846,
1430,
347,
5393,
407,
253,
4477,
253,
7882,
1783,
310,
20185,
285,
246,
812,
320,
29607,
1355,
275,
24088,
44351,
26202,
553,
14221,
2299,
28763,
417,
253,
1083,
275,
2069,
12395,
26278,
594,
891,
4282,
604,
253,
7882,
1750,
476,
320,
16058,
327,
247,
10076,
275,
534,
24088,
2233,
7313,
12666,
17568,
275,
673,
342,
884,
4706,
84,
18388,
50276,
936,
6635,
2829,
19,
858,
368,
6194,
247,
501,
3024,
1293,
667,
8967,
7432,
442,
737,
932,
3206,
285,
1611,
281,
7472,
253,
1071,
3045,
970,
271,
258,
615,
47037,
3365,
970,
253,
10166,
501,
3024,
347,
253,
16924,
1159,
604,
594,
891,
13414,
1158,
436,
2789,
667,
3282,
3707,
299,
14398,
18,
47037,
285,
253,
2862,
501,
3024,
4194,
275,
2829,
19,
812,
564,
1977,
50276,
38092,
5701,
50276,
13206,
21,
11743,
812,
2486,
690,
625,
2508,
387,
1878,
29570,
253,
3368,
1416,
50275,
22309,
310,
627,
247,
1980,
3579,
3213,
1561,
253,
323,
6287,
275,
253,
19265,
10934,
275,
20320,
21,
50276,
262,
651,
320,
5322,
281,
923,
247,
4864,
5740,
273,
253,
278,
10441,
16856,
10895,
50276,
555,
5367,
275,
253,
4060,
273,
2593,
270,
1237,
50276,
9939,
846,
30080,
22559,
891,
2572,
619,
4583,
4868,
432,
721,
281,
818,
7152,
33032,
18,
6010,
253,
7714,
29328,
247,
24048,
9554,
6974,
323,
5512,
26230,
253,
11786,
273,
11454,
9826,
8967,
7424,
841,
258,
3229,
1957,
277,
79,
2224,
342,
5415,
2581,
685,
13358,
2193,
323,
253,
1180,
273,
8090,
253,
47037,
310,
28055,
15626,
285,
45190,
2429,
281,
643,
1220,
735,
50276,
19,
20544,
50276,
783,
2929,
310,
6571,
973,
3542,
50276,
783,
7661,
2322,
2867,
273,
253,
47037,
5644,
281,
247,
3541,
33257,
326,
1057,
417,
3469,
327,
9554,
673,
50276,
783,
1566,
310,
3732,
281,
2629,
15302,
50276,
20,
7350,
50276,
783,
4473,
273,
11454,
258,
615,
3210,
616,
7990,
285,
616,
3264,
31471,
943,
320,
1805,
17194,
352,
310,
417,
4755,
534,
2554,
841,
3210,
1132,
285,
752,
597,
3959,
347,
2442,
20544,
50276,
783,
2899,
569,
6974,
3133,
2168,
1929,
285,
973,
4232,
50276,
262,
1057,
417,
1646,
326,
253,
2929,
2789,
2127,
285,
941,
2130,
281,
253,
1345,
50276,
21,
16157,
34974,
50275,
66,
5933,
337,
352,
3133,
326,
253,
5018,
907,
288,
943,
320,
31260,
2220,
1046,
3213,
5010,
253,
5018,
476,
760,
755,
4577,
50274,
67,
10414,
5347,
1320,
417,
3451,
24088,
11454,
1491,
5162,
2718,
258,
3354,
292,
298,
2477,
472,
75,
2487,
50275,
68,
752,
5373,
1057,
253,
11454,
258,
615,
1566,
452,
275,
253,
3634,
273,
2460,
9162,
752,
310,
253,
30328,
3212,
253,
5415,
6864,
2934,
275,
436,
5362,
1792,
351,
406,
339,
377,
6653,
6010,
627,
403,
5431,
767,
3082,
323,
26230,
253,
27935,
342,
1675,
281,
253,
2957,
323,
11454,
258,
3229,
253,
27785,
1332,
3587,
896,
44263,
684,
949,
253,
5018,
273,
253,
258,
615,
47037,
4283,
281,
7899,
27935,
533,
1077,
1781,
3541,
2105,
253,
39200,
1332,
275,
4499,
1057,
417,
4657,
253,
2862,
18974,
275,
3541,
533,
556,
8107,
18974,
6332,
26332,
253,
10704,
2900,
275,
253,
8107,
3884,
588,
417,
320,
253,
13737,
273,
253,
10704,
2900,
275,
253,
3579,
3884,
275,
436,
2929,
253,
4477,
12661,
247,
1332,
326,
310,
1097,
8107,
7899,
285,
556,
1698,
3541,
2105,
50276,
936,
5115,
436,
253,
4477,
1379,
5750,
273,
253,
35576,
26416,
71,
6375,
47037,
436,
10704,
1332,
310,
24048,
16161,
271,
258,
615,
27184,
275,
253,
8107,
3884,
310,
253,
13737,
273,
16161,
253,
258,
615,
27184,
275,
253,
3579,
3884,
436,
310,
417,
3839,
2032,
323,
253,
258,
615,
1220,
735,
5431,
908,
391,
76,
21,
285,
594,
327,
275,
253,
11454,
258,
615,
6239,
347,
253,
10704,
47037,
310,
11120,
42275,
253,
4477,
476,
432,
760,
253,
2457,
1375,
285,
417,
253,
2862,
18974,
12171,
17029,
1016,
258,
615,
47037,
3213,
281,
755,
253,
1980,
11786,
273,
253,
3602,
342,
1675,
281,
253,
2957,
597,
476,
840,
10173,
841,
27935,
2112,
253,
2862,
8107,
18974,
281,
4044,
271,
7899,
6642,
273,
253,
11786,
342,
1675,
281,
253,
3602,
347,
1016,
3213,
273,
253,
10704,
47037,
310,
24048,
597,
513,
417,
878,
281,
4657,
253,
2862,
18974,
253,
4477,
30648,
253,
7882,
285,
10704,
2228,
273,
616,
4081,
1332,
285,
2085,
247,
20953,
1650,
281,
921,
849,
973,
616,
1332,
8197,
27935,
2429,
253,
27785,
39200,
285,
17825,
32552,
3082,
50276,
783,
4477,
840,
1347,
4679,
327,
247,
5235,
273,
8892,
281,
1071,
616,
1566,
597,
1071,
616,
1566,
327,
2460,
9162,
4679,
1097,
327,
260,
338,
274,
740,
285,
4440,
257,
292,
285,
5115,
1175,
1543,
2429,
281,
253,
1666,
25379,
275,
1635,
597,
1347,
48960,
31640,
4679,
327,
4440,
257,
292,
285,
671,
921,
1175,
3045,
4720,
253,
4477,
1071,
616,
1332,
1097,
323,
673,
2962,
14053,
285,
5415,
2622,
3006,
14221,
969,
4645,
1175,
3045,
2429,
342,
27785,
9554,
3082,
50276,
993,
23223,
50276,
783,
16038,
285,
5161,
2934,
273,
253,
2929,
310,
2590,
10704,
1220,
735,
403,
275,
2087,
417,
24048,
285,
436,
476,
1421,
281,
31215,
11786,
8197,
672,
970,
253,
39200,
1332,
253,
4477,
5513,
436,
4518,
285,
840,
12661,
247,
1332,
326,
8069,
35910,
436,
50276,
783,
5661,
1543,
403,
3240,
13943,
253,
1566,
17923,
327,
1061,
342,
253,
17825,
32552,
1332,
275,
2426,
273,
7200,
533,
310,
1199,
625,
3541,
5919,
285,
19836,
3541,
310,
3907,
273,
253,
1180,
273,
9554,
5018,
436,
4483,
253,
4477,
281,
1408,
616,
1566,
327,
1781,
4311,
15302,
751,
4440,
257,
292,
534,
369,
417,
3786,
1896,
342,
954,
11454,
258,
615,
3082,
2007,
253,
4477,
5115,
1175,
3045,
327,
3240,
247,
4618,
5235,
273,
8892,
2460,
9162,
48960,
8104,
673,
2962,
14053,
1006,
800,
14053,
534,
310,
5322,
50276,
783,
4477,
1347,
247,
11080,
1783,
273,
253,
20243,
273,
616,
9554,
1332,
2429,
281,
2571,
534,
310,
1077,
9371,
50276,
8265,
3993,
50276,
783,
9759,
273,
253,
1332,
285,
1543,
310,
417,
1900,
1077,
2590,
323,
1650,
253,
2593,
670,
31731,
323,
253,
355,
71,
2899,
1080,
310,
417,
2590,
253,
4477,
3748,
326,
355,
71,
310,
417,
6474,
323,
1162,
66,
18,
533,
347,
2080,
347,
891,
476,
2028,
1620,
3748,
752,
1318,
273,
1162,
66,
597,
897,
275,
3946,
285,
1880,
13887,
436,
1318,
310,
2834,
2007,
352,
310,
417,
2590,
604,
355,
71,
310,
1335,
24048,
342,
436,
1162,
66,
4764,
18289,
368,
651,
452,
281,
897,
337,
1464,
275,
253,
8107,
3213,
323,
352,
281,
3464,
42275,
275,
534,
1083,
253,
8107,
310,
417,
6474,
253,
4477,
943,
320,
625,
2590,
670,
436,
50276,
783,
20953,
1650,
310,
21643,
849,
1705,
253,
9554,
673,
7866,
432,
246,
938,
310,
436,
984,
253,
2228,
760,
17202,
846,
246,
938,
347,
368,
897,
246,
18,
323,
512,
4679,
285,
253,
391,
34776,
285,
387,
311,
403,
671,
11467,
253,
1072,
323,
512,
4679,
352,
651,
320,
5322,
281,
923,
604,
436,
2686,
2789,
247,
3064,
671,
323,
246,
18,
275,
4677,
577,
253,
4477,
671,
3748,
253,
4309,
277,
392,
90,
17,
533,
436,
4309,
310,
1620,
5393,
275,
253,
2505,
513,
368,
1599,
277,
392,
91,
17,
253,
14777,
273,
3541,
8353,
403,
5322,
285,
2590,
2167,
50276,
783,
355,
71,
47037,
2168,
4961,
594,
253,
2022,
7680,
273,
253,
2929,
310,
3365,
281,
4647,
253,
355,
71,
47037,
281,
11454,
258,
3229,
436,
2097,
326,
253,
38135,
273,
253,
1332,
310,
8489,
3710,
533,
891,
513,
417,
1158,
326,
436,
310,
247,
2201,
2523,
347,
253,
1332,
2987,
973,
285,
310,
4518,
17194,
50276,
783,
2593,
670,
970,
1027,
10704,
1220,
735,
323,
501,
47301,
1057,
417,
1056,
1199,
3282,
501,
47301,
403,
417,
4158,
347,
14221,
285,
513,
417,
21319,
347,
14221,
275,
3946,
594,
359,
943,
417,
1902,
731,
281,
789,
387,
512,
342,
643,
10704,
1220,
735,
685,
299,
14398,
342,
4522,
383,
554,
18,
891,
13414,
1663,
1158,
841,
4679,
921,
2712,
4722,
285,
943,
320,
5176,
323,
19843,
50276,
250,
27167,
318,
4583,
253,
2929,
556,
247,
2590,
16038,
3400,
247,
5322,
285,
2969,
2900,
281,
271,
4722,
1895,
285,
556,
1175,
5661,
1543,
2299,
627,
403,
690,
19843,
3374,
534,
1056,
690,
7794,
273,
253,
1566,
285,
1332,
21643,
891,
3103,
5583,
247,
5075,
2997,
533,
651,
2572,
619,
4868,
604,
253,
19843,
3374,
403,
14042,
50276,
34974,
253,
1566,
33526,
6685,
1175,
9886,
4528,
327,
278,
79,
382,
470,
2597,
2299,
352,
3133,
432,
253,
30762,
326,
253,
3530,
403,
9648,
4105,
2429,
281,
26724,
34082,
75,
636,
323,
1650,
2412,
12177,
285,
3410,
3290,
403,
417,
1900,
9578,
533,
253,
3064,
3133,
3782,
21152,
804,
1060,
513,
368,
871,
2139,
436,
310,
50276,
555,
993,
285,
1355,
5701,
50276,
249,
1142,
5053,
253,
4477,
897,
44127,
14168,
281,
3630,
3000,
326,
943,
2057,
816,
320,
3542,
275,
36037,
982,
8772,
390,
970,
2505,
275,
14168,
4438,
24088,
387,
311,
391,
34776,
50276,
9088,
403,
2067,
963,
993,
275,
253,
6001,
594,
891,
1158,
352,
651,
320,
247,
1175,
2934,
323,
253,
4477,
281,
1239,
949,
253,
6001,
969,
281,
4993,
1110,
50276,
249,
2067,
5053,
253,
4477,
3630,
327,
71,
50276,
18,
534,
3185,
943,
320,
327,
71,
50276,
783,
4477,
2223,
3630,
3638,
3541,
2105,
342,
1675,
281,
9554,
673,
891,
1158,
352,
651,
320,
625,
9371,
281,
1333,
1180,
273,
47037,
5018,
390,
1633,
2112,
1110,
3104,
347,
9554,
673,
5431,
10770,
281,
253,
5170,
2701,
273,
253,
9909,
672,
16161,
271,
258,
615,
2490,
187,
4118,
18435,
27,
2520,
2929,
5611,
247,
747,
258,
615,
9554,
6974,
326,
4483,
3638,
20704,
11786,
13782,
50276,
74,
369,
7514,
326,
253,
1698,
1340,
273,
14940,
273,
436,
1332,
651,
1056,
352,
45783,
533,
253,
4477,
2684,
9470,
4679,
285,
1694,
13943,
1543,
50276,
1189,
455,
253,
2929,
12453,
581,
273,
253,
2022,
8542,
12748,
342,
1781,
11454,
258,
615,
3210,
50276,
783,
4477,
3449,
5906,
1031,
9713,
253,
30628,
7350,
275,
253,
5955
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper revisits the design of ensemble critics in offline rl the authors argue that the common design where critics in the ensemble sharing the same pessimistic target function in learning can lead to actually optimistic critics the authors analyze this phenomenon theoretically under the ntk assumption and present toy simulation examples the authors further use this insight to design an offline rl algorithm msg which gives sota results on common offline rl benchmarks in these experiments they show that the separating the targets is a key to the algorithms superior performance comments for rebuttal thanks for the rebuttal and the clarification while they address some of my concerns my main concern stay the same as stated in the original review the main issue i have is whether the proposal here is sufficient to design a full offline rl algorithm or just provide an important note on implementation choice the rebuttal also points out it is the objective of our work to advocate for relying on uncertainty estimation as the main source of pessimism for offline rl lets examine this question from two aspects based on the paper and rebuttal from the theoretical side the paper provides that theorem 31 which compares the iterates of qlcb of independent targets and the shared targets however it does not show how good pessimistic the estimate of independent targets is in the review i mentioned in general optimizing a pessimistic critic or being more pessimistic does not imply good performance in offline rl because whether a pessimistic critic is useful or not depends on how tight the under estimation is and where it is pessimistic being more pessimistic is not always good eg estimating all values as vmin is pessimistic but its obviously useless the current theory does not provide enough insights to how good the pessimistic estimate is or how good the learned policy based on such value estimate will be this is why i said the significance of the theoretical results are rather limited in the review for empirical side i think to demonstrate the authors claim it is necessary to show that sota can be achieved with alpha0 however the current results do not support that fully while i agree that in figure 3 alpha0 is among the best performing results i also think that figure 3 does not provide a conclusive answer as it is missing results of larger alpha value for beta0 as there is an increasing trend this was pointed out in the review its also hard to compare fig 3 and fig 2 directly i think the failure of using alpha0 in simpler mujoco domains is actually showing that the proposed approach alone might not be sufficient to provide enough pessimism broadly i do not agree with the rebuttals statement that these environments can be solved by behavior cloning all the methods the authors listed there are not pure behavior cloning which mimics all actions as all of them perform some reasoning what actions are better based on the rewards therefore i dont think this is a good excuse of using the cql term here yes i agree that the proposed method works with alpha0 with antmaze which is considered as a harder domain but is it because of the reason that the authors mentioned if this is the case why does not perform well in simpler domain or is it because of something that is related to the structure of antmaze environment and dataset currently we dont have sufficient evidence to tell thus i think that currently the paper provides insufficient results to show that the proposed uncertainty estimation alone can achieve sota offline rl results nonetheless i also think that this paper provides more than sufficient reasons to show that it would be a good design choice to improve an existing valuebased offline rl algorithm therefore i keep my original recommendation strengths 1 this paper presents an overlooked finding 2 it provides some theoretical reasons to back it up it also provides empirical validation 2 the writing of this paper is clear and the experimental results are thorough weakness 1 while the authors provide explanations of why the common usage of shared targets may lead to optimism the current results are not conclusive in particular while existing offline rl implementations use shared targets the pessimism of shared targets is not the main source of pessimism but rather an implementation detail therefore it is unclear whether the proposal here is sufficient to design a full offline rl algorithm or just provide an important note on implementation choice this factor limits the significance of the paper 2 it is unclear what data assumptions are needed for the proposed method to work properly as an offline rl method the authors well discuss the limitation such as the extra complexity needed by the proposed method however in my view the results here are more limited than what the authors claim for the reasons above docsepthis paper studies ensemblebased pessimism in offline rl from both theoretical and empirical aspects giving an analysis mathematically through ntk to show shared pessimistic targets can paradoxically lead to qestimates which are in fact optimistic proposing msg that trains each qnetwork independently and conducts experiments in d4rl and rl unplugged tasks strengths formally analyze the offline rl methods based on qensemble pessimism on infinitewidth neural networks and shows the pessimism term may become positive in shared targets empirically verifies the effectiveness of the algorithm by combining qensemble with cql weaknesses the ntk assumptions in section 31 and the gaussian assumptions in section 32 seems limited in broader value iteration the qensemble needs to combine with cql to obtain reasonable performance the use of cql makes it difficult to analyze the source of performance improvement however there exist several methods eg edac in neurips 2021 and pbrl in iclr 2022 that perform qensemble for offlinerl with purely uncertainty the experiment results are not complete for d4rl benchmark na docsepthis paper discusses the uncertainty estimation in rl which is an alternative to induce pessimism in offline rl though uncertainty estimation through ensembles has been proposed in the offline rl literature this paper points out a critical flaw in how to incorporate the lcb in the actorcritic based algorithm it shows theoretically that the previous algorithms which regress different q functions to the shared pessimistic target values and does policy evaluation based on the lcb could sometime leads to overestimation of the q function to address this the paper proposes a simple fix that is in the bellman backup stage instead of regressing the different q values to the shared lcb estimate just regressing them to independent q target empirically it shows better performance in challenging tasks that require stitching beyond this the paper also examines how different efficient ensemble method work in rl setting and it seems there still exist a large gap in the performance compared with deep ensemble which opens up more interesting questions in efficient ensemble methods in the rl setting strength 1 the paper is wellwritten and very easy to follow 2 it discusses a major flaw in how to incorporate lcb estimate in offline rl algorithms which seems being overlooked in the literature but empirically seems make a big difference the theoretical claim is wellsupported 3 it has a comprehensive set of empirical studies which covers various aspects about the applicability of the method such as the ensemble size the hyperparameter sensitivity i do appreciate the authors effort in discussing how to transfer efficient ensemble method in supervised learning setting to the rl setup to make it more computationally efficient though some negative results there weakness 1 maybe i am misunderstanding something but i do feel some of the claims and findings are not explained in a super clear way see questions section for details for this 2 the experiment section does show that incorporating the independent target leads to the better performance in the challenging tasks it would be great to see that this is result from better lcb estimate we see the overestimation issues in the toy task and it would be really helpful to see in the challenging tasks that shared target does lead to overestimation which is the reason that the method helps 3 section 42 seems a little bit out of picture as it seems alpha0 works great in most cases the authors state that it might help in some narrow data regime is any empirical study supporting this yes docsepthe paper observes a problem in existing pessimism estimation in offline rl using ensembles using shared targets for all ensembles updates the paper instead proposes to update each ensemble individually and apply the pessimism at policy updates the paper derives the update form of both methods in the ntk setting and shows that the update method with shared target could even result in optimism which is also shown with some synthetic simulation data finally the paper evaluates the proposed method in several offline rl benchmarks and show its empirical competitiveness strength 1 overall the paper is well written easy to follow and the technical part seems correct 2 the paper makes a good observation of the existing methods for offline rl when they update the qvalues for ensembles from hindsight one really does not need to incorporate the pessimism into the function update procedure but instead just apply pessimism during policy update this also seems to agree to the theory rl algorithms one can just perform the regular bellman updates or perform elimination in version space algorithms and define policy with lcb or take minimum over the remaining set of functions for pessimism for example 1 3 although it may not be obvious under what kind of conditions such that in the ntk setting using the shared target could result in optimism the following subsection provides good evidence that that indeed could happen it could be better to provide some more intuitive scenario or even a closedform construction 4 the paper provides extensive and convincing experiments including a good ablation experiments which contains the different kinds of shared target updating methods such as sharedlcb ens sharedmin deep ens and with a different number of ensembles b the paper tries many different hyperparameters for the baselines so the baselines seem to be finetuned for the final presentation of the results c the experiments are performed on extensive benchmarks weakness 1 the theoretical results provide very good intuition into the problems of the previous pessimism estimation in offline deep rl methods but since the result is based on the ntk setting it still has some gap between the practical situations 2 the result presented in table 1 has different hyperparameter for different tasks which likely undermines the empirical merits of the proposed algorithm references 1 xie tengyang et al bellmanconsistent pessimism for offline reinforcement learning advances in neural information processing systems 34 2021 66836694 1 the overall algorithm has good intuition and motivation but the introduction to the additional term in section 42 looks irrelevant to the rest of the paper from the experiments this term seems crucial to a good performance of the algorithm and thus unavoidable in the current version although it makes sense that some kind of regularization may be needed for unseen action this additional term indeed undermines the overall message a little bit 2 the ablation of using a more condensed surrogate for ensemble is a good experiment and as the paper already suggests it would be better if a more efficient way of pessimism could be derived which seems beyond the scope of this paper
### Summary: | the paper identifies a common flaw in pessimistic algorithms related to the use of shared targets and propose an alternative based on independent targets that mitigate the overlyoptimistic estimates the rebuttal has addressed a number of concerns raised by the reviewers and in particular the negative reviewer qbbn acknowledged that the proposed idea here would make an existing algorithm that uses eg double q networks which is quite common and also other main pessimism like value penalty or closeness to behavior policy to perform better thus the insight here can be quite useful in practice that said the reviewer is still concerned about the framing of the work the paper does not provide sufficient evidence theoretically or empirically that the proposed pessimistic estimate based on independent training alone is sufficient to design a sota offline rl algorithm which the paper claims to i think that the paper needs to provide stronger evidences or changes the framing given the strong support from other reviewers the ac is leaning towards acceptance but strongly recommend that the authors change the framing of the paper to honestly reflect the contributions of the work | [
849,
1175,
253,
45234,
2531,
6642,
310,
390,
849,
1175,
253,
6311,
3646,
1754,
327,
824,
1318,
6642,
588,
320,
436,
310,
2139,
891,
753,
253,
8453,
273,
253,
10527,
1543,
403,
2581,
3710,
275,
253,
2278,
50275,
1542,
16774,
1930,
891,
1158,
281,
7568,
253,
4477,
1750,
352,
310,
3309,
281,
921,
326,
256,
5503,
476,
320,
6786,
342,
9765,
17,
2299,
253,
1655,
1543,
513,
417,
1329,
326,
4751,
1223,
891,
5194,
326,
275,
4677,
495,
9765,
17,
310,
2190,
253,
1682,
9591,
1543,
891,
671,
1158,
326,
4677,
495,
1057,
417,
2085,
247,
38662,
3662,
347,
352,
310,
5816,
1543,
273,
4067,
9765,
1318,
323,
9840,
17,
347,
627,
310,
271,
3629,
9058,
436,
369,
8042,
562,
275,
253,
2278,
697,
671,
1892,
281,
7277,
3036,
495,
285,
3036,
374,
3587,
50276,
74,
1158,
253,
4433,
273,
970,
9765,
17,
275,
19554,
278,
10441,
16856,
10625,
310,
2686,
4645,
326,
253,
4081,
2746,
3815,
1537,
417,
320,
4209,
281,
2085,
2217,
45234,
1204,
21450,
891,
513,
417,
5194,
342,
253,
30080,
85,
932,
3908,
326,
841,
12620,
476,
320,
14042,
407,
3879,
34591,
512,
253,
3082,
253,
4477,
7117,
627,
403,
417,
6313,
3879,
34591,
534,
43341,
512,
5231,
347,
512,
273,
731,
1347,
690,
14720,
752,
5231,
403,
1805,
1754,
327,
253,
23267,
3103,
891,
13414,
1158,
436,
310,
247,
1175,
16267,
273,
970,
253,
260,
5848,
1307,
1060,
4754,
891,
5194,
326,
253,
4081,
1332,
2987,
342,
9765,
17,
342,
1331,
785,
2721,
534,
310,
2783,
347,
247,
12150,
5028,
533,
310,
352,
984,
273,
253,
1921,
326,
253,
4477,
5393,
604,
436,
310,
253,
1083,
2139,
1057,
417,
1347,
973,
275,
19554,
5028,
390,
310,
352,
984,
273,
1633,
326,
310,
2905,
281,
253,
2605,
273,
1331,
785,
2721,
3126,
285,
10895,
4390,
359,
13414,
452,
4209,
1941,
281,
2028,
50276,
40622,
891,
1158,
326,
4390,
253,
2929,
3400,
12497,
1543,
281,
921,
326,
253,
4081,
11649,
13418,
3815,
476,
5115,
256,
5503,
28841,
391,
77,
1543,
23188,
891,
671,
1158,
326,
436,
2929,
3400,
625,
685,
4209,
4606,
281,
921,
326,
352,
651,
320,
247,
1175,
2216,
4327,
281,
3157,
271,
5368,
1318,
3169,
28841,
391,
77,
5933,
3103,
891,
1978,
619,
3236,
17401,
50275,
296,
3755,
20556,
50276,
18,
436,
2929,
10262,
271,
28849,
4560,
50276,
19,
352,
3400,
690,
10527,
4606,
281,
896,
352,
598,
352,
671,
3400,
16774,
12820,
50276,
19,
253,
4028,
273,
436,
2929,
310,
2590,
285,
253,
5661,
1543,
403,
11080,
50275,
20881,
1255,
50276,
18,
1223,
253,
4477,
2085,
22909,
273,
2139,
253,
1846,
10393,
273,
6096,
8571,
778,
1421,
281,
36970,
253,
1655,
1543,
403,
417,
38662,
275,
1798,
1223,
5368,
28841,
391,
77,
27558,
897,
6096,
8571,
253,
45234,
1204,
273,
6096,
8571,
310,
417,
253,
2022,
2603,
273,
45234,
1204,
533,
2581,
271,
7092,
2508,
3103,
352,
310,
12744,
1880,
253,
10419,
1060,
310,
4209,
281,
2216,
247,
2120,
28841,
391,
77,
5933,
390,
816,
2085,
271,
1774,
3877,
327,
7092,
4327,
436,
2803,
7787,
253,
8453,
273,
253,
2929,
50275,
19,
352,
310,
12744,
752,
941,
13260,
403,
3058,
323,
253,
4081,
1332,
281,
789,
6283,
347,
271,
28841,
391,
77,
1332,
253,
4477,
973,
2319,
253,
12291,
824,
347,
253,
4465,
10454,
3058,
407,
253,
4081,
1332,
2299,
275,
619,
1859,
253,
1543,
1060,
403,
625,
3710,
685,
752,
253,
4477,
1750,
323,
253,
4606,
1840,
5474,
33032,
2520,
2929,
2175,
19862,
3169,
45234,
1204,
275,
28841,
391,
77,
432,
1097,
10527,
285,
16774,
7794,
50276,
19647,
271,
1783,
11076,
1037,
949,
295,
17922,
281,
921,
6096,
45234,
2531,
8571,
476,
25286,
1037,
1421,
281,
2805,
383,
36940,
534,
403,
275,
958,
28684,
50276,
856,
28163,
14441,
326,
18784,
1016,
2805,
18428,
10939,
285,
2589,
84,
4679,
275,
277,
21,
8435,
285,
391,
77,
440,
17381,
2400,
8892,
50275,
296,
3755,
20556,
50276,
630,
595,
12106,
253,
28841,
391,
77,
3082,
1754,
327,
2805,
1215,
78,
934,
45234,
1204,
327,
11968,
3429,
11454,
6928,
285,
2722,
253,
45234,
1204,
1307,
778,
2489,
2762,
275,
6096,
8571,
50275,
358,
5378,
1037,
2336,
7790,
253,
12510,
273,
253,
5933,
407,
16248,
2805,
1215,
78,
934,
342,
260,
5848,
50275,
20881,
1255,
265,
50276,
783,
295,
17922,
13260,
275,
2593,
4562,
285,
253,
305,
12064,
13260,
275,
2593,
4567,
3133,
3710,
275,
16055,
1318,
19502,
50276,
783,
2805,
1215,
78,
934,
3198,
281,
13398,
342,
260,
5848,
281,
4044,
5272,
3045,
253,
897,
273,
260,
5848,
2789,
352,
2834,
281,
12106,
253,
2603,
273,
3045,
7756,
2299,
627,
2226,
2067,
3082,
24088,
1407,
317,
275,
5723,
2824,
43425,
285,
268,
1288,
77,
275,
17857,
32888,
1384,
1423,
326,
1347,
2805,
1215,
78,
934,
323,
745,
41904,
77,
342,
15846,
11649,
50275,
783,
3368,
1543,
403,
417,
3426,
323,
277,
21,
8435,
22791,
5549,
5474,
33032,
2520,
2929,
25339,
253,
11649,
13418,
275,
391,
77,
534,
310,
271,
5795,
281,
10808,
45234,
1204,
275,
28841,
391,
77,
2167,
11649,
13418,
949,
49328,
556,
644,
4081,
275,
253,
28841,
391,
77,
6239,
436,
2929,
2792,
562,
247,
4619,
19652,
275,
849,
281,
19071,
253,
298,
11316,
275,
253,
12353,
68,
17425,
1754,
5933,
352,
2722,
28055,
326,
253,
2045,
11333,
534,
810,
560,
1027,
2805,
3470,
281,
253,
6096,
50276,
81,
405,
303,
2531,
2303,
2193,
285,
1057,
3646,
7103,
1754,
327,
253,
298,
11316,
812,
24225,
5644,
281,
35039,
14508,
273,
253,
2805,
1159,
281,
2953,
436,
253,
2929,
29328,
247,
2969,
4993,
326,
310,
275,
253,
17487,
1342,
17119,
3924,
3185,
273,
810,
13537,
253,
1027,
2805,
2193,
281,
253,
6096,
298,
11316,
6642,
816,
810,
13537,
731,
281,
3907,
2805,
2303,
45190,
352,
2722,
1805,
3045,
275,
11132,
8892,
326,
2430,
331,
31054,
4457,
436,
253,
2929,
671,
33888,
849,
1027,
5919,
19862,
1332,
789,
275,
391,
77,
4758,
285,
352,
3133,
627,
1335,
2226,
247,
1781,
8037,
275,
253,
3045,
2429,
342,
3676,
19862,
534,
13279,
598,
625,
4722,
3533,
275,
5919,
19862,
3082,
275,
253,
391,
77,
4758,
50276,
45563,
337,
253,
2929,
310,
973,
15720,
285,
1077,
3477,
281,
956,
374,
352,
25339,
247,
2201,
19652,
275,
849,
281,
19071,
298,
11316,
6642,
275,
28841,
391,
77,
11333,
534,
3133,
1146,
28849,
275,
253,
6239,
533,
45190,
3133,
1056,
247,
1943,
3064,
253,
10527,
1750,
310,
973,
19391,
50276,
20,
352,
556,
247,
11088,
873,
273,
16774,
2175,
534,
10949,
2710,
7794,
670,
253,
30437,
273,
253,
1332,
824,
347,
253,
19862,
1979,
253,
4373,
19484,
7340,
891,
513,
11435,
253,
4477,
3434,
275,
16585,
849,
281,
3700,
5919,
19862,
1332,
275,
22296,
4715,
4758,
281,
253,
391,
77,
9978,
281,
1056,
352,
625,
43245,
5919,
2167,
690,
4016,
1543,
627,
50276,
20881,
1255,
337,
5046,
891,
717,
40663,
1633,
533,
891,
513,
1928,
690,
273,
253,
3916,
285,
4342,
403,
417,
5544,
275,
247,
2221,
2590,
1039,
923,
3533,
2593,
323,
4278,
323,
436,
374,
253,
3368,
2593,
1057,
921,
326,
24049,
253,
3907,
2303,
5644,
281,
253,
1805,
3045,
275,
253,
11132,
8892,
352,
651,
320,
1270,
281,
923,
326,
436,
310,
906,
432,
1805,
298,
11316,
6642,
359,
923,
253,
35039,
14508,
3374,
275,
253,
20953,
4836,
285,
352,
651,
320,
1663,
9371,
281,
923,
275,
253,
11132,
8892,
326,
6096,
2303,
1057,
1421,
281,
35039,
14508,
534,
310,
253,
1921,
326,
253,
1332,
7729,
50276,
20,
2593,
5976,
3133,
247,
1652,
2372,
562,
273,
5406,
347,
352,
3133,
9765,
17,
2987,
1270,
275,
954,
2219,
253,
4477,
1375,
326,
352,
1537,
1361,
275,
690,
6891,
941,
9459,
310,
667,
16774,
1263,
8109,
436,
50276,
9820,
50276,
7152,
339,
431,
248,
2929,
40687,
247,
1895,
275,
5368,
45234,
1204,
13418,
275,
28841,
391,
77,
970,
49328,
970,
6096,
8571,
323,
512,
49328,
11269,
253,
2929,
3185,
29328,
281,
5731,
1016,
19862,
15978,
285,
4647,
253,
45234,
1204,
387,
3646,
11269,
253,
2929,
38422,
253,
5731,
830,
273,
1097,
3082,
275,
253,
295,
17922,
4758,
285,
2722,
326,
253,
5731,
1332,
342,
6096,
2303,
812,
1014,
906,
275,
36970,
534,
310,
671,
2011,
342,
690,
13506,
9864,
941,
4720,
253,
2929,
44995,
253,
4081,
1332,
275,
2067,
28841,
391,
77,
49602,
285,
921,
697,
16774,
3947,
48826,
50275,
45563,
50276,
18,
4583,
253,
2929,
310,
973,
3542,
3477,
281,
956,
285,
253,
7681,
629,
3133,
3451,
50275,
19,
253,
2929,
2789,
247,
1175,
8310,
273,
253,
5368,
3082,
323,
28841,
391,
77,
672,
597,
5731,
253,
2805,
8858,
323,
49328,
432,
17134,
18347,
581,
1663,
1057,
417,
878,
281,
19071,
253,
45234,
1204,
715,
253,
1159,
5731,
5199,
533,
3185,
816,
4647,
45234,
1204,
1309,
3646,
5731,
436,
671,
3133,
281,
5194,
281,
253,
3762,
391,
77,
11333,
581,
476,
816,
1347,
253,
3963,
17487,
1342,
11269,
390,
1347,
20408,
275,
2715,
2317,
11333,
285,
4853,
3646,
342,
298,
11316,
390,
1379,
5927,
689,
253,
5780,
873,
273,
3470,
323,
45234,
1204,
323,
1650,
337,
50275,
20,
3738,
352,
778,
417,
320,
4755,
762,
752,
2238,
273,
2515,
824,
326,
275,
253,
295,
17922,
4758,
970,
253,
6096,
2303,
812,
906,
275,
36970,
253,
1563,
19087,
3400,
1175,
1941,
326,
326,
6296,
812,
5108,
352,
812,
320,
1805,
281,
2085,
690,
625,
27350,
10076,
390,
1014,
247,
4581,
630,
5140,
50275,
21,
253,
2929,
3400,
9470,
285,
21414,
4679,
1690,
247,
1175,
28913,
4679,
534,
4428,
253,
1027,
9351,
273,
6096,
2303,
22753,
3082,
824,
347,
6096,
77,
11316,
3994,
6096,
1222,
3676,
3994,
285,
342,
247,
1027,
1180,
273,
49328,
50276,
67,
253,
2929,
14177,
1142,
1027,
4373,
22041,
323,
253,
1666,
25379,
594,
253,
1666,
25379,
1646,
281,
320,
1442,
292,
37437,
323,
253,
2457,
9759,
273,
253,
1543,
260,
253,
4679,
403,
2684,
327,
9470,
49602,
50274,
20881,
1255,
50276,
18,
253,
10527,
1543,
2085,
1077,
1175,
30328,
715,
253,
3237,
273,
253,
2045,
45234,
1204,
13418,
275,
28841,
3676,
391,
77,
3082,
533,
1580,
253,
906,
310,
1754,
327,
253,
295,
17922,
4758,
352,
1335,
556,
690,
8037,
875,
253,
8542,
9534,
50275,
19,
253,
906,
3559,
275,
2829,
337,
556,
1027,
4373,
19484,
323,
1027,
8892,
534,
2779,
35162,
1100,
253,
16774,
16108,
273,
253,
4081,
5933,
50274,
250,
3065,
50276,
18,
1269,
466,
246,
1205,
31524,
1162,
355,
17487,
1342,
32474,
45234,
1204,
323,
28841,
35221,
4715,
16424,
275,
11454,
1491,
5162,
2718,
5910,
43425,
47633,
1812,
30467,
337,
253,
4583,
5933,
556,
1175,
30328,
285,
16038,
533,
253,
10199,
281,
253,
3081,
1307,
275,
2593,
5976,
4453,
19124,
281,
253,
1551,
273,
253,
2929,
432,
253,
4679,
436,
1307,
3133,
9560,
281,
247,
1175,
3045,
273,
253,
5933,
285,
3021,
46133,
275,
253,
1655,
2715,
3738,
352,
2789,
3282,
326,
690,
2238,
273,
37820,
778,
320,
3058,
323,
39709,
2250,
436,
3081,
1307,
6296,
35162,
1100,
253,
4583,
3935,
247,
1652,
2372,
50275,
19,
253,
28913,
273,
970,
247,
625,
35341,
35701,
323,
19862,
310,
247,
1175,
3368,
285,
347,
253,
2929,
2168,
5936,
352,
651,
320,
1805,
604,
247,
625,
5919,
1039,
273,
45234,
1204,
812,
320,
6012,
534,
3133,
4457,
253,
7990,
273,
436,
2929,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
22649,
247,
1846,
19652,
275,
45234,
2531,
11333,
2905,
281,
253,
897,
273,
6096,
8571,
285,
12661,
271,
5795,
1754,
327,
3907,
8571,
326,
29966,
253,
27662,
32581,
2531,
8197,
253,
30080,
22559,
556,
9713,
247,
1180,
273,
7350,
5439,
407,
253,
30628,
285,
275,
1798,
253,
4016,
37317,
2805,
4482,
79,
14969,
326,
50273,
783,
4081,
2934,
1060,
651,
1056,
271,
5368,
5933,
326,
4648,
24088,
4021,
2805,
6928,
534,
310,
3240,
1846,
285,
671,
643,
2022,
45234,
1204,
751,
1318,
12339,
390,
2734,
8098,
281,
3879,
3646,
281,
1347,
1805,
3021,
253,
12288,
1060,
476,
320,
3240,
4217,
275,
3946,
50276,
3529,
753,
253,
37317,
310,
1335,
7514,
670,
253,
39926,
273,
253,
789,
50275,
783,
2929,
1057,
417,
2085,
4209,
1941,
28055,
390,
45190,
326,
253,
4081,
45234,
2531,
6642,
1754,
327,
3907,
3733,
3815,
310,
4209,
281,
2216,
247,
256,
5503,
28841,
391,
77,
5933,
534,
253,
2929,
3916,
281,
891,
1158,
326,
253,
2929,
3198,
281,
2085,
10046,
20456,
2979,
390,
2544,
253,
39926,
50276,
28821,
253,
2266,
1329,
432,
643,
30628,
253,
913,
310,
25661,
4404,
14924,
533,
7052,
5583,
326,
253,
4477,
1818,
253,
39926,
273,
253,
2929,
281,
20509,
4887,
253,
9021,
273,
253,
789,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
849,
1175,
253,
45234,
2531,
6642,
310,
390,
849,
1175,
253,
6311,
3646,
1754,
327,
824,
1318,
6642,
588,
320,
436,
310,
2139,
891,
753,
253,
8453,
273,
253,
10527,
1543,
403,
2581,
3710,
275,
253,
2278,
50275,
1542,
16774,
1930,
891,
1158,
281,
7568,
253,
4477,
1750,
352,
310,
3309,
281,
921,
326,
256,
5503,
476,
320,
6786,
342,
9765,
17,
2299,
253,
1655,
1543,
513,
417,
1329,
326,
4751,
1223,
891,
5194,
326,
275,
4677,
495,
9765,
17,
310,
2190,
253,
1682,
9591,
1543,
891,
671,
1158,
326,
4677,
495,
1057,
417,
2085,
247,
38662,
3662,
347,
352,
310,
5816,
1543,
273,
4067,
9765,
1318,
323,
9840,
17,
347,
627,
310,
271,
3629,
9058,
436,
369,
8042,
562,
275,
253,
2278,
697,
671,
1892,
281,
7277,
3036,
495,
285,
3036,
374,
3587,
50276,
74,
1158,
253,
4433,
273,
970,
9765,
17,
275,
19554,
278,
10441,
16856,
10625,
310,
2686,
4645,
326,
253,
4081,
2746,
3815,
1537,
417,
320,
4209,
281,
2085,
2217,
45234,
1204,
21450,
891,
513,
417,
5194,
342,
253,
30080,
85,
932,
3908,
326,
841,
12620,
476,
320,
14042,
407,
3879,
34591,
512,
253,
3082,
253,
4477,
7117,
627,
403,
417,
6313,
3879,
34591,
534,
43341,
512,
5231,
347,
512,
273,
731,
1347,
690,
14720,
752,
5231,
403,
1805,
1754,
327,
253,
23267,
3103,
891,
13414,
1158,
436,
310,
247,
1175,
16267,
273,
970,
253,
260,
5848,
1307,
1060,
4754,
891,
5194,
326,
253,
4081,
1332,
2987,
342,
9765,
17,
342,
1331,
785,
2721,
534,
310,
2783,
347,
247,
12150,
5028,
533,
310,
352,
984,
273,
253,
1921,
326,
253,
4477,
5393,
604,
436,
310,
253,
1083,
2139,
1057,
417,
1347,
973,
275,
19554,
5028,
390,
310,
352,
984,
273,
1633,
326,
310,
2905,
281,
253,
2605,
273,
1331,
785,
2721,
3126,
285,
10895,
4390,
359,
13414,
452,
4209,
1941,
281,
2028,
50276,
40622,
891,
1158,
326,
4390,
253,
2929,
3400,
12497,
1543,
281,
921,
326,
253,
4081,
11649,
13418,
3815,
476,
5115,
256,
5503,
28841,
391,
77,
1543,
23188,
891,
671,
1158,
326,
436,
2929,
3400,
625,
685,
4209,
4606,
281,
921,
326,
352,
651,
320,
247,
1175,
2216,
4327,
281,
3157,
271,
5368,
1318,
3169,
28841,
391,
77,
5933,
3103,
891,
1978,
619,
3236,
17401,
50275,
296,
3755,
20556,
50276,
18,
436,
2929,
10262,
271,
28849,
4560,
50276,
19,
352,
3400,
690,
10527,
4606,
281,
896,
352,
598,
352,
671,
3400,
16774,
12820,
50276,
19,
253,
4028,
273,
436,
2929,
310,
2590,
285,
253,
5661,
1543,
403,
11080,
50275,
20881,
1255,
50276,
18,
1223,
253,
4477,
2085,
22909,
273,
2139,
253,
1846,
10393,
273,
6096,
8571,
778,
1421,
281,
36970,
253,
1655,
1543,
403,
417,
38662,
275,
1798,
1223,
5368,
28841,
391,
77,
27558,
897,
6096,
8571,
253,
45234,
1204,
273,
6096,
8571,
310,
417,
253,
2022,
2603,
273,
45234,
1204,
533,
2581,
271,
7092,
2508,
3103,
352,
310,
12744,
1880,
253,
10419,
1060,
310,
4209,
281,
2216,
247,
2120,
28841,
391,
77,
5933,
390,
816,
2085,
271,
1774,
3877,
327,
7092,
4327,
436,
2803,
7787,
253,
8453,
273,
253,
2929,
50275,
19,
352,
310,
12744,
752,
941,
13260,
403,
3058,
323,
253,
4081,
1332,
281,
789,
6283,
347,
271,
28841,
391,
77,
1332,
253,
4477,
973,
2319,
253,
12291,
824,
347,
253,
4465,
10454,
3058,
407,
253,
4081,
1332,
2299,
275,
619,
1859,
253,
1543,
1060,
403,
625,
3710,
685,
752,
253,
4477,
1750,
323,
253,
4606,
1840,
5474,
33032,
2520,
2929,
2175,
19862,
3169,
45234,
1204,
275,
28841,
391,
77,
432,
1097,
10527,
285,
16774,
7794,
50276,
19647,
271,
1783,
11076,
1037,
949,
295,
17922,
281,
921,
6096,
45234,
2531,
8571,
476,
25286,
1037,
1421,
281,
2805,
383,
36940,
534,
403,
275,
958,
28684,
50276,
856,
28163,
14441,
326,
18784,
1016,
2805,
18428,
10939,
285,
2589,
84,
4679,
275,
277,
21,
8435,
285,
391,
77,
440,
17381,
2400,
8892,
50275,
296,
3755,
20556,
50276,
630,
595,
12106,
253,
28841,
391,
77,
3082,
1754,
327,
2805,
1215,
78,
934,
45234,
1204,
327,
11968,
3429,
11454,
6928,
285,
2722,
253,
45234,
1204,
1307,
778,
2489,
2762,
275,
6096,
8571,
50275,
358,
5378,
1037,
2336,
7790,
253,
12510,
273,
253,
5933,
407,
16248,
2805,
1215,
78,
934,
342,
260,
5848,
50275,
20881,
1255,
265,
50276,
783,
295,
17922,
13260,
275,
2593,
4562,
285,
253,
305,
12064,
13260,
275,
2593,
4567,
3133,
3710,
275,
16055,
1318,
19502,
50276,
783,
2805,
1215,
78,
934,
3198,
281,
13398,
342,
260,
5848,
281,
4044,
5272,
3045,
253,
897,
273,
260,
5848,
2789,
352,
2834,
281,
12106,
253,
2603,
273,
3045,
7756,
2299,
627,
2226,
2067,
3082,
24088,
1407,
317,
275,
5723,
2824,
43425,
285,
268,
1288,
77,
275,
17857,
32888,
1384,
1423,
326,
1347,
2805,
1215,
78,
934,
323,
745,
41904,
77,
342,
15846,
11649,
50275,
783,
3368,
1543,
403,
417,
3426,
323,
277,
21,
8435,
22791,
5549,
5474,
33032,
2520,
2929,
25339,
253,
11649,
13418,
275,
391,
77,
534,
310,
271,
5795,
281,
10808,
45234,
1204,
275,
28841,
391,
77,
2167,
11649,
13418,
949,
49328,
556,
644,
4081,
275,
253,
28841,
391,
77,
6239,
436,
2929,
2792,
562,
247,
4619,
19652,
275,
849,
281,
19071,
253,
298,
11316,
275,
253,
12353,
68,
17425,
1754,
5933,
352,
2722,
28055,
326,
253,
2045,
11333,
534,
810,
560,
1027,
2805,
3470,
281,
253,
6096,
50276,
81,
405,
303,
2531,
2303,
2193,
285,
1057,
3646,
7103,
1754,
327,
253,
298,
11316,
812,
24225,
5644,
281,
35039,
14508,
273,
253,
2805,
1159,
281,
2953,
436,
253,
2929,
29328,
247,
2969,
4993,
326,
310,
275,
253,
17487,
1342,
17119,
3924,
3185,
273,
810,
13537,
253,
1027,
2805,
2193,
281,
253,
6096,
298,
11316,
6642,
816,
810,
13537,
731,
281,
3907,
2805,
2303,
45190,
352,
2722,
1805,
3045,
275,
11132,
8892,
326,
2430,
331,
31054,
4457,
436,
253,
2929,
671,
33888,
849,
1027,
5919,
19862,
1332,
789,
275,
391,
77,
4758,
285,
352,
3133,
627,
1335,
2226,
247,
1781,
8037,
275,
253,
3045,
2429,
342,
3676,
19862,
534,
13279,
598,
625,
4722,
3533,
275,
5919,
19862,
3082,
275,
253,
391,
77,
4758,
50276,
45563,
337,
253,
2929,
310,
973,
15720,
285,
1077,
3477,
281,
956,
374,
352,
25339,
247,
2201,
19652,
275,
849,
281,
19071,
298,
11316,
6642,
275,
28841,
391,
77,
11333,
534,
3133,
1146,
28849,
275,
253,
6239,
533,
45190,
3133,
1056,
247,
1943,
3064,
253,
10527,
1750,
310,
973,
19391,
50276,
20,
352,
556,
247,
11088,
873,
273,
16774,
2175,
534,
10949,
2710,
7794,
670,
253,
30437,
273,
253,
1332,
824,
347,
253,
19862,
1979,
253,
4373,
19484,
7340,
891,
513,
11435,
253,
4477,
3434,
275,
16585,
849,
281,
3700,
5919,
19862,
1332,
275,
22296,
4715,
4758,
281,
253,
391,
77,
9978,
281,
1056,
352,
625,
43245,
5919,
2167,
690,
4016,
1543,
627,
50276,
20881,
1255,
337,
5046,
891,
717,
40663,
1633,
533,
891,
513,
1928,
690,
273,
253,
3916,
285,
4342,
403,
417,
5544,
275,
247,
2221,
2590,
1039,
923,
3533,
2593,
323,
4278,
323,
436,
374,
253,
3368,
2593,
1057,
921,
326,
24049,
253,
3907,
2303,
5644,
281,
253,
1805,
3045,
275,
253,
11132,
8892,
352,
651,
320,
1270,
281,
923,
326,
436,
310,
906,
432,
1805,
298,
11316,
6642,
359,
923,
253,
35039,
14508,
3374,
275,
253,
20953,
4836,
285,
352,
651,
320,
1663,
9371,
281,
923,
275,
253,
11132,
8892,
326,
6096,
2303,
1057,
1421,
281,
35039,
14508,
534,
310,
253,
1921,
326,
253,
1332,
7729,
50276,
20,
2593,
5976,
3133,
247,
1652,
2372,
562,
273,
5406,
347,
352,
3133,
9765,
17,
2987,
1270,
275,
954,
2219,
253,
4477,
1375,
326,
352,
1537,
1361,
275,
690,
6891,
941,
9459,
310,
667,
16774,
1263,
8109,
436,
50276,
9820,
50276,
7152,
339,
431,
248,
2929,
40687,
247,
1895,
275,
5368,
45234,
1204,
13418,
275,
28841,
391,
77,
970,
49328,
970,
6096,
8571,
323,
512,
49328,
11269,
253,
2929,
3185,
29328,
281,
5731,
1016,
19862,
15978,
285,
4647,
253,
45234,
1204,
387,
3646,
11269,
253,
2929,
38422,
253,
5731,
830,
273,
1097,
3082,
275,
253,
295,
17922,
4758,
285,
2722,
326,
253,
5731,
1332,
342,
6096,
2303,
812,
1014,
906,
275,
36970,
534,
310,
671,
2011,
342,
690,
13506,
9864,
941,
4720,
253,
2929,
44995,
253,
4081,
1332,
275,
2067,
28841,
391,
77,
49602,
285,
921,
697,
16774,
3947,
48826,
50275,
45563,
50276,
18,
4583,
253,
2929,
310,
973,
3542,
3477,
281,
956,
285,
253,
7681,
629,
3133,
3451,
50275,
19,
253,
2929,
2789,
247,
1175,
8310,
273,
253,
5368,
3082,
323,
28841,
391,
77,
672,
597,
5731,
253,
2805,
8858,
323,
49328,
432,
17134,
18347,
581,
1663,
1057,
417,
878,
281,
19071,
253,
45234,
1204,
715,
253,
1159,
5731,
5199,
533,
3185,
816,
4647,
45234,
1204,
1309,
3646,
5731,
436,
671,
3133,
281,
5194,
281,
253,
3762,
391,
77,
11333,
581,
476,
816,
1347,
253,
3963,
17487,
1342,
11269,
390,
1347,
20408,
275,
2715,
2317,
11333,
285,
4853,
3646,
342,
298,
11316,
390,
1379,
5927,
689,
253,
5780,
873,
273,
3470,
323,
45234,
1204,
323,
1650,
337,
50275,
20,
3738,
352,
778,
417,
320,
4755,
762,
752,
2238,
273,
2515,
824,
326,
275,
253,
295,
17922,
4758,
970,
253,
6096,
2303,
812,
906,
275,
36970,
253,
1563,
19087,
3400,
1175,
1941,
326,
326,
6296,
812,
5108,
352,
812,
320,
1805,
281,
2085,
690,
625,
27350,
10076,
390,
1014,
247,
4581,
630,
5140,
50275,
21,
253,
2929,
3400,
9470,
285,
21414,
4679,
1690,
247,
1175,
28913,
4679,
534,
4428,
253,
1027,
9351,
273,
6096,
2303,
22753,
3082,
824,
347,
6096,
77,
11316,
3994,
6096,
1222,
3676,
3994,
285,
342,
247,
1027,
1180,
273,
49328,
50276,
67,
253,
2929,
14177,
1142,
1027,
4373,
22041,
323,
253,
1666,
25379,
594,
253,
1666,
25379,
1646,
281,
320,
1442,
292,
37437,
323,
253,
2457,
9759,
273,
253,
1543,
260,
253,
4679,
403,
2684,
327,
9470,
49602,
50274,
20881,
1255,
50276,
18,
253,
10527,
1543,
2085,
1077,
1175,
30328,
715,
253,
3237,
273,
253,
2045,
45234,
1204,
13418,
275,
28841,
3676,
391,
77,
3082,
533,
1580,
253,
906,
310,
1754,
327,
253,
295,
17922,
4758,
352,
1335,
556,
690,
8037,
875,
253,
8542,
9534,
50275,
19,
253,
906,
3559,
275,
2829,
337,
556,
1027,
4373,
19484,
323,
1027,
8892,
534,
2779,
35162,
1100,
253,
16774,
16108,
273,
253,
4081,
5933,
50274,
250,
3065,
50276,
18,
1269,
466,
246,
1205,
31524,
1162,
355,
17487,
1342,
32474,
45234,
1204,
323,
28841,
35221,
4715,
16424,
275,
11454,
1491,
5162,
2718,
5910,
43425,
47633,
1812,
30467,
337,
253,
4583,
5933,
556,
1175,
30328,
285,
16038,
533,
253,
10199,
281,
253,
3081,
1307,
275,
2593,
5976,
4453,
19124,
281,
253,
1551,
273,
253,
2929,
432,
253,
4679,
436,
1307,
3133,
9560,
281,
247,
1175,
3045,
273,
253,
5933,
285,
3021,
46133,
275,
253,
1655,
2715,
3738,
352,
2789,
3282,
326,
690,
2238,
273,
37820,
778,
320,
3058,
323,
39709,
2250,
436,
3081,
1307,
6296,
35162,
1100,
253,
4583,
3935,
247,
1652,
2372,
50275,
19,
253,
28913,
273,
970,
247,
625,
35341,
35701,
323,
19862,
310,
247,
1175,
3368,
285,
347,
253,
2929,
2168,
5936,
352,
651,
320,
1805,
604,
247,
625,
5919,
1039,
273,
45234,
1204,
812,
320,
6012,
534,
3133,
4457,
253,
7990,
273,
436,
2929,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
22649,
247,
1846,
19652,
275,
45234,
2531,
11333,
2905,
281,
253,
897,
273,
6096,
8571,
285,
12661,
271,
5795,
1754,
327,
3907,
8571,
326,
29966,
253,
27662,
32581,
2531,
8197,
253,
30080,
22559,
556,
9713,
247,
1180,
273,
7350,
5439,
407,
253,
30628,
285,
275,
1798,
253,
4016,
37317,
2805,
4482,
79,
14969,
326,
50273,
783,
4081,
2934,
1060,
651,
1056,
271,
5368,
5933,
326,
4648,
24088,
4021,
2805,
6928,
534,
310,
3240,
1846,
285,
671,
643,
2022,
45234,
1204,
751,
1318,
12339,
390,
2734,
8098,
281,
3879,
3646,
281,
1347,
1805,
3021,
253,
12288,
1060,
476,
320,
3240,
4217,
275,
3946,
50276,
3529,
753,
253,
37317,
310,
1335,
7514,
670,
253,
39926,
273,
253,
789,
50275,
783,
2929,
1057,
417,
2085,
4209,
1941,
28055,
390,
45190,
326,
253,
4081,
45234,
2531,
6642,
1754,
327,
3907,
3733,
3815,
310,
4209,
281,
2216,
247,
256,
5503,
28841,
391,
77,
5933,
534,
253,
2929,
3916,
281,
891,
1158,
326,
253,
2929,
3198,
281,
2085,
10046,
20456,
2979,
390,
2544,
253,
39926,
50276,
28821,
253,
2266,
1329,
432,
643,
30628,
253,
913,
310,
25661,
4404,
14924,
533,
7052,
5583,
326,
253,
4477,
1818,
253,
39926,
273,
253,
2929,
281,
20509,
4887,
253,
9021,
273,
253,
789,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper suggests a method for approximating the 2wasserstein gradient flow for the relative entropy the proposed particlebased method uses a neural network function approximationbased approach to estimating the necessary density ratios experiments verify reasonable performance compared to mala and ula the paper appears to miss the fact that the 2wasserstein gradient flow for the relative entropy defines a markov process which is exactly the langevin dynamics this can be seen by comparing eq 4 to the fokkerplanck equation for an ito diffusion eg eq 41 in 1 see also section 35 in 2 for a relevant discussion from the ml literature indeed the fokkerplanck equation determines the diffusion uniquely because the differential operator in the fokkerplanck equation is the adjoint of the infinitesimal generator of the diffusion thus the proposed algorithm is a rather unorthodox way of approximating a langevin distribution the paper makes the said approximation more difficult than it should be by using a particle approximation that requires estimating density ratios a notorious tricky problem the algorithm ends up having some ability to handle multimodality because the use of weighted particles and density ratio estimation allows the algorithm to effectively compute the relative volumes of different modes of the distribution the proposed method for estimating the density ratios appears to be the same as geyers reverse logistic regression method 3 with a neural net replacing the inner product thus i expect similar results could be obtained more directly and with only a single round of volume estimation by 1 running mala many times and 2 then estimating the relative volume of each chain note there are numerous other methods for doing this other than reverse logistic regression such an approach should work well on the kinds of lowdimensional examples considered in the numerical experiments further issues arise in the experimental evaluation first the experiments seem to show very similar performance to mala all within the standard errors when provided eg in table 2 second im concerned about the quality of the mala implementation for which code was not included the lack of convergence in one example suggests mala was not run with appropriate step size adaptation targeting the optimal acceptance rate 45 as is standard in the literature if so then the comparison is not appropriate third for a fair comparison the mala chains should also be reweighted based on volume estimates for each chain as described above is it possible there are some gains from using the proposed method on multimodal distribution yes but i remain skeptical moreover if the goal is prediction i expect combining mcmc with stacking will be more effective 67 1 pavliotis g a stochastic processes and applications springer 2014 2 liu q stein variational gradient descent as gradient flow in neurips 2017 3 geyer c estimating normalizing constants and reweighting mixtures technical report 1994 4 roberts g o rosenthal j s optimal scaling of discrete approximations to langevin diffusions journal of the royal statistical society series b statistical methodology 60 255268 1998 5 roberts g o rosenthal j s optimal scaling for various metropolishastings algorithms statistical science 16 351367 2001 6 yao y vehtari a gelman a stacking for nonmixing bayesian computations the curse and blessing of multimodal posteriors arxivorg arxiv200612335 2020 7 yao y vehtari a simpson d gelman a using stacking to average bayesian predictive distributions bayesian analysis 13 9171007 2017 the paper seems to have a fundamental misunderstanding of the wasserstein gradient flow for the relative entropy and the experimental evaluations may not be appropriate docsepthis paper considers the problem of sampling from an unnormalized distribution the unnormalized target distribution can be regarded as a stationary point of the wasserstein gradient flow of the corresponding relative entropy functional which can be equivalently identified from a microscopic perspective by defining a timevarying velocity field of the particles while the exact timevarying velocity field is not exactly available the authors propose to estimate such a quantity by approximating the corresponding logarithmic density ratio through minimizing the bregman score such an approximation requires only samples from the variable distribution which can obtain by simulating particles following the estimated velocity field wasserstein gradient flow has proved to be a useful tool for sampling from an unnormalized distribution section 2 to 4 of this work follow the standard derivation of the work along this research line and well explain the particle evolution strategy since the underlying velocity field of the wasserstein gradient flow requires the access to the variable distribution qt which is in general not available a key step in methods along this research line is to estimate such a quantity to estimate such a quantity this work proposes to estimate the log density ratio between the potential function and the variable distribution qt by minimizing the bregman score which is described in section 5 however i find section 5 difficult to understand it would be vary helpful if the authors could explain the intuition of the bregman score in fact i think there should be an individual section in the preliminary that describes the bregman score and all the statements below equation 15 so that the reader can follow this very important step i think section 5 is the part that differs this work from previous work like 1 and is where the novelty of this paper lies it need to be very clearly explained 1 degond pierre and franciscojos mustieles a deterministic approximation of diffusion equations using particles siam journal on scientific and statistical computing 11 no 2 1990 293310 this paper leverage the microscopic equivalence of the wasserstein gradient flow of the relative entropy to sample from an unnormalized distribution but the derivation of the key step in the proposed approach is not well explained docsepthe paper proposes a novel way to sample from unnormalized distributions this is helpful when calculating or estimating the normalizing constant is untractable the main idea is to track the gradient flow of the relative entropy in the wasserstein space of probability distributions it is known that the flow converges to the target distribution and the paper introduces a variational characterization of the discretized steps the main benefit of this characterization is that it bypasses the need to know the normalizing constant as well as being amenable to estimation by using a combined particle evolution the benefits of the new algorithm are demonstrated through several numerical simulations i enjoyed reading the paper it is well written the motivation is clear and it is easy to follow the main ideas however i find it hard to assess the actual contribution of the paper on one hand while the proposed algorithm makes sense there is no guarantee in the paper either for the sampling accuracy or even for the fact that the algorithm will converge to the target measure for example is there any guarantee that the discretized flow does not add bias to the obtained measure there are many tunable parameters in the algorithms s the discretization step k the time horizon n the number of particles what is the interplay between those parameters given some distribution how should i choose those i would have expected to see a bit more of the underlying theory behind this algorithm on the other hand from the perspective of actual results i find that the numerical experiments are somewhat restricted and artificial coupled with the computational overhead its not clear to me when one will actually prefer to use the new algorith the idea is elegant interesting but the paper lacks evidence for its usefulness both from theoretical and applied perspectives docsepthe paper addresses the issue of sampling from an unnormalized distribution the sampling problem is cast as the numerical simulation of the gradient flow associated with the kl divergence between the target unnormalized distribution and the approximating distribution the challenging part is to estimate the density ratio that appears in the gradient term the authors propose to use a deep neural network to estimate the density ratio numerical results show the usefulness of the proposed method overall the paper is well written contributions are clearly stated relation to previous is presented the proposed method is well explained and a somewhat extended comparative numerical evaluation of the proposed method is given i find the overall approach interesting the difficulty of sampling being cast as a density ratio estimation problem this new problem however is not easy to solve and your approach of using a neural network to estimate the density ratio seems to work at least in the considered examples that being said i do have some issues with the paper you mention in the conclusion that you hope to establish the convergence properties of the proposed method i understand that it is not trivial to establish convergence however not presenting at least some intuition about the convergence of the algorithm is a strong drawback the way i see it is that there are two sources of error that could hamper convergence the discretization error of the numerical implementation of the continuous gradient flow and the approximation error of the density ratio i didnt find anything about either in the paper if not a proof at least some intuition about how they affect the results about how they interact etc in the numerical experiments section you present a fair amount of examples which show that the proposed method is capable of outperforming the competing algorithms the results are interesting however there are no results to show how the performances of the proposed method vary with the different parameters notably the number of considered particles and the choice of distribution w such results would cast some light into the inner workings of the proposed method and would be useful for anyone interested in using it in section 64 you do mention that the improved performances of the proposed method come with a higher computational cost however you do not perform any analysis of the tradeoff between computation time and performances it would have also been interesting to compare the performances of the different algorithms for the same computational budget another aspect that struck out to me is the choice of competing algorithms the algorithms that you choose as competitors are valid however their choice is questionable my first remark was why didnt you choose the smc algorithm as a competitor it also uses particles in order to estimate the unnormalized target distribution also hmc could have also been considered i spotted some typos here and there for example chians instead of chains on page 8 in the bottom paragraph denote the ula and mala with k chians repeats instead of repeat on page 9 we repeats the random partition 10 times another small issue is with figure 4 its hardly readable i understand that there is a limit on the page count however thats not a justification for having figures that are hard to read more so as there are some redundancies in the text that could have been eliminated for ex equations 11 and 13 are the same is the presence of both necessary for understanding the idea that is presented in section 4 i find the approach interesting however there are some issue with the paper as it is both theoretical and empirical from a theoretical point of view there is no discussion about conditions for convergence of the algorithm from an empirical point of view the numerical experiments are not complete enough with respect to the comparison analysis that is carried out but also with respect to compensating missing theoretical analysis overall the paper is interesting but in its current form is not ripe enough for publication
### Summary: | the paper proposes a sampling technique for unnormalized distributions the main idea is to gradually transform particles by following the gradient flow of the relative entropy in the wasserstein space of probability distributions the paper tackles an important problem and provides an interesting new perspective however even putting aside the concerns on the theoretical analysis raised by the reviewers the experimental evaluations does not seem sufficient to demonstrate the benefits of the proposed approach | [
8091,
11193,
326,
4419,
26230,
4038,
11878,
247,
32566,
28190,
1895,
253,
5933,
7637,
598,
1907,
690,
3745,
281,
6016,
23390,
351,
1319,
984,
253,
897,
273,
17375,
6353,
285,
4038,
4313,
13418,
4483,
253,
5933,
281,
8069,
11897,
253,
4103,
14118,
273,
1027,
10006,
273,
253,
3268,
253,
4081,
1332,
323,
26230,
253,
4038,
11878,
4620,
281,
320,
253,
1072,
347,
305,
2653,
398,
8107,
21535,
9077,
1332,
495,
342,
247,
11454,
2036,
15706,
253,
6703,
1885,
3021,
891,
1902,
2074,
1543,
812,
320,
2797,
625,
3587,
285,
342,
760,
247,
2014,
3790,
273,
4644,
13418,
407,
337,
3515,
4691,
66,
1142,
2069,
285,
374,
840,
26230,
253,
4103,
4644,
273,
1016,
5931,
3877,
627,
403,
7418,
643,
3082,
323,
2509,
436,
643,
685,
8107,
21535,
9077,
824,
271,
2746,
943,
789,
973,
327,
253,
9351,
273,
1698,
6967,
6667,
2783,
275,
253,
10704,
4679,
50275,
44295,
3374,
12893,
275,
253,
5661,
7103,
806,
253,
4679,
1646,
281,
921,
1077,
2074,
3045,
281,
4691,
66,
512,
1561,
253,
2629,
6332,
672,
2530,
24088,
275,
2829,
374,
1273,
516,
7514,
670,
253,
3290,
273,
253,
4691,
66,
7092,
323,
534,
2127,
369,
417,
2908,
253,
3480,
273,
14940,
275,
581,
1650,
5936,
4691,
66,
369,
417,
1408,
342,
4569,
3213,
1979,
15644,
12262,
253,
8654,
14924,
2281,
5329,
347,
310,
2629,
275,
253,
6239,
604,
594,
840,
253,
5301,
310,
417,
4569,
2626,
323,
247,
4344,
5301,
253,
4691,
66,
13178,
943,
671,
320,
294,
24676,
1754,
327,
4644,
8197,
323,
1016,
5931,
347,
2529,
1840,
50275,
261,
352,
1896,
627,
403,
690,
15988,
432,
970,
253,
4081,
1332,
327,
23390,
26306,
3268,
4754,
533,
891,
3464,
33872,
25761,
604,
253,
4736,
310,
10554,
891,
1902,
16248,
278,
3591,
68,
342,
37444,
588,
320,
625,
3576,
9963,
50274,
18,
47299,
965,
302,
261,
305,
247,
19191,
4870,
285,
4893,
7203,
254,
4059,
50276,
19,
632,
86,
2805,
2870,
249,
39762,
11786,
18499,
347,
11786,
2685,
275,
5723,
2824,
4240,
50276,
20,
305,
21776,
260,
26230,
2622,
3006,
14637,
285,
294,
6712,
272,
24170,
7681,
1304,
9354,
50275,
21,
687,
589,
1641,
305,
258,
50276,
2921,
45541,
480,
256,
8654,
13642,
273,
13358,
34754,
281,
298,
912,
8498,
2171,
16723,
6698,
273,
253,
17292,
7605,
5948,
2962,
270,
7605,
16182,
3925,
15215,
22913,
8065,
50276,
22,
687,
589,
1641,
305,
258,
50276,
2921,
45541,
480,
256,
8654,
13642,
323,
2710,
1313,
18427,
763,
42118,
11333,
7605,
5859,
1668,
4791,
1012,
2251,
6585,
50276,
23,
340,
8500,
340,
1670,
384,
1792,
247,
50276,
11500,
1342,
247,
37444,
323,
1327,
24706,
272,
17699,
16561,
30745,
253,
28401,
285,
26569,
273,
23390,
26306,
20731,
17327,
549,
32693,
2061,
549,
32693,
8603,
805,
22519,
9169,
50276,
24,
340,
8500,
340,
1670,
384,
1792,
247,
948,
10836,
277,
50276,
11500,
1342,
247,
970,
37444,
281,
3388,
17699,
16561,
15970,
10670,
17699,
16561,
1783,
2145,
898,
1166,
31182,
4240,
253,
2929,
3133,
281,
452,
247,
7936,
40663,
273,
253,
369,
2152,
6339,
11786,
2685,
323,
253,
4103,
15579,
285,
253,
5661,
27163,
778,
417,
320,
4569,
50276,
7152,
33032,
2520,
2929,
19401,
253,
1895,
273,
10491,
432,
271,
440,
6320,
1025,
3268,
253,
440,
6320,
1025,
2303,
3268,
476,
320,
12258,
347,
247,
17429,
1127,
273,
253,
369,
2152,
6339,
11786,
2685,
273,
253,
3969,
4103,
15579,
5164,
534,
476,
320,
39406,
3636,
432,
247,
22973,
8668,
407,
13947,
247,
673,
39381,
272,
7602,
1673,
273,
253,
6353,
1223,
253,
3242,
673,
39381,
272,
7602,
1673,
310,
417,
4555,
2130,
253,
4477,
12661,
281,
6642,
824,
247,
10671,
407,
4020,
839,
253,
3969,
32643,
4038,
4313,
949,
28699,
253,
1517,
72,
1342,
4868,
824,
271,
11193,
4419,
760,
3530,
432,
253,
4778,
3268,
534,
476,
4044,
407,
948,
8287,
6353,
1563,
253,
5998,
7602,
1673,
369,
2152,
6339,
11786,
2685,
556,
8058,
281,
320,
247,
4217,
4968,
323,
10491,
432,
271,
440,
6320,
1025,
3268,
2593,
374,
281,
577,
273,
436,
789,
956,
253,
2629,
28529,
273,
253,
789,
2112,
436,
2561,
1386,
285,
973,
5513,
253,
8091,
5606,
5700,
1580,
253,
6944,
7602,
1673,
273,
253,
369,
2152,
6339,
11786,
2685,
4419,
253,
2289,
281,
253,
4778,
3268,
2805,
85,
534,
310,
275,
2087,
417,
2130,
247,
2234,
3213,
275,
3082,
2112,
436,
2561,
1386,
310,
281,
6642,
824,
247,
10671,
281,
6642,
824,
247,
10671,
436,
789,
29328,
281,
6642,
253,
2412,
4038,
4313,
875,
253,
2442,
1159,
285,
253,
4778,
3268,
2805,
85,
407,
28699,
253,
1517,
72,
1342,
4868,
534,
310,
2529,
275,
2593,
608,
50276,
35529,
891,
1089,
2593,
608,
2834,
281,
2096,
352,
651,
320,
6889,
9371,
604,
253,
4477,
812,
5513,
253,
30328,
273,
253,
1517,
72,
1342,
4868,
275,
958,
891,
1158,
627,
943,
320,
271,
2060,
2593,
275,
253,
12611,
326,
8631,
253,
1517,
72,
1342,
4868,
285,
512,
253,
7234,
2708,
5150,
1458,
594,
326,
253,
9414,
476,
956,
436,
1077,
1774,
3213,
50275,
74,
1158,
2593,
608,
310,
253,
629,
326,
19986,
436,
789,
432,
2045,
789,
751,
337,
285,
310,
835,
253,
38135,
273,
436,
2929,
8696,
352,
878,
281,
320,
1077,
4518,
5544,
50275,
18,
6797,
857,
18753,
250,
285,
38996,
23538,
36326,
1364,
928,
265,
247,
30027,
11193,
273,
12393,
7424,
970,
6353,
4927,
312,
6698,
327,
8249,
285,
7605,
12672,
1903,
642,
374,
7901,
3285,
1610,
740,
436,
2929,
25057,
253,
22973,
19945,
273,
253,
369,
2152,
6339,
11786,
2685,
273,
253,
4103,
15579,
281,
3410,
432,
271,
440,
6320,
1025,
3268,
533,
253,
28529,
273,
253,
2234,
3213,
275,
253,
4081,
2746,
310,
417,
973,
5544,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
4460,
1039,
281,
3410,
432,
440,
6320,
1025,
10670,
436,
310,
9371,
672,
18899,
390,
26230,
253,
2622,
3006,
3638,
310,
18093,
44374,
50276,
783,
2022,
2934,
310,
281,
3540,
253,
11786,
2685,
273,
253,
4103,
15579,
275,
253,
369,
2152,
6339,
2317,
273,
5912,
10670,
352,
310,
1929,
326,
253,
2685,
26414,
281,
253,
2303,
3268,
285,
253,
2929,
23970,
247,
39762,
14846,
273,
253,
35132,
1025,
5018,
253,
2022,
5649,
273,
436,
14846,
310,
326,
352,
18210,
265,
253,
878,
281,
871,
253,
2622,
3006,
3638,
347,
973,
347,
1146,
42133,
281,
13418,
407,
970,
247,
5678,
8091,
5606,
50276,
783,
5373,
273,
253,
747,
5933,
403,
5183,
949,
2067,
10704,
9938,
891,
11346,
4361,
253,
2929,
352,
310,
973,
3542,
253,
16038,
310,
2590,
285,
352,
310,
3477,
281,
956,
253,
2022,
5697,
50276,
35529,
891,
1089,
352,
1892,
281,
2939,
253,
4588,
7680,
273,
253,
2929,
327,
581,
1133,
1223,
253,
4081,
5933,
2789,
3282,
627,
310,
642,
12215,
275,
253,
2929,
2057,
323,
253,
10491,
7200,
390,
1014,
323,
253,
958,
326,
253,
5933,
588,
29623,
281,
253,
2303,
2557,
323,
1650,
310,
627,
667,
12215,
326,
253,
35132,
1025,
2685,
1057,
417,
823,
8492,
281,
253,
2797,
2557,
627,
403,
1142,
10839,
494,
3602,
275,
253,
11333,
256,
253,
35132,
1320,
3213,
465,
253,
673,
16892,
295,
253,
1180,
273,
6353,
752,
310,
253,
36039,
875,
1110,
3602,
1677,
690,
3268,
849,
943,
891,
5206,
1110,
891,
651,
452,
3264,
281,
923,
247,
2372,
625,
273,
253,
6944,
3762,
3212,
436,
5933,
50276,
251,
253,
643,
1133,
432,
253,
8668,
273,
4588,
1543,
891,
1089,
326,
253,
10704,
4679,
403,
8489,
11096,
285,
13345,
9904,
342,
253,
15180,
18332,
697,
417,
2590,
281,
479,
672,
581,
588,
2686,
4510,
281,
897,
253,
747,
4649,
50276,
783,
2934,
310,
20654,
4722,
533,
253,
2929,
19756,
1941,
323,
697,
31471,
1097,
432,
10527,
285,
3732,
24302,
5474,
339,
431,
248,
2929,
12453,
253,
2523,
273,
10491,
432,
271,
440,
6320,
1025,
3268,
253,
10491,
1895,
310,
5248,
347,
253,
10704,
9864,
273,
253,
11786,
2685,
2330,
342,
253,
27451,
23279,
875,
253,
2303,
440,
6320,
1025,
3268,
285,
253,
4020,
839,
3268,
253,
11132,
629,
310,
281,
6642,
253,
4038,
4313,
326,
4620,
275,
253,
11786,
1307,
253,
4477,
12661,
281,
897,
247,
3676,
11454,
2990,
281,
6642,
253,
4038,
4313,
10704,
1543,
921,
253,
31471,
273,
253,
4081,
1332,
4583,
253,
2929,
310,
973,
3542,
9021,
403,
4518,
4767,
5886,
281,
2045,
310,
3559,
253,
4081,
1332,
310,
973,
5544,
285,
247,
8489,
6508,
20407,
10704,
7103,
273,
253,
4081,
1332,
310,
1677,
50276,
74,
1089,
253,
4583,
2746,
4722,
50276,
783,
10183,
273,
10491,
1146,
5248,
347,
247,
4038,
4313,
13418,
1895,
436,
747,
1895,
2299,
310,
417,
3477,
281,
8415,
285,
634,
2746,
273,
970,
247,
11454,
2990,
281,
6642,
253,
4038,
4313,
3133,
281,
789,
387,
1878,
275,
253,
2783,
6667,
50276,
3529,
1146,
753,
891,
513,
452,
690,
3374,
342,
253,
2929,
368,
3748,
275,
253,
6452,
326,
368,
3524,
281,
5100,
253,
14940,
3607,
273,
253,
4081,
1332,
891,
2096,
326,
352,
310,
417,
14916,
281,
5100,
14940,
2299,
417,
15250,
387,
1878,
690,
30328,
670,
253,
14940,
273,
253,
5933,
310,
247,
2266,
32489,
253,
1039,
891,
923,
352,
310,
326,
627,
403,
767,
4973,
273,
2228,
326,
812,
10546,
468,
14940,
253,
35132,
1320,
2228,
273,
253,
10704,
7092,
273,
253,
5415,
11786,
2685,
285,
253,
11193,
2228,
273,
253,
4038,
4313,
891,
42126,
1089,
2712,
670,
2057,
275,
253,
2929,
604,
417,
247,
4737,
387,
1878,
690,
30328,
670,
849,
597,
2818,
253,
1543,
670,
849,
597,
8008,
3966,
50276,
249,
253,
10704,
4679,
2593,
368,
1246,
247,
4344,
2408,
273,
6667,
534,
921,
326,
253,
4081,
1332,
310,
7032,
273,
41731,
14692,
253,
11771,
11333,
253,
1543,
403,
4722,
2299,
627,
403,
642,
1543,
281,
921,
849,
253,
16226,
273,
253,
4081,
1332,
6889,
342,
253,
1027,
3602,
19836,
253,
1180,
273,
2783,
6353,
285,
253,
4327,
273,
3268,
259,
824,
1543,
651,
5248,
690,
1708,
715,
253,
6703,
789,
723,
273,
253,
4081,
1332,
285,
651,
320,
4217,
323,
3780,
6110,
275,
970,
352,
50276,
249,
2593,
6705,
368,
513,
3748,
326,
253,
5520,
16226,
273,
253,
4081,
1332,
1705,
342,
247,
2169,
15180,
2105,
2299,
368,
513,
417,
1347,
667,
1783,
273,
253,
5454,
2727,
875,
13782,
673,
285,
16226,
352,
651,
452,
671,
644,
4722,
281,
7277,
253,
16226,
273,
253,
1027,
11333,
323,
253,
1072,
15180,
7563,
50276,
23955,
4809,
326,
10903,
562,
281,
479,
310,
253,
4327,
273,
11771,
11333,
253,
11333,
326,
368,
5206,
347,
21607,
403,
3588,
2299,
616,
4327,
310,
30455,
619,
806,
7579,
369,
2139,
42126,
368,
5206,
253,
924,
68,
5933,
347,
247,
32048,
352,
671,
4648,
6353,
275,
1340,
281,
6642,
253,
440,
6320,
1025,
2303,
3268,
671,
288,
17475,
812,
452,
671,
644,
2783,
50276,
74,
20673,
690,
963,
993,
1060,
285,
627,
323,
1650,
448,
2458,
3185,
273,
13178,
327,
3239,
854,
275,
253,
5004,
12494,
50276,
3354,
1584,
253,
209,
3627,
285,
4691,
66,
342,
465,
448,
2458,
24510,
3185,
273,
10280,
327,
3239,
898,
359,
24510,
253,
3632,
10883,
884,
2069,
50275,
23955,
1355,
2523,
310,
342,
4677,
577,
697,
10693,
34025,
891,
2096,
326,
627,
310,
247,
2701,
327,
253,
3239,
1385,
2299,
28763,
417,
247,
22861,
323,
1907,
8442,
326,
403,
1892,
281,
1239,
625,
594,
347,
627,
403,
690,
19886,
14013,
275,
253,
2505,
326,
812,
452,
644,
17527,
323,
385,
7424,
1903,
285,
2145,
403,
253,
1072,
310,
253,
3361,
273,
1097,
3309,
323,
4685,
253,
2934,
326,
310,
3559,
275,
2593,
577,
891,
1089,
253,
2746,
4722,
2299,
627,
403,
690,
2523,
342,
253,
2929,
347,
352,
310,
1097,
10527,
285,
16774,
432,
247,
10527,
1127,
273,
1859,
627,
310,
642,
5955,
670,
2515,
323,
14940,
273,
253,
5933,
432,
271,
16774,
1127,
273,
1859,
253,
10704,
4679,
403,
417,
3426,
2217,
342,
1675,
281,
253,
5301,
1783,
326,
310,
4824,
562,
533,
671,
342,
1675,
281,
7037,
839,
5816,
10527,
1783,
4583,
253,
2929,
310,
4722,
533,
275,
697,
1655,
830,
310,
417,
33567,
2217,
323,
9311,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
10491,
5853,
323,
440,
6320,
1025,
10670,
253,
2022,
2934,
310,
281,
13237,
4979,
6353,
407,
1563,
253,
11786,
2685,
273,
253,
4103,
15579,
275,
253,
369,
2152,
6339,
2317,
273,
5912,
10670,
253,
2929,
39223,
271,
1774,
1895,
285,
3400,
271,
4722,
747,
8668,
2299,
1014,
8133,
9255,
253,
7350,
327,
253,
10527,
1783,
5439,
407,
253,
30628,
253,
5661,
27163,
1057,
417,
1646,
4209,
281,
7568,
253,
5373,
273,
253,
4081,
2746
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
8091,
11193,
326,
4419,
26230,
4038,
11878,
247,
32566,
28190,
1895,
253,
5933,
7637,
598,
1907,
690,
3745,
281,
6016,
23390,
351,
1319,
984,
253,
897,
273,
17375,
6353,
285,
4038,
4313,
13418,
4483,
253,
5933,
281,
8069,
11897,
253,
4103,
14118,
273,
1027,
10006,
273,
253,
3268,
253,
4081,
1332,
323,
26230,
253,
4038,
11878,
4620,
281,
320,
253,
1072,
347,
305,
2653,
398,
8107,
21535,
9077,
1332,
495,
342,
247,
11454,
2036,
15706,
253,
6703,
1885,
3021,
891,
1902,
2074,
1543,
812,
320,
2797,
625,
3587,
285,
342,
760,
247,
2014,
3790,
273,
4644,
13418,
407,
337,
3515,
4691,
66,
1142,
2069,
285,
374,
840,
26230,
253,
4103,
4644,
273,
1016,
5931,
3877,
627,
403,
7418,
643,
3082,
323,
2509,
436,
643,
685,
8107,
21535,
9077,
824,
271,
2746,
943,
789,
973,
327,
253,
9351,
273,
1698,
6967,
6667,
2783,
275,
253,
10704,
4679,
50275,
44295,
3374,
12893,
275,
253,
5661,
7103,
806,
253,
4679,
1646,
281,
921,
1077,
2074,
3045,
281,
4691,
66,
512,
1561,
253,
2629,
6332,
672,
2530,
24088,
275,
2829,
374,
1273,
516,
7514,
670,
253,
3290,
273,
253,
4691,
66,
7092,
323,
534,
2127,
369,
417,
2908,
253,
3480,
273,
14940,
275,
581,
1650,
5936,
4691,
66,
369,
417,
1408,
342,
4569,
3213,
1979,
15644,
12262,
253,
8654,
14924,
2281,
5329,
347,
310,
2629,
275,
253,
6239,
604,
594,
840,
253,
5301,
310,
417,
4569,
2626,
323,
247,
4344,
5301,
253,
4691,
66,
13178,
943,
671,
320,
294,
24676,
1754,
327,
4644,
8197,
323,
1016,
5931,
347,
2529,
1840,
50275,
261,
352,
1896,
627,
403,
690,
15988,
432,
970,
253,
4081,
1332,
327,
23390,
26306,
3268,
4754,
533,
891,
3464,
33872,
25761,
604,
253,
4736,
310,
10554,
891,
1902,
16248,
278,
3591,
68,
342,
37444,
588,
320,
625,
3576,
9963,
50274,
18,
47299,
965,
302,
261,
305,
247,
19191,
4870,
285,
4893,
7203,
254,
4059,
50276,
19,
632,
86,
2805,
2870,
249,
39762,
11786,
18499,
347,
11786,
2685,
275,
5723,
2824,
4240,
50276,
20,
305,
21776,
260,
26230,
2622,
3006,
14637,
285,
294,
6712,
272,
24170,
7681,
1304,
9354,
50275,
21,
687,
589,
1641,
305,
258,
50276,
2921,
45541,
480,
256,
8654,
13642,
273,
13358,
34754,
281,
298,
912,
8498,
2171,
16723,
6698,
273,
253,
17292,
7605,
5948,
2962,
270,
7605,
16182,
3925,
15215,
22913,
8065,
50276,
22,
687,
589,
1641,
305,
258,
50276,
2921,
45541,
480,
256,
8654,
13642,
323,
2710,
1313,
18427,
763,
42118,
11333,
7605,
5859,
1668,
4791,
1012,
2251,
6585,
50276,
23,
340,
8500,
340,
1670,
384,
1792,
247,
50276,
11500,
1342,
247,
37444,
323,
1327,
24706,
272,
17699,
16561,
30745,
253,
28401,
285,
26569,
273,
23390,
26306,
20731,
17327,
549,
32693,
2061,
549,
32693,
8603,
805,
22519,
9169,
50276,
24,
340,
8500,
340,
1670,
384,
1792,
247,
948,
10836,
277,
50276,
11500,
1342,
247,
970,
37444,
281,
3388,
17699,
16561,
15970,
10670,
17699,
16561,
1783,
2145,
898,
1166,
31182,
4240,
253,
2929,
3133,
281,
452,
247,
7936,
40663,
273,
253,
369,
2152,
6339,
11786,
2685,
323,
253,
4103,
15579,
285,
253,
5661,
27163,
778,
417,
320,
4569,
50276,
7152,
33032,
2520,
2929,
19401,
253,
1895,
273,
10491,
432,
271,
440,
6320,
1025,
3268,
253,
440,
6320,
1025,
2303,
3268,
476,
320,
12258,
347,
247,
17429,
1127,
273,
253,
369,
2152,
6339,
11786,
2685,
273,
253,
3969,
4103,
15579,
5164,
534,
476,
320,
39406,
3636,
432,
247,
22973,
8668,
407,
13947,
247,
673,
39381,
272,
7602,
1673,
273,
253,
6353,
1223,
253,
3242,
673,
39381,
272,
7602,
1673,
310,
417,
4555,
2130,
253,
4477,
12661,
281,
6642,
824,
247,
10671,
407,
4020,
839,
253,
3969,
32643,
4038,
4313,
949,
28699,
253,
1517,
72,
1342,
4868,
824,
271,
11193,
4419,
760,
3530,
432,
253,
4778,
3268,
534,
476,
4044,
407,
948,
8287,
6353,
1563,
253,
5998,
7602,
1673,
369,
2152,
6339,
11786,
2685,
556,
8058,
281,
320,
247,
4217,
4968,
323,
10491,
432,
271,
440,
6320,
1025,
3268,
2593,
374,
281,
577,
273,
436,
789,
956,
253,
2629,
28529,
273,
253,
789,
2112,
436,
2561,
1386,
285,
973,
5513,
253,
8091,
5606,
5700,
1580,
253,
6944,
7602,
1673,
273,
253,
369,
2152,
6339,
11786,
2685,
4419,
253,
2289,
281,
253,
4778,
3268,
2805,
85,
534,
310,
275,
2087,
417,
2130,
247,
2234,
3213,
275,
3082,
2112,
436,
2561,
1386,
310,
281,
6642,
824,
247,
10671,
281,
6642,
824,
247,
10671,
436,
789,
29328,
281,
6642,
253,
2412,
4038,
4313,
875,
253,
2442,
1159,
285,
253,
4778,
3268,
2805,
85,
407,
28699,
253,
1517,
72,
1342,
4868,
534,
310,
2529,
275,
2593,
608,
50276,
35529,
891,
1089,
2593,
608,
2834,
281,
2096,
352,
651,
320,
6889,
9371,
604,
253,
4477,
812,
5513,
253,
30328,
273,
253,
1517,
72,
1342,
4868,
275,
958,
891,
1158,
627,
943,
320,
271,
2060,
2593,
275,
253,
12611,
326,
8631,
253,
1517,
72,
1342,
4868,
285,
512,
253,
7234,
2708,
5150,
1458,
594,
326,
253,
9414,
476,
956,
436,
1077,
1774,
3213,
50275,
74,
1158,
2593,
608,
310,
253,
629,
326,
19986,
436,
789,
432,
2045,
789,
751,
337,
285,
310,
835,
253,
38135,
273,
436,
2929,
8696,
352,
878,
281,
320,
1077,
4518,
5544,
50275,
18,
6797,
857,
18753,
250,
285,
38996,
23538,
36326,
1364,
928,
265,
247,
30027,
11193,
273,
12393,
7424,
970,
6353,
4927,
312,
6698,
327,
8249,
285,
7605,
12672,
1903,
642,
374,
7901,
3285,
1610,
740,
436,
2929,
25057,
253,
22973,
19945,
273,
253,
369,
2152,
6339,
11786,
2685,
273,
253,
4103,
15579,
281,
3410,
432,
271,
440,
6320,
1025,
3268,
533,
253,
28529,
273,
253,
2234,
3213,
275,
253,
4081,
2746,
310,
417,
973,
5544,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
4460,
1039,
281,
3410,
432,
440,
6320,
1025,
10670,
436,
310,
9371,
672,
18899,
390,
26230,
253,
2622,
3006,
3638,
310,
18093,
44374,
50276,
783,
2022,
2934,
310,
281,
3540,
253,
11786,
2685,
273,
253,
4103,
15579,
275,
253,
369,
2152,
6339,
2317,
273,
5912,
10670,
352,
310,
1929,
326,
253,
2685,
26414,
281,
253,
2303,
3268,
285,
253,
2929,
23970,
247,
39762,
14846,
273,
253,
35132,
1025,
5018,
253,
2022,
5649,
273,
436,
14846,
310,
326,
352,
18210,
265,
253,
878,
281,
871,
253,
2622,
3006,
3638,
347,
973,
347,
1146,
42133,
281,
13418,
407,
970,
247,
5678,
8091,
5606,
50276,
783,
5373,
273,
253,
747,
5933,
403,
5183,
949,
2067,
10704,
9938,
891,
11346,
4361,
253,
2929,
352,
310,
973,
3542,
253,
16038,
310,
2590,
285,
352,
310,
3477,
281,
956,
253,
2022,
5697,
50276,
35529,
891,
1089,
352,
1892,
281,
2939,
253,
4588,
7680,
273,
253,
2929,
327,
581,
1133,
1223,
253,
4081,
5933,
2789,
3282,
627,
310,
642,
12215,
275,
253,
2929,
2057,
323,
253,
10491,
7200,
390,
1014,
323,
253,
958,
326,
253,
5933,
588,
29623,
281,
253,
2303,
2557,
323,
1650,
310,
627,
667,
12215,
326,
253,
35132,
1025,
2685,
1057,
417,
823,
8492,
281,
253,
2797,
2557,
627,
403,
1142,
10839,
494,
3602,
275,
253,
11333,
256,
253,
35132,
1320,
3213,
465,
253,
673,
16892,
295,
253,
1180,
273,
6353,
752,
310,
253,
36039,
875,
1110,
3602,
1677,
690,
3268,
849,
943,
891,
5206,
1110,
891,
651,
452,
3264,
281,
923,
247,
2372,
625,
273,
253,
6944,
3762,
3212,
436,
5933,
50276,
251,
253,
643,
1133,
432,
253,
8668,
273,
4588,
1543,
891,
1089,
326,
253,
10704,
4679,
403,
8489,
11096,
285,
13345,
9904,
342,
253,
15180,
18332,
697,
417,
2590,
281,
479,
672,
581,
588,
2686,
4510,
281,
897,
253,
747,
4649,
50276,
783,
2934,
310,
20654,
4722,
533,
253,
2929,
19756,
1941,
323,
697,
31471,
1097,
432,
10527,
285,
3732,
24302,
5474,
339,
431,
248,
2929,
12453,
253,
2523,
273,
10491,
432,
271,
440,
6320,
1025,
3268,
253,
10491,
1895,
310,
5248,
347,
253,
10704,
9864,
273,
253,
11786,
2685,
2330,
342,
253,
27451,
23279,
875,
253,
2303,
440,
6320,
1025,
3268,
285,
253,
4020,
839,
3268,
253,
11132,
629,
310,
281,
6642,
253,
4038,
4313,
326,
4620,
275,
253,
11786,
1307,
253,
4477,
12661,
281,
897,
247,
3676,
11454,
2990,
281,
6642,
253,
4038,
4313,
10704,
1543,
921,
253,
31471,
273,
253,
4081,
1332,
4583,
253,
2929,
310,
973,
3542,
9021,
403,
4518,
4767,
5886,
281,
2045,
310,
3559,
253,
4081,
1332,
310,
973,
5544,
285,
247,
8489,
6508,
20407,
10704,
7103,
273,
253,
4081,
1332,
310,
1677,
50276,
74,
1089,
253,
4583,
2746,
4722,
50276,
783,
10183,
273,
10491,
1146,
5248,
347,
247,
4038,
4313,
13418,
1895,
436,
747,
1895,
2299,
310,
417,
3477,
281,
8415,
285,
634,
2746,
273,
970,
247,
11454,
2990,
281,
6642,
253,
4038,
4313,
3133,
281,
789,
387,
1878,
275,
253,
2783,
6667,
50276,
3529,
1146,
753,
891,
513,
452,
690,
3374,
342,
253,
2929,
368,
3748,
275,
253,
6452,
326,
368,
3524,
281,
5100,
253,
14940,
3607,
273,
253,
4081,
1332,
891,
2096,
326,
352,
310,
417,
14916,
281,
5100,
14940,
2299,
417,
15250,
387,
1878,
690,
30328,
670,
253,
14940,
273,
253,
5933,
310,
247,
2266,
32489,
253,
1039,
891,
923,
352,
310,
326,
627,
403,
767,
4973,
273,
2228,
326,
812,
10546,
468,
14940,
253,
35132,
1320,
2228,
273,
253,
10704,
7092,
273,
253,
5415,
11786,
2685,
285,
253,
11193,
2228,
273,
253,
4038,
4313,
891,
42126,
1089,
2712,
670,
2057,
275,
253,
2929,
604,
417,
247,
4737,
387,
1878,
690,
30328,
670,
849,
597,
2818,
253,
1543,
670,
849,
597,
8008,
3966,
50276,
249,
253,
10704,
4679,
2593,
368,
1246,
247,
4344,
2408,
273,
6667,
534,
921,
326,
253,
4081,
1332,
310,
7032,
273,
41731,
14692,
253,
11771,
11333,
253,
1543,
403,
4722,
2299,
627,
403,
642,
1543,
281,
921,
849,
253,
16226,
273,
253,
4081,
1332,
6889,
342,
253,
1027,
3602,
19836,
253,
1180,
273,
2783,
6353,
285,
253,
4327,
273,
3268,
259,
824,
1543,
651,
5248,
690,
1708,
715,
253,
6703,
789,
723,
273,
253,
4081,
1332,
285,
651,
320,
4217,
323,
3780,
6110,
275,
970,
352,
50276,
249,
2593,
6705,
368,
513,
3748,
326,
253,
5520,
16226,
273,
253,
4081,
1332,
1705,
342,
247,
2169,
15180,
2105,
2299,
368,
513,
417,
1347,
667,
1783,
273,
253,
5454,
2727,
875,
13782,
673,
285,
16226,
352,
651,
452,
671,
644,
4722,
281,
7277,
253,
16226,
273,
253,
1027,
11333,
323,
253,
1072,
15180,
7563,
50276,
23955,
4809,
326,
10903,
562,
281,
479,
310,
253,
4327,
273,
11771,
11333,
253,
11333,
326,
368,
5206,
347,
21607,
403,
3588,
2299,
616,
4327,
310,
30455,
619,
806,
7579,
369,
2139,
42126,
368,
5206,
253,
924,
68,
5933,
347,
247,
32048,
352,
671,
4648,
6353,
275,
1340,
281,
6642,
253,
440,
6320,
1025,
2303,
3268,
671,
288,
17475,
812,
452,
671,
644,
2783,
50276,
74,
20673,
690,
963,
993,
1060,
285,
627,
323,
1650,
448,
2458,
3185,
273,
13178,
327,
3239,
854,
275,
253,
5004,
12494,
50276,
3354,
1584,
253,
209,
3627,
285,
4691,
66,
342,
465,
448,
2458,
24510,
3185,
273,
10280,
327,
3239,
898,
359,
24510,
253,
3632,
10883,
884,
2069,
50275,
23955,
1355,
2523,
310,
342,
4677,
577,
697,
10693,
34025,
891,
2096,
326,
627,
310,
247,
2701,
327,
253,
3239,
1385,
2299,
28763,
417,
247,
22861,
323,
1907,
8442,
326,
403,
1892,
281,
1239,
625,
594,
347,
627,
403,
690,
19886,
14013,
275,
253,
2505,
326,
812,
452,
644,
17527,
323,
385,
7424,
1903,
285,
2145,
403,
253,
1072,
310,
253,
3361,
273,
1097,
3309,
323,
4685,
253,
2934,
326,
310,
3559,
275,
2593,
577,
891,
1089,
253,
2746,
4722,
2299,
627,
403,
690,
2523,
342,
253,
2929,
347,
352,
310,
1097,
10527,
285,
16774,
432,
247,
10527,
1127,
273,
1859,
627,
310,
642,
5955,
670,
2515,
323,
14940,
273,
253,
5933,
432,
271,
16774,
1127,
273,
1859,
253,
10704,
4679,
403,
417,
3426,
2217,
342,
1675,
281,
253,
5301,
1783,
326,
310,
4824,
562,
533,
671,
342,
1675,
281,
7037,
839,
5816,
10527,
1783,
4583,
253,
2929,
310,
4722,
533,
275,
697,
1655,
830,
310,
417,
33567,
2217,
323,
9311,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
10491,
5853,
323,
440,
6320,
1025,
10670,
253,
2022,
2934,
310,
281,
13237,
4979,
6353,
407,
1563,
253,
11786,
2685,
273,
253,
4103,
15579,
275,
253,
369,
2152,
6339,
2317,
273,
5912,
10670,
253,
2929,
39223,
271,
1774,
1895,
285,
3400,
271,
4722,
747,
8668,
2299,
1014,
8133,
9255,
253,
7350,
327,
253,
10527,
1783,
5439,
407,
253,
30628,
253,
5661,
27163,
1057,
417,
1646,
4209,
281,
7568,
253,
5373,
273,
253,
4081,
2746
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposed a novel graph representation circuit graph integrating the heterogeneous circuit information from logic synthesis and placement to facilitate the eda design process the proposed graph structure considers both topological cell connection in the netlist and geometric information positioning of the standard cells on the layout a corresponding graph neural network gnn structure is proposed for extracting circuit representation for various downstream tasks the experimental results demonstrated the effectiveness of the graph in congestion and net wirelength prediction tasks with efficient nn computation strengths 1 heterogeneous information fusion across multiple eda design stages typically circuit designs are divided into multiple phases each phase may have its own unique representation for the same underlying circuit the proposed circuit graph brings two representations netlist and cell placement into a unified graph representation which provided a more informative data structure embedding knowledge from multiple eda design phases 1 the proposed circuit graph is general enough to be extended to inspire future work the paper only touches on congestion and net wirelength prediction tasks for detailed routing and the graph featurization contains only related basic topology information and simple geometric information the reviewer believes the proposed graph can inspire more work in eda areas for example by adding standard cell delay as one new feature in the cell node the proposed graph may also help with the timing analysis of the circuit 1 the overall gnn structure follows the design of the circuit graph which sounds promising the topological and geometric message passing structures preserve the structure of the original circuit graph weaknesses 1 the paper didnt touch on how representative the extracted gnn features are the two tasks congestion prediction and net wirelength prediction in the paper are experimented independently although these two tasks have different readouts they shared the same input graph features and extract gnn feature representation it would be interesting to check if the knowledge can be transferred from one task to another using the proposed gnn 1 although the overall gnn structure sounds promising some detailed formulation or design choice of gnn needs to be further justified detailed comments are made in the questions the paper mentioned that one limitation is to test the proposed method under commercial products and more complex scenarios the reviewer appreciate that the authors bring up this and understand the difficulty behind it docsepthis work constructs a modeling framework that aims to solve various problems in the circuit design process this work incorporates 1 a novel circuit graph that is able to jointly integrate the topological and geometrical information and is claimed to be the first unified circuit representation approach that can be easily compatible across eda tasks and stage 2 a novel messagepassing paradigm circuitgnn that is tailored towards the aforementioned graph dataset structure the structure can conduct messagepassing on both topological and geometrical edges distinctively and then fuse the messages to update cells and nets representations 3 extensive experiments validates the merits of the proposed methods in terms of both the task accuracy and execution speeds strength 1 this work does a good job on analyzing and illustrating the tasks and problems of circuit eda in light of the machine learning methods 2 the methodology is described in much detailed but straightforward way 3 overall this work provides decent improvements over the existing methods just per the results alone it is impressive 4 the code provided in the supplementary materials is certainly a plus contributing to the transparency and reproduction of the works in the fields weakness 1 apart from the improvements on the message passing methods one of the key contributions of this work is to be able to jointly integrate the topo and geom information in one model however i do not see clearly the motivation for this point from both the application and results perspectives for actual application is there a significant disadvantage of simply using two sets of models or even methods respectively for logical synthesis and placeandrouting one of the reasons i would perceive is that a joint model may yield a better task performance due to the complementary information however as table 2 suggests the joint models improvements against the proposed method with only geom message passing yes its addressed docsepthe authors propose a unified way to construct graphs in different phases of eda flow and develop a general gnn model for downstream eda tasks specifically the proposed approach first constructs a heterogeneous graph by incorporating cellcell connections geometrical information and cellnet connections topological information the node and edge features are generated based on physical properties of cells pins and nets then a circuit gnn model is proposed to apply message passing on cellcell and cellnet connections separately which produces the representations of cells and nets for downstream tasks the experimental results show that the proposed method increases 167 accuracy on congestion prediction and reduces 169 error on wirelength prediction key strength the paper is clearly written all the technical steps are easy to follow the proposed method can be used to solve multistage eda tasks key weakness although the proposed circuit graph construction and gnn model are all reasonable they lack some technical significance for example for circuit graph construction it is straightforward to construct a bipartite graph based on cellnet connections from netlist in order to produce representations of cells and nets for downstream tasks hence the contribution is limited for the graph construction especially in logic synthesis stage where placement information is not available for gnn model it is a common way to apply message passing individually per edge type for handling heterogeneous graphs eg 2 thus the novelty of the proposed model is limited although the experiments show promising accuracy gains for downstream eda tasks further clarification could make the improvements more convincing missing strong gnn baselines the chosen baselines ie gcn graphsage and gat only consider node features since edge features are important in this paper authors should compare the proposed model against stronger baselines eg mpnn1 that incorporate edge features on the same input graph without a stronger baseline the contribution of the proposed gnn model is unclear not tuning hyperparameters for baselines authors choose the default hyperparameters for baselines from their original papers since the datasets used in those papers eg 3 are different from this paper hyperparameter tuning is necessary not comparing against dreamplace the purpose of wirelength prediction is to speedup eda design closure nonetheless there are no results of the runtime comparison between the proposed model and the placement method dreamplace which is a very fast placement method by exploiting gpus without this comparison its unclear about the motivation of wirelength prediction in placement 1 gilmer et al neural message passing for quantum chemistry icml17 2 zhang et al heterogeneous graph neural network kdd19 3 xie et al preplacement net length and timing estimation by customized graph neural network tcad22 thanks authors for mentioning potential limitations of this work one key challenge of deploying ml models into commercial eda tools is the model generalizability authors can evaluate the trained model on more unseen designs to see if it is truly generalizable
### Summary: | this paper proposes a gnn approach to eda using the construction of a circuit graph that combines geometric and topological information as well as features generated from physical properties of circuit components while reviewers have raised certain concerns some addressed already in rebuttal they all settled post rebuttal on recommending weak accept of the paper i agree with them and think the neurips audience would benefit from the inclusion of this work in the program and therefore i recommend acceptance i would like to encourage the authors to take into account the comments and discussion with the reviewers as well as incorporate materials presented in their responses when preparing the camera ready version | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
4081,
247,
4460,
4216,
6779,
5049,
4216,
24399,
253,
22766,
5049,
1491,
432,
9317,
9066,
285,
14663,
281,
12454,
253,
1407,
66,
2216,
1232,
253,
4081,
4216,
2605,
19401,
1097,
17597,
894,
4602,
275,
253,
2036,
3550,
285,
17856,
1491,
19274,
273,
253,
2629,
1341,
327,
253,
12806,
247,
3969,
4216,
11454,
2990,
305,
9866,
2605,
310,
4081,
323,
34705,
5049,
6779,
323,
2710,
15450,
8892,
253,
5661,
1543,
5183,
253,
12510,
273,
253,
4216,
275,
35367,
285,
2036,
6371,
3985,
10554,
8892,
342,
5919,
48257,
13782,
20544,
337,
22766,
1491,
11781,
2439,
2709,
1407,
66,
2216,
8661,
5431,
5049,
11809,
403,
4272,
715,
2709,
12475,
1016,
3408,
778,
452,
697,
1211,
4451,
6779,
323,
253,
1072,
6944,
5049,
253,
4081,
5049,
4216,
10316,
767,
14237,
2036,
3550,
285,
894,
14663,
715,
247,
27998,
4216,
6779,
534,
2530,
247,
625,
27096,
941,
2605,
21496,
3640,
432,
2709,
1407,
66,
2216,
12475,
337,
253,
4081,
5049,
4216,
310,
2087,
2217,
281,
320,
6508,
281,
26761,
2852,
789,
253,
2929,
760,
26847,
327,
35367,
285,
2036,
6371,
3985,
10554,
8892,
323,
7000,
24749,
285,
253,
4216,
704,
4953,
1320,
4428,
760,
2905,
5044,
18080,
1491,
285,
2969,
17856,
1491,
253,
37317,
11532,
253,
4081,
4216,
476,
26761,
625,
789,
275,
1407,
66,
3672,
323,
1650,
407,
6240,
2629,
894,
5778,
347,
581,
747,
4735,
275,
253,
894,
4666,
253,
4081,
4216,
778,
671,
1361,
342,
253,
11795,
1783,
273,
253,
5049,
337,
253,
4583,
305,
9866,
2605,
3637,
253,
2216,
273,
253,
5049,
4216,
534,
7835,
12532,
253,
17597,
285,
17856,
3935,
8136,
5289,
14003,
253,
2605,
273,
253,
3236,
5049,
4216,
50275,
20881,
1255,
265,
337,
253,
2929,
42126,
5181,
327,
849,
8612,
253,
10375,
305,
9866,
3386,
403,
253,
767,
8892,
35367,
10554,
285,
2036,
6371,
3985,
10554,
275,
253,
2929,
403,
3368,
264,
10939,
3738,
841,
767,
8892,
452,
1027,
1239,
8349,
597,
6096,
253,
1072,
3280,
4216,
3386,
285,
4908,
305,
9866,
4735,
6779,
352,
651,
320,
4722,
281,
2451,
604,
253,
3640,
476,
320,
9495,
432,
581,
4836,
281,
1529,
970,
253,
4081,
305,
9866,
337,
3738,
253,
4583,
305,
9866,
2605,
7835,
12532,
690,
7000,
15895,
390,
2216,
4327,
273,
305,
9866,
3198,
281,
320,
2007,
17285,
7000,
5701,
403,
1160,
275,
253,
3533,
253,
2929,
5393,
326,
581,
12291,
310,
281,
1071,
253,
4081,
1332,
762,
6264,
3580,
285,
625,
2570,
15216,
253,
37317,
11435,
326,
253,
4477,
3324,
598,
436,
285,
2096,
253,
10183,
3212,
352,
5474,
33032,
2520,
789,
21031,
247,
14053,
7792,
326,
13698,
281,
8415,
2710,
3237,
275,
253,
5049,
2216,
1232,
436,
789,
31167,
50276,
18,
247,
4460,
5049,
4216,
326,
310,
2104,
281,
26277,
19837,
253,
17597,
285,
38307,
1491,
285,
310,
7558,
281,
320,
253,
806,
27998,
5049,
6779,
2746,
326,
476,
320,
4354,
13333,
2439,
1407,
66,
8892,
285,
3924,
374,
247,
4460,
3935,
5858,
272,
22199,
5049,
3757,
79,
326,
310,
27846,
4404,
253,
18979,
4216,
10895,
2605,
253,
2605,
476,
2589,
3935,
5858,
272,
327,
1097,
17597,
285,
38307,
9297,
5799,
1242,
285,
840,
34824,
253,
8169,
281,
5731,
1341,
285,
37507,
14237,
495,
9470,
4679,
3588,
684,
253,
16108,
273,
253,
4081,
3082,
275,
2426,
273,
1097,
253,
4836,
7200,
285,
10636,
18819,
4757,
337,
436,
789,
1057,
247,
1175,
2628,
327,
18918,
285,
34805,
253,
8892,
285,
3237,
273,
5049,
1407,
66,
275,
1708,
273,
253,
5145,
4715,
3082,
374,
253,
16182,
310,
2529,
275,
1199,
7000,
533,
15246,
1039,
495,
4583,
436,
789,
3400,
12524,
11701,
689,
253,
5368,
3082,
816,
591,
253,
1543,
3815,
352,
310,
13943,
577,
253,
2127,
2530,
275,
253,
24864,
4753,
310,
5604,
247,
5043,
15979,
281,
253,
22107,
285,
21068,
273,
253,
2987,
275,
253,
4910,
50276,
20881,
1255,
337,
7419,
432,
253,
11701,
327,
253,
3935,
8136,
3082,
581,
273,
253,
2234,
9021,
273,
436,
789,
310,
281,
320,
2104,
281,
26277,
19837,
253,
1755,
80,
285,
49040,
1491,
275,
581,
1566,
2299,
891,
513,
417,
923,
4518,
253,
16038,
323,
436,
1127,
432,
1097,
253,
2898,
285,
1543,
24302,
323,
4588,
2898,
310,
627,
247,
1534,
18928,
273,
3365,
970,
767,
5239,
273,
3210,
390,
1014,
3082,
2975,
323,
13760,
9066,
285,
1659,
395,
83,
20309,
581,
273,
253,
4606,
891,
651,
24947,
310,
326,
247,
6036,
1566,
778,
4917,
247,
1805,
4836,
3045,
1955,
281,
253,
19767,
1491,
2299,
347,
2829,
374,
5936,
253,
6036,
3210,
11701,
1411,
253,
4081,
1332,
342,
760,
49040,
3935,
8136,
4754,
697,
9713,
5474,
339,
431,
248,
4477,
12661,
247,
27998,
1039,
281,
3989,
14580,
275,
1027,
12475,
273,
1407,
66,
2685,
285,
1287,
247,
2087,
305,
9866,
1566,
323,
15450,
1407,
66,
8892,
5742,
253,
4081,
2746,
806,
21031,
247,
22766,
4216,
407,
24049,
894,
3992,
10291,
38307,
1491,
285,
894,
3024,
10291,
17597,
1491,
253,
4666,
285,
5024,
3386,
403,
4561,
1754,
327,
3520,
3607,
273,
1341,
23854,
285,
37507,
840,
247,
5049,
305,
9866,
1566,
310,
4081,
281,
4647,
3935,
8136,
327,
894,
3992,
285,
894,
3024,
10291,
11794,
534,
11330,
253,
14237,
273,
1341,
285,
37507,
323,
15450,
8892,
253,
5661,
1543,
921,
326,
253,
4081,
1332,
5459,
22743,
7200,
327,
35367,
10554,
285,
11355,
23504,
2228,
327,
6371,
3985,
10554,
2234,
4757,
50273,
783,
2929,
310,
4518,
3542,
512,
253,
7681,
5018,
403,
3477,
281,
956,
50274,
783,
4081,
1332,
476,
320,
908,
281,
8415,
1554,
382,
486,
1407,
66,
8892,
50273,
2364,
14855,
50276,
20261,
253,
4081,
5049,
4216,
5140,
285,
305,
9866,
1566,
403,
512,
5272,
597,
3480,
690,
7681,
8453,
323,
1650,
50274,
1542,
5049,
4216,
5140,
352,
310,
15246,
281,
3989,
247,
49240,
4216,
1754,
327,
894,
3024,
10291,
432,
2036,
3550,
275,
1340,
281,
4711,
14237,
273,
1341,
285,
37507,
323,
15450,
8892,
7613,
253,
7680,
310,
3710,
323,
253,
4216,
5140,
3340,
275,
9317,
9066,
3924,
835,
14663,
1491,
310,
417,
2130,
50274,
1542,
305,
9866,
1566,
352,
310,
247,
1846,
1039,
281,
4647,
3935,
8136,
15978,
591,
5024,
1511,
323,
10885,
22766,
14580,
24088,
374,
3021,
253,
38135,
273,
253,
4081,
1566,
310,
3710,
50273,
20261,
253,
4679,
921,
12532,
7200,
15988,
323,
15450,
1407,
66,
8892,
2007,
37699,
812,
1056,
253,
11701,
625,
21414,
50274,
33722,
2266,
305,
9866,
1666,
25379,
253,
6777,
1666,
25379,
26332,
305,
14340,
14580,
486,
285,
305,
255,
760,
1908,
4666,
3386,
1580,
5024,
3386,
403,
1774,
275,
436,
2929,
4477,
943,
7277,
253,
4081,
1566,
1411,
10046,
1666,
25379,
24088,
23542,
9866,
18,
326,
19071,
5024,
3386,
327,
253,
1072,
3280,
4216,
1293,
247,
10046,
8245,
253,
7680,
273,
253,
4081,
305,
9866,
1566,
310,
12744,
50274,
1439,
25184,
4373,
22041,
323,
1666,
25379,
4477,
5206,
253,
4284,
4373,
22041,
323,
1666,
25379,
432,
616,
3236,
9380,
1580,
253,
15302,
908,
275,
1110,
9380,
24088,
495,
403,
1027,
432,
436,
2929,
4373,
19484,
25184,
310,
3309,
50274,
1439,
10941,
1411,
7156,
5070,
253,
4096,
273,
6371,
3985,
10554,
310,
281,
3885,
484,
1407,
66,
2216,
14230,
23188,
627,
403,
642,
1543,
273,
253,
20243,
5301,
875,
253,
4081,
1566,
285,
253,
14663,
1332,
7156,
5070,
534,
310,
247,
1077,
3809,
14663,
1332,
407,
38883,
31025,
316,
1293,
436,
5301,
697,
12744,
670,
253,
16038,
273,
6371,
3985,
10554,
275,
14663,
50276,
18,
305,
300,
961,
1162,
355,
11454,
3935,
8136,
323,
6318,
18090,
17857,
1686,
1166,
50276,
19,
1182,
12109,
1162,
355,
22766,
4216,
11454,
2990,
465,
1678,
746,
50276,
20,
1269,
466,
1162,
355,
638,
26380,
2036,
2978,
285,
11795,
13418,
407,
32176,
4216,
11454,
2990,
246,
21192,
1423,
6701,
4477,
323,
29570,
2442,
7364,
273,
436,
789,
581,
2234,
5691,
273,
45021,
13361,
3210,
715,
6264,
1407,
66,
5657,
310,
253,
1566,
2087,
50228,
4477,
476,
7472,
253,
10166,
1566,
327,
625,
39709,
11809,
281,
923,
604,
352,
310,
7777,
2087,
12729,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
305,
9866,
2746,
281,
1407,
66,
970,
253,
5140,
273,
247,
5049,
4216,
326,
24772,
17856,
285,
17597,
1491,
347,
973,
347,
3386,
4561,
432,
3520,
3607,
273,
5049,
4295,
1223,
30628,
452,
5439,
2176,
7350,
690,
9713,
2168,
275,
30080,
22559,
597,
512,
11371,
1501,
30080,
22559,
327,
46705,
5075,
2997,
273,
253,
2929,
891,
5194,
342,
731,
285,
1158,
253,
5723,
2824,
8446,
651,
5649,
432,
253,
11250,
273,
436,
789,
275,
253,
2086,
285,
3103,
891,
5583,
14924,
891,
651,
751,
281,
11907,
253,
4477,
281,
1379,
715,
2395,
253,
5701,
285,
5955,
342,
253,
30628,
347,
973,
347,
19071,
4753,
3559,
275,
616,
6128,
672,
13828,
253,
6568,
4704,
2715
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
4081,
247,
4460,
4216,
6779,
5049,
4216,
24399,
253,
22766,
5049,
1491,
432,
9317,
9066,
285,
14663,
281,
12454,
253,
1407,
66,
2216,
1232,
253,
4081,
4216,
2605,
19401,
1097,
17597,
894,
4602,
275,
253,
2036,
3550,
285,
17856,
1491,
19274,
273,
253,
2629,
1341,
327,
253,
12806,
247,
3969,
4216,
11454,
2990,
305,
9866,
2605,
310,
4081,
323,
34705,
5049,
6779,
323,
2710,
15450,
8892,
253,
5661,
1543,
5183,
253,
12510,
273,
253,
4216,
275,
35367,
285,
2036,
6371,
3985,
10554,
8892,
342,
5919,
48257,
13782,
20544,
337,
22766,
1491,
11781,
2439,
2709,
1407,
66,
2216,
8661,
5431,
5049,
11809,
403,
4272,
715,
2709,
12475,
1016,
3408,
778,
452,
697,
1211,
4451,
6779,
323,
253,
1072,
6944,
5049,
253,
4081,
5049,
4216,
10316,
767,
14237,
2036,
3550,
285,
894,
14663,
715,
247,
27998,
4216,
6779,
534,
2530,
247,
625,
27096,
941,
2605,
21496,
3640,
432,
2709,
1407,
66,
2216,
12475,
337,
253,
4081,
5049,
4216,
310,
2087,
2217,
281,
320,
6508,
281,
26761,
2852,
789,
253,
2929,
760,
26847,
327,
35367,
285,
2036,
6371,
3985,
10554,
8892,
323,
7000,
24749,
285,
253,
4216,
704,
4953,
1320,
4428,
760,
2905,
5044,
18080,
1491,
285,
2969,
17856,
1491,
253,
37317,
11532,
253,
4081,
4216,
476,
26761,
625,
789,
275,
1407,
66,
3672,
323,
1650,
407,
6240,
2629,
894,
5778,
347,
581,
747,
4735,
275,
253,
894,
4666,
253,
4081,
4216,
778,
671,
1361,
342,
253,
11795,
1783,
273,
253,
5049,
337,
253,
4583,
305,
9866,
2605,
3637,
253,
2216,
273,
253,
5049,
4216,
534,
7835,
12532,
253,
17597,
285,
17856,
3935,
8136,
5289,
14003,
253,
2605,
273,
253,
3236,
5049,
4216,
50275,
20881,
1255,
265,
337,
253,
2929,
42126,
5181,
327,
849,
8612,
253,
10375,
305,
9866,
3386,
403,
253,
767,
8892,
35367,
10554,
285,
2036,
6371,
3985,
10554,
275,
253,
2929,
403,
3368,
264,
10939,
3738,
841,
767,
8892,
452,
1027,
1239,
8349,
597,
6096,
253,
1072,
3280,
4216,
3386,
285,
4908,
305,
9866,
4735,
6779,
352,
651,
320,
4722,
281,
2451,
604,
253,
3640,
476,
320,
9495,
432,
581,
4836,
281,
1529,
970,
253,
4081,
305,
9866,
337,
3738,
253,
4583,
305,
9866,
2605,
7835,
12532,
690,
7000,
15895,
390,
2216,
4327,
273,
305,
9866,
3198,
281,
320,
2007,
17285,
7000,
5701,
403,
1160,
275,
253,
3533,
253,
2929,
5393,
326,
581,
12291,
310,
281,
1071,
253,
4081,
1332,
762,
6264,
3580,
285,
625,
2570,
15216,
253,
37317,
11435,
326,
253,
4477,
3324,
598,
436,
285,
2096,
253,
10183,
3212,
352,
5474,
33032,
2520,
789,
21031,
247,
14053,
7792,
326,
13698,
281,
8415,
2710,
3237,
275,
253,
5049,
2216,
1232,
436,
789,
31167,
50276,
18,
247,
4460,
5049,
4216,
326,
310,
2104,
281,
26277,
19837,
253,
17597,
285,
38307,
1491,
285,
310,
7558,
281,
320,
253,
806,
27998,
5049,
6779,
2746,
326,
476,
320,
4354,
13333,
2439,
1407,
66,
8892,
285,
3924,
374,
247,
4460,
3935,
5858,
272,
22199,
5049,
3757,
79,
326,
310,
27846,
4404,
253,
18979,
4216,
10895,
2605,
253,
2605,
476,
2589,
3935,
5858,
272,
327,
1097,
17597,
285,
38307,
9297,
5799,
1242,
285,
840,
34824,
253,
8169,
281,
5731,
1341,
285,
37507,
14237,
495,
9470,
4679,
3588,
684,
253,
16108,
273,
253,
4081,
3082,
275,
2426,
273,
1097,
253,
4836,
7200,
285,
10636,
18819,
4757,
337,
436,
789,
1057,
247,
1175,
2628,
327,
18918,
285,
34805,
253,
8892,
285,
3237,
273,
5049,
1407,
66,
275,
1708,
273,
253,
5145,
4715,
3082,
374,
253,
16182,
310,
2529,
275,
1199,
7000,
533,
15246,
1039,
495,
4583,
436,
789,
3400,
12524,
11701,
689,
253,
5368,
3082,
816,
591,
253,
1543,
3815,
352,
310,
13943,
577,
253,
2127,
2530,
275,
253,
24864,
4753,
310,
5604,
247,
5043,
15979,
281,
253,
22107,
285,
21068,
273,
253,
2987,
275,
253,
4910,
50276,
20881,
1255,
337,
7419,
432,
253,
11701,
327,
253,
3935,
8136,
3082,
581,
273,
253,
2234,
9021,
273,
436,
789,
310,
281,
320,
2104,
281,
26277,
19837,
253,
1755,
80,
285,
49040,
1491,
275,
581,
1566,
2299,
891,
513,
417,
923,
4518,
253,
16038,
323,
436,
1127,
432,
1097,
253,
2898,
285,
1543,
24302,
323,
4588,
2898,
310,
627,
247,
1534,
18928,
273,
3365,
970,
767,
5239,
273,
3210,
390,
1014,
3082,
2975,
323,
13760,
9066,
285,
1659,
395,
83,
20309,
581,
273,
253,
4606,
891,
651,
24947,
310,
326,
247,
6036,
1566,
778,
4917,
247,
1805,
4836,
3045,
1955,
281,
253,
19767,
1491,
2299,
347,
2829,
374,
5936,
253,
6036,
3210,
11701,
1411,
253,
4081,
1332,
342,
760,
49040,
3935,
8136,
4754,
697,
9713,
5474,
339,
431,
248,
4477,
12661,
247,
27998,
1039,
281,
3989,
14580,
275,
1027,
12475,
273,
1407,
66,
2685,
285,
1287,
247,
2087,
305,
9866,
1566,
323,
15450,
1407,
66,
8892,
5742,
253,
4081,
2746,
806,
21031,
247,
22766,
4216,
407,
24049,
894,
3992,
10291,
38307,
1491,
285,
894,
3024,
10291,
17597,
1491,
253,
4666,
285,
5024,
3386,
403,
4561,
1754,
327,
3520,
3607,
273,
1341,
23854,
285,
37507,
840,
247,
5049,
305,
9866,
1566,
310,
4081,
281,
4647,
3935,
8136,
327,
894,
3992,
285,
894,
3024,
10291,
11794,
534,
11330,
253,
14237,
273,
1341,
285,
37507,
323,
15450,
8892,
253,
5661,
1543,
921,
326,
253,
4081,
1332,
5459,
22743,
7200,
327,
35367,
10554,
285,
11355,
23504,
2228,
327,
6371,
3985,
10554,
2234,
4757,
50273,
783,
2929,
310,
4518,
3542,
512,
253,
7681,
5018,
403,
3477,
281,
956,
50274,
783,
4081,
1332,
476,
320,
908,
281,
8415,
1554,
382,
486,
1407,
66,
8892,
50273,
2364,
14855,
50276,
20261,
253,
4081,
5049,
4216,
5140,
285,
305,
9866,
1566,
403,
512,
5272,
597,
3480,
690,
7681,
8453,
323,
1650,
50274,
1542,
5049,
4216,
5140,
352,
310,
15246,
281,
3989,
247,
49240,
4216,
1754,
327,
894,
3024,
10291,
432,
2036,
3550,
275,
1340,
281,
4711,
14237,
273,
1341,
285,
37507,
323,
15450,
8892,
7613,
253,
7680,
310,
3710,
323,
253,
4216,
5140,
3340,
275,
9317,
9066,
3924,
835,
14663,
1491,
310,
417,
2130,
50274,
1542,
305,
9866,
1566,
352,
310,
247,
1846,
1039,
281,
4647,
3935,
8136,
15978,
591,
5024,
1511,
323,
10885,
22766,
14580,
24088,
374,
3021,
253,
38135,
273,
253,
4081,
1566,
310,
3710,
50273,
20261,
253,
4679,
921,
12532,
7200,
15988,
323,
15450,
1407,
66,
8892,
2007,
37699,
812,
1056,
253,
11701,
625,
21414,
50274,
33722,
2266,
305,
9866,
1666,
25379,
253,
6777,
1666,
25379,
26332,
305,
14340,
14580,
486,
285,
305,
255,
760,
1908,
4666,
3386,
1580,
5024,
3386,
403,
1774,
275,
436,
2929,
4477,
943,
7277,
253,
4081,
1566,
1411,
10046,
1666,
25379,
24088,
23542,
9866,
18,
326,
19071,
5024,
3386,
327,
253,
1072,
3280,
4216,
1293,
247,
10046,
8245,
253,
7680,
273,
253,
4081,
305,
9866,
1566,
310,
12744,
50274,
1439,
25184,
4373,
22041,
323,
1666,
25379,
4477,
5206,
253,
4284,
4373,
22041,
323,
1666,
25379,
432,
616,
3236,
9380,
1580,
253,
15302,
908,
275,
1110,
9380,
24088,
495,
403,
1027,
432,
436,
2929,
4373,
19484,
25184,
310,
3309,
50274,
1439,
10941,
1411,
7156,
5070,
253,
4096,
273,
6371,
3985,
10554,
310,
281,
3885,
484,
1407,
66,
2216,
14230,
23188,
627,
403,
642,
1543,
273,
253,
20243,
5301,
875,
253,
4081,
1566,
285,
253,
14663,
1332,
7156,
5070,
534,
310,
247,
1077,
3809,
14663,
1332,
407,
38883,
31025,
316,
1293,
436,
5301,
697,
12744,
670,
253,
16038,
273,
6371,
3985,
10554,
275,
14663,
50276,
18,
305,
300,
961,
1162,
355,
11454,
3935,
8136,
323,
6318,
18090,
17857,
1686,
1166,
50276,
19,
1182,
12109,
1162,
355,
22766,
4216,
11454,
2990,
465,
1678,
746,
50276,
20,
1269,
466,
1162,
355,
638,
26380,
2036,
2978,
285,
11795,
13418,
407,
32176,
4216,
11454,
2990,
246,
21192,
1423,
6701,
4477,
323,
29570,
2442,
7364,
273,
436,
789,
581,
2234,
5691,
273,
45021,
13361,
3210,
715,
6264,
1407,
66,
5657,
310,
253,
1566,
2087,
50228,
4477,
476,
7472,
253,
10166,
1566,
327,
625,
39709,
11809,
281,
923,
604,
352,
310,
7777,
2087,
12729,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
305,
9866,
2746,
281,
1407,
66,
970,
253,
5140,
273,
247,
5049,
4216,
326,
24772,
17856,
285,
17597,
1491,
347,
973,
347,
3386,
4561,
432,
3520,
3607,
273,
5049,
4295,
1223,
30628,
452,
5439,
2176,
7350,
690,
9713,
2168,
275,
30080,
22559,
597,
512,
11371,
1501,
30080,
22559,
327,
46705,
5075,
2997,
273,
253,
2929,
891,
5194,
342,
731,
285,
1158,
253,
5723,
2824,
8446,
651,
5649,
432,
253,
11250,
273,
436,
789,
275,
253,
2086,
285,
3103,
891,
5583,
14924,
891,
651,
751,
281,
11907,
253,
4477,
281,
1379,
715,
2395,
253,
5701,
285,
5955,
342,
253,
30628,
347,
973,
347,
19071,
4753,
3559,
275,
616,
6128,
672,
13828,
253,
6568,
4704,
2715
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
overall i like this direction since this is an important open problem in rl that does not seem to be widely known i was unaware of it until i looked into the related work and could lead to improved algorithms i encourage the authors to continue to pursue this line of research however i have a few clarifications and questions regarding the experiments which make it unclear how meaningful the results are for now i vote to reject this work but am willing to change my opinion based on the rebuttal strengths the paper investigates and draws further attention to an important open problem that does not seem to widely known based on my reading of nota and thomas it appears most major papers in the field today do not acknowledge the discrepancy of the missing discount factor the paper includes many experiments especially in the appendix each with a robust 10 seeds i do have some issues with the experimental setup that i will detail later but i appreciate the variation in experiments i also think the representation learning experiments in scenario 1 using fhtd are an interesting approach to study the effect of learnt representations the experimental setup and methods used are clearly described and it appears the code will be made available in the final version thereby potentially making the experiments highly reproducible issuespoints of clarification most of the study is done in the setting where gamma1 scenario 1 in the paper this corresponds to the undiscounted objective where the current time index must be included in the state for correct estimation of the value function however the setting that is most widely used in existing literature involves a discount factor1 for instance all of the methods cited in the methodology section henderson et al 2017 ilyas et al 2018 engstrom et al 2019 andrychowicz et al 2020 fujimoto et al 2018 haarnoja et al 2018 use a discount1 andrychowicz et al do not include a discount of 1 in their sweep over discount factors either this is dubbed scenario 2 in the main text and includes only one experiment on the ant task it is fine to try to draw insights and focus on scenario 1 as long as it is well motivated however i do think it is misleading to claim we believe our empirical results are relevant to most practitioners when most of the study does not involve a setting that is actually used by said practitioners my second concern is with the method used to choose hyperparameters for the experiments in particular the learning rate is chosen based on the ant experiment and then the best performing parameters are fixed and transferred to the others while i appreciate the motivation behind this approach im not certain how well these transfer to some tasks in particular the humanoidstandup task seems to involve returns which are an order of magnitude greater than the other tasks i think at least for this one task a small sweep is essential to be confident of the claims there are a few points in the paper where correlation seems to be misinterpreted as causation for instance figures 1113 in the paper indicate that a a discounted critic gammac1 performs better on all tasks b biased updates using td instead of empirical returns performs better on some tasks these two statements alone are insufficient to claim that the advantage of a discounted critic gammac1 is therefore partly due to bias looking at figures 11 and 13 i think a figure similar to figure 12 comparing td and empirical returns can be generated for any discount factor eg gamma0995 perhaps i am missing something here and if so clarification from the authors would be much appreciated these discrepancies combine in figure 1 where for gammac099 different values of extra transition samples n are plotted ostensibly increasing n should reduce the variance even further however quite a few of the curves choosing n2 or 4 performs significantly worse could the authors clarify why they think this happens interestingly the only task where the effect of n seems to not matter is the ant task for which a hyperparameter sweep was completed additionally the task where increasing n impacts performance the most is the humanoidstandup task where the returns are quite significantly different to me this result stresses that there might be more at play here and a more detailed study is required to tease apart the various confounding factors in summary while i think the approach is quite interesting there are concerns in some of the claims made in the text i appreciate the effort that went into the current set of results and the experimental setup with that in mind i would be willing to accept this submission if my concerns above are clarified and if the conclusions drawn from the results are tempered given the evidence finally there are minor points of clarification that did not affect my overall review but i nonetheless list below in the discounted infinite horizon setup of scenario 2 the timestep no longer needs to be added to the state however the text indicates that this is still done even in this case i think this does affect bootstrapping and thus learning the value target specifically it may be easier to learn a consistent value function that in this setting when the time index is not included in the state could the authors clarify this point as a minor point for readability it would be good if the algorithm boxes for ppotd and ppotdex etc included colours to highlight the changes to ppo algorithm 1 since these overlap quite a bit this is purely from a presentation perspective of course references peter henderson riashat islam philip bachman joelle pineau doina precup and david meger deep reinforcement learning that matters arxiv preprint arxiv170906560 2017 andrew ilyas logan engstrom shibani santurkar dimitris tsipras firdaus janoos larry rudolph and aleksander madry a closer look at deep policy gradients arxiv preprint arxiv181102553 2018 logan engstrom andrew ilyas shibani santurkar dimitris tsipras firdaus janoos larry rudolph and aleksander madry implementation matters in deep rl a case study on ppo and trpo in international conference on learning representations 2019 marcin andrychowicz anton raichuk piotr stanczyk manu orsini sertan girgin raphael marinier leonard hussenot matthieu geist olivier pietquin marcin michalski et al what matters in onpolicy reinforcement learning a largescale empirical study arxiv preprint arxiv200605990 2020 scott fujimoto herke van hoof and david meger addressing function approximation error in actorcritic methods arxiv preprint arxiv180209477 2018 docsepthe authors examine the commonly used paradigm of not discounting in the policy gradient objective they propose two hypotheses relating to discounting 1 discounting the critic improves representation learning 2 undiscounted policy gradient is similar to discounting an auxiliary loss these hypotheses are studied through a series of empirical tests in the mujoco domain with ppo strengths i believe this paper is asking the right type of questions about common setups there are a lot of choices made in deep rl algorithms which dont align with theory and are otherwise unstudied and empirical studies are an important some of the approaches used to answer these questions are quite unique overall there a lot of experiments both in the paper and the appendix which is detailed this is a paper which will benefit from the additional page of content as a lot of key figures can be shifted to the main body weaknesses given the empirical nature of this study it is really important to have robust experimentation to really answer the hypotheses the paper raises i think the paper falls short at this aspect and i wasnt convinced by the arguments made for either hypothesis furthermore the conclusions that could be drawn from the results are generally not that surprising im not sure ppo is the best algorithm to analyze many of these questions for example engstrom et al 2019 showed a lot of very minor implementation level details had a large impact on the performance consequently it may be difficult to disentangle the actual causative factors in performance this is problematic as many of the claims in the paper are supported by empirical tests where the performance is not strikingly different for example figure 1 is meant to justify that for gammac 099 additional transitions improved performance but on several environments increasing n to 2 or 4 seems to hurt performance going against our intuition about variance reduction figure 2 shows that for n neq 0 there is a large performance drop but all values of n neq 0 achieve a very similar performance rather than trending downwards as n increases to me this suggests a very brittle algorithm for section 3 the biasvariance tradeoff is evident from prior work as referenced by the authors so the result is of course not novel i think analyzing it in a deep rl setting is important but because of the problems mentioned prior i didnt find that these results provide anything solid to add to our understanding the results for figure 3 arent convincing 1 because they are overfit by selecting the best possible h for each it seems likely to always arrive at a high performing agent 2 this more suggests that these environments dont require the full horizon to achieve a high performance consider a simple cartpole problem which is optimal using greedy actions but has a horizon of 1 million time steps since were in an approximate setting with deep networks it isnt surprising that the agent can achieve a high performance without considering the full horizon the results from the toy mrp experiment and distributional rl do suggest some kind of connection to representation learning but isnt considering a longer horizon simply a more difficult learning problem is the representation necessarily an important aspect here i didnt find that the authors answered this question the conclusion from section 4 is that gammaa1 is an inductive bias that all transitions are equally important seems entirely selfevident from the mathematical definition given it applies equal weight to all transitions at the same time the main question of hypothesis 2 seems unanswered shouldnt auxppo approx ppo rather than disppo if this was true a single environment for figure 9 is not enough to draw any meaningful conclusions i did not find the discussion in b1 convincing that the other environments were not suitable simply change t0 for the other environments from personal experience the horizon of ant is generally large near 1000 as the terminal condition is hard to achieve meaning the difference between ant and the fixed length environments should be small additional comments 1 i do wonder if this paper is better off as two separate documents where each hypothesis is provided much more significant attentionexperimentation for example hypothesis 1 isnt actorcritic specific and is also applicable to qlearning based methods these experiments could be simplified by looking at algorithms with significantly fewer components and more settings 2 for the ppotdex experiment i think its also worth considering extrapolation error fujimoto et al 2019 in td learning since sit1 is sampled from a single transition rather than a full trajectory it is not necessarily contained in the batch as a result hat v is not trained on sit1 and produces an erroneous td target my first impression was that the performance drop for gammac1 was not surprising but the performance gain from n1 for gammac099 was and i think are are unanswered questions here another important reference is bengio et al 2020 which showed td0 generalizes worse than tdlambda and there is clearly a related result here 3 given mujoco environments are timelimited to 1000 time steps 1024 heads for ppofhtd seems like a mistakeoversight 4 why does ppofhtd with h1024 produce different results for the different parametrizations 5 is figure 6 surprising since the value function needs to consider a large space of solutions as the horizon increases 6 given distributional rl provides a large performance gain which to the best of my knowledge we are still missing a conclusive reason as to why im not sure ppoc51 ppotd is a significant result 7 it would be clearer if disppo was described before mentioning figure 15 8 figure 15 seems like an important conclusion and should be contained in the main body of the paper however the yaxis of figure 15 also conflicts with the description in the main body so im not sure what the correct interpretation is 9 i wonder if the result from figure 9 is reproducible if the flipping was done in a different way in the mujoco environments is the agent is rewarded mainly for velocity and the behavior of the agent in these cases would be enlightening does the agent run forward and then attempt to terminate can it move backwards conclusion i think the authors present a lot of interesting ideas and experimental approaches to answer their underlying questions however i felt that the experimentation was not sufficiently robust to justify their conclusions and i cannot recommend acceptance references engstrom logan et al implementation matters in deep rl a case study on ppo and trpo 2019 fujimoto scott et al offpolicy deep reinforcement learning without exploration 2019 bengio emmanuel et al interference and generalization in temporal difference learning 2020 edit nov 23 i have slightly increased my score due to the improvements made to the paper mainly reorganization some clarifications made by the authors but i still dont feel like my main concerns were addressed docsepsummary the paper proposes an empirical study of the discount factor as a regularization parameter in the actorcritic architectures specifically the paper considers the case in which the actor and the critic employ different values of the discount factor two scenarios are considered first the paper analyzes the case in which the true objective is undiscounted and a discount factor is employed in the critic like in trpo and ppo second the case in which the true objective is actually discounted but the discount factor is ignored in the update of the actor a quite large suite of experimental results is reported major issues organization the paper presents an extensive experimental evaluation that is split between the main paper and the appendix however in the main paper there are a lot of references and discussions related to experimental results that are provided in the appendix only this happens both in section 3 and in section 4 sometimes these results presented in the appendix only seem to be some fundamental claims of the paper like for figures 11 12 and 13 i think this choice makes affects negatively the readability and clarity of the paper indeed the reader has to continuously jump between the main paper and the appendix similarly the pseudocodes are reported in the appendix only but i think that this is less relevant compared to the plots i think that the paper would greatly benefit from a reorganization making it more selfcontained biasrepresentation tradeoff one of the main claims of the paper is that using a discount factor 1 in the critic when the true objective is undiscounted has a regularization effect not only on the variance but also on the learnability of the value function itself i have to admit that the paper has not convinced me on this point it is hard to say that the representation of the value function becomes more complex as the discount factor approaches one or similarly as the horizon increases in general i think that is possible to devise mdps in which the value function representation becomes simpler as the horizon increases as well as mdps in which it becomes more complex i can imagine that for a class of tasks the statement can be true but the paper does discuss the properties of these tasks can the authors elaborate more on this point auxiliary task perspective the paper proposes a perspective of the critic update without a discount factor for a discounted objective as a sum of two terms however i have some concerns about the application of the clipping technique independently for the two terms why not perform the clipping just once to the original discounted objective minor issues in section 2 the mdp model is introduced assuming finite stateaction spaces is this assumption really necessary the experiments are carried out on mujoco tasks that are characterized by continuous stateaction spaces the plots are very small including the ticks and labels on the axis moreover they are not readable when printing the paper in grayscale i suggest using different linestyles or markers overall i think that the paper addresses a relevant problem that is surely important to bridge the gap between theory and practice however i have some concerns about the organization and about the conclusions especially regarding the biasrepresentation tradeoff that the paper draws from the presented results for these reasons i think that the paper is currently not ready for publication at iclrdocsepin this paper the authors focus on the discounting mismatch in the actorcritic algorithm from comprehensive experiments the authors claim that this mismatch is either a biasvariance representation tradeoff or an auxiliary task for the actor update since the discounting mismatch problem is a wellknown gap between the theoretical analysis and the application their work especially the experiments might have some impact on how to understand this gap however since it does not provide any new analysis technique or practical model to improve the performance of the ac algorithm i would encourage the authors to do more analysis of the choice of gamma like how to choose gamma might lead to a good performance either experimentally or theoretically and i believe that would have more impact on both the theoretical analysis and practical algorithm design and meanwhile since in the first scenario the mismatching of gamma is considered to reduce the variance it would be interesting if the authors could compare this kind of variance reduction with the stochastic variance reduction on the policygradient algorithms 1 2 3 therefore though this paper lacks a theoretical analysis or a groundbreaking experimental performance this paper has an interesting and comprehensive experimental survey and proposes some new hypothesizes on this problem i will suggest borderline accept for this paper i might consider modifying my suggestion after discussion with other reviewers and the authors response 1 papini matteo et al stochastic variancereduced policy gradient arxiv preprint arxiv180605618 2018 2 xu pan felicia gao and quanquan gu sample efficient policy gradient methods with recursive variance reduction arxiv preprint arxiv190908610 2019 3 yuan huizhuo et al stochastic recursive momentum for policy gradient methods arxiv preprint arxiv200304302 2020
### Summary: | this paper studies the effect of the discount mismatch in actorcritics the discount used for evaluation often 1 the discount used for the critic and the discount used for the actor theres notably a representation learning argument supported by a series of experiments the initial reviews pointed out that this paper addresses very relevant research questions sometimes in a quite original way with a large set of experiments however they also raised concerns about the organizationclarity of the paper and possible weaknesses about the experimental studies the authors provided a rebuttal and a revision that clarified some points and triggered additional discussions however if the revision improved the initial submission the shared assessment is that the clarity and experiments themselves are still somewhat lacking as such the ac cannot recommend accepting this paper yet this work does have interesting ideas and the problem considered is of interest for the community and under studied the authors are strongly encouraged to submit a revised version to a future venue | [
432,
2720,
789,
347,
23378,
407,
253,
4477,
594,
253,
906,
310,
273,
2282,
417,
4460,
891,
1158,
18918,
352,
275,
247,
3676,
391,
77,
4758,
310,
1774,
533,
984,
273,
253,
3237,
5393,
2720,
891,
42126,
1089,
326,
841,
1543,
2085,
2712,
4891,
281,
823,
281,
776,
4685,
50275,
783,
1543,
323,
4677,
495,
403,
2649,
21414,
337,
984,
597,
403,
689,
8491,
407,
17221,
253,
1682,
1896,
288,
323,
1016,
352,
3133,
2779,
281,
1900,
12666,
387,
247,
1029,
9591,
5570,
374,
436,
625,
5936,
326,
841,
12620,
13414,
2430,
253,
2120,
16892,
281,
5115,
247,
1029,
3045,
1908,
247,
2969,
7281,
36479,
1895,
534,
310,
8654,
970,
38754,
5231,
533,
556,
247,
16892,
273,
337,
3041,
673,
5018,
1580,
497,
275,
271,
16851,
4758,
342,
3676,
6928,
352,
310,
2649,
10084,
326,
253,
5570,
476,
5115,
247,
1029,
3045,
1293,
7296,
253,
2120,
16892,
50275,
783,
1543,
432,
253,
20953,
278,
28946,
3368,
285,
3268,
267,
391,
77,
513,
1804,
690,
2238,
273,
4602,
281,
6779,
4715,
533,
310,
2649,
7296,
247,
3356,
16892,
3365,
247,
625,
2834,
4715,
1895,
310,
253,
6779,
7933,
271,
1774,
4809,
1060,
891,
42126,
1089,
326,
253,
4477,
9577,
436,
1953,
50275,
783,
6452,
432,
2593,
577,
310,
326,
17356,
66,
18,
310,
271,
42115,
8492,
326,
512,
16307,
403,
9696,
1774,
3133,
7094,
11329,
453,
87,
888,
432,
253,
15965,
5426,
1677,
352,
10384,
4503,
2801,
281,
512,
16307,
387,
253,
1072,
673,
253,
2022,
1953,
273,
9079,
374,
3133,
440,
42195,
943,
2649,
10992,
31244,
1192,
89,
268,
5367,
2581,
685,
557,
31244,
604,
436,
369,
2032,
50275,
66,
2014,
3126,
323,
4677,
898,
310,
417,
2217,
281,
3812,
667,
14282,
11815,
891,
858,
417,
1089,
253,
5955,
275,
270,
18,
21414,
326,
253,
643,
12620,
497,
417,
7470,
3365,
1818,
246,
17,
323,
253,
643,
12620,
432,
3367,
2793,
253,
16892,
273,
1331,
310,
3839,
1781,
2822,
9098,
347,
253,
8351,
1617,
310,
1892,
281,
5115,
4495,
253,
3064,
875,
1331,
285,
253,
4229,
2978,
12620,
943,
320,
1355,
50276,
38092,
5701,
337,
891,
513,
4282,
604,
436,
2929,
310,
1805,
745,
347,
767,
4858,
7177,
835,
1016,
9079,
310,
2530,
1199,
625,
1534,
4116,
16217,
2092,
318,
323,
1650,
9079,
337,
310,
2649,
12353,
68,
17425,
2173,
285,
310,
671,
7763,
281,
2805,
28269,
1754,
3082,
841,
4679,
812,
320,
21010,
407,
2819,
387,
11333,
342,
3012,
11184,
4295,
285,
625,
7533,
50276,
19,
323,
253,
7266,
302,
2465,
3368,
891,
1158,
697,
671,
4409,
7296,
26480,
17888,
2228,
269,
10441,
45459,
1162,
355,
6247,
275,
32989,
4715,
1580,
1790,
18,
310,
19958,
432,
247,
2014,
5502,
2581,
685,
247,
2120,
18974,
352,
310,
417,
7933,
6221,
275,
253,
14604,
347,
247,
906,
7856,
362,
310,
417,
10166,
327,
1790,
18,
285,
11330,
271,
21210,
32989,
2303,
619,
806,
13214,
369,
326,
253,
3045,
5926,
323,
305,
3681,
317,
18,
369,
417,
10084,
533,
253,
3045,
6351,
432,
295,
18,
323,
305,
3681,
317,
43771,
369,
285,
891,
1158,
403,
403,
440,
42195,
3533,
1060,
1529,
1774,
3806,
310,
270,
1205,
900,
1162,
355,
9169,
534,
2692,
32989,
17,
2087,
4219,
7197,
685,
32989,
2260,
285,
627,
310,
4518,
247,
2905,
906,
1060,
50276,
20,
1677,
278,
10441,
16856,
12620,
403,
4522,
293,
303,
959,
281,
9098,
673,
5018,
27277,
9851,
323,
7266,
1171,
384,
69,
3133,
751,
247,
10551,
12239,
429,
50276,
21,
2139,
1057,
7266,
1171,
384,
69,
342,
288,
31111,
4711,
1027,
1543,
323,
253,
1027,
30364,
21100,
569,
50276,
22,
310,
4677,
721,
10084,
1580,
253,
1318,
1159,
3198,
281,
1908,
247,
1781,
2317,
273,
5482,
347,
253,
16892,
5459,
721,
1677,
3268,
267,
391,
77,
3400,
247,
1781,
3045,
6351,
534,
281,
253,
1682,
273,
619,
3640,
359,
403,
1335,
5816,
247,
38662,
1921,
347,
281,
2139,
516,
417,
2119,
7266,
406,
3712,
50276,
377,
302,
69,
310,
247,
1534,
906,
50276,
24,
352,
651,
320,
30909,
604,
557,
31244,
369,
2529,
1078,
29570,
4677,
1458,
50276,
25,
4677,
1458,
3133,
751,
271,
1774,
6452,
285,
943,
320,
6221,
275,
253,
2022,
2133,
273,
253,
2929,
2299,
253,
340,
10565,
273,
4677,
1458,
671,
15272,
342,
253,
5740,
275,
253,
2022,
2133,
594,
516,
417,
2119,
752,
253,
3451,
7914,
310,
898,
891,
4282,
604,
253,
906,
432,
4677,
898,
310,
41374,
604,
253,
46899,
369,
2218,
275,
247,
1027,
1039,
275,
253,
278,
10441,
16856,
12620,
310,
253,
5570,
310,
33302,
7194,
323,
7602,
285,
253,
3879,
273,
253,
5570,
275,
841,
2219,
651,
320,
25441,
2980,
1057,
253,
5570,
1408,
3579,
285,
840,
3177,
281,
24174,
476,
352,
2118,
24291,
50276,
585,
3444,
50276,
74,
1158,
253,
4477,
1246,
247,
2257,
273,
4722,
5697,
285,
5661,
7274,
281,
3662,
616,
6944,
3533,
2299,
891,
3543,
326,
253,
40290,
369,
417,
10481,
10237,
281,
15249,
616,
11815,
285,
891,
2550,
5583,
14924,
50276,
250,
3065,
50276,
1205,
27621,
2412,
266,
1162,
355,
7092,
8213,
275,
3676,
391,
77,
247,
1083,
1263,
327,
268,
5367,
285,
492,
5367,
6247,
50276,
71,
10441,
45459,
660,
1519,
1162,
355,
745,
22872,
3676,
35221,
4715,
1293,
17947,
6247,
50276,
67,
1205,
900,
802,
29506,
1162,
355,
11689,
285,
26647,
275,
11935,
3064,
4715,
9169,
50275,
15576,
22458,
3495,
891,
452,
5777,
2559,
619,
4868,
1955,
281,
253,
11701,
1160,
281,
253,
2929,
7194,
40386,
50276,
8826,
8254,
6787,
1160,
407,
253,
4477,
533,
891,
1335,
13414,
1928,
751,
619,
2022,
7350,
497,
9713,
5474,
339,
793,
360,
3454,
253,
2929,
29328,
271,
16774,
1263,
273,
253,
13630,
2803,
347,
247,
37820,
4764,
275,
253,
12353,
68,
17425,
35615,
5742,
253,
2929,
19401,
253,
1083,
275,
534,
253,
12353,
285,
253,
7291,
2126,
1027,
2193,
273,
253,
13630,
2803,
767,
15216,
403,
2783,
806,
253,
2929,
3537,
13505,
253,
1083,
275,
534,
253,
2032,
8103,
310,
3807,
2865,
702,
264,
285,
247,
13630,
2803,
310,
7091,
275,
253,
7291,
751,
275,
492,
5367,
285,
268,
5367,
1273,
253,
1083,
275,
534,
253,
2032,
8103,
310,
2686,
42214,
533,
253,
13630,
2803,
310,
12841,
275,
253,
5731,
273,
253,
12353,
247,
3240,
1781,
18880,
273,
5661,
1543,
310,
2361,
50276,
24330,
3374,
50276,
25590,
253,
2929,
10262,
271,
9470,
5661,
7103,
326,
310,
8085,
875,
253,
2022,
2929,
285,
253,
30762,
2299,
275,
253,
2022,
2929,
627,
403,
247,
2257,
273,
10414,
285,
11985,
2905,
281,
5661,
1543,
326,
403,
2530,
275,
253,
30762,
760,
436,
6569,
1097,
275,
2593,
495,
285,
275,
2593,
577,
4536,
841,
1543,
3559,
275,
253,
30762,
760,
1646,
281,
320,
690,
7936,
3916,
273,
253,
2929,
751,
323,
8442,
1903,
1249,
285,
2145,
891,
1158,
436,
4327,
2789,
11852,
18123,
253,
1239,
1430,
285,
19843,
273,
253,
2929,
6296,
253,
9414,
556,
281,
14949,
6923,
875,
253,
2022,
2929,
285,
253,
30762,
12014,
253,
10585,
406,
3180,
403,
2361,
275,
253,
30762,
760,
533,
891,
1158,
326,
436,
310,
1679,
4623,
2429,
281,
253,
14777,
891,
1158,
326,
253,
2929,
651,
10260,
5649,
432,
247,
40386,
2403,
352,
625,
1881,
41010,
50276,
39043,
37626,
5454,
2727,
581,
273,
253,
2022,
3916,
273,
253,
2929,
310,
326,
970,
247,
13630,
2803,
50276,
18,
275,
253,
7291,
672,
253,
2032,
8103,
310,
3807,
2865,
702,
264,
556,
247,
37820,
1055,
417,
760,
327,
253,
11041,
533,
671,
327,
253,
3037,
1430,
273,
253,
1318,
1159,
3139,
891,
452,
281,
11476,
326,
253,
2929,
556,
417,
13762,
479,
327,
436,
1127,
352,
310,
1892,
281,
1333,
326,
253,
6779,
273,
253,
1318,
1159,
4916,
625,
2570,
347,
253,
13630,
2803,
7274,
581,
390,
12014,
347,
253,
16892,
5459,
275,
2087,
891,
1158,
326,
310,
1896,
281,
45018,
31934,
793,
275,
534,
253,
1318,
1159,
6779,
4916,
19554,
347,
253,
16892,
5459,
347,
973,
347,
31934,
793,
275,
534,
352,
4916,
625,
2570,
891,
476,
8564,
326,
323,
247,
966,
273,
8892,
253,
3908,
476,
320,
2032,
533,
253,
2929,
1057,
2319,
253,
3607,
273,
841,
8892,
476,
253,
4477,
21184,
625,
327,
436,
1127,
50276,
10422,
15434,
4836,
8668,
253,
2929,
29328,
247,
8668,
273,
253,
7291,
5731,
1293,
247,
13630,
2803,
323,
247,
42214,
8103,
347,
247,
2020,
273,
767,
2426,
2299,
891,
452,
690,
7350,
670,
253,
2898,
273,
253,
502,
8201,
5853,
10939,
323,
253,
767,
2426,
2139,
417,
1347,
253,
502,
8201,
816,
2378,
281,
253,
3236,
42214,
8103,
50275,
37585,
3374,
50276,
249,
2593,
374,
253,
278,
12132,
1566,
310,
5611,
7384,
6486,
1375,
1913,
8470,
310,
436,
9376,
1663,
3309,
253,
4679,
403,
4824,
562,
327,
278,
10441,
16856,
8892,
326,
403,
7943,
407,
5415,
1375,
1913,
8470,
50276,
783,
14777,
403,
1077,
1355,
1690,
253,
39064,
285,
13301,
327,
253,
7844,
25761,
597,
403,
417,
34025,
672,
11993,
253,
2929,
275,
650,
698,
25912,
891,
1804,
970,
1027,
19169,
383,
9250,
390,
9588,
50276,
1189,
455,
891,
1158,
326,
253,
2929,
12453,
247,
4623,
1895,
326,
310,
13353,
1774,
281,
9729,
253,
8037,
875,
3762,
285,
3946,
2299,
891,
452,
690,
7350,
670,
253,
6003,
285,
670,
253,
11815,
3340,
5001,
253,
8492,
37626,
5454,
2727,
326,
253,
2929,
21354,
432,
253,
3559,
1543,
323,
841,
4606,
891,
1158,
326,
253,
2929,
310,
4390,
417,
4704,
323,
9311,
387,
17857,
77,
5784,
406,
339,
9852,
436,
2929,
253,
4477,
2770,
327,
253,
13630,
272,
29713,
275,
253,
12353,
68,
17425,
5933,
432,
11088,
4679,
253,
4477,
1750,
326,
436,
29713,
310,
2057,
247,
8492,
87,
14417,
6779,
5454,
2727,
390,
271,
24026,
4836,
323,
253,
12353,
5731,
1580,
253,
13630,
272,
29713,
1895,
310,
247,
973,
4304,
8037,
875,
253,
10527,
1783,
285,
253,
2898,
616,
789,
3340,
253,
4679,
1537,
452,
690,
3486,
327,
849,
281,
2096,
436,
8037,
50276,
35529,
1580,
352,
1057,
417,
2085,
667,
747,
1783,
5853,
390,
8542,
1566,
281,
3157,
253,
3045,
273,
253,
913,
5933,
891,
651,
11907,
253,
4477,
281,
513,
625,
1783,
273,
253,
4327,
273,
17356,
751,
849,
281,
5206,
17356,
1537,
1421,
281,
247,
1175,
3045,
2057,
21657,
390,
28055,
285,
891,
2868,
326,
651,
452,
625,
3486,
327,
1097,
253,
10527,
1783,
285,
8542,
5933,
2216,
285,
26614,
1580,
275,
253,
806,
10076,
253,
19412,
16464,
273,
17356,
310,
2783,
281,
4796,
253,
11041,
352,
651,
320,
4722,
604,
253,
4477,
812,
7277,
436,
2238,
273,
11041,
5141,
342,
253,
19191,
11041,
5141,
327,
253,
3646,
29844,
11333,
337,
374,
495,
50276,
45230,
2167,
436,
2929,
19756,
247,
10527,
1783,
390,
247,
3216,
22071,
5661,
3045,
436,
2929,
556,
271,
4722,
285,
11088,
5661,
6630,
285,
29328,
690,
747,
6482,
4219,
327,
436,
1895,
891,
588,
1804,
45210,
2997,
323,
436,
2929,
891,
1537,
1908,
26264,
619,
14876,
846,
5955,
342,
643,
30628,
285,
253,
4477,
2380,
50276,
18,
13860,
5391,
1111,
442,
80,
1162,
355,
19191,
11041,
43408,
3646,
11786,
549,
32693,
638,
3845,
549,
32693,
11395,
1549,
3208,
1093,
4765,
50276,
19,
1269,
86,
3199,
11664,
29583,
305,
8500,
285,
572,
266,
371,
266,
1149,
3410,
5919,
3646,
11786,
3082,
342,
33037,
11041,
5141,
549,
32693,
638,
3845,
549,
32693,
16129,
2270,
2691,
740,
6247,
50276,
20,
340,
9041,
30287,
478,
11917,
80,
1162,
355,
19191,
33037,
10254,
323,
3646,
11786,
3082,
549,
32693,
638,
3845,
549,
32693,
1518,
19321,
19044,
9169,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
1055,
273,
253,
13630,
29713,
275,
12353,
24913,
982,
253,
13630,
908,
323,
7103,
2223,
337,
253,
13630,
908,
323,
253,
7291,
285,
253,
13630,
908,
323,
253,
12353,
253,
373,
19836,
247,
6779,
4715,
4154,
4516,
407,
247,
2962,
273,
4679,
253,
3302,
10123,
8042,
562,
326,
436,
2929,
12453,
1077,
4623,
2561,
3533,
4536,
275,
247,
3240,
3236,
1039,
342,
247,
1781,
873,
273,
4679,
2299,
597,
671,
5439,
7350,
670,
253,
6003,
498,
15752,
273,
253,
2929,
285,
1896,
32213,
670,
253,
5661,
2175,
253,
4477,
2530,
247,
30080,
22559,
285,
247,
18520,
326,
31637,
690,
2792,
285,
17142,
3081,
11985,
2299,
604,
253,
18520,
5520,
253,
3302,
19529,
253,
6096,
6803,
310,
326,
253,
19843,
285,
4679,
3746,
403,
1335,
8489,
14999,
347,
824,
253,
913,
2550,
5583,
18738,
436,
2929,
2568,
436,
789,
1057,
452,
4722,
5697,
285,
253,
1895,
2783,
310,
273,
1600,
323,
253,
3114,
285,
762,
5421,
253,
4477,
403,
7052,
14659,
281,
11929,
247,
17265,
2715,
281,
247,
2852,
18767,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
432,
2720,
789,
347,
23378,
407,
253,
4477,
594,
253,
906,
310,
273,
2282,
417,
4460,
891,
1158,
18918,
352,
275,
247,
3676,
391,
77,
4758,
310,
1774,
533,
984,
273,
253,
3237,
5393,
2720,
891,
42126,
1089,
326,
841,
1543,
2085,
2712,
4891,
281,
823,
281,
776,
4685,
50275,
783,
1543,
323,
4677,
495,
403,
2649,
21414,
337,
984,
597,
403,
689,
8491,
407,
17221,
253,
1682,
1896,
288,
323,
1016,
352,
3133,
2779,
281,
1900,
12666,
387,
247,
1029,
9591,
5570,
374,
436,
625,
5936,
326,
841,
12620,
13414,
2430,
253,
2120,
16892,
281,
5115,
247,
1029,
3045,
1908,
247,
2969,
7281,
36479,
1895,
534,
310,
8654,
970,
38754,
5231,
533,
556,
247,
16892,
273,
337,
3041,
673,
5018,
1580,
497,
275,
271,
16851,
4758,
342,
3676,
6928,
352,
310,
2649,
10084,
326,
253,
5570,
476,
5115,
247,
1029,
3045,
1293,
7296,
253,
2120,
16892,
50275,
783,
1543,
432,
253,
20953,
278,
28946,
3368,
285,
3268,
267,
391,
77,
513,
1804,
690,
2238,
273,
4602,
281,
6779,
4715,
533,
310,
2649,
7296,
247,
3356,
16892,
3365,
247,
625,
2834,
4715,
1895,
310,
253,
6779,
7933,
271,
1774,
4809,
1060,
891,
42126,
1089,
326,
253,
4477,
9577,
436,
1953,
50275,
783,
6452,
432,
2593,
577,
310,
326,
17356,
66,
18,
310,
271,
42115,
8492,
326,
512,
16307,
403,
9696,
1774,
3133,
7094,
11329,
453,
87,
888,
432,
253,
15965,
5426,
1677,
352,
10384,
4503,
2801,
281,
512,
16307,
387,
253,
1072,
673,
253,
2022,
1953,
273,
9079,
374,
3133,
440,
42195,
943,
2649,
10992,
31244,
1192,
89,
268,
5367,
2581,
685,
557,
31244,
604,
436,
369,
2032,
50275,
66,
2014,
3126,
323,
4677,
898,
310,
417,
2217,
281,
3812,
667,
14282,
11815,
891,
858,
417,
1089,
253,
5955,
275,
270,
18,
21414,
326,
253,
643,
12620,
497,
417,
7470,
3365,
1818,
246,
17,
323,
253,
643,
12620,
432,
3367,
2793,
253,
16892,
273,
1331,
310,
3839,
1781,
2822,
9098,
347,
253,
8351,
1617,
310,
1892,
281,
5115,
4495,
253,
3064,
875,
1331,
285,
253,
4229,
2978,
12620,
943,
320,
1355,
50276,
38092,
5701,
337,
891,
513,
4282,
604,
436,
2929,
310,
1805,
745,
347,
767,
4858,
7177,
835,
1016,
9079,
310,
2530,
1199,
625,
1534,
4116,
16217,
2092,
318,
323,
1650,
9079,
337,
310,
2649,
12353,
68,
17425,
2173,
285,
310,
671,
7763,
281,
2805,
28269,
1754,
3082,
841,
4679,
812,
320,
21010,
407,
2819,
387,
11333,
342,
3012,
11184,
4295,
285,
625,
7533,
50276,
19,
323,
253,
7266,
302,
2465,
3368,
891,
1158,
697,
671,
4409,
7296,
26480,
17888,
2228,
269,
10441,
45459,
1162,
355,
6247,
275,
32989,
4715,
1580,
1790,
18,
310,
19958,
432,
247,
2014,
5502,
2581,
685,
247,
2120,
18974,
352,
310,
417,
7933,
6221,
275,
253,
14604,
347,
247,
906,
7856,
362,
310,
417,
10166,
327,
1790,
18,
285,
11330,
271,
21210,
32989,
2303,
619,
806,
13214,
369,
326,
253,
3045,
5926,
323,
305,
3681,
317,
18,
369,
417,
10084,
533,
253,
3045,
6351,
432,
295,
18,
323,
305,
3681,
317,
43771,
369,
285,
891,
1158,
403,
403,
440,
42195,
3533,
1060,
1529,
1774,
3806,
310,
270,
1205,
900,
1162,
355,
9169,
534,
2692,
32989,
17,
2087,
4219,
7197,
685,
32989,
2260,
285,
627,
310,
4518,
247,
2905,
906,
1060,
50276,
20,
1677,
278,
10441,
16856,
12620,
403,
4522,
293,
303,
959,
281,
9098,
673,
5018,
27277,
9851,
323,
7266,
1171,
384,
69,
3133,
751,
247,
10551,
12239,
429,
50276,
21,
2139,
1057,
7266,
1171,
384,
69,
342,
288,
31111,
4711,
1027,
1543,
323,
253,
1027,
30364,
21100,
569,
50276,
22,
310,
4677,
721,
10084,
1580,
253,
1318,
1159,
3198,
281,
1908,
247,
1781,
2317,
273,
5482,
347,
253,
16892,
5459,
721,
1677,
3268,
267,
391,
77,
3400,
247,
1781,
3045,
6351,
534,
281,
253,
1682,
273,
619,
3640,
359,
403,
1335,
5816,
247,
38662,
1921,
347,
281,
2139,
516,
417,
2119,
7266,
406,
3712,
50276,
377,
302,
69,
310,
247,
1534,
906,
50276,
24,
352,
651,
320,
30909,
604,
557,
31244,
369,
2529,
1078,
29570,
4677,
1458,
50276,
25,
4677,
1458,
3133,
751,
271,
1774,
6452,
285,
943,
320,
6221,
275,
253,
2022,
2133,
273,
253,
2929,
2299,
253,
340,
10565,
273,
4677,
1458,
671,
15272,
342,
253,
5740,
275,
253,
2022,
2133,
594,
516,
417,
2119,
752,
253,
3451,
7914,
310,
898,
891,
4282,
604,
253,
906,
432,
4677,
898,
310,
41374,
604,
253,
46899,
369,
2218,
275,
247,
1027,
1039,
275,
253,
278,
10441,
16856,
12620,
310,
253,
5570,
310,
33302,
7194,
323,
7602,
285,
253,
3879,
273,
253,
5570,
275,
841,
2219,
651,
320,
25441,
2980,
1057,
253,
5570,
1408,
3579,
285,
840,
3177,
281,
24174,
476,
352,
2118,
24291,
50276,
585,
3444,
50276,
74,
1158,
253,
4477,
1246,
247,
2257,
273,
4722,
5697,
285,
5661,
7274,
281,
3662,
616,
6944,
3533,
2299,
891,
3543,
326,
253,
40290,
369,
417,
10481,
10237,
281,
15249,
616,
11815,
285,
891,
2550,
5583,
14924,
50276,
250,
3065,
50276,
1205,
27621,
2412,
266,
1162,
355,
7092,
8213,
275,
3676,
391,
77,
247,
1083,
1263,
327,
268,
5367,
285,
492,
5367,
6247,
50276,
71,
10441,
45459,
660,
1519,
1162,
355,
745,
22872,
3676,
35221,
4715,
1293,
17947,
6247,
50276,
67,
1205,
900,
802,
29506,
1162,
355,
11689,
285,
26647,
275,
11935,
3064,
4715,
9169,
50275,
15576,
22458,
3495,
891,
452,
5777,
2559,
619,
4868,
1955,
281,
253,
11701,
1160,
281,
253,
2929,
7194,
40386,
50276,
8826,
8254,
6787,
1160,
407,
253,
4477,
533,
891,
1335,
13414,
1928,
751,
619,
2022,
7350,
497,
9713,
5474,
339,
793,
360,
3454,
253,
2929,
29328,
271,
16774,
1263,
273,
253,
13630,
2803,
347,
247,
37820,
4764,
275,
253,
12353,
68,
17425,
35615,
5742,
253,
2929,
19401,
253,
1083,
275,
534,
253,
12353,
285,
253,
7291,
2126,
1027,
2193,
273,
253,
13630,
2803,
767,
15216,
403,
2783,
806,
253,
2929,
3537,
13505,
253,
1083,
275,
534,
253,
2032,
8103,
310,
3807,
2865,
702,
264,
285,
247,
13630,
2803,
310,
7091,
275,
253,
7291,
751,
275,
492,
5367,
285,
268,
5367,
1273,
253,
1083,
275,
534,
253,
2032,
8103,
310,
2686,
42214,
533,
253,
13630,
2803,
310,
12841,
275,
253,
5731,
273,
253,
12353,
247,
3240,
1781,
18880,
273,
5661,
1543,
310,
2361,
50276,
24330,
3374,
50276,
25590,
253,
2929,
10262,
271,
9470,
5661,
7103,
326,
310,
8085,
875,
253,
2022,
2929,
285,
253,
30762,
2299,
275,
253,
2022,
2929,
627,
403,
247,
2257,
273,
10414,
285,
11985,
2905,
281,
5661,
1543,
326,
403,
2530,
275,
253,
30762,
760,
436,
6569,
1097,
275,
2593,
495,
285,
275,
2593,
577,
4536,
841,
1543,
3559,
275,
253,
30762,
760,
1646,
281,
320,
690,
7936,
3916,
273,
253,
2929,
751,
323,
8442,
1903,
1249,
285,
2145,
891,
1158,
436,
4327,
2789,
11852,
18123,
253,
1239,
1430,
285,
19843,
273,
253,
2929,
6296,
253,
9414,
556,
281,
14949,
6923,
875,
253,
2022,
2929,
285,
253,
30762,
12014,
253,
10585,
406,
3180,
403,
2361,
275,
253,
30762,
760,
533,
891,
1158,
326,
436,
310,
1679,
4623,
2429,
281,
253,
14777,
891,
1158,
326,
253,
2929,
651,
10260,
5649,
432,
247,
40386,
2403,
352,
625,
1881,
41010,
50276,
39043,
37626,
5454,
2727,
581,
273,
253,
2022,
3916,
273,
253,
2929,
310,
326,
970,
247,
13630,
2803,
50276,
18,
275,
253,
7291,
672,
253,
2032,
8103,
310,
3807,
2865,
702,
264,
556,
247,
37820,
1055,
417,
760,
327,
253,
11041,
533,
671,
327,
253,
3037,
1430,
273,
253,
1318,
1159,
3139,
891,
452,
281,
11476,
326,
253,
2929,
556,
417,
13762,
479,
327,
436,
1127,
352,
310,
1892,
281,
1333,
326,
253,
6779,
273,
253,
1318,
1159,
4916,
625,
2570,
347,
253,
13630,
2803,
7274,
581,
390,
12014,
347,
253,
16892,
5459,
275,
2087,
891,
1158,
326,
310,
1896,
281,
45018,
31934,
793,
275,
534,
253,
1318,
1159,
6779,
4916,
19554,
347,
253,
16892,
5459,
347,
973,
347,
31934,
793,
275,
534,
352,
4916,
625,
2570,
891,
476,
8564,
326,
323,
247,
966,
273,
8892,
253,
3908,
476,
320,
2032,
533,
253,
2929,
1057,
2319,
253,
3607,
273,
841,
8892,
476,
253,
4477,
21184,
625,
327,
436,
1127,
50276,
10422,
15434,
4836,
8668,
253,
2929,
29328,
247,
8668,
273,
253,
7291,
5731,
1293,
247,
13630,
2803,
323,
247,
42214,
8103,
347,
247,
2020,
273,
767,
2426,
2299,
891,
452,
690,
7350,
670,
253,
2898,
273,
253,
502,
8201,
5853,
10939,
323,
253,
767,
2426,
2139,
417,
1347,
253,
502,
8201,
816,
2378,
281,
253,
3236,
42214,
8103,
50275,
37585,
3374,
50276,
249,
2593,
374,
253,
278,
12132,
1566,
310,
5611,
7384,
6486,
1375,
1913,
8470,
310,
436,
9376,
1663,
3309,
253,
4679,
403,
4824,
562,
327,
278,
10441,
16856,
8892,
326,
403,
7943,
407,
5415,
1375,
1913,
8470,
50276,
783,
14777,
403,
1077,
1355,
1690,
253,
39064,
285,
13301,
327,
253,
7844,
25761,
597,
403,
417,
34025,
672,
11993,
253,
2929,
275,
650,
698,
25912,
891,
1804,
970,
1027,
19169,
383,
9250,
390,
9588,
50276,
1189,
455,
891,
1158,
326,
253,
2929,
12453,
247,
4623,
1895,
326,
310,
13353,
1774,
281,
9729,
253,
8037,
875,
3762,
285,
3946,
2299,
891,
452,
690,
7350,
670,
253,
6003,
285,
670,
253,
11815,
3340,
5001,
253,
8492,
37626,
5454,
2727,
326,
253,
2929,
21354,
432,
253,
3559,
1543,
323,
841,
4606,
891,
1158,
326,
253,
2929,
310,
4390,
417,
4704,
323,
9311,
387,
17857,
77,
5784,
406,
339,
9852,
436,
2929,
253,
4477,
2770,
327,
253,
13630,
272,
29713,
275,
253,
12353,
68,
17425,
5933,
432,
11088,
4679,
253,
4477,
1750,
326,
436,
29713,
310,
2057,
247,
8492,
87,
14417,
6779,
5454,
2727,
390,
271,
24026,
4836,
323,
253,
12353,
5731,
1580,
253,
13630,
272,
29713,
1895,
310,
247,
973,
4304,
8037,
875,
253,
10527,
1783,
285,
253,
2898,
616,
789,
3340,
253,
4679,
1537,
452,
690,
3486,
327,
849,
281,
2096,
436,
8037,
50276,
35529,
1580,
352,
1057,
417,
2085,
667,
747,
1783,
5853,
390,
8542,
1566,
281,
3157,
253,
3045,
273,
253,
913,
5933,
891,
651,
11907,
253,
4477,
281,
513,
625,
1783,
273,
253,
4327,
273,
17356,
751,
849,
281,
5206,
17356,
1537,
1421,
281,
247,
1175,
3045,
2057,
21657,
390,
28055,
285,
891,
2868,
326,
651,
452,
625,
3486,
327,
1097,
253,
10527,
1783,
285,
8542,
5933,
2216,
285,
26614,
1580,
275,
253,
806,
10076,
253,
19412,
16464,
273,
17356,
310,
2783,
281,
4796,
253,
11041,
352,
651,
320,
4722,
604,
253,
4477,
812,
7277,
436,
2238,
273,
11041,
5141,
342,
253,
19191,
11041,
5141,
327,
253,
3646,
29844,
11333,
337,
374,
495,
50276,
45230,
2167,
436,
2929,
19756,
247,
10527,
1783,
390,
247,
3216,
22071,
5661,
3045,
436,
2929,
556,
271,
4722,
285,
11088,
5661,
6630,
285,
29328,
690,
747,
6482,
4219,
327,
436,
1895,
891,
588,
1804,
45210,
2997,
323,
436,
2929,
891,
1537,
1908,
26264,
619,
14876,
846,
5955,
342,
643,
30628,
285,
253,
4477,
2380,
50276,
18,
13860,
5391,
1111,
442,
80,
1162,
355,
19191,
11041,
43408,
3646,
11786,
549,
32693,
638,
3845,
549,
32693,
11395,
1549,
3208,
1093,
4765,
50276,
19,
1269,
86,
3199,
11664,
29583,
305,
8500,
285,
572,
266,
371,
266,
1149,
3410,
5919,
3646,
11786,
3082,
342,
33037,
11041,
5141,
549,
32693,
638,
3845,
549,
32693,
16129,
2270,
2691,
740,
6247,
50276,
20,
340,
9041,
30287,
478,
11917,
80,
1162,
355,
19191,
33037,
10254,
323,
3646,
11786,
3082,
549,
32693,
638,
3845,
549,
32693,
1518,
19321,
19044,
9169,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
253,
1055,
273,
253,
13630,
29713,
275,
12353,
24913,
982,
253,
13630,
908,
323,
7103,
2223,
337,
253,
13630,
908,
323,
253,
7291,
285,
253,
13630,
908,
323,
253,
12353,
253,
373,
19836,
247,
6779,
4715,
4154,
4516,
407,
247,
2962,
273,
4679,
253,
3302,
10123,
8042,
562,
326,
436,
2929,
12453,
1077,
4623,
2561,
3533,
4536,
275,
247,
3240,
3236,
1039,
342,
247,
1781,
873,
273,
4679,
2299,
597,
671,
5439,
7350,
670,
253,
6003,
498,
15752,
273,
253,
2929,
285,
1896,
32213,
670,
253,
5661,
2175,
253,
4477,
2530,
247,
30080,
22559,
285,
247,
18520,
326,
31637,
690,
2792,
285,
17142,
3081,
11985,
2299,
604,
253,
18520,
5520,
253,
3302,
19529,
253,
6096,
6803,
310,
326,
253,
19843,
285,
4679,
3746,
403,
1335,
8489,
14999,
347,
824,
253,
913,
2550,
5583,
18738,
436,
2929,
2568,
436,
789,
1057,
452,
4722,
5697,
285,
253,
1895,
2783,
310,
273,
1600,
323,
253,
3114,
285,
762,
5421,
253,
4477,
403,
7052,
14659,
281,
11929,
247,
17265,
2715,
281,
247,
2852,
18767,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper tries to prove that there is a bottleneck in feature learning for longtailed classification and data augmentation can help relieve the issues in longtail feature space three major experiments were done to prove that feature space 1 is more biased than balanced feature space 2 is more disused and less compact than balanced feature space and 3 less localized in terms of feature centroids and data augmentation can help alleviate all three issues the weakness 1 the second objective of this paper is to discuss why data augmentation helps in representation learning however in the paper only positive effects from data augmentation were shown the reasons and mechanisms were not fully discussed 2 the overall paper is based on the unserious term good enough what is this term defined how good is good enough good enough in terms of what generalization and robustness compared to full balanced data sets or in terms of knowledge transfer if it is the first one of course longtailed representations are less generalized and robust compared to balanced representations it is not a new idea and is already discussed in 1 if it is the second one then the later experiments dont make any sense and i think when people say longtail representations are good enough in studies like 2 it is more like it is good enough for longtail learning rather than comparing it to balanced learning 3 all the experiments seem unfair to me for example dlt are representations from longtailed data sets and d are representations from balanced data sets balanced data sets always have much more training samples compared to corresponding longtailed counterparts how do you know these inferior results were not caused by the lack of training samples 4 in the adding unseen samples experiments eg fig 6 fig 7 fig 8 only results on dlt were reported i want to see results when unseen samples are added to d as well only by doing this can you prove d is less diffused and better localized 5 fig 7 needs a more detailed legend so many components dont have explanations 6 by looking at figure 5 i dont see a significant difference between dlt and d in cifar100lt and imagenetlt 1 liu z miao z zhan x wang j gong b yu s x 2019 largescale longtailed recognition in an open world in proceedings of the ieeecvf conference on computer vision and pattern recognition pp 25372546 2 kang b xie s rohrbach m yan z gordo a feng j kalantidis y 2019 decoupling representation and classifier for longtailed recognition arxiv preprint arxiv191009217 i think the intuition of this paper is not clear and the experiments are not persuasive docsepthis paper poses an interesting and important question where are the bottlenecks in longtailed classification the authors use empirical experiments to show their observations 1 representation is more critical than classifier 2 data augmentation is helpful three datasets cifar10 lt cifar100 lt and imagenetlt are employed to work with resnet32 and resnet10 models to demonstrate their observations strength 1 the topic is interesting and the papers pose some unique observations after extensive empirical analysis and experiments 2 the paper defines several simple mathematical and statistical metrics to measure the differences between representations weakness 1 the posed questions are not well addressed the paper shows some observations but did not provide a concrete and reasonable solution such that the longtailed classification issue can be addressed the useful insight from this paper is limited 2 the empirical observations are not solid and rigorous the paper only provides some simple metrics but did not explain why the metric is necessary and whats the highlevel intuition i did not get the motivation why the authors come up with these metrics to show the differences between representation in addition there is no rigorous mathematical proof or statistical analysis 3 lack of careful related work discussion the related work section is hard to follow and the authors did not explain their contributions and differences from existing works 4 the writing needs to be improved many typos result in additional difficulties to read the citation format is not consistent for example cui et al in section 3 does not have a year and he et al 2015 followed by resnet32 should be he et al 2015 the google enough and normal in the abstract should be corrected an overall feeling is that the paper is an ongoing work and needs to be carefully written and improved my recommendation is to reject it in the current form docsepthe authors study the longtail dataset problem in order to determine the true bottleneck for the task after performing many ablations and experiments on 3 benchmark datasets they establish that contrary to common belief the bottleneck is in data representation rather than the classifier itself i really enjoyed reading the paper it has a very clear direction from the beginning with good experiments to back it up the writing is clear as well i believe longtailed classification is an interesting problem with clear realworld applications so studying it indepth is necessary for the community overall i dont see any major drawbacks or shortcomings as the experiments and ablations combined with the analysis are solid that said i have few questions the difference in representation between d and dlt is clearly visible however apart from difference in the shape of distribution longtail vs balanced there is also difference in the amount of data between those two which might play an important role especially when considering learned representations it would be good to see what is the difference as well between two datasets that have equal number of examples but different distributions normal ad lt since otherwise the authors conclusion might raise a question additionally yin et al 1 performed a related analysis of classifier magnitude difference which was depending on the distribution of classes the authors analysis seems deeper here however it would be good to address any similaritiesdifferences on top of that few methods 2 3 used adversarial examples in order to modify the learned representations instead of the classifiers in lt task it would be good to see what the authors think about such direction and what impact on the feature space and measured statistics it would have and finally apart from the analysis what are the conclusions here for the future researchers any thoughts on proposed directionsapproaches that could originate from the performed analysis 1 yin xi et al feature transfer learning for face recognition with underrepresented data proceedings of the ieeecvf conference on computer vision and pattern recognition 2019 2 kim jaehyung jongheon jeong and jinwoo shin m2m imbalanced classification via majortominor translation proceedings of the ieeecvf conference on computer vision and pattern recognition 2020 3 kozerawski jedrzej et al blt balancing longtailed datasets with adversariallyperturbed images proceedings of the asian conference on computer vision 2020 good analytical paper on interesting subject of longtailed classification it would be good to see authors thoughts on impact of the amount of data in training impact of adversarial augmentations and proposed directions stemming from the analysis docsepthis paper seeks to study what is the bottleneck in longtailed learning based on extensive experiments the authors propose that representation learning is the bottleneck in longtailed classification also this paper analyzes representation learning from the perspectives of intraclass compactness and interclass separation as well as the influence of data mixup on longtailed representation learning positive points 1 this paper seeks to empirically investigate the importance of representation learning which may provide a new understanding of deep longtailed learning to the community 2 this work shows the effectiveness of intraclass compactness and interclass separation on longtailed representation learning 3 this work also analyzes the influence of data augmentation on longtailed representation learning which provides a better understanding of data augmentation in deep longtailed learning negative points 1 this paper mentioned that a commonly held belief in deep longtailed classification is that performance bottleneck is the classification head atop the representation learner however such a belief may not be common note that many recent longtailed learning studies focus on improving representation learning 1 eg kcl 2 hybrid 3 paco4 and drolt 5 moreover in the conclusion this paper stated that the results suggest that the primary problem in longtailed classification may in fact be the fewshot learning problem on the tail classes rather than the imbalance problem however this argument is too strong and the obtained results cannot support it it would be better if the authors had written all the arguments more rigorously and verified them more completely 2 as mentioned by the above question there are many representation learningbased longtailed studies eg 15 therefore it would be better if authors can review mire representation learning based longtailed learning methods in related work 3 the vital problem in this paper is the used balanced set ie cifar10100 and imagenet1k please note that the data number of cifar10100 or imagenet1k is much more than their longtailed variants ie cifarlt and imagenetlt considering using more training samples will lead to significant improvement in representation learning and model performance most empirical comparisons in this paper especially table 1 and figure 2 are unfair and the corresponding arguments are unpersuasive the experiments would have been more persuasive if the balanced training set is a variant of the longtailed training set with a similar total data number but each class has the samesimilar data number like 12 for example a balanced set of imagenetlt can be obtained at httpsgithubcomvanintawesomelongtailedlearningtreemainresourcesdatatxtimagenetlt 4 please discuss more of figure 5 on imagenetlt and cifar100lt the variance trends of longtailed representations and ideal representations are quite consistent such observations seem different from the conclusion of sec 21 since i am confused about the results i guess other readers may also do so therefore i suggest the authors explain them more minor suggestions 1 figure 3 is not clear enough it would be better if the authors can explain it more in the captions moreover what are degrees in fig68 please make them more clear 2 in lines 45 of page 2 imagenetlt appears twice in line 1 of page 6 there should be a full stop before in practice references 1 deep longtailed learning a survey arxiv 2021 2 exploring balanced feature spaces for representation learning in iclr 2021 3 contrastive learning based hybrid networks for longtailed image classification in cvpr 2021 4 parametric contrastive learning in iccv 2021 5 distributional robustness loss for longtail learning in iccv 2021 overall i like the goal of this paper ie analyzing the bottleneck of longtailed learning however i cannot champion this paper since the data number of the used balanced set is much larger than the longtailed set which makes the empirical comparisons unfair and the corresponding finding unpersuasive moreover the arguments in this paper should be written more rigorously i am glad to see the response of the authors
### Summary: | this paper investigates the role of representation learning when the distribution over the feature space has a long tail the main motivation is to determine how much of the overall learning in this case is bottlenecked specifically by representation learning the main findings are that vanilla learning gives brittle longtailed representations harming overall performance the paper suggests a form of data augmentation to remedy this reviewers acknowledge that this investigation is worthwhile however many concerns were raised as to whether experiments support the drawn conclusions a more principled approach to the data augmentation methodology is also needed the authors address some of these providing further experiments but these were not enough to sway reviewers since results are fundamentally empirical in nature this shortcoming indicates that the paper is not ready to share with the community just yet stronger experiments with clearer evidence are needed to fully support the thesis of the work | [
1048,
29551,
941,
5239,
285,
277,
403,
14237,
432,
16645,
941,
5239,
16645,
941,
5239,
1900,
452,
1199,
625,
3733,
3530,
2429,
281,
3969,
1048,
29551,
21421,
849,
513,
368,
871,
841,
18134,
1543,
497,
417,
4269,
407,
253,
3480,
273,
3733,
3530,
50275,
21,
275,
253,
6240,
39709,
3530,
4679,
24088,
3036,
721,
3036,
818,
3036,
854,
760,
1543,
327,
277,
5792,
497,
2361,
891,
971,
281,
923,
1543,
672,
39709,
3530,
403,
2879,
281,
277,
347,
973,
760,
407,
2509,
436,
476,
368,
5276,
277,
310,
1679,
2171,
3197,
285,
1805,
15783,
50275,
22,
3036,
818,
3198,
247,
625,
7000,
13691,
594,
1142,
4295,
13414,
452,
22909,
50275,
23,
407,
2819,
387,
4677,
608,
891,
13414,
923,
247,
1534,
3064,
875,
277,
5792,
285,
277,
275,
260,
338,
274,
2313,
5792,
285,
4440,
257,
292,
5792,
50276,
18,
632,
86,
1182,
278,
22728,
1182,
1182,
5582,
1269,
259,
606,
480,
305,
543,
270,
50276,
30838,
256,
1269,
6247,
1236,
2510,
25912,
1048,
29551,
8981,
275,
271,
1527,
1533,
275,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
2030,
1787,
1099,
2950,
374,
465,
606,
270,
1269,
466,
256,
687,
6285,
16836,
278,
340,
266,
1182,
305,
46332,
247,
269,
1205,
480,
50276,
26076,
386,
30861,
340,
6247,
34430,
4906,
6779,
285,
30410,
323,
1048,
29551,
8981,
549,
32693,
638,
3845,
549,
32693,
746,
2313,
4529,
1166,
891,
1158,
253,
30328,
273,
436,
2929,
310,
417,
2590,
285,
253,
4679,
403,
417,
34593,
5474,
33032,
2520,
2929,
24543,
271,
4722,
285,
1774,
1953,
50276,
2811,
403,
253,
3673,
5025,
886,
661,
275,
1048,
29551,
9162,
253,
4477,
897,
16774,
4679,
281,
921,
616,
7313,
337,
6779,
310,
625,
4619,
685,
30410,
374,
941,
42072,
310,
9371,
50276,
13524,
15302,
260,
338,
274,
740,
46007,
260,
338,
274,
2313,
46007,
285,
4440,
257,
292,
5792,
403,
7091,
281,
789,
342,
501,
3024,
1237,
285,
501,
3024,
740,
3210,
281,
7568,
616,
7313,
50275,
45563,
50276,
18,
253,
9400,
310,
4722,
285,
253,
9380,
16753,
690,
4451,
7313,
846,
9470,
16774,
1783,
285,
4679,
374,
253,
2929,
13067,
2067,
2969,
15965,
285,
7605,
17082,
281,
2557,
253,
3910,
875,
14237,
50276,
20881,
1255,
337,
253,
22691,
3533,
403,
417,
973,
9713,
253,
2929,
2722,
690,
7313,
533,
858,
417,
2085,
247,
11859,
285,
5272,
2900,
824,
326,
253,
1048,
29551,
9162,
2523,
476,
320,
9713,
253,
4217,
12288,
432,
436,
2929,
310,
3710,
50274,
19,
253,
16774,
7313,
403,
417,
4891,
285,
26565,
50276,
783,
2929,
760,
3400,
690,
2969,
17082,
533,
858,
417,
5513,
2139,
253,
7982,
310,
3309,
285,
47515,
253,
1029,
5251,
30328,
891,
858,
417,
755,
253,
16038,
2139,
253,
4477,
1705,
598,
342,
841,
17082,
281,
921,
253,
3910,
875,
6779,
275,
1635,
627,
310,
642,
26565,
15965,
4737,
390,
7605,
1783,
50276,
20,
3480,
273,
10182,
2905,
789,
5955,
50276,
783,
2905,
789,
2593,
310,
1892,
281,
956,
285,
253,
4477,
858,
417,
5513,
616,
9021,
285,
3910,
432,
5368,
2987,
50274,
21,
253,
4028,
3198,
281,
320,
5520,
50276,
20415,
963,
993,
906,
275,
3081,
12748,
281,
1239,
50275,
783,
25577,
5981,
310,
417,
5185,
323,
1650,
50276,
68,
4113,
1162,
355,
275,
2593,
495,
1057,
417,
452,
247,
50275,
2913,
285,
344,
1162,
355,
4104,
3560,
407,
501,
3024,
1237,
943,
320,
344,
1162,
355,
4104,
50276,
783,
17899,
2217,
285,
2622,
275,
253,
12002,
943,
320,
15045,
50273,
266,
4583,
5471,
310,
326,
253,
2929,
310,
271,
10800,
789,
285,
3198,
281,
320,
9257,
3542,
285,
5520,
619,
17401,
310,
281,
12009,
352,
275,
253,
1655,
830,
50275,
7152,
339,
431,
248,
4477,
1263,
253,
1048,
14694,
10895,
1895,
275,
1340,
281,
3653,
253,
2032,
3673,
44856,
323,
253,
4836,
846,
9591,
1142,
490,
77,
569,
285,
4679,
327,
495,
22791,
15302,
597,
5100,
326,
10214,
281,
1846,
9927,
253,
3673,
44856,
310,
275,
941,
6779,
2581,
685,
253,
30410,
3139,
891,
1663,
11346,
4361,
253,
2929,
352,
556,
247,
1077,
2590,
3884,
432,
253,
5068,
342,
1175,
4679,
281,
896,
352,
598,
253,
4028,
310,
2590,
347,
973,
891,
2868,
1048,
29551,
9162,
310,
271,
4722,
1895,
342,
2590,
1524,
10186,
4893,
594,
12392,
352,
801,
554,
394,
310,
3309,
323,
253,
3114,
50276,
1189,
455,
891,
13414,
923,
667,
2201,
30453,
390,
35387,
347,
253,
4679,
285,
490,
77,
569,
5678,
342,
253,
1783,
403,
4891,
326,
753,
891,
452,
1643,
3533,
50276,
783,
3064,
275,
6779,
875,
277,
285,
277,
5792,
310,
4518,
7985,
2299,
7419,
432,
3064,
275,
253,
5281,
273,
3268,
1048,
14694,
4632,
16645,
627,
310,
671,
3064,
275,
253,
2408,
273,
941,
875,
1110,
767,
534,
1537,
1132,
271,
1774,
2554,
3340,
672,
7296,
6311,
14237,
352,
651,
320,
1175,
281,
923,
752,
310,
253,
3064,
347,
973,
875,
767,
15302,
326,
452,
4503,
1180,
273,
6667,
533,
1027,
10670,
2622,
519,
46007,
1580,
5010,
253,
4477,
6452,
1537,
7164,
247,
1953,
50276,
29483,
595,
50276,
90,
249,
1162,
355,
337,
2684,
247,
2905,
1783,
273,
30410,
9777,
3064,
534,
369,
7293,
327,
253,
3268,
273,
5971,
253,
4477,
1783,
3133,
12861,
1060,
2299,
352,
651,
320,
1175,
281,
2953,
667,
22620,
69,
26776,
327,
1755,
273,
326,
1643,
3082,
374,
495,
908,
48960,
6667,
275,
1340,
281,
10007,
253,
6311,
14237,
3185,
273,
253,
49996,
275,
46007,
4836,
352,
651,
320,
1175,
281,
923,
752,
253,
4477,
1158,
670,
824,
3884,
285,
752,
3486,
327,
253,
4735,
2317,
285,
4080,
9990,
352,
651,
452,
50276,
395,
4720,
7419,
432,
253,
1783,
752,
403,
253,
11815,
1060,
323,
253,
2852,
8607,
50276,
1279,
7906,
327,
4081,
10746,
6772,
3844,
326,
812,
35282,
432,
253,
2684,
1783,
50276,
18,
340,
249,
1269,
74,
1162,
355,
4735,
3700,
4715,
323,
2454,
8981,
342,
762,
33174,
941,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
6247,
50276,
19,
465,
303,
480,
3348,
1994,
1947,
480,
543,
248,
251,
5139,
543,
285,
480,
249,
680,
80,
439,
249,
278,
19,
78,
516,
30063,
9162,
3066,
19684,
430,
5240,
263,
10234,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
9169,
50276,
20,
465,
6002,
254,
1403,
9327,
23530,
83,
2721,
75,
1162,
355,
787,
85,
26259,
1048,
29551,
15302,
342,
18539,
274,
1365,
8292,
16193,
3888,
10061,
273,
253,
347,
757,
8059,
327,
4382,
8113,
9169,
1175,
16101,
2929,
327,
4722,
2256,
273,
1048,
29551,
9162,
352,
651,
320,
1175,
281,
923,
4477,
7906,
327,
3486,
273,
253,
2408,
273,
941,
275,
3733,
3486,
273,
48960,
35919,
569,
285,
4081,
10746,
45030,
432,
253,
1783,
5474,
33032,
2520,
2929,
14993,
281,
1263,
752,
310,
253,
3673,
44856,
275,
1048,
29551,
4715,
1754,
327,
9470,
4679,
253,
4477,
12661,
326,
6779,
4715,
310,
253,
3673,
44856,
275,
1048,
29551,
9162,
671,
436,
2929,
3537,
13505,
6779,
4715,
432,
253,
24302,
273,
11251,
14407,
8566,
1255,
285,
734,
2437,
9712,
347,
973,
347,
253,
4833,
273,
941,
5878,
484,
327,
1048,
29551,
6779,
4715,
2762,
2792,
337,
436,
2929,
14993,
281,
45190,
7409,
253,
6349,
273,
6779,
4715,
534,
778,
2085,
247,
747,
4685,
273,
3676,
1048,
29551,
4715,
281,
253,
3114,
50275,
19,
436,
789,
2722,
253,
12510,
273,
11251,
14407,
8566,
1255,
285,
734,
2437,
9712,
327,
1048,
29551,
6779,
4715,
50273,
20,
436,
789,
671,
3537,
13505,
253,
4833,
273,
941,
42072,
327,
1048,
29551,
6779,
4715,
534,
3400,
247,
1805,
4685,
273,
941,
42072,
275,
3676,
1048,
29551,
4715,
50275,
12373,
2792,
337,
436,
2929,
5393,
326,
247,
7744,
2918,
9927,
275,
3676,
1048,
29551,
9162,
310,
326,
3045,
3673,
44856,
310,
253,
9162,
1481,
31831,
253,
6779,
458,
47612,
2299,
824,
247,
9927,
778,
417,
320,
1846,
3877,
326,
1142,
3332,
1048,
29551,
4715,
2175,
2770,
327,
11138,
6779,
4715,
337,
24088,
465,
498,
374,
9769,
495,
268,
15861,
21,
285,
277,
1102,
85,
608,
50276,
3062,
1189,
275,
253,
6452,
436,
2929,
4767,
326,
253,
1543,
1804,
326,
253,
3625,
1895,
275,
1048,
29551,
9162,
778,
275,
958,
320,
253,
1643,
11860,
4715,
1895,
327,
253,
8105,
5971,
2581,
685,
253,
31561,
1895,
2299,
436,
4154,
310,
1512,
2266,
285,
253,
2797,
1543,
2550,
1329,
352,
352,
651,
320,
1805,
604,
253,
4477,
574,
3542,
512,
253,
7125,
625,
8132,
29689,
285,
16058,
731,
625,
4336,
50273,
19,
347,
5393,
407,
253,
1840,
1953,
627,
403,
1142,
6779,
4715,
3169,
1048,
29551,
2175,
24088,
1458,
3103,
352,
651,
320,
1805,
604,
4477,
476,
2278,
278,
603,
6779,
4715,
1754,
1048,
29551,
4715,
3082,
275,
2905,
789,
50275,
20,
253,
12232,
1895,
275,
436,
2929,
310,
253,
908,
16645,
873,
26332,
260,
338,
274,
6903,
361,
285,
4440,
257,
292,
18,
76,
4496,
3877,
326,
253,
941,
1180,
273,
50276,
46277,
274,
6903,
361,
390,
4440,
257,
292,
18,
76,
310,
1199,
625,
685,
616,
1048,
29551,
11640,
26332,
260,
338,
274,
5792,
285,
4440,
257,
292,
5792,
7296,
970,
625,
3733,
3530,
588,
1421,
281,
1534,
7756,
275,
6779,
4715,
285,
1566,
3045,
954,
16774,
14023,
275,
436,
2929,
3340,
2829,
337,
285,
4677,
374,
403,
16593,
285,
253,
3969,
7125,
403,
440,
5726,
86,
10563,
253,
4679,
651,
452,
644,
625,
34593,
604,
253,
16645,
3733,
873,
310,
247,
12955,
273,
253,
1048,
29551,
3733,
873,
342,
247,
2074,
2264,
941,
1180,
533,
1016,
966,
556,
253,
256,
1443,
303,
1858,
941,
1180,
751,
1249,
323,
1650,
247,
16645,
873,
273,
4440,
257,
292,
5792,
476,
320,
2797,
387,
5987,
7280,
681,
6148,
565,
1403,
265,
297,
29756,
29551,
28269,
5643,
358,
404,
21830,
8608,
255,
633,
303,
6533,
292,
5792,
50274,
21,
4496,
2319,
625,
273,
4677,
608,
327,
4440,
257,
292,
5792,
285,
260,
338,
274,
2313,
5792,
253,
11041,
13554,
273,
1048,
29551,
14237,
285,
7445,
14237,
403,
3240,
5185,
824,
7313,
1646,
1027,
432,
253,
6452,
273,
4706,
3127,
1580,
891,
717,
13477,
670,
253,
1543,
891,
5476,
643,
10668,
778,
671,
513,
594,
3103,
891,
1804,
253,
4477,
5513,
731,
625,
50272,
37585,
13991,
337,
4677,
495,
310,
417,
2590,
2217,
352,
651,
320,
1805,
604,
253,
4477,
476,
5513,
352,
625,
275,
253,
3403,
621,
25761,
752,
403,
7759,
275,
3036,
2358,
4496,
1056,
731,
625,
2590,
50276,
19,
275,
3104,
5329,
273,
3239,
374,
4440,
257,
292,
5792,
4620,
7019,
275,
1386,
337,
273,
3239,
721,
627,
943,
320,
247,
2120,
3523,
1078,
275,
3946,
50274,
250,
3065,
50276,
18,
3676,
1048,
29551,
4715,
247,
6630,
549,
32693,
43425,
50276,
19,
18216,
16645,
4735,
8470,
323,
6779,
4715,
275,
17857,
32888,
43425,
50276,
20,
4499,
422,
4715,
1754,
9769,
6928,
323,
1048,
29551,
2460,
9162,
275,
30105,
1087,
43425,
50276,
21,
36833,
4499,
422,
4715,
275,
17857,
17312,
43425,
50276,
22,
3268,
267,
31640,
2957,
323,
1048,
14694,
4715,
275,
17857,
17312,
43425,
50275,
1189,
455,
891,
751,
253,
4736,
273,
436,
2929,
26332,
18918,
253,
3673,
44856,
273,
1048,
29551,
4715,
2299,
891,
2550,
16928,
436,
2929,
1580,
253,
941,
1180,
273,
253,
908,
16645,
873,
310,
1199,
4067,
685,
253,
1048,
29551,
873,
534,
2789,
253,
16774,
14023,
16593,
285,
253,
3969,
4560,
440,
5726,
86,
10563,
25761,
253,
7125,
275,
436,
2929,
943,
320,
3542,
625,
8132,
29689,
891,
717,
9995,
281,
923,
253,
2380,
273,
253,
4477,
2490,
187,
4118,
18435,
27,
2520,
2929,
2340,
684,
253,
2554,
273,
6779,
4715,
672,
253,
3268,
689,
253,
4735,
2317,
556,
247,
1048,
8105,
253,
2022,
16038,
310,
281,
3653,
849,
1199,
273,
253,
4583,
4715,
275,
436,
1083,
310,
3673,
44856,
264,
5742,
407,
6779,
4715,
253,
2022,
4342,
403,
326,
26724,
4715,
4245,
1308,
1522,
1048,
29551,
14237,
5237,
272,
4583,
3045,
253,
2929,
5936,
247,
830,
273,
941,
42072,
281,
16748,
436,
30628,
14409,
326,
436,
5839,
310,
32811,
2299,
1142,
7350,
497,
5439,
347,
281,
1880,
4679,
1329,
253,
8392,
11815,
247,
625,
3505,
74,
6216,
2746,
281,
253,
941,
42072,
16182,
310,
671,
3058,
253,
4477,
2953,
690,
273,
841,
5277,
2007,
4679,
533,
841,
497,
417,
2217,
281,
26740,
30628,
1580,
1543,
403,
26401,
16774,
275,
3753,
436,
2159,
4202,
6492,
326,
253,
2929,
310,
417,
4704,
281,
3894,
342,
253,
3114,
816,
2568,
10046,
4679,
342,
30909,
1941,
403,
3058,
281,
4751,
1329,
253,
22857,
273,
253,
789
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
1048,
29551,
941,
5239,
285,
277,
403,
14237,
432,
16645,
941,
5239,
16645,
941,
5239,
1900,
452,
1199,
625,
3733,
3530,
2429,
281,
3969,
1048,
29551,
21421,
849,
513,
368,
871,
841,
18134,
1543,
497,
417,
4269,
407,
253,
3480,
273,
3733,
3530,
50275,
21,
275,
253,
6240,
39709,
3530,
4679,
24088,
3036,
721,
3036,
818,
3036,
854,
760,
1543,
327,
277,
5792,
497,
2361,
891,
971,
281,
923,
1543,
672,
39709,
3530,
403,
2879,
281,
277,
347,
973,
760,
407,
2509,
436,
476,
368,
5276,
277,
310,
1679,
2171,
3197,
285,
1805,
15783,
50275,
22,
3036,
818,
3198,
247,
625,
7000,
13691,
594,
1142,
4295,
13414,
452,
22909,
50275,
23,
407,
2819,
387,
4677,
608,
891,
13414,
923,
247,
1534,
3064,
875,
277,
5792,
285,
277,
275,
260,
338,
274,
2313,
5792,
285,
4440,
257,
292,
5792,
50276,
18,
632,
86,
1182,
278,
22728,
1182,
1182,
5582,
1269,
259,
606,
480,
305,
543,
270,
50276,
30838,
256,
1269,
6247,
1236,
2510,
25912,
1048,
29551,
8981,
275,
271,
1527,
1533,
275,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
7266,
2030,
1787,
1099,
2950,
374,
465,
606,
270,
1269,
466,
256,
687,
6285,
16836,
278,
340,
266,
1182,
305,
46332,
247,
269,
1205,
480,
50276,
26076,
386,
30861,
340,
6247,
34430,
4906,
6779,
285,
30410,
323,
1048,
29551,
8981,
549,
32693,
638,
3845,
549,
32693,
746,
2313,
4529,
1166,
891,
1158,
253,
30328,
273,
436,
2929,
310,
417,
2590,
285,
253,
4679,
403,
417,
34593,
5474,
33032,
2520,
2929,
24543,
271,
4722,
285,
1774,
1953,
50276,
2811,
403,
253,
3673,
5025,
886,
661,
275,
1048,
29551,
9162,
253,
4477,
897,
16774,
4679,
281,
921,
616,
7313,
337,
6779,
310,
625,
4619,
685,
30410,
374,
941,
42072,
310,
9371,
50276,
13524,
15302,
260,
338,
274,
740,
46007,
260,
338,
274,
2313,
46007,
285,
4440,
257,
292,
5792,
403,
7091,
281,
789,
342,
501,
3024,
1237,
285,
501,
3024,
740,
3210,
281,
7568,
616,
7313,
50275,
45563,
50276,
18,
253,
9400,
310,
4722,
285,
253,
9380,
16753,
690,
4451,
7313,
846,
9470,
16774,
1783,
285,
4679,
374,
253,
2929,
13067,
2067,
2969,
15965,
285,
7605,
17082,
281,
2557,
253,
3910,
875,
14237,
50276,
20881,
1255,
337,
253,
22691,
3533,
403,
417,
973,
9713,
253,
2929,
2722,
690,
7313,
533,
858,
417,
2085,
247,
11859,
285,
5272,
2900,
824,
326,
253,
1048,
29551,
9162,
2523,
476,
320,
9713,
253,
4217,
12288,
432,
436,
2929,
310,
3710,
50274,
19,
253,
16774,
7313,
403,
417,
4891,
285,
26565,
50276,
783,
2929,
760,
3400,
690,
2969,
17082,
533,
858,
417,
5513,
2139,
253,
7982,
310,
3309,
285,
47515,
253,
1029,
5251,
30328,
891,
858,
417,
755,
253,
16038,
2139,
253,
4477,
1705,
598,
342,
841,
17082,
281,
921,
253,
3910,
875,
6779,
275,
1635,
627,
310,
642,
26565,
15965,
4737,
390,
7605,
1783,
50276,
20,
3480,
273,
10182,
2905,
789,
5955,
50276,
783,
2905,
789,
2593,
310,
1892,
281,
956,
285,
253,
4477,
858,
417,
5513,
616,
9021,
285,
3910,
432,
5368,
2987,
50274,
21,
253,
4028,
3198,
281,
320,
5520,
50276,
20415,
963,
993,
906,
275,
3081,
12748,
281,
1239,
50275,
783,
25577,
5981,
310,
417,
5185,
323,
1650,
50276,
68,
4113,
1162,
355,
275,
2593,
495,
1057,
417,
452,
247,
50275,
2913,
285,
344,
1162,
355,
4104,
3560,
407,
501,
3024,
1237,
943,
320,
344,
1162,
355,
4104,
50276,
783,
17899,
2217,
285,
2622,
275,
253,
12002,
943,
320,
15045,
50273,
266,
4583,
5471,
310,
326,
253,
2929,
310,
271,
10800,
789,
285,
3198,
281,
320,
9257,
3542,
285,
5520,
619,
17401,
310,
281,
12009,
352,
275,
253,
1655,
830,
50275,
7152,
339,
431,
248,
4477,
1263,
253,
1048,
14694,
10895,
1895,
275,
1340,
281,
3653,
253,
2032,
3673,
44856,
323,
253,
4836,
846,
9591,
1142,
490,
77,
569,
285,
4679,
327,
495,
22791,
15302,
597,
5100,
326,
10214,
281,
1846,
9927,
253,
3673,
44856,
310,
275,
941,
6779,
2581,
685,
253,
30410,
3139,
891,
1663,
11346,
4361,
253,
2929,
352,
556,
247,
1077,
2590,
3884,
432,
253,
5068,
342,
1175,
4679,
281,
896,
352,
598,
253,
4028,
310,
2590,
347,
973,
891,
2868,
1048,
29551,
9162,
310,
271,
4722,
1895,
342,
2590,
1524,
10186,
4893,
594,
12392,
352,
801,
554,
394,
310,
3309,
323,
253,
3114,
50276,
1189,
455,
891,
13414,
923,
667,
2201,
30453,
390,
35387,
347,
253,
4679,
285,
490,
77,
569,
5678,
342,
253,
1783,
403,
4891,
326,
753,
891,
452,
1643,
3533,
50276,
783,
3064,
275,
6779,
875,
277,
285,
277,
5792,
310,
4518,
7985,
2299,
7419,
432,
3064,
275,
253,
5281,
273,
3268,
1048,
14694,
4632,
16645,
627,
310,
671,
3064,
275,
253,
2408,
273,
941,
875,
1110,
767,
534,
1537,
1132,
271,
1774,
2554,
3340,
672,
7296,
6311,
14237,
352,
651,
320,
1175,
281,
923,
752,
310,
253,
3064,
347,
973,
875,
767,
15302,
326,
452,
4503,
1180,
273,
6667,
533,
1027,
10670,
2622,
519,
46007,
1580,
5010,
253,
4477,
6452,
1537,
7164,
247,
1953,
50276,
29483,
595,
50276,
90,
249,
1162,
355,
337,
2684,
247,
2905,
1783,
273,
30410,
9777,
3064,
534,
369,
7293,
327,
253,
3268,
273,
5971,
253,
4477,
1783,
3133,
12861,
1060,
2299,
352,
651,
320,
1175,
281,
2953,
667,
22620,
69,
26776,
327,
1755,
273,
326,
1643,
3082,
374,
495,
908,
48960,
6667,
275,
1340,
281,
10007,
253,
6311,
14237,
3185,
273,
253,
49996,
275,
46007,
4836,
352,
651,
320,
1175,
281,
923,
752,
253,
4477,
1158,
670,
824,
3884,
285,
752,
3486,
327,
253,
4735,
2317,
285,
4080,
9990,
352,
651,
452,
50276,
395,
4720,
7419,
432,
253,
1783,
752,
403,
253,
11815,
1060,
323,
253,
2852,
8607,
50276,
1279,
7906,
327,
4081,
10746,
6772,
3844,
326,
812,
35282,
432,
253,
2684,
1783,
50276,
18,
340,
249,
1269,
74,
1162,
355,
4735,
3700,
4715,
323,
2454,
8981,
342,
762,
33174,
941,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
6247,
50276,
19,
465,
303,
480,
3348,
1994,
1947,
480,
543,
248,
251,
5139,
543,
285,
480,
249,
680,
80,
439,
249,
278,
19,
78,
516,
30063,
9162,
3066,
19684,
430,
5240,
263,
10234,
10061,
273,
253,
26332,
70,
886,
39985,
8059,
327,
4382,
8113,
285,
3102,
8981,
9169,
50276,
20,
465,
6002,
254,
1403,
9327,
23530,
83,
2721,
75,
1162,
355,
787,
85,
26259,
1048,
29551,
15302,
342,
18539,
274,
1365,
8292,
16193,
3888,
10061,
273,
253,
347,
757,
8059,
327,
4382,
8113,
9169,
1175,
16101,
2929,
327,
4722,
2256,
273,
1048,
29551,
9162,
352,
651,
320,
1175,
281,
923,
4477,
7906,
327,
3486,
273,
253,
2408,
273,
941,
275,
3733,
3486,
273,
48960,
35919,
569,
285,
4081,
10746,
45030,
432,
253,
1783,
5474,
33032,
2520,
2929,
14993,
281,
1263,
752,
310,
253,
3673,
44856,
275,
1048,
29551,
4715,
1754,
327,
9470,
4679,
253,
4477,
12661,
326,
6779,
4715,
310,
253,
3673,
44856,
275,
1048,
29551,
9162,
671,
436,
2929,
3537,
13505,
6779,
4715,
432,
253,
24302,
273,
11251,
14407,
8566,
1255,
285,
734,
2437,
9712,
347,
973,
347,
253,
4833,
273,
941,
5878,
484,
327,
1048,
29551,
6779,
4715,
2762,
2792,
337,
436,
2929,
14993,
281,
45190,
7409,
253,
6349,
273,
6779,
4715,
534,
778,
2085,
247,
747,
4685,
273,
3676,
1048,
29551,
4715,
281,
253,
3114,
50275,
19,
436,
789,
2722,
253,
12510,
273,
11251,
14407,
8566,
1255,
285,
734,
2437,
9712,
327,
1048,
29551,
6779,
4715,
50273,
20,
436,
789,
671,
3537,
13505,
253,
4833,
273,
941,
42072,
327,
1048,
29551,
6779,
4715,
534,
3400,
247,
1805,
4685,
273,
941,
42072,
275,
3676,
1048,
29551,
4715,
50275,
12373,
2792,
337,
436,
2929,
5393,
326,
247,
7744,
2918,
9927,
275,
3676,
1048,
29551,
9162,
310,
326,
3045,
3673,
44856,
310,
253,
9162,
1481,
31831,
253,
6779,
458,
47612,
2299,
824,
247,
9927,
778,
417,
320,
1846,
3877,
326,
1142,
3332,
1048,
29551,
4715,
2175,
2770,
327,
11138,
6779,
4715,
337,
24088,
465,
498,
374,
9769,
495,
268,
15861,
21,
285,
277,
1102,
85,
608,
50276,
3062,
1189,
275,
253,
6452,
436,
2929,
4767,
326,
253,
1543,
1804,
326,
253,
3625,
1895,
275,
1048,
29551,
9162,
778,
275,
958,
320,
253,
1643,
11860,
4715,
1895,
327,
253,
8105,
5971,
2581,
685,
253,
31561,
1895,
2299,
436,
4154,
310,
1512,
2266,
285,
253,
2797,
1543,
2550,
1329,
352,
352,
651,
320,
1805,
604,
253,
4477,
574,
3542,
512,
253,
7125,
625,
8132,
29689,
285,
16058,
731,
625,
4336,
50273,
19,
347,
5393,
407,
253,
1840,
1953,
627,
403,
1142,
6779,
4715,
3169,
1048,
29551,
2175,
24088,
1458,
3103,
352,
651,
320,
1805,
604,
4477,
476,
2278,
278,
603,
6779,
4715,
1754,
1048,
29551,
4715,
3082,
275,
2905,
789,
50275,
20,
253,
12232,
1895,
275,
436,
2929,
310,
253,
908,
16645,
873,
26332,
260,
338,
274,
6903,
361,
285,
4440,
257,
292,
18,
76,
4496,
3877,
326,
253,
941,
1180,
273,
50276,
46277,
274,
6903,
361,
390,
4440,
257,
292,
18,
76,
310,
1199,
625,
685,
616,
1048,
29551,
11640,
26332,
260,
338,
274,
5792,
285,
4440,
257,
292,
5792,
7296,
970,
625,
3733,
3530,
588,
1421,
281,
1534,
7756,
275,
6779,
4715,
285,
1566,
3045,
954,
16774,
14023,
275,
436,
2929,
3340,
2829,
337,
285,
4677,
374,
403,
16593,
285,
253,
3969,
7125,
403,
440,
5726,
86,
10563,
253,
4679,
651,
452,
644,
625,
34593,
604,
253,
16645,
3733,
873,
310,
247,
12955,
273,
253,
1048,
29551,
3733,
873,
342,
247,
2074,
2264,
941,
1180,
533,
1016,
966,
556,
253,
256,
1443,
303,
1858,
941,
1180,
751,
1249,
323,
1650,
247,
16645,
873,
273,
4440,
257,
292,
5792,
476,
320,
2797,
387,
5987,
7280,
681,
6148,
565,
1403,
265,
297,
29756,
29551,
28269,
5643,
358,
404,
21830,
8608,
255,
633,
303,
6533,
292,
5792,
50274,
21,
4496,
2319,
625,
273,
4677,
608,
327,
4440,
257,
292,
5792,
285,
260,
338,
274,
2313,
5792,
253,
11041,
13554,
273,
1048,
29551,
14237,
285,
7445,
14237,
403,
3240,
5185,
824,
7313,
1646,
1027,
432,
253,
6452,
273,
4706,
3127,
1580,
891,
717,
13477,
670,
253,
1543,
891,
5476,
643,
10668,
778,
671,
513,
594,
3103,
891,
1804,
253,
4477,
5513,
731,
625,
50272,
37585,
13991,
337,
4677,
495,
310,
417,
2590,
2217,
352,
651,
320,
1805,
604,
253,
4477,
476,
5513,
352,
625,
275,
253,
3403,
621,
25761,
752,
403,
7759,
275,
3036,
2358,
4496,
1056,
731,
625,
2590,
50276,
19,
275,
3104,
5329,
273,
3239,
374,
4440,
257,
292,
5792,
4620,
7019,
275,
1386,
337,
273,
3239,
721,
627,
943,
320,
247,
2120,
3523,
1078,
275,
3946,
50274,
250,
3065,
50276,
18,
3676,
1048,
29551,
4715,
247,
6630,
549,
32693,
43425,
50276,
19,
18216,
16645,
4735,
8470,
323,
6779,
4715,
275,
17857,
32888,
43425,
50276,
20,
4499,
422,
4715,
1754,
9769,
6928,
323,
1048,
29551,
2460,
9162,
275,
30105,
1087,
43425,
50276,
21,
36833,
4499,
422,
4715,
275,
17857,
17312,
43425,
50276,
22,
3268,
267,
31640,
2957,
323,
1048,
14694,
4715,
275,
17857,
17312,
43425,
50275,
1189,
455,
891,
751,
253,
4736,
273,
436,
2929,
26332,
18918,
253,
3673,
44856,
273,
1048,
29551,
4715,
2299,
891,
2550,
16928,
436,
2929,
1580,
253,
941,
1180,
273,
253,
908,
16645,
873,
310,
1199,
4067,
685,
253,
1048,
29551,
873,
534,
2789,
253,
16774,
14023,
16593,
285,
253,
3969,
4560,
440,
5726,
86,
10563,
25761,
253,
7125,
275,
436,
2929,
943,
320,
3542,
625,
8132,
29689,
891,
717,
9995,
281,
923,
253,
2380,
273,
253,
4477,
2490,
187,
4118,
18435,
27,
2520,
2929,
2340,
684,
253,
2554,
273,
6779,
4715,
672,
253,
3268,
689,
253,
4735,
2317,
556,
247,
1048,
8105,
253,
2022,
16038,
310,
281,
3653,
849,
1199,
273,
253,
4583,
4715,
275,
436,
1083,
310,
3673,
44856,
264,
5742,
407,
6779,
4715,
253,
2022,
4342,
403,
326,
26724,
4715,
4245,
1308,
1522,
1048,
29551,
14237,
5237,
272,
4583,
3045,
253,
2929,
5936,
247,
830,
273,
941,
42072,
281,
16748,
436,
30628,
14409,
326,
436,
5839,
310,
32811,
2299,
1142,
7350,
497,
5439,
347,
281,
1880,
4679,
1329,
253,
8392,
11815,
247,
625,
3505,
74,
6216,
2746,
281,
253,
941,
42072,
16182,
310,
671,
3058,
253,
4477,
2953,
690,
273,
841,
5277,
2007,
4679,
533,
841,
497,
417,
2217,
281,
26740,
30628,
1580,
1543,
403,
26401,
16774,
275,
3753,
436,
2159,
4202,
6492,
326,
253,
2929,
310,
417,
4704,
281,
3894,
342,
253,
3114,
816,
2568,
10046,
4679,
342,
30909,
1941,
403,
3058,
281,
4751,
1329,
253,
22857,
273,
253,
789
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a bayesian model comparison based approach for quantifying the semantic similarity between two groups of embeddings eg two sentences in particular it proposes to use the difference between the probability that the two groups are from the same model and the probability that they are from different models while the approach looks interesting i have a few concerns using the bayesian model comparison framework seems to be an interesting idea however what are the advantages compared to widely used learned models say a learned cnn that takes as input two sentences and outputs the similarity score the latter can fit the groundtruth labels given by humans while its unclear the model comparison leads to good correlation with human judgments some discussion should be provided the von misesfisher likelihood is a very simplified model of actual text data have you considered using other models in particular more sophisticated ones may lead to better performance different information criteria can be plugged in are there comparisons the experiments are just too simple and incomplete to make reasonable conclusions for example it seems compared to sif there is not much advantage even in the online setting docsepthe authors propose a probabilistic model for computing the sentence similarity between two sets of representations in an online fashion that is they do not need to see the entire dataset at once as sif does when using pca they evaluate on the sts tasks and outperform competitive baselines like wmd averaging embeddings and sif without pca but they have worse performance that sif pca the paper is clearly written and their model is carefully laid out along with their derivation my concern with this paper however is that i feel the paper lacks a motivation was it derive an online similarity metric that outperforms sifwithout pca a few experimental questionscomments what happens to all methods when stop words are not removed how far does performance fall i think one reason it might fall in addition to the reasons given in the paper is that all vectors are set to have the same norm for sts tasks often the norms of these vectors are reduced during training which lessens their influence what mechanism was used to identify the stop words and does removing these help the other methods i know in the paper stop words were removed in the baseline did this unilaterally improve performance for these methods overall i do like the paper however i do find the results to be lackluster there are many papers on combining word embeddings trained in various ways that have much stronger numbers on sts but these methods wont be effective with this type of similarity namely because embeddings must have unit norm in their model therefore i think the paper needs some more motivation and experimental evidence of its superiority over related methods like sifpca in order for it to be accepted pros probabilistic model with clear design assumptions from which a similarity metric can be derived derived similarity metric doesnt require knowledge of the entire dataset in comparison to sif pca cons performance seems to be slightly better than sif wmd and averaging word embeddings but below that of sif pca unclear motivation for the model was it derive an online similarity metric that outperforms sifwithout pca requires the removal of stop words but doesnt state how these were defined minor point but tuning this could be enough to cause the improvement over related methodsdocsepmain contribution devising and evaluating a theoreticallysound algorithm for quantifying the semantic similarity between two pieces of text eg two sentences given pretrained word embeddings glove clarity the paper is generally wellwritten but i would have liked to see more details regarding the motivation for the work description of the prior work and discussion of the results as an example i could not understand what were the differences between the online and offline settings with only a reference to the arora et al 2016 paper that does not contain neither online nor offline the mathematical derivations are detailed which is nice originality the work looks original it proposes a method for quantifying semantic similarity that does not rely on cosine similarity significance i should start by saying i am not a great reviewer for this paper i am not familiar with the sts dataset and dont have the mathematical background to fully understand the authors algorithm i like to see theoretical work in a field that desperately needs some but overall i feel the paper could do a much better job at explaining the motivation behind the work which is limited to cosine similarity is not backed by a solid theoretical foundation i am not convinced of the practicality of the algorithm either the algorithm seems to improve slightly over the compared approaches and it is unclear if the differences are significant and only in some settings the approach needs to remove stopwords which is reminiscent of good old feature engineering finally the paper claims better average time complexity than some other methods but discussing whether the algorithm is faster for common ranges of d the word embedding dimension would also have been interesting
### Summary: | this paper presents a novel family of probabilistic approaches to computing the similarities between two sentences using bagofembeddings representations and presents evaluations on a standard benchmark to demonstrate the effectiveness of the approach while there seem to be no substantial disputes about the soundness of the paper in its current form the reviewers were not convinced by the broad motivation for the approach and did not find the empirical results compelling enough to serve as a motivation on its own given that no reviewer was willing to argue that this paper makes an important enough contribution to be accepted it is unfortunate that one of the assigned reviewersby their own admissionwas not well qualified to review it and that a second reviewer did not submit a review at all necessitating a late fillin review thank you anonymous emergency reviewer however the paper was considered seriously i can attest that both of the two higherconfidence reviewers are well qualified to review work on problems and methods like these | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
17699,
16561,
1566,
5301,
1754,
2746,
323,
2677,
5411,
253,
24705,
14259,
875,
767,
2390,
273,
46234,
24088,
767,
14683,
275,
1798,
352,
29328,
281,
897,
253,
3064,
875,
253,
5912,
326,
253,
767,
2390,
403,
432,
253,
1072,
1566,
285,
253,
5912,
326,
597,
403,
432,
1027,
3210,
50276,
6050,
253,
2746,
4453,
4722,
891,
452,
247,
1643,
7350,
50275,
5302,
253,
17699,
16561,
1566,
5301,
7792,
3133,
281,
320,
271,
4722,
2934,
2299,
752,
403,
253,
11361,
2429,
281,
7561,
908,
6311,
3210,
1333,
247,
6311,
260,
9866,
326,
3936,
347,
3280,
767,
14683,
285,
18012,
253,
14259,
4868,
253,
6158,
476,
4944,
253,
3216,
33024,
13301,
1677,
407,
7497,
1223,
697,
12744,
253,
1566,
5301,
5644,
281,
1175,
5921,
342,
1966,
23014,
690,
5955,
943,
320,
2530,
50276,
783,
8449,
278,
3013,
71,
6850,
12177,
310,
247,
1077,
21010,
1566,
273,
4588,
2505,
941,
452,
368,
2783,
970,
643,
3210,
275,
1798,
625,
18144,
4394,
778,
1421,
281,
1805,
3045,
50275,
19623,
1491,
6866,
476,
320,
43867,
275,
403,
627,
14023,
50275,
783,
4679,
403,
816,
1512,
2969,
285,
18464,
281,
1056,
5272,
11815,
323,
1650,
352,
3133,
2429,
281,
256,
338,
627,
310,
417,
1199,
5750,
1014,
275,
253,
3909,
4758,
50276,
7152,
339,
431,
248,
4477,
12661,
247,
37851,
1566,
323,
12672,
253,
6197,
14259,
875,
767,
5239,
273,
14237,
275,
271,
3909,
8142,
326,
310,
597,
513,
417,
878,
281,
923,
253,
2862,
10895,
387,
2378,
347,
256,
338,
1057,
672,
970,
268,
6357,
597,
7472,
327,
253,
331,
84,
8892,
285,
562,
32231,
12085,
1666,
25379,
751,
259,
6535,
25001,
46234,
285,
256,
338,
1293,
268,
6357,
533,
597,
452,
7197,
3045,
326,
256,
338,
50276,
5902,
66,
50276,
783,
2929,
310,
4518,
3542,
285,
616,
1566,
310,
9257,
10090,
562,
2112,
342,
616,
28529,
619,
4468,
342,
436,
2929,
2299,
310,
326,
891,
1928,
253,
2929,
19756,
247,
16038,
369,
352,
15313,
271,
3909,
14259,
7982,
326,
41731,
13015,
256,
338,
14920,
268,
6357,
50276,
66,
1643,
5661,
3533,
26122,
50276,
5371,
6569,
281,
512,
3082,
672,
3523,
3000,
403,
417,
5176,
849,
2080,
1057,
3045,
2965,
891,
1158,
581,
1921,
352,
1537,
2965,
275,
1635,
281,
253,
4606,
1677,
275,
253,
2929,
310,
326,
512,
11390,
403,
873,
281,
452,
253,
1072,
5222,
323,
331,
84,
8892,
2223,
253,
22429,
273,
841,
11390,
403,
3777,
1309,
3733,
534,
1679,
561,
616,
4833,
752,
5122,
369,
908,
281,
4271,
253,
3523,
3000,
285,
1057,
11922,
841,
1361,
253,
643,
3082,
891,
871,
275,
253,
2929,
3523,
3000,
497,
5176,
275,
253,
8245,
858,
436,
440,
300,
37313,
3157,
3045,
323,
841,
3082,
50276,
1189,
455,
891,
513,
751,
253,
2929,
2299,
891,
513,
1089,
253,
1543,
281,
320,
3480,
77,
8976,
627,
403,
1142,
9380,
327,
16248,
3159,
46234,
10166,
275,
2710,
4088,
326,
452,
1199,
10046,
3904,
327,
331,
84,
533,
841,
3082,
31451,
320,
3576,
342,
436,
1511,
273,
14259,
10775,
984,
46234,
1364,
452,
3943,
5222,
275,
616,
1566,
3103,
891,
1158,
253,
2929,
3198,
690,
625,
16038,
285,
5661,
1941,
273,
697,
34385,
689,
2905,
3082,
751,
256,
338,
5902,
66,
275,
1340,
323,
352,
281,
320,
7607,
50276,
856,
84,
50276,
22275,
5280,
2531,
1566,
342,
2590,
2216,
13260,
432,
534,
247,
14259,
7982,
476,
320,
6012,
50276,
13472,
14259,
7982,
36908,
2430,
3640,
273,
253,
2862,
10895,
275,
5301,
281,
256,
338,
50276,
5902,
66,
50276,
5040,
50276,
24159,
3133,
281,
320,
5777,
1805,
685,
256,
338,
259,
6535,
285,
25001,
3159,
46234,
533,
2708,
326,
273,
256,
338,
50276,
5902,
66,
50275,
328,
8250,
16038,
323,
253,
1566,
369,
352,
15313,
271,
3909,
14259,
7982,
326,
41731,
13015,
256,
338,
14920,
268,
6357,
50276,
36042,
253,
8570,
273,
3523,
3000,
533,
36908,
1375,
849,
841,
497,
2931,
5884,
1127,
533,
25184,
436,
812,
320,
2217,
281,
2847,
253,
7756,
689,
2905,
3082,
7152,
339,
2617,
404,
7680,
1474,
2182,
285,
16344,
247,
28055,
27962,
5933,
323,
2677,
5411,
253,
24705,
14259,
875,
767,
7437,
273,
2505,
24088,
767,
14683,
1677,
3215,
11273,
3159,
46234,
38081,
50276,
498,
15752,
253,
2929,
310,
3839,
973,
15720,
533,
891,
651,
452,
10490,
281,
923,
625,
4278,
5001,
253,
16038,
323,
253,
789,
5740,
273,
253,
2720,
789,
285,
5955,
273,
253,
1543,
347,
271,
1650,
891,
812,
417,
2096,
752,
497,
253,
3910,
875,
253,
3909,
285,
28841,
7533,
342,
760,
247,
3806,
281,
253,
549,
6464,
1162,
355,
4022,
2929,
326,
1057,
417,
3831,
6747,
3909,
4543,
28841,
253,
15965,
3538,
569,
403,
7000,
534,
310,
5322,
50276,
19164,
414,
253,
789,
4453,
3236,
352,
29328,
247,
1332,
323,
2677,
5411,
24705,
14259,
326,
1057,
417,
10725,
327,
7349,
460,
14259,
50276,
9188,
40348,
891,
943,
1265,
407,
3981,
891,
717,
417,
247,
1270,
37317,
323,
436,
2929,
891,
717,
417,
7615,
342,
253,
331,
84,
10895,
285,
13414,
452,
253,
15965,
4114,
281,
4751,
2096,
253,
4477,
5933,
891,
751,
281,
923,
10527,
789,
275,
247,
1673,
326,
26426,
3198,
690,
533,
4583,
891,
1928,
253,
2929,
812,
513,
247,
1199,
1805,
2628,
387,
15571,
253,
16038,
3212,
253,
789,
534,
310,
3710,
281,
7349,
460,
14259,
50276,
261,
417,
17245,
407,
247,
4891,
10527,
12153,
891,
717,
417,
13762,
273,
253,
8542,
414,
273,
253,
5933,
2057,
253,
5933,
3133,
281,
3157,
5777,
689,
253,
2429,
7274,
285,
352,
310,
12744,
604,
253,
3910,
403,
1534,
285,
760,
275,
690,
7533,
253,
2746,
3198,
281,
5386,
3523,
12113,
534,
310,
35036,
273,
1175,
1711,
4735,
11369,
4720,
253,
2929,
3916,
1805,
3388,
673,
10454,
685,
690,
643,
3082,
533,
16585,
1880,
253,
5933,
310,
7938,
323,
1846,
13794,
273,
277,
253,
3159,
21496,
7877,
651,
671,
452,
644,
4722,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
4460,
2021,
273,
37851,
7274,
281,
12672,
253,
22620,
875,
767,
14683,
970,
7351,
1171,
24224,
26935,
14237,
285,
10262,
27163,
327,
247,
2629,
22791,
281,
7568,
253,
12510,
273,
253,
2746,
1223,
627,
1646,
281,
320,
642,
6832,
24598,
670,
253,
3590,
1255,
273,
253,
2929,
275,
697,
1655,
830,
253,
30628,
497,
417,
13762,
407,
253,
3862,
16038,
323,
253,
2746,
285,
858,
417,
1089,
253,
16774,
1543,
18511,
2217,
281,
5752,
347,
247,
16038,
327,
697,
1211,
1677,
326,
642,
37317,
369,
7378,
281,
9059,
326,
436,
2929,
2789,
271,
1774,
2217,
7680,
281,
320,
7607,
50276,
262,
310,
23293,
326,
581,
273,
253,
7922,
30628,
1615,
616,
1211,
11341,
4238,
417,
973,
12165,
281,
2278,
352,
285,
326,
247,
1273,
37317,
858,
417,
11929,
247,
2278,
387,
512,
2436,
27427,
247,
3563,
7522,
249,
2278,
5717,
368,
17679,
8945,
37317,
2299,
253,
2929,
369,
2783,
10369,
891,
476,
863,
383,
326,
1097,
273,
253,
767,
2169,
39943,
30628,
403,
973,
12165,
281,
2278,
789,
327,
3237,
285,
3082,
751,
841
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
17699,
16561,
1566,
5301,
1754,
2746,
323,
2677,
5411,
253,
24705,
14259,
875,
767,
2390,
273,
46234,
24088,
767,
14683,
275,
1798,
352,
29328,
281,
897,
253,
3064,
875,
253,
5912,
326,
253,
767,
2390,
403,
432,
253,
1072,
1566,
285,
253,
5912,
326,
597,
403,
432,
1027,
3210,
50276,
6050,
253,
2746,
4453,
4722,
891,
452,
247,
1643,
7350,
50275,
5302,
253,
17699,
16561,
1566,
5301,
7792,
3133,
281,
320,
271,
4722,
2934,
2299,
752,
403,
253,
11361,
2429,
281,
7561,
908,
6311,
3210,
1333,
247,
6311,
260,
9866,
326,
3936,
347,
3280,
767,
14683,
285,
18012,
253,
14259,
4868,
253,
6158,
476,
4944,
253,
3216,
33024,
13301,
1677,
407,
7497,
1223,
697,
12744,
253,
1566,
5301,
5644,
281,
1175,
5921,
342,
1966,
23014,
690,
5955,
943,
320,
2530,
50276,
783,
8449,
278,
3013,
71,
6850,
12177,
310,
247,
1077,
21010,
1566,
273,
4588,
2505,
941,
452,
368,
2783,
970,
643,
3210,
275,
1798,
625,
18144,
4394,
778,
1421,
281,
1805,
3045,
50275,
19623,
1491,
6866,
476,
320,
43867,
275,
403,
627,
14023,
50275,
783,
4679,
403,
816,
1512,
2969,
285,
18464,
281,
1056,
5272,
11815,
323,
1650,
352,
3133,
2429,
281,
256,
338,
627,
310,
417,
1199,
5750,
1014,
275,
253,
3909,
4758,
50276,
7152,
339,
431,
248,
4477,
12661,
247,
37851,
1566,
323,
12672,
253,
6197,
14259,
875,
767,
5239,
273,
14237,
275,
271,
3909,
8142,
326,
310,
597,
513,
417,
878,
281,
923,
253,
2862,
10895,
387,
2378,
347,
256,
338,
1057,
672,
970,
268,
6357,
597,
7472,
327,
253,
331,
84,
8892,
285,
562,
32231,
12085,
1666,
25379,
751,
259,
6535,
25001,
46234,
285,
256,
338,
1293,
268,
6357,
533,
597,
452,
7197,
3045,
326,
256,
338,
50276,
5902,
66,
50276,
783,
2929,
310,
4518,
3542,
285,
616,
1566,
310,
9257,
10090,
562,
2112,
342,
616,
28529,
619,
4468,
342,
436,
2929,
2299,
310,
326,
891,
1928,
253,
2929,
19756,
247,
16038,
369,
352,
15313,
271,
3909,
14259,
7982,
326,
41731,
13015,
256,
338,
14920,
268,
6357,
50276,
66,
1643,
5661,
3533,
26122,
50276,
5371,
6569,
281,
512,
3082,
672,
3523,
3000,
403,
417,
5176,
849,
2080,
1057,
3045,
2965,
891,
1158,
581,
1921,
352,
1537,
2965,
275,
1635,
281,
253,
4606,
1677,
275,
253,
2929,
310,
326,
512,
11390,
403,
873,
281,
452,
253,
1072,
5222,
323,
331,
84,
8892,
2223,
253,
22429,
273,
841,
11390,
403,
3777,
1309,
3733,
534,
1679,
561,
616,
4833,
752,
5122,
369,
908,
281,
4271,
253,
3523,
3000,
285,
1057,
11922,
841,
1361,
253,
643,
3082,
891,
871,
275,
253,
2929,
3523,
3000,
497,
5176,
275,
253,
8245,
858,
436,
440,
300,
37313,
3157,
3045,
323,
841,
3082,
50276,
1189,
455,
891,
513,
751,
253,
2929,
2299,
891,
513,
1089,
253,
1543,
281,
320,
3480,
77,
8976,
627,
403,
1142,
9380,
327,
16248,
3159,
46234,
10166,
275,
2710,
4088,
326,
452,
1199,
10046,
3904,
327,
331,
84,
533,
841,
3082,
31451,
320,
3576,
342,
436,
1511,
273,
14259,
10775,
984,
46234,
1364,
452,
3943,
5222,
275,
616,
1566,
3103,
891,
1158,
253,
2929,
3198,
690,
625,
16038,
285,
5661,
1941,
273,
697,
34385,
689,
2905,
3082,
751,
256,
338,
5902,
66,
275,
1340,
323,
352,
281,
320,
7607,
50276,
856,
84,
50276,
22275,
5280,
2531,
1566,
342,
2590,
2216,
13260,
432,
534,
247,
14259,
7982,
476,
320,
6012,
50276,
13472,
14259,
7982,
36908,
2430,
3640,
273,
253,
2862,
10895,
275,
5301,
281,
256,
338,
50276,
5902,
66,
50276,
5040,
50276,
24159,
3133,
281,
320,
5777,
1805,
685,
256,
338,
259,
6535,
285,
25001,
3159,
46234,
533,
2708,
326,
273,
256,
338,
50276,
5902,
66,
50275,
328,
8250,
16038,
323,
253,
1566,
369,
352,
15313,
271,
3909,
14259,
7982,
326,
41731,
13015,
256,
338,
14920,
268,
6357,
50276,
36042,
253,
8570,
273,
3523,
3000,
533,
36908,
1375,
849,
841,
497,
2931,
5884,
1127,
533,
25184,
436,
812,
320,
2217,
281,
2847,
253,
7756,
689,
2905,
3082,
7152,
339,
2617,
404,
7680,
1474,
2182,
285,
16344,
247,
28055,
27962,
5933,
323,
2677,
5411,
253,
24705,
14259,
875,
767,
7437,
273,
2505,
24088,
767,
14683,
1677,
3215,
11273,
3159,
46234,
38081,
50276,
498,
15752,
253,
2929,
310,
3839,
973,
15720,
533,
891,
651,
452,
10490,
281,
923,
625,
4278,
5001,
253,
16038,
323,
253,
789,
5740,
273,
253,
2720,
789,
285,
5955,
273,
253,
1543,
347,
271,
1650,
891,
812,
417,
2096,
752,
497,
253,
3910,
875,
253,
3909,
285,
28841,
7533,
342,
760,
247,
3806,
281,
253,
549,
6464,
1162,
355,
4022,
2929,
326,
1057,
417,
3831,
6747,
3909,
4543,
28841,
253,
15965,
3538,
569,
403,
7000,
534,
310,
5322,
50276,
19164,
414,
253,
789,
4453,
3236,
352,
29328,
247,
1332,
323,
2677,
5411,
24705,
14259,
326,
1057,
417,
10725,
327,
7349,
460,
14259,
50276,
9188,
40348,
891,
943,
1265,
407,
3981,
891,
717,
417,
247,
1270,
37317,
323,
436,
2929,
891,
717,
417,
7615,
342,
253,
331,
84,
10895,
285,
13414,
452,
253,
15965,
4114,
281,
4751,
2096,
253,
4477,
5933,
891,
751,
281,
923,
10527,
789,
275,
247,
1673,
326,
26426,
3198,
690,
533,
4583,
891,
1928,
253,
2929,
812,
513,
247,
1199,
1805,
2628,
387,
15571,
253,
16038,
3212,
253,
789,
534,
310,
3710,
281,
7349,
460,
14259,
50276,
261,
417,
17245,
407,
247,
4891,
10527,
12153,
891,
717,
417,
13762,
273,
253,
8542,
414,
273,
253,
5933,
2057,
253,
5933,
3133,
281,
3157,
5777,
689,
253,
2429,
7274,
285,
352,
310,
12744,
604,
253,
3910,
403,
1534,
285,
760,
275,
690,
7533,
253,
2746,
3198,
281,
5386,
3523,
12113,
534,
310,
35036,
273,
1175,
1711,
4735,
11369,
4720,
253,
2929,
3916,
1805,
3388,
673,
10454,
685,
690,
643,
3082,
533,
16585,
1880,
253,
5933,
310,
7938,
323,
1846,
13794,
273,
277,
253,
3159,
21496,
7877,
651,
671,
452,
644,
4722,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
247,
4460,
2021,
273,
37851,
7274,
281,
12672,
253,
22620,
875,
767,
14683,
970,
7351,
1171,
24224,
26935,
14237,
285,
10262,
27163,
327,
247,
2629,
22791,
281,
7568,
253,
12510,
273,
253,
2746,
1223,
627,
1646,
281,
320,
642,
6832,
24598,
670,
253,
3590,
1255,
273,
253,
2929,
275,
697,
1655,
830,
253,
30628,
497,
417,
13762,
407,
253,
3862,
16038,
323,
253,
2746,
285,
858,
417,
1089,
253,
16774,
1543,
18511,
2217,
281,
5752,
347,
247,
16038,
327,
697,
1211,
1677,
326,
642,
37317,
369,
7378,
281,
9059,
326,
436,
2929,
2789,
271,
1774,
2217,
7680,
281,
320,
7607,
50276,
262,
310,
23293,
326,
581,
273,
253,
7922,
30628,
1615,
616,
1211,
11341,
4238,
417,
973,
12165,
281,
2278,
352,
285,
326,
247,
1273,
37317,
858,
417,
11929,
247,
2278,
387,
512,
2436,
27427,
247,
3563,
7522,
249,
2278,
5717,
368,
17679,
8945,
37317,
2299,
253,
2929,
369,
2783,
10369,
891,
476,
863,
383,
326,
1097,
273,
253,
767,
2169,
39943,
30628,
403,
973,
12165,
281,
2278,
789,
327,
3237,
285,
3082,
751,
841
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary of the paper this work presents a novel method for similarity function learning using nonlinear model the main problem with the similarity function learning models is the pairwise component of the loss function which grows quadratically with the training set the existing stochastic approximations which are agnostic to training set size have high variance and this inturn results in poor convergence and generalisation this paper presents a new stochastic approximation of the pairwise loss with reduced variance this is achieved by exploiting the dotproduct structure of the leastsquares loss and is computationally efficient provided the embedding dimensions are small the core idea is to rewrite the leastsquares as the matrix dot product of two psd matrices grammian the grammian matrix is the sum of the outerproduct of embeddings along the training samples the authors present two algorithms for training the model 1sagram by maintaining a cache of all embedding vectors of training points onk space whenever a point is encountered its cache is replaced with its embedding vector 2 sogram this algorithm keeps a moving average of the grammian estimate to reduce the variance experimental results shows that this approach reduces the variance in the grammian estimates results in faster convergence and better generalisation review the paper is well written with clear contribution to the problem of similarity learning my only complain is that i think the evaluation is a bit weak and does not support the claim that is applicable all kinds of problems eg nlp and recommender systems this task in wikipedia does not seem to be standard kind of arbitrary there are some recommendation results in the appendix but i think it should have been in the main paper overall interesting but i would recommend evaluating in standard similarity learning for nlp and other tasks perhaps more than one there are specific similarity evaluation sets for word embeddings it can be found in following papers httpsarxivorgpdf13013781pdf httpwwwaclweborganthologyd151036docsepthis paper proposes an efficient algorithm to learn neural embedding models with a dotproduct structure over very large corpora the main method is to reformulate the objective function in terms of generalized gramiam matrices and maintain estimates of those matrices in the training process the algorithm uses less time and achieves significantly better quality than sampling based methods 1 about the experiments it seems the sample size for sampling based experiments is not discussed the number of noise samples have a large influence on the performance of the models in figure 2 different sampling strategies are discussed it would be cool if we can also see how the sampling size affects the estimation error 2 if we just look at the sampling based methods in figure 2a uniform samplings gramian estimates is the worst but the map of uniform sampling on validation set for all three datasets are not the worst do you have any comments 3 wheter an edge whether an edge docsepthis paper proposes a method for estimating nonlinear similarities between items using gramian estimation this is achieved by having two separate neural networks defined for each item to be compared which are then combined via a dot product the proposed innovation in this paper is to use gramian estimation for the penalty parameter of the optimization which allows for the nonlinear case two algorithms are proposed which allow for estimation in the stochastic online setting experiments are presented which appear to show good performance on some standard benchmark tasks overall i think this is an interesting set of ideas for an important problem i have two reservations first the organization of the paper needs to be addressed in order to aid user readability the paper often jumps across sections without giving motivation or connecting language this will limit the audience of the paper and the work second and more importantly i found the experiments to be slightly underwhelming the hyperparameters batch size learning rate and architecture dont have any rationale attached to them it is also not entirely clear whether the chosen comparison methods fully constitute the current state of the art nonetheless i think this is an interesting idea and strong work with compelling results editorial comments the organization of this paper leaves something to be desired the introductions ends very abruptly and then appears to begin again after the related work section from what i can tell the first three sections all constitute the introduction and should be merged with appropriate edits to make the narrative clear where x and y are nodes in a graph and the similarity is wheter an edge typo and sentence ends prematurely
### Summary: | this paper presents methods to scale learning of embedding models estimated using neural networks the main idea is to work with gram matrices whose sizes depend on the length of the embedding building upon existing works like sag algorithm the paper proposes two new stochastic methods for learning using stochastic estimates of gram matrices reviewers find the paper interesting and useful although have given many suggestions to improve the presentation and experiments for this reason i recommend to accept this paper a small note sag algorithm was originally proposed in 2013 the paper only cites the 2017 version please include the 2013 version as well | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
273,
253,
2929,
50276,
2520,
789,
10262,
247,
4460,
1332,
323,
14259,
1159,
4715,
970,
14561,
1566,
253,
2022,
1895,
342,
253,
14259,
1159,
4715,
3210,
310,
253,
28208,
4445,
273,
253,
2957,
1159,
534,
17202,
13284,
5372,
342,
253,
3733,
873,
253,
5368,
19191,
34754,
534,
403,
639,
79,
6932,
281,
3733,
873,
1979,
452,
1029,
11041,
285,
436,
540,
662,
1543,
275,
4105,
14940,
285,
2087,
5837,
436,
2929,
10262,
247,
747,
19191,
11193,
273,
253,
28208,
2957,
342,
3777,
11041,
436,
310,
6786,
407,
38883,
253,
14261,
7509,
2605,
273,
253,
1878,
23600,
4420,
2957,
285,
310,
43245,
5919,
2530,
253,
21496,
10103,
403,
1355,
253,
5161,
2934,
310,
281,
24813,
253,
1878,
23600,
4420,
347,
253,
4315,
14261,
1885,
273,
767,
3714,
69,
12624,
650,
3681,
757,
253,
650,
3681,
757,
4315,
310,
253,
2020,
273,
253,
8346,
7509,
273,
46234,
2112,
253,
3733,
3530,
253,
4477,
1246,
767,
11333,
323,
3733,
253,
1566,
337,
84,
12068,
407,
11850,
247,
11556,
273,
512,
21496,
11390,
273,
3733,
2792,
327,
76,
2317,
10793,
247,
1127,
310,
14494,
697,
11556,
310,
7932,
342,
697,
21496,
4972,
374,
256,
12935,
436,
5933,
11359,
247,
4886,
3388,
273,
253,
650,
3681,
757,
6642,
281,
4796,
253,
11041,
5661,
1543,
2722,
326,
436,
2746,
11355,
253,
11041,
275,
253,
650,
3681,
757,
8197,
1543,
275,
7938,
14940,
285,
1805,
2087,
5837,
50276,
15337,
50276,
783,
2929,
310,
973,
3542,
342,
2590,
7680,
281,
253,
1895,
273,
14259,
50276,
28269,
50276,
2577,
760,
17805,
310,
326,
891,
1158,
253,
7103,
310,
247,
2372,
5075,
285,
1057,
417,
1329,
253,
1750,
326,
310,
7763,
512,
9351,
273,
3237,
24088,
295,
24343,
285,
3818,
3109,
2718,
436,
4836,
275,
259,
15170,
1057,
417,
1646,
281,
320,
2629,
2238,
273,
10341,
50276,
9088,
403,
690,
17401,
1543,
275,
253,
30762,
533,
891,
1158,
352,
943,
452,
644,
275,
253,
2022,
2929,
50276,
1189,
455,
4722,
533,
891,
651,
5583,
16344,
275,
2629,
14259,
4715,
323,
295,
24343,
285,
643,
8892,
4931,
625,
685,
581,
50276,
9088,
403,
2173,
14259,
7103,
5239,
323,
3159,
46234,
352,
476,
320,
1119,
275,
1563,
9380,
5987,
39962,
2061,
9275,
39677,
1787,
3593,
9275,
50275,
2413,
2700,
29404,
7585,
2061,
14718,
1497,
69,
1010,
740,
1812,
7152,
33032,
2520,
2929,
29328,
271,
5919,
5933,
281,
3037,
50276,
570,
1546,
21496,
3210,
342,
247,
14261,
7509,
2605,
689,
1077,
1781,
5944,
66,
253,
2022,
1332,
310,
281,
8460,
4187,
253,
8103,
1159,
275,
2426,
273,
14923,
650,
7588,
312,
12624,
285,
6558,
8197,
273,
1110,
12624,
275,
253,
3733,
1232,
253,
5933,
4648,
1679,
673,
285,
33526,
3012,
1805,
3290,
685,
10491,
1754,
3082,
50275,
18,
670,
253,
4679,
352,
3133,
253,
3410,
1979,
323,
10491,
1754,
4679,
310,
417,
5469,
253,
1180,
273,
6046,
3530,
452,
247,
1781,
4833,
327,
253,
3045,
273,
253,
3210,
275,
4677,
374,
1027,
10491,
8130,
403,
5469,
352,
651,
320,
4484,
604,
359,
476,
671,
923,
849,
253,
10491,
1979,
11852,
253,
13418,
2228,
50275,
19,
604,
359,
816,
1007,
387,
253,
10491,
1754,
3082,
275,
4677,
374,
66,
6447,
1775,
22945,
29975,
757,
8197,
310,
253,
9065,
533,
253,
3711,
273,
6447,
10491,
327,
12820,
873,
323,
512,
1264,
15302,
403,
417,
253,
9065,
513,
368,
452,
667,
5701,
50276,
20,
488,
350,
271,
5024,
50276,
20094,
271,
5024,
5474,
33032,
2520,
2929,
29328,
247,
1332,
323,
26230,
14561,
22620,
875,
4957,
970,
29975,
757,
13418,
436,
310,
6786,
407,
1907,
767,
4858,
11454,
6928,
2931,
323,
1016,
5382,
281,
320,
2429,
534,
403,
840,
5678,
3066,
247,
14261,
1885,
253,
4081,
15832,
275,
436,
2929,
310,
281,
897,
29975,
757,
13418,
323,
253,
12339,
4764,
273,
253,
13757,
534,
4483,
323,
253,
14561,
1083,
767,
11333,
403,
4081,
534,
1581,
323,
13418,
275,
253,
19191,
50276,
27381,
4758,
4679,
403,
3559,
534,
3176,
281,
921,
1175,
3045,
327,
690,
2629,
22791,
8892,
50275,
1189,
455,
891,
1158,
436,
310,
271,
4722,
873,
273,
5697,
323,
271,
1774,
1895,
891,
452,
767,
33196,
806,
253,
6003,
273,
253,
2929,
3198,
281,
320,
9713,
275,
1340,
281,
8596,
2608,
1239,
1430,
253,
2929,
2223,
27287,
2439,
7118,
1293,
4933,
16038,
390,
12873,
3448,
436,
588,
2701,
253,
8446,
273,
253,
2929,
285,
253,
789,
1273,
285,
625,
15538,
891,
1119,
253,
4679,
281,
320,
5777,
762,
11622,
3987,
253,
4373,
22041,
14604,
1979,
4715,
2281,
285,
10336,
13414,
452,
667,
24775,
7660,
281,
731,
352,
310,
671,
417,
7094,
2590,
1880,
253,
6777,
5301,
3082,
4751,
12647,
253,
1655,
1375,
273,
253,
1445,
23188,
891,
1158,
436,
310,
271,
4722,
2934,
285,
2266,
789,
342,
18511,
1543,
50275,
20511,
451,
5701,
50276,
783,
6003,
273,
436,
2929,
6505,
1633,
281,
320,
6799,
253,
3092,
960,
7637,
1077,
30046,
285,
840,
4620,
281,
3135,
969,
846,
253,
2905,
789,
2593,
432,
752,
891,
476,
2028,
253,
806,
1264,
7118,
512,
12647,
253,
10199,
285,
943,
320,
21884,
342,
4569,
1407,
953,
281,
1056,
253,
14511,
2590,
50276,
2811,
1269,
285,
340,
403,
7632,
275,
247,
4216,
285,
253,
14259,
310,
488,
350,
271,
5024,
50276,
555,
5367,
285,
6197,
7637,
20346,
314,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
3082,
281,
4311,
4715,
273,
21496,
3210,
5998,
970,
11454,
6928,
253,
2022,
2934,
310,
281,
789,
342,
29975,
12624,
3692,
9552,
3469,
327,
253,
2978,
273,
253,
21496,
3652,
2220,
5368,
2987,
751,
18561,
5933,
253,
2929,
29328,
767,
747,
19191,
3082,
323,
4715,
970,
19191,
8197,
273,
29975,
12624,
50275,
15337,
398,
1089,
253,
2929,
4722,
285,
4217,
3738,
452,
1677,
1142,
13991,
281,
3157,
253,
9759,
285,
4679,
323,
436,
1921,
891,
5583,
281,
2997,
436,
2929,
50276,
66,
1355,
3877,
18561,
5933,
369,
8927,
4081,
275,
4072,
253,
2929,
760,
28070,
253,
4240,
2715,
4496,
2486,
253,
4072,
2715,
347,
973,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
273,
253,
2929,
50276,
2520,
789,
10262,
247,
4460,
1332,
323,
14259,
1159,
4715,
970,
14561,
1566,
253,
2022,
1895,
342,
253,
14259,
1159,
4715,
3210,
310,
253,
28208,
4445,
273,
253,
2957,
1159,
534,
17202,
13284,
5372,
342,
253,
3733,
873,
253,
5368,
19191,
34754,
534,
403,
639,
79,
6932,
281,
3733,
873,
1979,
452,
1029,
11041,
285,
436,
540,
662,
1543,
275,
4105,
14940,
285,
2087,
5837,
436,
2929,
10262,
247,
747,
19191,
11193,
273,
253,
28208,
2957,
342,
3777,
11041,
436,
310,
6786,
407,
38883,
253,
14261,
7509,
2605,
273,
253,
1878,
23600,
4420,
2957,
285,
310,
43245,
5919,
2530,
253,
21496,
10103,
403,
1355,
253,
5161,
2934,
310,
281,
24813,
253,
1878,
23600,
4420,
347,
253,
4315,
14261,
1885,
273,
767,
3714,
69,
12624,
650,
3681,
757,
253,
650,
3681,
757,
4315,
310,
253,
2020,
273,
253,
8346,
7509,
273,
46234,
2112,
253,
3733,
3530,
253,
4477,
1246,
767,
11333,
323,
3733,
253,
1566,
337,
84,
12068,
407,
11850,
247,
11556,
273,
512,
21496,
11390,
273,
3733,
2792,
327,
76,
2317,
10793,
247,
1127,
310,
14494,
697,
11556,
310,
7932,
342,
697,
21496,
4972,
374,
256,
12935,
436,
5933,
11359,
247,
4886,
3388,
273,
253,
650,
3681,
757,
6642,
281,
4796,
253,
11041,
5661,
1543,
2722,
326,
436,
2746,
11355,
253,
11041,
275,
253,
650,
3681,
757,
8197,
1543,
275,
7938,
14940,
285,
1805,
2087,
5837,
50276,
15337,
50276,
783,
2929,
310,
973,
3542,
342,
2590,
7680,
281,
253,
1895,
273,
14259,
50276,
28269,
50276,
2577,
760,
17805,
310,
326,
891,
1158,
253,
7103,
310,
247,
2372,
5075,
285,
1057,
417,
1329,
253,
1750,
326,
310,
7763,
512,
9351,
273,
3237,
24088,
295,
24343,
285,
3818,
3109,
2718,
436,
4836,
275,
259,
15170,
1057,
417,
1646,
281,
320,
2629,
2238,
273,
10341,
50276,
9088,
403,
690,
17401,
1543,
275,
253,
30762,
533,
891,
1158,
352,
943,
452,
644,
275,
253,
2022,
2929,
50276,
1189,
455,
4722,
533,
891,
651,
5583,
16344,
275,
2629,
14259,
4715,
323,
295,
24343,
285,
643,
8892,
4931,
625,
685,
581,
50276,
9088,
403,
2173,
14259,
7103,
5239,
323,
3159,
46234,
352,
476,
320,
1119,
275,
1563,
9380,
5987,
39962,
2061,
9275,
39677,
1787,
3593,
9275,
50275,
2413,
2700,
29404,
7585,
2061,
14718,
1497,
69,
1010,
740,
1812,
7152,
33032,
2520,
2929,
29328,
271,
5919,
5933,
281,
3037,
50276,
570,
1546,
21496,
3210,
342,
247,
14261,
7509,
2605,
689,
1077,
1781,
5944,
66,
253,
2022,
1332,
310,
281,
8460,
4187,
253,
8103,
1159,
275,
2426,
273,
14923,
650,
7588,
312,
12624,
285,
6558,
8197,
273,
1110,
12624,
275,
253,
3733,
1232,
253,
5933,
4648,
1679,
673,
285,
33526,
3012,
1805,
3290,
685,
10491,
1754,
3082,
50275,
18,
670,
253,
4679,
352,
3133,
253,
3410,
1979,
323,
10491,
1754,
4679,
310,
417,
5469,
253,
1180,
273,
6046,
3530,
452,
247,
1781,
4833,
327,
253,
3045,
273,
253,
3210,
275,
4677,
374,
1027,
10491,
8130,
403,
5469,
352,
651,
320,
4484,
604,
359,
476,
671,
923,
849,
253,
10491,
1979,
11852,
253,
13418,
2228,
50275,
19,
604,
359,
816,
1007,
387,
253,
10491,
1754,
3082,
275,
4677,
374,
66,
6447,
1775,
22945,
29975,
757,
8197,
310,
253,
9065,
533,
253,
3711,
273,
6447,
10491,
327,
12820,
873,
323,
512,
1264,
15302,
403,
417,
253,
9065,
513,
368,
452,
667,
5701,
50276,
20,
488,
350,
271,
5024,
50276,
20094,
271,
5024,
5474,
33032,
2520,
2929,
29328,
247,
1332,
323,
26230,
14561,
22620,
875,
4957,
970,
29975,
757,
13418,
436,
310,
6786,
407,
1907,
767,
4858,
11454,
6928,
2931,
323,
1016,
5382,
281,
320,
2429,
534,
403,
840,
5678,
3066,
247,
14261,
1885,
253,
4081,
15832,
275,
436,
2929,
310,
281,
897,
29975,
757,
13418,
323,
253,
12339,
4764,
273,
253,
13757,
534,
4483,
323,
253,
14561,
1083,
767,
11333,
403,
4081,
534,
1581,
323,
13418,
275,
253,
19191,
50276,
27381,
4758,
4679,
403,
3559,
534,
3176,
281,
921,
1175,
3045,
327,
690,
2629,
22791,
8892,
50275,
1189,
455,
891,
1158,
436,
310,
271,
4722,
873,
273,
5697,
323,
271,
1774,
1895,
891,
452,
767,
33196,
806,
253,
6003,
273,
253,
2929,
3198,
281,
320,
9713,
275,
1340,
281,
8596,
2608,
1239,
1430,
253,
2929,
2223,
27287,
2439,
7118,
1293,
4933,
16038,
390,
12873,
3448,
436,
588,
2701,
253,
8446,
273,
253,
2929,
285,
253,
789,
1273,
285,
625,
15538,
891,
1119,
253,
4679,
281,
320,
5777,
762,
11622,
3987,
253,
4373,
22041,
14604,
1979,
4715,
2281,
285,
10336,
13414,
452,
667,
24775,
7660,
281,
731,
352,
310,
671,
417,
7094,
2590,
1880,
253,
6777,
5301,
3082,
4751,
12647,
253,
1655,
1375,
273,
253,
1445,
23188,
891,
1158,
436,
310,
271,
4722,
2934,
285,
2266,
789,
342,
18511,
1543,
50275,
20511,
451,
5701,
50276,
783,
6003,
273,
436,
2929,
6505,
1633,
281,
320,
6799,
253,
3092,
960,
7637,
1077,
30046,
285,
840,
4620,
281,
3135,
969,
846,
253,
2905,
789,
2593,
432,
752,
891,
476,
2028,
253,
806,
1264,
7118,
512,
12647,
253,
10199,
285,
943,
320,
21884,
342,
4569,
1407,
953,
281,
1056,
253,
14511,
2590,
50276,
2811,
1269,
285,
340,
403,
7632,
275,
247,
4216,
285,
253,
14259,
310,
488,
350,
271,
5024,
50276,
555,
5367,
285,
6197,
7637,
20346,
314,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
10262,
3082,
281,
4311,
4715,
273,
21496,
3210,
5998,
970,
11454,
6928,
253,
2022,
2934,
310,
281,
789,
342,
29975,
12624,
3692,
9552,
3469,
327,
253,
2978,
273,
253,
21496,
3652,
2220,
5368,
2987,
751,
18561,
5933,
253,
2929,
29328,
767,
747,
19191,
3082,
323,
4715,
970,
19191,
8197,
273,
29975,
12624,
50275,
15337,
398,
1089,
253,
2929,
4722,
285,
4217,
3738,
452,
1677,
1142,
13991,
281,
3157,
253,
9759,
285,
4679,
323,
436,
1921,
891,
5583,
281,
2997,
436,
2929,
50276,
66,
1355,
3877,
18561,
5933,
369,
8927,
4081,
275,
4072,
253,
2929,
760,
28070,
253,
4240,
2715,
4496,
2486,
253,
4072,
2715,
347,
973,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a method to improve the generalization of neural networks by training them to be robust to adversarial perturbations in the statistics of the batch normalization bn layers the approach combines gradients computed on unperturbed bn statistics with gradients computed on perturbed statistics perturbations or noise in the bn statistics are obtained through 1 signed gradients from the first update and 2 reductions in the batch size for the second update experiments demonstrate improvements over standard training especially in the case of smallerscale datasets ie cifar and timeimagenet the method can also be combined with other techniques such as mixup and sam optimization typically leading to further improvements strengths the method benefits the generalization of neural networks trained on smaller datasets considerably the technical presentation of the method in section 32 is detailed and sufficiently clear the method can be combined with other training methods such as sam weaknesses the paper claims to bridge the gap between robustness and generalization experiments are focused mainly on the generalization ability of the learned networks and robustness experiments are restricted to perturbations of the bn statistics this is quite limited and it is unclear if the learned networks are robust to various other adversarial attacks indeed it is unclear what the relevance of sections 44 and 45 are regarding the robustness of the networks in practice another contribution of the paper is a new at paradigm termed modelbased at it appears that the main idea of perturbing model parameters has been explored in various prior works eg 8 28 it is not clear what the generic formulation in eq 2 contributes or what novel insights are provided the benefits of the method seem to disappear during largescale experiments on imagenet this is somewhat concerning and it might be good to investigate this issue further section 33 is somewhat confusing l206 claims mathcalr0 but then mathcalr appears in the perturbation computation of 7 it is also unclear if a term similar to gphi exists in this case as mentioned above it might be good to further address the performance on larger scale datasets if this turns out to be a limitation also depending on how robust the method is to other adversarial perturbations this could also be mentioned in the limitations docsepwhile adversarial training is one of the most successful methods to increase robustness it usually degrades performance of the models on clean images the authors attribute this to distributional discrepancy in batch norm statistics they propose adversarially perturbed batch normalization apart to achieve robustness against bn statistics noise and to bridge the gap between models generalization and robustness they perform backward passes twice over each batch of clean samples the first backward pass produces two gradient computations a normal gradient that helps update parameters of model and a statistics gradient that is used to perturb the statistics parameters in bn the second pass is performed to generate the defensive gradient that helps the model resist the adversarial statistics perturbation the normal and defensive gradients are combined to improve both generalization and robustness of the model experiments are performed on cifar tinyimagenet and imagenet and show improved clean accuracy over standard training and sam 28 originality and significance the paper presents a new way of bridging the gap between models generalization and robustness it is known in the literature that there is discrepancy between batch norm statistics of clean and adversarial examples 13 as well as the statistics from different batches advprop proposes using two batch norm statistics one for clean images and one auxiliary for adversarial examples 13 rather than creating a separate layer to deal with this discrepancy the paper attempts to make the models robust to the bn statistics noise this approach is interesting and novel to the best of my knowledge the method can be combined with other augmentations to further boost performance the proposed combination with sam as one of the stateoftheart methods is particularly promising quality overall the paper is wellstructured and wellwritten the proposed approach is sound and is described clearly experiments are performed on various datasets including cifar10 cifar100 tinyimagenet and imagenet overall experimental results are convincing they demonstrate improvements on clean accuracy over the baselines as well as robustness against perturbed bn statistics comparing with baselines using the same budget is important given the additional cost of the proposed approach the authors report detailed experimental results in the supplementary material and show that arapt is relatively insensitive to hyperparameters clarity scalability of arapt to large datasets and models is not clearly supported in the experiments the authors use the relatively small resnet18 model on imagenet1k arapt underperforms standard training on imagenet at 2x budget and outperforms it at 4x budget table 2 the authors note that apart employed on the largescale dataset requires more steps to show its promise but do not provide further explanation or experiments on this all experiments are performed on the resnet family on imagenet the achieved accuracy of 7214 table 2 is far from the stateoftheart itd be good to include experiments on other architectures eg efficientnet and see if the gains are significant the authors have addressed limitation of the work in terms of suffering from potential degeneration in case of the combination with other training methods implicitly involving bn the authors can address potential limitation of their work on largescale datasets and models there are no potential negative societal impact that need to be specifically addressed docsepthis paper proposes to add adversarial noise on the bn statistics to improve classification accuracy on indistribution images strength 1 the paper is wellwritten and easy to follow the related works are thoroughly discussed weakness 1 the novelty is limited the proposed method is almost identical with advbn 15 neurips21 although the authors mentioned three differences in related work section i still think they are all minor differences 2 in experiments no results on 11 or 15 are reported this makes it hard to evaluate whether the proposed method can outperform previous works please see above docsepthis paper introduces an adversarially perturbed batch normalization to improve the models generalization and robustness experiments on cifar tinyimagenet and imagenet show that the proposed methods can improve the models performance compared with the baseline model strengths compared with the previous advbn 15 the proposed apart is more appliable and easytraining the paper is wellwritten and the theoretical analysis is clear weaknesses experiments from the reviewers view the experiments in this paper are not sufficient 1 as mentioned in lines 6263 the author mentioned that they want to bridge the gap between the models generalization and robustness the reviewer thinks experiments on imagenetc or stylized imagenet are needed to show the advantages of robustness 2 the comparison with other methods is missing the reviewer thinks a comparison with normalization methods 1820 and adversarial methods 11 15 is needed 3 mixup experiments on imagenet are missing 4 similar to the experiments on cifar10 and cifra100 the authors are suggested to conduct the experiments on one more backbone on tinyimagenet and imagenet for me the current experiments are not sufficient the authors are suggested to add more experiments to show the advantages of their paper
### Summary: | the paper presents a new way of bridging the gap between models generalization and robustness by combining gradients computed on unperturbed bn statistics with gradients computed on perturbed statistics the main goal is to improve the standard generalization but the authors should clarify their definition of robustness as it seems to confuse all reviewers eg questioning adversarial attacks moreover the method itself is very simple and the idea of using adversarial perturbation to stabilize model training isnt new advprop etc reviewers are further concerned about the lack of largescale experiments or on stateoftheart architectures besides there are no comparisons with some of the competing methods such as advprop therefore i find no sufficient ground to recommend acceptance in this papers current shape | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
1332,
281,
3157,
253,
26647,
273,
11454,
6928,
407,
3733,
731,
281,
320,
10237,
281,
48960,
26309,
275,
253,
9990,
273,
253,
14604,
21539,
270,
79,
8090,
253,
2746,
24772,
27935,
10302,
327,
440,
8292,
16193,
270,
79,
9990,
342,
27935,
10302,
327,
44711,
9990,
26309,
390,
6046,
275,
253,
270,
79,
9990,
403,
2797,
949,
337,
6704,
27935,
432,
253,
806,
5731,
285,
374,
23082,
275,
253,
14604,
1979,
323,
253,
1273,
5731,
4679,
7568,
11701,
689,
2629,
3733,
3340,
275,
253,
1083,
273,
1355,
398,
25912,
15302,
26332,
260,
338,
274,
285,
673,
303,
6533,
292,
253,
1332,
476,
671,
320,
5678,
342,
643,
5609,
824,
347,
5878,
484,
285,
1775,
13757,
5431,
4283,
281,
2007,
11701,
50274,
296,
3755,
20556,
50276,
783,
1332,
5373,
253,
26647,
273,
11454,
6928,
10166,
327,
4577,
15302,
15455,
50276,
783,
7681,
9759,
273,
253,
1332,
275,
2593,
4567,
310,
7000,
285,
10481,
2590,
50276,
783,
1332,
476,
320,
5678,
342,
643,
3733,
3082,
824,
347,
1775,
50274,
20881,
1255,
265,
50276,
783,
2929,
3916,
281,
9729,
253,
8037,
875,
31640,
285,
26647,
4679,
403,
7106,
7194,
327,
253,
26647,
3745,
273,
253,
6311,
6928,
285,
31640,
4679,
403,
11096,
281,
26309,
273,
253,
270,
79,
9990,
436,
310,
3240,
3710,
285,
352,
310,
12744,
604,
253,
6311,
6928,
403,
10237,
281,
2710,
643,
48960,
8104,
6296,
352,
310,
12744,
752,
253,
17200,
273,
7118,
7127,
285,
5329,
403,
5001,
253,
31640,
273,
253,
6928,
275,
3946,
50275,
23955,
7680,
273,
253,
2929,
310,
247,
747,
387,
22199,
23776,
1566,
3169,
387,
352,
4620,
326,
253,
2022,
2934,
273,
12230,
272,
1566,
3602,
556,
644,
14859,
275,
2710,
2720,
2987,
24088,
854,
3349,
352,
310,
417,
2590,
752,
253,
12314,
15895,
275,
16186,
374,
17904,
390,
752,
4460,
16039,
403,
2530,
50275,
783,
5373,
273,
253,
1332,
1646,
281,
15529,
1309,
1236,
2510,
25912,
4679,
327,
4440,
257,
292,
436,
310,
8489,
8664,
285,
352,
1537,
320,
1175,
281,
7409,
436,
2523,
2007,
50275,
4674,
5922,
310,
8489,
21643,
298,
18040,
3916,
14168,
1179,
83,
17,
533,
840,
14168,
1179,
83,
4620,
275,
253,
20452,
13782,
273,
818,
352,
310,
671,
12744,
604,
247,
1307,
2074,
281,
305,
2162,
4961,
275,
436,
1083,
50276,
284,
5393,
1840,
352,
1537,
320,
1175,
281,
2007,
2953,
253,
3045,
327,
4067,
4311,
15302,
604,
436,
7819,
562,
281,
320,
247,
12291,
671,
7293,
327,
849,
10237,
253,
1332,
310,
281,
643,
48960,
26309,
436,
812,
671,
320,
5393,
275,
253,
7364,
5474,
33032,
6050,
48960,
3733,
310,
581,
273,
253,
954,
5547,
3082,
281,
2572,
31640,
352,
3798,
372,
25013,
3045,
273,
253,
3210,
327,
4076,
3888,
253,
4477,
11104,
436,
281,
3268,
267,
26210,
275,
14604,
5222,
9990,
597,
12661,
18539,
274,
1365,
44711,
14604,
21539,
7419,
281,
5115,
31640,
1411,
270,
79,
9990,
6046,
285,
281,
9729,
253,
8037,
875,
3210,
26647,
285,
31640,
597,
1347,
19265,
11999,
7019,
689,
1016,
14604,
273,
4076,
3530,
253,
806,
19265,
1509,
11330,
767,
11786,
30745,
247,
2622,
11786,
326,
7729,
5731,
3602,
273,
1566,
285,
247,
9990,
11786,
326,
310,
908,
281,
12230,
253,
9990,
3602,
275,
270,
79,
253,
1273,
1509,
310,
2684,
281,
6635,
253,
14397,
11786,
326,
7729,
253,
1566,
11623,
253,
48960,
9990,
20452,
253,
2622,
285,
14397,
27935,
403,
5678,
281,
3157,
1097,
26647,
285,
31640,
273,
253,
1566,
4679,
403,
2684,
327,
260,
338,
274,
10058,
303,
6533,
292,
285,
4440,
257,
292,
285,
921,
5520,
4076,
7200,
689,
2629,
3733,
285,
1775,
3349,
50272,
19164,
414,
285,
8453,
50276,
783,
2929,
10262,
247,
747,
1039,
273,
49519,
253,
8037,
875,
3210,
26647,
285,
31640,
352,
310,
1929,
275,
253,
6239,
326,
627,
310,
26210,
875,
14604,
5222,
9990,
273,
4076,
285,
48960,
6667,
2145,
347,
973,
347,
253,
9990,
432,
1027,
39657,
1604,
8560,
29328,
970,
767,
14604,
5222,
9990,
581,
323,
4076,
3888,
285,
581,
24026,
323,
48960,
6667,
2145,
2581,
685,
6153,
247,
4858,
3828,
281,
2968,
342,
436,
26210,
253,
2929,
9437,
281,
1056,
253,
3210,
10237,
281,
253,
270,
79,
9990,
6046,
436,
2746,
310,
4722,
285,
4460,
281,
253,
1682,
273,
619,
3640,
50273,
783,
1332,
476,
320,
5678,
342,
643,
35919,
569,
281,
2007,
9510,
3045,
253,
4081,
5019,
342,
1775,
347,
581,
273,
253,
1375,
23037,
14387,
3082,
310,
3782,
12532,
50275,
15177,
50276,
1189,
455,
253,
2929,
310,
973,
34218,
285,
973,
15720,
50276,
783,
4081,
2746,
310,
3590,
285,
310,
2529,
4518,
50275,
16217,
3825,
403,
2684,
327,
2710,
15302,
1690,
260,
338,
274,
740,
260,
338,
274,
2313,
10058,
303,
6533,
292,
285,
4440,
257,
292,
4583,
5661,
1543,
403,
21414,
597,
7568,
11701,
327,
4076,
7200,
689,
253,
1666,
25379,
347,
973,
347,
31640,
1411,
44711,
270,
79,
9990,
10941,
342,
1666,
25379,
970,
253,
1072,
7563,
310,
1774,
1677,
253,
3081,
2105,
273,
253,
4081,
2746,
50273,
783,
4477,
1304,
7000,
5661,
1543,
275,
253,
24864,
2144,
285,
921,
326,
247,
376,
431,
310,
4942,
39188,
281,
4373,
22041,
50275,
498,
15752,
50276,
24606,
1430,
273,
247,
376,
431,
281,
1781,
15302,
285,
3210,
310,
417,
4518,
4516,
275,
253,
4679,
253,
4477,
897,
253,
4942,
1355,
501,
3024,
1093,
1566,
327,
4440,
257,
292,
18,
76,
247,
376,
431,
762,
468,
13015,
2629,
3733,
327,
4440,
257,
292,
387,
374,
89,
7563,
285,
41731,
13015,
352,
387,
577,
89,
7563,
2829,
374,
253,
4477,
3877,
326,
7419,
7091,
327,
253,
1236,
2510,
25912,
10895,
4419,
625,
5018,
281,
921,
697,
9023,
533,
513,
417,
2085,
2007,
8813,
390,
4679,
327,
436,
50275,
455,
4679,
403,
2684,
327,
253,
501,
3024,
2021,
327,
4440,
257,
292,
253,
6786,
7200,
273,
8187,
1047,
2829,
374,
310,
2080,
432,
253,
1375,
23037,
14387,
352,
69,
320,
1175,
281,
2486,
4679,
327,
643,
35615,
24088,
5919,
3024,
285,
923,
604,
253,
15988,
403,
1534,
50275,
783,
4477,
452,
9713,
12291,
273,
253,
789,
275,
2426,
273,
9958,
432,
2442,
27939,
275,
1083,
273,
253,
5019,
342,
643,
3733,
3082,
29688,
7668,
270,
79,
50275,
783,
4477,
476,
2953,
2442,
12291,
273,
616,
789,
327,
1236,
2510,
25912,
15302,
285,
3210,
50275,
9088,
403,
642,
2442,
4016,
38058,
3486,
326,
878,
281,
320,
5742,
9713,
50276,
7152,
33032,
2520,
2929,
29328,
281,
823,
48960,
6046,
327,
253,
270,
79,
9990,
281,
3157,
9162,
7200,
327,
31929,
2382,
3888,
50276,
45563,
337,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
253,
2905,
2987,
403,
16575,
5469,
50275,
20881,
1255,
337,
253,
38135,
310,
3710,
253,
4081,
1332,
310,
2761,
8931,
342,
1604,
15453,
1458,
5723,
2824,
1797,
3738,
253,
4477,
5393,
1264,
3910,
275,
2905,
789,
2593,
891,
1335,
1158,
597,
403,
512,
5884,
3910,
50276,
19,
275,
4679,
642,
1543,
327,
1903,
390,
1458,
403,
2361,
436,
2789,
352,
1892,
281,
7472,
1880,
253,
4081,
1332,
476,
562,
32231,
2045,
2987,
50276,
32897,
923,
1840,
5474,
33032,
2520,
2929,
23970,
271,
18539,
274,
1365,
44711,
14604,
21539,
281,
3157,
253,
3210,
26647,
285,
31640,
4679,
327,
260,
338,
274,
10058,
303,
6533,
292,
285,
4440,
257,
292,
921,
326,
253,
4081,
3082,
476,
3157,
253,
3210,
3045,
2429,
342,
253,
8245,
1566,
50275,
296,
3755,
20556,
2429,
342,
253,
2045,
1604,
15453,
1458,
253,
4081,
7419,
310,
625,
622,
965,
494,
285,
1842,
1767,
26208,
50276,
783,
2929,
310,
973,
15720,
285,
253,
10527,
1783,
310,
2590,
50275,
20881,
1255,
265,
4679,
432,
253,
30628,
1859,
253,
4679,
275,
436,
2929,
403,
417,
4209,
337,
186,
284,
5393,
275,
3104,
721,
24235,
253,
2488,
5393,
326,
597,
971,
281,
9729,
253,
8037,
875,
253,
3210,
26647,
285,
31640,
253,
37317,
11121,
4679,
327,
4440,
257,
14069,
390,
17521,
1025,
4440,
257,
292,
403,
3058,
281,
921,
253,
11361,
273,
31640,
50276,
19,
186,
783,
5301,
342,
643,
3082,
310,
5816,
253,
37317,
11121,
247,
5301,
342,
21539,
3082,
1283,
938,
285,
48960,
3082,
1903,
1458,
310,
3058,
495,
186,
24706,
484,
4679,
327,
4440,
257,
292,
403,
5816,
577,
186,
22202,
281,
253,
4679,
327,
260,
338,
274,
740,
285,
260,
338,
376,
2313,
253,
4477,
403,
5125,
281,
2589,
253,
4679,
327,
581,
625,
27882,
327,
10058,
303,
6533,
292,
285,
4440,
257,
292,
50276,
1542,
479,
253,
1655,
4679,
403,
417,
4209,
253,
4477,
403,
5125,
281,
823,
625,
4679,
281,
921,
253,
11361,
273,
616,
2929,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
747,
1039,
273,
49519,
253,
8037,
875,
3210,
26647,
285,
31640,
407,
16248,
27935,
10302,
327,
440,
8292,
16193,
270,
79,
9990,
342,
27935,
10302,
327,
44711,
9990,
253,
2022,
4736,
310,
281,
3157,
253,
2629,
26647,
533,
253,
4477,
943,
19148,
616,
5426,
273,
31640,
347,
352,
3133,
281,
40678,
512,
30628,
24088,
20501,
48960,
8104,
25761,
253,
1332,
3139,
310,
1077,
2969,
285,
253,
2934,
273,
970,
48960,
20452,
281,
33292,
1566,
3733,
310,
2649,
747,
1604,
8560,
3966,
30628,
403,
2007,
7514,
670,
253,
3480,
273,
1236,
2510,
25912,
4679,
390,
327,
1375,
23037,
14387,
35615,
16280,
627,
403,
642,
14023,
342,
690,
273,
253,
11771,
3082,
824,
347,
1604,
8560,
3103,
891,
1089,
642,
4209,
3216,
281,
5583,
14924,
275,
436,
9380,
1655,
5281
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
247,
1332,
281,
3157,
253,
26647,
273,
11454,
6928,
407,
3733,
731,
281,
320,
10237,
281,
48960,
26309,
275,
253,
9990,
273,
253,
14604,
21539,
270,
79,
8090,
253,
2746,
24772,
27935,
10302,
327,
440,
8292,
16193,
270,
79,
9990,
342,
27935,
10302,
327,
44711,
9990,
26309,
390,
6046,
275,
253,
270,
79,
9990,
403,
2797,
949,
337,
6704,
27935,
432,
253,
806,
5731,
285,
374,
23082,
275,
253,
14604,
1979,
323,
253,
1273,
5731,
4679,
7568,
11701,
689,
2629,
3733,
3340,
275,
253,
1083,
273,
1355,
398,
25912,
15302,
26332,
260,
338,
274,
285,
673,
303,
6533,
292,
253,
1332,
476,
671,
320,
5678,
342,
643,
5609,
824,
347,
5878,
484,
285,
1775,
13757,
5431,
4283,
281,
2007,
11701,
50274,
296,
3755,
20556,
50276,
783,
1332,
5373,
253,
26647,
273,
11454,
6928,
10166,
327,
4577,
15302,
15455,
50276,
783,
7681,
9759,
273,
253,
1332,
275,
2593,
4567,
310,
7000,
285,
10481,
2590,
50276,
783,
1332,
476,
320,
5678,
342,
643,
3733,
3082,
824,
347,
1775,
50274,
20881,
1255,
265,
50276,
783,
2929,
3916,
281,
9729,
253,
8037,
875,
31640,
285,
26647,
4679,
403,
7106,
7194,
327,
253,
26647,
3745,
273,
253,
6311,
6928,
285,
31640,
4679,
403,
11096,
281,
26309,
273,
253,
270,
79,
9990,
436,
310,
3240,
3710,
285,
352,
310,
12744,
604,
253,
6311,
6928,
403,
10237,
281,
2710,
643,
48960,
8104,
6296,
352,
310,
12744,
752,
253,
17200,
273,
7118,
7127,
285,
5329,
403,
5001,
253,
31640,
273,
253,
6928,
275,
3946,
50275,
23955,
7680,
273,
253,
2929,
310,
247,
747,
387,
22199,
23776,
1566,
3169,
387,
352,
4620,
326,
253,
2022,
2934,
273,
12230,
272,
1566,
3602,
556,
644,
14859,
275,
2710,
2720,
2987,
24088,
854,
3349,
352,
310,
417,
2590,
752,
253,
12314,
15895,
275,
16186,
374,
17904,
390,
752,
4460,
16039,
403,
2530,
50275,
783,
5373,
273,
253,
1332,
1646,
281,
15529,
1309,
1236,
2510,
25912,
4679,
327,
4440,
257,
292,
436,
310,
8489,
8664,
285,
352,
1537,
320,
1175,
281,
7409,
436,
2523,
2007,
50275,
4674,
5922,
310,
8489,
21643,
298,
18040,
3916,
14168,
1179,
83,
17,
533,
840,
14168,
1179,
83,
4620,
275,
253,
20452,
13782,
273,
818,
352,
310,
671,
12744,
604,
247,
1307,
2074,
281,
305,
2162,
4961,
275,
436,
1083,
50276,
284,
5393,
1840,
352,
1537,
320,
1175,
281,
2007,
2953,
253,
3045,
327,
4067,
4311,
15302,
604,
436,
7819,
562,
281,
320,
247,
12291,
671,
7293,
327,
849,
10237,
253,
1332,
310,
281,
643,
48960,
26309,
436,
812,
671,
320,
5393,
275,
253,
7364,
5474,
33032,
6050,
48960,
3733,
310,
581,
273,
253,
954,
5547,
3082,
281,
2572,
31640,
352,
3798,
372,
25013,
3045,
273,
253,
3210,
327,
4076,
3888,
253,
4477,
11104,
436,
281,
3268,
267,
26210,
275,
14604,
5222,
9990,
597,
12661,
18539,
274,
1365,
44711,
14604,
21539,
7419,
281,
5115,
31640,
1411,
270,
79,
9990,
6046,
285,
281,
9729,
253,
8037,
875,
3210,
26647,
285,
31640,
597,
1347,
19265,
11999,
7019,
689,
1016,
14604,
273,
4076,
3530,
253,
806,
19265,
1509,
11330,
767,
11786,
30745,
247,
2622,
11786,
326,
7729,
5731,
3602,
273,
1566,
285,
247,
9990,
11786,
326,
310,
908,
281,
12230,
253,
9990,
3602,
275,
270,
79,
253,
1273,
1509,
310,
2684,
281,
6635,
253,
14397,
11786,
326,
7729,
253,
1566,
11623,
253,
48960,
9990,
20452,
253,
2622,
285,
14397,
27935,
403,
5678,
281,
3157,
1097,
26647,
285,
31640,
273,
253,
1566,
4679,
403,
2684,
327,
260,
338,
274,
10058,
303,
6533,
292,
285,
4440,
257,
292,
285,
921,
5520,
4076,
7200,
689,
2629,
3733,
285,
1775,
3349,
50272,
19164,
414,
285,
8453,
50276,
783,
2929,
10262,
247,
747,
1039,
273,
49519,
253,
8037,
875,
3210,
26647,
285,
31640,
352,
310,
1929,
275,
253,
6239,
326,
627,
310,
26210,
875,
14604,
5222,
9990,
273,
4076,
285,
48960,
6667,
2145,
347,
973,
347,
253,
9990,
432,
1027,
39657,
1604,
8560,
29328,
970,
767,
14604,
5222,
9990,
581,
323,
4076,
3888,
285,
581,
24026,
323,
48960,
6667,
2145,
2581,
685,
6153,
247,
4858,
3828,
281,
2968,
342,
436,
26210,
253,
2929,
9437,
281,
1056,
253,
3210,
10237,
281,
253,
270,
79,
9990,
6046,
436,
2746,
310,
4722,
285,
4460,
281,
253,
1682,
273,
619,
3640,
50273,
783,
1332,
476,
320,
5678,
342,
643,
35919,
569,
281,
2007,
9510,
3045,
253,
4081,
5019,
342,
1775,
347,
581,
273,
253,
1375,
23037,
14387,
3082,
310,
3782,
12532,
50275,
15177,
50276,
1189,
455,
253,
2929,
310,
973,
34218,
285,
973,
15720,
50276,
783,
4081,
2746,
310,
3590,
285,
310,
2529,
4518,
50275,
16217,
3825,
403,
2684,
327,
2710,
15302,
1690,
260,
338,
274,
740,
260,
338,
274,
2313,
10058,
303,
6533,
292,
285,
4440,
257,
292,
4583,
5661,
1543,
403,
21414,
597,
7568,
11701,
327,
4076,
7200,
689,
253,
1666,
25379,
347,
973,
347,
31640,
1411,
44711,
270,
79,
9990,
10941,
342,
1666,
25379,
970,
253,
1072,
7563,
310,
1774,
1677,
253,
3081,
2105,
273,
253,
4081,
2746,
50273,
783,
4477,
1304,
7000,
5661,
1543,
275,
253,
24864,
2144,
285,
921,
326,
247,
376,
431,
310,
4942,
39188,
281,
4373,
22041,
50275,
498,
15752,
50276,
24606,
1430,
273,
247,
376,
431,
281,
1781,
15302,
285,
3210,
310,
417,
4518,
4516,
275,
253,
4679,
253,
4477,
897,
253,
4942,
1355,
501,
3024,
1093,
1566,
327,
4440,
257,
292,
18,
76,
247,
376,
431,
762,
468,
13015,
2629,
3733,
327,
4440,
257,
292,
387,
374,
89,
7563,
285,
41731,
13015,
352,
387,
577,
89,
7563,
2829,
374,
253,
4477,
3877,
326,
7419,
7091,
327,
253,
1236,
2510,
25912,
10895,
4419,
625,
5018,
281,
921,
697,
9023,
533,
513,
417,
2085,
2007,
8813,
390,
4679,
327,
436,
50275,
455,
4679,
403,
2684,
327,
253,
501,
3024,
2021,
327,
4440,
257,
292,
253,
6786,
7200,
273,
8187,
1047,
2829,
374,
310,
2080,
432,
253,
1375,
23037,
14387,
352,
69,
320,
1175,
281,
2486,
4679,
327,
643,
35615,
24088,
5919,
3024,
285,
923,
604,
253,
15988,
403,
1534,
50275,
783,
4477,
452,
9713,
12291,
273,
253,
789,
275,
2426,
273,
9958,
432,
2442,
27939,
275,
1083,
273,
253,
5019,
342,
643,
3733,
3082,
29688,
7668,
270,
79,
50275,
783,
4477,
476,
2953,
2442,
12291,
273,
616,
789,
327,
1236,
2510,
25912,
15302,
285,
3210,
50275,
9088,
403,
642,
2442,
4016,
38058,
3486,
326,
878,
281,
320,
5742,
9713,
50276,
7152,
33032,
2520,
2929,
29328,
281,
823,
48960,
6046,
327,
253,
270,
79,
9990,
281,
3157,
9162,
7200,
327,
31929,
2382,
3888,
50276,
45563,
337,
253,
2929,
310,
973,
15720,
285,
3477,
281,
956,
253,
2905,
2987,
403,
16575,
5469,
50275,
20881,
1255,
337,
253,
38135,
310,
3710,
253,
4081,
1332,
310,
2761,
8931,
342,
1604,
15453,
1458,
5723,
2824,
1797,
3738,
253,
4477,
5393,
1264,
3910,
275,
2905,
789,
2593,
891,
1335,
1158,
597,
403,
512,
5884,
3910,
50276,
19,
275,
4679,
642,
1543,
327,
1903,
390,
1458,
403,
2361,
436,
2789,
352,
1892,
281,
7472,
1880,
253,
4081,
1332,
476,
562,
32231,
2045,
2987,
50276,
32897,
923,
1840,
5474,
33032,
2520,
2929,
23970,
271,
18539,
274,
1365,
44711,
14604,
21539,
281,
3157,
253,
3210,
26647,
285,
31640,
4679,
327,
260,
338,
274,
10058,
303,
6533,
292,
285,
4440,
257,
292,
921,
326,
253,
4081,
3082,
476,
3157,
253,
3210,
3045,
2429,
342,
253,
8245,
1566,
50275,
296,
3755,
20556,
2429,
342,
253,
2045,
1604,
15453,
1458,
253,
4081,
7419,
310,
625,
622,
965,
494,
285,
1842,
1767,
26208,
50276,
783,
2929,
310,
973,
15720,
285,
253,
10527,
1783,
310,
2590,
50275,
20881,
1255,
265,
4679,
432,
253,
30628,
1859,
253,
4679,
275,
436,
2929,
403,
417,
4209,
337,
186,
284,
5393,
275,
3104,
721,
24235,
253,
2488,
5393,
326,
597,
971,
281,
9729,
253,
8037,
875,
253,
3210,
26647,
285,
31640,
253,
37317,
11121,
4679,
327,
4440,
257,
14069,
390,
17521,
1025,
4440,
257,
292,
403,
3058,
281,
921,
253,
11361,
273,
31640,
50276,
19,
186,
783,
5301,
342,
643,
3082,
310,
5816,
253,
37317,
11121,
247,
5301,
342,
21539,
3082,
1283,
938,
285,
48960,
3082,
1903,
1458,
310,
3058,
495,
186,
24706,
484,
4679,
327,
4440,
257,
292,
403,
5816,
577,
186,
22202,
281,
253,
4679,
327,
260,
338,
274,
740,
285,
260,
338,
376,
2313,
253,
4477,
403,
5125,
281,
2589,
253,
4679,
327,
581,
625,
27882,
327,
10058,
303,
6533,
292,
285,
4440,
257,
292,
50276,
1542,
479,
253,
1655,
4679,
403,
417,
4209,
253,
4477,
403,
5125,
281,
823,
625,
4679,
281,
921,
253,
11361,
273,
616,
2929,
2490,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
747,
1039,
273,
49519,
253,
8037,
875,
3210,
26647,
285,
31640,
407,
16248,
27935,
10302,
327,
440,
8292,
16193,
270,
79,
9990,
342,
27935,
10302,
327,
44711,
9990,
253,
2022,
4736,
310,
281,
3157,
253,
2629,
26647,
533,
253,
4477,
943,
19148,
616,
5426,
273,
31640,
347,
352,
3133,
281,
40678,
512,
30628,
24088,
20501,
48960,
8104,
25761,
253,
1332,
3139,
310,
1077,
2969,
285,
253,
2934,
273,
970,
48960,
20452,
281,
33292,
1566,
3733,
310,
2649,
747,
1604,
8560,
3966,
30628,
403,
2007,
7514,
670,
253,
3480,
273,
1236,
2510,
25912,
4679,
390,
327,
1375,
23037,
14387,
35615,
16280,
627,
403,
642,
14023,
342,
690,
273,
253,
11771,
3082,
824,
347,
1604,
8560,
3103,
891,
1089,
642,
4209,
3216,
281,
5583,
14924,
275,
436,
9380,
1655,
5281
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper tackles the challenge of generating adversarial perturbation for a target model with no access to the model or the models training data ie target domain using a trained model and data from a source domain imagenet the authors train a generator to craft perturbations which maximize the cosine distance between the intermediate features of clean and adversarial images this generator is then assisted by two techniques random normalization of the input image and spatial attention on intermediatelayer features used for cosine distance experiments show that this method outperforms prior methods in blackbox setting no access to target domain or model as well as whitebox setting strengths 1 the problem setting no access to target data is of importance in practice access to data is as hard if not harder than access to model 2 the experiments are extensive and clearly show a significant improvement in blackbox attack capability 3 the code provided with the paper along with the appendix help gaining a clearer understanding of the method conversely they further emphasize readability issues of the manuscript weaknesses 1 the manuscript is poorly written grammatical mistakes and semantic mistakes are aplenty some phrases are the opposite of what the method actually does section 34 states specifically we apply a channelwise average pooling to the feature maps at layer l where as the actually operation is crosschannel average pooling refer to 1 other mistakes are highlighted below 2 the novelty of the method is limited in wen zhou et al 2018 2 intermediate feature disruption is used to increasing blackbox transferability in weibin wu et al 2020 3 attention is used for increasing transferability the paper does not mention these works 3 the claims of section 42 are only weakly supported statement the downsampling module has an essential impact on the resulting adversarial examples i fail to see how this can be inferred from visualizing the crosschannel attention outputs other weaknesses 1 since all the experiments are only regarding imagenet target datasets it is unclear how well the method will perform if the source dataset is different especially if the source dataset is small 2 the metrics do not include standard deviation across multiple random runs evaluating the standard deviation in at least one setting will elucidate the significance of the results 3 results with combining the two proposed techniques rn and da should be present in the main draft this is an important question that the manuscript only deals with in the appendix the manuscript with benefit from a discussion on the fact that using these two techniques in tandem is challenging and fails to consistently outperform using just a single module text errors abstract 1 transferability nature transferable nature 2 the only knowledge only the knowledge 3 the coarsegrained domain coarsegrained domains introduction 1 possible to the spotlight possible 2 transparent opaque 3 the query querying 4 but more threatening and more threatening 5 the generator a generator method 1 they can subject to the they can be modeled as samples from the standard normal distribution 2 even the inputs are not even if the inputs are not experiments 1 in the torchvision in the torchvision library 2 another seven seven other questions for the authors 1 how will the generator network perform with its trained with all source models at once see experiments table 3 in konda reddy mopuri et al 2017 4 i suspect that it should further increase the transferability 2 have the authors tried to increase the rgb jittering when comparing to existing methods i suspect that with significant jittering augmentation may perform similar to random normalization references 1 network in network min lin qiang chen shuicheng yan arxiv 2013 2 transferable adversarial perturbations wen zhou xin hou yongjun chen mengyun tang xiangqi huang xiang gan yong yang eccv 2018 3 boosting the transferability of adversarial samples via attention weibin wu yuxin su xixian chen shenglin zhao irwin king michael r lyu yuwing tai cvpr 2020 4 nag network for adversary generation konda reddy mopuri utkarsh ojha utsav garg r venkatesh babu cvpr 2018 the method proposed in this paper outperforms existing methods and targets an important setting no access to target domain or model however the writing is errorridden and the proposed method is only marginally novel wrt existing works therefore i rate the paper as marginally above accept threshold conditional on the authors correcting the mistakes highlighted above docsepthis work first identifies a more practical threat model for blackbox transfer adversarial attack where the target models domain remains unknown and the attackers surrogate model may be trained in another domain then the bia attack is proposed to enhance transferability whose key idea is to distort lowlevel features captured by dnns intermediate layers instead of perturbing the domainspecific features in the output layer two modules da and rn are further proposed to improve attack success rate experimental results demonstrate that bia is more effective than existing methods see the proscons below pros 1 considering more practical threat model is certainly helpful and important for the transfer attack research 2 the results indeed demonstrate large improvement of bia in terms of error rate cons 1 im a little concerned about the crossdomain statement made in this work to me the target datasets considered in this work cifar stl cub stanford cars are still coming from the same natural imagery domain as imagenet despite they have different label spaces in particular cub is known to have overlap with imagenet 1 where the crossdomain claim certainly does not hold an example case that is more crossdomain would be to transfer from imagenet model to a chestxray model in a similar sense to naseer et al 2 the specific methodology of bia seems not new it is known that perturbing intermediate layer features can yield more transferable adversarial examples eg 2 in fact the formulation of bia appears very similar to the one proposed in 2 while bia minimizes the cosine similarity between intermediated layer features of the clean and adversarial examples 2 maximizes the euclidean distance which essentially is the same feature space attacks are also shown to be more powerful than decision space attacks in more strict blackbox transfer scenarios 3 but sec 32 fails to recognize these existing works clearly identifying the difference between bia and 2 might help address this concern 3 if my above judgement of bia not being new is correct then my further concern comes from the result side in table 2 it seems that the performance gain can be largely attributed to the bia or essentially feature space attack itself rather than the da and rn module this hurts the empirical novelty to some extent as previous works have shown the superiority of feature space attacks in either standard or more strict transfer settings 23 1 httpwwwvisioncaltecheduvisipediacub2002011html 2 feature space perturbations yield more transferable adversarial examples 3 perturbing across the feature hierarchy to improve standard and strict blackbox attack transferability this paper indeed identifies a more practical threat model but the experiments do not closely match the proposed crossdomain scenario and the performance gain seems to largely come from existing technique perturbing feature space instead of decision space these issues prevent me from recommending for acceptance docsepthis paper focuses on the transferability of blackbox domains in real life we do not know the relevant information of the deployed model and transfer attacks on blackbox domains can better evaluate the vulnerability of deployed models therefore beyond imagenet attack bia is proposed to investigate the transferability towards blackbox domains unknown classification tasks with the only knowledge of the imagenet domain from the perspective of data and model the authors propose random normalization rn module and domainagnostic attention da module to narrow the gap between the source and target domains finally bia achieves stateoftheart performance in blackbox domains settings strengths 1 bia focuses on disrupting lowlevel features to improve transferability 2 this work proposes random normalization rn module to handle the various distributions between source domains and target domains 3 this work proposes domainagnostic attention da module to produce a more robust feature representation weaknesses 1 rn module and da module are not always mutually reinforcing the reason behind this has not been analyzed 2 in competitors diverse inputs method dim is not new from this perspective why not use the more powerful miditifgsm or a newer transfer attack 3 table 2 and table 3 show the transferability comparisons on classification tasks however the effects of da and rn seem to depend on different models to understand more deeply it is necessary to analyze why different modules have different effects in different models minor questions 1 what are the experiments on the finegrained and coarsegrained classification to prove why distinguish between finegrained and coarsegrained there is no clear explanation in this work 2 does the comparison method differ greatly in training costs i tend to accept this paper because it focuses on more realistic black box attack settings and proposes two modules to improve performance the design of the module is insightful and effective but the module proposed under some models is not always effective which limits its application and requires more adequate analysis
### Summary: | this paper considers that the models training data may be not accessible when learning the attacking model and thus a more practical blackbox attack scheme beyond imagenet attack bia framework is designed all the reviewers agreed that the setting in this paper is important and helpful when designing attack methods however the method is not totally new nevertheless considering the importance of the problem investigated in this paper the nice design of the overall framework and the extensive experiments the ac recommends accept for this paper | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
39223,
253,
5691,
273,
11365,
48960,
20452,
323,
247,
2303,
1566,
50276,
3113,
642,
2289,
281,
253,
1566,
390,
253,
3210,
3733,
941,
26332,
2303,
5028,
970,
247,
10166,
1566,
285,
941,
432,
247,
2603,
5028,
4440,
257,
292,
253,
4477,
6194,
247,
14156,
281,
11072,
26309,
534,
22950,
253,
7349,
460,
4181,
875,
253,
10444,
3386,
273,
4076,
285,
48960,
3888,
436,
14156,
310,
840,
21075,
407,
767,
5609,
50276,
14719,
21539,
273,
253,
3280,
2460,
285,
8820,
4116,
327,
26103,
255,
293,
4071,
3386,
908,
323,
7349,
460,
4181,
4679,
921,
326,
436,
1332,
41731,
13015,
2720,
3082,
275,
2806,
3364,
4758,
642,
2289,
281,
2303,
5028,
390,
1566,
347,
973,
347,
3168,
3364,
4758,
50275,
296,
3755,
20556,
50276,
18,
253,
1895,
4758,
642,
2289,
281,
2303,
941,
310,
273,
6349,
50276,
249,
3946,
2289,
281,
941,
310,
347,
1892,
604,
417,
12150,
685,
2289,
281,
1566,
374,
253,
4679,
403,
9470,
285,
4518,
921,
247,
1534,
7756,
275,
2806,
3364,
2983,
14603,
495,
253,
2127,
2530,
342,
253,
2929,
2112,
342,
253,
30762,
1361,
21896,
247,
30909,
4685,
273,
253,
1332,
5636,
600,
597,
2007,
22175,
1239,
1430,
3374,
273,
253,
7714,
50275,
20881,
1255,
265,
50276,
18,
253,
7714,
310,
15225,
3542,
50276,
1710,
2056,
474,
16503,
285,
24705,
16503,
403,
247,
446,
4108,
690,
25491,
403,
253,
7285,
273,
752,
253,
1332,
2686,
1057,
2593,
5910,
3054,
5742,
359,
4647,
247,
5048,
3020,
3388,
45900,
281,
253,
4735,
8115,
387,
3828,
298,
835,
347,
253,
2686,
4254,
310,
2831,
13695,
3388,
45900,
3730,
281,
337,
643,
16503,
403,
16318,
2708,
50276,
19,
253,
38135,
273,
253,
1332,
310,
3710,
275,
259,
257,
1182,
14451,
1162,
355,
4765,
374,
10444,
4735,
20638,
310,
908,
281,
3629,
2806,
3364,
3700,
1430,
275,
359,
487,
249,
259,
86,
1162,
355,
9169,
495,
4116,
310,
908,
323,
3629,
3700,
1430,
253,
2929,
1057,
417,
3748,
841,
2987,
50276,
20,
253,
3916,
273,
2593,
5976,
403,
760,
22112,
4516,
3908,
253,
1066,
48027,
6333,
556,
271,
5667,
3486,
327,
253,
4795,
48960,
6667,
891,
1891,
281,
923,
849,
436,
476,
320,
22245,
432,
5304,
3006,
253,
2831,
13695,
4116,
18012,
50274,
977,
32213,
50276,
18,
1580,
512,
253,
4679,
403,
760,
5001,
4440,
257,
292,
50276,
7831,
15302,
352,
310,
12744,
849,
973,
253,
1332,
588,
1347,
604,
253,
2603,
10895,
310,
1027,
3340,
604,
253,
2603,
10895,
310,
1355,
50276,
19,
253,
17082,
513,
417,
2486,
2629,
11254,
2439,
2709,
3632,
6613,
16344,
253,
2629,
11254,
275,
387,
1878,
581,
4758,
588,
30955,
253,
8453,
273,
253,
1543,
495,
1543,
342,
16248,
253,
767,
4081,
5609,
391,
79,
285,
4204,
943,
320,
1246,
275,
253,
2022,
7482,
436,
310,
271,
1774,
1953,
326,
253,
7714,
760,
13330,
342,
275,
253,
30762,
253,
7714,
342,
5649,
432,
247,
5955,
327,
253,
958,
326,
970,
841,
767,
5609,
275,
30111,
310,
11132,
285,
10224,
281,
12724,
562,
32231,
970,
816,
247,
2014,
6333,
50274,
1156,
6332,
50274,
15834,
337,
3700,
1430,
3753,
50276,
17338,
494,
3753,
374,
253,
760,
3640,
50276,
7483,
253,
3640,
495,
253,
25319,
72,
11273,
5028,
50276,
1940,
10788,
72,
11273,
10625,
50275,
46089,
50276,
18,
1896,
281,
253,
34543,
50276,
24902,
374,
13955,
50276,
412,
13734,
50276,
20,
253,
7316,
50276,
7267,
272,
577,
533,
625,
18844,
50276,
395,
625,
18844,
608,
253,
14156,
50276,
66,
14156,
50275,
9349,
337,
597,
476,
2256,
281,
253,
50276,
9328,
476,
320,
23115,
347,
3530,
432,
253,
2629,
2622,
3268,
374,
1014,
253,
14800,
403,
417,
50276,
9154,
604,
253,
14800,
403,
417,
50275,
16217,
3825,
337,
275,
253,
30162,
4694,
50276,
249,
253,
30162,
4694,
6335,
374,
1529,
5093,
50276,
23587,
643,
50275,
34974,
323,
253,
4477,
50275,
18,
849,
588,
253,
14156,
2990,
1347,
342,
697,
10166,
342,
512,
2603,
3210,
387,
2378,
923,
4679,
50276,
2420,
495,
275,
465,
18782,
28159,
90,
278,
412,
11317,
1162,
355,
4240,
577,
891,
9101,
326,
352,
943,
2007,
2572,
253,
3700,
1430,
50276,
19,
452,
253,
4477,
3597,
281,
2572,
253,
46206,
480,
4069,
272,
672,
10941,
281,
5368,
3082,
891,
9101,
326,
342,
1534,
480,
4069,
272,
42072,
778,
1347,
2074,
281,
3632,
21539,
50275,
250,
3065,
50275,
18,
2990,
275,
2990,
1054,
19169,
2805,
22589,
260,
864,
439,
86,
280,
24176,
340,
266,
549,
32693,
4072,
50276,
19,
3700,
494,
48960,
26309,
259,
257,
1182,
14451,
1269,
249,
288,
276,
340,
543,
30986,
260,
864,
278,
1205,
90,
328,
12717,
1269,
22589,
33980,
30287,
606,
1269,
22589,
36827,
340,
543,
30966,
23746,
87,
4765,
50276,
20,
43124,
253,
3700,
1430,
273,
48960,
3530,
3066,
4116,
359,
487,
249,
259,
86,
340,
2310,
249,
402,
1269,
895,
757,
260,
864,
703,
1251,
3642,
1182,
31035,
3496,
6481,
6963,
278,
44023,
391,
12865,
86,
340,
86,
7706,
43579,
30105,
1087,
9169,
50276,
21,
295,
356,
2990,
323,
34014,
5978,
465,
18782,
28159,
90,
278,
412,
11317,
2780,
76,
7894,
258,
75,
3227,
2780,
47929,
305,
1662,
391,
8097,
76,
684,
73,
5366,
86,
30105,
1087,
4765,
253,
1332,
4081,
275,
436,
2929,
41731,
13015,
5368,
3082,
285,
8571,
271,
1774,
4758,
642,
2289,
281,
2303,
5028,
390,
1566,
2299,
253,
4028,
310,
2228,
83,
5394,
285,
253,
4081,
1332,
310,
760,
42876,
4460,
8772,
5368,
2987,
3103,
891,
2281,
253,
2929,
347,
42876,
1840,
2997,
7887,
17697,
327,
253,
4477,
35827,
253,
16503,
16318,
1840,
5474,
33032,
2520,
789,
806,
22649,
247,
625,
8542,
4322,
1566,
323,
2806,
3364,
3700,
48960,
2983,
835,
253,
2303,
3210,
5028,
4558,
7202,
285,
253,
40567,
35701,
1566,
778,
320,
10166,
275,
1529,
5028,
840,
253,
270,
571,
2983,
310,
4081,
281,
7278,
3700,
1430,
3692,
2234,
2934,
310,
281,
35303,
1698,
5251,
3386,
10848,
407,
277,
79,
2224,
10444,
8090,
3185,
273,
12230,
272,
253,
10625,
29765,
3386,
275,
253,
3453,
3828,
767,
11911,
4204,
285,
391,
79,
403,
2007,
4081,
281,
3157,
2983,
2323,
2281,
5661,
1543,
7568,
326,
270,
571,
310,
625,
3576,
685,
5368,
3082,
923,
253,
5847,
5040,
2708,
50275,
856,
84,
337,
7296,
625,
8542,
4322,
1566,
310,
5604,
9371,
285,
1774,
323,
253,
3700,
2983,
2561,
374,
253,
1543,
6296,
7568,
1781,
7756,
273,
270,
571,
275,
2426,
273,
2228,
2281,
50275,
5040,
337,
516,
247,
1652,
7514,
670,
253,
2831,
13517,
3908,
1160,
275,
436,
789,
281,
479,
253,
2303,
15302,
2783,
275,
436,
789,
260,
338,
274,
331,
77,
12966,
331,
266,
4379,
8458,
403,
1335,
3551,
432,
253,
1072,
3626,
27471,
5028,
347,
4440,
257,
292,
5747,
597,
452,
1027,
5203,
8470,
275,
1798,
12966,
310,
1929,
281,
452,
14787,
342,
4440,
257,
292,
337,
835,
253,
2831,
13517,
1750,
5604,
1057,
417,
2186,
271,
1650,
1083,
326,
310,
625,
2831,
13517,
651,
320,
281,
3700,
432,
4440,
257,
292,
1566,
281,
247,
9081,
89,
1402,
1566,
275,
247,
2074,
3282,
281,
295,
511,
254,
1162,
355,
50276,
19,
253,
2173,
16182,
273,
270,
571,
3133,
417,
747,
352,
310,
1929,
326,
12230,
272,
10444,
3828,
3386,
476,
4917,
625,
3700,
494,
48960,
6667,
24088,
374,
275,
958,
253,
15895,
273,
270,
571,
4620,
1077,
2074,
281,
253,
581,
4081,
275,
374,
1223,
270,
571,
46926,
253,
7349,
460,
14259,
875,
734,
11181,
3828,
3386,
273,
253,
4076,
285,
48960,
6667,
374,
11903,
4219,
253,
299,
26365,
4181,
534,
9093,
310,
253,
1072,
4735,
2317,
8104,
403,
671,
2011,
281,
320,
625,
6422,
685,
3061,
2317,
8104,
275,
625,
7654,
2806,
3364,
3700,
15216,
495,
533,
4706,
4567,
10224,
281,
9446,
841,
5368,
2987,
4518,
12488,
253,
3064,
875,
270,
571,
285,
374,
1537,
1361,
2953,
436,
4468,
50276,
20,
604,
619,
1840,
31536,
273,
270,
571,
417,
1146,
747,
310,
3451,
840,
619,
2007,
4468,
3249,
432,
253,
906,
1930,
275,
2829,
374,
352,
3133,
326,
253,
3045,
6351,
476,
320,
8127,
12877,
281,
253,
270,
571,
390,
9093,
4735,
2317,
2983,
3139,
2581,
685,
253,
4204,
285,
391,
79,
6333,
436,
31835,
253,
16774,
38135,
281,
690,
6070,
347,
2045,
2987,
452,
2011,
253,
34385,
273,
4735,
2317,
8104,
275,
2057,
2629,
390,
625,
7654,
3700,
7533,
3495,
50275,
18,
3944,
2700,
4694,
1179,
442,
1962,
563,
4534,
532,
264,
14427,
538,
1518,
7330,
2974,
50276,
19,
4735,
2317,
26309,
4917,
625,
3700,
494,
48960,
6667,
50276,
20,
12230,
272,
2439,
253,
4735,
19868,
281,
3157,
2629,
285,
7654,
2806,
3364,
2983,
3700,
1430,
50276,
2520,
2929,
6296,
22649,
247,
625,
8542,
4322,
1566,
533,
253,
4679,
513,
417,
8244,
3761,
253,
4081,
2831,
13517,
10076,
285,
253,
3045,
6351,
3133,
281,
8127,
1705,
432,
5368,
5853,
12230,
272,
4735,
2317,
3185,
273,
3061,
2317,
841,
3374,
3657,
479,
432,
46705,
323,
14924,
5474,
33032,
2520,
2929,
16633,
327,
253,
3700,
1430,
273,
2806,
3364,
10625,
275,
1524,
1495,
359,
513,
417,
871,
253,
4623,
1491,
273,
253,
18329,
1566,
285,
3700,
8104,
327,
2806,
3364,
10625,
476,
1805,
7472,
253,
24189,
273,
18329,
3210,
3103,
4457,
4440,
257,
292,
2983,
270,
571,
310,
4081,
281,
7409,
253,
3700,
1430,
4404,
2806,
3364,
10625,
7202,
9162,
8892,
342,
253,
760,
3640,
273,
253,
4440,
257,
292,
5028,
432,
253,
8668,
273,
941,
285,
1566,
253,
4477,
12661,
3632,
21539,
391,
79,
6333,
285,
5028,
1530,
6932,
4116,
4204,
6333,
281,
6891,
253,
8037,
875,
253,
2603,
285,
2303,
10625,
4720,
270,
571,
33526,
1375,
23037,
14387,
3045,
275,
2806,
3364,
10625,
7533,
20544,
337,
270,
571,
16633,
327,
13201,
272,
1698,
5251,
3386,
281,
3157,
3700,
1430,
374,
436,
789,
29328,
3632,
21539,
391,
79,
6333,
281,
6016,
253,
2710,
10670,
875,
2603,
10625,
285,
2303,
10625,
495,
436,
789,
29328,
5028,
1530,
6932,
4116,
4204,
6333,
281,
4711,
247,
625,
10237,
4735,
6779,
50276,
20881,
1255,
265,
337,
391,
79,
6333,
285,
4204,
6333,
403,
417,
1900,
25834,
41894,
253,
1921,
3212,
436,
556,
417,
644,
5867,
374,
275,
21607,
11117,
14800,
1332,
3317,
310,
417,
747,
432,
436,
8668,
2139,
417,
897,
253,
625,
6422,
4260,
262,
338,
72,
3610,
390,
247,
21629,
3700,
2983,
495,
2829,
374,
285,
2829,
495,
921,
253,
3700,
1430,
14023,
327,
9162,
8892,
2299,
253,
2538,
273,
4204,
285,
391,
79,
1646,
281,
3469,
327,
1027,
3210,
281,
2096,
625,
11617,
352,
310,
3309,
281,
12106,
2139,
1027,
11911,
452,
1027,
2538,
275,
1027,
3210,
50276,
37585,
3533,
337,
752,
403,
253,
4679,
327,
253,
4030,
72,
11273,
285,
25319,
72,
11273,
9162,
281,
5276,
2139,
12129,
875,
4030,
72,
11273,
285,
25319,
72,
11273,
627,
310,
642,
2590,
8813,
275,
436,
789,
374,
1057,
253,
5301,
1332,
9184,
10260,
275,
3733,
4815,
50276,
74,
5257,
281,
2997,
436,
2929,
984,
352,
16633,
327,
625,
15958,
2806,
3817,
2983,
7533,
285,
29328,
767,
11911,
281,
3157,
3045,
253,
2216,
273,
253,
6333,
310,
47860,
285,
3576,
533,
253,
6333,
4081,
762,
690,
3210,
310,
417,
1900,
3576,
534,
7787,
697,
2898,
285,
4419,
625,
10599,
1783,
2490,
187,
4118,
18435,
27,
2520,
2929,
19401,
326,
253,
3210,
3733,
941,
778,
320,
417,
12482,
672,
4715,
253,
20362,
1566,
285,
3021,
247,
625,
8542,
2806,
3364,
2983,
6974,
4457,
4440,
257,
292,
2983,
270,
571,
7792,
310,
4158,
512,
253,
30628,
5821,
326,
253,
4758,
275,
436,
2929,
310,
1774,
285,
9371,
672,
20462,
2983,
3082,
2299,
253,
1332,
310,
417,
9106,
747,
17837,
7296,
253,
6349,
273,
253,
1895,
6949,
275,
436,
2929,
253,
5322,
2216,
273,
253,
4583,
7792,
285,
253,
9470,
4679,
253,
913,
32636,
2997,
323,
436,
2929
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
39223,
253,
5691,
273,
11365,
48960,
20452,
323,
247,
2303,
1566,
50276,
3113,
642,
2289,
281,
253,
1566,
390,
253,
3210,
3733,
941,
26332,
2303,
5028,
970,
247,
10166,
1566,
285,
941,
432,
247,
2603,
5028,
4440,
257,
292,
253,
4477,
6194,
247,
14156,
281,
11072,
26309,
534,
22950,
253,
7349,
460,
4181,
875,
253,
10444,
3386,
273,
4076,
285,
48960,
3888,
436,
14156,
310,
840,
21075,
407,
767,
5609,
50276,
14719,
21539,
273,
253,
3280,
2460,
285,
8820,
4116,
327,
26103,
255,
293,
4071,
3386,
908,
323,
7349,
460,
4181,
4679,
921,
326,
436,
1332,
41731,
13015,
2720,
3082,
275,
2806,
3364,
4758,
642,
2289,
281,
2303,
5028,
390,
1566,
347,
973,
347,
3168,
3364,
4758,
50275,
296,
3755,
20556,
50276,
18,
253,
1895,
4758,
642,
2289,
281,
2303,
941,
310,
273,
6349,
50276,
249,
3946,
2289,
281,
941,
310,
347,
1892,
604,
417,
12150,
685,
2289,
281,
1566,
374,
253,
4679,
403,
9470,
285,
4518,
921,
247,
1534,
7756,
275,
2806,
3364,
2983,
14603,
495,
253,
2127,
2530,
342,
253,
2929,
2112,
342,
253,
30762,
1361,
21896,
247,
30909,
4685,
273,
253,
1332,
5636,
600,
597,
2007,
22175,
1239,
1430,
3374,
273,
253,
7714,
50275,
20881,
1255,
265,
50276,
18,
253,
7714,
310,
15225,
3542,
50276,
1710,
2056,
474,
16503,
285,
24705,
16503,
403,
247,
446,
4108,
690,
25491,
403,
253,
7285,
273,
752,
253,
1332,
2686,
1057,
2593,
5910,
3054,
5742,
359,
4647,
247,
5048,
3020,
3388,
45900,
281,
253,
4735,
8115,
387,
3828,
298,
835,
347,
253,
2686,
4254,
310,
2831,
13695,
3388,
45900,
3730,
281,
337,
643,
16503,
403,
16318,
2708,
50276,
19,
253,
38135,
273,
253,
1332,
310,
3710,
275,
259,
257,
1182,
14451,
1162,
355,
4765,
374,
10444,
4735,
20638,
310,
908,
281,
3629,
2806,
3364,
3700,
1430,
275,
359,
487,
249,
259,
86,
1162,
355,
9169,
495,
4116,
310,
908,
323,
3629,
3700,
1430,
253,
2929,
1057,
417,
3748,
841,
2987,
50276,
20,
253,
3916,
273,
2593,
5976,
403,
760,
22112,
4516,
3908,
253,
1066,
48027,
6333,
556,
271,
5667,
3486,
327,
253,
4795,
48960,
6667,
891,
1891,
281,
923,
849,
436,
476,
320,
22245,
432,
5304,
3006,
253,
2831,
13695,
4116,
18012,
50274,
977,
32213,
50276,
18,
1580,
512,
253,
4679,
403,
760,
5001,
4440,
257,
292,
50276,
7831,
15302,
352,
310,
12744,
849,
973,
253,
1332,
588,
1347,
604,
253,
2603,
10895,
310,
1027,
3340,
604,
253,
2603,
10895,
310,
1355,
50276,
19,
253,
17082,
513,
417,
2486,
2629,
11254,
2439,
2709,
3632,
6613,
16344,
253,
2629,
11254,
275,
387,
1878,
581,
4758,
588,
30955,
253,
8453,
273,
253,
1543,
495,
1543,
342,
16248,
253,
767,
4081,
5609,
391,
79,
285,
4204,
943,
320,
1246,
275,
253,
2022,
7482,
436,
310,
271,
1774,
1953,
326,
253,
7714,
760,
13330,
342,
275,
253,
30762,
253,
7714,
342,
5649,
432,
247,
5955,
327,
253,
958,
326,
970,
841,
767,
5609,
275,
30111,
310,
11132,
285,
10224,
281,
12724,
562,
32231,
970,
816,
247,
2014,
6333,
50274,
1156,
6332,
50274,
15834,
337,
3700,
1430,
3753,
50276,
17338,
494,
3753,
374,
253,
760,
3640,
50276,
7483,
253,
3640,
495,
253,
25319,
72,
11273,
5028,
50276,
1940,
10788,
72,
11273,
10625,
50275,
46089,
50276,
18,
1896,
281,
253,
34543,
50276,
24902,
374,
13955,
50276,
412,
13734,
50276,
20,
253,
7316,
50276,
7267,
272,
577,
533,
625,
18844,
50276,
395,
625,
18844,
608,
253,
14156,
50276,
66,
14156,
50275,
9349,
337,
597,
476,
2256,
281,
253,
50276,
9328,
476,
320,
23115,
347,
3530,
432,
253,
2629,
2622,
3268,
374,
1014,
253,
14800,
403,
417,
50276,
9154,
604,
253,
14800,
403,
417,
50275,
16217,
3825,
337,
275,
253,
30162,
4694,
50276,
249,
253,
30162,
4694,
6335,
374,
1529,
5093,
50276,
23587,
643,
50275,
34974,
323,
253,
4477,
50275,
18,
849,
588,
253,
14156,
2990,
1347,
342,
697,
10166,
342,
512,
2603,
3210,
387,
2378,
923,
4679,
50276,
2420,
495,
275,
465,
18782,
28159,
90,
278,
412,
11317,
1162,
355,
4240,
577,
891,
9101,
326,
352,
943,
2007,
2572,
253,
3700,
1430,
50276,
19,
452,
253,
4477,
3597,
281,
2572,
253,
46206,
480,
4069,
272,
672,
10941,
281,
5368,
3082,
891,
9101,
326,
342,
1534,
480,
4069,
272,
42072,
778,
1347,
2074,
281,
3632,
21539,
50275,
250,
3065,
50275,
18,
2990,
275,
2990,
1054,
19169,
2805,
22589,
260,
864,
439,
86,
280,
24176,
340,
266,
549,
32693,
4072,
50276,
19,
3700,
494,
48960,
26309,
259,
257,
1182,
14451,
1269,
249,
288,
276,
340,
543,
30986,
260,
864,
278,
1205,
90,
328,
12717,
1269,
22589,
33980,
30287,
606,
1269,
22589,
36827,
340,
543,
30966,
23746,
87,
4765,
50276,
20,
43124,
253,
3700,
1430,
273,
48960,
3530,
3066,
4116,
359,
487,
249,
259,
86,
340,
2310,
249,
402,
1269,
895,
757,
260,
864,
703,
1251,
3642,
1182,
31035,
3496,
6481,
6963,
278,
44023,
391,
12865,
86,
340,
86,
7706,
43579,
30105,
1087,
9169,
50276,
21,
295,
356,
2990,
323,
34014,
5978,
465,
18782,
28159,
90,
278,
412,
11317,
2780,
76,
7894,
258,
75,
3227,
2780,
47929,
305,
1662,
391,
8097,
76,
684,
73,
5366,
86,
30105,
1087,
4765,
253,
1332,
4081,
275,
436,
2929,
41731,
13015,
5368,
3082,
285,
8571,
271,
1774,
4758,
642,
2289,
281,
2303,
5028,
390,
1566,
2299,
253,
4028,
310,
2228,
83,
5394,
285,
253,
4081,
1332,
310,
760,
42876,
4460,
8772,
5368,
2987,
3103,
891,
2281,
253,
2929,
347,
42876,
1840,
2997,
7887,
17697,
327,
253,
4477,
35827,
253,
16503,
16318,
1840,
5474,
33032,
2520,
789,
806,
22649,
247,
625,
8542,
4322,
1566,
323,
2806,
3364,
3700,
48960,
2983,
835,
253,
2303,
3210,
5028,
4558,
7202,
285,
253,
40567,
35701,
1566,
778,
320,
10166,
275,
1529,
5028,
840,
253,
270,
571,
2983,
310,
4081,
281,
7278,
3700,
1430,
3692,
2234,
2934,
310,
281,
35303,
1698,
5251,
3386,
10848,
407,
277,
79,
2224,
10444,
8090,
3185,
273,
12230,
272,
253,
10625,
29765,
3386,
275,
253,
3453,
3828,
767,
11911,
4204,
285,
391,
79,
403,
2007,
4081,
281,
3157,
2983,
2323,
2281,
5661,
1543,
7568,
326,
270,
571,
310,
625,
3576,
685,
5368,
3082,
923,
253,
5847,
5040,
2708,
50275,
856,
84,
337,
7296,
625,
8542,
4322,
1566,
310,
5604,
9371,
285,
1774,
323,
253,
3700,
2983,
2561,
374,
253,
1543,
6296,
7568,
1781,
7756,
273,
270,
571,
275,
2426,
273,
2228,
2281,
50275,
5040,
337,
516,
247,
1652,
7514,
670,
253,
2831,
13517,
3908,
1160,
275,
436,
789,
281,
479,
253,
2303,
15302,
2783,
275,
436,
789,
260,
338,
274,
331,
77,
12966,
331,
266,
4379,
8458,
403,
1335,
3551,
432,
253,
1072,
3626,
27471,
5028,
347,
4440,
257,
292,
5747,
597,
452,
1027,
5203,
8470,
275,
1798,
12966,
310,
1929,
281,
452,
14787,
342,
4440,
257,
292,
337,
835,
253,
2831,
13517,
1750,
5604,
1057,
417,
2186,
271,
1650,
1083,
326,
310,
625,
2831,
13517,
651,
320,
281,
3700,
432,
4440,
257,
292,
1566,
281,
247,
9081,
89,
1402,
1566,
275,
247,
2074,
3282,
281,
295,
511,
254,
1162,
355,
50276,
19,
253,
2173,
16182,
273,
270,
571,
3133,
417,
747,
352,
310,
1929,
326,
12230,
272,
10444,
3828,
3386,
476,
4917,
625,
3700,
494,
48960,
6667,
24088,
374,
275,
958,
253,
15895,
273,
270,
571,
4620,
1077,
2074,
281,
253,
581,
4081,
275,
374,
1223,
270,
571,
46926,
253,
7349,
460,
14259,
875,
734,
11181,
3828,
3386,
273,
253,
4076,
285,
48960,
6667,
374,
11903,
4219,
253,
299,
26365,
4181,
534,
9093,
310,
253,
1072,
4735,
2317,
8104,
403,
671,
2011,
281,
320,
625,
6422,
685,
3061,
2317,
8104,
275,
625,
7654,
2806,
3364,
3700,
15216,
495,
533,
4706,
4567,
10224,
281,
9446,
841,
5368,
2987,
4518,
12488,
253,
3064,
875,
270,
571,
285,
374,
1537,
1361,
2953,
436,
4468,
50276,
20,
604,
619,
1840,
31536,
273,
270,
571,
417,
1146,
747,
310,
3451,
840,
619,
2007,
4468,
3249,
432,
253,
906,
1930,
275,
2829,
374,
352,
3133,
326,
253,
3045,
6351,
476,
320,
8127,
12877,
281,
253,
270,
571,
390,
9093,
4735,
2317,
2983,
3139,
2581,
685,
253,
4204,
285,
391,
79,
6333,
436,
31835,
253,
16774,
38135,
281,
690,
6070,
347,
2045,
2987,
452,
2011,
253,
34385,
273,
4735,
2317,
8104,
275,
2057,
2629,
390,
625,
7654,
3700,
7533,
3495,
50275,
18,
3944,
2700,
4694,
1179,
442,
1962,
563,
4534,
532,
264,
14427,
538,
1518,
7330,
2974,
50276,
19,
4735,
2317,
26309,
4917,
625,
3700,
494,
48960,
6667,
50276,
20,
12230,
272,
2439,
253,
4735,
19868,
281,
3157,
2629,
285,
7654,
2806,
3364,
2983,
3700,
1430,
50276,
2520,
2929,
6296,
22649,
247,
625,
8542,
4322,
1566,
533,
253,
4679,
513,
417,
8244,
3761,
253,
4081,
2831,
13517,
10076,
285,
253,
3045,
6351,
3133,
281,
8127,
1705,
432,
5368,
5853,
12230,
272,
4735,
2317,
3185,
273,
3061,
2317,
841,
3374,
3657,
479,
432,
46705,
323,
14924,
5474,
33032,
2520,
2929,
16633,
327,
253,
3700,
1430,
273,
2806,
3364,
10625,
275,
1524,
1495,
359,
513,
417,
871,
253,
4623,
1491,
273,
253,
18329,
1566,
285,
3700,
8104,
327,
2806,
3364,
10625,
476,
1805,
7472,
253,
24189,
273,
18329,
3210,
3103,
4457,
4440,
257,
292,
2983,
270,
571,
310,
4081,
281,
7409,
253,
3700,
1430,
4404,
2806,
3364,
10625,
7202,
9162,
8892,
342,
253,
760,
3640,
273,
253,
4440,
257,
292,
5028,
432,
253,
8668,
273,
941,
285,
1566,
253,
4477,
12661,
3632,
21539,
391,
79,
6333,
285,
5028,
1530,
6932,
4116,
4204,
6333,
281,
6891,
253,
8037,
875,
253,
2603,
285,
2303,
10625,
4720,
270,
571,
33526,
1375,
23037,
14387,
3045,
275,
2806,
3364,
10625,
7533,
20544,
337,
270,
571,
16633,
327,
13201,
272,
1698,
5251,
3386,
281,
3157,
3700,
1430,
374,
436,
789,
29328,
3632,
21539,
391,
79,
6333,
281,
6016,
253,
2710,
10670,
875,
2603,
10625,
285,
2303,
10625,
495,
436,
789,
29328,
5028,
1530,
6932,
4116,
4204,
6333,
281,
4711,
247,
625,
10237,
4735,
6779,
50276,
20881,
1255,
265,
337,
391,
79,
6333,
285,
4204,
6333,
403,
417,
1900,
25834,
41894,
253,
1921,
3212,
436,
556,
417,
644,
5867,
374,
275,
21607,
11117,
14800,
1332,
3317,
310,
417,
747,
432,
436,
8668,
2139,
417,
897,
253,
625,
6422,
4260,
262,
338,
72,
3610,
390,
247,
21629,
3700,
2983,
495,
2829,
374,
285,
2829,
495,
921,
253,
3700,
1430,
14023,
327,
9162,
8892,
2299,
253,
2538,
273,
4204,
285,
391,
79,
1646,
281,
3469,
327,
1027,
3210,
281,
2096,
625,
11617,
352,
310,
3309,
281,
12106,
2139,
1027,
11911,
452,
1027,
2538,
275,
1027,
3210,
50276,
37585,
3533,
337,
752,
403,
253,
4679,
327,
253,
4030,
72,
11273,
285,
25319,
72,
11273,
9162,
281,
5276,
2139,
12129,
875,
4030,
72,
11273,
285,
25319,
72,
11273,
627,
310,
642,
2590,
8813,
275,
436,
789,
374,
1057,
253,
5301,
1332,
9184,
10260,
275,
3733,
4815,
50276,
74,
5257,
281,
2997,
436,
2929,
984,
352,
16633,
327,
625,
15958,
2806,
3817,
2983,
7533,
285,
29328,
767,
11911,
281,
3157,
3045,
253,
2216,
273,
253,
6333,
310,
47860,
285,
3576,
533,
253,
6333,
4081,
762,
690,
3210,
310,
417,
1900,
3576,
534,
7787,
697,
2898,
285,
4419,
625,
10599,
1783,
2490,
187,
4118,
18435,
27,
2520,
2929,
19401,
326,
253,
3210,
3733,
941,
778,
320,
417,
12482,
672,
4715,
253,
20362,
1566,
285,
3021,
247,
625,
8542,
2806,
3364,
2983,
6974,
4457,
4440,
257,
292,
2983,
270,
571,
7792,
310,
4158,
512,
253,
30628,
5821,
326,
253,
4758,
275,
436,
2929,
310,
1774,
285,
9371,
672,
20462,
2983,
3082,
2299,
253,
1332,
310,
417,
9106,
747,
17837,
7296,
253,
6349,
273,
253,
1895,
6949,
275,
436,
2929,
253,
5322,
2216,
273,
253,
4583,
7792,
285,
253,
9470,
4679,
253,
913,
32636,
2997,
323,
436,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper propose a new threat model called stability attack the goal of stability attack is to hinder model from being robust to adversarial attacks the author proposes hypocritical perturbation as a method for stability attack and shows that hypocritical perturbation is harmful in terms of adversarial robustness in a simple gaussian mixture setting finally the author shows that 2 adversarial training is enough for protecting stability attack pros as far as i know this is the first work that studies the robustness of adversarial training against train data poisoning called stability attack as 1 noted hypocritical perturbation is known to be a weak attack in terms of the standard accuracy but this paper illustrates that it can be a strong attack in terms of the robust accuracy the intuition of stability attack for vulnerability of adversarial attack is well demonstrated cons it seems that hypocritical perturbation is main content but is not well described especially the comparison of adversarial perturbation and hypocritical perturbation lacks it seems that stability attack in table 2 and adversarial poisoning and hyp and adv in table 3 4 are identical respectively the consistency of naming the attacks is necessary the description of the poisoning methods in table 2 is not given in table 3 adv and hyp are only considered for the methods of data poisoning it would be more appropriate to do evaluation against the various attacks as in table 2 the final message of the paper is that by using a larger perturbation size the hypocritical perturbation can be defended however this message would be obvious and so the novelty would not be high 1 lue tao lei feng jinfeng yi shengjun huang and songcan chen better safe than sorry preventing delusive adversaries with adversarial training in neurips 2021 as mentioned in the manuscript the standard accuracy of a robustly leanred prediction model based on training data poisoned by the stability attack is better than the standard accuracy of a robustly trained model based on training data poisoned by other methods in this sense the stability attack is less serious than other poisoned methods which degrade the standard accuracy as well as the robust accuracy simultaeously docsepthis paper presents sability attack against the conventional adversarial training process aiming to reduce the eventual robust accuracy of the resulting model specifically the corresponding hypocritical perturbations are applied into training data as a trainingtime attack theoretical analysis is provided to support the idea of hypocritical perturbations experimental results on commonly used classification datasets like cifar10 demonstrate the effeciveness of the proposed method strengths 1 this paper is technically sound and clear around the theoretical analysis experimental results are significant which well support the theory 2 the writing quality of the paper is good overall specifically the background of problem is smoothly introduced 3 the proposed method is sufficiently evaluated to be specific key attacks like fgsm pgd cw and autoattack are present for robustness evaluation meanwhile the experiments cover four datasets namely cifar10100 svhn and tinyimagenet which are sufficient to demonstrate the effectiveness for the proposed method 4 the proposed method successfully compromised adversarial training methods however countermeasure adaptive defense is also proposed and evaluated weaknesses 1 overall this is a good paper and i did not find many problems limitations are discussed as pointed out in the checklist docsepthis paper introduces the problem of adversarial training when they face the new type of attacks called stability attacks the stability attacks aim to compromise the robust availability by slightly manipulating the training data most of existing methods neglect that the test robustness of the adversarial trained models when they are under the trainingtime availability attacks under the threat of the stability attacks they demonstrate that the adversarial trained network with epsilon perturbation budget is not enough to defend against the epsilon bounded adversarial perturbation the authors argue that it is necessary to enlarge the epsilon perturbation budget when they conduct the adversarial training strengths 1 clear writing easy to understand and wellorganized paper 2 important experimental result the testrobustness of adversarial trained network against evasion attacks when they are under the delusive attacks is intriguing result in the adversarial research weaknesses 1 low originality missing comparison with the recent key reference ref1 which is somewhat similar method of the trainingtime availability attacks the attack generation algorithm of trainingtime and testtime perturbation is similar to ref1 furthermore ref1 achieved the stateoftheart attack performance against adversarial trained network on clean data thus it is unclear that this paper made the fair comparison between current sota poisoning attacks on adversarial trained network in this regard the novelty of the threat model on the adversarial trained network by perturbing the training data is marginal 2 insufficient explanations on the relationship with nonrobust features this paper passes the buck to the nonrobust feature for the experiment results of the increase of standard accuracy and the decrease of robust accuracy when they are under the stability attacks the authors emphasize the responsibility of the nonrobust feature in the title abstract and throughout the paper however i am not convinced that nonrobust feature is the only reason for the experiment result in table 2 natural test robustness has increased and test robustness under evasion attacks has decreased when they are adversarially trained under the stability attacks but considering the fact that there always exists a tradeoff between the standard accuracy and the robust accuracy ref2 nonrobust feature cant be solely blamed as the authors followed the process of theoretical analysis with the ref3 they need to present another empirical evidence featurelevel analysis or visualizations for the explanations on relationship with nonrobust features as ref3 did 3 confounding usage of the term for the similar concept it is very confused when the term hyp stability attacks and similar concept appears throughout the paper ref1 fu s he f liu y shen l tao d 2021 september robust unlearnable examples protecting data privacy against adversarial learning in international conference on learning representations ref2 zhang h yu y jiao j xing e el ghaoui l jordan m 2019 may theoretically principled tradeoff between robustness and accuracy in international conference on machine learning pp 74727482 pmlr ref3 tsipras d santurkar s engstrom l turner a madry a 2018 robustness may be at odds with accuracy arxiv preprint arxiv180512152 yes the authors have addressed the limitations and potential negative societal impact of their work docsep summary this paper introduces a novel data poisoning attack against adversarial training called stability attacks the goal is to temper the training data such that the robust performance of adversarial training over this manipulated dataset is degraded to construct this attack a hypocritical perturbation is built unlike adversarial perturbations the aim of hypocritical perturbations is to reinforce the nonrobust features in the training data these perturbations can be generated by negating adversarial example generation objectives motivation the paper motivates stability attacks from the perspective of robust vs nonrobust features specifically a simple binary classification task over a mixture of gaussians is considered statistical analysis on this task shows that adversarial training over hypocritically perturbed data is destructive to adversarial robustness moreover it is shown that a larger perturbation magnitude is needed to guard adversarial training against stability attacks implementation the effectiveness of stability attacks against adversarial training is demonstrated through extensive experimental results strengths the paper is clear and it guides the reader skillfully the paper is wellmotivated the statistical analysis of the binary classification task is thorough and the implications of the theoretical results are discussed comprehensively this paper sheds light on the implications of robust vs nonrobust features from a novel perspective and utilizes these studies to introduce a new threat against adversarial training the experimental settings are discussed in detail and the effects of different hyperparameters and architectures on the performance are investigated weaknesses while the theoretical justifications of hypocritical perturbations on the binary classification task are discussed the relationship of these results with the attack generation process eq 10 is obscure although the given example for the logistic loss and the binary classification task is appreciated the origins of the objective function in eq 10 need a better justification furthermore a thorough discussion on the relationship of this work with existing works on the tradeoff between the clean and robust accuracy of neural networks with adversarial training seems missing as the experimental results suggest the implications align with the observations of tsipras et al 63 on the tradeoff between the clean and robust accuracy eg see table 4 from this perspective it seems like stability attacks are somehow just exploiting this tradeoff to pose their threat on adversarial training thus a comprehensive discussion on the differences between this work and prior work in this area is required a potential discussion on the realworld negative impacts of the current work is missing this reviewer would encourage the authors to discuss this matter explicitly
### Summary: | this paper proposes a new threat model called stability attack which aims to hinder model from being robust to adversarial attacks the author proposes hypocritical perturbation as a method for stability attack and shows that hypocritical perturbation can indeed decrease the adversarial robustness of a model trained in a simple gaussian mixture setting the reviewers agree that the problem being studied is interesting the proposed method is well motivated and the experiments are mostly convincing the authors are encouraged to merge the new results during the rebuttal into the publication and discuss more on the efficiency of the proposed method | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
12661,
247,
747,
4322,
1566,
1925,
7882,
2983,
253,
4736,
273,
7882,
2983,
310,
281,
35007,
1566,
432,
1146,
10237,
281,
48960,
8104,
253,
2488,
50276,
856,
6013,
3500,
406,
14762,
20452,
347,
247,
1332,
323,
7882,
2983,
285,
2722,
326,
3500,
406,
14762,
20452,
310,
19632,
275,
2426,
273,
48960,
31640,
275,
247,
2969,
305,
12064,
7802,
4758,
4720,
253,
2488,
2722,
326,
374,
48960,
3733,
310,
2217,
323,
15233,
7882,
2983,
50276,
856,
84,
50275,
284,
2080,
347,
891,
871,
436,
310,
253,
806,
789,
326,
2175,
253,
31640,
273,
48960,
3733,
1411,
6194,
941,
33254,
1925,
7882,
2983,
50276,
284,
337,
4879,
3500,
406,
14762,
20452,
310,
1929,
281,
320,
247,
5075,
2983,
275,
2426,
273,
253,
2629,
7200,
533,
436,
2929,
18303,
326,
352,
476,
320,
247,
2266,
2983,
275,
2426,
273,
253,
10237,
7200,
50276,
783,
30328,
273,
7882,
2983,
323,
24189,
273,
48960,
2983,
310,
973,
5183,
772,
50276,
262,
3133,
326,
3500,
406,
14762,
20452,
310,
2022,
2600,
533,
310,
417,
973,
2529,
3340,
253,
5301,
273,
48960,
20452,
285,
3500,
406,
14762,
20452,
19756,
50276,
262,
3133,
326,
7882,
2983,
275,
2829,
374,
285,
48960,
33254,
285,
3500,
285,
1604,
275,
2829,
495,
577,
403,
8931,
2975,
253,
15274,
273,
26086,
253,
8104,
310,
3309,
50276,
783,
5740,
273,
253,
33254,
3082,
275,
2829,
374,
310,
417,
1677,
50276,
249,
2829,
495,
50276,
24301,
285,
3500,
403,
760,
2783,
323,
253,
3082,
273,
941,
33254,
352,
651,
320,
625,
4569,
281,
513,
7103,
1411,
253,
2710,
8104,
347,
275,
2829,
374,
50276,
783,
2457,
3935,
273,
253,
2929,
310,
326,
407,
970,
247,
4067,
20452,
1979,
253,
3500,
406,
14762,
20452,
476,
320,
25860,
2299,
436,
3935,
651,
320,
4755,
285,
594,
253,
38135,
651,
417,
320,
1029,
50275,
18,
298,
489,
246,
8500,
43278,
269,
1205,
480,
2050,
1205,
340,
74,
703,
1251,
30986,
30287,
606,
285,
4498,
5092,
260,
864,
1805,
4999,
685,
7016,
13538,
1448,
15240,
18539,
3927,
342,
48960,
3733,
275,
5723,
2824,
43425,
50276,
284,
5393,
275,
253,
7714,
253,
2629,
7200,
273,
247,
10237,
314,
9644,
433,
10554,
1566,
1754,
327,
3733,
941,
47494,
407,
253,
7882,
2983,
310,
50276,
29266,
685,
253,
2629,
7200,
273,
247,
10237,
314,
10166,
1566,
1754,
327,
3733,
941,
47494,
407,
643,
3082,
275,
436,
3282,
253,
7882,
2983,
310,
1679,
4092,
685,
643,
47494,
3082,
534,
40195,
253,
2629,
7200,
347,
973,
347,
253,
10237,
7200,
8257,
3348,
4087,
5474,
33032,
2520,
2929,
10262,
256,
1430,
2983,
1411,
253,
6041,
48960,
3733,
1232,
26400,
281,
4796,
253,
27585,
10237,
7200,
273,
253,
4795,
1566,
5742,
253,
3969,
3500,
406,
14762,
26309,
403,
3732,
715,
3733,
941,
347,
247,
3733,
2606,
2983,
10527,
1783,
310,
2530,
281,
1329,
253,
2934,
273,
3500,
406,
14762,
26309,
5661,
1543,
327,
7744,
908,
9162,
15302,
751,
260,
338,
274,
740,
7568,
253,
29692,
453,
68,
6460,
273,
253,
4081,
1332,
50276,
296,
3755,
20556,
50276,
18,
436,
2929,
310,
22335,
3590,
285,
2590,
1475,
253,
10527,
1783,
5661,
1543,
403,
1534,
534,
973,
1329,
253,
3762,
50276,
19,
253,
4028,
3290,
273,
253,
2929,
310,
1175,
4583,
5742,
253,
4114,
273,
1895,
310,
25863,
5611,
50276,
20,
253,
4081,
1332,
310,
10481,
6760,
281,
320,
2173,
2234,
8104,
751,
269,
72,
3610,
23256,
69,
260,
88,
285,
6753,
35946,
403,
1246,
323,
31640,
7103,
26614,
253,
4679,
3835,
1740,
15302,
10775,
260,
338,
274,
6903,
361,
18504,
13107,
285,
10058,
303,
6533,
292,
534,
403,
4209,
281,
7568,
253,
12510,
323,
253,
4081,
1332,
50276,
21,
253,
4081,
1332,
8379,
25047,
48960,
3733,
3082,
2299,
4828,
30238,
17825,
5684,
310,
671,
4081,
285,
6760,
50275,
20881,
1255,
265,
50276,
18,
4583,
436,
310,
247,
1175,
2929,
285,
891,
858,
417,
1089,
1142,
3237,
7364,
403,
5469,
347,
8042,
562,
275,
253,
44282,
5474,
33032,
2520,
2929,
23970,
253,
1895,
273,
48960,
3733,
672,
597,
2454,
253,
747,
1511,
273,
8104,
1925,
7882,
8104,
253,
7882,
8104,
4388,
281,
18230,
253,
10237,
11659,
407,
5777,
40238,
253,
3733,
941,
954,
273,
5368,
3082,
18369,
326,
253,
1071,
31640,
273,
253,
48960,
10166,
3210,
672,
597,
403,
762,
253,
3733,
2606,
11659,
8104,
762,
253,
4322,
273,
253,
7882,
8104,
597,
7568,
326,
253,
48960,
10166,
2990,
342,
299,
4277,
20452,
7563,
310,
417,
2217,
281,
2342,
1411,
253,
299,
4277,
11542,
48960,
20452,
253,
4477,
9059,
326,
352,
310,
3309,
281,
46112,
253,
299,
4277,
20452,
7563,
672,
597,
2589,
253,
48960,
3733,
20544,
50276,
18,
2590,
4028,
3477,
281,
2096,
285,
973,
34092,
2929,
374,
1774,
5661,
906,
253,
1071,
18848,
461,
1255,
273,
48960,
10166,
2990,
1411,
612,
4930,
8104,
672,
597,
403,
762,
253,
1448,
15240,
8104,
310,
27807,
906,
275,
253,
48960,
2561,
50276,
20881,
1255,
265,
50275,
18,
1698,
3236,
414,
5816,
5301,
342,
253,
3332,
2234,
3806,
1275,
18,
534,
310,
8489,
2074,
1332,
273,
253,
3733,
2606,
11659,
8104,
253,
2983,
5978,
5933,
273,
3733,
2606,
285,
1071,
2606,
20452,
310,
2074,
281,
1275,
18,
33810,
1275,
18,
6786,
253,
1375,
23037,
14387,
2983,
3045,
1411,
48960,
10166,
2990,
327,
4076,
941,
3021,
352,
310,
12744,
326,
436,
2929,
1160,
253,
4344,
5301,
875,
1655,
256,
5503,
33254,
8104,
327,
48960,
10166,
2990,
275,
436,
2743,
253,
38135,
273,
253,
4322,
1566,
327,
253,
48960,
10166,
2990,
407,
12230,
272,
253,
3733,
941,
310,
16888,
374,
12497,
22909,
327,
253,
2954,
342,
1327,
18848,
461,
3386,
436,
2929,
11999,
253,
12433,
281,
253,
1327,
18848,
461,
4735,
323,
253,
3368,
1543,
273,
253,
2572,
273,
2629,
7200,
285,
253,
6379,
273,
10237,
7200,
672,
597,
403,
762,
253,
7882,
8104,
253,
4477,
22175,
253,
8294,
273,
253,
1327,
18848,
461,
4735,
275,
253,
4060,
12002,
285,
4768,
253,
2929,
2299,
891,
717,
417,
13762,
326,
1327,
18848,
461,
4735,
310,
253,
760,
1921,
323,
253,
3368,
906,
275,
2829,
374,
3626,
1071,
31640,
556,
2559,
285,
1071,
31640,
762,
612,
4930,
8104,
556,
6137,
672,
597,
403,
18539,
274,
1365,
10166,
762,
253,
7882,
8104,
533,
7296,
253,
958,
326,
627,
1900,
4961,
247,
5454,
2727,
875,
253,
2629,
7200,
285,
253,
10237,
7200,
1275,
19,
1327,
18848,
461,
4735,
16216,
320,
12718,
27137,
347,
253,
4477,
3560,
253,
1232,
273,
10527,
1783,
342,
253,
1275,
20,
597,
878,
281,
1246,
1529,
16774,
1941,
4735,
5251,
1783,
390,
5304,
5904,
323,
253,
22909,
327,
2954,
342,
1327,
18848,
461,
3386,
347,
1275,
20,
858,
495,
34541,
10393,
273,
253,
1307,
323,
253,
2074,
4473,
352,
310,
1077,
13477,
672,
253,
1307,
3500,
7882,
8104,
285,
2074,
4473,
4620,
4768,
253,
2929,
50275,
709,
18,
15260,
256,
344,
269,
632,
86,
340,
703,
79,
298,
50276,
893,
80,
277,
43425,
22688,
2037,
10237,
440,
29343,
494,
6667,
15233,
941,
11068,
1411,
48960,
4715,
275,
5213,
8059,
327,
4715,
14237,
50276,
709,
19,
1182,
12109,
288,
340,
86,
340,
480,
22728,
480,
1269,
272,
299,
1045,
305,
3227,
276,
74,
298,
50276,
75,
11208,
278,
6247,
778,
28055,
3505,
74,
6216,
5454,
2727,
875,
31640,
285,
7200,
275,
5213,
8059,
327,
5145,
4715,
7266,
818,
2504,
1630,
35340,
268,
1686,
83,
50276,
709,
20,
28669,
532,
6230,
277,
256,
386,
321,
18970,
256,
2209,
27621,
298,
1614,
254,
247,
50276,
34180,
610,
247,
4765,
31640,
778,
320,
387,
13653,
342,
7200,
549,
32693,
638,
3845,
549,
32693,
1093,
1762,
805,
17472,
50276,
9820,
253,
4477,
452,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
5474,
33032,
6010,
50276,
2520,
2929,
23970,
247,
4460,
941,
33254,
2983,
1411,
48960,
3733,
1925,
7882,
8104,
253,
4736,
310,
281,
2660,
253,
3733,
941,
824,
326,
253,
10237,
3045,
273,
48960,
3733,
689,
436,
32494,
10895,
310,
30853,
281,
3989,
436,
2983,
247,
3500,
406,
14762,
20452,
310,
4270,
12401,
48960,
26309,
253,
4388,
273,
3500,
406,
14762,
26309,
310,
281,
28432,
253,
1327,
18848,
461,
3386,
275,
253,
3733,
941,
841,
26309,
476,
320,
4561,
407,
2297,
839,
48960,
1650,
5978,
16566,
50276,
24013,
7639,
253,
2929,
15265,
684,
7882,
8104,
432,
253,
8668,
273,
10237,
4632,
1327,
18848,
461,
3386,
5742,
247,
2969,
8985,
9162,
4836,
689,
247,
7802,
273,
305,
10064,
2458,
310,
2783,
7605,
1783,
327,
436,
4836,
2722,
326,
48960,
3733,
689,
3500,
406,
902,
1037,
44711,
941,
310,
27009,
281,
48960,
31640,
25761,
352,
310,
2011,
326,
247,
4067,
20452,
9777,
310,
3058,
281,
7496,
48960,
3733,
1411,
7882,
8104,
50276,
39595,
253,
12510,
273,
7882,
8104,
1411,
48960,
3733,
310,
5183,
949,
9470,
5661,
1543,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
2590,
285,
352,
22591,
253,
9414,
10861,
2920,
50275,
783,
2929,
310,
973,
24013,
8550,
253,
7605,
1783,
273,
253,
8985,
9162,
4836,
310,
11080,
285,
253,
12739,
273,
253,
10527,
1543,
403,
5469,
9483,
1242,
50275,
2520,
2929,
703,
1397,
1708,
327,
253,
12739,
273,
10237,
4632,
1327,
18848,
461,
3386,
432,
247,
4460,
8668,
285,
29820,
841,
2175,
281,
9569,
247,
747,
4322,
1411,
48960,
3733,
50275,
783,
5661,
7533,
403,
5469,
275,
2508,
285,
253,
2538,
273,
1027,
4373,
22041,
285,
35615,
327,
253,
3045,
403,
6949,
50275,
20881,
1255,
265,
50276,
6050,
253,
10527,
816,
6787,
273,
3500,
406,
14762,
26309,
327,
253,
8985,
9162,
4836,
403,
5469,
253,
2954,
273,
841,
1543,
342,
253,
2983,
5978,
1232,
16186,
884,
310,
26591,
3738,
253,
1677,
1650,
323,
253,
21535,
2957,
285,
253,
8985,
9162,
4836,
310,
14109,
253,
20801,
273,
253,
8103,
1159,
275,
16186,
884,
878,
247,
1805,
22861,
50274,
44295,
3062,
247,
11080,
5955,
327,
253,
2954,
273,
436,
789,
342,
5368,
2987,
327,
253,
5454,
2727,
875,
253,
4076,
285,
10237,
7200,
273,
11454,
6928,
342,
48960,
3733,
3133,
5816,
347,
253,
5661,
1543,
1804,
253,
12739,
8495,
342,
253,
7313,
273,
28669,
532,
6230,
1162,
355,
9654,
327,
253,
5454,
2727,
875,
253,
4076,
285,
10237,
7200,
24088,
923,
2829,
577,
432,
436,
8668,
352,
3133,
751,
7882,
8104,
403,
10380,
816,
38883,
436,
5454,
2727,
281,
16753,
616,
4322,
327,
48960,
3733,
3021,
247,
11088,
5955,
327,
253,
3910,
875,
436,
789,
285,
2720,
789,
275,
436,
2170,
310,
2424,
50276,
66,
2442,
5955,
327,
253,
1524,
10186,
4016,
16274,
273,
253,
1655,
789,
310,
5816,
436,
37317,
651,
11907,
253,
4477,
281,
2319,
436,
2647,
11120,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
747,
4322,
1566,
1925,
7882,
2983,
534,
13698,
281,
35007,
1566,
432,
1146,
10237,
281,
48960,
8104,
253,
2488,
29328,
3500,
406,
14762,
20452,
347,
247,
1332,
323,
7882,
2983,
285,
2722,
326,
3500,
406,
14762,
20452,
476,
6296,
6379,
253,
48960,
31640,
273,
247,
1566,
10166,
275,
247,
2969,
305,
12064,
7802,
4758,
253,
30628,
5194,
326,
253,
1895,
1146,
5421,
310,
4722,
253,
4081,
1332,
310,
973,
17194,
285,
253,
4679,
403,
6571,
21414,
253,
4477,
403,
14659,
281,
17310,
253,
747,
1543,
1309,
253,
30080,
22559,
715,
253,
9311,
285,
2319,
625,
327,
253,
6733,
273,
253,
4081,
1332,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
12661,
247,
747,
4322,
1566,
1925,
7882,
2983,
253,
4736,
273,
7882,
2983,
310,
281,
35007,
1566,
432,
1146,
10237,
281,
48960,
8104,
253,
2488,
50276,
856,
6013,
3500,
406,
14762,
20452,
347,
247,
1332,
323,
7882,
2983,
285,
2722,
326,
3500,
406,
14762,
20452,
310,
19632,
275,
2426,
273,
48960,
31640,
275,
247,
2969,
305,
12064,
7802,
4758,
4720,
253,
2488,
2722,
326,
374,
48960,
3733,
310,
2217,
323,
15233,
7882,
2983,
50276,
856,
84,
50275,
284,
2080,
347,
891,
871,
436,
310,
253,
806,
789,
326,
2175,
253,
31640,
273,
48960,
3733,
1411,
6194,
941,
33254,
1925,
7882,
2983,
50276,
284,
337,
4879,
3500,
406,
14762,
20452,
310,
1929,
281,
320,
247,
5075,
2983,
275,
2426,
273,
253,
2629,
7200,
533,
436,
2929,
18303,
326,
352,
476,
320,
247,
2266,
2983,
275,
2426,
273,
253,
10237,
7200,
50276,
783,
30328,
273,
7882,
2983,
323,
24189,
273,
48960,
2983,
310,
973,
5183,
772,
50276,
262,
3133,
326,
3500,
406,
14762,
20452,
310,
2022,
2600,
533,
310,
417,
973,
2529,
3340,
253,
5301,
273,
48960,
20452,
285,
3500,
406,
14762,
20452,
19756,
50276,
262,
3133,
326,
7882,
2983,
275,
2829,
374,
285,
48960,
33254,
285,
3500,
285,
1604,
275,
2829,
495,
577,
403,
8931,
2975,
253,
15274,
273,
26086,
253,
8104,
310,
3309,
50276,
783,
5740,
273,
253,
33254,
3082,
275,
2829,
374,
310,
417,
1677,
50276,
249,
2829,
495,
50276,
24301,
285,
3500,
403,
760,
2783,
323,
253,
3082,
273,
941,
33254,
352,
651,
320,
625,
4569,
281,
513,
7103,
1411,
253,
2710,
8104,
347,
275,
2829,
374,
50276,
783,
2457,
3935,
273,
253,
2929,
310,
326,
407,
970,
247,
4067,
20452,
1979,
253,
3500,
406,
14762,
20452,
476,
320,
25860,
2299,
436,
3935,
651,
320,
4755,
285,
594,
253,
38135,
651,
417,
320,
1029,
50275,
18,
298,
489,
246,
8500,
43278,
269,
1205,
480,
2050,
1205,
340,
74,
703,
1251,
30986,
30287,
606,
285,
4498,
5092,
260,
864,
1805,
4999,
685,
7016,
13538,
1448,
15240,
18539,
3927,
342,
48960,
3733,
275,
5723,
2824,
43425,
50276,
284,
5393,
275,
253,
7714,
253,
2629,
7200,
273,
247,
10237,
314,
9644,
433,
10554,
1566,
1754,
327,
3733,
941,
47494,
407,
253,
7882,
2983,
310,
50276,
29266,
685,
253,
2629,
7200,
273,
247,
10237,
314,
10166,
1566,
1754,
327,
3733,
941,
47494,
407,
643,
3082,
275,
436,
3282,
253,
7882,
2983,
310,
1679,
4092,
685,
643,
47494,
3082,
534,
40195,
253,
2629,
7200,
347,
973,
347,
253,
10237,
7200,
8257,
3348,
4087,
5474,
33032,
2520,
2929,
10262,
256,
1430,
2983,
1411,
253,
6041,
48960,
3733,
1232,
26400,
281,
4796,
253,
27585,
10237,
7200,
273,
253,
4795,
1566,
5742,
253,
3969,
3500,
406,
14762,
26309,
403,
3732,
715,
3733,
941,
347,
247,
3733,
2606,
2983,
10527,
1783,
310,
2530,
281,
1329,
253,
2934,
273,
3500,
406,
14762,
26309,
5661,
1543,
327,
7744,
908,
9162,
15302,
751,
260,
338,
274,
740,
7568,
253,
29692,
453,
68,
6460,
273,
253,
4081,
1332,
50276,
296,
3755,
20556,
50276,
18,
436,
2929,
310,
22335,
3590,
285,
2590,
1475,
253,
10527,
1783,
5661,
1543,
403,
1534,
534,
973,
1329,
253,
3762,
50276,
19,
253,
4028,
3290,
273,
253,
2929,
310,
1175,
4583,
5742,
253,
4114,
273,
1895,
310,
25863,
5611,
50276,
20,
253,
4081,
1332,
310,
10481,
6760,
281,
320,
2173,
2234,
8104,
751,
269,
72,
3610,
23256,
69,
260,
88,
285,
6753,
35946,
403,
1246,
323,
31640,
7103,
26614,
253,
4679,
3835,
1740,
15302,
10775,
260,
338,
274,
6903,
361,
18504,
13107,
285,
10058,
303,
6533,
292,
534,
403,
4209,
281,
7568,
253,
12510,
323,
253,
4081,
1332,
50276,
21,
253,
4081,
1332,
8379,
25047,
48960,
3733,
3082,
2299,
4828,
30238,
17825,
5684,
310,
671,
4081,
285,
6760,
50275,
20881,
1255,
265,
50276,
18,
4583,
436,
310,
247,
1175,
2929,
285,
891,
858,
417,
1089,
1142,
3237,
7364,
403,
5469,
347,
8042,
562,
275,
253,
44282,
5474,
33032,
2520,
2929,
23970,
253,
1895,
273,
48960,
3733,
672,
597,
2454,
253,
747,
1511,
273,
8104,
1925,
7882,
8104,
253,
7882,
8104,
4388,
281,
18230,
253,
10237,
11659,
407,
5777,
40238,
253,
3733,
941,
954,
273,
5368,
3082,
18369,
326,
253,
1071,
31640,
273,
253,
48960,
10166,
3210,
672,
597,
403,
762,
253,
3733,
2606,
11659,
8104,
762,
253,
4322,
273,
253,
7882,
8104,
597,
7568,
326,
253,
48960,
10166,
2990,
342,
299,
4277,
20452,
7563,
310,
417,
2217,
281,
2342,
1411,
253,
299,
4277,
11542,
48960,
20452,
253,
4477,
9059,
326,
352,
310,
3309,
281,
46112,
253,
299,
4277,
20452,
7563,
672,
597,
2589,
253,
48960,
3733,
20544,
50276,
18,
2590,
4028,
3477,
281,
2096,
285,
973,
34092,
2929,
374,
1774,
5661,
906,
253,
1071,
18848,
461,
1255,
273,
48960,
10166,
2990,
1411,
612,
4930,
8104,
672,
597,
403,
762,
253,
1448,
15240,
8104,
310,
27807,
906,
275,
253,
48960,
2561,
50276,
20881,
1255,
265,
50275,
18,
1698,
3236,
414,
5816,
5301,
342,
253,
3332,
2234,
3806,
1275,
18,
534,
310,
8489,
2074,
1332,
273,
253,
3733,
2606,
11659,
8104,
253,
2983,
5978,
5933,
273,
3733,
2606,
285,
1071,
2606,
20452,
310,
2074,
281,
1275,
18,
33810,
1275,
18,
6786,
253,
1375,
23037,
14387,
2983,
3045,
1411,
48960,
10166,
2990,
327,
4076,
941,
3021,
352,
310,
12744,
326,
436,
2929,
1160,
253,
4344,
5301,
875,
1655,
256,
5503,
33254,
8104,
327,
48960,
10166,
2990,
275,
436,
2743,
253,
38135,
273,
253,
4322,
1566,
327,
253,
48960,
10166,
2990,
407,
12230,
272,
253,
3733,
941,
310,
16888,
374,
12497,
22909,
327,
253,
2954,
342,
1327,
18848,
461,
3386,
436,
2929,
11999,
253,
12433,
281,
253,
1327,
18848,
461,
4735,
323,
253,
3368,
1543,
273,
253,
2572,
273,
2629,
7200,
285,
253,
6379,
273,
10237,
7200,
672,
597,
403,
762,
253,
7882,
8104,
253,
4477,
22175,
253,
8294,
273,
253,
1327,
18848,
461,
4735,
275,
253,
4060,
12002,
285,
4768,
253,
2929,
2299,
891,
717,
417,
13762,
326,
1327,
18848,
461,
4735,
310,
253,
760,
1921,
323,
253,
3368,
906,
275,
2829,
374,
3626,
1071,
31640,
556,
2559,
285,
1071,
31640,
762,
612,
4930,
8104,
556,
6137,
672,
597,
403,
18539,
274,
1365,
10166,
762,
253,
7882,
8104,
533,
7296,
253,
958,
326,
627,
1900,
4961,
247,
5454,
2727,
875,
253,
2629,
7200,
285,
253,
10237,
7200,
1275,
19,
1327,
18848,
461,
4735,
16216,
320,
12718,
27137,
347,
253,
4477,
3560,
253,
1232,
273,
10527,
1783,
342,
253,
1275,
20,
597,
878,
281,
1246,
1529,
16774,
1941,
4735,
5251,
1783,
390,
5304,
5904,
323,
253,
22909,
327,
2954,
342,
1327,
18848,
461,
3386,
347,
1275,
20,
858,
495,
34541,
10393,
273,
253,
1307,
323,
253,
2074,
4473,
352,
310,
1077,
13477,
672,
253,
1307,
3500,
7882,
8104,
285,
2074,
4473,
4620,
4768,
253,
2929,
50275,
709,
18,
15260,
256,
344,
269,
632,
86,
340,
703,
79,
298,
50276,
893,
80,
277,
43425,
22688,
2037,
10237,
440,
29343,
494,
6667,
15233,
941,
11068,
1411,
48960,
4715,
275,
5213,
8059,
327,
4715,
14237,
50276,
709,
19,
1182,
12109,
288,
340,
86,
340,
480,
22728,
480,
1269,
272,
299,
1045,
305,
3227,
276,
74,
298,
50276,
75,
11208,
278,
6247,
778,
28055,
3505,
74,
6216,
5454,
2727,
875,
31640,
285,
7200,
275,
5213,
8059,
327,
5145,
4715,
7266,
818,
2504,
1630,
35340,
268,
1686,
83,
50276,
709,
20,
28669,
532,
6230,
277,
256,
386,
321,
18970,
256,
2209,
27621,
298,
1614,
254,
247,
50276,
34180,
610,
247,
4765,
31640,
778,
320,
387,
13653,
342,
7200,
549,
32693,
638,
3845,
549,
32693,
1093,
1762,
805,
17472,
50276,
9820,
253,
4477,
452,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
5474,
33032,
6010,
50276,
2520,
2929,
23970,
247,
4460,
941,
33254,
2983,
1411,
48960,
3733,
1925,
7882,
8104,
253,
4736,
310,
281,
2660,
253,
3733,
941,
824,
326,
253,
10237,
3045,
273,
48960,
3733,
689,
436,
32494,
10895,
310,
30853,
281,
3989,
436,
2983,
247,
3500,
406,
14762,
20452,
310,
4270,
12401,
48960,
26309,
253,
4388,
273,
3500,
406,
14762,
26309,
310,
281,
28432,
253,
1327,
18848,
461,
3386,
275,
253,
3733,
941,
841,
26309,
476,
320,
4561,
407,
2297,
839,
48960,
1650,
5978,
16566,
50276,
24013,
7639,
253,
2929,
15265,
684,
7882,
8104,
432,
253,
8668,
273,
10237,
4632,
1327,
18848,
461,
3386,
5742,
247,
2969,
8985,
9162,
4836,
689,
247,
7802,
273,
305,
10064,
2458,
310,
2783,
7605,
1783,
327,
436,
4836,
2722,
326,
48960,
3733,
689,
3500,
406,
902,
1037,
44711,
941,
310,
27009,
281,
48960,
31640,
25761,
352,
310,
2011,
326,
247,
4067,
20452,
9777,
310,
3058,
281,
7496,
48960,
3733,
1411,
7882,
8104,
50276,
39595,
253,
12510,
273,
7882,
8104,
1411,
48960,
3733,
310,
5183,
949,
9470,
5661,
1543,
50276,
296,
3755,
20556,
50276,
783,
2929,
310,
2590,
285,
352,
22591,
253,
9414,
10861,
2920,
50275,
783,
2929,
310,
973,
24013,
8550,
253,
7605,
1783,
273,
253,
8985,
9162,
4836,
310,
11080,
285,
253,
12739,
273,
253,
10527,
1543,
403,
5469,
9483,
1242,
50275,
2520,
2929,
703,
1397,
1708,
327,
253,
12739,
273,
10237,
4632,
1327,
18848,
461,
3386,
432,
247,
4460,
8668,
285,
29820,
841,
2175,
281,
9569,
247,
747,
4322,
1411,
48960,
3733,
50275,
783,
5661,
7533,
403,
5469,
275,
2508,
285,
253,
2538,
273,
1027,
4373,
22041,
285,
35615,
327,
253,
3045,
403,
6949,
50275,
20881,
1255,
265,
50276,
6050,
253,
10527,
816,
6787,
273,
3500,
406,
14762,
26309,
327,
253,
8985,
9162,
4836,
403,
5469,
253,
2954,
273,
841,
1543,
342,
253,
2983,
5978,
1232,
16186,
884,
310,
26591,
3738,
253,
1677,
1650,
323,
253,
21535,
2957,
285,
253,
8985,
9162,
4836,
310,
14109,
253,
20801,
273,
253,
8103,
1159,
275,
16186,
884,
878,
247,
1805,
22861,
50274,
44295,
3062,
247,
11080,
5955,
327,
253,
2954,
273,
436,
789,
342,
5368,
2987,
327,
253,
5454,
2727,
875,
253,
4076,
285,
10237,
7200,
273,
11454,
6928,
342,
48960,
3733,
3133,
5816,
347,
253,
5661,
1543,
1804,
253,
12739,
8495,
342,
253,
7313,
273,
28669,
532,
6230,
1162,
355,
9654,
327,
253,
5454,
2727,
875,
253,
4076,
285,
10237,
7200,
24088,
923,
2829,
577,
432,
436,
8668,
352,
3133,
751,
7882,
8104,
403,
10380,
816,
38883,
436,
5454,
2727,
281,
16753,
616,
4322,
327,
48960,
3733,
3021,
247,
11088,
5955,
327,
253,
3910,
875,
436,
789,
285,
2720,
789,
275,
436,
2170,
310,
2424,
50276,
66,
2442,
5955,
327,
253,
1524,
10186,
4016,
16274,
273,
253,
1655,
789,
310,
5816,
436,
37317,
651,
11907,
253,
4477,
281,
2319,
436,
2647,
11120,
2490,
187,
4118,
18435,
27,
2520,
2929,
29328,
247,
747,
4322,
1566,
1925,
7882,
2983,
534,
13698,
281,
35007,
1566,
432,
1146,
10237,
281,
48960,
8104,
253,
2488,
29328,
3500,
406,
14762,
20452,
347,
247,
1332,
323,
7882,
2983,
285,
2722,
326,
3500,
406,
14762,
20452,
476,
6296,
6379,
253,
48960,
31640,
273,
247,
1566,
10166,
275,
247,
2969,
305,
12064,
7802,
4758,
253,
30628,
5194,
326,
253,
1895,
1146,
5421,
310,
4722,
253,
4081,
1332,
310,
973,
17194,
285,
253,
4679,
403,
6571,
21414,
253,
4477,
403,
14659,
281,
17310,
253,
747,
1543,
1309,
253,
30080,
22559,
715,
253,
9311,
285,
2319,
625,
327,
253,
6733,
273,
253,
4081,
1332,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper proposes a new baseline for attribution methods tailored to deep neural networks dnn attribution methods like integrated gradients deep lift and others require a baseline to compare to as part of the computation the choice of a baseline has been controversial in the literature and a good method to select a baseline remains an open problem this paper seeks to address that problem specifically this paper seeks to develop a baseline for onevsone explanations as opposed to onevsall explanations consider an mnist model a onevsone attribution would attribute why an input is say a 2 and not a 4 ie it is contrastive against a particular target class and not all classes this paper proposes to use a stargan for generating these baselines the paper then evaluates explanations derived using the new baseline and shows that they explanations perform better overall i think the paper tackles an important problem but i have several concerns with the motivation the appropriateness of the baseline definition in this work and the evaluation ill expand on these concerns in the later part of the review so i am not recommending an accept in its current form significancequality the paper tackles an interesting and potentially challenging problem however motivation is still somewhat unclear and there are critical problems with the evaluations used as justification here i go into these at the end of this review claritywriting the paper is generally easy to follow i problem i had reading it is that there are a few sentences that are stated as fact without any justification for example the paper notes the minimum distance training sample in section 22 is a true classtargeted baseline proposed in the past what is a true classtargeted baseline such statements should probably be reformulated minor changes lasted paragraph of section 23 has posted a challenging posted is probably not the desired word here questions and concerns motivation it is still not clear to me why a onevsone attribution is desirable more should probably be done here to motivate this the biggest need though is motivation for the onevsone attribution baseline in several statements the paper alludes to properties of baselines used in expected gradients and other methods stating the reasons why these baselines are undesirable i agree but why should a onevsone baseline be preferable to these ideally the paper will set out a list of desirable properties then show that the baseline derived from ganmex satisfies these it is still not clear to me why a notion of minimum distance in a different target class is the right one can the authors say more about why this should be the case evaluation ill preface my concerns here with the fact that i think evaluating model attributions or explanations in general is a difficult and open problem this said i dont think any of the evaluations presented in this paper can be taken as showing that the ganmex baseline is the desirable one first the perturbationbased evaluation does not provide consistent rankings see tomsett et al aaai sanity checks for saliency metrics i suspect the gini index will have the same problems as those discussed in the tomsett et al paper the sanity checks themselves ie the cascading randomization will tell you if a method should be ruled out and not whether a method is effective consequently i dont think the sanity checks can say much in judging baselines having said all of this i think the way to evaluate a baseline is to take a task where the truth groundtruth rankings are known a priori train a model to respect and align to the true ground truth now one can compare attributions from such a model for a normal baseline and a baseline from ganmex assuming the attribution method itself is a reliable one then one can quantify improvements due to the ganmex baseline a paper that might be related to this work that also incorporated generative modeling httpsarxivorgabs180708024pdf overall the concerns above make me hesitant about this current draft however i am happy to revise my assessment if the authors think i am wrong docsepsummary this paper looks to use gans to generate baselines for attribution methods the focus on onevsone feature importance explanations is novel to the best of my knowledge the paper attempts to make progress on the baseline selection problem that has plagued the feature importance community strengths as far as i know the authors contribution of one vs one attribution compared to one vs any attribution is novel whilst other works have alluded to this or ran heusristic experiments this paper does a good job of formalizing the notion the ability for ganmex to live on top of any other attribution method makes it an attractive addition to existing attribution methods thank you for visualizing the baselines generated by ganmex quite helpful weaknesses a computational complexity analysis is required to gauge the practical utility of generating baselines with ganmex also it would be nice to give complexity of ganmex compared to fido eg and simple nearest neighbor baselines in addition to the visual comparisons provided it would have been helpful to evaluate explanations using exisiting evaluation criteria in the attribution literature ie faithfulness sensitivity monotonicity etc this paper has the opportunity to broadly assess the effects of various baselines on attributions questions while gans seem like an attractive choice of deep generative model dgm for this problem can you comment on or experiment with other dgms ie vaes or specifically vaeacs 1 however any dgm that has latent class separation should suffice you would be able to perform optimization in the latent space 2 3 4 and achieve similar class separation as described in figure 1 the attributions in figure 3e seem like noise while zero baseline seems visually appealing can you provide some intuition for why this occurs the gan feels like overkill for mnist but might be suitable for other high dimensional problems wherein the baseline needs to pick up on small nuances in the data 1 httpsopenreviewnetforumidsyxtjh0qym 2 httpsarxivorgabs180608867 3 httpsarxivorgabs200606848 4 httpsarxivorgabs180708024docseppaper summary this paper considers the lessexplored baseline selection issue in attribution methods for onevsone explanations of multiclass classifiers the key insight is to construct the closest and realistic target class baseline to this end an existing imagetoimage translation gan model namely stargan is leveraged to transform an input example to another example in a target class yet is close to the input this baseline can be integrated with a variety of attribution methods including integrated gradient deeplift occlusion and deepshap and shows consistent improvements over zero baseline and minimum distance training sample for onevsone explanations the experiments are conducted on three datasets mnist svhn and apple2orange paper strengths this paper addresses an important yet overlooked baseline selection problem the way the authors address this problem is interesting by leveraging gan models empirical evaluations demonstrate the effectiveness and generalizability of the proposed approach paper weaknesses 1 the main weakness of this paper to me is the evaluation section the proposed approach is only validated on simple datasets like mnist and svhn it would be more convincing to show the effectiveness of the proposed approach on natural images and a large number of classes like cifar and imagenet as used in the previous work such as ig 2 following the comment in 1 the prior works such as ig and deeplift have been used to analyze other types of models and been evaluated on other types of data such as genomics and neural machine translation in addition to images would the proposed approach also apply to these domains 3 as illustrated in figure 1 the key assumption of the proposed approach is that a gan model stargan is able to generate examples that are much closer to the input examples than the training examples ie minimum distance training sample under what conditions would such an assumption hold 4 following the comment in 3 it would be interesting to show and analyze some failure cases 5 in the proposed approach the stargan directly uses the already trained model classifier as its discriminator how if the stargan trains its own discriminator without using the model classifier 6 it would be interesting to show the hyperparameter different tradeoff lambdas sensitivity 7 i understood that the authors focused on onevsone explanations but i am interested to hear the authors thoughts on how to extend the proposed approach to onevsall explanations after rebuttal i thank the authors for the rebuttal i have also read the other reviewers comments unfortunately the rebuttal is unconvincing and sometimes vague i keep my original ratingdocsepthe paper claims to present a novel ganbased model explainability for generating onevsone explanations by incorporating tobeexplained classifier as part of the gan they use gans to produce a baseline image which is a realistic instance from a target class that resembles the original instance positive aspects a novel approach for generating onevsone explanation baselines leveraging gans the proposed approach improves the saliency maps for binary classifiers negative aspects the paper lacks clarity the approach is demonstrated on cherrypicking examples have doubts of its generalization capability please find below some of my concerns 1 your claim we use gans to produce a baseline image which is a realistic instance from a target class that resembles the original instance why do you need a gan why dont you use a network to generate a confusion matrix to analyze the performance of the classifier and based on this analysis you could explain why for instance the digit 0 is classified as a 6 2 related with the previous point your analysis is very limited you assume 0 is classified as 6 could 0 be classified as an 8 or 9 it is not clear from your analysis there are no comments on these cases looks like your examples to defend your approach are cherrypicked 3 i am not sure how to interpret figure 2 some other comments 1 the paper lacks novelty the authors contribution is not clear 2 the experimental validation is limited and not convincing the authors use just some simple datasets mnist svhn what about more complex datasets like cifar10 lsun etc could your approach explain the misclassification in these cases
### Summary: | this work investigates the choice of a baseline for attribution methods such a choice is important and can heavily influence the outcome of any analysis that involves attribution methods the work proposes doing 1 onevsone attribution in a sort of contrastive fashion 2 generating baselines using stargan the reviewers have brought out a number of valid concerns about this work 1 onevsone attribution appears to be novel and distinctive enough from the more prevalent onevsall formulations i am perhaps more optimistic than the reviewers that such a formulation is in fact useful but i can see where the hesitancy can come from 2 its not clear that the evaluation shows that the proposed method is in fact superior to the others all the reviewers touched upon this one way or another 3 somewhat simplistic datasets used for evaluation noted that there are cifar10 results in the rebuttal this was more borderline than the scores would indicate i thank the authors for the extensive replies and extra experiments i encourage them to incorporate more of the feedback and resubmit to the next suitable conference i do believe that doing experiments on imagetnet like previous work does such as ig would be quite worthwhile and convincing i suspect the computational expense could be mitigated by reusing pretrained networks of which there are many available for imagenet specifically | [
7826,
11132,
1895,
2299,
16038,
310,
1335,
8489,
12744,
285,
627,
403,
4619,
3237,
342,
253,
27163,
908,
347,
22861,
1060,
891,
564,
715,
841,
387,
253,
990,
273,
436,
2278,
50275,
498,
15752,
17695,
253,
2929,
310,
3839,
3477,
281,
956,
891,
1895,
891,
574,
4361,
352,
310,
326,
627,
403,
247,
1643,
14683,
326,
403,
4767,
347,
958,
1293,
667,
22861,
323,
1650,
253,
2929,
7211,
253,
5927,
4181,
3733,
3410,
275,
2593,
3307,
310,
247,
2032,
502,
284,
296,
1816,
264,
8245,
4081,
275,
253,
2469,
752,
310,
247,
2032,
502,
284,
296,
1816,
264,
8245,
824,
7234,
943,
3164,
320,
8460,
2907,
50275,
37585,
2544,
20578,
12494,
273,
2593,
3495,
556,
9269,
247,
11132,
9269,
310,
3164,
417,
253,
6799,
3159,
1060,
50275,
34974,
285,
7350,
50275,
24013,
7639,
352,
310,
1335,
417,
2590,
281,
479,
2139,
247,
581,
10936,
531,
863,
2382,
310,
11408,
625,
943,
3164,
320,
2218,
1060,
281,
41509,
436,
253,
5962,
878,
2167,
310,
16038,
323,
253,
581,
10936,
531,
863,
2382,
8245,
275,
2067,
7234,
253,
2929,
512,
14735,
281,
3607,
273,
1666,
25379,
908,
275,
3264,
27935,
285,
643,
3082,
14851,
253,
4606,
2139,
841,
1666,
25379,
403,
26016,
891,
5194,
533,
2139,
943,
247,
581,
10936,
531,
8245,
320,
29224,
281,
841,
34243,
253,
2929,
588,
873,
562,
247,
1618,
273,
11408,
3607,
840,
921,
326,
253,
8245,
6012,
432,
36827,
78,
911,
12310,
841,
352,
310,
1335,
417,
2590,
281,
479,
2139,
247,
10732,
273,
5927,
4181,
275,
247,
1027,
2303,
966,
310,
253,
987,
581,
476,
253,
4477,
1333,
625,
670,
2139,
436,
943,
320,
253,
1083,
50274,
15419,
2368,
2853,
638,
1664,
619,
7350,
1060,
342,
253,
958,
326,
891,
1158,
16344,
1566,
863,
8303,
390,
22909,
275,
2087,
310,
247,
2834,
285,
1527,
1895,
436,
753,
891,
13414,
1158,
667,
273,
253,
27163,
3559,
275,
436,
2929,
476,
320,
2668,
347,
4645,
326,
253,
36827,
78,
911,
8245,
310,
253,
11408,
581,
806,
253,
20452,
3169,
7103,
1057,
417,
2085,
5185,
31972,
923,
281,
983,
3592,
1162,
355,
39951,
2284,
45985,
12255,
323,
3779,
4364,
17082,
891,
9101,
253,
305,
5391,
3605,
588,
452,
253,
1072,
3237,
347,
1110,
5469,
275,
253,
281,
983,
3592,
1162,
355,
2929,
253,
45985,
12255,
3746,
26332,
253,
18779,
6748,
46852,
588,
2028,
368,
604,
247,
1332,
943,
320,
12969,
562,
285,
417,
1880,
247,
1332,
310,
3576,
17912,
891,
13414,
1158,
253,
45985,
12255,
476,
1333,
1199,
275,
32721,
1666,
25379,
1907,
753,
512,
273,
436,
891,
1158,
253,
1039,
281,
7472,
247,
8245,
310,
281,
1379,
247,
4836,
835,
253,
5083,
3216,
33024,
31972,
403,
1929,
247,
30400,
6194,
247,
1566,
281,
1675,
285,
8495,
281,
253,
2032,
3216,
5083,
1024,
581,
476,
7277,
863,
8303,
432,
824,
247,
1566,
323,
247,
2622,
8245,
285,
247,
8245,
432,
36827,
78,
911,
7384,
253,
863,
2382,
1332,
3139,
310,
247,
9630,
581,
840,
581,
476,
22048,
11701,
1955,
281,
253,
36827,
78,
911,
8245,
50275,
66,
2929,
326,
1537,
320,
2905,
281,
436,
789,
326,
671,
11217,
1006,
800,
14053,
5987,
39962,
2061,
5375,
11395,
1967,
1438,
1348,
9275,
50276,
1189,
455,
253,
7350,
1840,
1056,
479,
16063,
386,
670,
436,
1655,
7482,
2299,
891,
717,
5211,
281,
49620,
619,
6803,
604,
253,
4477,
1158,
891,
717,
3430,
5474,
339,
793,
360,
3454,
436,
2929,
4453,
281,
897,
305,
507,
281,
6635,
1666,
25379,
323,
863,
2382,
3082,
253,
2770,
327,
581,
10936,
531,
4735,
6349,
22909,
310,
4460,
281,
253,
1682,
273,
619,
3640,
253,
2929,
9437,
281,
1056,
4780,
327,
253,
8245,
5438,
1895,
326,
556,
48086,
253,
4735,
6349,
3114,
50275,
296,
3755,
20556,
50276,
284,
2080,
347,
891,
871,
253,
4477,
7680,
273,
581,
4632,
581,
863,
2382,
2429,
281,
581,
4632,
667,
863,
2382,
310,
4460,
16682,
643,
2987,
452,
512,
21015,
281,
436,
390,
6337,
344,
316,
2198,
280,
4679,
436,
2929,
1057,
247,
1175,
2628,
273,
7473,
3006,
253,
10732,
50276,
783,
3745,
323,
36827,
78,
911,
281,
3153,
327,
1755,
273,
667,
643,
863,
2382,
1332,
2789,
352,
271,
12994,
1635,
281,
5368,
863,
2382,
3082,
5717,
368,
323,
5304,
3006,
253,
1666,
25379,
4561,
407,
36827,
78,
911,
3240,
9371,
50275,
20881,
1255,
265,
50275,
66,
15180,
10454,
1783,
310,
2424,
281,
11206,
253,
8542,
11839,
273,
11365,
1666,
25379,
342,
36827,
78,
911,
671,
352,
651,
320,
5322,
281,
1918,
10454,
273,
36827,
78,
911,
2429,
281,
269,
7112,
24088,
285,
2969,
5275,
6346,
1666,
25379,
50275,
249,
1635,
281,
253,
5304,
14023,
2530,
352,
651,
452,
644,
9371,
281,
7472,
22909,
970,
385,
261,
2996,
7103,
6866,
275,
253,
863,
2382,
6239,
26332,
6009,
16858,
7340,
45973,
414,
3966,
436,
2929,
556,
253,
5107,
281,
21450,
2939,
253,
2538,
273,
2710,
1666,
25379,
327,
863,
8303,
50276,
34974,
50276,
6050,
305,
507,
1646,
751,
271,
12994,
4327,
273,
3676,
1006,
800,
1566,
277,
34753,
323,
436,
1895,
476,
368,
4385,
327,
390,
3368,
342,
643,
277,
72,
983,
26332,
13460,
265,
390,
5742,
362,
3348,
18944,
337,
2299,
667,
277,
34753,
326,
556,
21624,
966,
9712,
943,
36433,
368,
651,
320,
2104,
281,
1347,
13757,
275,
253,
21624,
2317,
374,
495,
577,
285,
5115,
2074,
966,
9712,
347,
2529,
275,
4677,
337,
50276,
783,
863,
8303,
275,
4677,
495,
70,
1646,
751,
6046,
1223,
5058,
8245,
3133,
25910,
23176,
50276,
5092,
368,
2085,
690,
30328,
323,
2139,
436,
6634,
253,
36827,
9193,
751,
689,
24212,
323,
278,
79,
382,
533,
1537,
320,
7470,
323,
643,
1029,
15759,
3237,
10646,
253,
8245,
3198,
281,
2619,
598,
327,
1355,
8794,
1972,
275,
253,
941,
50276,
18,
5987,
5758,
15337,
3024,
39061,
2352,
90,
633,
34453,
17,
82,
1105,
50276,
19,
5987,
39962,
2061,
5375,
11395,
1549,
2055,
2251,
50276,
20,
5987,
39962,
2061,
5375,
1518,
1549,
2358,
2385,
50276,
21,
5987,
39962,
2061,
5375,
11395,
1967,
1438,
1348,
7152,
339,
377,
6653,
6010,
50276,
2520,
2929,
19401,
253,
3293,
11523,
446,
2149,
8245,
5438,
2523,
275,
863,
2382,
3082,
323,
581,
10936,
531,
22909,
273,
23559,
14407,
49996,
253,
2234,
12288,
310,
281,
3989,
253,
8642,
285,
15958,
2303,
966,
8245,
281,
436,
990,
271,
5368,
4440,
16713,
5695,
10234,
36827,
1566,
10775,
4177,
1247,
310,
19732,
2961,
281,
4979,
271,
3280,
1650,
281,
1529,
1650,
275,
247,
2303,
966,
2568,
310,
2810,
281,
253,
3280,
436,
8245,
476,
320,
8527,
342,
247,
5235,
273,
863,
2382,
3082,
1690,
8527,
11786,
372,
70,
446,
2094,
30796,
285,
3676,
1200,
522,
285,
2722,
5185,
11701,
689,
5058,
8245,
285,
5927,
4181,
3733,
3410,
323,
581,
10936,
531,
22909,
253,
4679,
403,
5196,
327,
1264,
15302,
50276,
16192,
382,
18504,
13107,
285,
19126,
19,
35270,
50276,
20790,
20544,
50276,
2520,
2929,
12453,
271,
1774,
2568,
28849,
8245,
5438,
1895,
253,
1039,
253,
4477,
2953,
436,
1895,
310,
4722,
407,
19732,
2977,
36827,
3210,
16774,
27163,
7568,
253,
12510,
285,
2087,
50228,
273,
253,
4081,
2746,
50276,
20790,
32213,
50276,
18,
253,
2022,
14855,
273,
436,
2929,
281,
479,
310,
253,
7103,
2593,
253,
4081,
2746,
310,
760,
17618,
327,
2969,
15302,
751,
278,
79,
382,
285,
18504,
13107,
352,
651,
320,
625,
21414,
281,
921,
253,
12510,
273,
253,
4081,
2746,
327,
3626,
3888,
285,
247,
1781,
1180,
273,
5971,
751,
260,
338,
274,
285,
4440,
257,
292,
347,
908,
275,
253,
2045,
789,
824,
347,
25477,
50276,
19,
1563,
253,
4385,
275,
337,
253,
2720,
2987,
824,
347,
25477,
285,
372,
70,
446,
2094,
452,
644,
908,
281,
12106,
643,
3510,
273,
3210,
285,
644,
6760,
327,
643,
3510,
273,
941,
824,
347,
730,
19177,
285,
11454,
5145,
10234,
275,
1635,
281,
3888,
651,
253,
4081,
2746,
671,
4647,
281,
841,
10625,
50276,
20,
347,
12800,
275,
4677,
337,
253,
2234,
9376,
273,
253,
4081,
2746,
310,
326,
247,
36827,
1566,
4177,
1247,
310,
2104,
281,
6635,
6667,
326,
403,
1199,
8003,
281,
253,
3280,
6667,
685,
253,
3733,
6667,
26332,
5927,
4181,
3733,
3410,
762,
752,
2515,
651,
824,
271,
9376,
2186,
50276,
21,
1563,
253,
4385,
275,
495,
352,
651,
320,
4722,
281,
921,
285,
12106,
690,
4433,
2219,
50276,
22,
275,
253,
4081,
2746,
253,
4177,
1247,
3587,
4648,
253,
2168,
10166,
1566,
30410,
347,
697,
7134,
12915,
849,
604,
253,
4177,
1247,
18784,
697,
1211,
7134,
12915,
1293,
970,
253,
1566,
30410,
50276,
23,
352,
651,
320,
4722,
281,
921,
253,
4373,
19484,
1027,
5454,
2727,
24082,
34797,
7340,
50276,
24,
891,
7192,
326,
253,
4477,
7106,
327,
581,
10936,
531,
22909,
533,
891,
717,
6110,
281,
4089,
253,
4477,
7906,
327,
849,
281,
9017,
253,
4081,
2746,
281,
581,
10936,
455,
22909,
50276,
6438,
30080,
22559,
50275,
74,
5717,
253,
4477,
323,
253,
30080,
22559,
891,
452,
671,
1239,
253,
643,
30628,
5701,
19235,
253,
30080,
22559,
310,
10915,
87,
19163,
285,
4536,
21248,
891,
1978,
619,
3236,
13716,
7152,
339,
431,
248,
2929,
3916,
281,
1246,
247,
4460,
36827,
3169,
1566,
5513,
1430,
323,
11365,
581,
10936,
531,
22909,
407,
24049,
281,
1257,
15083,
1243,
30410,
347,
629,
273,
253,
36827,
597,
897,
305,
507,
281,
4711,
247,
8245,
2460,
534,
310,
247,
15958,
4227,
432,
247,
2303,
966,
326,
29217,
253,
3236,
4227,
50276,
10247,
7794,
50276,
66,
4460,
2746,
323,
11365,
581,
10936,
531,
8813,
1666,
25379,
19732,
2977,
305,
507,
50276,
783,
4081,
2746,
19132,
253,
3779,
4364,
8115,
323,
8985,
49996,
50275,
12373,
7794,
50276,
783,
2929,
19756,
19843,
50276,
783,
2746,
310,
5183,
327,
33804,
81,
12427,
6667,
452,
24626,
273,
697,
26647,
14603,
50276,
32897,
1089,
2708,
690,
273,
619,
7350,
337,
634,
1750,
359,
897,
305,
507,
281,
4711,
247,
8245,
2460,
534,
310,
247,
15958,
4227,
432,
247,
2303,
966,
326,
29217,
253,
3236,
4227,
2139,
513,
368,
878,
247,
36827,
2139,
13414,
368,
897,
247,
2990,
281,
6635,
247,
13775,
4315,
281,
12106,
253,
3045,
273,
253,
30410,
285,
1754,
327,
436,
1783,
368,
812,
5513,
2139,
323,
4227,
253,
6670,
470,
310,
10509,
347,
247,
721,
50276,
19,
2905,
342,
253,
2045,
1127,
634,
1783,
310,
1077,
3710,
368,
5467,
470,
310,
10509,
347,
721,
812,
470,
320,
10509,
347,
271,
854,
390,
898,
352,
310,
417,
2590,
432,
634,
1783,
627,
403,
642,
5701,
327,
841,
2219,
4453,
751,
634,
6667,
281,
2342,
634,
2746,
403,
33804,
81,
14050,
495,
891,
717,
417,
2119,
849,
281,
4665,
4677,
374,
50276,
8826,
643,
5701,
337,
253,
2929,
19756,
38135,
253,
4477,
7680,
310,
417,
2590,
374,
253,
5661,
12820,
310,
3710,
285,
417,
21414,
253,
4477,
897,
816,
690,
2969,
15302,
278,
79,
382,
18504,
13107,
752,
670,
625,
2570,
15302,
751,
260,
338,
274,
740,
298,
13998,
3966,
812,
634,
2746,
5513,
253,
3731,
42070,
275,
841,
2219,
2490,
187,
4118,
18435,
27,
2520,
789,
2340,
684,
253,
4327,
273,
247,
8245,
323,
863,
2382,
3082,
824,
247,
4327,
310,
1774,
285,
476,
11306,
4833,
253,
6454,
273,
667,
1783,
326,
8687,
863,
2382,
3082,
253,
789,
29328,
2509,
337,
581,
10936,
531,
863,
2382,
275,
247,
3686,
273,
4499,
422,
8142,
374,
11365,
1666,
25379,
970,
4177,
1247,
50276,
783,
30628,
452,
3982,
562,
247,
1180,
273,
3588,
7350,
670,
436,
789,
50276,
18,
581,
10936,
531,
863,
2382,
4620,
281,
320,
4460,
285,
21488,
2217,
432,
253,
625,
21270,
581,
10936,
455,
26850,
891,
717,
4931,
625,
28684,
685,
253,
30628,
326,
824,
247,
15895,
310,
275,
958,
4217,
533,
891,
476,
923,
835,
253,
16063,
4306,
476,
1705,
432,
374,
697,
417,
2590,
326,
253,
7103,
2722,
326,
253,
4081,
1332,
310,
275,
958,
8936,
281,
253,
2571,
512,
253,
30628,
14435,
2220,
436,
581,
1039,
390,
1529,
495,
8489,
8077,
2531,
15302,
908,
323,
7103,
4879,
326,
627,
403,
260,
338,
274,
740,
1543,
275,
253,
30080,
22559,
50276,
2520,
369,
625,
45210,
685,
253,
7363,
651,
5224,
891,
5717,
253,
4477,
323,
253,
9470,
32114,
285,
4465,
4679,
891,
11907,
731,
281,
19071,
625,
273,
253,
8680,
285,
501,
538,
2225,
281,
253,
1735,
7470,
8059,
891,
513,
2868,
326,
2509,
4679,
327,
4440,
292,
3024,
751,
2045,
789,
1057,
824,
347,
25477,
651,
320,
3240,
32811,
285,
21414,
891,
9101,
253,
15180,
14247,
812,
320,
4784,
27285,
407,
294,
5302,
3215,
11273,
6928,
273,
534,
627,
403,
1142,
2130,
323,
4440,
257,
292,
5742
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
7826,
11132,
1895,
2299,
16038,
310,
1335,
8489,
12744,
285,
627,
403,
4619,
3237,
342,
253,
27163,
908,
347,
22861,
1060,
891,
564,
715,
841,
387,
253,
990,
273,
436,
2278,
50275,
498,
15752,
17695,
253,
2929,
310,
3839,
3477,
281,
956,
891,
1895,
891,
574,
4361,
352,
310,
326,
627,
403,
247,
1643,
14683,
326,
403,
4767,
347,
958,
1293,
667,
22861,
323,
1650,
253,
2929,
7211,
253,
5927,
4181,
3733,
3410,
275,
2593,
3307,
310,
247,
2032,
502,
284,
296,
1816,
264,
8245,
4081,
275,
253,
2469,
752,
310,
247,
2032,
502,
284,
296,
1816,
264,
8245,
824,
7234,
943,
3164,
320,
8460,
2907,
50275,
37585,
2544,
20578,
12494,
273,
2593,
3495,
556,
9269,
247,
11132,
9269,
310,
3164,
417,
253,
6799,
3159,
1060,
50275,
34974,
285,
7350,
50275,
24013,
7639,
352,
310,
1335,
417,
2590,
281,
479,
2139,
247,
581,
10936,
531,
863,
2382,
310,
11408,
625,
943,
3164,
320,
2218,
1060,
281,
41509,
436,
253,
5962,
878,
2167,
310,
16038,
323,
253,
581,
10936,
531,
863,
2382,
8245,
275,
2067,
7234,
253,
2929,
512,
14735,
281,
3607,
273,
1666,
25379,
908,
275,
3264,
27935,
285,
643,
3082,
14851,
253,
4606,
2139,
841,
1666,
25379,
403,
26016,
891,
5194,
533,
2139,
943,
247,
581,
10936,
531,
8245,
320,
29224,
281,
841,
34243,
253,
2929,
588,
873,
562,
247,
1618,
273,
11408,
3607,
840,
921,
326,
253,
8245,
6012,
432,
36827,
78,
911,
12310,
841,
352,
310,
1335,
417,
2590,
281,
479,
2139,
247,
10732,
273,
5927,
4181,
275,
247,
1027,
2303,
966,
310,
253,
987,
581,
476,
253,
4477,
1333,
625,
670,
2139,
436,
943,
320,
253,
1083,
50274,
15419,
2368,
2853,
638,
1664,
619,
7350,
1060,
342,
253,
958,
326,
891,
1158,
16344,
1566,
863,
8303,
390,
22909,
275,
2087,
310,
247,
2834,
285,
1527,
1895,
436,
753,
891,
13414,
1158,
667,
273,
253,
27163,
3559,
275,
436,
2929,
476,
320,
2668,
347,
4645,
326,
253,
36827,
78,
911,
8245,
310,
253,
11408,
581,
806,
253,
20452,
3169,
7103,
1057,
417,
2085,
5185,
31972,
923,
281,
983,
3592,
1162,
355,
39951,
2284,
45985,
12255,
323,
3779,
4364,
17082,
891,
9101,
253,
305,
5391,
3605,
588,
452,
253,
1072,
3237,
347,
1110,
5469,
275,
253,
281,
983,
3592,
1162,
355,
2929,
253,
45985,
12255,
3746,
26332,
253,
18779,
6748,
46852,
588,
2028,
368,
604,
247,
1332,
943,
320,
12969,
562,
285,
417,
1880,
247,
1332,
310,
3576,
17912,
891,
13414,
1158,
253,
45985,
12255,
476,
1333,
1199,
275,
32721,
1666,
25379,
1907,
753,
512,
273,
436,
891,
1158,
253,
1039,
281,
7472,
247,
8245,
310,
281,
1379,
247,
4836,
835,
253,
5083,
3216,
33024,
31972,
403,
1929,
247,
30400,
6194,
247,
1566,
281,
1675,
285,
8495,
281,
253,
2032,
3216,
5083,
1024,
581,
476,
7277,
863,
8303,
432,
824,
247,
1566,
323,
247,
2622,
8245,
285,
247,
8245,
432,
36827,
78,
911,
7384,
253,
863,
2382,
1332,
3139,
310,
247,
9630,
581,
840,
581,
476,
22048,
11701,
1955,
281,
253,
36827,
78,
911,
8245,
50275,
66,
2929,
326,
1537,
320,
2905,
281,
436,
789,
326,
671,
11217,
1006,
800,
14053,
5987,
39962,
2061,
5375,
11395,
1967,
1438,
1348,
9275,
50276,
1189,
455,
253,
7350,
1840,
1056,
479,
16063,
386,
670,
436,
1655,
7482,
2299,
891,
717,
5211,
281,
49620,
619,
6803,
604,
253,
4477,
1158,
891,
717,
3430,
5474,
339,
793,
360,
3454,
436,
2929,
4453,
281,
897,
305,
507,
281,
6635,
1666,
25379,
323,
863,
2382,
3082,
253,
2770,
327,
581,
10936,
531,
4735,
6349,
22909,
310,
4460,
281,
253,
1682,
273,
619,
3640,
253,
2929,
9437,
281,
1056,
4780,
327,
253,
8245,
5438,
1895,
326,
556,
48086,
253,
4735,
6349,
3114,
50275,
296,
3755,
20556,
50276,
284,
2080,
347,
891,
871,
253,
4477,
7680,
273,
581,
4632,
581,
863,
2382,
2429,
281,
581,
4632,
667,
863,
2382,
310,
4460,
16682,
643,
2987,
452,
512,
21015,
281,
436,
390,
6337,
344,
316,
2198,
280,
4679,
436,
2929,
1057,
247,
1175,
2628,
273,
7473,
3006,
253,
10732,
50276,
783,
3745,
323,
36827,
78,
911,
281,
3153,
327,
1755,
273,
667,
643,
863,
2382,
1332,
2789,
352,
271,
12994,
1635,
281,
5368,
863,
2382,
3082,
5717,
368,
323,
5304,
3006,
253,
1666,
25379,
4561,
407,
36827,
78,
911,
3240,
9371,
50275,
20881,
1255,
265,
50275,
66,
15180,
10454,
1783,
310,
2424,
281,
11206,
253,
8542,
11839,
273,
11365,
1666,
25379,
342,
36827,
78,
911,
671,
352,
651,
320,
5322,
281,
1918,
10454,
273,
36827,
78,
911,
2429,
281,
269,
7112,
24088,
285,
2969,
5275,
6346,
1666,
25379,
50275,
249,
1635,
281,
253,
5304,
14023,
2530,
352,
651,
452,
644,
9371,
281,
7472,
22909,
970,
385,
261,
2996,
7103,
6866,
275,
253,
863,
2382,
6239,
26332,
6009,
16858,
7340,
45973,
414,
3966,
436,
2929,
556,
253,
5107,
281,
21450,
2939,
253,
2538,
273,
2710,
1666,
25379,
327,
863,
8303,
50276,
34974,
50276,
6050,
305,
507,
1646,
751,
271,
12994,
4327,
273,
3676,
1006,
800,
1566,
277,
34753,
323,
436,
1895,
476,
368,
4385,
327,
390,
3368,
342,
643,
277,
72,
983,
26332,
13460,
265,
390,
5742,
362,
3348,
18944,
337,
2299,
667,
277,
34753,
326,
556,
21624,
966,
9712,
943,
36433,
368,
651,
320,
2104,
281,
1347,
13757,
275,
253,
21624,
2317,
374,
495,
577,
285,
5115,
2074,
966,
9712,
347,
2529,
275,
4677,
337,
50276,
783,
863,
8303,
275,
4677,
495,
70,
1646,
751,
6046,
1223,
5058,
8245,
3133,
25910,
23176,
50276,
5092,
368,
2085,
690,
30328,
323,
2139,
436,
6634,
253,
36827,
9193,
751,
689,
24212,
323,
278,
79,
382,
533,
1537,
320,
7470,
323,
643,
1029,
15759,
3237,
10646,
253,
8245,
3198,
281,
2619,
598,
327,
1355,
8794,
1972,
275,
253,
941,
50276,
18,
5987,
5758,
15337,
3024,
39061,
2352,
90,
633,
34453,
17,
82,
1105,
50276,
19,
5987,
39962,
2061,
5375,
11395,
1549,
2055,
2251,
50276,
20,
5987,
39962,
2061,
5375,
1518,
1549,
2358,
2385,
50276,
21,
5987,
39962,
2061,
5375,
11395,
1967,
1438,
1348,
7152,
339,
377,
6653,
6010,
50276,
2520,
2929,
19401,
253,
3293,
11523,
446,
2149,
8245,
5438,
2523,
275,
863,
2382,
3082,
323,
581,
10936,
531,
22909,
273,
23559,
14407,
49996,
253,
2234,
12288,
310,
281,
3989,
253,
8642,
285,
15958,
2303,
966,
8245,
281,
436,
990,
271,
5368,
4440,
16713,
5695,
10234,
36827,
1566,
10775,
4177,
1247,
310,
19732,
2961,
281,
4979,
271,
3280,
1650,
281,
1529,
1650,
275,
247,
2303,
966,
2568,
310,
2810,
281,
253,
3280,
436,
8245,
476,
320,
8527,
342,
247,
5235,
273,
863,
2382,
3082,
1690,
8527,
11786,
372,
70,
446,
2094,
30796,
285,
3676,
1200,
522,
285,
2722,
5185,
11701,
689,
5058,
8245,
285,
5927,
4181,
3733,
3410,
323,
581,
10936,
531,
22909,
253,
4679,
403,
5196,
327,
1264,
15302,
50276,
16192,
382,
18504,
13107,
285,
19126,
19,
35270,
50276,
20790,
20544,
50276,
2520,
2929,
12453,
271,
1774,
2568,
28849,
8245,
5438,
1895,
253,
1039,
253,
4477,
2953,
436,
1895,
310,
4722,
407,
19732,
2977,
36827,
3210,
16774,
27163,
7568,
253,
12510,
285,
2087,
50228,
273,
253,
4081,
2746,
50276,
20790,
32213,
50276,
18,
253,
2022,
14855,
273,
436,
2929,
281,
479,
310,
253,
7103,
2593,
253,
4081,
2746,
310,
760,
17618,
327,
2969,
15302,
751,
278,
79,
382,
285,
18504,
13107,
352,
651,
320,
625,
21414,
281,
921,
253,
12510,
273,
253,
4081,
2746,
327,
3626,
3888,
285,
247,
1781,
1180,
273,
5971,
751,
260,
338,
274,
285,
4440,
257,
292,
347,
908,
275,
253,
2045,
789,
824,
347,
25477,
50276,
19,
1563,
253,
4385,
275,
337,
253,
2720,
2987,
824,
347,
25477,
285,
372,
70,
446,
2094,
452,
644,
908,
281,
12106,
643,
3510,
273,
3210,
285,
644,
6760,
327,
643,
3510,
273,
941,
824,
347,
730,
19177,
285,
11454,
5145,
10234,
275,
1635,
281,
3888,
651,
253,
4081,
2746,
671,
4647,
281,
841,
10625,
50276,
20,
347,
12800,
275,
4677,
337,
253,
2234,
9376,
273,
253,
4081,
2746,
310,
326,
247,
36827,
1566,
4177,
1247,
310,
2104,
281,
6635,
6667,
326,
403,
1199,
8003,
281,
253,
3280,
6667,
685,
253,
3733,
6667,
26332,
5927,
4181,
3733,
3410,
762,
752,
2515,
651,
824,
271,
9376,
2186,
50276,
21,
1563,
253,
4385,
275,
495,
352,
651,
320,
4722,
281,
921,
285,
12106,
690,
4433,
2219,
50276,
22,
275,
253,
4081,
2746,
253,
4177,
1247,
3587,
4648,
253,
2168,
10166,
1566,
30410,
347,
697,
7134,
12915,
849,
604,
253,
4177,
1247,
18784,
697,
1211,
7134,
12915,
1293,
970,
253,
1566,
30410,
50276,
23,
352,
651,
320,
4722,
281,
921,
253,
4373,
19484,
1027,
5454,
2727,
24082,
34797,
7340,
50276,
24,
891,
7192,
326,
253,
4477,
7106,
327,
581,
10936,
531,
22909,
533,
891,
717,
6110,
281,
4089,
253,
4477,
7906,
327,
849,
281,
9017,
253,
4081,
2746,
281,
581,
10936,
455,
22909,
50276,
6438,
30080,
22559,
50275,
74,
5717,
253,
4477,
323,
253,
30080,
22559,
891,
452,
671,
1239,
253,
643,
30628,
5701,
19235,
253,
30080,
22559,
310,
10915,
87,
19163,
285,
4536,
21248,
891,
1978,
619,
3236,
13716,
7152,
339,
431,
248,
2929,
3916,
281,
1246,
247,
4460,
36827,
3169,
1566,
5513,
1430,
323,
11365,
581,
10936,
531,
22909,
407,
24049,
281,
1257,
15083,
1243,
30410,
347,
629,
273,
253,
36827,
597,
897,
305,
507,
281,
4711,
247,
8245,
2460,
534,
310,
247,
15958,
4227,
432,
247,
2303,
966,
326,
29217,
253,
3236,
4227,
50276,
10247,
7794,
50276,
66,
4460,
2746,
323,
11365,
581,
10936,
531,
8813,
1666,
25379,
19732,
2977,
305,
507,
50276,
783,
4081,
2746,
19132,
253,
3779,
4364,
8115,
323,
8985,
49996,
50275,
12373,
7794,
50276,
783,
2929,
19756,
19843,
50276,
783,
2746,
310,
5183,
327,
33804,
81,
12427,
6667,
452,
24626,
273,
697,
26647,
14603,
50276,
32897,
1089,
2708,
690,
273,
619,
7350,
337,
634,
1750,
359,
897,
305,
507,
281,
4711,
247,
8245,
2460,
534,
310,
247,
15958,
4227,
432,
247,
2303,
966,
326,
29217,
253,
3236,
4227,
2139,
513,
368,
878,
247,
36827,
2139,
13414,
368,
897,
247,
2990,
281,
6635,
247,
13775,
4315,
281,
12106,
253,
3045,
273,
253,
30410,
285,
1754,
327,
436,
1783,
368,
812,
5513,
2139,
323,
4227,
253,
6670,
470,
310,
10509,
347,
247,
721,
50276,
19,
2905,
342,
253,
2045,
1127,
634,
1783,
310,
1077,
3710,
368,
5467,
470,
310,
10509,
347,
721,
812,
470,
320,
10509,
347,
271,
854,
390,
898,
352,
310,
417,
2590,
432,
634,
1783,
627,
403,
642,
5701,
327,
841,
2219,
4453,
751,
634,
6667,
281,
2342,
634,
2746,
403,
33804,
81,
14050,
495,
891,
717,
417,
2119,
849,
281,
4665,
4677,
374,
50276,
8826,
643,
5701,
337,
253,
2929,
19756,
38135,
253,
4477,
7680,
310,
417,
2590,
374,
253,
5661,
12820,
310,
3710,
285,
417,
21414,
253,
4477,
897,
816,
690,
2969,
15302,
278,
79,
382,
18504,
13107,
752,
670,
625,
2570,
15302,
751,
260,
338,
274,
740,
298,
13998,
3966,
812,
634,
2746,
5513,
253,
3731,
42070,
275,
841,
2219,
2490,
187,
4118,
18435,
27,
2520,
789,
2340,
684,
253,
4327,
273,
247,
8245,
323,
863,
2382,
3082,
824,
247,
4327,
310,
1774,
285,
476,
11306,
4833,
253,
6454,
273,
667,
1783,
326,
8687,
863,
2382,
3082,
253,
789,
29328,
2509,
337,
581,
10936,
531,
863,
2382,
275,
247,
3686,
273,
4499,
422,
8142,
374,
11365,
1666,
25379,
970,
4177,
1247,
50276,
783,
30628,
452,
3982,
562,
247,
1180,
273,
3588,
7350,
670,
436,
789,
50276,
18,
581,
10936,
531,
863,
2382,
4620,
281,
320,
4460,
285,
21488,
2217,
432,
253,
625,
21270,
581,
10936,
455,
26850,
891,
717,
4931,
625,
28684,
685,
253,
30628,
326,
824,
247,
15895,
310,
275,
958,
4217,
533,
891,
476,
923,
835,
253,
16063,
4306,
476,
1705,
432,
374,
697,
417,
2590,
326,
253,
7103,
2722,
326,
253,
4081,
1332,
310,
275,
958,
8936,
281,
253,
2571,
512,
253,
30628,
14435,
2220,
436,
581,
1039,
390,
1529,
495,
8489,
8077,
2531,
15302,
908,
323,
7103,
4879,
326,
627,
403,
260,
338,
274,
740,
1543,
275,
253,
30080,
22559,
50276,
2520,
369,
625,
45210,
685,
253,
7363,
651,
5224,
891,
5717,
253,
4477,
323,
253,
9470,
32114,
285,
4465,
4679,
891,
11907,
731,
281,
19071,
625,
273,
253,
8680,
285,
501,
538,
2225,
281,
253,
1735,
7470,
8059,
891,
513,
2868,
326,
2509,
4679,
327,
4440,
292,
3024,
751,
2045,
789,
1057,
824,
347,
25477,
651,
320,
3240,
32811,
285,
21414,
891,
9101,
253,
15180,
14247,
812,
320,
4784,
27285,
407,
294,
5302,
3215,
11273,
6928,
273,
534,
627,
403,
1142,
2130,
323,
4440,
257,
292,
5742
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work provides a unifying review of hardness measures for mdps that have appeared in previous rl theory bounds in tabular settings the authors have also developed a benchmark with easily estimable values for these hardness measures to be used for empirical investigation of rl theory performance of four standard algorithms are measured in the environments originality the work provides a unifying perspective on what has thus far been seen as disparate notions of hardness with qualitative comparison of the strengths and weaknesses of each i think that while no new notion of hardness is investigated here which would make this a very strong paper the perspective offered is novel and worthwhile the development of a standard tabular benchmark for rl theory is again a work of synthesis from previous papers while valuable for rl theory practitioners there is less novel insight here as these environments are well known quality the paper is clearly wellthought through and well constructed with unifying insight provided i think if the environments werent chosen from the literature but instead chosen such that the different aspects of hardness described in the paper were easier to control independently with respect to the different policy and environment parameters described in the paper the benchmark would lead to even more meaningful insight clarity the paper is easy to follow and plots and charts are easy to read the visualisations of the environments give good intuition as to their structure and the experiments give good characterisation of their hardness significance the paper is significant and wellpoised to enable future work in unified hardness and empirical evaluation of rl theory results as suggested above the insight here is mainly unifying with little specific novel contributions either in terms of environment or in terms of analysis new measures of hardness or previously unseen environments with useful properties would make the paper very strong docsepthe paper introduces colosseum a python package that allows empirical investigation of mdp hardness along estimation complexity and visitation complexity for tabular reinforcement learning it surveys various existing harndess measures and argues why many of these do not capture the above mentioned complexities properly the ones which come closest to capturing these are environmental value norm for estimation complexity and diameter for visitation complexity as i understand from the paper it also implements agents and implements a benchmark for what the authors claim are the four most widely studied tabular reinforcement learning settings which i believe are a episodic ergodic b episodic communicating c continuous ergodic d continuous communicating they also perform experiments examining the hardness measures under various changes to the mpds fig 1 as also with the agents in the mentioned settings tab 1 and fig 2 i believe they motivate the approach well and the benchmark is more comprehensive than existing benhcmark for tabular rl the quality of the woek also seems high they provide code and collect it in a single package which should be of great significance to the community esp but not just the tabular rl communtiy i assume the code quality is also good based on a quick walk through the jupyter notebook colosseummaintutorialipynb the analysis plots also look good however in respect to the clarity i feel like the paper would be much more suited to the journal format i feel like many of the questions i ask below are probably clarified in the appendix which i only glanced through however i feel like some of these rather belong in the main paper the motivations for the different mdp families was not described in the main paper there were some statements such as the ergodic settings seem generally slightly easier than the communicating settings but the evidence was not explicitly pointed out i felt like this in many places and it feels this was done to save space because of such spacesaving measures i feel the paper is better suited to a journal in various places the clarity can be improved eg the diameter also increases almost linearly with the number of states when prand is relatively small an approximately linear relationship can still be observed this was a little confusing because subfigures being referenced are not mentioned esp for the 2nd sentence i cant see the linear relationship when p is small in fig 1a because its not zoomed in enough what values of p were meant small sum of the reciprocals of the suboptimality gaps this measure of hardness is not particularly apt at capturing estimation complexity since it focuses solely on optimal policy identification it also underestimates the increase in hardness induced by an increase in visitation complexity i understand the arguments however in fig 1a1d tbh sum of the reciprocals of the suboptimality gaps actually seems the closest to the cumulative regret of the tuned nearoptimal agent could the authors please add some reasoning as to why this measure ends up being seemingly the closet to the cumulative regret of the tuned nearoptimal agent even though it is not suited either for visitation complexity or estimation complexity for qlearning and psrl figs 2b and 2c the diameter seems to have a generally smaller influence on the average cumulative regret i cant quite see this in the figures efficiently computable hardness measures adding a table with computational complexity of calculating the measures would be highly appreciated yes docsepthis paper first presents a survey of existing hardness measures and results from the mdp literature their main contribution is the introduction of colosseum which is a benchmark for empirically validating hardness results which they use to compare various existing hardness measures originality it seems to me that section 2 hardness in theory is a review of existing literature so the original contributions of this paper would be limited to the colosseum package and the empirical investigations provided for these the work is original as far as i can tell although some of the claims seem to be a bit inflated eg a pioneering python package the most exhaustive in tabular reinforcement learning invaluable analysis tools etc quality the authors have done a reasonably thorough survey of hardness literature and evaluated these measures using the various environments in their package there are a few issues regarding clarity and correctness that i include in the questions below the code for colosseum seems to be wellwritten and welldocumented which i consider to be a core part of this papers contribution clarity the paper is very well written and motivated reasonably well some of the plots and tables are hard to digest specifically its often not clear what is being said with them theres a lot going on in figure 1 and even more in table 1 and even though the main takeaways do seem to be discussed in the text theyre mostly lost in the paragraphs in page 7 it seems that the last sentence is the main takeaway for each hardness measure i would suggest highlighting these in a more streamlined manner to draw the readers attention directly to the takeaways and leave the descriptive text until after table 1 is a bit overwhelming its not clear what were supposed to be looking for i would also suggest rewriting this section so there are clear takeaways and insights for the readers currently it reads just as a verbal description of the many numbers in the table although it is claimed that in figure 2 there is generally a positive relationship between both of these hardness measures and the average cumulative regret it seems almost like points uniformly spread on the plane ie i dont see a clear relationship at all there are a few other issues i mention in the questions below significance this to me is the weakest point of the paper although i appreciate the authors effort to produce a nice package for benchmarking theoretical results its not clear how significant this will be empirical evaluations on toy environments for theoretical results are typically meant to highlight characteristics or subtleties of the theory introduced but are not the end goal in itself in particular whether the empirical results suggest sublinear or linear growth say does not in any way change the theoretical results thus it is not clear what the added value would be to have a theory benchmark something that i think could make this package more impactful is to try to go beyond tabular one suggestion would be to look at bsuite which the authors do cite as they include both tabular and continuous environments in particular it would be interesting to evaluate both as they may allow one to empirically investigate how the hardness measures vary when moving from tabular to larger systems i acknowledge that in nontabular systems it may not be possible to compute all of them in closed form but there may be approximations alternatively continuous variants of the tabular systems considered could provide a nice middle ground eg by smoothing out each tabular state another aspect that could increase the significance of this work is to evaluate nontabular methods for instance with linear function approximators a lot of rl theory does exist for linear approximators and tabular of course so it would be interesting to evaluate how the dynamics of the empirical evaluations change or not when moving from tabular to nontabular methods along these lines in line 366 it says the development of such measures is theoretically and empirically important it would be nice to provide some concrete examples such as a theoretical bound dependent on one of these hardness measures or something like that otherwise its not clear why these hardness measures are important some limitations are provided mostly related to future work it would be nice to have some discussion regarding the significanceimpact or lack thereof of this work in line with some of the comments i made above regarding significance no discussion of potential negative societal impact was provided docsepthis paper reviews and categorizes some hardness measures on tabular mdps and proposes a new tabular rl benchmark named colosseum that enables exact computation of these measures in empirical evaluations extensive experiments are conducted to assess the performance of existing tabular rl methods on the proposed benchmark that spans diverse environments focusing on different hardness measures strengths 1 the paper suggests an interesting viewpoint of connecting the theory and practice of tabular rl by taking into account the hardness measures that are typically used only in theoretical analysis when evaluating and comparing tabular rl algorithms 2 the colosseum benchmark is of high quality flexible and with sufficient docs and tutorials 3 the empirical evaluations are thorough 4 the paper is well organized and presented weaknesses 1 the benchmark and the hardness measures only apply to tabular rl in contrast most of the current empirical work in rl is devoted to the nontabular setting and there is also an increasing number of theoretical work that explores nontabular rl 2 the theoretical claims made by this paper are somewhat vague see the following for details no concern
### Summary: | the reviewers opinions are quite consistent towards a weak accept im not confident with the big title hardness in markov decision processes theory and practice this paper is more like a survey benchmark review instead of a research article neither the theory part or the practice part is novel enough as a research article its a bit thin as a survey paper i personally tend to weak reject but i respect the reviewers weak accept | [
247,
440,
5411,
8668,
327,
752,
556,
3021,
2080,
644,
2326,
347,
39653,
27367,
273,
38576,
342,
18276,
5301,
273,
253,
20544,
285,
32213,
273,
1016,
891,
1158,
326,
1223,
642,
747,
10732,
273,
38576,
310,
6949,
1060,
534,
651,
1056,
436,
247,
1077,
2266,
2929,
253,
8668,
5907,
310,
4460,
285,
32811,
253,
2440,
273,
247,
2629,
10334,
792,
22791,
323,
391,
77,
3762,
310,
969,
247,
789,
273,
9066,
432,
2045,
9380,
1223,
9865,
323,
391,
77,
3762,
24432,
627,
310,
1679,
4460,
12288,
1060,
347,
841,
12620,
403,
973,
1929,
50276,
15177,
253,
2929,
310,
4518,
973,
24286,
949,
285,
973,
8818,
342,
440,
5411,
12288,
2530,
891,
1158,
604,
253,
12620,
359,
624,
6777,
432,
253,
6239,
533,
3185,
6777,
824,
326,
253,
1027,
7794,
273,
38576,
2529,
275,
253,
2929,
497,
6927,
281,
1453,
10939,
342,
1675,
281,
253,
1027,
3646,
285,
3126,
3602,
2529,
275,
253,
2929,
253,
22791,
651,
1421,
281,
1014,
625,
14282,
12288,
50276,
498,
15752,
253,
2929,
310,
3477,
281,
956,
285,
14777,
285,
19840,
403,
3477,
281,
1239,
253,
5304,
18058,
273,
253,
12620,
1918,
1175,
30328,
347,
281,
616,
2605,
285,
253,
4679,
1918,
1175,
1894,
5837,
273,
616,
38576,
50276,
9188,
40348,
253,
2929,
310,
1534,
285,
973,
5367,
1701,
281,
8046,
2852,
789,
275,
27998,
38576,
285,
16774,
7103,
273,
391,
77,
3762,
1543,
50276,
284,
5125,
1840,
253,
12288,
1060,
310,
7194,
440,
5411,
342,
1652,
2173,
4460,
9021,
2057,
275,
2426,
273,
3126,
390,
275,
2426,
273,
1783,
747,
5593,
273,
38576,
390,
3786,
39709,
12620,
342,
4217,
3607,
651,
1056,
253,
2929,
1077,
2266,
5474,
339,
431,
248,
2929,
23970,
847,
37554,
360,
247,
15548,
5522,
326,
4483,
16774,
5839,
273,
278,
12132,
38576,
2112,
13418,
10454,
285,
41820,
10454,
323,
10334,
792,
35221,
4715,
352,
17276,
2710,
5368,
288,
1596,
22192,
5593,
285,
8219,
2139,
1142,
273,
841,
513,
417,
9232,
253,
1840,
5393,
48663,
6283,
253,
4394,
534,
1705,
8642,
281,
26475,
841,
403,
6938,
1318,
5222,
323,
13418,
10454,
285,
9080,
323,
41820,
10454,
347,
891,
2096,
432,
253,
2929,
352,
671,
17930,
6083,
285,
17930,
247,
22791,
323,
752,
253,
4477,
1750,
403,
253,
1740,
954,
7561,
5421,
10334,
792,
35221,
4715,
7533,
50276,
4609,
891,
2868,
403,
247,
6314,
23329,
21651,
23329,
270,
6314,
23329,
26728,
260,
5415,
21651,
23329,
277,
5415,
26728,
597,
671,
1347,
4679,
17565,
253,
38576,
5593,
762,
2710,
2544,
281,
253,
23542,
1397,
3036,
337,
347,
671,
342,
253,
6083,
275,
253,
5393,
7533,
10334,
337,
285,
3036,
374,
50276,
74,
2868,
597,
41509,
253,
2746,
973,
285,
253,
22791,
310,
625,
11088,
685,
5368,
2240,
73,
3591,
782,
323,
10334,
792,
391,
77,
253,
3290,
273,
253,
32063,
1441,
671,
3133,
1029,
597,
2085,
2127,
285,
4822,
352,
275,
247,
2014,
5522,
534,
943,
320,
273,
1270,
8453,
281,
253,
3114,
17985,
533,
417,
816,
253,
10334,
792,
391,
77,
1681,
6811,
90,
891,
5467,
253,
2127,
3290,
310,
671,
1175,
1754,
327,
247,
3158,
2940,
949,
253,
480,
484,
90,
350,
24849,
847,
37554,
20440,
1143,
37929,
532,
1362,
67,
253,
1783,
14777,
671,
1007,
1175,
2299,
275,
1675,
281,
253,
19843,
891,
1928,
751,
253,
2929,
651,
320,
1199,
625,
18960,
281,
253,
6698,
5981,
891,
1928,
751,
1142,
273,
253,
3533,
891,
1642,
2708,
403,
3164,
31637,
275,
253,
30762,
534,
891,
760,
17377,
949,
2299,
891,
1928,
751,
690,
273,
841,
2581,
5663,
275,
253,
2022,
2929,
50276,
783,
42852,
323,
253,
1027,
278,
12132,
5870,
369,
417,
2529,
275,
253,
2022,
2929,
50276,
9088,
497,
690,
7234,
824,
347,
253,
21651,
23329,
7533,
1646,
3839,
5777,
6927,
685,
253,
26728,
7533,
50276,
2858,
253,
1941,
369,
417,
11120,
8042,
562,
891,
3543,
751,
436,
275,
1142,
5053,
285,
352,
9193,
436,
369,
2218,
281,
5321,
2317,
50276,
12157,
273,
824,
8470,
3292,
5593,
891,
1928,
253,
2929,
310,
1805,
18960,
281,
247,
6698,
50275,
249,
2710,
5053,
253,
19843,
476,
320,
5520,
24088,
50276,
783,
9080,
671,
5459,
2761,
23352,
342,
253,
1180,
273,
3054,
672,
819,
395,
310,
4942,
1355,
271,
5512,
4872,
2954,
476,
1335,
320,
2540,
50276,
2520,
369,
247,
1652,
21643,
984,
749,
40203,
1146,
23378,
403,
417,
5393,
17985,
323,
253,
374,
2109,
6197,
891,
16216,
923,
253,
4872,
2954,
672,
50276,
81,
310,
1355,
275,
3036,
337,
66,
984,
697,
417,
21282,
264,
275,
2217,
752,
2193,
273,
268,
497,
5486,
1355,
50275,
2204,
273,
253,
33704,
932,
273,
253,
749,
32581,
1319,
18388,
436,
2557,
273,
38576,
310,
417,
3782,
13390,
387,
26475,
13418,
10454,
1580,
352,
16633,
12718,
327,
8654,
3646,
8137,
352,
671,
22698,
36940,
253,
2572,
275,
38576,
5802,
407,
271,
2572,
275,
41820,
10454,
50276,
74,
2096,
253,
7125,
2299,
275,
3036,
337,
66,
18,
69,
246,
26576,
2020,
273,
253,
33704,
932,
273,
253,
749,
32581,
1319,
18388,
2686,
3133,
253,
8642,
281,
253,
18849,
14938,
273,
253,
24251,
2822,
29776,
5570,
812,
253,
4477,
4496,
823,
690,
14720,
347,
281,
2139,
436,
2557,
7637,
598,
1146,
16907,
253,
26348,
281,
253,
18849,
14938,
273,
253,
24251,
2822,
29776,
5570,
1014,
2167,
352,
310,
417,
18960,
2057,
323,
41820,
10454,
390,
13418,
10454,
50275,
1542,
2805,
28269,
285,
3714,
8435,
3036,
84,
374,
67,
285,
374,
68,
253,
9080,
3133,
281,
452,
247,
3839,
4577,
4833,
327,
253,
3388,
18849,
14938,
50276,
74,
16216,
3240,
923,
436,
275,
253,
8442,
50276,
20246,
314,
2475,
494,
38576,
5593,
50276,
8052,
247,
2829,
342,
15180,
10454,
273,
18899,
253,
5593,
651,
320,
4122,
14109,
50276,
9820,
5474,
33032,
2520,
2929,
806,
10262,
247,
6630,
273,
5368,
38576,
5593,
285,
1543,
432,
253,
278,
12132,
6239,
616,
2022,
7680,
310,
253,
10199,
273,
847,
37554,
360,
534,
310,
247,
22791,
323,
45190,
3588,
839,
38576,
1543,
534,
597,
897,
281,
7277,
2710,
5368,
38576,
5593,
50276,
19164,
414,
352,
3133,
281,
479,
326,
2593,
374,
38576,
275,
3762,
310,
247,
2278,
273,
5368,
6239,
594,
253,
3236,
9021,
273,
436,
2929,
651,
320,
3710,
281,
253,
847,
37554,
360,
5522,
285,
253,
16774,
14006,
2530,
323,
841,
253,
789,
310,
3236,
347,
2080,
347,
891,
476,
2028,
3738,
690,
273,
253,
3916,
1646,
281,
320,
247,
2372,
41470,
24088,
247,
45200,
15548,
5522,
253,
954,
41389,
275,
10334,
792,
35221,
4715,
38089,
1783,
5657,
3966,
50275,
15177,
253,
4477,
452,
2218,
247,
12054,
11080,
6630,
273,
38576,
6239,
285,
6760,
841,
5593,
970,
253,
2710,
12620,
275,
616,
5522,
627,
403,
247,
1643,
3374,
5001,
19843,
285,
36594,
326,
891,
2486,
275,
253,
3533,
2708,
253,
2127,
323,
847,
37554,
360,
3133,
281,
320,
973,
15720,
285,
6210,
392,
1829,
264,
534,
891,
1908,
281,
320,
247,
5161,
629,
273,
436,
9380,
7680,
50275,
498,
15752,
253,
2929,
310,
1077,
973,
3542,
285,
17194,
12054,
973,
690,
273,
253,
14777,
285,
7180,
403,
1892,
281,
19818,
5742,
697,
2223,
417,
2590,
752,
310,
1146,
753,
342,
731,
253,
373,
247,
2257,
1469,
327,
275,
4677,
337,
285,
1014,
625,
275,
2829,
337,
285,
1014,
2167,
253,
2022,
1379,
42287,
513,
1646,
281,
320,
5469,
275,
253,
2505,
597,
250,
6571,
3663,
275,
253,
33295,
275,
3239,
818,
352,
3133,
326,
253,
1390,
6197,
310,
253,
2022,
1379,
12594,
323,
1016,
38576,
2557,
891,
651,
1804,
27321,
841,
275,
247,
625,
5542,
12490,
5133,
281,
3812,
253,
10668,
4116,
3587,
281,
253,
1379,
42287,
285,
3553,
253,
27389,
2505,
1919,
846,
2829,
337,
310,
247,
2372,
16400,
697,
417,
2590,
752,
497,
6326,
281,
320,
2819,
323,
891,
651,
671,
1804,
294,
17695,
436,
2593,
594,
627,
403,
2590,
1379,
42287,
285,
16039,
323,
253,
10668,
4390,
352,
9563,
816,
347,
247,
21765,
5740,
273,
253,
1142,
3904,
275,
253,
2829,
50276,
20261,
352,
310,
7558,
326,
275,
4677,
374,
627,
310,
3839,
247,
2762,
2954,
875,
1097,
273,
841,
38576,
5593,
285,
253,
3388,
18849,
14938,
352,
3133,
2761,
751,
2792,
17568,
5195,
327,
253,
6415,
26332,
891,
13414,
923,
247,
2590,
2954,
387,
512,
50276,
9088,
403,
247,
1643,
643,
3374,
891,
3748,
275,
253,
3533,
2708,
50275,
9188,
40348,
436,
281,
479,
310,
253,
5075,
383,
1127,
273,
253,
2929,
3738,
891,
11435,
253,
4477,
3434,
281,
4711,
247,
5322,
5522,
323,
22791,
272,
10527,
1543,
697,
417,
2590,
849,
1534,
436,
588,
320,
16774,
27163,
327,
20953,
12620,
323,
10527,
1543,
403,
5431,
5486,
281,
6780,
5319,
390,
8482,
1059,
447,
273,
253,
3762,
5611,
533,
403,
417,
253,
990,
4736,
275,
3139,
275,
1798,
1880,
253,
16774,
1543,
1804,
749,
8172,
390,
4872,
3116,
1333,
1057,
417,
275,
667,
1039,
1818,
253,
10527,
1543,
3021,
352,
310,
417,
2590,
752,
253,
2879,
1318,
651,
320,
281,
452,
247,
3762,
22791,
1633,
326,
891,
1158,
812,
1056,
436,
5522,
625,
3486,
1020,
310,
281,
1611,
281,
564,
4457,
10334,
792,
581,
14876,
651,
320,
281,
1007,
387,
270,
42877,
534,
253,
4477,
513,
26542,
347,
597,
2486,
1097,
10334,
792,
285,
5415,
12620,
275,
1798,
352,
651,
320,
4722,
281,
7472,
1097,
347,
597,
778,
1581,
581,
281,
45190,
7409,
849,
253,
38576,
5593,
6889,
672,
4886,
432,
10334,
792,
281,
4067,
2718,
891,
14409,
326,
275,
25450,
357,
792,
2718,
352,
778,
417,
320,
1896,
281,
11897,
512,
273,
731,
275,
4581,
830,
533,
627,
778,
320,
34754,
31506,
5415,
11640,
273,
253,
10334,
792,
2718,
2783,
812,
2085,
247,
5322,
4766,
3216,
24088,
407,
36971,
562,
1016,
10334,
792,
1375,
1529,
4809,
326,
812,
2572,
253,
8453,
273,
436,
789,
310,
281,
7472,
25450,
357,
792,
3082,
323,
4227,
342,
4872,
1159,
4020,
2392,
247,
2257,
273,
391,
77,
3762,
1057,
2226,
323,
4872,
4020,
2392,
285,
10334,
792,
273,
2282,
594,
352,
651,
320,
4722,
281,
7472,
849,
253,
8062,
273,
253,
16774,
27163,
1818,
390,
417,
672,
4886,
432,
10334,
792,
281,
25450,
357,
792,
3082,
50276,
28694,
841,
3104,
275,
1386,
33274,
352,
2296,
253,
2440,
273,
824,
5593,
310,
28055,
285,
45190,
1774,
352,
651,
320,
5322,
281,
2085,
690,
11859,
6667,
824,
347,
247,
10527,
3033,
7976,
327,
581,
273,
841,
38576,
5593,
390,
1633,
751,
326,
5010,
697,
417,
2590,
2139,
841,
38576,
5593,
403,
1774,
690,
7364,
403,
2530,
6571,
2905,
281,
2852,
789,
352,
651,
320,
5322,
281,
452,
690,
5955,
5001,
253,
8453,
48276,
390,
3480,
10445,
273,
436,
789,
275,
1386,
342,
690,
273,
253,
5701,
891,
1160,
1840,
5001,
8453,
50276,
2369,
5955,
273,
2442,
4016,
38058,
3486,
369,
2530,
5474,
33032,
2520,
2929,
10123,
285,
13213,
4219,
690,
38576,
5593,
327,
10334,
792,
31934,
793,
285,
29328,
247,
747,
10334,
792,
391,
77,
22791,
4907,
847,
37554,
360,
326,
13276,
3242,
13782,
273,
841,
5593,
275,
16774,
27163,
9470,
4679,
403,
5196,
281,
2939,
253,
3045,
273,
5368,
10334,
792,
391,
77,
3082,
327,
253,
4081,
22791,
326,
35742,
11117,
12620,
13654,
327,
1027,
38576,
5593,
50276,
296,
3755,
20556,
337,
253,
2929,
5936,
271,
4722,
31460,
273,
12873,
253,
3762,
285,
3946,
273,
10334,
792,
391,
77,
407,
3192,
715,
2395,
253,
38576,
5593,
326,
403,
5431,
908,
760,
275,
10527,
1783,
672,
16344,
285,
10941,
10334,
792,
391,
77,
11333,
374,
253,
847,
37554,
360,
22791,
310,
273,
1029,
3290,
12112,
285,
342,
4209,
27586,
285,
40727,
495,
253,
16774,
27163,
403,
11080,
577,
253,
2929,
310,
973,
10932,
285,
3559,
50274,
20881,
1255,
265,
337,
253,
22791,
285,
253,
38576,
5593,
760,
4647,
281,
10334,
792,
391,
77,
275,
4499,
954,
273,
253,
1655,
16774,
789,
275,
391,
77,
310,
16222,
281,
253,
25450,
357,
792,
4758,
285,
627,
310,
671,
271,
3629,
1180,
273,
10527,
789,
326,
33826,
25450,
357,
792,
391,
77,
374,
253,
10527,
3916,
1160,
407,
436,
2929,
403,
8489,
21248,
923,
253,
1563,
323,
4278,
642,
4468,
2490,
187,
4118,
18435,
27,
783,
30628,
11626,
403,
3240,
5185,
4404,
247,
5075,
2997,
50276,
303,
417,
13224,
342,
253,
1943,
4060,
38576,
275,
1616,
729,
3061,
4870,
3762,
285,
3946,
436,
2929,
310,
625,
751,
247,
6630,
50276,
31591,
4698,
2278,
3185,
273,
247,
2561,
3929,
6747,
253,
3762,
629,
390,
253,
3946,
629,
310,
4460,
2217,
347,
247,
2561,
3929,
697,
247,
2372,
6906,
347,
247,
6630,
2929,
891,
11697,
5257,
281,
5075,
12009,
533,
891,
1675,
253,
30628,
5075,
2997
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
247,
440,
5411,
8668,
327,
752,
556,
3021,
2080,
644,
2326,
347,
39653,
27367,
273,
38576,
342,
18276,
5301,
273,
253,
20544,
285,
32213,
273,
1016,
891,
1158,
326,
1223,
642,
747,
10732,
273,
38576,
310,
6949,
1060,
534,
651,
1056,
436,
247,
1077,
2266,
2929,
253,
8668,
5907,
310,
4460,
285,
32811,
253,
2440,
273,
247,
2629,
10334,
792,
22791,
323,
391,
77,
3762,
310,
969,
247,
789,
273,
9066,
432,
2045,
9380,
1223,
9865,
323,
391,
77,
3762,
24432,
627,
310,
1679,
4460,
12288,
1060,
347,
841,
12620,
403,
973,
1929,
50276,
15177,
253,
2929,
310,
4518,
973,
24286,
949,
285,
973,
8818,
342,
440,
5411,
12288,
2530,
891,
1158,
604,
253,
12620,
359,
624,
6777,
432,
253,
6239,
533,
3185,
6777,
824,
326,
253,
1027,
7794,
273,
38576,
2529,
275,
253,
2929,
497,
6927,
281,
1453,
10939,
342,
1675,
281,
253,
1027,
3646,
285,
3126,
3602,
2529,
275,
253,
2929,
253,
22791,
651,
1421,
281,
1014,
625,
14282,
12288,
50276,
498,
15752,
253,
2929,
310,
3477,
281,
956,
285,
14777,
285,
19840,
403,
3477,
281,
1239,
253,
5304,
18058,
273,
253,
12620,
1918,
1175,
30328,
347,
281,
616,
2605,
285,
253,
4679,
1918,
1175,
1894,
5837,
273,
616,
38576,
50276,
9188,
40348,
253,
2929,
310,
1534,
285,
973,
5367,
1701,
281,
8046,
2852,
789,
275,
27998,
38576,
285,
16774,
7103,
273,
391,
77,
3762,
1543,
50276,
284,
5125,
1840,
253,
12288,
1060,
310,
7194,
440,
5411,
342,
1652,
2173,
4460,
9021,
2057,
275,
2426,
273,
3126,
390,
275,
2426,
273,
1783,
747,
5593,
273,
38576,
390,
3786,
39709,
12620,
342,
4217,
3607,
651,
1056,
253,
2929,
1077,
2266,
5474,
339,
431,
248,
2929,
23970,
847,
37554,
360,
247,
15548,
5522,
326,
4483,
16774,
5839,
273,
278,
12132,
38576,
2112,
13418,
10454,
285,
41820,
10454,
323,
10334,
792,
35221,
4715,
352,
17276,
2710,
5368,
288,
1596,
22192,
5593,
285,
8219,
2139,
1142,
273,
841,
513,
417,
9232,
253,
1840,
5393,
48663,
6283,
253,
4394,
534,
1705,
8642,
281,
26475,
841,
403,
6938,
1318,
5222,
323,
13418,
10454,
285,
9080,
323,
41820,
10454,
347,
891,
2096,
432,
253,
2929,
352,
671,
17930,
6083,
285,
17930,
247,
22791,
323,
752,
253,
4477,
1750,
403,
253,
1740,
954,
7561,
5421,
10334,
792,
35221,
4715,
7533,
50276,
4609,
891,
2868,
403,
247,
6314,
23329,
21651,
23329,
270,
6314,
23329,
26728,
260,
5415,
21651,
23329,
277,
5415,
26728,
597,
671,
1347,
4679,
17565,
253,
38576,
5593,
762,
2710,
2544,
281,
253,
23542,
1397,
3036,
337,
347,
671,
342,
253,
6083,
275,
253,
5393,
7533,
10334,
337,
285,
3036,
374,
50276,
74,
2868,
597,
41509,
253,
2746,
973,
285,
253,
22791,
310,
625,
11088,
685,
5368,
2240,
73,
3591,
782,
323,
10334,
792,
391,
77,
253,
3290,
273,
253,
32063,
1441,
671,
3133,
1029,
597,
2085,
2127,
285,
4822,
352,
275,
247,
2014,
5522,
534,
943,
320,
273,
1270,
8453,
281,
253,
3114,
17985,
533,
417,
816,
253,
10334,
792,
391,
77,
1681,
6811,
90,
891,
5467,
253,
2127,
3290,
310,
671,
1175,
1754,
327,
247,
3158,
2940,
949,
253,
480,
484,
90,
350,
24849,
847,
37554,
20440,
1143,
37929,
532,
1362,
67,
253,
1783,
14777,
671,
1007,
1175,
2299,
275,
1675,
281,
253,
19843,
891,
1928,
751,
253,
2929,
651,
320,
1199,
625,
18960,
281,
253,
6698,
5981,
891,
1928,
751,
1142,
273,
253,
3533,
891,
1642,
2708,
403,
3164,
31637,
275,
253,
30762,
534,
891,
760,
17377,
949,
2299,
891,
1928,
751,
690,
273,
841,
2581,
5663,
275,
253,
2022,
2929,
50276,
783,
42852,
323,
253,
1027,
278,
12132,
5870,
369,
417,
2529,
275,
253,
2022,
2929,
50276,
9088,
497,
690,
7234,
824,
347,
253,
21651,
23329,
7533,
1646,
3839,
5777,
6927,
685,
253,
26728,
7533,
50276,
2858,
253,
1941,
369,
417,
11120,
8042,
562,
891,
3543,
751,
436,
275,
1142,
5053,
285,
352,
9193,
436,
369,
2218,
281,
5321,
2317,
50276,
12157,
273,
824,
8470,
3292,
5593,
891,
1928,
253,
2929,
310,
1805,
18960,
281,
247,
6698,
50275,
249,
2710,
5053,
253,
19843,
476,
320,
5520,
24088,
50276,
783,
9080,
671,
5459,
2761,
23352,
342,
253,
1180,
273,
3054,
672,
819,
395,
310,
4942,
1355,
271,
5512,
4872,
2954,
476,
1335,
320,
2540,
50276,
2520,
369,
247,
1652,
21643,
984,
749,
40203,
1146,
23378,
403,
417,
5393,
17985,
323,
253,
374,
2109,
6197,
891,
16216,
923,
253,
4872,
2954,
672,
50276,
81,
310,
1355,
275,
3036,
337,
66,
984,
697,
417,
21282,
264,
275,
2217,
752,
2193,
273,
268,
497,
5486,
1355,
50275,
2204,
273,
253,
33704,
932,
273,
253,
749,
32581,
1319,
18388,
436,
2557,
273,
38576,
310,
417,
3782,
13390,
387,
26475,
13418,
10454,
1580,
352,
16633,
12718,
327,
8654,
3646,
8137,
352,
671,
22698,
36940,
253,
2572,
275,
38576,
5802,
407,
271,
2572,
275,
41820,
10454,
50276,
74,
2096,
253,
7125,
2299,
275,
3036,
337,
66,
18,
69,
246,
26576,
2020,
273,
253,
33704,
932,
273,
253,
749,
32581,
1319,
18388,
2686,
3133,
253,
8642,
281,
253,
18849,
14938,
273,
253,
24251,
2822,
29776,
5570,
812,
253,
4477,
4496,
823,
690,
14720,
347,
281,
2139,
436,
2557,
7637,
598,
1146,
16907,
253,
26348,
281,
253,
18849,
14938,
273,
253,
24251,
2822,
29776,
5570,
1014,
2167,
352,
310,
417,
18960,
2057,
323,
41820,
10454,
390,
13418,
10454,
50275,
1542,
2805,
28269,
285,
3714,
8435,
3036,
84,
374,
67,
285,
374,
68,
253,
9080,
3133,
281,
452,
247,
3839,
4577,
4833,
327,
253,
3388,
18849,
14938,
50276,
74,
16216,
3240,
923,
436,
275,
253,
8442,
50276,
20246,
314,
2475,
494,
38576,
5593,
50276,
8052,
247,
2829,
342,
15180,
10454,
273,
18899,
253,
5593,
651,
320,
4122,
14109,
50276,
9820,
5474,
33032,
2520,
2929,
806,
10262,
247,
6630,
273,
5368,
38576,
5593,
285,
1543,
432,
253,
278,
12132,
6239,
616,
2022,
7680,
310,
253,
10199,
273,
847,
37554,
360,
534,
310,
247,
22791,
323,
45190,
3588,
839,
38576,
1543,
534,
597,
897,
281,
7277,
2710,
5368,
38576,
5593,
50276,
19164,
414,
352,
3133,
281,
479,
326,
2593,
374,
38576,
275,
3762,
310,
247,
2278,
273,
5368,
6239,
594,
253,
3236,
9021,
273,
436,
2929,
651,
320,
3710,
281,
253,
847,
37554,
360,
5522,
285,
253,
16774,
14006,
2530,
323,
841,
253,
789,
310,
3236,
347,
2080,
347,
891,
476,
2028,
3738,
690,
273,
253,
3916,
1646,
281,
320,
247,
2372,
41470,
24088,
247,
45200,
15548,
5522,
253,
954,
41389,
275,
10334,
792,
35221,
4715,
38089,
1783,
5657,
3966,
50275,
15177,
253,
4477,
452,
2218,
247,
12054,
11080,
6630,
273,
38576,
6239,
285,
6760,
841,
5593,
970,
253,
2710,
12620,
275,
616,
5522,
627,
403,
247,
1643,
3374,
5001,
19843,
285,
36594,
326,
891,
2486,
275,
253,
3533,
2708,
253,
2127,
323,
847,
37554,
360,
3133,
281,
320,
973,
15720,
285,
6210,
392,
1829,
264,
534,
891,
1908,
281,
320,
247,
5161,
629,
273,
436,
9380,
7680,
50275,
498,
15752,
253,
2929,
310,
1077,
973,
3542,
285,
17194,
12054,
973,
690,
273,
253,
14777,
285,
7180,
403,
1892,
281,
19818,
5742,
697,
2223,
417,
2590,
752,
310,
1146,
753,
342,
731,
253,
373,
247,
2257,
1469,
327,
275,
4677,
337,
285,
1014,
625,
275,
2829,
337,
285,
1014,
2167,
253,
2022,
1379,
42287,
513,
1646,
281,
320,
5469,
275,
253,
2505,
597,
250,
6571,
3663,
275,
253,
33295,
275,
3239,
818,
352,
3133,
326,
253,
1390,
6197,
310,
253,
2022,
1379,
12594,
323,
1016,
38576,
2557,
891,
651,
1804,
27321,
841,
275,
247,
625,
5542,
12490,
5133,
281,
3812,
253,
10668,
4116,
3587,
281,
253,
1379,
42287,
285,
3553,
253,
27389,
2505,
1919,
846,
2829,
337,
310,
247,
2372,
16400,
697,
417,
2590,
752,
497,
6326,
281,
320,
2819,
323,
891,
651,
671,
1804,
294,
17695,
436,
2593,
594,
627,
403,
2590,
1379,
42287,
285,
16039,
323,
253,
10668,
4390,
352,
9563,
816,
347,
247,
21765,
5740,
273,
253,
1142,
3904,
275,
253,
2829,
50276,
20261,
352,
310,
7558,
326,
275,
4677,
374,
627,
310,
3839,
247,
2762,
2954,
875,
1097,
273,
841,
38576,
5593,
285,
253,
3388,
18849,
14938,
352,
3133,
2761,
751,
2792,
17568,
5195,
327,
253,
6415,
26332,
891,
13414,
923,
247,
2590,
2954,
387,
512,
50276,
9088,
403,
247,
1643,
643,
3374,
891,
3748,
275,
253,
3533,
2708,
50275,
9188,
40348,
436,
281,
479,
310,
253,
5075,
383,
1127,
273,
253,
2929,
3738,
891,
11435,
253,
4477,
3434,
281,
4711,
247,
5322,
5522,
323,
22791,
272,
10527,
1543,
697,
417,
2590,
849,
1534,
436,
588,
320,
16774,
27163,
327,
20953,
12620,
323,
10527,
1543,
403,
5431,
5486,
281,
6780,
5319,
390,
8482,
1059,
447,
273,
253,
3762,
5611,
533,
403,
417,
253,
990,
4736,
275,
3139,
275,
1798,
1880,
253,
16774,
1543,
1804,
749,
8172,
390,
4872,
3116,
1333,
1057,
417,
275,
667,
1039,
1818,
253,
10527,
1543,
3021,
352,
310,
417,
2590,
752,
253,
2879,
1318,
651,
320,
281,
452,
247,
3762,
22791,
1633,
326,
891,
1158,
812,
1056,
436,
5522,
625,
3486,
1020,
310,
281,
1611,
281,
564,
4457,
10334,
792,
581,
14876,
651,
320,
281,
1007,
387,
270,
42877,
534,
253,
4477,
513,
26542,
347,
597,
2486,
1097,
10334,
792,
285,
5415,
12620,
275,
1798,
352,
651,
320,
4722,
281,
7472,
1097,
347,
597,
778,
1581,
581,
281,
45190,
7409,
849,
253,
38576,
5593,
6889,
672,
4886,
432,
10334,
792,
281,
4067,
2718,
891,
14409,
326,
275,
25450,
357,
792,
2718,
352,
778,
417,
320,
1896,
281,
11897,
512,
273,
731,
275,
4581,
830,
533,
627,
778,
320,
34754,
31506,
5415,
11640,
273,
253,
10334,
792,
2718,
2783,
812,
2085,
247,
5322,
4766,
3216,
24088,
407,
36971,
562,
1016,
10334,
792,
1375,
1529,
4809,
326,
812,
2572,
253,
8453,
273,
436,
789,
310,
281,
7472,
25450,
357,
792,
3082,
323,
4227,
342,
4872,
1159,
4020,
2392,
247,
2257,
273,
391,
77,
3762,
1057,
2226,
323,
4872,
4020,
2392,
285,
10334,
792,
273,
2282,
594,
352,
651,
320,
4722,
281,
7472,
849,
253,
8062,
273,
253,
16774,
27163,
1818,
390,
417,
672,
4886,
432,
10334,
792,
281,
25450,
357,
792,
3082,
50276,
28694,
841,
3104,
275,
1386,
33274,
352,
2296,
253,
2440,
273,
824,
5593,
310,
28055,
285,
45190,
1774,
352,
651,
320,
5322,
281,
2085,
690,
11859,
6667,
824,
347,
247,
10527,
3033,
7976,
327,
581,
273,
841,
38576,
5593,
390,
1633,
751,
326,
5010,
697,
417,
2590,
2139,
841,
38576,
5593,
403,
1774,
690,
7364,
403,
2530,
6571,
2905,
281,
2852,
789,
352,
651,
320,
5322,
281,
452,
690,
5955,
5001,
253,
8453,
48276,
390,
3480,
10445,
273,
436,
789,
275,
1386,
342,
690,
273,
253,
5701,
891,
1160,
1840,
5001,
8453,
50276,
2369,
5955,
273,
2442,
4016,
38058,
3486,
369,
2530,
5474,
33032,
2520,
2929,
10123,
285,
13213,
4219,
690,
38576,
5593,
327,
10334,
792,
31934,
793,
285,
29328,
247,
747,
10334,
792,
391,
77,
22791,
4907,
847,
37554,
360,
326,
13276,
3242,
13782,
273,
841,
5593,
275,
16774,
27163,
9470,
4679,
403,
5196,
281,
2939,
253,
3045,
273,
5368,
10334,
792,
391,
77,
3082,
327,
253,
4081,
22791,
326,
35742,
11117,
12620,
13654,
327,
1027,
38576,
5593,
50276,
296,
3755,
20556,
337,
253,
2929,
5936,
271,
4722,
31460,
273,
12873,
253,
3762,
285,
3946,
273,
10334,
792,
391,
77,
407,
3192,
715,
2395,
253,
38576,
5593,
326,
403,
5431,
908,
760,
275,
10527,
1783,
672,
16344,
285,
10941,
10334,
792,
391,
77,
11333,
374,
253,
847,
37554,
360,
22791,
310,
273,
1029,
3290,
12112,
285,
342,
4209,
27586,
285,
40727,
495,
253,
16774,
27163,
403,
11080,
577,
253,
2929,
310,
973,
10932,
285,
3559,
50274,
20881,
1255,
265,
337,
253,
22791,
285,
253,
38576,
5593,
760,
4647,
281,
10334,
792,
391,
77,
275,
4499,
954,
273,
253,
1655,
16774,
789,
275,
391,
77,
310,
16222,
281,
253,
25450,
357,
792,
4758,
285,
627,
310,
671,
271,
3629,
1180,
273,
10527,
789,
326,
33826,
25450,
357,
792,
391,
77,
374,
253,
10527,
3916,
1160,
407,
436,
2929,
403,
8489,
21248,
923,
253,
1563,
323,
4278,
642,
4468,
2490,
187,
4118,
18435,
27,
783,
30628,
11626,
403,
3240,
5185,
4404,
247,
5075,
2997,
50276,
303,
417,
13224,
342,
253,
1943,
4060,
38576,
275,
1616,
729,
3061,
4870,
3762,
285,
3946,
436,
2929,
310,
625,
751,
247,
6630,
50276,
31591,
4698,
2278,
3185,
273,
247,
2561,
3929,
6747,
253,
3762,
629,
390,
253,
3946,
629,
310,
4460,
2217,
347,
247,
2561,
3929,
697,
247,
2372,
6906,
347,
247,
6630,
2929,
891,
11697,
5257,
281,
5075,
12009,
533,
891,
1675,
253,
30628,
5075,
2997
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper conducts an extensive set of experiments on rws and compares it against a set of benchmarks such as gmm and iwae the main contribution of the paper is the fact revealed by these experiments that rws learns better models and inference networks with increasing numbers of particles and that its benefits extend to continuous latent variable models as well the performance of rws will increase significantly if we increase the number of particles the experimental part is written in an inspiring way and i enjoyed reading it however there should be stronger baselines incorporated for example httpsarxivorgabs180507445 also i think the authors could try to emphasize more on the shortcomings of rws discovered by the gmm experiments and how defensive importance sampling fixes it there are several other parts in the paper that indicates interesting facts diving deeper into it could possibly lead to more interesting findings in all i would consider these comparison results important to be somewhere in the literature but because its lack of rigorous analysis and explanation for the observations i personally think these observations alone are not novel enough to be an iclr paper docsepthis manuscript investigates the performance of reweighted wakesleep rws framework for learning deep generative models with discrete latent variables it gives a clear introduction to variational autoencoder based models for scenarios with discrete latent variables including iwae and also models based on continuous relaxations of discrete variables the paper performs several experiments which suggest that rws is more appropriate for discrete latent variables than other methods such as iwae especially increasing the number of particles unlike iwae always enhances the performance of rws while this paper investigates an important problem and also offers interesting observations it lacks a rigorous analysis of why the rws performance is consistently better than iwae more precisely the propositions should be stated in more formal language and they should be accompanied with a minimal rigorous justificationdocsepmain idea this paper studies a problem of the importance weighted autoencoder iwae pointed out by rainforth 18 that is tighter lower bounds arising from increasing the number of particles improve the learning of the generative model but worsen the learning of the inference network the authors show that the reweighted wakesleep algorithm rws doesnt suffer from this issue moreover as an alternative to control variate scheme and reparameterization trick rws doesnt suffer from high variance gradients thus it is particularly useful for discrete latent variable models to support the claim they conduct three experiments 1 on attend infer repeat a generative model with both discrete and continuous latent variables 2 on mnist with a continuous latent variable model 3 on a synthetic gmm clarity issues 1 branching has been used many times but afaik this seems not a standard terminology what do branching on the samples conditional branching branching paths mean 2 zeroforcing failure mode and deltaww i find this part difficult to follow for example the following sentence the inference network qzx becomes the posterior for this model which in this model also has support at most 0 9 for all x however this failure mode seems an interesting finding and since deltaww outperforms other methods it deserves a better introduction questions 1 in fig 1 right how do you estimate klqzx pzx 2 in sec 42 why do you say iwae learns a better model only up to a point k 128 and suffers from diminishing returns afterwards 3 in fig 4 why ws doesnt achieve a better performance when k increasing experiments 1 since the motivating story is about discrete latent variable models better baselines should be compared eg rbm dvae dvae vqvae etc 2 all experiments were on either on mnist or synthetic data at least one large scale experiment on discrete data should be made to verify the performance of rws
### Summary: | the paper presents a well conducted empirical study of the reweighted wake sleep rws algorithm bornschein and bengio 2015 it shows that it performs consistently better than alternatives such as importance weighted autoencoder iwae for the hard problem of learning deep generative models with discrete latent variables acting as a stochastic control flow the work is wellwritten and extracts valuable insights supported by empirical observations in particular the fact that increasing the number of particles improves learning in rws but hurts in iwae and the fact that rws can also be successfully applied to continuous variables the reviewers and ac note the following weaknesses of the work as it currently stands a it is almost exclusively empirical and while reasonable explanations are argued it does not provide a formal theoretical analysis justifying the observed behaviour b experiments are limited to mnist and synthetic data confirmation of the findings on largerscale realworld data and model would provide a more complete and convincing evidence the paper should be made stronger on at least one and ideally both of these accounts | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2589,
84,
271,
9470,
873,
273,
4679,
327,
391,
8819,
285,
26662,
352,
1411,
247,
873,
273,
49602,
824,
347,
305,
2188,
285,
891,
88,
3348,
253,
2022,
7680,
273,
253,
2929,
310,
253,
958,
4950,
407,
841,
4679,
326,
391,
8819,
33772,
1805,
3210,
285,
17032,
6928,
342,
3629,
3904,
273,
6353,
285,
326,
697,
5373,
9017,
281,
5415,
21624,
4778,
3210,
347,
973,
253,
3045,
273,
391,
8819,
588,
2572,
3012,
604,
359,
2572,
253,
1180,
273,
6353,
50275,
783,
5661,
629,
310,
3542,
275,
271,
29853,
1039,
285,
891,
11346,
4361,
352,
2299,
627,
943,
320,
10046,
1666,
25379,
11217,
323,
1650,
5987,
39962,
2061,
5375,
11395,
1235,
3566,
1857,
671,
891,
1158,
253,
4477,
812,
1611,
281,
22175,
625,
327,
253,
35387,
273,
391,
8819,
6888,
407,
253,
305,
2188,
4679,
285,
849,
14397,
6349,
10491,
26019,
352,
627,
403,
2067,
643,
4243,
275,
253,
2929,
326,
6492,
4722,
5441,
33058,
12861,
715,
352,
812,
6830,
1421,
281,
625,
4722,
4342,
50276,
249,
512,
891,
651,
1908,
841,
5301,
1543,
1774,
281,
320,
9366,
275,
253,
6239,
533,
984,
697,
3480,
273,
26565,
1783,
285,
8813,
323,
253,
7313,
891,
11697,
1158,
841,
7313,
3815,
403,
417,
4460,
2217,
281,
320,
271,
17857,
32888,
2929,
50276,
7152,
33032,
2520,
7714,
2340,
684,
253,
3045,
273,
294,
24676,
259,
1582,
9481,
391,
8819,
7792,
323,
4715,
3676,
1006,
800,
3210,
342,
13358,
21624,
4903,
352,
4245,
247,
2590,
10199,
281,
39762,
6753,
36465,
1754,
3210,
323,
15216,
342,
13358,
21624,
4903,
1690,
891,
88,
3348,
285,
671,
3210,
1754,
327,
5415,
7921,
569,
273,
13358,
4903,
253,
2929,
17923,
2067,
4679,
534,
1804,
326,
391,
8819,
310,
625,
4569,
323,
13358,
21624,
4903,
685,
643,
3082,
824,
347,
891,
88,
3348,
3340,
3629,
253,
1180,
273,
6353,
12401,
891,
88,
3348,
1900,
25222,
253,
3045,
273,
391,
8819,
50276,
6050,
436,
2929,
2340,
684,
271,
1774,
1895,
285,
671,
6131,
4722,
7313,
352,
19756,
247,
26565,
1783,
273,
2139,
253,
391,
8819,
3045,
310,
12724,
1805,
685,
891,
88,
3348,
625,
10534,
253,
39325,
943,
320,
4767,
275,
625,
7473,
3448,
285,
597,
943,
320,
11704,
342,
247,
8723,
26565,
22861,
7152,
339,
2617,
404,
2934,
436,
2929,
2175,
247,
1895,
273,
253,
6349,
17375,
6753,
36465,
891,
88,
3348,
8042,
562,
407,
50276,
1949,
28287,
1283,
326,
310,
40638,
2406,
14493,
14475,
432,
3629,
253,
1180,
273,
6353,
3157,
253,
4715,
273,
253,
1006,
800,
1566,
533,
548,
8243,
253,
4715,
273,
253,
17032,
2990,
253,
4477,
921,
326,
253,
294,
24676,
259,
1582,
9481,
5933,
391,
8819,
36908,
11089,
432,
436,
2523,
25761,
347,
271,
5795,
281,
1453,
1459,
366,
6974,
285,
294,
19484,
1320,
10480,
391,
8819,
36908,
11089,
432,
1029,
11041,
27935,
3021,
352,
310,
3782,
4217,
323,
13358,
21624,
4778,
3210,
50274,
936,
1329,
253,
1750,
597,
2589,
1264,
4679,
337,
327,
8041,
9441,
10280,
247,
1006,
800,
1566,
342,
1097,
13358,
285,
5415,
21624,
4903,
374,
327,
278,
79,
382,
342,
247,
5415,
21624,
4778,
1566,
495,
327,
247,
13506,
305,
2188,
50276,
498,
15752,
3374,
337,
27213,
556,
644,
908,
1142,
2069,
533,
6706,
66,
1479,
436,
3133,
417,
247,
2629,
28939,
752,
513,
27213,
327,
253,
3530,
17697,
27213,
27213,
11865,
1599,
374,
1182,
254,
1171,
263,
2844,
4433,
4438,
285,
18687,
1477,
891,
1089,
436,
629,
2834,
281,
956,
323,
1650,
253,
1563,
6197,
50276,
783,
17032,
2990,
2805,
91,
89,
4916,
253,
12637,
323,
436,
1566,
534,
275,
436,
1566,
671,
556,
1329,
387,
954,
470,
50273,
26,
323,
512,
1269,
50276,
35529,
436,
4433,
4438,
3133,
271,
4722,
4560,
285,
1580,
18687,
1477,
41731,
13015,
643,
3082,
352,
22828,
247,
1805,
10199,
50275,
34974,
337,
275,
3036,
337,
987,
849,
513,
368,
6642,
27451,
82,
91,
89,
50276,
81,
91,
89,
374,
275,
4706,
5976,
2139,
513,
368,
1333,
891,
88,
3348,
33772,
247,
1805,
1566,
760,
598,
281,
247,
1127,
465,
50276,
8196,
285,
27171,
432,
48245,
6548,
16906,
50275,
20,
275,
3036,
577,
2139,
37280,
36908,
5115,
247,
1805,
3045,
672,
465,
3629,
50276,
16217,
3825,
337,
1580,
253,
15265,
839,
2926,
310,
670,
13358,
21624,
4778,
3210,
1805,
1666,
25379,
943,
320,
2429,
24088,
391,
5844,
277,
21574,
277,
21574,
362,
82,
21574,
3966,
50276,
19,
512,
4679,
497,
327,
2057,
327,
278,
79,
382,
390,
13506,
941,
387,
1878,
581,
1781,
4311,
3368,
327,
13358,
941,
943,
320,
1160,
281,
12654,
253,
3045,
273,
391,
8819,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
973,
5196,
16774,
1263,
273,
253,
294,
24676,
11772,
4600,
391,
8819,
5933,
5686,
9163,
249,
285,
270,
1205,
900,
4104,
352,
2722,
326,
352,
17923,
12724,
1805,
685,
18075,
824,
347,
6349,
17375,
6753,
36465,
891,
88,
3348,
323,
253,
1892,
1895,
273,
4715,
3676,
1006,
800,
3210,
342,
13358,
21624,
4903,
8534,
347,
247,
19191,
1453,
2685,
50276,
783,
789,
310,
973,
15720,
285,
16756,
9865,
16039,
4516,
407,
16774,
7313,
275,
1798,
253,
958,
326,
3629,
253,
1180,
273,
6353,
19132,
4715,
275,
391,
8819,
533,
31835,
275,
891,
88,
3348,
285,
253,
958,
326,
391,
8819,
476,
671,
320,
8379,
3732,
281,
5415,
4903,
253,
30628,
285,
913,
3877,
253,
1563,
32213,
273,
253,
789,
347,
352,
4390,
9572,
50276,
66,
352,
310,
2761,
14288,
16774,
285,
1223,
5272,
22909,
403,
9125,
352,
1057,
417,
2085,
247,
7473,
10527,
1783,
816,
5411,
253,
2540,
8770,
270,
4679,
403,
3710,
281,
278,
79,
382,
285,
13506,
941,
16883,
273,
253,
4342,
327,
1236,
7276,
25912,
1524,
10186,
941,
285,
1566,
651,
2085,
247,
625,
3426,
285,
21414,
1941,
50276,
783,
2929,
943,
320,
1160,
10046,
327,
387,
1878,
581,
285,
34243,
1097,
273,
841,
8553,
50276
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2589,
84,
271,
9470,
873,
273,
4679,
327,
391,
8819,
285,
26662,
352,
1411,
247,
873,
273,
49602,
824,
347,
305,
2188,
285,
891,
88,
3348,
253,
2022,
7680,
273,
253,
2929,
310,
253,
958,
4950,
407,
841,
4679,
326,
391,
8819,
33772,
1805,
3210,
285,
17032,
6928,
342,
3629,
3904,
273,
6353,
285,
326,
697,
5373,
9017,
281,
5415,
21624,
4778,
3210,
347,
973,
253,
3045,
273,
391,
8819,
588,
2572,
3012,
604,
359,
2572,
253,
1180,
273,
6353,
50275,
783,
5661,
629,
310,
3542,
275,
271,
29853,
1039,
285,
891,
11346,
4361,
352,
2299,
627,
943,
320,
10046,
1666,
25379,
11217,
323,
1650,
5987,
39962,
2061,
5375,
11395,
1235,
3566,
1857,
671,
891,
1158,
253,
4477,
812,
1611,
281,
22175,
625,
327,
253,
35387,
273,
391,
8819,
6888,
407,
253,
305,
2188,
4679,
285,
849,
14397,
6349,
10491,
26019,
352,
627,
403,
2067,
643,
4243,
275,
253,
2929,
326,
6492,
4722,
5441,
33058,
12861,
715,
352,
812,
6830,
1421,
281,
625,
4722,
4342,
50276,
249,
512,
891,
651,
1908,
841,
5301,
1543,
1774,
281,
320,
9366,
275,
253,
6239,
533,
984,
697,
3480,
273,
26565,
1783,
285,
8813,
323,
253,
7313,
891,
11697,
1158,
841,
7313,
3815,
403,
417,
4460,
2217,
281,
320,
271,
17857,
32888,
2929,
50276,
7152,
33032,
2520,
7714,
2340,
684,
253,
3045,
273,
294,
24676,
259,
1582,
9481,
391,
8819,
7792,
323,
4715,
3676,
1006,
800,
3210,
342,
13358,
21624,
4903,
352,
4245,
247,
2590,
10199,
281,
39762,
6753,
36465,
1754,
3210,
323,
15216,
342,
13358,
21624,
4903,
1690,
891,
88,
3348,
285,
671,
3210,
1754,
327,
5415,
7921,
569,
273,
13358,
4903,
253,
2929,
17923,
2067,
4679,
534,
1804,
326,
391,
8819,
310,
625,
4569,
323,
13358,
21624,
4903,
685,
643,
3082,
824,
347,
891,
88,
3348,
3340,
3629,
253,
1180,
273,
6353,
12401,
891,
88,
3348,
1900,
25222,
253,
3045,
273,
391,
8819,
50276,
6050,
436,
2929,
2340,
684,
271,
1774,
1895,
285,
671,
6131,
4722,
7313,
352,
19756,
247,
26565,
1783,
273,
2139,
253,
391,
8819,
3045,
310,
12724,
1805,
685,
891,
88,
3348,
625,
10534,
253,
39325,
943,
320,
4767,
275,
625,
7473,
3448,
285,
597,
943,
320,
11704,
342,
247,
8723,
26565,
22861,
7152,
339,
2617,
404,
2934,
436,
2929,
2175,
247,
1895,
273,
253,
6349,
17375,
6753,
36465,
891,
88,
3348,
8042,
562,
407,
50276,
1949,
28287,
1283,
326,
310,
40638,
2406,
14493,
14475,
432,
3629,
253,
1180,
273,
6353,
3157,
253,
4715,
273,
253,
1006,
800,
1566,
533,
548,
8243,
253,
4715,
273,
253,
17032,
2990,
253,
4477,
921,
326,
253,
294,
24676,
259,
1582,
9481,
5933,
391,
8819,
36908,
11089,
432,
436,
2523,
25761,
347,
271,
5795,
281,
1453,
1459,
366,
6974,
285,
294,
19484,
1320,
10480,
391,
8819,
36908,
11089,
432,
1029,
11041,
27935,
3021,
352,
310,
3782,
4217,
323,
13358,
21624,
4778,
3210,
50274,
936,
1329,
253,
1750,
597,
2589,
1264,
4679,
337,
327,
8041,
9441,
10280,
247,
1006,
800,
1566,
342,
1097,
13358,
285,
5415,
21624,
4903,
374,
327,
278,
79,
382,
342,
247,
5415,
21624,
4778,
1566,
495,
327,
247,
13506,
305,
2188,
50276,
498,
15752,
3374,
337,
27213,
556,
644,
908,
1142,
2069,
533,
6706,
66,
1479,
436,
3133,
417,
247,
2629,
28939,
752,
513,
27213,
327,
253,
3530,
17697,
27213,
27213,
11865,
1599,
374,
1182,
254,
1171,
263,
2844,
4433,
4438,
285,
18687,
1477,
891,
1089,
436,
629,
2834,
281,
956,
323,
1650,
253,
1563,
6197,
50276,
783,
17032,
2990,
2805,
91,
89,
4916,
253,
12637,
323,
436,
1566,
534,
275,
436,
1566,
671,
556,
1329,
387,
954,
470,
50273,
26,
323,
512,
1269,
50276,
35529,
436,
4433,
4438,
3133,
271,
4722,
4560,
285,
1580,
18687,
1477,
41731,
13015,
643,
3082,
352,
22828,
247,
1805,
10199,
50275,
34974,
337,
275,
3036,
337,
987,
849,
513,
368,
6642,
27451,
82,
91,
89,
50276,
81,
91,
89,
374,
275,
4706,
5976,
2139,
513,
368,
1333,
891,
88,
3348,
33772,
247,
1805,
1566,
760,
598,
281,
247,
1127,
465,
50276,
8196,
285,
27171,
432,
48245,
6548,
16906,
50275,
20,
275,
3036,
577,
2139,
37280,
36908,
5115,
247,
1805,
3045,
672,
465,
3629,
50276,
16217,
3825,
337,
1580,
253,
15265,
839,
2926,
310,
670,
13358,
21624,
4778,
3210,
1805,
1666,
25379,
943,
320,
2429,
24088,
391,
5844,
277,
21574,
277,
21574,
362,
82,
21574,
3966,
50276,
19,
512,
4679,
497,
327,
2057,
327,
278,
79,
382,
390,
13506,
941,
387,
1878,
581,
1781,
4311,
3368,
327,
13358,
941,
943,
320,
1160,
281,
12654,
253,
3045,
273,
391,
8819,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
247,
973,
5196,
16774,
1263,
273,
253,
294,
24676,
11772,
4600,
391,
8819,
5933,
5686,
9163,
249,
285,
270,
1205,
900,
4104,
352,
2722,
326,
352,
17923,
12724,
1805,
685,
18075,
824,
347,
6349,
17375,
6753,
36465,
891,
88,
3348,
323,
253,
1892,
1895,
273,
4715,
3676,
1006,
800,
3210,
342,
13358,
21624,
4903,
8534,
347,
247,
19191,
1453,
2685,
50276,
783,
789,
310,
973,
15720,
285,
16756,
9865,
16039,
4516,
407,
16774,
7313,
275,
1798,
253,
958,
326,
3629,
253,
1180,
273,
6353,
19132,
4715,
275,
391,
8819,
533,
31835,
275,
891,
88,
3348,
285,
253,
958,
326,
391,
8819,
476,
671,
320,
8379,
3732,
281,
5415,
4903,
253,
30628,
285,
913,
3877,
253,
1563,
32213,
273,
253,
789,
347,
352,
4390,
9572,
50276,
66,
352,
310,
2761,
14288,
16774,
285,
1223,
5272,
22909,
403,
9125,
352,
1057,
417,
2085,
247,
7473,
10527,
1783,
816,
5411,
253,
2540,
8770,
270,
4679,
403,
3710,
281,
278,
79,
382,
285,
13506,
941,
16883,
273,
253,
4342,
327,
1236,
7276,
25912,
1524,
10186,
941,
285,
1566,
651,
2085,
247,
625,
3426,
285,
21414,
1941,
50276,
783,
2929,
943,
320,
1160,
10046,
327,
387,
1878,
581,
285,
34243,
1097,
273,
841,
8553,
50276
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper tackles a very challenging problem and provides a novel approach the authors have an indepth understanding of the related works and provide a detailed review the theoretical contributions of this paper are solid and the experiments are quite thorough the assumption of binary data seems a bit strict nonstationarity seems to be the most critical foundation of this paper and is worth more explanation and intuition like what is the connection between the number of segments and the number of observedlatent variables for model identification in the experiments what is the reason that you only run 3 times for each case i think one may need to provide the computational complexity of the proposed algorithm could you give more discussion about how to extend your method to address the discrete setting docsep1 this paper is well written and the authors are good at providing intuitive examples for further explanation and the binary ica problems they focused on including the identifiability and estimation methods are important and potentially useful 2 the authors found the nonidentifiability for the binary ica model in the twovariable case which is somewhat surprising but they showed empirically that the model is identifiable when the dimensionality is higher further they employed correlation identifiability to derive a practical algorithm for the estimation i think overall the authors did interesting research but i have some concerns listed below since the title of this paper is binary independent component analysis via nonstationarity i think that the authors would have used some information about nonstationarity eg the invariance to help estimate such a nonstationary model but the authors do not follow this direction it seems to me that the segment variable u or the number of segments nu is given as shown especially in algorithm 1 and the authors only estimate the binary ica model in one segment by another one thus it would confuse me whether they focus on handling the nonstationarity problem or not i have some concerns listed below what are the relationships between the identifiability for the binary causal discovery model and the identifiability for the binary ica model it might be helpful to discuss more the applications of the binary ica model in the paper in page 2 add independent noise epsilon form mathcaln add independent noise from mathcaln docsepindependent component analysis via nonstationarity is an important issue the identifiability of the proposed model are discussed in detail and the proposed mle is efficient why use a specific link function like phisqrtpiover 8 y01 it seems that this setting is critical to the identifiability of the proposed model more motivation should be given about this setting docsepthe paper is well written with clear motivation and goals the presented model is simple but provides a sound answer to a practical problem the simulation study is convincing some part of the methodology would benefit from from further explanation especially how the nonstationary part is handled also the regularization step of the blica algorithm would benefit from further justification more generally it is better from a scientific perspective to discuss related work at the beginning of a contribution and not at the end as explained in the previous section it would be beneficial to give more details about the u component it might be clear to a reader familiar with the literature in the field but not to the general audience such as the uai community for instance two major points could be clarified in practice are the segments predefined or do you need to estimate them from the data before applying ica how does the nonstationnarity increase the model identifiability on a related topic figure 3 is difficult to understand it would be nice to have a short sentence recalling that lower value of log101mcs implies better performance and that a model is considered as identifiable for values below say 3 some details about blica could be better explained justified the full mle approach does not seem intractable for the dimensionalities considered in the paper it might be necessary to use parallelization for reasonable computing time this might be impractical but would be very interesting to compare the performance loss by blica also for the full mle how would you parametrize the correlation matrix to ensures that its estimates satisfies the necessary properties for blica the so called regularization step seems a bit awkward first why is it called regularization in general in ml this term refers to penalty terms penalizing for the model complexity while in this context it seems to be closer to an attempt to project on the closest positive definite matrix second the pairwise estimates do not ensure at all that the matrix is well defined and that is the reason why such regularization is needed could the authors elaborate on why beyond empirical evidence this estimate is consistent minor comments p 2 c1 use left text and right in equations p5 c1 then we fit those correlation what do you mean p6 c1 maybe add a sentence to say that ivae is introduced to be used as benchmark reference
### Summary: | meta review i had trouble with this paper and i have to say that i am more skeptical than the reviewers who were generally positive some of the concerns were raised by the reviewers and everybody seemed happy after the rebuttal so i will not push this any further although i expect that the authors can clarify considerable eg on using segments in their setup my main concern is that the title is misleading in many ways first it suggests that nonstationarity is handled in some special way in this paper but it is not second such a general title binary ica suggests that they came up with a canonical way of dealing with ica for binary data however they approach is very special the choice of this specific link function is not well motivated in the paper it is in the replies i would add to that that in the classical ica the mixing matrix has a concrete physical interpretation but here such an interpretation is missing this is of course an easily fixable concern and i hope the authors will adjust their title my other concern is about the identifiability results i am not saying that the results are wrong but that the authors do not have a good understanding of the identifiability issue in this particular scenario and here the paper looks underdeveloped but after addressing the comments of the reviewers and a title adjustment i think this could be an interesting paper | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
39223,
247,
1077,
11132,
1895,
285,
3400,
247,
4460,
2746,
50276,
783,
4477,
452,
271,
801,
554,
394,
4685,
273,
253,
2905,
2987,
285,
2085,
247,
7000,
2278,
50276,
783,
10527,
9021,
273,
436,
2929,
403,
4891,
285,
253,
4679,
403,
3240,
11080,
50275,
783,
9376,
273,
8985,
941,
3133,
247,
2372,
7654,
1327,
20502,
15752,
3133,
281,
320,
253,
954,
4619,
12153,
273,
436,
2929,
285,
310,
4409,
625,
8813,
285,
30328,
751,
752,
310,
253,
4602,
875,
253,
1180,
273,
13288,
285,
253,
1180,
273,
2540,
13324,
290,
4903,
323,
1566,
8137,
50274,
249,
253,
4679,
752,
310,
253,
1921,
326,
368,
760,
1408,
495,
2069,
323,
1016,
1083,
891,
1158,
581,
778,
878,
281,
2085,
253,
15180,
10454,
273,
253,
4081,
5933,
50275,
16534,
368,
1918,
625,
5955,
670,
849,
281,
9017,
634,
1332,
281,
2953,
253,
13358,
4758,
50276,
7152,
33032,
18,
436,
2929,
310,
973,
3542,
285,
253,
4477,
403,
1175,
387,
5277,
27350,
6667,
323,
2007,
8813,
285,
253,
8985,
209,
3737,
3237,
597,
7106,
327,
1690,
253,
1548,
18279,
1430,
285,
13418,
3082,
403,
1774,
285,
7826,
4217,
50275,
19,
253,
4477,
1119,
253,
1327,
888,
18279,
1430,
323,
253,
8985,
209,
3737,
1566,
275,
253,
767,
18645,
1083,
534,
310,
8489,
10084,
533,
597,
2692,
45190,
326,
253,
1566,
310,
38640,
672,
253,
7877,
1319,
310,
2169,
2007,
597,
7091,
5921,
1548,
18279,
1430,
281,
15313,
247,
8542,
5933,
323,
253,
13418,
891,
1158,
4583,
253,
4477,
858,
4722,
2561,
533,
891,
452,
690,
7350,
7117,
2708,
50276,
17480,
253,
4060,
273,
436,
2929,
310,
8985,
3907,
4445,
1783,
3066,
1327,
20502,
15752,
891,
1158,
326,
253,
4477,
651,
452,
908,
690,
1491,
670,
1327,
20502,
15752,
24088,
253,
31429,
281,
1361,
6642,
824,
247,
1327,
20502,
552,
1566,
533,
253,
4477,
513,
417,
956,
436,
3884,
352,
3133,
281,
479,
326,
253,
8223,
4778,
1484,
390,
253,
1180,
273,
13288,
8794,
310,
1677,
347,
2011,
3340,
275,
5933,
337,
285,
253,
4477,
760,
6642,
253,
8985,
209,
3737,
1566,
275,
581,
8223,
407,
1529,
581,
3021,
352,
651,
40678,
479,
1880,
597,
2770,
327,
10885,
253,
1327,
20502,
15752,
1895,
390,
417,
50275,
74,
452,
690,
7350,
7117,
2708,
50276,
5371,
403,
253,
7688,
875,
253,
1548,
18279,
1430,
323,
253,
8985,
19349,
8900,
1566,
285,
253,
1548,
18279,
1430,
323,
253,
8985,
209,
3737,
1566,
352,
1537,
320,
9371,
281,
2319,
625,
253,
4893,
273,
253,
8985,
209,
3737,
1566,
275,
253,
2929,
50275,
249,
3239,
374,
823,
3907,
6046,
299,
4277,
50276,
630,
14168,
1179,
79,
50276,
1911,
3907,
6046,
432,
14168,
1179,
79,
5474,
33032,
17777,
4445,
1783,
3066,
1327,
20502,
15752,
310,
271,
1774,
2523,
50276,
783,
50275,
888,
18279,
1430,
50276,
1171,
253,
4081,
1566,
403,
5469,
275,
2508,
285,
253,
4081,
278,
282,
310,
5919,
50274,
22309,
897,
247,
2173,
3048,
1159,
50276,
3022,
815,
261,
2274,
81,
900,
332,
854,
340,
520,
50274,
262,
3133,
326,
436,
4758,
50276,
261,
4619,
281,
253,
1548,
18279,
1430,
273,
253,
4081,
1566,
50276,
3062,
16038,
943,
320,
1677,
670,
436,
4758,
50273,
7152,
339,
431,
248,
2929,
310,
973,
3542,
342,
2590,
16038,
285,
7342,
253,
3559,
1566,
310,
2969,
533,
3400,
247,
3590,
3662,
281,
247,
8542,
1895,
253,
9864,
1263,
310,
21414,
50276,
8826,
629,
273,
253,
16182,
651,
5649,
432,
432,
2007,
8813,
3340,
849,
253,
1327,
20502,
552,
629,
310,
15726,
671,
253,
37820,
3213,
273,
253,
270,
663,
66,
5933,
651,
5649,
432,
2007,
22861,
625,
3839,
352,
310,
1805,
432,
247,
8249,
8668,
281,
2319,
2905,
789,
387,
253,
5068,
273,
247,
7680,
285,
417,
387,
253,
990,
50276,
284,
5544,
275,
253,
2045,
2593,
352,
651,
320,
12912,
281,
1918,
625,
4278,
670,
253,
1484,
4445,
352,
1537,
320,
2590,
281,
247,
9414,
7615,
342,
253,
6239,
275,
253,
1673,
533,
417,
281,
253,
2087,
8446,
824,
347,
253,
1484,
2284,
3114,
323,
4227,
767,
2201,
2792,
812,
320,
31637,
275,
3946,
403,
253,
13288,
41364,
390,
513,
368,
878,
281,
6642,
731,
432,
253,
941,
1078,
9433,
209,
3737,
849,
1057,
253,
1327,
20502,
79,
15752,
2572,
253,
1566,
1548,
18279,
1430,
50276,
251,
247,
2905,
9400,
4677,
495,
310,
2834,
281,
2096,
352,
651,
320,
5322,
281,
452,
247,
2159,
6197,
43800,
326,
2406,
1318,
273,
2412,
6903,
78,
6113,
8018,
1805,
3045,
285,
326,
247,
1566,
310,
2783,
347,
38640,
323,
2193,
2708,
1333,
495,
50276,
8826,
4278,
670,
270,
663,
66,
812,
320,
1805,
5544,
50276,
6309,
1245,
50275,
783,
2120,
278,
282,
2746,
1057,
417,
1646,
540,
44374,
323,
253,
15759,
1005,
2783,
275,
253,
2929,
352,
1537,
320,
3309,
281,
897,
7529,
1320,
323,
5272,
12672,
673,
436,
1537,
320,
45783,
533,
651,
320,
1077,
4722,
281,
7277,
253,
3045,
2957,
407,
270,
663,
66,
50276,
12563,
323,
253,
2120,
278,
282,
849,
651,
368,
30364,
363,
2721,
253,
5921,
4315,
281,
20096,
326,
697,
8197,
12310,
253,
3309,
3607,
50276,
1542,
270,
663,
66,
253,
594,
1925,
37820,
3213,
3133,
247,
2372,
19328,
806,
2139,
310,
352,
1925,
37820,
50276,
249,
2087,
275,
13361,
436,
1307,
10770,
281,
12339,
2426,
29697,
3006,
323,
253,
1566,
10454,
1223,
275,
436,
3634,
352,
3133,
281,
320,
8003,
281,
271,
3177,
281,
2199,
327,
253,
8642,
2762,
19040,
4315,
1273,
253,
28208,
8197,
513,
417,
5416,
387,
512,
326,
253,
4315,
310,
973,
2931,
285,
326,
310,
253,
1921,
2139,
824,
37820,
310,
3058,
812,
253,
4477,
21184,
327,
2139,
4457,
16774,
1941,
436,
6642,
310,
5185,
50275,
37585,
5701,
50276,
81,
374,
260,
18,
897,
1669,
2505,
285,
50276,
918,
275,
7424,
50276,
81,
22,
260,
18,
840,
359,
4944,
1110,
5921,
752,
513,
368,
1599,
50276,
81,
23,
260,
18,
5046,
823,
247,
6197,
281,
1333,
326,
21983,
3348,
310,
5611,
281,
320,
908,
347,
22791,
3806,
2490,
187,
4118,
18435,
27,
13518,
2278,
891,
574,
7596,
342,
436,
2929,
285,
891,
452,
281,
1333,
326,
891,
717,
625,
33872,
685,
253,
30628,
665,
497,
3839,
2762,
690,
273,
253,
7350,
497,
5439,
407,
253,
30628,
285,
11648,
4455,
5211,
846,
253,
30080,
22559,
594,
891,
588,
417,
7450,
436,
667,
2007,
3738,
891,
1902,
326,
253,
4477,
476,
19148,
10665,
24088,
327,
970,
13288,
275,
616,
9978,
50275,
2577,
2022,
4468,
310,
326,
253,
4060,
310,
24363,
275,
1142,
4088,
806,
352,
5936,
326,
1327,
20502,
15752,
310,
15726,
275,
690,
2714,
1039,
275,
436,
2929,
533,
352,
310,
417,
1273,
824,
247,
2087,
4060,
8985,
209,
3737,
5936,
326,
597,
2210,
598,
342,
247,
15516,
1039,
273,
10620,
342,
209,
3737,
323,
8985,
941,
2299,
597,
2746,
310,
1077,
2714,
253,
4327,
273,
436,
2173,
3048,
1159,
310,
417,
973,
17194,
275,
253,
2929,
352,
310,
275,
253,
32114,
891,
651,
823,
281,
326,
326,
275,
253,
8946,
209,
3737,
253,
12480,
4315,
556,
247,
11859,
3520,
7914,
533,
1060,
824,
271,
7914,
310,
5816,
436,
310,
273,
2282,
271,
4354,
4993,
494,
4468,
285,
891,
3524,
253,
4477,
588,
4575,
616,
4060,
50276,
2577,
643,
4468,
310,
670,
253,
1548,
18279,
1430,
1543,
891,
717,
417,
3981,
326,
253,
1543,
403,
3430,
533,
326,
253,
4477,
513,
417,
452,
247,
1175,
4685,
273,
253,
1548,
18279,
1430,
2523,
275,
436,
1798,
10076,
285,
1060,
253,
2929,
4453,
762,
35208,
533,
846,
15974,
253,
5701,
273,
253,
30628,
285,
247,
4060,
14000,
891,
1158,
436,
812,
320,
271,
4722,
2929,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
39223,
247,
1077,
11132,
1895,
285,
3400,
247,
4460,
2746,
50276,
783,
4477,
452,
271,
801,
554,
394,
4685,
273,
253,
2905,
2987,
285,
2085,
247,
7000,
2278,
50276,
783,
10527,
9021,
273,
436,
2929,
403,
4891,
285,
253,
4679,
403,
3240,
11080,
50275,
783,
9376,
273,
8985,
941,
3133,
247,
2372,
7654,
1327,
20502,
15752,
3133,
281,
320,
253,
954,
4619,
12153,
273,
436,
2929,
285,
310,
4409,
625,
8813,
285,
30328,
751,
752,
310,
253,
4602,
875,
253,
1180,
273,
13288,
285,
253,
1180,
273,
2540,
13324,
290,
4903,
323,
1566,
8137,
50274,
249,
253,
4679,
752,
310,
253,
1921,
326,
368,
760,
1408,
495,
2069,
323,
1016,
1083,
891,
1158,
581,
778,
878,
281,
2085,
253,
15180,
10454,
273,
253,
4081,
5933,
50275,
16534,
368,
1918,
625,
5955,
670,
849,
281,
9017,
634,
1332,
281,
2953,
253,
13358,
4758,
50276,
7152,
33032,
18,
436,
2929,
310,
973,
3542,
285,
253,
4477,
403,
1175,
387,
5277,
27350,
6667,
323,
2007,
8813,
285,
253,
8985,
209,
3737,
3237,
597,
7106,
327,
1690,
253,
1548,
18279,
1430,
285,
13418,
3082,
403,
1774,
285,
7826,
4217,
50275,
19,
253,
4477,
1119,
253,
1327,
888,
18279,
1430,
323,
253,
8985,
209,
3737,
1566,
275,
253,
767,
18645,
1083,
534,
310,
8489,
10084,
533,
597,
2692,
45190,
326,
253,
1566,
310,
38640,
672,
253,
7877,
1319,
310,
2169,
2007,
597,
7091,
5921,
1548,
18279,
1430,
281,
15313,
247,
8542,
5933,
323,
253,
13418,
891,
1158,
4583,
253,
4477,
858,
4722,
2561,
533,
891,
452,
690,
7350,
7117,
2708,
50276,
17480,
253,
4060,
273,
436,
2929,
310,
8985,
3907,
4445,
1783,
3066,
1327,
20502,
15752,
891,
1158,
326,
253,
4477,
651,
452,
908,
690,
1491,
670,
1327,
20502,
15752,
24088,
253,
31429,
281,
1361,
6642,
824,
247,
1327,
20502,
552,
1566,
533,
253,
4477,
513,
417,
956,
436,
3884,
352,
3133,
281,
479,
326,
253,
8223,
4778,
1484,
390,
253,
1180,
273,
13288,
8794,
310,
1677,
347,
2011,
3340,
275,
5933,
337,
285,
253,
4477,
760,
6642,
253,
8985,
209,
3737,
1566,
275,
581,
8223,
407,
1529,
581,
3021,
352,
651,
40678,
479,
1880,
597,
2770,
327,
10885,
253,
1327,
20502,
15752,
1895,
390,
417,
50275,
74,
452,
690,
7350,
7117,
2708,
50276,
5371,
403,
253,
7688,
875,
253,
1548,
18279,
1430,
323,
253,
8985,
19349,
8900,
1566,
285,
253,
1548,
18279,
1430,
323,
253,
8985,
209,
3737,
1566,
352,
1537,
320,
9371,
281,
2319,
625,
253,
4893,
273,
253,
8985,
209,
3737,
1566,
275,
253,
2929,
50275,
249,
3239,
374,
823,
3907,
6046,
299,
4277,
50276,
630,
14168,
1179,
79,
50276,
1911,
3907,
6046,
432,
14168,
1179,
79,
5474,
33032,
17777,
4445,
1783,
3066,
1327,
20502,
15752,
310,
271,
1774,
2523,
50276,
783,
50275,
888,
18279,
1430,
50276,
1171,
253,
4081,
1566,
403,
5469,
275,
2508,
285,
253,
4081,
278,
282,
310,
5919,
50274,
22309,
897,
247,
2173,
3048,
1159,
50276,
3022,
815,
261,
2274,
81,
900,
332,
854,
340,
520,
50274,
262,
3133,
326,
436,
4758,
50276,
261,
4619,
281,
253,
1548,
18279,
1430,
273,
253,
4081,
1566,
50276,
3062,
16038,
943,
320,
1677,
670,
436,
4758,
50273,
7152,
339,
431,
248,
2929,
310,
973,
3542,
342,
2590,
16038,
285,
7342,
253,
3559,
1566,
310,
2969,
533,
3400,
247,
3590,
3662,
281,
247,
8542,
1895,
253,
9864,
1263,
310,
21414,
50276,
8826,
629,
273,
253,
16182,
651,
5649,
432,
432,
2007,
8813,
3340,
849,
253,
1327,
20502,
552,
629,
310,
15726,
671,
253,
37820,
3213,
273,
253,
270,
663,
66,
5933,
651,
5649,
432,
2007,
22861,
625,
3839,
352,
310,
1805,
432,
247,
8249,
8668,
281,
2319,
2905,
789,
387,
253,
5068,
273,
247,
7680,
285,
417,
387,
253,
990,
50276,
284,
5544,
275,
253,
2045,
2593,
352,
651,
320,
12912,
281,
1918,
625,
4278,
670,
253,
1484,
4445,
352,
1537,
320,
2590,
281,
247,
9414,
7615,
342,
253,
6239,
275,
253,
1673,
533,
417,
281,
253,
2087,
8446,
824,
347,
253,
1484,
2284,
3114,
323,
4227,
767,
2201,
2792,
812,
320,
31637,
275,
3946,
403,
253,
13288,
41364,
390,
513,
368,
878,
281,
6642,
731,
432,
253,
941,
1078,
9433,
209,
3737,
849,
1057,
253,
1327,
20502,
79,
15752,
2572,
253,
1566,
1548,
18279,
1430,
50276,
251,
247,
2905,
9400,
4677,
495,
310,
2834,
281,
2096,
352,
651,
320,
5322,
281,
452,
247,
2159,
6197,
43800,
326,
2406,
1318,
273,
2412,
6903,
78,
6113,
8018,
1805,
3045,
285,
326,
247,
1566,
310,
2783,
347,
38640,
323,
2193,
2708,
1333,
495,
50276,
8826,
4278,
670,
270,
663,
66,
812,
320,
1805,
5544,
50276,
6309,
1245,
50275,
783,
2120,
278,
282,
2746,
1057,
417,
1646,
540,
44374,
323,
253,
15759,
1005,
2783,
275,
253,
2929,
352,
1537,
320,
3309,
281,
897,
7529,
1320,
323,
5272,
12672,
673,
436,
1537,
320,
45783,
533,
651,
320,
1077,
4722,
281,
7277,
253,
3045,
2957,
407,
270,
663,
66,
50276,
12563,
323,
253,
2120,
278,
282,
849,
651,
368,
30364,
363,
2721,
253,
5921,
4315,
281,
20096,
326,
697,
8197,
12310,
253,
3309,
3607,
50276,
1542,
270,
663,
66,
253,
594,
1925,
37820,
3213,
3133,
247,
2372,
19328,
806,
2139,
310,
352,
1925,
37820,
50276,
249,
2087,
275,
13361,
436,
1307,
10770,
281,
12339,
2426,
29697,
3006,
323,
253,
1566,
10454,
1223,
275,
436,
3634,
352,
3133,
281,
320,
8003,
281,
271,
3177,
281,
2199,
327,
253,
8642,
2762,
19040,
4315,
1273,
253,
28208,
8197,
513,
417,
5416,
387,
512,
326,
253,
4315,
310,
973,
2931,
285,
326,
310,
253,
1921,
2139,
824,
37820,
310,
3058,
812,
253,
4477,
21184,
327,
2139,
4457,
16774,
1941,
436,
6642,
310,
5185,
50275,
37585,
5701,
50276,
81,
374,
260,
18,
897,
1669,
2505,
285,
50276,
918,
275,
7424,
50276,
81,
22,
260,
18,
840,
359,
4944,
1110,
5921,
752,
513,
368,
1599,
50276,
81,
23,
260,
18,
5046,
823,
247,
6197,
281,
1333,
326,
21983,
3348,
310,
5611,
281,
320,
908,
347,
22791,
3806,
2490,
187,
4118,
18435,
27,
13518,
2278,
891,
574,
7596,
342,
436,
2929,
285,
891,
452,
281,
1333,
326,
891,
717,
625,
33872,
685,
253,
30628,
665,
497,
3839,
2762,
690,
273,
253,
7350,
497,
5439,
407,
253,
30628,
285,
11648,
4455,
5211,
846,
253,
30080,
22559,
594,
891,
588,
417,
7450,
436,
667,
2007,
3738,
891,
1902,
326,
253,
4477,
476,
19148,
10665,
24088,
327,
970,
13288,
275,
616,
9978,
50275,
2577,
2022,
4468,
310,
326,
253,
4060,
310,
24363,
275,
1142,
4088,
806,
352,
5936,
326,
1327,
20502,
15752,
310,
15726,
275,
690,
2714,
1039,
275,
436,
2929,
533,
352,
310,
417,
1273,
824,
247,
2087,
4060,
8985,
209,
3737,
5936,
326,
597,
2210,
598,
342,
247,
15516,
1039,
273,
10620,
342,
209,
3737,
323,
8985,
941,
2299,
597,
2746,
310,
1077,
2714,
253,
4327,
273,
436,
2173,
3048,
1159,
310,
417,
973,
17194,
275,
253,
2929,
352,
310,
275,
253,
32114,
891,
651,
823,
281,
326,
326,
275,
253,
8946,
209,
3737,
253,
12480,
4315,
556,
247,
11859,
3520,
7914,
533,
1060,
824,
271,
7914,
310,
5816,
436,
310,
273,
2282,
271,
4354,
4993,
494,
4468,
285,
891,
3524,
253,
4477,
588,
4575,
616,
4060,
50276,
2577,
643,
4468,
310,
670,
253,
1548,
18279,
1430,
1543,
891,
717,
417,
3981,
326,
253,
1543,
403,
3430,
533,
326,
253,
4477,
513,
417,
452,
247,
1175,
4685,
273,
253,
1548,
18279,
1430,
2523,
275,
436,
1798,
10076,
285,
1060,
253,
2929,
4453,
762,
35208,
533,
846,
15974,
253,
5701,
273,
253,
30628,
285,
247,
4060,
14000,
891,
1158,
436,
812,
320,
271,
4722,
2929,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the sample complexity for learning heuristic functions for gbfsa search on a graph with a fixed number of nodes n the analysis uses pac learning framework and the main results show the upper and lower bound of pseudo dimensions of a class of utility functions in which each utility function associates a search task to a scalar value between 0 and h the paper also continues to provide upper bounds on the expectation of gaps between the optimal costs and the suboptimal costs where the expectation was taken over the search task sampled from some distribution d and the bounds are given in terms of the number of samples and the number of nodes strengths are mathematical analysis of the sample complexity for learning heuristic functions for graph search tasks using gbfsa weaknesses are that this analysis emphasizes theoretical aspects and missing practical implications of the upper bounds i think this work is not relevant to this section docsepthe paper presents bounds on the sample complexity required for learning heuristic functions to guide greedy bestfirst search and a search for solving the shortest path problem in a given graph the classical approach to bestfirst search and heuristic search in general is to provide it with a handcrafted heuristic which is typically obtained by solving a relaxed version of the original problem in order to guide it more effectively towards the optimal solution however more recent work aims to learn the guiding heuristic directly from some training data which could be more appealing in some cases therefore deriving bounds on how much data is required to learn a heuristic function with certain guarantees is called for the paper is fairly well written and organised the quality of the presentation is overall very good and therefore the paper is relatively easy to follow most of the concepts and technical details are introduced and discussed if a fairly clear manner i think the paper needs a more detailed running example otherwise its not very easy to follow the details especially for a reader whos not very familiar with this research area minor comments definition 1 there is a typo hyi geq ti instead of hyi geq zi see above docsepthis theoretical paper presents sample complexity bounds for learning heuristics for a and bestfirst search it shows an onlogn upper bound on the pseudo dimension of bfs and on2logn for a with omegan lower bounds for both it shows that the upper bounds are nearly tight but can be improved for a when bounding edge weights and variable degrees moreover when learning a potentially suboptimal heuristic function the paper gives an upper bound on the suboptimality the paper is relatively straightforward in the sense that it gives clear questions and clear answers it is well written and explains the weaknesses of the results namely the relatively big gap between the bounds on the pseudodimension of a as well as give some explanation why it is hard to bridge them i dont see any major weaknesses i would suggest that another interesting direction here is looking at a for planning the graph is obviously exponentially large so the bounds here are useless but it has a compact representation eg the strips model could some heuristics be learned efficiently in that setting typos etc defn you use t1 tn for the values in the text and z1zn in the formula 107 disrtibution 154 gaurantees no direct societal impact
### Summary: | strong paper studying the sample complexity of learning heuristic functions for gbfs and a the reviewers were especially impressed with the theoretical results and find the paper a worthwhile contribution to this conference | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
3410,
10454,
323,
4715,
47641,
3470,
323,
305,
3342,
6678,
3186,
327,
247,
4216,
342,
247,
4229,
1180,
273,
7632,
295,
253,
1783,
4648,
19162,
4715,
7792,
285,
253,
2022,
1543,
921,
253,
5170,
285,
2406,
3033,
273,
17927,
10103,
273,
247,
966,
273,
11839,
3470,
275,
534,
1016,
11839,
1159,
26624,
247,
3186,
4836,
281,
247,
13434,
1318,
875,
470,
285,
288,
253,
2929,
671,
7788,
281,
2085,
5170,
14493,
327,
253,
15355,
273,
18388,
875,
253,
8654,
4815,
285,
253,
749,
29776,
4815,
835,
253,
15355,
369,
2668,
689,
253,
3186,
4836,
19958,
432,
690,
3268,
277,
285,
253,
14493,
403,
1677,
275,
2426,
273,
253,
1180,
273,
3530,
285,
253,
1180,
273,
7632,
50276,
296,
3755,
20556,
403,
15965,
1783,
273,
253,
3410,
10454,
323,
4715,
47641,
3470,
323,
4216,
3186,
8892,
970,
305,
3342,
6678,
50276,
20881,
1255,
265,
403,
326,
436,
1783,
35520,
10527,
7794,
285,
5816,
8542,
12739,
273,
253,
5170,
14493,
50276,
74,
1158,
436,
789,
310,
417,
4623,
281,
436,
2593,
5474,
339,
431,
248,
2929,
10262,
14493,
327,
253,
3410,
10454,
2424,
323,
4715,
47641,
3470,
281,
7102,
38754,
1682,
7053,
3186,
285,
247,
3186,
323,
16161,
253,
30505,
1854,
1895,
275,
247,
1677,
4216,
253,
8946,
2746,
281,
1682,
7053,
3186,
285,
47641,
3186,
275,
2087,
310,
281,
2085,
352,
342,
247,
1133,
12517,
264,
47641,
534,
310,
5431,
2797,
407,
16161,
247,
19595,
2715,
273,
253,
3236,
1895,
275,
1340,
281,
7102,
352,
625,
8069,
4404,
253,
8654,
2900,
2299,
625,
3332,
789,
13698,
281,
3037,
253,
26766,
47641,
3587,
432,
690,
3733,
941,
534,
812,
320,
625,
23176,
275,
690,
2219,
3103,
44190,
14493,
327,
849,
1199,
941,
310,
2424,
281,
3037,
247,
47641,
1159,
342,
2176,
23632,
310,
1925,
323,
50276,
783,
2929,
310,
9648,
973,
3542,
285,
29070,
253,
3290,
273,
253,
9759,
310,
4583,
1077,
1175,
285,
3103,
253,
2929,
310,
4942,
3477,
281,
956,
954,
273,
253,
12342,
285,
7681,
4278,
403,
5611,
285,
5469,
604,
247,
9648,
2590,
5133,
50275,
74,
1158,
253,
2929,
3198,
247,
625,
7000,
3515,
1650,
5010,
697,
417,
1077,
3477,
281,
956,
253,
4278,
3340,
323,
247,
9414,
364,
375,
417,
1077,
7615,
342,
436,
2561,
2170,
50276,
37585,
5701,
50275,
28692,
337,
627,
310,
247,
1745,
80,
1465,
74,
305,
2574,
16816,
3185,
273,
1465,
74,
305,
2574,
1182,
74,
50276,
2887,
1840,
5474,
33032,
2520,
10527,
2929,
10262,
3410,
10454,
14493,
323,
4715,
344,
321,
3397,
323,
247,
285,
1682,
7053,
3186,
352,
2722,
271,
327,
77,
2331,
5170,
3033,
327,
253,
17927,
7877,
273,
270,
3671,
285,
327,
19,
77,
2331,
323,
247,
342,
7005,
30558,
2406,
14493,
323,
1097,
352,
2722,
326,
253,
5170,
14493,
403,
4829,
6863,
533,
476,
320,
5520,
323,
247,
672,
41113,
5024,
13461,
285,
4778,
7759,
25761,
672,
4715,
247,
7826,
749,
29776,
47641,
1159,
253,
2929,
4245,
271,
5170,
3033,
327,
253,
749,
32581,
1319,
50276,
783,
2929,
310,
4942,
15246,
275,
253,
3282,
326,
352,
4245,
2590,
3533,
285,
2590,
9172,
352,
310,
973,
3542,
285,
11424,
253,
32213,
273,
253,
1543,
10775,
253,
4942,
1943,
8037,
875,
253,
14493,
327,
253,
10585,
351,
50127,
273,
247,
347,
973,
347,
1918,
690,
8813,
2139,
352,
310,
1892,
281,
9729,
731,
50276,
74,
13414,
923,
667,
2201,
32213,
891,
651,
1804,
326,
1529,
4722,
3884,
1060,
310,
2819,
387,
247,
323,
7219,
253,
4216,
310,
9090,
28596,
1781,
594,
253,
14493,
1060,
403,
19437,
533,
352,
556,
247,
8566,
6779,
24088,
253,
22486,
1566,
812,
690,
344,
321,
3397,
320,
6311,
14556,
275,
326,
4758,
50274,
555,
993,
3966,
50276,
1545,
79,
368,
897,
246,
18,
50276,
14543,
323,
253,
2193,
275,
253,
2505,
285,
1182,
18,
27509,
275,
253,
7212,
50276,
12224,
557,
1378,
487,
890,
50276,
17161,
305,
28973,
4317,
50276,
2369,
1480,
38058,
3486,
2490,
187,
4118,
18435,
27,
9072,
2929,
12392,
253,
3410,
10454,
273,
4715,
47641,
3470,
323,
305,
3342,
84,
285,
247,
253,
30628,
497,
3340,
17847,
342,
253,
10527,
1543,
285,
1089,
253,
2929,
247,
32811,
7680,
281,
436,
8059
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
3410,
10454,
323,
4715,
47641,
3470,
323,
305,
3342,
6678,
3186,
327,
247,
4216,
342,
247,
4229,
1180,
273,
7632,
295,
253,
1783,
4648,
19162,
4715,
7792,
285,
253,
2022,
1543,
921,
253,
5170,
285,
2406,
3033,
273,
17927,
10103,
273,
247,
966,
273,
11839,
3470,
275,
534,
1016,
11839,
1159,
26624,
247,
3186,
4836,
281,
247,
13434,
1318,
875,
470,
285,
288,
253,
2929,
671,
7788,
281,
2085,
5170,
14493,
327,
253,
15355,
273,
18388,
875,
253,
8654,
4815,
285,
253,
749,
29776,
4815,
835,
253,
15355,
369,
2668,
689,
253,
3186,
4836,
19958,
432,
690,
3268,
277,
285,
253,
14493,
403,
1677,
275,
2426,
273,
253,
1180,
273,
3530,
285,
253,
1180,
273,
7632,
50276,
296,
3755,
20556,
403,
15965,
1783,
273,
253,
3410,
10454,
323,
4715,
47641,
3470,
323,
4216,
3186,
8892,
970,
305,
3342,
6678,
50276,
20881,
1255,
265,
403,
326,
436,
1783,
35520,
10527,
7794,
285,
5816,
8542,
12739,
273,
253,
5170,
14493,
50276,
74,
1158,
436,
789,
310,
417,
4623,
281,
436,
2593,
5474,
339,
431,
248,
2929,
10262,
14493,
327,
253,
3410,
10454,
2424,
323,
4715,
47641,
3470,
281,
7102,
38754,
1682,
7053,
3186,
285,
247,
3186,
323,
16161,
253,
30505,
1854,
1895,
275,
247,
1677,
4216,
253,
8946,
2746,
281,
1682,
7053,
3186,
285,
47641,
3186,
275,
2087,
310,
281,
2085,
352,
342,
247,
1133,
12517,
264,
47641,
534,
310,
5431,
2797,
407,
16161,
247,
19595,
2715,
273,
253,
3236,
1895,
275,
1340,
281,
7102,
352,
625,
8069,
4404,
253,
8654,
2900,
2299,
625,
3332,
789,
13698,
281,
3037,
253,
26766,
47641,
3587,
432,
690,
3733,
941,
534,
812,
320,
625,
23176,
275,
690,
2219,
3103,
44190,
14493,
327,
849,
1199,
941,
310,
2424,
281,
3037,
247,
47641,
1159,
342,
2176,
23632,
310,
1925,
323,
50276,
783,
2929,
310,
9648,
973,
3542,
285,
29070,
253,
3290,
273,
253,
9759,
310,
4583,
1077,
1175,
285,
3103,
253,
2929,
310,
4942,
3477,
281,
956,
954,
273,
253,
12342,
285,
7681,
4278,
403,
5611,
285,
5469,
604,
247,
9648,
2590,
5133,
50275,
74,
1158,
253,
2929,
3198,
247,
625,
7000,
3515,
1650,
5010,
697,
417,
1077,
3477,
281,
956,
253,
4278,
3340,
323,
247,
9414,
364,
375,
417,
1077,
7615,
342,
436,
2561,
2170,
50276,
37585,
5701,
50275,
28692,
337,
627,
310,
247,
1745,
80,
1465,
74,
305,
2574,
16816,
3185,
273,
1465,
74,
305,
2574,
1182,
74,
50276,
2887,
1840,
5474,
33032,
2520,
10527,
2929,
10262,
3410,
10454,
14493,
323,
4715,
344,
321,
3397,
323,
247,
285,
1682,
7053,
3186,
352,
2722,
271,
327,
77,
2331,
5170,
3033,
327,
253,
17927,
7877,
273,
270,
3671,
285,
327,
19,
77,
2331,
323,
247,
342,
7005,
30558,
2406,
14493,
323,
1097,
352,
2722,
326,
253,
5170,
14493,
403,
4829,
6863,
533,
476,
320,
5520,
323,
247,
672,
41113,
5024,
13461,
285,
4778,
7759,
25761,
672,
4715,
247,
7826,
749,
29776,
47641,
1159,
253,
2929,
4245,
271,
5170,
3033,
327,
253,
749,
32581,
1319,
50276,
783,
2929,
310,
4942,
15246,
275,
253,
3282,
326,
352,
4245,
2590,
3533,
285,
2590,
9172,
352,
310,
973,
3542,
285,
11424,
253,
32213,
273,
253,
1543,
10775,
253,
4942,
1943,
8037,
875,
253,
14493,
327,
253,
10585,
351,
50127,
273,
247,
347,
973,
347,
1918,
690,
8813,
2139,
352,
310,
1892,
281,
9729,
731,
50276,
74,
13414,
923,
667,
2201,
32213,
891,
651,
1804,
326,
1529,
4722,
3884,
1060,
310,
2819,
387,
247,
323,
7219,
253,
4216,
310,
9090,
28596,
1781,
594,
253,
14493,
1060,
403,
19437,
533,
352,
556,
247,
8566,
6779,
24088,
253,
22486,
1566,
812,
690,
344,
321,
3397,
320,
6311,
14556,
275,
326,
4758,
50274,
555,
993,
3966,
50276,
1545,
79,
368,
897,
246,
18,
50276,
14543,
323,
253,
2193,
275,
253,
2505,
285,
1182,
18,
27509,
275,
253,
7212,
50276,
12224,
557,
1378,
487,
890,
50276,
17161,
305,
28973,
4317,
50276,
2369,
1480,
38058,
3486,
2490,
187,
4118,
18435,
27,
9072,
2929,
12392,
253,
3410,
10454,
273,
4715,
47641,
3470,
323,
305,
3342,
84,
285,
247,
253,
30628,
497,
3340,
17847,
342,
253,
10527,
1543,
285,
1089,
253,
2929,
247,
32811,
7680,
281,
436,
8059
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a compression method for transformerbased encoderdecoder or language models the key idea of the proposed method is to decompose the standard parameters into a much smaller shared parameter matrix and independent parameters for each original matrix then the method can approximately recover the original transformer models by simple additions and multiplications the experiments are conducted on three mt tasks one summarization task and one language modeling task experimental results show that the proposed method seems to reduce model sizes and computations successfully while preventing considerable performance degradation in some cases the proposed method appears to improve the performance the idea of the proposed method is interesting but there are a few concerns in terms of the presentation therefore it is hard to judge whether this paper has enough contribution for publishing as the conference paper the following are my concerns in the current version 1 technical novelty the idea of the proposed method is interesting and might be effective however the idea itself of sharing the parameter is not very innovative i think that sharing parameters for compressing dnns is a standard technique nowadays therefore the authors need to clarify the contributions of the proposed method such as the unique properties that previous similar compression methods cannot achieve currently i do not find any strong properties in the proposed method if my understanding is correct the proposed method is a reconstruction method therefore we need a trained model for applying the proposed method this means the proposed method requires additional computation i do not fully understand why this paper compares the computational cost with the standard transformer 2 notation and equation the notations are incredibly messy and hard to understand the authors need to make notations much simpler for better understanding to readers 3 l1 constraint if my understanding is correct the relaxed l1 constraint does not guarantee to find the solution that satisfies the threshold of nonzero factors this paper seems not to explain the way if such a situation occurs in the solution 4 typo or misconfiguration in table 1 it says the results for wmt deen and wmt fren however at the beginning of section 4 the experiments are conducted on wmt ende and enfr which are not deen and fren 5 confirmation of model sizes according to the original transformer paper 1 the numbers of parameters of transformer base and large are 64m and 213m respectively however in the experiments the model size of the baseline transformer is 36m as shown in table 1 for wmt ende moreover i checked the previous paper such as the lite transfomer paper wu et al 2020 and the pay less attention paper wu et al 2019 however i could not find the precise experimental settings used in this paper i recommend clearly showing the model configurations and hyperparameter settings for keeping reproducibility otherwise the reproducibility of the proposed method may not be sufficient 1 vaswani et al attention is all you need in proc of nips2017 6 inconsistent results in table 1 and 3 i thought that the ablation study of table 3 is based on the results settings of table 1 however the numbers of parameters shown in tables 1 and 3 differ entirely so i do not understand the meaning of the ablation study in table 3 please confirm it and clarify the configuration difference between tables 1 and 3 moreover explain the results of the baseline transformer and the proposed method corresponding to the ablation results in table 3 additionally it seems that there is no description about what the improvedembedded is shown in table 3 if i miss the description please let me know if the paper lacks explanation this can be an apparent problem for this paper in terms of completeness the idea of the proposed method is interesting and might be effective however the idea itself of sharing the parameter is not very innovative and rather incremental experimental settings are ambiguous and seems to use very weak settings docsepthis paper describes a technique for reducing the size and computation of a transformer model by projecting and factoring weight matrices experiments on mt summarization and language modeling show improved results over competing techniques and even over standard transformers despite using significantly fewer parameters and less computation the paper contains a lot of substance but it is very dense and hard to follow the core dictionary technique isnt really explained at a high level before the paper plunges into the details it seems to be something like the approach in 1 but its difficult to be sure i gave up on section 3 after a while the results in section 5 are very impressive but some intuition about why a compressed approach like this could beat a much larger baseline on large data settings really need to be provided 1 kaiser lukasz et al fast decoding in sequence models using discrete latent variables international conference on machine learning pmlr 2018 details the first line of research would be good to add a word or two saying how these papers reduce computational complexity figure 1 is really great but you should say where these stats come from figures 2 and 3 captions crash into text this is hard to understand in this paper the params omit word embedding size that would highly dependent on the sentence length and would significantly differ for various tasks the total params in this paper includes the model size of word embedding its difficult to align table 3 with figure 5 you should include a line corresponding to the point in figure 5 with highest bleu higher than anything that appears in table 3 potentially a great paper but if so it deserves to be much better explained docsepthis work proposes a modification of the original transformer architecture by replacing attention layers and layers in its feedforward networks across all of its blocks with learned shared dictionaries the proposed model called dictformer has a smaller number of parameters and uses a smaller amount of computational operations when compared to the original transformer and some of its variations when evaluated against these models on popular machine translation summarization and language modeling benchmarks dictformer achieves comparable or better performance strengths the proposed modification to the transformer architecture reduces the number of model parameters and computational operations while sustaining competitive performance on various downstream tasks to the best of my knowledge the idea of replacing layers of the transformer with shared dictionaries is novel room for improvement shareddictionary attention i might be missing something but why is it stated that the unshared linear projection tildewiqj is approximately equal to wiqj my understanding is that this is not directly optimized for in the model groupwise shared dictionary ffn the motivation behind dividing columns of the dictionary into groups is a bit unclear what is meant by highquality performance of the shared dictionary projection also have the authors considered using a larger number of dictionary elements m to increase the flexibility of the model how is the number of groups g determined training the dictformer since the sparse matrix z is initialized using values in c how are coefficients c initialized results tables missing confidence intervals were the experiments run with multiple seeds suggested related work how is this work related to work on sparse transformers eg 1 2 or fixed attention such as 3 4 1 child r gray s radford a sutskever i generating long sequences with sparse transformers arxiv preprint arxiv190410509 2019 apr 23 2 correia gm niculae v and martins af 2019 adaptively sparse transformers arxiv preprint arxiv190900015 3 you w sun s and iyyer m 2020 hardcoded gaussian attention for neural machine translation arxiv preprint arxiv200500742 4 raganato a scherrer y and tiedemann j 2020 fixed encoder selfattention patterns in transformerbased machine translation arxiv preprint arxiv200210260 additional questions is it necessary to have the dictionary size less than the embedding size namely m d would it not be feasible to have a large dictionary m d but keep the number of selected components t small ie t d through a sparsity constraint have the authors tracked whether all columns of the dictionaries are used in practice have the authors tracked what percentage of the t coefficients are nonzero on average nitpicks typos p 2 first line few unshared linear projections p 3 overview paragraph given an accuracy threshold p 4 paragraph starting with the reason why cix should not x be capitalized p 5 groupwise shareddictionary ffn paragraph a n d times d weights n weights of size d times d p 6 figure 4 training sparse coefficients we train sparse coefficients p 6 first sentence of training dictformer via constraints and relaxation paragraph linear projections of a dictionary p 7 last paragraph of architecture and evaluation paragraph switch first sentence to present tense total params in p 8 machine translation paragraph dictformer obtains more compact p 8 sensitive study paragraph rename to sensitivity study p 9 first paragraph coefficient size is fixed to 60 p 9 ablation paragraph first sentence missing space after period p 9 we will release code and data is there data to be released the proposed modification to the transformer architecture is novel and i believe would be interesting for the community but the methodology and motivation could be explained more clearly and provided with more context including more details on the hyperparameter selection and on how the dictformer is trained the experimental results would be even more convincing if confidence intervals are provided updates during paper discussion based on the authors responses to the reviewers questions and updates to the manuscript including clarifying some of their methodology and statements and including confidence intervals in the results section ive decided to increase my score docsepthe authors proposed an efficient transformer layer based on a dictionary of shared parameters instead of standard selfattention the goal is to reduce redundant parameters in transformer models the main contributions are a lite transformer model modification of the selfattention parameters and evaluation on language dowstream tasks the proposed transformer model outperforms related work on the machine translation and language modelling tasks strengths clear description of background knowledge clear exposition of the proposed model the authors perform a comprehensive comparison on different downstream tasks such as machine translation summarization and language modeling the findings show that the proposed transformer model outperforms related work on the machine translation and language modeling tasks weaknesses it is not clear how the initialisation of hyperparameters affects model performance questions to the authors please address the following questions during the rebuttal does parameter initialization could affect model performance a possible extra contribution is to perform multiple random runs and report variance however how expensive could this exercise become please speculate on how attention representations behave across layers for example in abnar and zuidema quantifying attention flow in transformers or voita et al the bottomup evolution of representations in the transformer a study with machine translation and language modeling objectives by using other pretraining objectives in the langgue modelling task eg next sentence would it change any finding or results i recommend acceptance given that the paper clearly describes related work and proposed model the authors proposed an efficient transformer model that can be trained with less resources the authors perform an evaluation of the proposed model with different language downstream tasks and the model outperforms related work on machine translation and language modelling
### Summary: | dictformer is a method to reduce the redundancy in transformers so they can deployed on edge devices in the method a shared dictionary across layers and unshared coefficients are used in place of weight multiplications the author proposed a l1 relaxation to train the nondifferentiable objective to achieve both higher performance and lower parameter counts all reviewers ended up giving the paper a score of 6 after increasing their scores during discussions while the results are strong better performance at much lower parameter counts the paper is not clearly written several reviewers noted that the paper is difficult to understand and has a few unresolved points for example the method also ended up performing better than the base transformer model that dictformer is supposed to compress there seems to be a lack of understanding about what part of the model delivered the improvements one reviewer said that this is potentially a great paper that deserves to be better explained the basic concept of sharing a dictionary across layers should be simple enough to explain well and deserve a better explanation than eq 5 the authors promise to release the code which would be necessary for a full dissemination of this work i recommend accept | [
604,
824,
247,
4112,
6634,
275,
253,
2900,
50274,
21,
1745,
80,
390,
3731,
24693,
50276,
249,
2829,
337,
352,
2296,
253,
1543,
323,
259,
6917,
372,
257,
285,
259,
6917,
28080,
2299,
387,
253,
5068,
273,
2593,
577,
253,
4679,
403,
5196,
327,
259,
6917,
19072,
285,
546,
925,
534,
403,
417,
372,
257,
285,
28080,
50273,
22,
16883,
273,
1566,
9552,
50275,
35861,
281,
253,
3236,
39707,
2929,
337,
253,
3904,
273,
3602,
273,
39707,
2613,
285,
1781,
403,
6705,
78,
285,
25098,
78,
2975,
2299,
275,
253,
4679,
253,
1566,
1979,
273,
253,
8245,
39707,
310,
5540,
78,
347,
2011,
275,
2829,
337,
323,
259,
6917,
19072,
50276,
3062,
1189,
891,
10141,
253,
2045,
2929,
824,
347,
253,
298,
614,
47415,
8056,
2929,
259,
86,
1162,
355,
9169,
285,
253,
2075,
1679,
4116,
2929,
259,
86,
1162,
355,
6247,
2299,
891,
812,
417,
1089,
253,
10799,
5661,
7533,
908,
275,
436,
2929,
50276,
74,
5583,
4518,
4645,
253,
1566,
16012,
285,
4373,
19484,
7533,
323,
7562,
38041,
5010,
253,
38041,
273,
253,
4081,
1332,
778,
417,
320,
4209,
50276,
18,
16016,
88,
6451,
1162,
355,
4116,
310,
512,
368,
878,
275,
15613,
273,
295,
2824,
7132,
50272,
23,
16706,
1543,
275,
2829,
337,
285,
495,
50276,
74,
1869,
326,
253,
28913,
1263,
273,
2829,
495,
310,
1754,
327,
253,
1543,
7533,
273,
2829,
337,
2299,
253,
3904,
273,
3602,
2011,
275,
7180,
337,
285,
495,
9184,
7094,
594,
891,
513,
417,
2096,
253,
4495,
273,
253,
28913,
1263,
275,
2829,
495,
4496,
6583,
352,
285,
19148,
253,
6661,
3064,
875,
7180,
337,
285,
495,
50276,
3062,
1189,
5513,
253,
1543,
273,
253,
8245,
39707,
285,
253,
4081,
1332,
3969,
281,
253,
28913,
1543,
275,
2829,
495,
50275,
29483,
595,
352,
3133,
326,
627,
310,
642,
5740,
670,
752,
253,
5520,
40964,
310,
2011,
275,
2829,
495,
604,
891,
2985,
253,
5740,
4496,
1339,
479,
871,
604,
253,
2929,
19756,
8813,
436,
476,
320,
271,
5165,
1895,
323,
436,
2929,
275,
2426,
273,
29867,
50264,
783,
2934,
273,
253,
4081,
1332,
310,
4722,
285,
1537,
320,
3576,
2299,
253,
2934,
3139,
273,
9628,
253,
4764,
310,
417,
1077,
16694,
285,
2581,
32809,
5661,
7533,
403,
23851,
285,
3133,
281,
897,
1077,
5075,
7533,
50276,
7152,
33032,
2520,
2929,
8631,
247,
5853,
323,
8493,
253,
1979,
285,
13782,
273,
247,
39707,
1566,
407,
35104,
285,
2803,
272,
2801,
12624,
4679,
327,
26301,
10405,
1320,
285,
3448,
14053,
921,
5520,
1543,
689,
11771,
5609,
285,
1014,
689,
2629,
4979,
398,
5747,
970,
3012,
11184,
3602,
285,
1679,
13782,
50276,
783,
2929,
4428,
247,
2257,
273,
10359,
533,
352,
310,
1077,
14086,
285,
1892,
281,
956,
253,
5161,
19034,
5853,
310,
2649,
1663,
5544,
387,
247,
1029,
1268,
1078,
253,
2929,
31314,
265,
715,
253,
4278,
352,
3133,
281,
320,
1633,
751,
253,
2746,
275,
337,
533,
697,
2834,
281,
320,
2119,
891,
3534,
598,
327,
2593,
495,
846,
247,
1223,
253,
1543,
275,
2593,
608,
403,
1077,
13943,
533,
690,
30328,
670,
2139,
247,
21012,
2746,
751,
436,
812,
7171,
247,
1199,
4067,
8245,
327,
1781,
941,
7533,
1663,
878,
281,
320,
2530,
50276,
18,
465,
34393,
298,
2788,
284,
91,
1162,
355,
3809,
28490,
275,
3425,
3210,
970,
13358,
21624,
4903,
5213,
8059,
327,
5145,
4715,
268,
1686,
83,
4765,
50276,
23454,
50276,
783,
806,
1386,
273,
2561,
651,
320,
1175,
281,
823,
247,
3159,
390,
767,
3981,
849,
841,
9380,
4796,
15180,
10454,
50276,
13206,
337,
310,
1663,
1270,
533,
368,
943,
1333,
835,
841,
22118,
1705,
432,
50276,
40203,
374,
285,
495,
3403,
621,
13035,
715,
2505,
50276,
2520,
310,
1892,
281,
2096,
209,
186,
249,
436,
2929,
253,
18912,
35991,
3159,
21496,
1979,
326,
651,
4122,
7976,
327,
253,
6197,
2978,
285,
651,
3012,
9184,
323,
2710,
8892,
253,
2264,
18912,
275,
436,
2929,
3797,
253,
1566,
1979,
273,
3159,
21496,
50276,
953,
2834,
281,
8495,
2829,
495,
342,
4677,
608,
368,
943,
2486,
247,
1386,
3969,
281,
253,
1127,
275,
4677,
608,
342,
4585,
7387,
86,
2169,
685,
2712,
326,
4620,
275,
2829,
495,
50276,
11714,
4303,
247,
1270,
2929,
533,
604,
594,
352,
22828,
281,
320,
1199,
1805,
5544,
5474,
33032,
2520,
789,
29328,
247,
11237,
273,
253,
3236,
39707,
10336,
407,
15706,
4116,
8090,
285,
8090,
275,
697,
3997,
10495,
6928,
2439,
512,
273,
697,
8336,
342,
6311,
6096,
277,
49580,
253,
4081,
1566,
1925,
10886,
19946,
556,
247,
4577,
1180,
273,
3602,
285,
4648,
247,
4577,
2408,
273,
15180,
5871,
672,
2429,
281,
253,
3236,
39707,
285,
690,
273,
697,
10575,
672,
6760,
1411,
841,
3210,
327,
4633,
5145,
10234,
10405,
1320,
285,
3448,
14053,
49602,
10886,
19946,
33526,
10870,
390,
1805,
3045,
50276,
296,
3755,
20556,
50276,
783,
4081,
11237,
281,
253,
39707,
10336,
11355,
253,
1180,
273,
1566,
3602,
285,
15180,
5871,
1223,
38933,
12085,
3045,
327,
2710,
15450,
8892,
50276,
936,
253,
1682,
273,
619,
3640,
253,
2934,
273,
15706,
8090,
273,
253,
39707,
342,
6096,
277,
49580,
310,
4460,
50275,
4461,
323,
7756,
50276,
18867,
46717,
4116,
50276,
74,
1537,
320,
5816,
1633,
533,
2139,
310,
352,
4767,
326,
253,
440,
18867,
4872,
12378,
246,
6227,
22084,
82,
75,
310,
5512,
4503,
281,
259,
29370,
75,
619,
4685,
310,
326,
436,
310,
417,
3587,
18325,
323,
275,
253,
1566,
50276,
4399,
3020,
6096,
19034,
269,
4174,
50276,
783,
16038,
3212,
23534,
9930,
273,
253,
19034,
715,
2390,
310,
247,
2372,
12744,
752,
310,
5486,
407,
1029,
15177,
3045,
273,
253,
6096,
19034,
12378,
671,
452,
253,
4477,
2783,
970,
247,
4067,
1180,
273,
19034,
3603,
278,
281,
2572,
253,
15840,
273,
253,
1566,
50274,
5430,
310,
253,
1180,
273,
2390,
305,
3413,
50276,
31158,
253,
10886,
19946,
50276,
17480,
253,
23507,
4315,
1182,
310,
31260,
970,
2193,
275,
260,
849,
403,
10303,
260,
31260,
50275,
16680,
7180,
50276,
33722,
7162,
11508,
497,
253,
4679,
1408,
342,
2709,
12922,
50275,
35640,
264,
2905,
789,
50276,
5430,
310,
436,
789,
2905,
281,
789,
327,
23507,
4979,
398,
24088,
337,
374,
390,
4229,
4116,
824,
347,
495,
577,
50275,
18,
1429,
391,
11978,
256,
1985,
4379,
247,
256,
14298,
413,
332,
891,
11365,
1048,
6430,
342,
23507,
4979,
398,
549,
32693,
638,
3845,
549,
32693,
746,
2125,
740,
17013,
6247,
1049,
83,
3495,
50276,
19,
2643,
571,
305,
78,
6815,
335,
3348,
362,
285,
16172,
968,
6706,
6247,
5223,
1242,
23507,
4979,
398,
549,
32693,
638,
3845,
549,
32693,
746,
2693,
5831,
22,
50276,
20,
368,
259,
5101,
256,
285,
891,
90,
7885,
278,
9169,
1892,
38059,
305,
12064,
4116,
323,
11454,
5145,
10234,
549,
32693,
638,
3845,
549,
32693,
1518,
5388,
24,
2945,
50276,
21,
391,
12043,
4611,
247,
660,
379,
6554,
340,
285,
12331,
39480,
480,
9169,
4229,
32049,
1881,
42959,
6127,
275,
39707,
3169,
5145,
10234,
549,
32693,
638,
3845,
549,
32693,
10016,
11335,
1549,
50275,
38092,
3533,
50276,
261,
352,
3309,
281,
452,
253,
19034,
1979,
1679,
685,
253,
21496,
1979,
10775,
278,
50276,
69,
651,
352,
417,
320,
17887,
281,
452,
247,
1781,
19034,
278,
50276,
69,
533,
1978,
253,
1180,
273,
4236,
4295,
246,
1355,
26332,
246,
50276,
69,
949,
247,
37139,
414,
7658,
50275,
9802,
253,
4477,
27173,
1880,
512,
9930,
273,
253,
277,
49580,
403,
908,
275,
3946,
50276,
9802,
253,
4477,
27173,
752,
7155,
273,
253,
246,
10303,
403,
28078,
327,
3388,
50275,
32202,
81,
5519,
50276,
555,
993,
50276,
81,
374,
806,
1386,
1643,
440,
18867,
4872,
20553,
50276,
81,
495,
18389,
12494,
1677,
271,
7200,
7887,
50276,
81,
577,
12494,
4983,
342,
253,
1921,
2139,
260,
895,
50276,
11425,
417,
1269,
320,
5347,
1025,
50275,
81,
608,
1387,
3020,
6096,
46717,
269,
4174,
12494,
247,
295,
277,
2069,
277,
13461,
50276,
79,
13461,
273,
1979,
277,
2069,
277,
50273,
81,
721,
4677,
577,
3733,
23507,
10303,
50276,
664,
6194,
23507,
10303,
50276,
81,
721,
806,
6197,
273,
3733,
10886,
19946,
3066,
10806,
285,
17040,
12494,
4872,
20553,
273,
247,
19034,
50275,
81,
818,
1390,
12494,
273,
10336,
285,
7103,
12494,
5234,
806,
6197,
281,
1246,
29341,
2264,
18912,
275,
50276,
81,
854,
5145,
10234,
12494,
10886,
19946,
31326,
625,
8566,
50275,
81,
854,
7996,
1263,
12494,
41838,
281,
7340,
1263,
50276,
81,
898,
50276,
7053,
12494,
10235,
1979,
310,
4229,
281,
3925,
50276,
81,
898,
50276,
1752,
318,
12494,
806,
6197,
5816,
2317,
846,
2180,
50276,
81,
898,
359,
588,
3727,
2127,
285,
941,
310,
627,
941,
281,
320,
4439,
50274,
783,
4081,
11237,
281,
253,
39707,
10336,
310,
4460,
285,
891,
2868,
651,
320,
4722,
323,
253,
3114,
533,
253,
16182,
285,
16038,
812,
320,
5544,
625,
4518,
285,
2530,
342,
625,
3634,
1690,
625,
4278,
327,
253,
4373,
19484,
5438,
285,
327,
849,
253,
10886,
19946,
310,
10166,
253,
5661,
1543,
651,
320,
1014,
625,
21414,
604,
7162,
11508,
403,
2530,
50275,
484,
24275,
1309,
2929,
5955,
1754,
327,
253,
4477,
6128,
281,
253,
30628,
3533,
285,
11269,
281,
253,
7714,
1690,
8254,
5411,
690,
273,
616,
16182,
285,
7234,
285,
1690,
7162,
11508,
275,
253,
1543,
2593,
209,
422,
4425,
281,
2572,
619,
4868,
50276,
7152,
339,
431,
248,
4477,
4081,
271,
5919,
39707,
3828,
1754,
327,
247,
19034,
273,
6096,
3602,
3185,
273,
2629,
1881,
42959,
50276,
783,
4736,
310,
281,
4796,
28116,
3602,
275,
39707,
3210,
50276,
783,
2022,
9021,
403,
247,
298,
614,
39707,
1566,
11237,
273,
253,
1881,
42959,
3602,
285,
7103,
327,
3448,
48418,
4963,
8892,
253,
4081,
39707,
1566,
41731,
13015,
2905,
789,
327,
253,
5145,
10234,
285,
3448,
26278,
8892,
20544,
50276,
8250,
5740,
273,
4114,
3640,
50274,
8250,
47284,
273,
253,
4081,
1566,
50276,
783,
4477,
1347,
247,
11088,
5301,
327,
1027,
15450,
8892,
824,
347,
5145,
10234,
10405,
1320,
285,
3448,
14053,
50276,
783,
4342,
921,
326,
253,
4081,
39707,
1566,
41731,
13015,
2905,
789,
327,
253,
5145,
10234,
285,
3448,
14053,
8892,
50276,
20881,
1255,
265,
50276,
262,
310,
417,
2590,
849,
253,
3302,
5837,
273,
4373,
22041,
11852,
1566,
3045,
50274,
34974,
281,
253,
4477,
4496,
2953,
253,
1563,
3533,
1309,
253,
30080,
22559,
50275,
18566,
4764,
31850,
812,
2818,
1566,
3045,
247,
1896,
4465,
7680,
310,
281,
1347,
2709,
50276,
14719,
6613,
285,
1304,
11041,
2299,
849,
8214,
812,
436,
5763,
2489,
50275,
32897,
30821,
327,
849,
4116,
14237,
21319,
2439,
8090,
323,
1650,
275,
490,
27380,
285,
10736,
5710,
66,
2677,
5411,
4116,
2685,
275,
4979,
398,
390,
3273,
5741,
1162,
355,
253,
5004,
484,
5606,
273,
14237,
275,
253,
39707,
247,
1263,
342,
5145,
10234,
285,
3448,
14053,
16566,
50275,
1615,
970,
643,
3215,
26208,
16566,
275,
253,
19457,
25070,
26278,
4836,
24088,
1735,
6197,
651,
352,
1818,
667,
4560,
390,
1543,
50275,
74,
5583,
14924,
1677,
326,
253,
2929,
4518,
8631,
2905,
789,
285,
4081,
1566,
253,
4477,
4081,
271,
5919,
39707,
1566,
326,
476,
320,
10166,
342,
1679,
5300,
253,
4477,
1347,
271,
50276,
15419,
2368,
273,
253,
4081,
1566,
342,
1027,
3448,
15450,
8892,
285,
253,
1566,
41731,
13015,
2905,
789,
327,
5145,
10234,
285,
3448,
26278,
50276,
187,
187,
4118,
18435,
27,
8102,
19946,
310,
247,
1332,
281,
4796,
253,
39296,
275,
4979,
398,
594,
597,
476,
18329,
327,
5024,
4095,
275,
253,
1332,
247,
6096,
19034,
2439,
8090,
285,
440,
18867,
10303,
403,
908,
275,
1659,
273,
2801,
30840,
569,
253,
2488,
4081,
247,
298,
18,
17040,
281,
6194,
253,
27370,
7413,
6051,
8103,
50276,
936,
5115,
1097,
2169,
3045,
285,
2406,
4764,
9372,
50276,
455,
30628,
7402,
598,
4933,
253,
2929,
247,
4868,
273,
721,
846,
3629,
616,
7363,
1309,
11985,
1223,
253,
1543,
403,
2266,
1805,
3045,
387,
1199,
2406,
4764,
9372,
253,
2929,
310,
417,
4518,
3542,
2067,
30628,
4879,
326,
253,
2929,
310,
2834,
281,
2096,
285,
556,
247,
1643,
39394,
2792,
323,
1650,
253,
1332,
671,
7402,
598,
9591,
1805,
685,
253,
2613,
39707,
1566,
326,
10886,
19946,
310,
6326,
281,
19477,
627,
3133,
281,
320,
247,
3480,
273,
4685,
670,
752,
629,
273,
253,
1566,
8549,
253,
11701,
581,
37317,
753,
326,
436,
310,
7826,
247,
1270,
2929,
326,
22828,
281,
320,
1805,
5544,
253,
5044,
4473,
273,
9628,
247,
19034,
2439,
8090,
943,
320,
2969,
2217,
281,
5513,
973,
285,
17337,
247,
1805,
8813,
685,
16186,
608,
50275,
783,
4477,
9023,
281,
3727,
253,
2127,
534,
651,
320,
3309,
323,
247,
2120,
36404,
273,
436,
789,
891,
5583,
2997
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
604,
824,
247,
4112,
6634,
275,
253,
2900,
50274,
21,
1745,
80,
390,
3731,
24693,
50276,
249,
2829,
337,
352,
2296,
253,
1543,
323,
259,
6917,
372,
257,
285,
259,
6917,
28080,
2299,
387,
253,
5068,
273,
2593,
577,
253,
4679,
403,
5196,
327,
259,
6917,
19072,
285,
546,
925,
534,
403,
417,
372,
257,
285,
28080,
50273,
22,
16883,
273,
1566,
9552,
50275,
35861,
281,
253,
3236,
39707,
2929,
337,
253,
3904,
273,
3602,
273,
39707,
2613,
285,
1781,
403,
6705,
78,
285,
25098,
78,
2975,
2299,
275,
253,
4679,
253,
1566,
1979,
273,
253,
8245,
39707,
310,
5540,
78,
347,
2011,
275,
2829,
337,
323,
259,
6917,
19072,
50276,
3062,
1189,
891,
10141,
253,
2045,
2929,
824,
347,
253,
298,
614,
47415,
8056,
2929,
259,
86,
1162,
355,
9169,
285,
253,
2075,
1679,
4116,
2929,
259,
86,
1162,
355,
6247,
2299,
891,
812,
417,
1089,
253,
10799,
5661,
7533,
908,
275,
436,
2929,
50276,
74,
5583,
4518,
4645,
253,
1566,
16012,
285,
4373,
19484,
7533,
323,
7562,
38041,
5010,
253,
38041,
273,
253,
4081,
1332,
778,
417,
320,
4209,
50276,
18,
16016,
88,
6451,
1162,
355,
4116,
310,
512,
368,
878,
275,
15613,
273,
295,
2824,
7132,
50272,
23,
16706,
1543,
275,
2829,
337,
285,
495,
50276,
74,
1869,
326,
253,
28913,
1263,
273,
2829,
495,
310,
1754,
327,
253,
1543,
7533,
273,
2829,
337,
2299,
253,
3904,
273,
3602,
2011,
275,
7180,
337,
285,
495,
9184,
7094,
594,
891,
513,
417,
2096,
253,
4495,
273,
253,
28913,
1263,
275,
2829,
495,
4496,
6583,
352,
285,
19148,
253,
6661,
3064,
875,
7180,
337,
285,
495,
50276,
3062,
1189,
5513,
253,
1543,
273,
253,
8245,
39707,
285,
253,
4081,
1332,
3969,
281,
253,
28913,
1543,
275,
2829,
495,
50275,
29483,
595,
352,
3133,
326,
627,
310,
642,
5740,
670,
752,
253,
5520,
40964,
310,
2011,
275,
2829,
495,
604,
891,
2985,
253,
5740,
4496,
1339,
479,
871,
604,
253,
2929,
19756,
8813,
436,
476,
320,
271,
5165,
1895,
323,
436,
2929,
275,
2426,
273,
29867,
50264,
783,
2934,
273,
253,
4081,
1332,
310,
4722,
285,
1537,
320,
3576,
2299,
253,
2934,
3139,
273,
9628,
253,
4764,
310,
417,
1077,
16694,
285,
2581,
32809,
5661,
7533,
403,
23851,
285,
3133,
281,
897,
1077,
5075,
7533,
50276,
7152,
33032,
2520,
2929,
8631,
247,
5853,
323,
8493,
253,
1979,
285,
13782,
273,
247,
39707,
1566,
407,
35104,
285,
2803,
272,
2801,
12624,
4679,
327,
26301,
10405,
1320,
285,
3448,
14053,
921,
5520,
1543,
689,
11771,
5609,
285,
1014,
689,
2629,
4979,
398,
5747,
970,
3012,
11184,
3602,
285,
1679,
13782,
50276,
783,
2929,
4428,
247,
2257,
273,
10359,
533,
352,
310,
1077,
14086,
285,
1892,
281,
956,
253,
5161,
19034,
5853,
310,
2649,
1663,
5544,
387,
247,
1029,
1268,
1078,
253,
2929,
31314,
265,
715,
253,
4278,
352,
3133,
281,
320,
1633,
751,
253,
2746,
275,
337,
533,
697,
2834,
281,
320,
2119,
891,
3534,
598,
327,
2593,
495,
846,
247,
1223,
253,
1543,
275,
2593,
608,
403,
1077,
13943,
533,
690,
30328,
670,
2139,
247,
21012,
2746,
751,
436,
812,
7171,
247,
1199,
4067,
8245,
327,
1781,
941,
7533,
1663,
878,
281,
320,
2530,
50276,
18,
465,
34393,
298,
2788,
284,
91,
1162,
355,
3809,
28490,
275,
3425,
3210,
970,
13358,
21624,
4903,
5213,
8059,
327,
5145,
4715,
268,
1686,
83,
4765,
50276,
23454,
50276,
783,
806,
1386,
273,
2561,
651,
320,
1175,
281,
823,
247,
3159,
390,
767,
3981,
849,
841,
9380,
4796,
15180,
10454,
50276,
13206,
337,
310,
1663,
1270,
533,
368,
943,
1333,
835,
841,
22118,
1705,
432,
50276,
40203,
374,
285,
495,
3403,
621,
13035,
715,
2505,
50276,
2520,
310,
1892,
281,
2096,
209,
186,
249,
436,
2929,
253,
18912,
35991,
3159,
21496,
1979,
326,
651,
4122,
7976,
327,
253,
6197,
2978,
285,
651,
3012,
9184,
323,
2710,
8892,
253,
2264,
18912,
275,
436,
2929,
3797,
253,
1566,
1979,
273,
3159,
21496,
50276,
953,
2834,
281,
8495,
2829,
495,
342,
4677,
608,
368,
943,
2486,
247,
1386,
3969,
281,
253,
1127,
275,
4677,
608,
342,
4585,
7387,
86,
2169,
685,
2712,
326,
4620,
275,
2829,
495,
50276,
11714,
4303,
247,
1270,
2929,
533,
604,
594,
352,
22828,
281,
320,
1199,
1805,
5544,
5474,
33032,
2520,
789,
29328,
247,
11237,
273,
253,
3236,
39707,
10336,
407,
15706,
4116,
8090,
285,
8090,
275,
697,
3997,
10495,
6928,
2439,
512,
273,
697,
8336,
342,
6311,
6096,
277,
49580,
253,
4081,
1566,
1925,
10886,
19946,
556,
247,
4577,
1180,
273,
3602,
285,
4648,
247,
4577,
2408,
273,
15180,
5871,
672,
2429,
281,
253,
3236,
39707,
285,
690,
273,
697,
10575,
672,
6760,
1411,
841,
3210,
327,
4633,
5145,
10234,
10405,
1320,
285,
3448,
14053,
49602,
10886,
19946,
33526,
10870,
390,
1805,
3045,
50276,
296,
3755,
20556,
50276,
783,
4081,
11237,
281,
253,
39707,
10336,
11355,
253,
1180,
273,
1566,
3602,
285,
15180,
5871,
1223,
38933,
12085,
3045,
327,
2710,
15450,
8892,
50276,
936,
253,
1682,
273,
619,
3640,
253,
2934,
273,
15706,
8090,
273,
253,
39707,
342,
6096,
277,
49580,
310,
4460,
50275,
4461,
323,
7756,
50276,
18867,
46717,
4116,
50276,
74,
1537,
320,
5816,
1633,
533,
2139,
310,
352,
4767,
326,
253,
440,
18867,
4872,
12378,
246,
6227,
22084,
82,
75,
310,
5512,
4503,
281,
259,
29370,
75,
619,
4685,
310,
326,
436,
310,
417,
3587,
18325,
323,
275,
253,
1566,
50276,
4399,
3020,
6096,
19034,
269,
4174,
50276,
783,
16038,
3212,
23534,
9930,
273,
253,
19034,
715,
2390,
310,
247,
2372,
12744,
752,
310,
5486,
407,
1029,
15177,
3045,
273,
253,
6096,
19034,
12378,
671,
452,
253,
4477,
2783,
970,
247,
4067,
1180,
273,
19034,
3603,
278,
281,
2572,
253,
15840,
273,
253,
1566,
50274,
5430,
310,
253,
1180,
273,
2390,
305,
3413,
50276,
31158,
253,
10886,
19946,
50276,
17480,
253,
23507,
4315,
1182,
310,
31260,
970,
2193,
275,
260,
849,
403,
10303,
260,
31260,
50275,
16680,
7180,
50276,
33722,
7162,
11508,
497,
253,
4679,
1408,
342,
2709,
12922,
50275,
35640,
264,
2905,
789,
50276,
5430,
310,
436,
789,
2905,
281,
789,
327,
23507,
4979,
398,
24088,
337,
374,
390,
4229,
4116,
824,
347,
495,
577,
50275,
18,
1429,
391,
11978,
256,
1985,
4379,
247,
256,
14298,
413,
332,
891,
11365,
1048,
6430,
342,
23507,
4979,
398,
549,
32693,
638,
3845,
549,
32693,
746,
2125,
740,
17013,
6247,
1049,
83,
3495,
50276,
19,
2643,
571,
305,
78,
6815,
335,
3348,
362,
285,
16172,
968,
6706,
6247,
5223,
1242,
23507,
4979,
398,
549,
32693,
638,
3845,
549,
32693,
746,
2693,
5831,
22,
50276,
20,
368,
259,
5101,
256,
285,
891,
90,
7885,
278,
9169,
1892,
38059,
305,
12064,
4116,
323,
11454,
5145,
10234,
549,
32693,
638,
3845,
549,
32693,
1518,
5388,
24,
2945,
50276,
21,
391,
12043,
4611,
247,
660,
379,
6554,
340,
285,
12331,
39480,
480,
9169,
4229,
32049,
1881,
42959,
6127,
275,
39707,
3169,
5145,
10234,
549,
32693,
638,
3845,
549,
32693,
10016,
11335,
1549,
50275,
38092,
3533,
50276,
261,
352,
3309,
281,
452,
253,
19034,
1979,
1679,
685,
253,
21496,
1979,
10775,
278,
50276,
69,
651,
352,
417,
320,
17887,
281,
452,
247,
1781,
19034,
278,
50276,
69,
533,
1978,
253,
1180,
273,
4236,
4295,
246,
1355,
26332,
246,
50276,
69,
949,
247,
37139,
414,
7658,
50275,
9802,
253,
4477,
27173,
1880,
512,
9930,
273,
253,
277,
49580,
403,
908,
275,
3946,
50276,
9802,
253,
4477,
27173,
752,
7155,
273,
253,
246,
10303,
403,
28078,
327,
3388,
50275,
32202,
81,
5519,
50276,
555,
993,
50276,
81,
374,
806,
1386,
1643,
440,
18867,
4872,
20553,
50276,
81,
495,
18389,
12494,
1677,
271,
7200,
7887,
50276,
81,
577,
12494,
4983,
342,
253,
1921,
2139,
260,
895,
50276,
11425,
417,
1269,
320,
5347,
1025,
50275,
81,
608,
1387,
3020,
6096,
46717,
269,
4174,
12494,
247,
295,
277,
2069,
277,
13461,
50276,
79,
13461,
273,
1979,
277,
2069,
277,
50273,
81,
721,
4677,
577,
3733,
23507,
10303,
50276,
664,
6194,
23507,
10303,
50276,
81,
721,
806,
6197,
273,
3733,
10886,
19946,
3066,
10806,
285,
17040,
12494,
4872,
20553,
273,
247,
19034,
50275,
81,
818,
1390,
12494,
273,
10336,
285,
7103,
12494,
5234,
806,
6197,
281,
1246,
29341,
2264,
18912,
275,
50276,
81,
854,
5145,
10234,
12494,
10886,
19946,
31326,
625,
8566,
50275,
81,
854,
7996,
1263,
12494,
41838,
281,
7340,
1263,
50276,
81,
898,
50276,
7053,
12494,
10235,
1979,
310,
4229,
281,
3925,
50276,
81,
898,
50276,
1752,
318,
12494,
806,
6197,
5816,
2317,
846,
2180,
50276,
81,
898,
359,
588,
3727,
2127,
285,
941,
310,
627,
941,
281,
320,
4439,
50274,
783,
4081,
11237,
281,
253,
39707,
10336,
310,
4460,
285,
891,
2868,
651,
320,
4722,
323,
253,
3114,
533,
253,
16182,
285,
16038,
812,
320,
5544,
625,
4518,
285,
2530,
342,
625,
3634,
1690,
625,
4278,
327,
253,
4373,
19484,
5438,
285,
327,
849,
253,
10886,
19946,
310,
10166,
253,
5661,
1543,
651,
320,
1014,
625,
21414,
604,
7162,
11508,
403,
2530,
50275,
484,
24275,
1309,
2929,
5955,
1754,
327,
253,
4477,
6128,
281,
253,
30628,
3533,
285,
11269,
281,
253,
7714,
1690,
8254,
5411,
690,
273,
616,
16182,
285,
7234,
285,
1690,
7162,
11508,
275,
253,
1543,
2593,
209,
422,
4425,
281,
2572,
619,
4868,
50276,
7152,
339,
431,
248,
4477,
4081,
271,
5919,
39707,
3828,
1754,
327,
247,
19034,
273,
6096,
3602,
3185,
273,
2629,
1881,
42959,
50276,
783,
4736,
310,
281,
4796,
28116,
3602,
275,
39707,
3210,
50276,
783,
2022,
9021,
403,
247,
298,
614,
39707,
1566,
11237,
273,
253,
1881,
42959,
3602,
285,
7103,
327,
3448,
48418,
4963,
8892,
253,
4081,
39707,
1566,
41731,
13015,
2905,
789,
327,
253,
5145,
10234,
285,
3448,
26278,
8892,
20544,
50276,
8250,
5740,
273,
4114,
3640,
50274,
8250,
47284,
273,
253,
4081,
1566,
50276,
783,
4477,
1347,
247,
11088,
5301,
327,
1027,
15450,
8892,
824,
347,
5145,
10234,
10405,
1320,
285,
3448,
14053,
50276,
783,
4342,
921,
326,
253,
4081,
39707,
1566,
41731,
13015,
2905,
789,
327,
253,
5145,
10234,
285,
3448,
14053,
8892,
50276,
20881,
1255,
265,
50276,
262,
310,
417,
2590,
849,
253,
3302,
5837,
273,
4373,
22041,
11852,
1566,
3045,
50274,
34974,
281,
253,
4477,
4496,
2953,
253,
1563,
3533,
1309,
253,
30080,
22559,
50275,
18566,
4764,
31850,
812,
2818,
1566,
3045,
247,
1896,
4465,
7680,
310,
281,
1347,
2709,
50276,
14719,
6613,
285,
1304,
11041,
2299,
849,
8214,
812,
436,
5763,
2489,
50275,
32897,
30821,
327,
849,
4116,
14237,
21319,
2439,
8090,
323,
1650,
275,
490,
27380,
285,
10736,
5710,
66,
2677,
5411,
4116,
2685,
275,
4979,
398,
390,
3273,
5741,
1162,
355,
253,
5004,
484,
5606,
273,
14237,
275,
253,
39707,
247,
1263,
342,
5145,
10234,
285,
3448,
14053,
16566,
50275,
1615,
970,
643,
3215,
26208,
16566,
275,
253,
19457,
25070,
26278,
4836,
24088,
1735,
6197,
651,
352,
1818,
667,
4560,
390,
1543,
50275,
74,
5583,
14924,
1677,
326,
253,
2929,
4518,
8631,
2905,
789,
285,
4081,
1566,
253,
4477,
4081,
271,
5919,
39707,
1566,
326,
476,
320,
10166,
342,
1679,
5300,
253,
4477,
1347,
271,
50276,
15419,
2368,
273,
253,
4081,
1566,
342,
1027,
3448,
15450,
8892,
285,
253,
1566,
41731,
13015,
2905,
789,
327,
5145,
10234,
285,
3448,
26278,
50276,
187,
187,
4118,
18435,
27,
8102,
19946,
310,
247,
1332,
281,
4796,
253,
39296,
275,
4979,
398,
594,
597,
476,
18329,
327,
5024,
4095,
275,
253,
1332,
247,
6096,
19034,
2439,
8090,
285,
440,
18867,
10303,
403,
908,
275,
1659,
273,
2801,
30840,
569,
253,
2488,
4081,
247,
298,
18,
17040,
281,
6194,
253,
27370,
7413,
6051,
8103,
50276,
936,
5115,
1097,
2169,
3045,
285,
2406,
4764,
9372,
50276,
455,
30628,
7402,
598,
4933,
253,
2929,
247,
4868,
273,
721,
846,
3629,
616,
7363,
1309,
11985,
1223,
253,
1543,
403,
2266,
1805,
3045,
387,
1199,
2406,
4764,
9372,
253,
2929,
310,
417,
4518,
3542,
2067,
30628,
4879,
326,
253,
2929,
310,
2834,
281,
2096,
285,
556,
247,
1643,
39394,
2792,
323,
1650,
253,
1332,
671,
7402,
598,
9591,
1805,
685,
253,
2613,
39707,
1566,
326,
10886,
19946,
310,
6326,
281,
19477,
627,
3133,
281,
320,
247,
3480,
273,
4685,
670,
752,
629,
273,
253,
1566,
8549,
253,
11701,
581,
37317,
753,
326,
436,
310,
7826,
247,
1270,
2929,
326,
22828,
281,
320,
1805,
5544,
253,
5044,
4473,
273,
9628,
247,
19034,
2439,
8090,
943,
320,
2969,
2217,
281,
5513,
973,
285,
17337,
247,
1805,
8813,
685,
16186,
608,
50275,
783,
4477,
9023,
281,
3727,
253,
2127,
534,
651,
320,
3309,
323,
247,
2120,
36404,
273,
436,
789,
891,
5583,
2997
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper is a continuation of an original associated learning paper by kaochen 2021 it attempts to propoose new learning approach associated learning as an alternative way to backpropagation on top of the original paper it discovers more interesting properties and extend al to cnn lstm and transformers though lacking sufficient details the authors have resolved my concerns on the technical details of how al is applied on rnns and transformers the paper is well written experiments show that in text classification and image classification the proposed method outperform bp in some basic architecture setting here are my concerns it is unclear how the al is applied on rnns and transformers section 211 only very briefly described them but i could not figure out some of the details for example how is the temporal data processed in lstm in cnn when flatting the hidden representation it also lost the spatial information in feature maps furthermore how to convert si to si if ti is also a 3d feature map when the spatial information is lost from the description in section 2 it seems that al introduces around double the parameters for a given neural network what is the impact of the increased parameters in computation cost the experiments uses relatively simple network architecture for text classification does the same benefits carry over to large transformer models and still beats currently popular models like bert the architecture information on cnn in section 33 is missing if my understanding is correct the proposed architecture would not work in sequence generation task like lstm and transformers could do right in summary i think though the paper proposes al framework as an alternative to bp it is actually a simple extension to a previous work and does not proposes substantially new ideas some details are missing and experiments are not extensive enough to cover stateoftheart architectures docsepassociated learning puts forth a template that can be applied to almost any network to achieve faster training and inference they apply their template to several existing deep learning models and perform experiments that show they can achieve comparable if not better results with less training time the paper clearly lays out the advantages of associated learning faster inference dynamic layer accumulation and pipeline the paper is clearly written with good figures the experiments appear to be easy reproducible too the decrease in epochs needed for lstms is particularly impressive i found the biological basis a little lacking perhaps some type of curriculum learning or more exploration on what the various shortcuts are doing could make this argument stronger the related works section neglects to mention other gradientisolated methods like httpsarxivorgabs190511786 i think in some ways this work can be seen as encoderdecoder with additional regularization too i would recommend this paper to be accepted while there are several issues the empirical results are strong particularly the lstm reduction in epochs i think is a lot more to explore with the dynamic layer accumulation and gradient isolation too that would be interesting to other researchers docsepthis paper studies and benchmarks an alternative to backprop named associated learning they analyze the pros and cons thank you for this read the results and the methodology are definitely compelling why i cannot accept the manuscript as is is that the motivation is not clear enough it is clear wrt why bp is not ideal but it is not clear how you landed on this method specifically as compared to many other attempts on finding more optimal neural network optimization methodologies section 2 is very difficult to follow i would spend some more effort explaining how your method works in the manuscript it would be nice to include an algorithm of how to implement al a selection of minor comments some typos throughout the manuscript eg in abstract associate and paragraph 4 in the introduction in section 4 we notation must be introduced eg f y etc in section 2 are not introduced properly in relation to figure 1 it is difficult to follow the difference in notation when using h b and f i recommend you spend some more time on making this very clear to the reader i find the table 2 epochs for ag news difficult to follow there is a clear pattern that al is faster but then things changes radically for ag news would be nice with some further analysis into this with some more clarification on how you ended up with this methodology and a clear algorithm for how to implement al the reviewer would be happy to accept the manuscript docsepthis paper proposes associated learning al for cnn rnn and transformer different from backpropagation bp al decomposes bps global endtoend training strategy into several small local optimization targets such that each subnetworks has an isolated gradient flow to achieve this the paper proposes to map input x and output y into intermediate al layers and performs metric learning eg t1b1s1 and autoencoder learning t1t1 as shown in figure 2 moreover each al layer can be optimized locally the idea is interesting the experiments demonstrate the effectiveness on imdb review ags news corpus dbpedia ontology the stanford sentiment treebank cifar10 and fashionmnist first as in figure 2 the paper proposes to map input x and output y into a latent space for metric learning fxgy and autoencoder learning yhgy are also investigated in multilabel classificationr1r2 which are not discussed in this paper in my opinion the main difference is the design of multiple latent spaces compared with these multilabel classification methods second in traditional machine learning we often map a high dimensional space to a low dimensional space for metric learning it is unclear why maps the target y to the intermediate layers in this paper given a high dimensional space eg images the inference model extracts useful features and filters unrelated features for metric learning however in this paper i find that the authors conduct experiments on some single label classification eg cifar10 and fashionmnist in this case y is a scalar or onehot vector i am curious about the exact form of g1 g2 g3 in figure 2 does the proposed method map a low dimensional latent space to a high one what is the motivation for expanding representation space if g1y and g2g1x are still in a low dimensional space or gi are very simple do we really need inverse transformations from y to al layers in this case we can simply fuse different al layers to top layers for metric learning for example we can move y after t3 and remove h1 h2 h3 in figure 2 since y is a specific label it is unclear why we need to map to a high dimensional space the design of multilabel classification is reasonable to me because the target y is complex eg the multiple label vectors could miss some labels and the multilabels could be in a high dimensional space in this case one can map high dimensional space x and y into a low dimensional latent space for metric learning r1 learning deep latent spaces for multilabel classification aaai 2017 r2 multilabel classification via featureaware implicit label space encoding 2014 third it would be better to set a baseline by moving y after t3 and removing h1 h2 h3 in figure 2 for comparison fourth the architecture of al layers is similar to ladder networks it is suggested to analyze the differences r3 semisupervised learning with ladder networks 2015 1 the proposed method can be optimized locally and achieve competitive results the proposed framework can be used for cnns rnns and transformers the idea is interesting 2 more analyses about the motivation and the necessity of inverse transformation for y to latent space are needed 3 the analyses and discussions about related works such as multilabel classification and ladder networks are missed 4 some experiments are suggested to support the authors opinions eg 2 and 3 if possible
### Summary: | the authors propose a method for associative learning as an alternative to back propagation based learning the idea is to interesting the coupling between layers are broken down into local loss functions that can be updated independently the targets are projected to previous layers and the information is preserved using an autoencoder loss function the projections from the target side are then compared with the projections from input side using a bridge function and a metric loss the method is evaluated on text and image classification tasks the results suggest that this is a promising alternative to back propagation based learning pros a novel idea that seems promising evaluated on text and image classification tasks and demonstrated utility cons the impact of the number of additional parameters and the computation is not clarified even though epochs are lower the authors utilized the discussion period very well running additional experiments that were suggested especially ablation studies they also clarified all the questions that were raised in all the paper has improved substantially from the robust discussion | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
310,
247,
26272,
273,
271,
3236,
2330,
4715,
2929,
407,
16288,
406,
864,
43425,
352,
9437,
281,
4198,
80,
583,
747,
4715,
2746,
2330,
4715,
347,
271,
5795,
1039,
281,
896,
44263,
318,
327,
1755,
273,
253,
3236,
2929,
352,
41217,
625,
4722,
3607,
285,
9017,
355,
281,
260,
9866,
298,
296,
78,
285,
4979,
398,
2167,
14999,
4209,
4278,
253,
4477,
452,
11512,
619,
7350,
327,
253,
7681,
4278,
273,
849,
355,
310,
3732,
327,
391,
79,
2224,
285,
4979,
398,
50274,
783,
2929,
310,
973,
3542,
4679,
921,
326,
275,
2505,
9162,
285,
2460,
9162,
253,
4081,
1332,
562,
32231,
20633,
275,
690,
5044,
10336,
4758,
50276,
1568,
403,
619,
7350,
50275,
262,
310,
12744,
849,
253,
355,
310,
3732,
327,
391,
79,
2224,
285,
4979,
398,
2593,
24978,
760,
1077,
13366,
2529,
731,
533,
891,
812,
417,
4677,
562,
690,
273,
253,
4278,
323,
1650,
849,
310,
253,
11935,
941,
11742,
275,
298,
296,
78,
50276,
249,
260,
9866,
672,
6507,
1076,
253,
8763,
6779,
352,
671,
3663,
253,
8820,
1491,
275,
4735,
8115,
33810,
849,
281,
6455,
4927,
281,
4927,
604,
16816,
310,
671,
247,
495,
69,
4735,
3711,
672,
253,
8820,
1491,
310,
3663,
50275,
4064,
253,
5740,
275,
2593,
374,
352,
3133,
326,
355,
23970,
1475,
4021,
253,
3602,
323,
247,
1677,
11454,
2990,
752,
310,
253,
3486,
273,
253,
2559,
3602,
275,
13782,
2105,
50275,
783,
4679,
4648,
4942,
2969,
2990,
10336,
323,
2505,
9162,
1057,
253,
1072,
5373,
4459,
689,
281,
1781,
39707,
3210,
285,
1335,
27125,
4390,
4633,
3210,
751,
270,
797,
50275,
783,
10336,
1491,
327,
260,
9866,
275,
2593,
5922,
310,
5816,
50275,
338,
619,
4685,
310,
3451,
253,
4081,
10336,
651,
417,
789,
275,
3425,
5978,
4836,
751,
298,
296,
78,
285,
4979,
398,
812,
513,
987,
275,
6010,
891,
1158,
2167,
253,
2929,
29328,
355,
7792,
347,
271,
5795,
281,
20633,
352,
310,
2686,
247,
2969,
6880,
281,
247,
2045,
789,
285,
1057,
417,
29328,
9619,
747,
5697,
690,
4278,
403,
5816,
285,
4679,
403,
417,
9470,
2217,
281,
3835,
1375,
23037,
14387,
35615,
5474,
339,
5858,
1118,
456,
4715,
12516,
6593,
247,
7646,
326,
476,
320,
3732,
281,
2761,
667,
2990,
281,
5115,
7938,
3733,
285,
17032,
597,
4647,
616,
7646,
281,
2067,
5368,
3676,
4715,
3210,
285,
1347,
4679,
326,
921,
597,
476,
5115,
10870,
604,
417,
1805,
1543,
342,
1679,
3733,
673,
50276,
783,
2929,
4518,
41714,
562,
253,
11361,
273,
2330,
4715,
7938,
17032,
7870,
3828,
12037,
285,
15722,
253,
2929,
310,
4518,
3542,
342,
1175,
8442,
253,
4679,
3176,
281,
320,
3477,
41374,
1512,
253,
6379,
275,
44540,
3058,
323,
298,
296,
983,
310,
3782,
13943,
50276,
74,
1119,
253,
7534,
3720,
247,
1652,
14999,
4931,
690,
1511,
273,
24642,
4715,
390,
625,
17947,
327,
752,
253,
2710,
28194,
84,
403,
2509,
812,
1056,
436,
4154,
10046,
253,
2905,
2987,
2593,
18369,
84,
281,
3748,
643,
11786,
18762,
456,
3082,
751,
5987,
39962,
2061,
5375,
746,
1762,
12231,
2691,
891,
1158,
275,
690,
4088,
436,
789,
476,
320,
2326,
347,
32049,
48759,
342,
3081,
37820,
1512,
891,
651,
5583,
436,
2929,
281,
320,
7607,
1223,
627,
403,
2067,
3374,
253,
16774,
1543,
403,
2266,
3782,
253,
298,
296,
78,
5141,
275,
44540,
891,
1158,
310,
247,
2257,
625,
281,
8338,
342,
253,
7870,
3828,
12037,
285,
11786,
12940,
1512,
326,
651,
320,
4722,
281,
643,
8607,
5474,
33032,
2520,
2929,
2175,
285,
49602,
271,
5795,
281,
896,
8560,
4907,
2330,
4715,
597,
12106,
253,
5847,
285,
772,
5717,
368,
323,
436,
1239,
253,
1543,
285,
253,
16182,
403,
7964,
18511,
2139,
891,
2550,
2997,
253,
7714,
347,
310,
310,
326,
50276,
783,
16038,
310,
417,
2590,
2217,
352,
310,
2590,
8772,
2139,
20633,
310,
417,
7445,
533,
352,
310,
417,
2590,
849,
368,
17735,
327,
436,
1332,
5742,
347,
2429,
281,
1142,
643,
9437,
327,
4560,
625,
8654,
11454,
2990,
13757,
39396,
50276,
4674,
374,
310,
1077,
2834,
281,
956,
891,
651,
6947,
690,
625,
3434,
15571,
849,
634,
1332,
2987,
275,
253,
7714,
50276,
262,
651,
320,
5322,
281,
2486,
271,
5933,
273,
849,
281,
3359,
355,
50276,
66,
5438,
273,
5884,
5701,
50276,
8826,
963,
993,
4768,
253,
7714,
24088,
275,
12002,
15629,
285,
12494,
577,
275,
253,
10199,
275,
2593,
577,
359,
50276,
25604,
1364,
320,
5611,
24088,
269,
340,
3966,
275,
2593,
374,
403,
417,
5611,
6283,
275,
5886,
281,
4677,
337,
50276,
262,
310,
2834,
281,
956,
253,
3064,
275,
14951,
672,
970,
288,
270,
285,
269,
891,
5583,
368,
6947,
690,
625,
673,
327,
2403,
436,
1077,
2590,
281,
253,
9414,
50275,
74,
1089,
253,
2829,
374,
44540,
323,
639,
3668,
2834,
281,
956,
627,
310,
247,
2590,
3102,
326,
355,
310,
7938,
533,
840,
1841,
2544,
39278,
323,
639,
3668,
651,
320,
5322,
342,
690,
2007,
1783,
715,
436,
342,
690,
625,
37699,
327,
849,
368,
7402,
598,
342,
436,
16182,
285,
247,
2590,
5933,
323,
849,
281,
3359,
355,
253,
37317,
651,
320,
5211,
281,
2997,
253,
7714,
5474,
33032,
2520,
2929,
29328,
2330,
4715,
355,
323,
260,
9866,
391,
9866,
285,
39707,
1027,
432,
896,
44263,
318,
20633,
355,
11101,
6013,
270,
793,
4156,
990,
936,
423,
3733,
5700,
715,
2067,
1355,
1980,
13757,
8571,
824,
326,
1016,
749,
3024,
4896,
556,
271,
7011,
11786,
2685,
281,
5115,
436,
253,
2929,
29328,
281,
3711,
3280,
1269,
285,
3453,
340,
715,
10444,
355,
8090,
285,
17923,
7982,
4715,
24088,
246,
18,
67,
18,
84,
18,
285,
6753,
36465,
4715,
246,
18,
85,
18,
347,
2011,
275,
4677,
374,
25761,
1016,
355,
3828,
476,
320,
18325,
12171,
253,
2934,
310,
4722,
253,
4679,
7568,
253,
12510,
327,
516,
5470,
2278,
639,
84,
3668,
20689,
277,
12303,
7366,
42081,
253,
331,
266,
4379,
21942,
5202,
17703,
260,
338,
274,
740,
285,
8142,
16192,
382,
806,
347,
275,
4677,
374,
253,
2929,
29328,
281,
3711,
3280,
1269,
285,
3453,
340,
715,
247,
21624,
2317,
323,
7982,
4715,
269,
89,
4233,
285,
6753,
36465,
4715,
340,
73,
4233,
403,
671,
6949,
275,
33362,
1492,
9162,
83,
18,
83,
19,
534,
403,
417,
5469,
275,
436,
2929,
275,
619,
4743,
253,
2022,
3064,
310,
253,
2216,
273,
2709,
21624,
8470,
2429,
342,
841,
33362,
1492,
9162,
3082,
50275,
9815,
275,
5899,
5145,
4715,
359,
2223,
3711,
247,
1029,
15759,
2317,
281,
247,
1698,
15759,
2317,
323,
7982,
4715,
352,
310,
12744,
2139,
8115,
253,
2303,
340,
281,
253,
10444,
8090,
275,
436,
2929,
1677,
247,
1029,
15759,
2317,
24088,
3888,
253,
17032,
1566,
16756,
4217,
3386,
285,
15116,
20804,
3386,
323,
7982,
4715,
50275,
35529,
275,
436,
2929,
891,
1089,
326,
253,
4477,
2589,
4679,
327,
690,
2014,
5203,
9162,
24088,
260,
338,
274,
740,
285,
8142,
16192,
382,
275,
436,
1083,
340,
310,
247,
13434,
390,
581,
12022,
4972,
891,
717,
14338,
670,
253,
3242,
830,
273,
305,
18,
305,
19,
305,
20,
275,
4677,
374,
1057,
253,
4081,
1332,
3711,
247,
1698,
15759,
21624,
2317,
281,
247,
1029,
581,
752,
310,
253,
16038,
323,
16122,
6779,
2317,
604,
305,
18,
90,
285,
305,
19,
72,
18,
89,
403,
1335,
275,
247,
1698,
15759,
2317,
390,
15891,
403,
1077,
2969,
513,
359,
1663,
878,
13737,
21257,
432,
340,
281,
355,
8090,
275,
436,
1083,
359,
476,
3365,
34824,
1027,
355,
8090,
281,
1755,
8090,
323,
7982,
4715,
323,
1650,
359,
476,
2118,
340,
846,
50276,
85,
20,
285,
5386,
288,
18,
288,
19,
288,
20,
275,
4677,
374,
1580,
340,
310,
247,
2173,
5203,
352,
310,
12744,
2139,
359,
878,
281,
3711,
281,
247,
1029,
15759,
2317,
50276,
783,
2216,
273,
33362,
1492,
9162,
310,
5272,
281,
479,
984,
253,
2303,
340,
310,
2570,
24088,
253,
2709,
5203,
11390,
812,
2985,
690,
13301,
285,
253,
33362,
357,
1241,
812,
320,
275,
247,
1029,
15759,
2317,
275,
436,
1083,
581,
476,
3711,
1029,
15759,
2317,
1269,
285,
340,
715,
247,
1698,
15759,
21624,
2317,
323,
7982,
4715,
50276,
83,
18,
4715,
3676,
21624,
8470,
323,
33362,
1492,
9162,
39951,
2284,
4240,
391,
19,
33362,
1492,
9162,
3066,
4735,
13823,
15424,
5203,
2317,
9706,
4059,
50275,
19016,
352,
651,
320,
1805,
281,
873,
247,
8245,
407,
4886,
340,
846,
50276,
85,
20,
285,
11922,
288,
18,
288,
19,
288,
20,
275,
4677,
374,
323,
5301,
50276,
48499,
253,
10336,
273,
355,
8090,
310,
2074,
281,
23465,
6928,
352,
310,
5125,
281,
12106,
253,
3910,
50276,
83,
20,
49863,
29974,
13337,
4715,
342,
23465,
6928,
4104,
337,
253,
4081,
1332,
476,
320,
18325,
12171,
285,
5115,
12085,
1543,
253,
4081,
7792,
476,
320,
908,
323,
260,
79,
2224,
391,
79,
2224,
285,
4979,
398,
253,
2934,
310,
4722,
50276,
19,
625,
6260,
670,
253,
16038,
285,
253,
15504,
273,
13737,
9261,
323,
340,
281,
21624,
2317,
403,
3058,
50276,
20,
253,
6260,
285,
11985,
670,
2905,
2987,
824,
347,
33362,
1492,
9162,
285,
23465,
6928,
403,
9829,
50276,
21,
690,
4679,
403,
5125,
281,
1329,
253,
4477,
11626,
24088,
374,
285,
495,
604,
1896,
50273,
187,
187,
4118,
18435,
27,
783,
4477,
12661,
247,
1332,
323,
42162,
4715,
347,
271,
5795,
281,
896,
18634,
1754,
4715,
253,
2934,
310,
281,
4722,
253,
8789,
875,
8090,
403,
7154,
1066,
715,
1980,
2957,
3470,
326,
476,
320,
9300,
10939,
253,
8571,
403,
16589,
281,
2045,
8090,
285,
253,
1491,
310,
15296,
970,
271,
6753,
36465,
2957,
1159,
253,
20553,
432,
253,
2303,
1930,
403,
840,
2429,
342,
253,
20553,
432,
3280,
1930,
970,
247,
9729,
1159,
285,
247,
7982,
2957,
253,
1332,
310,
6760,
327,
2505,
285,
2460,
9162,
8892,
253,
1543,
1804,
326,
436,
310,
247,
12532,
5795,
281,
896,
18634,
1754,
4715,
50276,
856,
84,
50276,
66,
4460,
2934,
326,
3133,
12532,
50276,
15419,
11634,
327,
2505,
285,
2460,
9162,
8892,
285,
5183,
11839,
50276,
5040,
50276,
783,
3486,
273,
253,
1180,
273,
3081,
3602,
285,
253,
13782,
310,
417,
31637,
1014,
2167,
44540,
403,
2406,
50276,
783,
4477,
12845,
253,
5955,
2180,
1077,
973,
3515,
3081,
4679,
326,
497,
5125,
3340,
28913,
2175,
597,
50276,
12563,
31637,
512,
253,
3533,
326,
497,
5439,
275,
512,
253,
2929,
556,
5520,
9619,
432,
253,
10237,
5955
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
310,
247,
26272,
273,
271,
3236,
2330,
4715,
2929,
407,
16288,
406,
864,
43425,
352,
9437,
281,
4198,
80,
583,
747,
4715,
2746,
2330,
4715,
347,
271,
5795,
1039,
281,
896,
44263,
318,
327,
1755,
273,
253,
3236,
2929,
352,
41217,
625,
4722,
3607,
285,
9017,
355,
281,
260,
9866,
298,
296,
78,
285,
4979,
398,
2167,
14999,
4209,
4278,
253,
4477,
452,
11512,
619,
7350,
327,
253,
7681,
4278,
273,
849,
355,
310,
3732,
327,
391,
79,
2224,
285,
4979,
398,
50274,
783,
2929,
310,
973,
3542,
4679,
921,
326,
275,
2505,
9162,
285,
2460,
9162,
253,
4081,
1332,
562,
32231,
20633,
275,
690,
5044,
10336,
4758,
50276,
1568,
403,
619,
7350,
50275,
262,
310,
12744,
849,
253,
355,
310,
3732,
327,
391,
79,
2224,
285,
4979,
398,
2593,
24978,
760,
1077,
13366,
2529,
731,
533,
891,
812,
417,
4677,
562,
690,
273,
253,
4278,
323,
1650,
849,
310,
253,
11935,
941,
11742,
275,
298,
296,
78,
50276,
249,
260,
9866,
672,
6507,
1076,
253,
8763,
6779,
352,
671,
3663,
253,
8820,
1491,
275,
4735,
8115,
33810,
849,
281,
6455,
4927,
281,
4927,
604,
16816,
310,
671,
247,
495,
69,
4735,
3711,
672,
253,
8820,
1491,
310,
3663,
50275,
4064,
253,
5740,
275,
2593,
374,
352,
3133,
326,
355,
23970,
1475,
4021,
253,
3602,
323,
247,
1677,
11454,
2990,
752,
310,
253,
3486,
273,
253,
2559,
3602,
275,
13782,
2105,
50275,
783,
4679,
4648,
4942,
2969,
2990,
10336,
323,
2505,
9162,
1057,
253,
1072,
5373,
4459,
689,
281,
1781,
39707,
3210,
285,
1335,
27125,
4390,
4633,
3210,
751,
270,
797,
50275,
783,
10336,
1491,
327,
260,
9866,
275,
2593,
5922,
310,
5816,
50275,
338,
619,
4685,
310,
3451,
253,
4081,
10336,
651,
417,
789,
275,
3425,
5978,
4836,
751,
298,
296,
78,
285,
4979,
398,
812,
513,
987,
275,
6010,
891,
1158,
2167,
253,
2929,
29328,
355,
7792,
347,
271,
5795,
281,
20633,
352,
310,
2686,
247,
2969,
6880,
281,
247,
2045,
789,
285,
1057,
417,
29328,
9619,
747,
5697,
690,
4278,
403,
5816,
285,
4679,
403,
417,
9470,
2217,
281,
3835,
1375,
23037,
14387,
35615,
5474,
339,
5858,
1118,
456,
4715,
12516,
6593,
247,
7646,
326,
476,
320,
3732,
281,
2761,
667,
2990,
281,
5115,
7938,
3733,
285,
17032,
597,
4647,
616,
7646,
281,
2067,
5368,
3676,
4715,
3210,
285,
1347,
4679,
326,
921,
597,
476,
5115,
10870,
604,
417,
1805,
1543,
342,
1679,
3733,
673,
50276,
783,
2929,
4518,
41714,
562,
253,
11361,
273,
2330,
4715,
7938,
17032,
7870,
3828,
12037,
285,
15722,
253,
2929,
310,
4518,
3542,
342,
1175,
8442,
253,
4679,
3176,
281,
320,
3477,
41374,
1512,
253,
6379,
275,
44540,
3058,
323,
298,
296,
983,
310,
3782,
13943,
50276,
74,
1119,
253,
7534,
3720,
247,
1652,
14999,
4931,
690,
1511,
273,
24642,
4715,
390,
625,
17947,
327,
752,
253,
2710,
28194,
84,
403,
2509,
812,
1056,
436,
4154,
10046,
253,
2905,
2987,
2593,
18369,
84,
281,
3748,
643,
11786,
18762,
456,
3082,
751,
5987,
39962,
2061,
5375,
746,
1762,
12231,
2691,
891,
1158,
275,
690,
4088,
436,
789,
476,
320,
2326,
347,
32049,
48759,
342,
3081,
37820,
1512,
891,
651,
5583,
436,
2929,
281,
320,
7607,
1223,
627,
403,
2067,
3374,
253,
16774,
1543,
403,
2266,
3782,
253,
298,
296,
78,
5141,
275,
44540,
891,
1158,
310,
247,
2257,
625,
281,
8338,
342,
253,
7870,
3828,
12037,
285,
11786,
12940,
1512,
326,
651,
320,
4722,
281,
643,
8607,
5474,
33032,
2520,
2929,
2175,
285,
49602,
271,
5795,
281,
896,
8560,
4907,
2330,
4715,
597,
12106,
253,
5847,
285,
772,
5717,
368,
323,
436,
1239,
253,
1543,
285,
253,
16182,
403,
7964,
18511,
2139,
891,
2550,
2997,
253,
7714,
347,
310,
310,
326,
50276,
783,
16038,
310,
417,
2590,
2217,
352,
310,
2590,
8772,
2139,
20633,
310,
417,
7445,
533,
352,
310,
417,
2590,
849,
368,
17735,
327,
436,
1332,
5742,
347,
2429,
281,
1142,
643,
9437,
327,
4560,
625,
8654,
11454,
2990,
13757,
39396,
50276,
4674,
374,
310,
1077,
2834,
281,
956,
891,
651,
6947,
690,
625,
3434,
15571,
849,
634,
1332,
2987,
275,
253,
7714,
50276,
262,
651,
320,
5322,
281,
2486,
271,
5933,
273,
849,
281,
3359,
355,
50276,
66,
5438,
273,
5884,
5701,
50276,
8826,
963,
993,
4768,
253,
7714,
24088,
275,
12002,
15629,
285,
12494,
577,
275,
253,
10199,
275,
2593,
577,
359,
50276,
25604,
1364,
320,
5611,
24088,
269,
340,
3966,
275,
2593,
374,
403,
417,
5611,
6283,
275,
5886,
281,
4677,
337,
50276,
262,
310,
2834,
281,
956,
253,
3064,
275,
14951,
672,
970,
288,
270,
285,
269,
891,
5583,
368,
6947,
690,
625,
673,
327,
2403,
436,
1077,
2590,
281,
253,
9414,
50275,
74,
1089,
253,
2829,
374,
44540,
323,
639,
3668,
2834,
281,
956,
627,
310,
247,
2590,
3102,
326,
355,
310,
7938,
533,
840,
1841,
2544,
39278,
323,
639,
3668,
651,
320,
5322,
342,
690,
2007,
1783,
715,
436,
342,
690,
625,
37699,
327,
849,
368,
7402,
598,
342,
436,
16182,
285,
247,
2590,
5933,
323,
849,
281,
3359,
355,
253,
37317,
651,
320,
5211,
281,
2997,
253,
7714,
5474,
33032,
2520,
2929,
29328,
2330,
4715,
355,
323,
260,
9866,
391,
9866,
285,
39707,
1027,
432,
896,
44263,
318,
20633,
355,
11101,
6013,
270,
793,
4156,
990,
936,
423,
3733,
5700,
715,
2067,
1355,
1980,
13757,
8571,
824,
326,
1016,
749,
3024,
4896,
556,
271,
7011,
11786,
2685,
281,
5115,
436,
253,
2929,
29328,
281,
3711,
3280,
1269,
285,
3453,
340,
715,
10444,
355,
8090,
285,
17923,
7982,
4715,
24088,
246,
18,
67,
18,
84,
18,
285,
6753,
36465,
4715,
246,
18,
85,
18,
347,
2011,
275,
4677,
374,
25761,
1016,
355,
3828,
476,
320,
18325,
12171,
253,
2934,
310,
4722,
253,
4679,
7568,
253,
12510,
327,
516,
5470,
2278,
639,
84,
3668,
20689,
277,
12303,
7366,
42081,
253,
331,
266,
4379,
21942,
5202,
17703,
260,
338,
274,
740,
285,
8142,
16192,
382,
806,
347,
275,
4677,
374,
253,
2929,
29328,
281,
3711,
3280,
1269,
285,
3453,
340,
715,
247,
21624,
2317,
323,
7982,
4715,
269,
89,
4233,
285,
6753,
36465,
4715,
340,
73,
4233,
403,
671,
6949,
275,
33362,
1492,
9162,
83,
18,
83,
19,
534,
403,
417,
5469,
275,
436,
2929,
275,
619,
4743,
253,
2022,
3064,
310,
253,
2216,
273,
2709,
21624,
8470,
2429,
342,
841,
33362,
1492,
9162,
3082,
50275,
9815,
275,
5899,
5145,
4715,
359,
2223,
3711,
247,
1029,
15759,
2317,
281,
247,
1698,
15759,
2317,
323,
7982,
4715,
352,
310,
12744,
2139,
8115,
253,
2303,
340,
281,
253,
10444,
8090,
275,
436,
2929,
1677,
247,
1029,
15759,
2317,
24088,
3888,
253,
17032,
1566,
16756,
4217,
3386,
285,
15116,
20804,
3386,
323,
7982,
4715,
50275,
35529,
275,
436,
2929,
891,
1089,
326,
253,
4477,
2589,
4679,
327,
690,
2014,
5203,
9162,
24088,
260,
338,
274,
740,
285,
8142,
16192,
382,
275,
436,
1083,
340,
310,
247,
13434,
390,
581,
12022,
4972,
891,
717,
14338,
670,
253,
3242,
830,
273,
305,
18,
305,
19,
305,
20,
275,
4677,
374,
1057,
253,
4081,
1332,
3711,
247,
1698,
15759,
21624,
2317,
281,
247,
1029,
581,
752,
310,
253,
16038,
323,
16122,
6779,
2317,
604,
305,
18,
90,
285,
305,
19,
72,
18,
89,
403,
1335,
275,
247,
1698,
15759,
2317,
390,
15891,
403,
1077,
2969,
513,
359,
1663,
878,
13737,
21257,
432,
340,
281,
355,
8090,
275,
436,
1083,
359,
476,
3365,
34824,
1027,
355,
8090,
281,
1755,
8090,
323,
7982,
4715,
323,
1650,
359,
476,
2118,
340,
846,
50276,
85,
20,
285,
5386,
288,
18,
288,
19,
288,
20,
275,
4677,
374,
1580,
340,
310,
247,
2173,
5203,
352,
310,
12744,
2139,
359,
878,
281,
3711,
281,
247,
1029,
15759,
2317,
50276,
783,
2216,
273,
33362,
1492,
9162,
310,
5272,
281,
479,
984,
253,
2303,
340,
310,
2570,
24088,
253,
2709,
5203,
11390,
812,
2985,
690,
13301,
285,
253,
33362,
357,
1241,
812,
320,
275,
247,
1029,
15759,
2317,
275,
436,
1083,
581,
476,
3711,
1029,
15759,
2317,
1269,
285,
340,
715,
247,
1698,
15759,
21624,
2317,
323,
7982,
4715,
50276,
83,
18,
4715,
3676,
21624,
8470,
323,
33362,
1492,
9162,
39951,
2284,
4240,
391,
19,
33362,
1492,
9162,
3066,
4735,
13823,
15424,
5203,
2317,
9706,
4059,
50275,
19016,
352,
651,
320,
1805,
281,
873,
247,
8245,
407,
4886,
340,
846,
50276,
85,
20,
285,
11922,
288,
18,
288,
19,
288,
20,
275,
4677,
374,
323,
5301,
50276,
48499,
253,
10336,
273,
355,
8090,
310,
2074,
281,
23465,
6928,
352,
310,
5125,
281,
12106,
253,
3910,
50276,
83,
20,
49863,
29974,
13337,
4715,
342,
23465,
6928,
4104,
337,
253,
4081,
1332,
476,
320,
18325,
12171,
285,
5115,
12085,
1543,
253,
4081,
7792,
476,
320,
908,
323,
260,
79,
2224,
391,
79,
2224,
285,
4979,
398,
253,
2934,
310,
4722,
50276,
19,
625,
6260,
670,
253,
16038,
285,
253,
15504,
273,
13737,
9261,
323,
340,
281,
21624,
2317,
403,
3058,
50276,
20,
253,
6260,
285,
11985,
670,
2905,
2987,
824,
347,
33362,
1492,
9162,
285,
23465,
6928,
403,
9829,
50276,
21,
690,
4679,
403,
5125,
281,
1329,
253,
4477,
11626,
24088,
374,
285,
495,
604,
1896,
50273,
187,
187,
4118,
18435,
27,
783,
4477,
12661,
247,
1332,
323,
42162,
4715,
347,
271,
5795,
281,
896,
18634,
1754,
4715,
253,
2934,
310,
281,
4722,
253,
8789,
875,
8090,
403,
7154,
1066,
715,
1980,
2957,
3470,
326,
476,
320,
9300,
10939,
253,
8571,
403,
16589,
281,
2045,
8090,
285,
253,
1491,
310,
15296,
970,
271,
6753,
36465,
2957,
1159,
253,
20553,
432,
253,
2303,
1930,
403,
840,
2429,
342,
253,
20553,
432,
3280,
1930,
970,
247,
9729,
1159,
285,
247,
7982,
2957,
253,
1332,
310,
6760,
327,
2505,
285,
2460,
9162,
8892,
253,
1543,
1804,
326,
436,
310,
247,
12532,
5795,
281,
896,
18634,
1754,
4715,
50276,
856,
84,
50276,
66,
4460,
2934,
326,
3133,
12532,
50276,
15419,
11634,
327,
2505,
285,
2460,
9162,
8892,
285,
5183,
11839,
50276,
5040,
50276,
783,
3486,
273,
253,
1180,
273,
3081,
3602,
285,
253,
13782,
310,
417,
31637,
1014,
2167,
44540,
403,
2406,
50276,
783,
4477,
12845,
253,
5955,
2180,
1077,
973,
3515,
3081,
4679,
326,
497,
5125,
3340,
28913,
2175,
597,
50276,
12563,
31637,
512,
253,
3533,
326,
497,
5439,
275,
512,
253,
2929,
556,
5520,
9619,
432,
253,
10237,
5955
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors present a unifying framework for objectcentric learning bringing together a wide array of distinct methods under a single framework i personally find this flavor of manuscripts particularly useful as theyve previously helped me better understand fields of research for example cunningham and ghahramani 2015 is in a similar spirit albeit for a different set of problems i think the current manuscript will be a valuable addition to the workshop and serve to generate useful discussion within the community however one aspect of the manuscript which i felt could potentially be improved perhaps as future iterations of the work are the insights that can be gained from the proposed interpretation of objectcentric learning for example given that the authors propose to interpret objectcentric learning as nested optimization perhaps there are relevant methods from the nested optimization literature which could now be more easily ported over and used to improve objectcentric learning or instead perhaps the proposed framework can be used to further outline similaritiesdifferences between current work minortypos abstract promising results in unsupervised decomposition simple visual scenes promising results in unsupervised decomposition of simple visual scenes references cunningham john p and zoubin ghahramani linear dimensionality reduction survey insights and generalizations the journal of machine learning research 161 2015 28592900 docsepthe paper aims to identify the underlying computational problem that existing iterative approaches to objectcentric learning are trying to solve specifically the paper classifies existing approaches into two categories those that metalearn posterior inference and those that metalearn parameter estimation the paper then proposes an optimization problem that unifies these two categories where the inner layer optimizes elbo with respect to the perdatapoint parameters eg slot representations cluster assignments and the outer layer optimizes the task objective eg reconstruction classification with respect to network weights eg encoder and decoder the paper also suggests some connections to other fields pros the paper is wellmotivated a unified problem formulation can shed light on ways to improve the existing methods cons the clarity of the paper can be improved for example i didnt understand the key difference between the two proposed categories in particular why cant slot attention fit in the first category i am not sure whether the proposed framework is general enough in particular why does the inner objective have to be elbo the paper mentioned that the soft kmeans algorithm is known to monotonically improve the elbo however in slot attention the soft kmeans algorithm is replaced by learnable updates it is unclear whether the learnable updates is still optimizing the same objectivedocsepthe authors unify the iterative algorithms in objectcentric learning methods into a particular nested optimization problem with solving a maximization of elbo they join metalearn posterior inference and metalearn parameter estimation in existing methods to the same nested optimization problem and interpret it as the essence of objectcentric learning although unification is always what scientists pursue a simple combination is not enough some pivotal questions should be answered in this paper 1 why do we need unification in objectcentric learning methods whats the advantage of regarding these practical algorithms as a theoretical optimization formulation 2 is there any detailed examples that your idea can bring some difference to recent research
### Summary: | this paper is relevant to the workshop and outlines an interesting connection between iterative objectcentric representation learning approaches and nested optimization as such i believe it can provide a valuable contribution to this workshop despite not having shown an immediate practical advantage of this unification as pointed out by reviewer iyd1 we encourage the authors to take the reviewers feedback into account when preparing the cameraready version of the paper | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
1246,
247,
440,
5411,
7792,
323,
1789,
37382,
4715,
9745,
2366,
247,
4618,
3781,
273,
5799,
3082,
762,
247,
2014,
7792,
891,
11697,
1089,
436,
13746,
273,
40336,
3782,
4217,
347,
597,
306,
3786,
6518,
479,
1805,
2096,
4910,
273,
2561,
323,
1650,
260,
41681,
285,
32798,
1240,
3358,
6451,
4104,
310,
275,
247,
2074,
5968,
23447,
323,
247,
1027,
873,
273,
3237,
891,
1158,
253,
1655,
7714,
588,
320,
247,
9865,
1635,
281,
253,
22586,
285,
5752,
281,
6635,
4217,
5955,
1561,
253,
3114,
2299,
581,
4809,
273,
253,
7714,
534,
891,
3543,
812,
7826,
320,
5520,
4931,
347,
2852,
25142,
273,
253,
789,
403,
253,
16039,
326,
476,
320,
12103,
432,
253,
4081,
7914,
273,
1789,
37382,
4715,
323,
1650,
1677,
326,
253,
4477,
12661,
281,
4665,
1789,
37382,
4715,
347,
20494,
13757,
4931,
627,
403,
4623,
3082,
432,
253,
20494,
13757,
6239,
534,
812,
1024,
320,
625,
4354,
2245,
264,
689,
285,
908,
281,
3157,
1789,
37382,
4715,
390,
3185,
4931,
253,
4081,
7792,
476,
320,
908,
281,
2007,
19270,
22620,
69,
26776,
875,
1655,
789,
50276,
1222,
430,
90,
993,
50276,
15834,
12532,
1543,
275,
440,
35421,
14717,
2969,
5304,
13451,
50275,
13382,
2182,
1543,
275,
440,
35421,
14717,
273,
2969,
5304,
13451,
50274,
250,
3065,
50276,
68,
41681,
480,
2116,
268,
285,
1182,
276,
4805,
32798,
1240,
3358,
6451,
4872,
7877,
1319,
5141,
6630,
16039,
285,
2087,
5904,
253,
6698,
273,
5145,
4715,
2561,
22761,
4104,
27360,
26,
1717,
361,
5474,
339,
431,
248,
2929,
13698,
281,
4271,
253,
6944,
15180,
1895,
326,
5368,
34560,
7274,
281,
1789,
37382,
4715,
403,
2820,
281,
8415,
5742,
253,
2929,
966,
7790,
5368,
7274,
715,
767,
9050,
1110,
326,
5148,
613,
79,
12637,
17032,
285,
1110,
326,
5148,
613,
79,
4764,
13418,
253,
2929,
840,
29328,
271,
13757,
1895,
326,
440,
7790,
841,
767,
9050,
835,
253,
6703,
3828,
5556,
4219,
1045,
2399,
342,
1675,
281,
253,
591,
8608,
522,
842,
3602,
24088,
15239,
14237,
7368,
23768,
285,
253,
8346,
3828,
5556,
4219,
253,
4836,
8103,
24088,
14433,
9162,
342,
1675,
281,
2990,
13461,
24088,
32049,
285,
29810,
253,
2929,
671,
5936,
690,
10291,
281,
643,
4910,
50276,
856,
84,
50276,
783,
2929,
310,
973,
24013,
8550,
247,
27998,
1895,
15895,
476,
17914,
1708,
327,
4088,
281,
3157,
253,
5368,
3082,
50276,
5040,
50276,
783,
19843,
273,
253,
2929,
476,
320,
5520,
323,
1650,
891,
42126,
2096,
253,
2234,
3064,
875,
253,
767,
4081,
9050,
275,
1798,
2139,
16216,
15239,
4116,
4944,
275,
253,
806,
7140,
50276,
74,
717,
417,
2119,
1880,
253,
4081,
7792,
310,
2087,
2217,
275,
1798,
2139,
1057,
253,
6703,
8103,
452,
281,
320,
1045,
2399,
253,
2929,
5393,
326,
253,
2602,
465,
30799,
5933,
310,
1929,
281,
41907,
1037,
3157,
253,
1045,
2399,
2299,
275,
15239,
4116,
253,
2602,
465,
30799,
5933,
310,
7932,
407,
3037,
494,
11269,
352,
310,
12744,
1880,
253,
3037,
494,
11269,
310,
1335,
39793,
253,
1072,
1789,
1567,
406,
339,
431,
248,
4477,
440,
1419,
253,
34560,
11333,
275,
1789,
37382,
4715,
3082,
715,
247,
1798,
20494,
13757,
1895,
342,
16161,
247,
11903,
1320,
273,
1045,
2399,
597,
6604,
5148,
613,
79,
12637,
17032,
285,
5148,
613,
79,
4764,
13418,
275,
5368,
3082,
281,
253,
1072,
20494,
13757,
1895,
285,
4665,
352,
347,
253,
17718,
273,
1789,
37382,
4715,
3738,
440,
1877,
310,
1900,
752,
10950,
15142,
247,
2969,
5019,
310,
417,
2217,
690,
30847,
3533,
943,
320,
9577,
275,
436,
2929,
337,
2139,
513,
359,
878,
440,
1877,
275,
1789,
37382,
4715,
3082,
47515,
253,
5750,
273,
5001,
841,
8542,
11333,
347,
247,
10527,
13757,
15895,
374,
310,
627,
667,
7000,
6667,
326,
634,
2934,
476,
3324,
690,
3064,
281,
3332,
2561,
187,
187,
4118,
18435,
27,
2520,
2929,
310,
4623,
281,
253,
22586,
285,
36264,
271,
4722,
4602,
875,
34560,
1789,
37382,
6779,
4715,
7274,
285,
20494,
13757,
347,
824,
891,
2868,
352,
476,
2085,
247,
9865,
7680,
281,
436,
22586,
5747,
417,
1907,
2011,
271,
8993,
8542,
5750,
273,
436,
440,
1877,
347,
8042,
562,
407,
37317,
891,
10120,
18,
359,
11907,
253,
4477,
281,
1379,
253,
30628,
8680,
715,
2395,
672,
13828,
253,
4049,
254,
609,
5102,
2715,
273,
253,
2929
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
1246,
247,
440,
5411,
7792,
323,
1789,
37382,
4715,
9745,
2366,
247,
4618,
3781,
273,
5799,
3082,
762,
247,
2014,
7792,
891,
11697,
1089,
436,
13746,
273,
40336,
3782,
4217,
347,
597,
306,
3786,
6518,
479,
1805,
2096,
4910,
273,
2561,
323,
1650,
260,
41681,
285,
32798,
1240,
3358,
6451,
4104,
310,
275,
247,
2074,
5968,
23447,
323,
247,
1027,
873,
273,
3237,
891,
1158,
253,
1655,
7714,
588,
320,
247,
9865,
1635,
281,
253,
22586,
285,
5752,
281,
6635,
4217,
5955,
1561,
253,
3114,
2299,
581,
4809,
273,
253,
7714,
534,
891,
3543,
812,
7826,
320,
5520,
4931,
347,
2852,
25142,
273,
253,
789,
403,
253,
16039,
326,
476,
320,
12103,
432,
253,
4081,
7914,
273,
1789,
37382,
4715,
323,
1650,
1677,
326,
253,
4477,
12661,
281,
4665,
1789,
37382,
4715,
347,
20494,
13757,
4931,
627,
403,
4623,
3082,
432,
253,
20494,
13757,
6239,
534,
812,
1024,
320,
625,
4354,
2245,
264,
689,
285,
908,
281,
3157,
1789,
37382,
4715,
390,
3185,
4931,
253,
4081,
7792,
476,
320,
908,
281,
2007,
19270,
22620,
69,
26776,
875,
1655,
789,
50276,
1222,
430,
90,
993,
50276,
15834,
12532,
1543,
275,
440,
35421,
14717,
2969,
5304,
13451,
50275,
13382,
2182,
1543,
275,
440,
35421,
14717,
273,
2969,
5304,
13451,
50274,
250,
3065,
50276,
68,
41681,
480,
2116,
268,
285,
1182,
276,
4805,
32798,
1240,
3358,
6451,
4872,
7877,
1319,
5141,
6630,
16039,
285,
2087,
5904,
253,
6698,
273,
5145,
4715,
2561,
22761,
4104,
27360,
26,
1717,
361,
5474,
339,
431,
248,
2929,
13698,
281,
4271,
253,
6944,
15180,
1895,
326,
5368,
34560,
7274,
281,
1789,
37382,
4715,
403,
2820,
281,
8415,
5742,
253,
2929,
966,
7790,
5368,
7274,
715,
767,
9050,
1110,
326,
5148,
613,
79,
12637,
17032,
285,
1110,
326,
5148,
613,
79,
4764,
13418,
253,
2929,
840,
29328,
271,
13757,
1895,
326,
440,
7790,
841,
767,
9050,
835,
253,
6703,
3828,
5556,
4219,
1045,
2399,
342,
1675,
281,
253,
591,
8608,
522,
842,
3602,
24088,
15239,
14237,
7368,
23768,
285,
253,
8346,
3828,
5556,
4219,
253,
4836,
8103,
24088,
14433,
9162,
342,
1675,
281,
2990,
13461,
24088,
32049,
285,
29810,
253,
2929,
671,
5936,
690,
10291,
281,
643,
4910,
50276,
856,
84,
50276,
783,
2929,
310,
973,
24013,
8550,
247,
27998,
1895,
15895,
476,
17914,
1708,
327,
4088,
281,
3157,
253,
5368,
3082,
50276,
5040,
50276,
783,
19843,
273,
253,
2929,
476,
320,
5520,
323,
1650,
891,
42126,
2096,
253,
2234,
3064,
875,
253,
767,
4081,
9050,
275,
1798,
2139,
16216,
15239,
4116,
4944,
275,
253,
806,
7140,
50276,
74,
717,
417,
2119,
1880,
253,
4081,
7792,
310,
2087,
2217,
275,
1798,
2139,
1057,
253,
6703,
8103,
452,
281,
320,
1045,
2399,
253,
2929,
5393,
326,
253,
2602,
465,
30799,
5933,
310,
1929,
281,
41907,
1037,
3157,
253,
1045,
2399,
2299,
275,
15239,
4116,
253,
2602,
465,
30799,
5933,
310,
7932,
407,
3037,
494,
11269,
352,
310,
12744,
1880,
253,
3037,
494,
11269,
310,
1335,
39793,
253,
1072,
1789,
1567,
406,
339,
431,
248,
4477,
440,
1419,
253,
34560,
11333,
275,
1789,
37382,
4715,
3082,
715,
247,
1798,
20494,
13757,
1895,
342,
16161,
247,
11903,
1320,
273,
1045,
2399,
597,
6604,
5148,
613,
79,
12637,
17032,
285,
5148,
613,
79,
4764,
13418,
275,
5368,
3082,
281,
253,
1072,
20494,
13757,
1895,
285,
4665,
352,
347,
253,
17718,
273,
1789,
37382,
4715,
3738,
440,
1877,
310,
1900,
752,
10950,
15142,
247,
2969,
5019,
310,
417,
2217,
690,
30847,
3533,
943,
320,
9577,
275,
436,
2929,
337,
2139,
513,
359,
878,
440,
1877,
275,
1789,
37382,
4715,
3082,
47515,
253,
5750,
273,
5001,
841,
8542,
11333,
347,
247,
10527,
13757,
15895,
374,
310,
627,
667,
7000,
6667,
326,
634,
2934,
476,
3324,
690,
3064,
281,
3332,
2561,
187,
187,
4118,
18435,
27,
2520,
2929,
310,
4623,
281,
253,
22586,
285,
36264,
271,
4722,
4602,
875,
34560,
1789,
37382,
6779,
4715,
7274,
285,
20494,
13757,
347,
824,
891,
2868,
352,
476,
2085,
247,
9865,
7680,
281,
436,
22586,
5747,
417,
1907,
2011,
271,
8993,
8542,
5750,
273,
436,
440,
1877,
347,
8042,
562,
407,
37317,
891,
10120,
18,
359,
11907,
253,
4477,
281,
1379,
253,
30628,
8680,
715,
2395,
672,
13828,
253,
4049,
254,
609,
5102,
2715,
273,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes a new method of updating deep neural networks for combinatorial optimization problems during search using reinforcement learning in particular the authors show that by updating only part of the network better results can be achieved at lower cost they describe and evaluate their method on different combinatorial optimization problems comparing to other machinelearningbased approaches as well as traditional solvers the paper presents an interesting idea that seems to have a large impact the evaluation is thorough and fair the results are convincing this is a good paper that should be accepted there are a few minor points that were unclear to me and might warrant further discussion the results for the tsp in table 1 show that concorde is often the fastest solver this is somewhat counterintuitive especially compared to lkh as it is a complete solver the proposed method is also often much slower some explanation of this would be helpful for the reader to understand what exactly is going on there the most nebulous part of the proposed method to me is the placement of the new layer which sounds like it might be quite difficult in practice and potentially require expensive evaluation of different alternatives a more indepth discussion of how the authors determined this for their experiments along with some recommendations on how to do this in a new setting would make the paper stronger and more applicable in practice interesting method with promising results docsepthe paper deals with endtoend learning of heuristics for combinatorial optimization problems the authors propose an extension of the active search method of bello et al 2016 where only part of the model parameters are updated at test time for each instance they propose three ways of applying this idea that consist in finetuning part of the instance embeddings the parameters of an additional layer or directly the prediction scores of the model applied to the pomo method kwon et al 2020 for the tsp and cvpr and the l2d method of zhang et al 2020 for the jssp the proposed efficient active search leads to significant improvements on instances of the same size and larger than the training ones strengths 1 the paper is clear well organised and well written 1 the presented approach seems applicable to any constructive method as long as it has a encoderdecoder type of architecture 1 in the experiments the proposed approach is applied to 2 models for solving 3 different problems and the results are consistently positive which hints at the generality of the proposed approach 1 it improves the performance of the underlying model on test instances from the same distribution as the training instances as well as to larger instances from the same distribution effectively addressing the wellknown difficulty of standard learningbased models to perform well on larger instances 1nice discussion in sec 44 to explain possible reasons of why one of the proposed variants work best for each problem weaknesses 1 in the experiments the scale of instances is limited 200 nodes for tsp and cvrp while recent learningbased methods such as the cited fu et al 2021 manage to solve tsp instances with up to 10000 nodes 1 generalisation is only illustrated and claimed with respect to the size of the instances it would be interesting to see the results on other distributions shifts eg applying eas to a model trained on tsp100 on instances of tsplib 1 for cvrp results are indeed provided for other distributions but for each family of instances the authors say that the model is trained for 3 weeks and then tested on similar instances for each family could eas be helpful to learn good solutions starting from the same model recommendation i would vote for accepting the paper the contribution is interesting as a middle ground between finetuning a whole model for each instance active search and other nonlearning based search strategies beamsearch sampling etc the proposed approach is illustrated on 2 models for 3 standard co problems and consistently shows good results questions 1 what is the motivation of adding a new layer to finetune easlay versus finetuning some existing layers of the model 1 in tables 1 and 2 why are there no entries for most of the learningbased baselines for n 125 since there is no strict timelimit in this setting i guess with a smallenough batch size for the models to fit into memory all these methods would provide some results for n up to 200 1 in appendix table 4 have you tried simply applying eas to the model trained on the uniform distribution cvrp100 that would be a natural test of the impact of eas on generalization 1 looking at figure 3 it seems the value of the best lambda depends on the problem and the range of potential values is quite wide 001100 have you checked the scale of the different losses and could it help explain such a difference 1 if one were to apply eas to another modelco problem could you deduce from your experiments a general kind of rule of thumb of which variant would work better for which situation additional feedback in introduction these methods do not react towards the solutions seen so far ie the underlying distribution from which instances are sampled is never changed throughout the search not clear to me do you mean from which solutions are sampled wide adaption adoption sec 3background the decoder is introduced with parameter omega but this one is only defined as the embeddings in the next paragraph sec 31 in the definition of the total gradient right after equation 2 shouldnt there be a minus before one of the gradients since one gradient corresponds to the minimisation of the cost and the other to the maximisation of the likelihood figure 2 yaxis is the average costs optimality gap would be more relevant and consistent especially if instances have different sizes the paper provides an interesting contribution to learn to search for highquality solutions at test time nicely completing endtoend learning pipelines for solving co problems the proposed approach could be applied to any model that has an encoderdecoder type of architecture and is experimentally validated on 2 models and 3 problems a limitation is that it is not clear if it could help a model generalize to instances that are much larger than the training ones or with significantly different characteristics i vote for accepting the paper update after rebuttal i thank the authors for precisely answering all my questions and concerns i am happy to confirm my initial recommendation of accepting the paper docsepthe paper studies machine learningbased methods for combinatorial optimization the paper builds upon bello et al 2016 on using reinforcement learning to generate solutions for combinatorial optimization problems eg tsp the novelty of the paper is to optimize only a subset of the model parameters the paper then proposes three different implementations based on this idea significance one limitation of existing rlbased approaches for combinatorial optimization is its resource requirement as demonstrated in table 1 the active search technique in bello et al 2016 takes 5 days to solve 10000 test instances of tsp the paper aims to tackle this limitation by proposing to optimize only a subset of the model parameters i think the paper is making a good and meaningful contribution towards research in the field novelty the paper extends the active search method in bello et al 2016 the three proposed implementations are based on one general idea of optimizing only a subset of the model parameters the novelty of the proposed technique is therefore limited however if the method performs well its simplicity could be of high interest presentation the paper is wellwritten the related literature is discussed in detail the experimental results are clearly presented with ablation study and trajectory analysis there are some minor ambiguities in presenting the proposed techniques as elaborated below there are some ambiguities in the paper 1 on page 4 below figure 1 the paper proposes the first strategy update the embedding of the using the loss function jrl and jil these loss functions are not explicitly specified anywhere in the paper only their gradient wrt the embeddings are presented in eq1 and eq2 readers who are not familiar with rlimitation learning literature may not know what jrl and jil are it would be great if the author can be more explicit about the loss functions before presenting their gradients 2 in table 1 the authors provide wallclock time for the proposed algorithms and other baselines on a set of 10000 tsp instances eas achieves competitive performance while taking only 57 hours to run as compared to 5 days using the original active search is this improvement due to a eas uses less memory and hence we can solve more instances in parallel or b eas is computationally more efficient ie it uses less cputime to achieve the competitive performance or c a combination of the above it would be better if the authors report the cputime instead of wallclock time and separately report the space memory and time cpu of these algorithms minor comments in figure 3 the xaxis should be labeled lambda the paper proposes a simple extension to an existing rlbased method for combinatorial optimization its effectiveness is demonstrated empirically however i feel that the results of the experiments should be reported in greater detail ie to compare with the original active search in different performance metrics such as memory and cpu time usage docsepthis paper studies deep learning methods for solving combinatorial optimization problems the authors write that stateoftheart methods typically use models that consist of encoder and decoder units the methods first create an embedding of the problem instance using the encoder then starting with an empty solution to the problem the embedding and the decoder are used to autoregressively construct a solution over a series of time steps given an already trained model and a test instance this paper studies how to quickly update the model parameters in order to improve the quality of the solution returned by this procedure the authors propose three techniques which adjust 1 the normally static embeddings of the problem instance that are generated by the encoder model 2 the weights of additional instancespecific residual layers added to the decoder and 3 the parameters of a lookup table that directly affect the probability distribution returned by model strengths this paper studies an exciting areamachine learning for combinatorial optimizationwhere machine learning has the potential to make a big impact from the experiments especially tables 1 and 2 it looks as through the proposed approaches are much faster than competitor active search bello et al 16 which from my understanding searches for ways to adjust all parameters of the trained model at test time in contrast the proposed approach only searches for ways to adjust specific subsets of model parameters which makes the approach faster i appreciate that the authors evaluate their approach on a few different types of combinatorial optimization problems two different types of routing problems and a scheduling problem for the scheduling problem the improvement over active search is a bit more modest weaknesses i found the problem description somewhat hard to follow in section 3 it would be helpful to clarify what exactly an action corresponds to in this setting one way to do this would be to summarize the combinatorial problems studied in the experiments section and explain what an action corresponds to and what the state st1 corresponds to after applying an action at in terms of solution quality the improvements over problemspecific baselines are sometimes really small eg in tables 1 and 2 a fraction of a percentage on such a small scale i wasnt sure if i could trust the superiority of any particular method confidence intervals would really help here detailed comments page 2 im not sure that exemplary is the right word here i would remove it equation 1 im not sure what you mean when you say that b0 is a baseline used to reduce variance can you say more overall im leaning toward acceptance because the proposed approach seems to provide a notable improvement over prior methods in particular active search by bello et al 16 in terms of runtime
### Summary: | this paper gives a framework for using learning in combinatorial optimization problems in particular active search is used to learn hueristics the reviewers thought the paper had nice conceptual contributions for this approach and that the results would be very interesting to the community | [
247,
747,
4758,
651,
1056,
253,
2929,
10046,
285,
625,
7763,
275,
3946,
50276,
47606,
1332,
342,
12532,
1543,
5474,
339,
431,
248,
2929,
13330,
342,
990,
936,
423,
4715,
273,
344,
321,
3397,
323,
38183,
13757,
3237,
253,
4477,
12661,
271,
6880,
273,
253,
3939,
3186,
1332,
273,
270,
6646,
1162,
355,
4022,
835,
760,
629,
273,
253,
1566,
3602,
403,
9300,
387,
1071,
673,
323,
1016,
4227,
597,
12661,
1264,
4088,
273,
9433,
436,
2934,
326,
2882,
275,
1442,
292,
25004,
629,
273,
253,
4227,
46234,
253,
3602,
273,
271,
3081,
3828,
390,
3587,
253,
10554,
7363,
273,
253,
1566,
3732,
281,
253,
268,
19216,
1332,
465,
33382,
1162,
355,
9169,
323,
253,
37232,
285,
30105,
1087,
285,
253,
298,
19,
69,
1332,
273,
1182,
12109,
1162,
355,
9169,
323,
253,
480,
859,
81,
253,
4081,
5919,
3939,
3186,
5644,
281,
1534,
11701,
327,
10872,
273,
253,
1072,
1979,
285,
4067,
685,
253,
3733,
4394,
50276,
296,
3755,
20556,
50276,
18,
253,
2929,
310,
2590,
973,
29070,
285,
973,
3542,
50276,
18,
253,
3559,
2746,
3133,
7763,
281,
667,
25799,
1332,
347,
1048,
347,
352,
556,
247,
32049,
48759,
1511,
273,
10336,
50276,
18,
275,
253,
4679,
253,
4081,
2746,
310,
3732,
281,
374,
3210,
323,
16161,
495,
1027,
3237,
285,
253,
1543,
403,
12724,
2762,
534,
28145,
387,
253,
31376,
273,
253,
4081,
2746,
50276,
18,
352,
19132,
253,
3045,
273,
253,
6944,
1566,
327,
1071,
10872,
432,
253,
1072,
3268,
347,
253,
3733,
10872,
347,
973,
347,
281,
4067,
10872,
432,
253,
1072,
3268,
8069,
15974,
253,
973,
4304,
10183,
273,
2629,
4715,
3169,
3210,
281,
1347,
973,
327,
4067,
10872,
50276,
18,
40199,
5955,
275,
4706,
7127,
281,
5513,
1896,
4606,
273,
2139,
581,
273,
253,
4081,
11640,
789,
1682,
323,
1016,
1895,
50275,
20881,
1255,
265,
337,
275,
253,
4679,
253,
4311,
273,
10872,
310,
3710,
1052,
7632,
323,
37232,
285,
260,
24987,
81,
1223,
3332,
4715,
3169,
3082,
824,
347,
253,
11106,
15260,
1162,
355,
43425,
8722,
281,
8415,
37232,
10872,
342,
598,
281,
30321,
7632,
337,
2087,
5837,
310,
760,
12800,
285,
7558,
342,
1675,
281,
253,
1979,
273,
253,
10872,
352,
651,
320,
4722,
281,
923,
253,
1543,
327,
643,
10670,
15036,
24088,
9433,
1842,
281,
247,
1566,
10166,
327,
37232,
2313,
327,
10872,
273,
246,
23336,
487,
337,
323,
260,
24987,
81,
1543,
403,
6296,
2530,
323,
643,
10670,
533,
323,
1016,
2021,
273,
10872,
253,
4477,
1333,
326,
253,
1566,
310,
10166,
323,
495,
3618,
285,
840,
5762,
327,
2074,
10872,
323,
1016,
2021,
812,
1842,
320,
9371,
281,
3037,
1175,
5482,
4983,
432,
253,
1072,
1566,
50275,
250,
27167,
318,
50276,
74,
651,
6273,
323,
18738,
253,
2929,
253,
7680,
310,
4722,
347,
247,
4766,
3216,
875,
1442,
292,
25004,
247,
2644,
1566,
323,
1016,
4227,
3939,
3186,
285,
643,
1327,
28269,
1754,
3186,
8130,
8325,
8716,
10491,
3966,
253,
4081,
2746,
310,
12800,
327,
374,
3210,
323,
495,
2629,
820,
3237,
285,
12724,
2722,
1175,
1543,
50276,
34974,
337,
752,
310,
253,
16038,
273,
6240,
247,
747,
3828,
281,
1442,
292,
2517,
1842,
15328,
7147,
1442,
292,
25004,
690,
5368,
8090,
273,
253,
1566,
337,
275,
7180,
337,
285,
374,
2139,
403,
627,
642,
12028,
323,
954,
273,
253,
4715,
3169,
1666,
25379,
323,
295,
50276,
9312,
50276,
17480,
627,
310,
642,
7654,
4522,
293,
16563,
275,
436,
4758,
891,
5476,
342,
247,
1355,
40252,
14604,
1979,
323,
253,
3210,
281,
4944,
715,
3541,
512,
841,
3082,
651,
2085,
690,
1543,
323,
295,
598,
281,
1052,
337,
275,
30762,
2829,
577,
452,
368,
3597,
3365,
9433,
1842,
281,
253,
1566,
10166,
327,
253,
6447,
3268,
260,
24987,
81,
2313,
326,
651,
320,
247,
3626,
1071,
273,
253,
50276,
48276,
273,
1842,
327,
26647,
50276,
18,
2819,
387,
4677,
495,
50276,
262,
3133,
253,
1318,
273,
253,
1682,
29331,
7024,
327,
253,
1895,
285,
253,
2491,
273,
2442,
2193,
310,
3240,
4618,
7449,
37965,
452,
368,
10141,
253,
4311,
273,
253,
1027,
11655,
285,
812,
352,
1361,
5513,
824,
247,
3064,
337,
604,
581,
497,
281,
4647,
1842,
281,
1529,
1566,
1940,
1895,
812,
368,
27566,
432,
634,
4679,
247,
2087,
2238,
273,
4086,
273,
17300,
273,
534,
12955,
651,
789,
1805,
323,
534,
4112,
50275,
38092,
8680,
50276,
249,
10199,
50271,
20513,
3082,
513,
417,
8071,
4404,
253,
5482,
2326,
594,
2080,
26332,
253,
6944,
3268,
432,
534,
10872,
403,
19958,
310,
1620,
4391,
4768,
253,
3186,
417,
2590,
281,
479,
513,
368,
1599,
432,
534,
5482,
403,
19958,
50272,
4363,
5223,
279,
50276,
324,
7872,
50276,
1704,
495,
11814,
253,
29810,
310,
5611,
342,
4764,
40639,
533,
436,
581,
310,
760,
2931,
347,
253,
46234,
275,
253,
1735,
12494,
50275,
1704,
4562,
275,
253,
5426,
273,
253,
2264,
11786,
987,
846,
5150,
374,
943,
2649,
627,
320,
247,
19734,
1078,
581,
273,
253,
27935,
50276,
17480,
581,
11786,
10140,
281,
253,
7221,
5837,
273,
253,
2105,
285,
253,
643,
281,
253,
11903,
5837,
273,
253,
12177,
50276,
13206,
374,
340,
10565,
310,
253,
3388,
4815,
5556,
1319,
8037,
651,
320,
625,
4623,
285,
5185,
3340,
604,
10872,
452,
1027,
9552,
50272,
783,
2929,
3400,
271,
4722,
7680,
281,
3037,
281,
3186,
323,
1029,
15177,
5482,
387,
1071,
673,
23395,
21006,
990,
936,
423,
4715,
44387,
323,
16161,
820,
3237,
253,
4081,
2746,
812,
320,
3732,
281,
667,
1566,
326,
556,
271,
32049,
48759,
1511,
273,
10336,
285,
310,
21657,
17618,
327,
374,
3210,
285,
495,
3237,
247,
12291,
310,
326,
352,
310,
417,
2590,
604,
352,
812,
1361,
247,
1566,
39970,
281,
10872,
326,
403,
1199,
4067,
685,
253,
3733,
4394,
390,
342,
3012,
1027,
5319,
891,
6273,
323,
18738,
253,
2929,
50274,
11183,
846,
30080,
22559,
50276,
74,
5717,
253,
4477,
323,
10534,
22291,
512,
619,
3533,
285,
7350,
891,
717,
5211,
281,
6583,
619,
3302,
17401,
273,
18738,
253,
2929,
50275,
7152,
339,
431,
248,
2929,
2175,
5145,
4715,
3169,
3082,
323,
38183,
13757,
253,
2929,
21168,
2220,
270,
6646,
1162,
355,
4022,
327,
970,
35221,
4715,
281,
6635,
5482,
323,
38183,
13757,
3237,
24088,
37232,
253,
38135,
273,
253,
2929,
310,
281,
22318,
760,
247,
8578,
273,
253,
1566,
3602,
253,
2929,
840,
29328,
1264,
1027,
27558,
1754,
327,
436,
2934,
8453,
581,
12291,
273,
5368,
391,
77,
3169,
7274,
323,
38183,
13757,
310,
697,
7741,
8284,
347,
5183,
275,
2829,
337,
253,
3939,
3186,
5853,
275,
270,
6646,
1162,
355,
4022,
3936,
608,
1897,
281,
8415,
30321,
1071,
10872,
273,
37232,
253,
2929,
13698,
281,
18915,
436,
12291,
407,
36636,
281,
22318,
760,
247,
8578,
273,
253,
1566,
3602,
891,
1158,
253,
2929,
310,
2403,
247,
1175,
285,
14282,
7680,
4404,
2561,
275,
253,
1673,
50276,
2369,
652,
555,
253,
2929,
8725,
253,
3939,
3186,
1332,
275,
270,
6646,
1162,
355,
4022,
253,
1264,
4081,
27558,
403,
1754,
327,
581,
2087,
2934,
273,
39793,
760,
247,
8578,
273,
253,
1566,
3602,
253,
38135,
273,
253,
4081,
5853,
310,
3103,
3710,
2299,
604,
253,
1332,
17923,
973,
697,
17647,
812,
320,
273,
1029,
1600,
50276,
49836,
253,
2929,
310,
973,
15720,
253,
2905,
6239,
310,
5469,
275,
2508,
253,
5661,
1543,
403,
4518,
3559,
342,
28913,
1263,
285,
18974,
1783,
627,
403,
690,
5884,
15200,
39560,
275,
15250,
253,
4081,
5609,
347,
50221,
2708,
50276,
9088,
403,
690,
15200,
39560,
275,
253,
2929,
50276,
18,
327,
3239,
577,
2708,
4677,
337,
253,
2929,
29328,
253,
806,
5700,
5731,
253,
21496,
273,
253,
970,
253,
2957,
1159,
480,
8435,
285,
480,
300,
841,
2957,
3470,
403,
417,
11120,
7616,
9825,
275,
253,
2929,
760,
616,
11786,
8772,
253,
46234,
403,
3559,
275,
16186,
18,
285,
16186,
19,
10668,
665,
403,
417,
7615,
342,
391,
2815,
3535,
4715,
6239,
778,
417,
871,
752,
480,
8435,
285,
480,
300,
403,
352,
651,
320,
1270,
604,
253,
2488,
476,
320,
625,
6843,
670,
253,
2957,
3470,
1078,
15250,
616,
27935,
50276,
19,
275,
2829,
337,
253,
4477,
2085,
3402,
13273,
673,
323,
253,
4081,
11333,
285,
643,
1666,
25379,
327,
247,
873,
273,
30321,
37232,
10872,
1842,
33526,
12085,
3045,
1223,
3192,
760,
8988,
3038,
281,
1408,
347,
2429,
281,
608,
1897,
970,
253,
3236,
3939,
3186,
310,
436,
7756,
1955,
281,
247,
1842,
4648,
1679,
3541,
285,
7613,
359,
476,
8415,
625,
10872,
275,
7529,
390,
270,
1842,
310,
43245,
625,
5919,
26332,
352,
4648,
1679,
260,
1065,
553,
281,
5115,
253,
12085,
3045,
390,
260,
247,
5019,
273,
253,
1840,
352,
651,
320,
1805,
604,
253,
4477,
1304,
253,
260,
1065,
553,
3185,
273,
3402,
13273,
673,
285,
11794,
1304,
253,
2317,
3541,
285,
673,
27754,
273,
841,
11333,
50276,
37585,
5701,
275,
4677,
495,
253,
1269,
10565,
943,
320,
13130,
29331,
50276,
783,
2929,
29328,
247,
2969,
6880,
281,
271,
5368,
391,
77,
3169,
1332,
323,
38183,
13757,
697,
12510,
310,
5183,
45190,
2299,
891,
1928,
326,
253,
1543,
273,
253,
4679,
943,
320,
2361,
275,
3687,
2508,
26332,
281,
7277,
342,
253,
3236,
3939,
3186,
275,
1027,
3045,
17082,
824,
347,
3541,
285,
27754,
673,
10393,
5474,
33032,
2520,
2929,
2175,
3676,
4715,
3082,
323,
16161,
38183,
13757,
3237,
253,
4477,
3630,
326,
1375,
23037,
14387,
3082,
5431,
897,
3210,
326,
2882,
273,
32049,
285,
29810,
5085,
253,
3082,
806,
2794,
271,
21496,
273,
253,
1895,
4227,
970,
253,
32049,
840,
4983,
342,
271,
6325,
2900,
281,
253,
1895,
253,
21496,
285,
253,
29810,
403,
908,
281,
47694,
3161,
1242,
3989,
247,
2900,
689,
247,
2962,
273,
673,
5018,
1677,
271,
2168,
10166,
1566,
285,
247,
1071,
4227,
436,
2929,
2175,
849,
281,
4541,
5731,
253,
1566,
3602,
275,
1340,
281,
3157,
253,
3290,
273,
253,
2900,
4895,
407,
436,
5199,
253,
4477,
12661,
1264,
5609,
534,
4575,
337,
253,
9403,
4228,
46234,
273,
253,
1895,
4227,
326,
403,
4561,
407,
253,
32049,
1566,
374,
253,
13461,
273,
3081,
10872,
29765,
12541,
8090,
2879,
281,
253,
29810,
285,
495,
253,
3602,
273,
247,
31994,
2829,
326,
3587,
2818,
253,
5912,
3268,
4895,
407,
1566,
20544,
50276,
2520,
2929,
2175,
271,
12302,
403,
312,
12627,
4715,
323,
38183,
13757,
2811,
5145,
4715,
556,
253,
2442,
281,
1056,
247,
1943,
3486,
50276,
4064,
253,
4679,
3340,
7180,
337,
285,
374,
352,
4453,
347,
949,
253,
4081,
7274,
403,
1199,
7938,
685,
32048,
3939,
3186,
270,
6646,
1162,
355,
1668,
534,
432,
619,
4685,
17891,
323,
4088,
281,
4575,
512,
3602,
273,
253,
10166,
1566,
387,
1071,
673,
275,
4499,
253,
4081,
2746,
760,
17891,
323,
4088,
281,
4575,
2173,
20077,
273,
1566,
3602,
534,
2789,
253,
2746,
7938,
50276,
74,
11435,
326,
253,
4477,
7472,
616,
2746,
327,
247,
1643,
1027,
3510,
273,
38183,
13757,
3237,
767,
1027,
3510,
273,
24749,
3237,
285,
247,
27387,
1895,
323,
253,
27387,
1895,
253,
7756,
689,
3939,
3186,
310,
247,
2372,
625,
16453,
50276,
20881,
1255,
265,
50276,
74,
1119,
253,
1895,
5740,
8489,
1892,
281,
956,
275,
2593,
495,
352,
651,
320,
9371,
281,
19148,
752,
4555,
271,
2250,
10140,
281,
275,
436,
4758,
581,
1039,
281,
513,
436,
651,
320,
281,
26799,
253,
38183,
3237,
5421,
275,
253,
4679,
2593,
285,
5513,
752,
271,
2250,
10140,
281,
285,
752,
253,
1375,
331,
18,
10140,
281,
846,
9433,
271,
2250,
387,
50275,
249,
2426,
273,
2900,
3290,
253,
11701,
689,
3237,
29765,
1666,
25379,
403,
4536,
1663,
1355,
24088,
275,
7180,
337,
285,
374,
247,
6919,
273,
247,
7155,
327,
824,
247,
1355,
4311,
891,
369,
2649,
2119,
604,
891,
812,
4517,
253,
34385,
273,
667,
1798,
1332,
7162,
11508,
651,
1663,
1361,
1060,
50276,
5992,
7193,
5701,
50276,
6377,
374,
516,
417,
2119,
326,
34093,
310,
253,
987,
3159,
1060,
891,
651,
5386,
352,
50276,
29813,
337,
516,
417,
2119,
752,
368,
1599,
672,
368,
1333,
326,
270,
17,
310,
247,
8245,
908,
281,
4796,
11041,
476,
368,
1333,
625,
4583,
516,
25661,
2584,
14924,
984,
253,
4081,
2746,
3133,
281,
2085,
247,
16613,
7756,
689,
2720,
3082,
275,
1798,
3939,
3186,
407,
270,
6646,
1162,
355,
1668,
275,
2426,
273,
20243,
2490,
187,
4118,
18435,
27,
2520,
2929,
4245,
247,
7792,
323,
970,
4715,
275,
38183,
13757,
3237,
50276,
249,
1798,
3939,
3186,
310,
908,
281,
3037,
288,
10207,
3397,
253,
30628,
1869,
253,
2929,
574,
5322,
20178,
9021,
323,
436,
2746,
285,
326,
253,
1543,
651,
320,
1077,
4722,
281,
253,
3114
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
247,
747,
4758,
651,
1056,
253,
2929,
10046,
285,
625,
7763,
275,
3946,
50276,
47606,
1332,
342,
12532,
1543,
5474,
339,
431,
248,
2929,
13330,
342,
990,
936,
423,
4715,
273,
344,
321,
3397,
323,
38183,
13757,
3237,
253,
4477,
12661,
271,
6880,
273,
253,
3939,
3186,
1332,
273,
270,
6646,
1162,
355,
4022,
835,
760,
629,
273,
253,
1566,
3602,
403,
9300,
387,
1071,
673,
323,
1016,
4227,
597,
12661,
1264,
4088,
273,
9433,
436,
2934,
326,
2882,
275,
1442,
292,
25004,
629,
273,
253,
4227,
46234,
253,
3602,
273,
271,
3081,
3828,
390,
3587,
253,
10554,
7363,
273,
253,
1566,
3732,
281,
253,
268,
19216,
1332,
465,
33382,
1162,
355,
9169,
323,
253,
37232,
285,
30105,
1087,
285,
253,
298,
19,
69,
1332,
273,
1182,
12109,
1162,
355,
9169,
323,
253,
480,
859,
81,
253,
4081,
5919,
3939,
3186,
5644,
281,
1534,
11701,
327,
10872,
273,
253,
1072,
1979,
285,
4067,
685,
253,
3733,
4394,
50276,
296,
3755,
20556,
50276,
18,
253,
2929,
310,
2590,
973,
29070,
285,
973,
3542,
50276,
18,
253,
3559,
2746,
3133,
7763,
281,
667,
25799,
1332,
347,
1048,
347,
352,
556,
247,
32049,
48759,
1511,
273,
10336,
50276,
18,
275,
253,
4679,
253,
4081,
2746,
310,
3732,
281,
374,
3210,
323,
16161,
495,
1027,
3237,
285,
253,
1543,
403,
12724,
2762,
534,
28145,
387,
253,
31376,
273,
253,
4081,
2746,
50276,
18,
352,
19132,
253,
3045,
273,
253,
6944,
1566,
327,
1071,
10872,
432,
253,
1072,
3268,
347,
253,
3733,
10872,
347,
973,
347,
281,
4067,
10872,
432,
253,
1072,
3268,
8069,
15974,
253,
973,
4304,
10183,
273,
2629,
4715,
3169,
3210,
281,
1347,
973,
327,
4067,
10872,
50276,
18,
40199,
5955,
275,
4706,
7127,
281,
5513,
1896,
4606,
273,
2139,
581,
273,
253,
4081,
11640,
789,
1682,
323,
1016,
1895,
50275,
20881,
1255,
265,
337,
275,
253,
4679,
253,
4311,
273,
10872,
310,
3710,
1052,
7632,
323,
37232,
285,
260,
24987,
81,
1223,
3332,
4715,
3169,
3082,
824,
347,
253,
11106,
15260,
1162,
355,
43425,
8722,
281,
8415,
37232,
10872,
342,
598,
281,
30321,
7632,
337,
2087,
5837,
310,
760,
12800,
285,
7558,
342,
1675,
281,
253,
1979,
273,
253,
10872,
352,
651,
320,
4722,
281,
923,
253,
1543,
327,
643,
10670,
15036,
24088,
9433,
1842,
281,
247,
1566,
10166,
327,
37232,
2313,
327,
10872,
273,
246,
23336,
487,
337,
323,
260,
24987,
81,
1543,
403,
6296,
2530,
323,
643,
10670,
533,
323,
1016,
2021,
273,
10872,
253,
4477,
1333,
326,
253,
1566,
310,
10166,
323,
495,
3618,
285,
840,
5762,
327,
2074,
10872,
323,
1016,
2021,
812,
1842,
320,
9371,
281,
3037,
1175,
5482,
4983,
432,
253,
1072,
1566,
50275,
250,
27167,
318,
50276,
74,
651,
6273,
323,
18738,
253,
2929,
253,
7680,
310,
4722,
347,
247,
4766,
3216,
875,
1442,
292,
25004,
247,
2644,
1566,
323,
1016,
4227,
3939,
3186,
285,
643,
1327,
28269,
1754,
3186,
8130,
8325,
8716,
10491,
3966,
253,
4081,
2746,
310,
12800,
327,
374,
3210,
323,
495,
2629,
820,
3237,
285,
12724,
2722,
1175,
1543,
50276,
34974,
337,
752,
310,
253,
16038,
273,
6240,
247,
747,
3828,
281,
1442,
292,
2517,
1842,
15328,
7147,
1442,
292,
25004,
690,
5368,
8090,
273,
253,
1566,
337,
275,
7180,
337,
285,
374,
2139,
403,
627,
642,
12028,
323,
954,
273,
253,
4715,
3169,
1666,
25379,
323,
295,
50276,
9312,
50276,
17480,
627,
310,
642,
7654,
4522,
293,
16563,
275,
436,
4758,
891,
5476,
342,
247,
1355,
40252,
14604,
1979,
323,
253,
3210,
281,
4944,
715,
3541,
512,
841,
3082,
651,
2085,
690,
1543,
323,
295,
598,
281,
1052,
337,
275,
30762,
2829,
577,
452,
368,
3597,
3365,
9433,
1842,
281,
253,
1566,
10166,
327,
253,
6447,
3268,
260,
24987,
81,
2313,
326,
651,
320,
247,
3626,
1071,
273,
253,
50276,
48276,
273,
1842,
327,
26647,
50276,
18,
2819,
387,
4677,
495,
50276,
262,
3133,
253,
1318,
273,
253,
1682,
29331,
7024,
327,
253,
1895,
285,
253,
2491,
273,
2442,
2193,
310,
3240,
4618,
7449,
37965,
452,
368,
10141,
253,
4311,
273,
253,
1027,
11655,
285,
812,
352,
1361,
5513,
824,
247,
3064,
337,
604,
581,
497,
281,
4647,
1842,
281,
1529,
1566,
1940,
1895,
812,
368,
27566,
432,
634,
4679,
247,
2087,
2238,
273,
4086,
273,
17300,
273,
534,
12955,
651,
789,
1805,
323,
534,
4112,
50275,
38092,
8680,
50276,
249,
10199,
50271,
20513,
3082,
513,
417,
8071,
4404,
253,
5482,
2326,
594,
2080,
26332,
253,
6944,
3268,
432,
534,
10872,
403,
19958,
310,
1620,
4391,
4768,
253,
3186,
417,
2590,
281,
479,
513,
368,
1599,
432,
534,
5482,
403,
19958,
50272,
4363,
5223,
279,
50276,
324,
7872,
50276,
1704,
495,
11814,
253,
29810,
310,
5611,
342,
4764,
40639,
533,
436,
581,
310,
760,
2931,
347,
253,
46234,
275,
253,
1735,
12494,
50275,
1704,
4562,
275,
253,
5426,
273,
253,
2264,
11786,
987,
846,
5150,
374,
943,
2649,
627,
320,
247,
19734,
1078,
581,
273,
253,
27935,
50276,
17480,
581,
11786,
10140,
281,
253,
7221,
5837,
273,
253,
2105,
285,
253,
643,
281,
253,
11903,
5837,
273,
253,
12177,
50276,
13206,
374,
340,
10565,
310,
253,
3388,
4815,
5556,
1319,
8037,
651,
320,
625,
4623,
285,
5185,
3340,
604,
10872,
452,
1027,
9552,
50272,
783,
2929,
3400,
271,
4722,
7680,
281,
3037,
281,
3186,
323,
1029,
15177,
5482,
387,
1071,
673,
23395,
21006,
990,
936,
423,
4715,
44387,
323,
16161,
820,
3237,
253,
4081,
2746,
812,
320,
3732,
281,
667,
1566,
326,
556,
271,
32049,
48759,
1511,
273,
10336,
285,
310,
21657,
17618,
327,
374,
3210,
285,
495,
3237,
247,
12291,
310,
326,
352,
310,
417,
2590,
604,
352,
812,
1361,
247,
1566,
39970,
281,
10872,
326,
403,
1199,
4067,
685,
253,
3733,
4394,
390,
342,
3012,
1027,
5319,
891,
6273,
323,
18738,
253,
2929,
50274,
11183,
846,
30080,
22559,
50276,
74,
5717,
253,
4477,
323,
10534,
22291,
512,
619,
3533,
285,
7350,
891,
717,
5211,
281,
6583,
619,
3302,
17401,
273,
18738,
253,
2929,
50275,
7152,
339,
431,
248,
2929,
2175,
5145,
4715,
3169,
3082,
323,
38183,
13757,
253,
2929,
21168,
2220,
270,
6646,
1162,
355,
4022,
327,
970,
35221,
4715,
281,
6635,
5482,
323,
38183,
13757,
3237,
24088,
37232,
253,
38135,
273,
253,
2929,
310,
281,
22318,
760,
247,
8578,
273,
253,
1566,
3602,
253,
2929,
840,
29328,
1264,
1027,
27558,
1754,
327,
436,
2934,
8453,
581,
12291,
273,
5368,
391,
77,
3169,
7274,
323,
38183,
13757,
310,
697,
7741,
8284,
347,
5183,
275,
2829,
337,
253,
3939,
3186,
5853,
275,
270,
6646,
1162,
355,
4022,
3936,
608,
1897,
281,
8415,
30321,
1071,
10872,
273,
37232,
253,
2929,
13698,
281,
18915,
436,
12291,
407,
36636,
281,
22318,
760,
247,
8578,
273,
253,
1566,
3602,
891,
1158,
253,
2929,
310,
2403,
247,
1175,
285,
14282,
7680,
4404,
2561,
275,
253,
1673,
50276,
2369,
652,
555,
253,
2929,
8725,
253,
3939,
3186,
1332,
275,
270,
6646,
1162,
355,
4022,
253,
1264,
4081,
27558,
403,
1754,
327,
581,
2087,
2934,
273,
39793,
760,
247,
8578,
273,
253,
1566,
3602,
253,
38135,
273,
253,
4081,
5853,
310,
3103,
3710,
2299,
604,
253,
1332,
17923,
973,
697,
17647,
812,
320,
273,
1029,
1600,
50276,
49836,
253,
2929,
310,
973,
15720,
253,
2905,
6239,
310,
5469,
275,
2508,
253,
5661,
1543,
403,
4518,
3559,
342,
28913,
1263,
285,
18974,
1783,
627,
403,
690,
5884,
15200,
39560,
275,
15250,
253,
4081,
5609,
347,
50221,
2708,
50276,
9088,
403,
690,
15200,
39560,
275,
253,
2929,
50276,
18,
327,
3239,
577,
2708,
4677,
337,
253,
2929,
29328,
253,
806,
5700,
5731,
253,
21496,
273,
253,
970,
253,
2957,
1159,
480,
8435,
285,
480,
300,
841,
2957,
3470,
403,
417,
11120,
7616,
9825,
275,
253,
2929,
760,
616,
11786,
8772,
253,
46234,
403,
3559,
275,
16186,
18,
285,
16186,
19,
10668,
665,
403,
417,
7615,
342,
391,
2815,
3535,
4715,
6239,
778,
417,
871,
752,
480,
8435,
285,
480,
300,
403,
352,
651,
320,
1270,
604,
253,
2488,
476,
320,
625,
6843,
670,
253,
2957,
3470,
1078,
15250,
616,
27935,
50276,
19,
275,
2829,
337,
253,
4477,
2085,
3402,
13273,
673,
323,
253,
4081,
11333,
285,
643,
1666,
25379,
327,
247,
873,
273,
30321,
37232,
10872,
1842,
33526,
12085,
3045,
1223,
3192,
760,
8988,
3038,
281,
1408,
347,
2429,
281,
608,
1897,
970,
253,
3236,
3939,
3186,
310,
436,
7756,
1955,
281,
247,
1842,
4648,
1679,
3541,
285,
7613,
359,
476,
8415,
625,
10872,
275,
7529,
390,
270,
1842,
310,
43245,
625,
5919,
26332,
352,
4648,
1679,
260,
1065,
553,
281,
5115,
253,
12085,
3045,
390,
260,
247,
5019,
273,
253,
1840,
352,
651,
320,
1805,
604,
253,
4477,
1304,
253,
260,
1065,
553,
3185,
273,
3402,
13273,
673,
285,
11794,
1304,
253,
2317,
3541,
285,
673,
27754,
273,
841,
11333,
50276,
37585,
5701,
275,
4677,
495,
253,
1269,
10565,
943,
320,
13130,
29331,
50276,
783,
2929,
29328,
247,
2969,
6880,
281,
271,
5368,
391,
77,
3169,
1332,
323,
38183,
13757,
697,
12510,
310,
5183,
45190,
2299,
891,
1928,
326,
253,
1543,
273,
253,
4679,
943,
320,
2361,
275,
3687,
2508,
26332,
281,
7277,
342,
253,
3236,
3939,
3186,
275,
1027,
3045,
17082,
824,
347,
3541,
285,
27754,
673,
10393,
5474,
33032,
2520,
2929,
2175,
3676,
4715,
3082,
323,
16161,
38183,
13757,
3237,
253,
4477,
3630,
326,
1375,
23037,
14387,
3082,
5431,
897,
3210,
326,
2882,
273,
32049,
285,
29810,
5085,
253,
3082,
806,
2794,
271,
21496,
273,
253,
1895,
4227,
970,
253,
32049,
840,
4983,
342,
271,
6325,
2900,
281,
253,
1895,
253,
21496,
285,
253,
29810,
403,
908,
281,
47694,
3161,
1242,
3989,
247,
2900,
689,
247,
2962,
273,
673,
5018,
1677,
271,
2168,
10166,
1566,
285,
247,
1071,
4227,
436,
2929,
2175,
849,
281,
4541,
5731,
253,
1566,
3602,
275,
1340,
281,
3157,
253,
3290,
273,
253,
2900,
4895,
407,
436,
5199,
253,
4477,
12661,
1264,
5609,
534,
4575,
337,
253,
9403,
4228,
46234,
273,
253,
1895,
4227,
326,
403,
4561,
407,
253,
32049,
1566,
374,
253,
13461,
273,
3081,
10872,
29765,
12541,
8090,
2879,
281,
253,
29810,
285,
495,
253,
3602,
273,
247,
31994,
2829,
326,
3587,
2818,
253,
5912,
3268,
4895,
407,
1566,
20544,
50276,
2520,
2929,
2175,
271,
12302,
403,
312,
12627,
4715,
323,
38183,
13757,
2811,
5145,
4715,
556,
253,
2442,
281,
1056,
247,
1943,
3486,
50276,
4064,
253,
4679,
3340,
7180,
337,
285,
374,
352,
4453,
347,
949,
253,
4081,
7274,
403,
1199,
7938,
685,
32048,
3939,
3186,
270,
6646,
1162,
355,
1668,
534,
432,
619,
4685,
17891,
323,
4088,
281,
4575,
512,
3602,
273,
253,
10166,
1566,
387,
1071,
673,
275,
4499,
253,
4081,
2746,
760,
17891,
323,
4088,
281,
4575,
2173,
20077,
273,
1566,
3602,
534,
2789,
253,
2746,
7938,
50276,
74,
11435,
326,
253,
4477,
7472,
616,
2746,
327,
247,
1643,
1027,
3510,
273,
38183,
13757,
3237,
767,
1027,
3510,
273,
24749,
3237,
285,
247,
27387,
1895,
323,
253,
27387,
1895,
253,
7756,
689,
3939,
3186,
310,
247,
2372,
625,
16453,
50276,
20881,
1255,
265,
50276,
74,
1119,
253,
1895,
5740,
8489,
1892,
281,
956,
275,
2593,
495,
352,
651,
320,
9371,
281,
19148,
752,
4555,
271,
2250,
10140,
281,
275,
436,
4758,
581,
1039,
281,
513,
436,
651,
320,
281,
26799,
253,
38183,
3237,
5421,
275,
253,
4679,
2593,
285,
5513,
752,
271,
2250,
10140,
281,
285,
752,
253,
1375,
331,
18,
10140,
281,
846,
9433,
271,
2250,
387,
50275,
249,
2426,
273,
2900,
3290,
253,
11701,
689,
3237,
29765,
1666,
25379,
403,
4536,
1663,
1355,
24088,
275,
7180,
337,
285,
374,
247,
6919,
273,
247,
7155,
327,
824,
247,
1355,
4311,
891,
369,
2649,
2119,
604,
891,
812,
4517,
253,
34385,
273,
667,
1798,
1332,
7162,
11508,
651,
1663,
1361,
1060,
50276,
5992,
7193,
5701,
50276,
6377,
374,
516,
417,
2119,
326,
34093,
310,
253,
987,
3159,
1060,
891,
651,
5386,
352,
50276,
29813,
337,
516,
417,
2119,
752,
368,
1599,
672,
368,
1333,
326,
270,
17,
310,
247,
8245,
908,
281,
4796,
11041,
476,
368,
1333,
625,
4583,
516,
25661,
2584,
14924,
984,
253,
4081,
2746,
3133,
281,
2085,
247,
16613,
7756,
689,
2720,
3082,
275,
1798,
3939,
3186,
407,
270,
6646,
1162,
355,
1668,
275,
2426,
273,
20243,
2490,
187,
4118,
18435,
27,
2520,
2929,
4245,
247,
7792,
323,
970,
4715,
275,
38183,
13757,
3237,
50276,
249,
1798,
3939,
3186,
310,
908,
281,
3037,
288,
10207,
3397,
253,
30628,
1869,
253,
2929,
574,
5322,
20178,
9021,
323,
436,
2746,
285,
326,
253,
1543,
651,
320,
1077,
4722,
281,
253,
3114
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a method for the detection of adversarial examples based on identification of critical paths called effective paths in dnn classifiers borrowing from the analysis of execution paths of controlflow programs the authors use backpropagation from the neuron associated from the final class decision to identify a minimal subset of input synapses accounting for more than a threshold proportion theta of the total input weight the identification process is then recursively applied at the preceding layer for those neurons associated with the selected minimal subset of synapses forming a tree of synapses the effective path the authors then propose to compare the effective paths actually unions of paths of different examples using simple structural dissimilarity measures which they extend to allow comparison to a typical aggregated path for multiple examples drawn from a common class in their experimentation with their measure they noted that examples generated by a number of adversarial attacks tend to be less similar to their firstranked estimated class than normal examples are to their own firstranked classes similarly they note that these same adversarial attacks tend to be more similar to their secondranked classes than normal examples are to their own secondranked classes as the authors point out this is likely due to the increased likelihood of the secondranked class of adversarial examples being the true class for the original example from which it was perturbed the authors then propose the difference between these two similarities that is firstranked dissimilarity minus secondranked dissimilarity as a characterization of adversarial examples the idea of using critical paths in the dnn to detect adversarial examples is interesting and the authors deserve credit for showing that these critical paths as defined in this paper do show differences from those of normal examples however the originality of the approach is undercut by the recent work of wang et al cvpr 2018 which the authors acknowledge only in the discussion of experimental results although the details are different as to how critical paths are identified and how adversarial examples can be detected using them the strategies are definitely related a more detailed explanation of this should have been given in the introduction of the paper more troubling is the fact that a headtohead experimental comparison is not provided neither with wang et al nor with other state of the art detectors other than a qualitative assessment of the capabilities of some detectors in table 1 note that even this qualitative discussion does not include some of the recent detection approaches such as bpda athalye et al icml 2018 or lid ma et al iclr 2018 the question of how best to define critical paths and their similarities is still very much open the authors approach is rather simplistic and straightforward for example is their similarity measure biased towards the contributions from early layers can a layerbylayer weighting of contributions improve the performance the authors do not always interpret their own experimental results correctly for example their results in figures 7i and 7j dont really support their conclusion that performance remains almost unchanged when theta is in the range 05 10 also figure 4 does not show that their effective path similarity is not directly a great metric to distinguish between normal and adversarial examples because a large proportion of adversarial examples have scores that fall in the typical range for normal examples however there are differences in tendency which can be exploited as the authors do show the organization of the paper is in some need of improvement for example the discussion of densities of effective paths section 2 comes well before the details of the choice of threshold value theta used to generate them section 41 to summarize pros a good case is made for the use of critical paths as a way of differentiating adversarial examples from normal examples the reported improvement in similarity of adversarial examples with respect to their secondranked classes is particularly intriguing the paper is generally well written and easy to follow cons the experimental treatment is insufficient in particular a more carefully considered experimental justification is needed with respect to other detection strategies the question of how best to define critical paths and their similarities is still very much open the authors do not always interpret their own experimental results correctly the organization of the paper is in some need of improvementdocsepthe authors propose the notion of effective path for the purpose of identifying neurons that contributes to the predictions and being able to detect adversarial images in the context of image classification overall the paper is well written except that the authors are mixing two highly related but still different topics explanation and adversary detection so that the motivation is confusing the experimental results indeed show promises that effective path can help understand class similarities and network efficiencies but doesnt really show how the proposed work is adding value to the field it lacks the experimental comparison with previous methods but only include discussion in texts this paper could turn out to be a stronger paper but it is not ready yet below are some more detailed comments 1 the authors motivate by stating that the vulnerability of nn to input perturbations is due to the lack of interpretability section introduction abstract i can understand that we want more interpretability and we want less vulnerability but i cant agree that vulnerability is caused by lack of interpretability also the authors are trying to accomplish both tasks interpretability and adversary detection by showing data analysis of how the findings coincide with prior knowledge eg class of digit 1 is the most different from other classes in mnist task and by showing detecting adversary images however neither has valid quantitative comparison with previous work actually for the interpretability topic the authors didnt really provide a tool or a generalizable method thus i would suggest to choose one of the two topics ie adversarial image detection and focus on it by adding thorough comparison with other methods in the discussion and result section include the interpretability analysis to justify why the proposed adversary detection method is behaving in certain ways 2 one topic that is missing from the paper is the time complexity of the proposed method at a nave estimate it would require tracking and finding the minimum set of effective neurons with threshold theta and thus per instance at least om log m is required at prediction phase where m is the number of features for n instances the asymptotic complexity is onm log m how does it compare to the other adversary detection methods 3 page 3 mentions that the work for critical routing path wang et al 2018 requires retraining for every image this statement is not really true without more context also authors discuss this work again very briefly in page 8 due to the high similarity in methods and motivation with the proposed method but the authors dont show any quantitative comparison after all both methods are trying to identify neurons that contribute the most to the prediction some more concrete comparison would be nice 4 page 3 mentions that the derived overall effective path is highly sparse compared to the original network and the effective path density for five trained models ranges from 13 to 42 which conforms with the 80 claim from another paper together with the other similar statements it would be really nice to note what theta is used for such statements how does such statement change with different theta also some discussion would be nice about what such sparsity implies specifically does the sparsity suggest the opportunity for feature selection or does it suggest a way for detecting overfitting 5 page 5 shows the path similarity between the normal and the adversary examples from the figure 5a and 5b we can see the on the first layer the mean deviate between normal and others but why the last layer they almost reach to the same point it seems it is the middle layer that distinguish the normal from the adversary examples the most some more discussion would be good 6 some justification of why theta05 is chosen would be good on page 6 7 on page 7 the authors are discussing the performance of the proposed method however there is no really comparison with other methods but rather the authors stated better accuracy auc is better by comparing different evaluation scenarios i dont find such discussion helpful in showing the contribution of the proposed method also in the parameter sensitivity it would be nice to add the analysis for the effective path density and see if it still conforms with the 80 claim with different theta 8 page 1 need to add citations for the statement and even outperformed human beings 9 minor issue page 1 such computer vision should be such as computer vision docsepthis paper proposes a measure effective path of which units and weights were most important for classification of a particular input or input class using the effective path the authors analyze the overlap between paths across classes for cnns and between adversarially modified and unmodified images finally the paper proposes an adversarial defense method based on effective path which detects adversarially manipulated images with high accuracy and generality to a variety of settings overall this paper is interesting and provides several novel observations the clarity of the exposition is generally good but can be improved in several places mentioned below as for significance effective path is likely to inform future analyses of neural networks and the adversarial defense may prove impactful though ultimately its impact will depend on if and when the defense is broken however there are several important controls missing from the analysis several claims which are unsubstantiated and experimental details are lacking in a few places as such in its current form i can only weakly recommend this paper for acceptance if in the revision the controls requested below are included additional evidence is provided for the unsubstantiated claims or if those claims are toned down and exposition of missing experimental details is included id be happy to raise my score major points 1 while the observation regarding path specialization is very interesting one cannot gauge whether or not the degree of overlap observed between classspecific paths signals path specialization or simply high inputtoinput path variance which is similar both within and across classes in order to distinguish between these possibilities a measure of intraclass path similarity is necessary in addition an experiment similar to that in figure 2 with cifar10 would be quite helpful in evaluating whether this phenomenon exists in more natural datasets the imagenet results are difficult to interpret due to the large number of classes 2 several claims in the path specialization section are unsubstantiated 2a in particular the claim that 1 has the highest degree of specialization because of its unique shape is made without evidence as is the similarity between 5 and 8 6 is also similar to 8 and yet does not show the same similarity in the path specialization these differences may very well simply be due to chance 2b the claim that the path specialization in imagenet matches the class hierarchy is made only based on the rough nonlinearity of figure 3 please either measure the overlap within and across class categories or soften this claim 3 the similarity analysis for adversarial images is also very interesting but a comparison between unmodified and randomly perturbed images with matched norms to the adversarially perturbed images is necessary to establish whether this effect is due to noise generally or adversarial noise its unclear how the effective path is calculated when negative weights are involved further exposition of this aspect would be helpful minor pointstypos 1 there are several places where confusing concepts are introduced in one paragraph but explained several paragraphs later in particular the distinction between synapses and weights is introduced halfway through page 2 but explained on page 3 and the fact that the coefficients for the defense metric are learned is unclear until page 4 even though theyre introduced on page 3 2 typos 2a section 1 fourth paragraph and adversarial images we uncover should be and adversarial images and we uncover 2b section 1 fourth paragraph by small perturbation the network should be by small perturbations the network 2c section 2 first paragraph the blackboxed neural should be the blackbox neural 2d section 2 first paragraph in the high level should be at a high level 2e section 4 first paragraph as it does no modify should be as it does not modify 2f title should be neural network
### Summary: | the paper presents an approach to estimate the effective path of examples in a network to reach a decision and consider this to analyze if examples might be adversarial reviewers think the paper lacks some clarity and experiments they point to a confusion between interpretability and adversarial attacks they ask questions about computational complexity and point to some unsubstanciated claims authors have not responded to reviewers overall i concur with the reviewers to reject the paper | [
2299,
253,
3236,
414,
273,
253,
2746,
310,
762,
7317,
407,
253,
3332,
789,
273,
259,
606,
1162,
355,
30105,
1087,
4765,
534,
253,
4477,
14409,
760,
275,
253,
5955,
273,
5661,
1543,
3738,
253,
4278,
403,
1027,
347,
281,
849,
4619,
11865,
403,
3636,
285,
849,
48960,
6667,
476,
320,
5189,
970,
731,
253,
8130,
403,
7964,
2905,
50276,
66,
625,
7000,
8813,
273,
436,
943,
452,
644,
1677,
275,
253,
10199,
273,
253,
2929,
625,
39728,
310,
253,
958,
326,
247,
1481,
936,
2522,
5661,
5301,
310,
417,
2530,
6747,
342,
259,
606,
1162,
355,
4543,
342,
643,
1375,
273,
253,
1445,
25421,
643,
685,
247,
18276,
6803,
273,
253,
13789,
273,
690,
25421,
275,
2829,
337,
3877,
326,
1014,
436,
18276,
5955,
1057,
417,
2486,
690,
273,
253,
3332,
5481,
7274,
824,
347,
20633,
1473,
9621,
5242,
70,
1162,
355,
17857,
1686,
4765,
390,
16486,
6429,
1162,
355,
17857,
32888,
4765,
50276,
783,
1953,
273,
849,
1682,
281,
4853,
4619,
11865,
285,
616,
22620,
310,
1335,
1077,
1199,
1527,
50276,
783,
4477,
2746,
310,
2581,
8077,
2531,
285,
15246,
323,
1650,
310,
616,
14259,
2557,
23539,
4404,
253,
9021,
432,
2393,
8090,
476,
247,
3828,
67,
1190,
4071,
42428,
273,
9021,
3157,
253,
3045,
50276,
783,
4477,
513,
417,
1900,
4665,
616,
1211,
5661,
1543,
9113,
323,
1650,
616,
1543,
275,
8442,
818,
74,
285,
818,
75,
13414,
1663,
1329,
616,
6452,
326,
3045,
4558,
2761,
19965,
672,
39116,
310,
275,
253,
2491,
16987,
50276,
740,
671,
4677,
577,
1057,
417,
921,
326,
616,
3576,
1854,
14259,
310,
417,
3587,
247,
1270,
7982,
281,
12129,
875,
2622,
285,
48960,
6667,
984,
247,
1781,
8394,
273,
48960,
6667,
452,
7363,
326,
2965,
275,
253,
6867,
2491,
323,
2622,
6667,
2299,
627,
403,
3910,
275,
14955,
534,
476,
320,
28734,
347,
253,
4477,
513,
921,
50276,
783,
6003,
273,
253,
2929,
310,
275,
690,
878,
273,
7756,
323,
1650,
253,
5955,
273,
16689,
273,
3576,
11865,
2593,
374,
3249,
973,
1078,
253,
4278,
273,
253,
4327,
273,
7887,
1318,
39116,
908,
281,
6635,
731,
2593,
7609,
50276,
936,
26799,
50276,
856,
84,
50276,
66,
1175,
1083,
310,
1160,
323,
253,
897,
273,
4619,
11865,
347,
247,
1039,
273,
43073,
48960,
6667,
432,
2622,
6667,
50276,
783,
2361,
7756,
275,
14259,
273,
48960,
6667,
342,
1675,
281,
616,
1273,
14714,
264,
5971,
310,
3782,
27807,
50276,
783,
2929,
310,
3839,
973,
3542,
285,
3477,
281,
956,
50276,
5040,
50276,
783,
5661,
1971,
310,
12497,
275,
1798,
247,
625,
9257,
2783,
5661,
22861,
310,
3058,
342,
1675,
281,
643,
5481,
8130,
50276,
783,
1953,
273,
849,
1682,
281,
4853,
4619,
11865,
285,
616,
22620,
310,
1335,
1077,
1199,
1527,
50276,
783,
4477,
513,
417,
1900,
4665,
616,
1211,
5661,
1543,
9113,
50276,
783,
6003,
273,
253,
2929,
310,
275,
690,
878,
273,
7756,
7152,
339,
431,
248,
4477,
12661,
253,
10732,
273,
3576,
1854,
323,
253,
4096,
273,
12488,
8512,
326,
17904,
281,
253,
13650,
285,
1146,
2104,
281,
2736,
48960,
3888,
275,
253,
3634,
273,
2460,
9162,
4583,
253,
2929,
310,
973,
3542,
3707,
326,
253,
4477,
403,
12480,
767,
4122,
2905,
533,
1335,
1027,
12989,
8813,
285,
34014,
5481,
594,
326,
253,
16038,
310,
21643,
253,
5661,
1543,
6296,
921,
16966,
326,
3576,
1854,
476,
1361,
2096,
966,
22620,
285,
2990,
37469,
533,
36908,
1663,
921,
849,
253,
4081,
789,
310,
6240,
1318,
281,
253,
1673,
352,
19756,
253,
5661,
5301,
342,
2045,
3082,
533,
760,
2486,
5955,
275,
17438,
436,
2929,
812,
1614,
562,
281,
320,
247,
10046,
2929,
533,
352,
310,
417,
4704,
2568,
50276,
27490,
403,
690,
625,
7000,
5701,
337,
253,
4477,
41509,
407,
14851,
326,
253,
24189,
273,
48257,
281,
3280,
26309,
310,
1955,
281,
253,
3480,
273,
4665,
1430,
2593,
10199,
50276,
15834,
891,
476,
2096,
326,
359,
971,
625,
4665,
1430,
285,
359,
971,
1679,
24189,
533,
891,
16216,
5194,
326,
24189,
310,
4269,
407,
3480,
273,
4665,
1430,
671,
253,
4477,
403,
2820,
281,
14294,
1097,
8892,
4665,
1430,
285,
34014,
5481,
407,
4645,
941,
1783,
273,
849,
253,
4342,
28588,
342,
2720,
3640,
24088,
966,
273,
6670,
337,
310,
253,
954,
1027,
432,
643,
5971,
275,
278,
79,
382,
4836,
285,
407,
4645,
15549,
34014,
3888,
2299,
6747,
556,
3588,
11745,
5301,
342,
2045,
789,
2686,
323,
253,
4665,
1430,
9400,
253,
4477,
42126,
1663,
2085,
247,
4968,
390,
247,
2087,
12729,
1332,
3021,
891,
651,
1804,
281,
5206,
581,
273,
253,
767,
12989,
26332,
48960,
2460,
5481,
285,
2770,
327,
352,
407,
6240,
11080,
5301,
342,
643,
3082,
275,
253,
5955,
285,
906,
2593,
2486,
253,
4665,
1430,
1783,
281,
15249,
2139,
253,
4081,
34014,
5481,
1332,
310,
50090,
275,
2176,
4088,
50276,
19,
581,
9400,
326,
310,
5816,
432,
253,
2929,
310,
253,
673,
10454,
273,
253,
4081,
1332,
387,
247,
295,
1123,
6642,
352,
651,
2430,
12544,
285,
4560,
253,
5927,
873,
273,
3576,
8512,
342,
7887,
39116,
285,
3021,
591,
4227,
387,
1878,
7005,
2412,
278,
310,
2424,
387,
10554,
3408,
835,
278,
310,
253,
1180,
273,
3386,
323,
295,
10872,
253,
20185,
10454,
310,
327,
78,
2412,
278,
849,
1057,
352,
7277,
281,
253,
643,
34014,
5481,
3082,
50276,
20,
3239,
495,
25957,
326,
253,
789,
323,
4619,
24749,
1854,
259,
606,
1162,
355,
4765,
4419,
851,
26208,
323,
1046,
2460,
436,
3908,
310,
417,
1663,
2032,
1293,
625,
3634,
671,
4477,
2319,
436,
789,
969,
1077,
13366,
275,
3239,
854,
1955,
281,
253,
1029,
14259,
275,
3082,
285,
16038,
342,
253,
4081,
1332,
533,
253,
4477,
13414,
921,
667,
11745,
5301,
846,
512,
1097,
3082,
403,
2820,
281,
4271,
8512,
326,
8162,
253,
954,
281,
253,
10554,
690,
625,
11859,
5301,
651,
320,
5322,
50276,
21,
3239,
495,
25957,
326,
253,
6012,
4583,
3576,
1854,
310,
4122,
23507,
2429,
281,
253,
3236,
2990,
285,
253,
3576,
1854,
4038,
323,
2620,
10166,
3210,
13794,
432,
2145,
281,
5976,
534,
10138,
84,
342,
253,
5096,
1750,
432,
1529,
2929,
2366,
342,
253,
643,
2074,
7234,
352,
651,
320,
1663,
5322,
281,
3877,
752,
39116,
310,
908,
323,
824,
7234,
849,
1057,
824,
3908,
1818,
342,
1027,
39116,
671,
690,
5955,
651,
320,
5322,
670,
752,
824,
37139,
414,
8018,
5742,
1057,
253,
37139,
414,
1804,
253,
5107,
323,
4735,
5438,
390,
1057,
352,
1804,
247,
1039,
323,
15549,
689,
31893,
50276,
22,
3239,
608,
2722,
253,
1854,
14259,
875,
253,
2622,
285,
253,
34014,
6667,
432,
253,
4677,
608,
66,
285,
608,
67,
359,
476,
923,
253,
327,
253,
806,
3828,
253,
1599,
1474,
4513,
875,
2622,
285,
2571,
533,
2139,
253,
1390,
3828,
597,
2761,
3986,
281,
253,
1072,
1127,
352,
3133,
352,
310,
253,
4766,
3828,
326,
12129,
253,
2622,
432,
253,
34014,
6667,
253,
954,
690,
625,
5955,
651,
320,
1175,
50276,
23,
690,
22861,
273,
2139,
39116,
1762,
310,
6777,
651,
320,
1175,
327,
3239,
721,
50276,
24,
327,
3239,
818,
253,
4477,
403,
16585,
253,
3045,
273,
253,
4081,
1332,
2299,
627,
310,
642,
1663,
5301,
342,
643,
3082,
533,
2581,
253,
4477,
4767,
1805,
7200,
247,
1028,
310,
1805,
407,
10941,
1027,
7103,
15216,
891,
13414,
1089,
824,
5955,
9371,
275,
4645,
253,
7680,
273,
253,
4081,
1332,
671,
275,
253,
4764,
7340,
352,
651,
320,
5322,
281,
823,
253,
1783,
323,
253,
3576,
1854,
4038,
285,
923,
604,
352,
1335,
10138,
84,
342,
253,
5096,
1750,
342,
1027,
39116,
50276,
25,
3239,
337,
878,
281,
823,
30404,
323,
253,
3908,
50276,
395,
1014,
41731,
10574,
1966,
14965,
50276,
26,
5884,
2523,
3239,
337,
824,
4382,
8113,
943,
320,
824,
347,
4382,
8113,
5474,
33032,
2520,
2929,
29328,
247,
2557,
3576,
1854,
273,
534,
5085,
285,
13461,
497,
954,
1774,
323,
9162,
273,
247,
1798,
3280,
390,
3280,
966,
970,
253,
3576,
1854,
253,
4477,
12106,
253,
14787,
875,
11865,
2439,
5971,
323,
260,
79,
2224,
285,
875,
18539,
274,
1365,
7321,
285,
440,
25016,
3888,
4720,
253,
2929,
29328,
271,
48960,
5684,
1332,
1754,
327,
3576,
1854,
534,
34472,
18539,
274,
1365,
32494,
3888,
342,
1029,
7200,
285,
31376,
281,
247,
5235,
273,
7533,
50275,
1189,
455,
436,
2929,
310,
4722,
285,
3400,
2067,
4460,
7313,
253,
19843,
273,
253,
47284,
310,
3839,
1175,
533,
476,
320,
5520,
275,
2067,
5053,
5393,
2708,
347,
323,
8453,
3576,
1854,
310,
2779,
281,
4151,
2852,
6260,
273,
11454,
6928,
285,
253,
48960,
5684,
778,
5276,
3486,
1020,
2167,
9142,
697,
3486,
588,
3469,
327,
604,
285,
672,
253,
5684,
310,
7154,
50275,
35529,
627,
403,
2067,
1774,
5760,
5816,
432,
253,
1783,
2067,
3916,
534,
403,
440,
44167,
4215,
285,
5661,
4278,
403,
14999,
275,
247,
1643,
5053,
347,
824,
275,
697,
1655,
830,
891,
476,
760,
22112,
5583,
436,
2929,
323,
14924,
604,
275,
253,
18520,
253,
5760,
9521,
2708,
403,
2908,
3081,
1941,
310,
2530,
323,
253,
440,
44167,
4215,
3916,
390,
604,
1110,
3916,
403,
7020,
264,
1066,
285,
47284,
273,
5816,
5661,
4278,
310,
2908,
2654,
320,
5211,
281,
7164,
619,
4868,
50275,
24330,
2792,
50276,
18,
1223,
253,
8310,
5001,
1854,
48544,
310,
1077,
4722,
581,
2550,
11206,
1880,
390,
417,
253,
4248,
273,
14787,
2540,
875,
966,
6160,
11865,
6298,
1854,
48544,
390,
3365,
1029,
3280,
936,
5423,
1854,
11041,
534,
310,
2074,
1097,
1561,
285,
2439,
5971,
275,
1340,
281,
12129,
875,
841,
15018,
247,
2557,
273,
11251,
14407,
1854,
14259,
310,
3309,
275,
1635,
271,
3368,
2074,
281,
326,
275,
4677,
374,
342,
260,
338,
274,
740,
651,
320,
3240,
9371,
275,
16344,
1880,
436,
11562,
4961,
275,
625,
3626,
15302,
253,
4440,
257,
292,
1543,
403,
2834,
281,
4665,
1955,
281,
253,
1781,
1180,
273,
5971,
50276,
19,
2067,
3916,
275,
253,
1854,
48544,
2593,
403,
440,
44167,
4215,
50275,
19,
66,
275,
1798,
253,
1750,
326,
337,
556,
253,
4585,
4248,
273,
48544,
984,
273,
697,
4451,
5281,
310,
1160,
1293,
1941,
347,
310,
253,
14259,
875,
608,
285,
854,
721,
310,
671,
2074,
281,
854,
285,
2568,
1057,
417,
921,
253,
1072,
14259,
275,
253,
1854,
48544,
841,
3910,
778,
1077,
973,
3365,
320,
1955,
281,
4839,
50276,
19,
67,
253,
1750,
326,
253,
1854,
48544,
275,
4440,
257,
292,
10129,
253,
966,
19868,
310,
1160,
760,
1754,
327,
253,
7227,
14561,
414,
273,
4677,
495,
4496,
2057,
2557,
253,
14787,
1561,
285,
2439,
966,
9050,
390,
50007,
436,
1750,
50276,
20,
253,
14259,
1783,
323,
48960,
3888,
310,
671,
1077,
4722,
533,
247,
5301,
875,
440,
25016,
285,
12421,
44711,
3888,
342,
13373,
22429,
281,
253,
18539,
274,
1365,
44711,
3888,
310,
3309,
281,
5100,
1880,
436,
1055,
310,
1955,
281,
6046,
3839,
390,
48960,
6046,
697,
12744,
849,
253,
3576,
1854,
310,
5118,
672,
4016,
13461,
403,
3206,
2007,
47284,
273,
436,
4809,
651,
320,
9371,
50276,
37585,
1127,
25871,
993,
50275,
18,
627,
403,
2067,
5053,
835,
21643,
12342,
403,
5611,
275,
581,
12494,
533,
5544,
2067,
33295,
1996,
275,
1798,
253,
13812,
875,
40041,
285,
13461,
310,
5611,
25854,
949,
3239,
374,
533,
5544,
327,
3239,
495,
285,
253,
958,
326,
253,
10303,
323,
253,
5684,
7982,
403,
6311,
310,
12744,
1919,
3239,
577,
1014,
2167,
597,
250,
5611,
327,
3239,
495,
50276,
19,
963,
993,
50275,
19,
66,
2593,
337,
7002,
12494,
285,
48960,
3888,
359,
32355,
943,
320,
285,
48960,
3888,
285,
359,
32355,
50276,
19,
67,
2593,
337,
7002,
12494,
407,
1355,
20452,
253,
2990,
943,
320,
407,
1355,
26309,
253,
2990,
50276,
19,
68,
2593,
374,
806,
12494,
253,
2806,
3364,
264,
11454,
943,
320,
253,
2806,
3364,
11454,
50276,
19,
69,
2593,
374,
806,
12494,
275,
253,
1029,
1268,
943,
320,
387,
247,
1029,
1268,
50276,
19,
70,
2593,
577,
806,
12494,
347,
352,
1057,
642,
10007,
943,
320,
347,
352,
1057,
417,
10007,
50276,
19,
71,
4060,
943,
320,
11454,
2990,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
271,
2746,
281,
6642,
253,
3576,
1854,
273,
6667,
275,
247,
2990,
281,
3986,
247,
3061,
285,
1908,
436,
281,
12106,
604,
6667,
1537,
320,
48960,
30628,
1158,
253,
2929,
19756,
690,
19843,
285,
4679,
597,
1127,
281,
247,
13775,
875,
4665,
1430,
285,
48960,
8104,
597,
1642,
3533,
670,
15180,
10454,
285,
1127,
281,
690,
440,
2377,
296,
1377,
4215,
3916,
4477,
452,
417,
10974,
281,
30628,
4583,
891,
15038,
342,
253,
30628,
281,
12009,
253,
2929
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
2299,
253,
3236,
414,
273,
253,
2746,
310,
762,
7317,
407,
253,
3332,
789,
273,
259,
606,
1162,
355,
30105,
1087,
4765,
534,
253,
4477,
14409,
760,
275,
253,
5955,
273,
5661,
1543,
3738,
253,
4278,
403,
1027,
347,
281,
849,
4619,
11865,
403,
3636,
285,
849,
48960,
6667,
476,
320,
5189,
970,
731,
253,
8130,
403,
7964,
2905,
50276,
66,
625,
7000,
8813,
273,
436,
943,
452,
644,
1677,
275,
253,
10199,
273,
253,
2929,
625,
39728,
310,
253,
958,
326,
247,
1481,
936,
2522,
5661,
5301,
310,
417,
2530,
6747,
342,
259,
606,
1162,
355,
4543,
342,
643,
1375,
273,
253,
1445,
25421,
643,
685,
247,
18276,
6803,
273,
253,
13789,
273,
690,
25421,
275,
2829,
337,
3877,
326,
1014,
436,
18276,
5955,
1057,
417,
2486,
690,
273,
253,
3332,
5481,
7274,
824,
347,
20633,
1473,
9621,
5242,
70,
1162,
355,
17857,
1686,
4765,
390,
16486,
6429,
1162,
355,
17857,
32888,
4765,
50276,
783,
1953,
273,
849,
1682,
281,
4853,
4619,
11865,
285,
616,
22620,
310,
1335,
1077,
1199,
1527,
50276,
783,
4477,
2746,
310,
2581,
8077,
2531,
285,
15246,
323,
1650,
310,
616,
14259,
2557,
23539,
4404,
253,
9021,
432,
2393,
8090,
476,
247,
3828,
67,
1190,
4071,
42428,
273,
9021,
3157,
253,
3045,
50276,
783,
4477,
513,
417,
1900,
4665,
616,
1211,
5661,
1543,
9113,
323,
1650,
616,
1543,
275,
8442,
818,
74,
285,
818,
75,
13414,
1663,
1329,
616,
6452,
326,
3045,
4558,
2761,
19965,
672,
39116,
310,
275,
253,
2491,
16987,
50276,
740,
671,
4677,
577,
1057,
417,
921,
326,
616,
3576,
1854,
14259,
310,
417,
3587,
247,
1270,
7982,
281,
12129,
875,
2622,
285,
48960,
6667,
984,
247,
1781,
8394,
273,
48960,
6667,
452,
7363,
326,
2965,
275,
253,
6867,
2491,
323,
2622,
6667,
2299,
627,
403,
3910,
275,
14955,
534,
476,
320,
28734,
347,
253,
4477,
513,
921,
50276,
783,
6003,
273,
253,
2929,
310,
275,
690,
878,
273,
7756,
323,
1650,
253,
5955,
273,
16689,
273,
3576,
11865,
2593,
374,
3249,
973,
1078,
253,
4278,
273,
253,
4327,
273,
7887,
1318,
39116,
908,
281,
6635,
731,
2593,
7609,
50276,
936,
26799,
50276,
856,
84,
50276,
66,
1175,
1083,
310,
1160,
323,
253,
897,
273,
4619,
11865,
347,
247,
1039,
273,
43073,
48960,
6667,
432,
2622,
6667,
50276,
783,
2361,
7756,
275,
14259,
273,
48960,
6667,
342,
1675,
281,
616,
1273,
14714,
264,
5971,
310,
3782,
27807,
50276,
783,
2929,
310,
3839,
973,
3542,
285,
3477,
281,
956,
50276,
5040,
50276,
783,
5661,
1971,
310,
12497,
275,
1798,
247,
625,
9257,
2783,
5661,
22861,
310,
3058,
342,
1675,
281,
643,
5481,
8130,
50276,
783,
1953,
273,
849,
1682,
281,
4853,
4619,
11865,
285,
616,
22620,
310,
1335,
1077,
1199,
1527,
50276,
783,
4477,
513,
417,
1900,
4665,
616,
1211,
5661,
1543,
9113,
50276,
783,
6003,
273,
253,
2929,
310,
275,
690,
878,
273,
7756,
7152,
339,
431,
248,
4477,
12661,
253,
10732,
273,
3576,
1854,
323,
253,
4096,
273,
12488,
8512,
326,
17904,
281,
253,
13650,
285,
1146,
2104,
281,
2736,
48960,
3888,
275,
253,
3634,
273,
2460,
9162,
4583,
253,
2929,
310,
973,
3542,
3707,
326,
253,
4477,
403,
12480,
767,
4122,
2905,
533,
1335,
1027,
12989,
8813,
285,
34014,
5481,
594,
326,
253,
16038,
310,
21643,
253,
5661,
1543,
6296,
921,
16966,
326,
3576,
1854,
476,
1361,
2096,
966,
22620,
285,
2990,
37469,
533,
36908,
1663,
921,
849,
253,
4081,
789,
310,
6240,
1318,
281,
253,
1673,
352,
19756,
253,
5661,
5301,
342,
2045,
3082,
533,
760,
2486,
5955,
275,
17438,
436,
2929,
812,
1614,
562,
281,
320,
247,
10046,
2929,
533,
352,
310,
417,
4704,
2568,
50276,
27490,
403,
690,
625,
7000,
5701,
337,
253,
4477,
41509,
407,
14851,
326,
253,
24189,
273,
48257,
281,
3280,
26309,
310,
1955,
281,
253,
3480,
273,
4665,
1430,
2593,
10199,
50276,
15834,
891,
476,
2096,
326,
359,
971,
625,
4665,
1430,
285,
359,
971,
1679,
24189,
533,
891,
16216,
5194,
326,
24189,
310,
4269,
407,
3480,
273,
4665,
1430,
671,
253,
4477,
403,
2820,
281,
14294,
1097,
8892,
4665,
1430,
285,
34014,
5481,
407,
4645,
941,
1783,
273,
849,
253,
4342,
28588,
342,
2720,
3640,
24088,
966,
273,
6670,
337,
310,
253,
954,
1027,
432,
643,
5971,
275,
278,
79,
382,
4836,
285,
407,
4645,
15549,
34014,
3888,
2299,
6747,
556,
3588,
11745,
5301,
342,
2045,
789,
2686,
323,
253,
4665,
1430,
9400,
253,
4477,
42126,
1663,
2085,
247,
4968,
390,
247,
2087,
12729,
1332,
3021,
891,
651,
1804,
281,
5206,
581,
273,
253,
767,
12989,
26332,
48960,
2460,
5481,
285,
2770,
327,
352,
407,
6240,
11080,
5301,
342,
643,
3082,
275,
253,
5955,
285,
906,
2593,
2486,
253,
4665,
1430,
1783,
281,
15249,
2139,
253,
4081,
34014,
5481,
1332,
310,
50090,
275,
2176,
4088,
50276,
19,
581,
9400,
326,
310,
5816,
432,
253,
2929,
310,
253,
673,
10454,
273,
253,
4081,
1332,
387,
247,
295,
1123,
6642,
352,
651,
2430,
12544,
285,
4560,
253,
5927,
873,
273,
3576,
8512,
342,
7887,
39116,
285,
3021,
591,
4227,
387,
1878,
7005,
2412,
278,
310,
2424,
387,
10554,
3408,
835,
278,
310,
253,
1180,
273,
3386,
323,
295,
10872,
253,
20185,
10454,
310,
327,
78,
2412,
278,
849,
1057,
352,
7277,
281,
253,
643,
34014,
5481,
3082,
50276,
20,
3239,
495,
25957,
326,
253,
789,
323,
4619,
24749,
1854,
259,
606,
1162,
355,
4765,
4419,
851,
26208,
323,
1046,
2460,
436,
3908,
310,
417,
1663,
2032,
1293,
625,
3634,
671,
4477,
2319,
436,
789,
969,
1077,
13366,
275,
3239,
854,
1955,
281,
253,
1029,
14259,
275,
3082,
285,
16038,
342,
253,
4081,
1332,
533,
253,
4477,
13414,
921,
667,
11745,
5301,
846,
512,
1097,
3082,
403,
2820,
281,
4271,
8512,
326,
8162,
253,
954,
281,
253,
10554,
690,
625,
11859,
5301,
651,
320,
5322,
50276,
21,
3239,
495,
25957,
326,
253,
6012,
4583,
3576,
1854,
310,
4122,
23507,
2429,
281,
253,
3236,
2990,
285,
253,
3576,
1854,
4038,
323,
2620,
10166,
3210,
13794,
432,
2145,
281,
5976,
534,
10138,
84,
342,
253,
5096,
1750,
432,
1529,
2929,
2366,
342,
253,
643,
2074,
7234,
352,
651,
320,
1663,
5322,
281,
3877,
752,
39116,
310,
908,
323,
824,
7234,
849,
1057,
824,
3908,
1818,
342,
1027,
39116,
671,
690,
5955,
651,
320,
5322,
670,
752,
824,
37139,
414,
8018,
5742,
1057,
253,
37139,
414,
1804,
253,
5107,
323,
4735,
5438,
390,
1057,
352,
1804,
247,
1039,
323,
15549,
689,
31893,
50276,
22,
3239,
608,
2722,
253,
1854,
14259,
875,
253,
2622,
285,
253,
34014,
6667,
432,
253,
4677,
608,
66,
285,
608,
67,
359,
476,
923,
253,
327,
253,
806,
3828,
253,
1599,
1474,
4513,
875,
2622,
285,
2571,
533,
2139,
253,
1390,
3828,
597,
2761,
3986,
281,
253,
1072,
1127,
352,
3133,
352,
310,
253,
4766,
3828,
326,
12129,
253,
2622,
432,
253,
34014,
6667,
253,
954,
690,
625,
5955,
651,
320,
1175,
50276,
23,
690,
22861,
273,
2139,
39116,
1762,
310,
6777,
651,
320,
1175,
327,
3239,
721,
50276,
24,
327,
3239,
818,
253,
4477,
403,
16585,
253,
3045,
273,
253,
4081,
1332,
2299,
627,
310,
642,
1663,
5301,
342,
643,
3082,
533,
2581,
253,
4477,
4767,
1805,
7200,
247,
1028,
310,
1805,
407,
10941,
1027,
7103,
15216,
891,
13414,
1089,
824,
5955,
9371,
275,
4645,
253,
7680,
273,
253,
4081,
1332,
671,
275,
253,
4764,
7340,
352,
651,
320,
5322,
281,
823,
253,
1783,
323,
253,
3576,
1854,
4038,
285,
923,
604,
352,
1335,
10138,
84,
342,
253,
5096,
1750,
342,
1027,
39116,
50276,
25,
3239,
337,
878,
281,
823,
30404,
323,
253,
3908,
50276,
395,
1014,
41731,
10574,
1966,
14965,
50276,
26,
5884,
2523,
3239,
337,
824,
4382,
8113,
943,
320,
824,
347,
4382,
8113,
5474,
33032,
2520,
2929,
29328,
247,
2557,
3576,
1854,
273,
534,
5085,
285,
13461,
497,
954,
1774,
323,
9162,
273,
247,
1798,
3280,
390,
3280,
966,
970,
253,
3576,
1854,
253,
4477,
12106,
253,
14787,
875,
11865,
2439,
5971,
323,
260,
79,
2224,
285,
875,
18539,
274,
1365,
7321,
285,
440,
25016,
3888,
4720,
253,
2929,
29328,
271,
48960,
5684,
1332,
1754,
327,
3576,
1854,
534,
34472,
18539,
274,
1365,
32494,
3888,
342,
1029,
7200,
285,
31376,
281,
247,
5235,
273,
7533,
50275,
1189,
455,
436,
2929,
310,
4722,
285,
3400,
2067,
4460,
7313,
253,
19843,
273,
253,
47284,
310,
3839,
1175,
533,
476,
320,
5520,
275,
2067,
5053,
5393,
2708,
347,
323,
8453,
3576,
1854,
310,
2779,
281,
4151,
2852,
6260,
273,
11454,
6928,
285,
253,
48960,
5684,
778,
5276,
3486,
1020,
2167,
9142,
697,
3486,
588,
3469,
327,
604,
285,
672,
253,
5684,
310,
7154,
50275,
35529,
627,
403,
2067,
1774,
5760,
5816,
432,
253,
1783,
2067,
3916,
534,
403,
440,
44167,
4215,
285,
5661,
4278,
403,
14999,
275,
247,
1643,
5053,
347,
824,
275,
697,
1655,
830,
891,
476,
760,
22112,
5583,
436,
2929,
323,
14924,
604,
275,
253,
18520,
253,
5760,
9521,
2708,
403,
2908,
3081,
1941,
310,
2530,
323,
253,
440,
44167,
4215,
3916,
390,
604,
1110,
3916,
403,
7020,
264,
1066,
285,
47284,
273,
5816,
5661,
4278,
310,
2908,
2654,
320,
5211,
281,
7164,
619,
4868,
50275,
24330,
2792,
50276,
18,
1223,
253,
8310,
5001,
1854,
48544,
310,
1077,
4722,
581,
2550,
11206,
1880,
390,
417,
253,
4248,
273,
14787,
2540,
875,
966,
6160,
11865,
6298,
1854,
48544,
390,
3365,
1029,
3280,
936,
5423,
1854,
11041,
534,
310,
2074,
1097,
1561,
285,
2439,
5971,
275,
1340,
281,
12129,
875,
841,
15018,
247,
2557,
273,
11251,
14407,
1854,
14259,
310,
3309,
275,
1635,
271,
3368,
2074,
281,
326,
275,
4677,
374,
342,
260,
338,
274,
740,
651,
320,
3240,
9371,
275,
16344,
1880,
436,
11562,
4961,
275,
625,
3626,
15302,
253,
4440,
257,
292,
1543,
403,
2834,
281,
4665,
1955,
281,
253,
1781,
1180,
273,
5971,
50276,
19,
2067,
3916,
275,
253,
1854,
48544,
2593,
403,
440,
44167,
4215,
50275,
19,
66,
275,
1798,
253,
1750,
326,
337,
556,
253,
4585,
4248,
273,
48544,
984,
273,
697,
4451,
5281,
310,
1160,
1293,
1941,
347,
310,
253,
14259,
875,
608,
285,
854,
721,
310,
671,
2074,
281,
854,
285,
2568,
1057,
417,
921,
253,
1072,
14259,
275,
253,
1854,
48544,
841,
3910,
778,
1077,
973,
3365,
320,
1955,
281,
4839,
50276,
19,
67,
253,
1750,
326,
253,
1854,
48544,
275,
4440,
257,
292,
10129,
253,
966,
19868,
310,
1160,
760,
1754,
327,
253,
7227,
14561,
414,
273,
4677,
495,
4496,
2057,
2557,
253,
14787,
1561,
285,
2439,
966,
9050,
390,
50007,
436,
1750,
50276,
20,
253,
14259,
1783,
323,
48960,
3888,
310,
671,
1077,
4722,
533,
247,
5301,
875,
440,
25016,
285,
12421,
44711,
3888,
342,
13373,
22429,
281,
253,
18539,
274,
1365,
44711,
3888,
310,
3309,
281,
5100,
1880,
436,
1055,
310,
1955,
281,
6046,
3839,
390,
48960,
6046,
697,
12744,
849,
253,
3576,
1854,
310,
5118,
672,
4016,
13461,
403,
3206,
2007,
47284,
273,
436,
4809,
651,
320,
9371,
50276,
37585,
1127,
25871,
993,
50275,
18,
627,
403,
2067,
5053,
835,
21643,
12342,
403,
5611,
275,
581,
12494,
533,
5544,
2067,
33295,
1996,
275,
1798,
253,
13812,
875,
40041,
285,
13461,
310,
5611,
25854,
949,
3239,
374,
533,
5544,
327,
3239,
495,
285,
253,
958,
326,
253,
10303,
323,
253,
5684,
7982,
403,
6311,
310,
12744,
1919,
3239,
577,
1014,
2167,
597,
250,
5611,
327,
3239,
495,
50276,
19,
963,
993,
50275,
19,
66,
2593,
337,
7002,
12494,
285,
48960,
3888,
359,
32355,
943,
320,
285,
48960,
3888,
285,
359,
32355,
50276,
19,
67,
2593,
337,
7002,
12494,
407,
1355,
20452,
253,
2990,
943,
320,
407,
1355,
26309,
253,
2990,
50276,
19,
68,
2593,
374,
806,
12494,
253,
2806,
3364,
264,
11454,
943,
320,
253,
2806,
3364,
11454,
50276,
19,
69,
2593,
374,
806,
12494,
275,
253,
1029,
1268,
943,
320,
387,
247,
1029,
1268,
50276,
19,
70,
2593,
577,
806,
12494,
347,
352,
1057,
642,
10007,
943,
320,
347,
352,
1057,
417,
10007,
50276,
19,
71,
4060,
943,
320,
11454,
2990,
50276,
187,
187,
4118,
18435,
27,
783,
2929,
10262,
271,
2746,
281,
6642,
253,
3576,
1854,
273,
6667,
275,
247,
2990,
281,
3986,
247,
3061,
285,
1908,
436,
281,
12106,
604,
6667,
1537,
320,
48960,
30628,
1158,
253,
2929,
19756,
690,
19843,
285,
4679,
597,
1127,
281,
247,
13775,
875,
4665,
1430,
285,
48960,
8104,
597,
1642,
3533,
670,
15180,
10454,
285,
1127,
281,
690,
440,
2377,
296,
1377,
4215,
3916,
4477,
452,
417,
10974,
281,
30628,
4583,
891,
15038,
342,
253,
30628,
281,
12009,
253,
2929
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the author proposes to use a competitive multiagent setting for encouraging exploration i very much agree with most of previous reviewers and their constructive suggestions however i find a major issue with this paper is the lack of baseline comparisons the paper shows that cer her her cer i do not think cer should be compared to her at all cer to me attacks the exploration problem in a very different way than her it is not trying to reuse experience which is the core in her instead it uses 2 agents and their competition for encouraging visiting new states this method should be compared to method that encourages exploration via some form of intrinsic motivation there are methods proposed in the past such as 12 that uses intrinsic motivationcuriosity driven prediction error to encourage exploration note that these methods are also compatible with her id suggest comparing cer with one of these methods if not all both with and without her minor in the beginning paragraph of 31 the paper states while the relabelling strategy introduced by her provides useful rewards for training a goalconditioned policy it assumes that learning from arbitrary goals will generalize to the actual task goals as such exploration remains a fundamental challenge for goaldirected rl with sparse reward we propose a relabelling strategy designed to overcome this challenge i think overcoming this particular challenge is a bit overstating the method proposed in this paper is not guaranteed to address the fundamental challenge either ie why can you assume that learning from arbitrary goals that results from the dynamics of two agents will generalize to the actual task goals i will change my rating accordingly if there are more meaningful comparisons made in the rebuttal 1 curiositydriven exploration by selfsupervised prediction pathak et al 2 largescale study of curiositydriven learning burda et aldocsepthe authors propose a new method for learning from sparse rewards in modelfree reinforcement learning settings this is a challenging and important problem in modelfree rl mainly due to the lack of effective exploration they propose a new way of densifying the reward by encouraging a pair of agents to explore different states using competitive selfplay while trying to learn the same task one of the agents a receives a penalty for visiting states that the other agent b also visits while b is rewarded for visiting states found by a they evaluate their method on a few tasks with continuous action spaces such as ant navigation in a maze and object manipulation by a simulated robotic arm their method shows faster convergence in some cases and better performance than comparable algorithms strengths attempts to solve a longstanding problem in modelfree rl effective exploration in sparse reward environments clear writing and structure easy to understand except for some minor details novel intuitive and simple method building on ideas from previous works good empirical results better than state of the art in terms of performance on some challenging tasks weaknesses not very clear why and when the method works more insight from experiments in less complex environments or some theoretical analysis would be helpful it would also be useful to better understand the conditions under which we can expect this to bring significant gains and when we can expect this to fail or not help more than other methods not clear how stable to train and robust to different environment dynamics the method is main comments questions the paper makes the claim that their technique automatically generates a curriculum of exploration which seems to be based more on intuition rather than clear experiments or analysis i would suggest to either avoid making such claims or include stronger evidence for that for example you could consider visualizing the visited states by a and b for a fixed goal and initial state at different training epochs other such experiments and analysis would be very helpful it is known that certain reward shaping approaches can have negative consequences and lead to undesired behaviors ng et al 1999 clark amodei 2016 why can we expect that this particular type of reward shaping doesnt have such side effects can it be the case that due to this adversarial reward structure a learns a policy that takes it to some bad states from which it will be difficult to recover or that a b get stuck in a cyclic behavior have you observed such behaviors in any of your experiments do you train the agents with using the shaped reward from the exploration competition between a and b for the entire training duration have you tried to continue training from sparse reward only eg after the effect ratio has stabilized one problem i see with this approach is the fact that you never directly optimize the true sparse reward of the tasks so in the late stages of training your performance might suffer because the agent a is still trying to explore different parts of the state space can you comment on how stable this method is to train given its adversarial nature and what potential tricks can help in practice except for the discussion on batch size please make clear the way you are generating the result plots ie is a evaluated on the full task with sparse reward and initial goal distribution with no relabelling in algorithm 1 can you include the initialization of the goals for a and b does b receive identical goals as a it would also be helpful to more clearly state the limitations and advantages of this method compared to other algorithms designed for more efficient exploration eg the need for a resettable environment for intcer but not for indcer etc minor comments questions you might consider including more references in the related work section that initializing from different state distributions such as hosu rebedea 2016 zhu et al 2016 and kakade langford 2002 and perhaps more papers tackling the exploration problem can you provide some intuition on why intcer performs better than indcer on most tasks and why in figure 1 her intcer takes longer to converge than the other methods on the s maze in figure 4 why are you not including indcer without her have you considered training a pool of agents with selfplay for the competitive exploration instead of two agents is there any intuition on expecting one or the other to perform better plots what is the xaxis of the plots number of samples episodes epochs please label it please be explicit about the variance shown in the plots is that the std it would be helpful if to have larger numbers on the xyaxes it is difficult to read when on paper can you explain how you smoothed the curves whether before or after taking the average and perhaps include the min and max as well i believe this could go in the appendix notation i dont understand the need for calling the reward rg instead of r i believe this introduces confusion since the framework already has r taking as argument the goal g eq 1 while the g in the subscript doesnt seem to refer to a particular g but rather to a general fact that this is a reward for a goaloriented task with sparse reward where the goals are a subset of the states eq 4 please use a consistent notation for q in sections 21 and 22 at times you use qsag qasg or qsa typos page 6 last paragraph of section 41 interestingly even the is enough to support page 7 last paragraph of section 43 interestingly adversely affects both docsepthe authors propose a states relabeling strategy cer to encourage exploration in rl algorithms by organizing a competitive game between a pair of agents to verify their strategy they extend maddpg as their framework then they compare the performance of agents trained with her and both variants of cer and both variants of cer with her the experiments show that cer can improve the performance of her with faster converge and higher accuracy my major concerns are as follows 1 the authors may want to conduct more experiments to compare cer with other stateoftheart methods such as ppo1 as illustrated in figure 1 the performance of her is better than that of cer the authors may want to analyze whether cer strategy alone could properly address the sparse reward problems and why cer strategy can improve her the authors have mentioned that cer is orthogonal to her i suggest authors provide more discussions on this statement 2 the authors may want to improve the readability of this paper for example in figure 1 the authors may want to clarify the meanings of the axes and the plots the results shown in figure 3 are confusing how can the authors come to the conclusion that the optimal conguration requires balancing the batch sizes used for the two agents to better illustrate the framework of cer the authors may want to show its flow chart 3 there are some typos for example in section 21 the authors use tssa without index t in section 22 the authors use both qasg and qsag there is something wrong with the format of the reference tim salimans and richard chen demonstration 2018 in the bottom of page 10 1 schulman j wolski f dhariwal p et al proximal policy optimization algorithmsj 2017 docsepthe paper is well written and easy to read exploration is one of the fundamental problems in rl and the idea of using two agents for better exploration is interesting and novel however an explanation of the intuition behind the method would be useful the experimental results show that the method works well in complex tasks since states are compared to each other in l2 distance the method might not generalize to other domains where l2 distance is not a good distance metric pros well written a simple and novel idea tackling a hard problem good results on hard tasks cons an explanation of why the method should work is missing plot text is too small what is the unit of xaxis questions what is the intuition behind the method during training randomly sampled two states are compared why it is a good idea how the replay buffer size will affect it since it is a twoplayer game is there anything you can say about its nash equilibrium why a is better than b at the task when comparing states are whole raw observations including velocity etc used section 42 doesnt seem to be that relevant or helpful is it really necessary fig 4 is missing cer alone results why is that it doesnt work by itself on those tasks
### Summary: | the paper proposes a new method to improve exploration in sparse reward problems by having two agents competing with each other to generate shaping reward that relies on how novel a newly visited state is the idea is nice and simple and the results are promising the authors implemented more baselines suggested in initial reviews which was also helpful on the other hand the approach appears somewhat ad hoc it is not always clear why and when the method works although some intuitions are given one reviewer gave a nice suggestion of obtaining further insights by running experiments in less complex environments overall this work is an interesting contribution | [
2722,
326,
15733,
50276,
379,
50276,
379,
50276,
1209,
891,
513,
417,
1158,
15733,
943,
320,
2429,
281,
617,
387,
512,
15733,
281,
479,
8104,
253,
17947,
1895,
275,
247,
1077,
1027,
1039,
685,
617,
352,
310,
417,
2820,
281,
33150,
2793,
534,
310,
253,
5161,
275,
617,
3185,
352,
4648,
374,
6083,
285,
616,
7324,
323,
18462,
13975,
747,
3054,
436,
1332,
943,
320,
2429,
281,
1332,
326,
29426,
17947,
3066,
690,
830,
273,
15276,
16038,
627,
403,
3082,
4081,
275,
253,
2469,
824,
347,
1249,
326,
4648,
15276,
16038,
1915,
20367,
8877,
10554,
2228,
281,
11907,
17947,
3877,
326,
841,
3082,
403,
671,
13333,
342,
617,
2654,
1804,
10941,
15733,
342,
581,
273,
841,
3082,
604,
417,
512,
1097,
342,
285,
1293,
617,
50276,
37585,
275,
253,
5068,
12494,
273,
4562,
253,
2929,
3054,
50275,
6050,
253,
774,
357,
3485,
5700,
5611,
407,
617,
3400,
4217,
23267,
323,
3733,
247,
4736,
44321,
3646,
352,
19584,
326,
4715,
432,
10341,
7342,
588,
39970,
281,
253,
4588,
4836,
7342,
347,
824,
17947,
4558,
247,
7936,
5691,
323,
4736,
27481,
391,
77,
342,
23507,
10921,
359,
12661,
247,
774,
357,
3485,
5700,
4158,
281,
11399,
436,
5691,
50276,
74,
1158,
40845,
436,
1798,
5691,
310,
247,
2372,
689,
44101,
253,
1332,
4081,
275,
436,
2929,
310,
417,
16293,
281,
2953,
253,
7936,
5691,
2057,
50276,
466,
2139,
476,
368,
5467,
326,
4715,
432,
10341,
7342,
326,
1543,
432,
253,
8062,
273,
767,
6083,
588,
39970,
281,
253,
4588,
4836,
7342,
50276,
74,
588,
1818,
619,
13716,
15672,
604,
627,
403,
625,
14282,
14023,
1160,
275,
253,
30080,
22559,
50276,
18,
24536,
17477,
17947,
407,
1881,
35421,
10554,
1854,
518,
1162,
355,
374,
1236,
2510,
25912,
1263,
273,
24536,
17477,
4715,
3600,
1473,
1162,
355,
7152,
339,
431,
248,
4477,
12661,
247,
747,
1332,
323,
4715,
432,
23507,
23267,
275,
771,
813,
658,
35221,
4715,
7533,
436,
310,
247,
11132,
285,
1774,
1895,
275,
771,
813,
658,
391,
77,
7194,
1955,
281,
253,
3480,
273,
3576,
17947,
597,
12661,
247,
747,
1039,
273,
12006,
5411,
253,
10921,
407,
18462,
247,
4667,
273,
6083,
281,
8338,
1027,
3054,
970,
12085,
1881,
1993,
1223,
2820,
281,
3037,
253,
1072,
4836,
581,
273,
253,
6083,
247,
14488,
247,
12339,
323,
13975,
3054,
326,
253,
643,
5570,
270,
671,
12941,
1223,
270,
310,
33302,
323,
13975,
3054,
1119,
407,
247,
597,
7472,
616,
1332,
327,
247,
1643,
8892,
342,
5415,
2250,
8470,
824,
347,
1331,
15034,
275,
247,
37045,
285,
1789,
19763,
407,
247,
15524,
35121,
4430,
50276,
14094,
1332,
2722,
7938,
14940,
275,
690,
2219,
285,
1805,
3045,
685,
10870,
11333,
50273,
296,
3755,
20556,
9437,
281,
8415,
247,
1048,
6924,
1895,
275,
771,
813,
658,
391,
77,
3576,
17947,
275,
23507,
10921,
12620,
2590,
4028,
285,
2605,
3477,
281,
2096,
3707,
323,
690,
5884,
4278,
4460,
27350,
285,
2969,
1332,
3652,
327,
5697,
432,
2045,
2987,
1175,
16774,
1543,
1805,
685,
1375,
273,
253,
1445,
275,
2426,
273,
3045,
327,
690,
11132,
8892,
50276,
20881,
1255,
265,
417,
1077,
2590,
2139,
285,
672,
253,
1332,
2987,
50276,
3062,
12288,
432,
4679,
275,
1679,
2570,
12620,
390,
690,
10527,
1783,
651,
320,
9371,
352,
651,
671,
320,
4217,
281,
1805,
2096,
253,
2515,
762,
534,
359,
476,
1902,
436,
281,
3324,
1534,
15988,
285,
672,
359,
476,
1902,
436,
281,
1891,
390,
417,
1361,
625,
685,
643,
3082,
50276,
1439,
2590,
849,
6474,
281,
6194,
285,
10237,
281,
1027,
3126,
8062,
253,
1332,
310,
50275,
7265,
5701,
50276,
34974,
253,
2929,
2789,
253,
1750,
326,
616,
5853,
8356,
15693,
247,
24642,
273,
17947,
534,
3133,
281,
320,
1754,
625,
327,
30328,
2581,
685,
2590,
4679,
390,
1783,
891,
651,
1804,
281,
2057,
3693,
2403,
824,
3916,
390,
2486,
10046,
1941,
323,
326,
323,
1650,
368,
812,
1908,
5304,
3006,
253,
11580,
3054,
407,
247,
285,
270,
323,
247,
4229,
4736,
285,
3302,
1375,
387,
1027,
3733,
44540,
643,
824,
4679,
285,
1783,
651,
320,
1077,
9371,
352,
310,
1929,
326,
2176,
10921,
29209,
7274,
476,
452,
4016,
9099,
285,
1421,
281,
19231,
1250,
13576,
9782,
1162,
355,
7544,
502,
782,
50276,
312,
853,
74,
4022,
2139,
476,
359,
1902,
326,
436,
1798,
1511,
273,
10921,
29209,
36908,
452,
824,
1930,
2538,
476,
352,
320,
253,
1083,
326,
1955,
281,
436,
48960,
10921,
2605,
247,
33772,
247,
3646,
326,
3936,
352,
281,
690,
3076,
3054,
432,
534,
352,
588,
320,
2834,
281,
9295,
390,
326,
247,
50276,
67,
755,
10960,
275,
247,
19870,
3879,
452,
368,
2540,
824,
13576,
275,
667,
273,
634,
4679,
513,
368,
6194,
253,
6083,
342,
970,
253,
16745,
10921,
432,
253,
17947,
7324,
875,
247,
285,
270,
323,
253,
2862,
3733,
7467,
452,
368,
3597,
281,
4035,
3733,
432,
23507,
10921,
760,
24088,
846,
253,
1055,
4313,
556,
32779,
581,
1895,
891,
923,
342,
436,
2746,
310,
253,
958,
326,
368,
1620,
3587,
22318,
253,
2032,
23507,
10921,
273,
253,
8892,
594,
275,
253,
3563,
8661,
273,
3733,
634,
3045,
1537,
11089,
984,
253,
5570,
247,
310,
1335,
2820,
281,
8338,
1027,
4243,
273,
253,
1375,
2317,
50276,
5092,
368,
4385,
327,
849,
6474,
436,
1332,
310,
281,
6194,
1677,
697,
48960,
3753,
285,
752,
2442,
24866,
476,
1361,
275,
3946,
3707,
323,
253,
5955,
327,
14604,
1979,
4496,
1056,
2590,
253,
1039,
368,
403,
11365,
253,
906,
14777,
26332,
310,
247,
6760,
327,
253,
2120,
4836,
342,
23507,
10921,
285,
3302,
4736,
3268,
342,
642,
774,
357,
3485,
275,
5933,
337,
476,
368,
2486,
253,
31850,
273,
253,
7342,
323,
247,
285,
270,
1057,
270,
4763,
8931,
7342,
347,
247,
352,
651,
671,
320,
9371,
281,
625,
4518,
1375,
253,
7364,
285,
11361,
273,
436,
1332,
2429,
281,
643,
11333,
4158,
323,
625,
5919,
17947,
24088,
253,
878,
323,
247,
14932,
2420,
3126,
323,
540,
1209,
533,
417,
323,
801,
1209,
3966,
50275,
37585,
5701,
50276,
34974,
368,
1537,
1908,
1690,
625,
10414,
275,
253,
2905,
789,
2593,
326,
3302,
3006,
432,
1027,
1375,
10670,
824,
347,
288,
375,
86,
50276,
250,
3026,
14576,
4022,
1182,
11917,
1162,
355,
4022,
285,
465,
518,
796,
50276,
8700,
4379,
6752,
285,
4931,
625,
9380,
46710,
253,
17947,
1895,
50276,
5092,
368,
2085,
690,
30328,
327,
2139,
540,
1209,
17923,
1805,
685,
801,
1209,
327,
954,
8892,
285,
2139,
275,
4677,
337,
617,
50276,
565,
1209,
3936,
3356,
281,
29623,
685,
253,
643,
3082,
327,
253,
256,
37045,
275,
4677,
577,
2139,
403,
368,
417,
1690,
801,
1209,
1293,
617,
452,
368,
2783,
3733,
247,
6363,
273,
6083,
342,
1881,
1993,
323,
253,
12085,
17947,
3185,
273,
767,
6083,
310,
627,
667,
30328,
327,
16764,
581,
390,
253,
643,
281,
1347,
1805,
50275,
42045,
752,
310,
253,
1269,
10565,
273,
253,
14777,
1180,
273,
3530,
13305,
44540,
4496,
5203,
352,
4496,
320,
6843,
670,
253,
11041,
2011,
275,
253,
14777,
310,
326,
253,
6268,
352,
651,
320,
9371,
604,
281,
452,
4067,
3904,
327,
253,
1269,
90,
44832,
352,
310,
2834,
281,
1239,
672,
327,
2929,
476,
368,
5513,
849,
368,
43966,
253,
9191,
50276,
20094,
1078,
390,
846,
3192,
253,
3388,
285,
4931,
2486,
253,
1054,
285,
2781,
347,
973,
891,
2868,
436,
812,
564,
275,
253,
30762,
50276,
25604,
891,
13414,
2096,
253,
878,
323,
6789,
253,
10921,
28937,
3185,
273,
391,
891,
2868,
436,
23970,
13775,
1580,
253,
7792,
2168,
556,
391,
3192,
347,
4154,
253,
4736,
305,
16186,
337,
1223,
253,
305,
275,
253,
749,
3866,
36908,
1646,
281,
3730,
281,
247,
1798,
305,
533,
2581,
281,
247,
2087,
958,
326,
436,
310,
247,
10921,
323,
247,
4736,
21085,
4836,
342,
23507,
10921,
835,
253,
7342,
403,
247,
8578,
273,
253,
3054,
16186,
577,
4496,
897,
247,
5185,
14951,
323,
2805,
275,
7118,
3127,
285,
3307,
387,
2069,
368,
897,
2805,
84,
356,
2805,
284,
72,
390,
2805,
6678,
50276,
555,
993,
3239,
721,
1390,
12494,
273,
2593,
7609,
4722,
314,
1014,
253,
50275,
261,
2217,
281,
1329,
50276,
6377,
818,
1390,
12494,
273,
2593,
7652,
4722,
314,
50276,
324,
13110,
11852,
1097,
50275,
7152,
339,
431,
248,
4477,
12661,
247,
3054,
774,
1492,
272,
5700,
15733,
281,
11907,
17947,
275,
391,
77,
11333,
407,
26169,
247,
12085,
2165,
875,
247,
4667,
273,
6083,
50276,
936,
12654,
616,
5700,
597,
9017,
278,
1911,
8159,
347,
616,
7792,
840,
597,
7277,
253,
3045,
273,
6083,
10166,
342,
617,
285,
1097,
11640,
273,
15733,
285,
1097,
11640,
273,
15733,
342,
617,
253,
4679,
921,
326,
15733,
476,
3157,
253,
3045,
273,
617,
342,
7938,
29623,
285,
2169,
7200,
50276,
2577,
2201,
7350,
403,
347,
3637,
337,
186,
783,
4477,
778,
971,
281,
2589,
625,
4679,
281,
7277,
15733,
342,
643,
1375,
23037,
14387,
3082,
824,
347,
268,
5367,
18,
347,
12800,
275,
4677,
337,
253,
3045,
273,
617,
310,
1805,
685,
326,
273,
15733,
253,
4477,
778,
971,
281,
12106,
1880,
15733,
5700,
3815,
812,
6283,
2953,
253,
23507,
10921,
3237,
285,
2139,
15733,
5700,
476,
3157,
617,
253,
4477,
452,
5393,
326,
15733,
310,
19627,
281,
617,
891,
1804,
4477,
2085,
625,
11985,
327,
436,
3908,
50276,
19,
186,
783,
4477,
778,
971,
281,
3157,
253,
1239,
1430,
273,
436,
2929,
50276,
1542,
1650,
275,
4677,
337,
253,
4477,
778,
971,
281,
19148,
253,
30460,
273,
253,
24039,
285,
253,
14777,
50276,
783,
1543,
2011,
275,
4677,
495,
403,
21643,
849,
476,
253,
4477,
1705,
281,
253,
6452,
326,
253,
8654,
345,
72,
2742,
4419,
26259,
253,
14604,
9552,
908,
323,
253,
767,
6083,
50276,
936,
1805,
17093,
253,
7792,
273,
15733,
253,
4477,
778,
971,
281,
921,
697,
2685,
8326,
495,
186,
9088,
403,
690,
963,
993,
323,
1650,
275,
2593,
3127,
253,
4477,
897,
246,
859,
66,
1293,
3605,
246,
275,
2593,
3307,
253,
4477,
897,
1097,
2805,
284,
72,
285,
2805,
84,
356,
50276,
9088,
310,
1633,
3430,
342,
253,
5981,
273,
253,
3806,
4522,
3779,
303,
507,
285,
6793,
472,
260,
864,
50276,
48387,
318,
4765,
275,
253,
5004,
273,
3239,
884,
50276,
18,
5807,
335,
1342,
480,
259,
3017,
5985,
269,
42158,
1792,
18758,
268,
1162,
355,
19561,
3646,
13757,
11333,
75,
4240,
50276,
7152,
339,
431,
248,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
17947,
310,
581,
273,
253,
7936,
3237,
275,
391,
77,
285,
253,
2934,
273,
970,
767,
6083,
323,
1805,
17947,
310,
4722,
285,
4460,
2299,
271,
8813,
273,
253,
30328,
3212,
253,
1332,
651,
320,
4217,
253,
5661,
1543,
921,
326,
253,
1332,
2987,
973,
275,
2570,
8892,
1580,
3054,
403,
2429,
281,
1016,
643,
275,
298,
19,
4181,
253,
1332,
1537,
417,
39970,
281,
643,
10625,
835,
298,
19,
4181,
310,
417,
247,
1175,
4181,
7982,
50276,
856,
84,
50276,
4714,
3542,
50276,
66,
2969,
285,
4460,
2934,
46710,
247,
1892,
1895,
50276,
12311,
1543,
327,
1892,
8892,
50276,
5040,
50275,
266,
8813,
273,
2139,
253,
1332,
943,
789,
310,
5816,
50276,
14095,
2505,
310,
1512,
1355,
752,
310,
253,
3943,
273,
1269,
10565,
50276,
34974,
50276,
5371,
310,
253,
30328,
3212,
253,
1332,
50276,
32674,
3733,
12421,
19958,
767,
3054,
403,
2429,
2139,
352,
310,
247,
1175,
2934,
849,
253,
44864,
6391,
1979,
588,
2818,
352,
50276,
17480,
352,
310,
247,
2500,
4488,
4071,
2165,
310,
627,
2712,
368,
476,
1333,
670,
697,
295,
1225,
12902,
50275,
22309,
247,
310,
1805,
685,
270,
387,
253,
4836,
50276,
9453,
10941,
3054,
403,
2644,
9305,
7313,
1690,
7602,
3966,
908,
50276,
4674,
5976,
36908,
1646,
281,
320,
326,
4623,
390,
9371,
310,
352,
1663,
3309,
50275,
926,
577,
310,
5816,
15733,
3815,
1543,
2139,
310,
326,
352,
36908,
789,
407,
3139,
327,
1110,
8892,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
747,
1332,
281,
3157,
17947,
275,
23507,
10921,
3237,
407,
1907,
767,
6083,
11771,
342,
1016,
643,
281,
6635,
29209,
10921,
326,
15771,
327,
849,
4460,
247,
9841,
11580,
1375,
310,
50276,
783,
2934,
310,
5322,
285,
2969,
285,
253,
1543,
403,
12532,
50276,
783,
4477,
9009,
625,
1666,
25379,
5125,
275,
3302,
10123,
534,
369,
671,
9371,
50276,
251,
253,
643,
1133,
253,
2746,
4620,
8489,
519,
26901,
50276,
262,
310,
417,
1900,
2590,
2139,
285,
672,
253,
1332,
2987,
3738,
690,
16875,
4431,
403,
1677,
50276,
531,
37317,
3534,
247,
5322,
14876,
273,
13546,
2007,
16039,
407,
3515,
4679,
275,
1679,
2570,
12620,
50276,
1189,
455,
436,
789,
310,
271,
4722,
7680
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
2722,
326,
15733,
50276,
379,
50276,
379,
50276,
1209,
891,
513,
417,
1158,
15733,
943,
320,
2429,
281,
617,
387,
512,
15733,
281,
479,
8104,
253,
17947,
1895,
275,
247,
1077,
1027,
1039,
685,
617,
352,
310,
417,
2820,
281,
33150,
2793,
534,
310,
253,
5161,
275,
617,
3185,
352,
4648,
374,
6083,
285,
616,
7324,
323,
18462,
13975,
747,
3054,
436,
1332,
943,
320,
2429,
281,
1332,
326,
29426,
17947,
3066,
690,
830,
273,
15276,
16038,
627,
403,
3082,
4081,
275,
253,
2469,
824,
347,
1249,
326,
4648,
15276,
16038,
1915,
20367,
8877,
10554,
2228,
281,
11907,
17947,
3877,
326,
841,
3082,
403,
671,
13333,
342,
617,
2654,
1804,
10941,
15733,
342,
581,
273,
841,
3082,
604,
417,
512,
1097,
342,
285,
1293,
617,
50276,
37585,
275,
253,
5068,
12494,
273,
4562,
253,
2929,
3054,
50275,
6050,
253,
774,
357,
3485,
5700,
5611,
407,
617,
3400,
4217,
23267,
323,
3733,
247,
4736,
44321,
3646,
352,
19584,
326,
4715,
432,
10341,
7342,
588,
39970,
281,
253,
4588,
4836,
7342,
347,
824,
17947,
4558,
247,
7936,
5691,
323,
4736,
27481,
391,
77,
342,
23507,
10921,
359,
12661,
247,
774,
357,
3485,
5700,
4158,
281,
11399,
436,
5691,
50276,
74,
1158,
40845,
436,
1798,
5691,
310,
247,
2372,
689,
44101,
253,
1332,
4081,
275,
436,
2929,
310,
417,
16293,
281,
2953,
253,
7936,
5691,
2057,
50276,
466,
2139,
476,
368,
5467,
326,
4715,
432,
10341,
7342,
326,
1543,
432,
253,
8062,
273,
767,
6083,
588,
39970,
281,
253,
4588,
4836,
7342,
50276,
74,
588,
1818,
619,
13716,
15672,
604,
627,
403,
625,
14282,
14023,
1160,
275,
253,
30080,
22559,
50276,
18,
24536,
17477,
17947,
407,
1881,
35421,
10554,
1854,
518,
1162,
355,
374,
1236,
2510,
25912,
1263,
273,
24536,
17477,
4715,
3600,
1473,
1162,
355,
7152,
339,
431,
248,
4477,
12661,
247,
747,
1332,
323,
4715,
432,
23507,
23267,
275,
771,
813,
658,
35221,
4715,
7533,
436,
310,
247,
11132,
285,
1774,
1895,
275,
771,
813,
658,
391,
77,
7194,
1955,
281,
253,
3480,
273,
3576,
17947,
597,
12661,
247,
747,
1039,
273,
12006,
5411,
253,
10921,
407,
18462,
247,
4667,
273,
6083,
281,
8338,
1027,
3054,
970,
12085,
1881,
1993,
1223,
2820,
281,
3037,
253,
1072,
4836,
581,
273,
253,
6083,
247,
14488,
247,
12339,
323,
13975,
3054,
326,
253,
643,
5570,
270,
671,
12941,
1223,
270,
310,
33302,
323,
13975,
3054,
1119,
407,
247,
597,
7472,
616,
1332,
327,
247,
1643,
8892,
342,
5415,
2250,
8470,
824,
347,
1331,
15034,
275,
247,
37045,
285,
1789,
19763,
407,
247,
15524,
35121,
4430,
50276,
14094,
1332,
2722,
7938,
14940,
275,
690,
2219,
285,
1805,
3045,
685,
10870,
11333,
50273,
296,
3755,
20556,
9437,
281,
8415,
247,
1048,
6924,
1895,
275,
771,
813,
658,
391,
77,
3576,
17947,
275,
23507,
10921,
12620,
2590,
4028,
285,
2605,
3477,
281,
2096,
3707,
323,
690,
5884,
4278,
4460,
27350,
285,
2969,
1332,
3652,
327,
5697,
432,
2045,
2987,
1175,
16774,
1543,
1805,
685,
1375,
273,
253,
1445,
275,
2426,
273,
3045,
327,
690,
11132,
8892,
50276,
20881,
1255,
265,
417,
1077,
2590,
2139,
285,
672,
253,
1332,
2987,
50276,
3062,
12288,
432,
4679,
275,
1679,
2570,
12620,
390,
690,
10527,
1783,
651,
320,
9371,
352,
651,
671,
320,
4217,
281,
1805,
2096,
253,
2515,
762,
534,
359,
476,
1902,
436,
281,
3324,
1534,
15988,
285,
672,
359,
476,
1902,
436,
281,
1891,
390,
417,
1361,
625,
685,
643,
3082,
50276,
1439,
2590,
849,
6474,
281,
6194,
285,
10237,
281,
1027,
3126,
8062,
253,
1332,
310,
50275,
7265,
5701,
50276,
34974,
253,
2929,
2789,
253,
1750,
326,
616,
5853,
8356,
15693,
247,
24642,
273,
17947,
534,
3133,
281,
320,
1754,
625,
327,
30328,
2581,
685,
2590,
4679,
390,
1783,
891,
651,
1804,
281,
2057,
3693,
2403,
824,
3916,
390,
2486,
10046,
1941,
323,
326,
323,
1650,
368,
812,
1908,
5304,
3006,
253,
11580,
3054,
407,
247,
285,
270,
323,
247,
4229,
4736,
285,
3302,
1375,
387,
1027,
3733,
44540,
643,
824,
4679,
285,
1783,
651,
320,
1077,
9371,
352,
310,
1929,
326,
2176,
10921,
29209,
7274,
476,
452,
4016,
9099,
285,
1421,
281,
19231,
1250,
13576,
9782,
1162,
355,
7544,
502,
782,
50276,
312,
853,
74,
4022,
2139,
476,
359,
1902,
326,
436,
1798,
1511,
273,
10921,
29209,
36908,
452,
824,
1930,
2538,
476,
352,
320,
253,
1083,
326,
1955,
281,
436,
48960,
10921,
2605,
247,
33772,
247,
3646,
326,
3936,
352,
281,
690,
3076,
3054,
432,
534,
352,
588,
320,
2834,
281,
9295,
390,
326,
247,
50276,
67,
755,
10960,
275,
247,
19870,
3879,
452,
368,
2540,
824,
13576,
275,
667,
273,
634,
4679,
513,
368,
6194,
253,
6083,
342,
970,
253,
16745,
10921,
432,
253,
17947,
7324,
875,
247,
285,
270,
323,
253,
2862,
3733,
7467,
452,
368,
3597,
281,
4035,
3733,
432,
23507,
10921,
760,
24088,
846,
253,
1055,
4313,
556,
32779,
581,
1895,
891,
923,
342,
436,
2746,
310,
253,
958,
326,
368,
1620,
3587,
22318,
253,
2032,
23507,
10921,
273,
253,
8892,
594,
275,
253,
3563,
8661,
273,
3733,
634,
3045,
1537,
11089,
984,
253,
5570,
247,
310,
1335,
2820,
281,
8338,
1027,
4243,
273,
253,
1375,
2317,
50276,
5092,
368,
4385,
327,
849,
6474,
436,
1332,
310,
281,
6194,
1677,
697,
48960,
3753,
285,
752,
2442,
24866,
476,
1361,
275,
3946,
3707,
323,
253,
5955,
327,
14604,
1979,
4496,
1056,
2590,
253,
1039,
368,
403,
11365,
253,
906,
14777,
26332,
310,
247,
6760,
327,
253,
2120,
4836,
342,
23507,
10921,
285,
3302,
4736,
3268,
342,
642,
774,
357,
3485,
275,
5933,
337,
476,
368,
2486,
253,
31850,
273,
253,
7342,
323,
247,
285,
270,
1057,
270,
4763,
8931,
7342,
347,
247,
352,
651,
671,
320,
9371,
281,
625,
4518,
1375,
253,
7364,
285,
11361,
273,
436,
1332,
2429,
281,
643,
11333,
4158,
323,
625,
5919,
17947,
24088,
253,
878,
323,
247,
14932,
2420,
3126,
323,
540,
1209,
533,
417,
323,
801,
1209,
3966,
50275,
37585,
5701,
50276,
34974,
368,
1537,
1908,
1690,
625,
10414,
275,
253,
2905,
789,
2593,
326,
3302,
3006,
432,
1027,
1375,
10670,
824,
347,
288,
375,
86,
50276,
250,
3026,
14576,
4022,
1182,
11917,
1162,
355,
4022,
285,
465,
518,
796,
50276,
8700,
4379,
6752,
285,
4931,
625,
9380,
46710,
253,
17947,
1895,
50276,
5092,
368,
2085,
690,
30328,
327,
2139,
540,
1209,
17923,
1805,
685,
801,
1209,
327,
954,
8892,
285,
2139,
275,
4677,
337,
617,
50276,
565,
1209,
3936,
3356,
281,
29623,
685,
253,
643,
3082,
327,
253,
256,
37045,
275,
4677,
577,
2139,
403,
368,
417,
1690,
801,
1209,
1293,
617,
452,
368,
2783,
3733,
247,
6363,
273,
6083,
342,
1881,
1993,
323,
253,
12085,
17947,
3185,
273,
767,
6083,
310,
627,
667,
30328,
327,
16764,
581,
390,
253,
643,
281,
1347,
1805,
50275,
42045,
752,
310,
253,
1269,
10565,
273,
253,
14777,
1180,
273,
3530,
13305,
44540,
4496,
5203,
352,
4496,
320,
6843,
670,
253,
11041,
2011,
275,
253,
14777,
310,
326,
253,
6268,
352,
651,
320,
9371,
604,
281,
452,
4067,
3904,
327,
253,
1269,
90,
44832,
352,
310,
2834,
281,
1239,
672,
327,
2929,
476,
368,
5513,
849,
368,
43966,
253,
9191,
50276,
20094,
1078,
390,
846,
3192,
253,
3388,
285,
4931,
2486,
253,
1054,
285,
2781,
347,
973,
891,
2868,
436,
812,
564,
275,
253,
30762,
50276,
25604,
891,
13414,
2096,
253,
878,
323,
6789,
253,
10921,
28937,
3185,
273,
391,
891,
2868,
436,
23970,
13775,
1580,
253,
7792,
2168,
556,
391,
3192,
347,
4154,
253,
4736,
305,
16186,
337,
1223,
253,
305,
275,
253,
749,
3866,
36908,
1646,
281,
3730,
281,
247,
1798,
305,
533,
2581,
281,
247,
2087,
958,
326,
436,
310,
247,
10921,
323,
247,
4736,
21085,
4836,
342,
23507,
10921,
835,
253,
7342,
403,
247,
8578,
273,
253,
3054,
16186,
577,
4496,
897,
247,
5185,
14951,
323,
2805,
275,
7118,
3127,
285,
3307,
387,
2069,
368,
897,
2805,
84,
356,
2805,
284,
72,
390,
2805,
6678,
50276,
555,
993,
3239,
721,
1390,
12494,
273,
2593,
7609,
4722,
314,
1014,
253,
50275,
261,
2217,
281,
1329,
50276,
6377,
818,
1390,
12494,
273,
2593,
7652,
4722,
314,
50276,
324,
13110,
11852,
1097,
50275,
7152,
339,
431,
248,
4477,
12661,
247,
3054,
774,
1492,
272,
5700,
15733,
281,
11907,
17947,
275,
391,
77,
11333,
407,
26169,
247,
12085,
2165,
875,
247,
4667,
273,
6083,
50276,
936,
12654,
616,
5700,
597,
9017,
278,
1911,
8159,
347,
616,
7792,
840,
597,
7277,
253,
3045,
273,
6083,
10166,
342,
617,
285,
1097,
11640,
273,
15733,
285,
1097,
11640,
273,
15733,
342,
617,
253,
4679,
921,
326,
15733,
476,
3157,
253,
3045,
273,
617,
342,
7938,
29623,
285,
2169,
7200,
50276,
2577,
2201,
7350,
403,
347,
3637,
337,
186,
783,
4477,
778,
971,
281,
2589,
625,
4679,
281,
7277,
15733,
342,
643,
1375,
23037,
14387,
3082,
824,
347,
268,
5367,
18,
347,
12800,
275,
4677,
337,
253,
3045,
273,
617,
310,
1805,
685,
326,
273,
15733,
253,
4477,
778,
971,
281,
12106,
1880,
15733,
5700,
3815,
812,
6283,
2953,
253,
23507,
10921,
3237,
285,
2139,
15733,
5700,
476,
3157,
617,
253,
4477,
452,
5393,
326,
15733,
310,
19627,
281,
617,
891,
1804,
4477,
2085,
625,
11985,
327,
436,
3908,
50276,
19,
186,
783,
4477,
778,
971,
281,
3157,
253,
1239,
1430,
273,
436,
2929,
50276,
1542,
1650,
275,
4677,
337,
253,
4477,
778,
971,
281,
19148,
253,
30460,
273,
253,
24039,
285,
253,
14777,
50276,
783,
1543,
2011,
275,
4677,
495,
403,
21643,
849,
476,
253,
4477,
1705,
281,
253,
6452,
326,
253,
8654,
345,
72,
2742,
4419,
26259,
253,
14604,
9552,
908,
323,
253,
767,
6083,
50276,
936,
1805,
17093,
253,
7792,
273,
15733,
253,
4477,
778,
971,
281,
921,
697,
2685,
8326,
495,
186,
9088,
403,
690,
963,
993,
323,
1650,
275,
2593,
3127,
253,
4477,
897,
246,
859,
66,
1293,
3605,
246,
275,
2593,
3307,
253,
4477,
897,
1097,
2805,
284,
72,
285,
2805,
84,
356,
50276,
9088,
310,
1633,
3430,
342,
253,
5981,
273,
253,
3806,
4522,
3779,
303,
507,
285,
6793,
472,
260,
864,
50276,
48387,
318,
4765,
275,
253,
5004,
273,
3239,
884,
50276,
18,
5807,
335,
1342,
480,
259,
3017,
5985,
269,
42158,
1792,
18758,
268,
1162,
355,
19561,
3646,
13757,
11333,
75,
4240,
50276,
7152,
339,
431,
248,
2929,
310,
973,
3542,
285,
3477,
281,
1239,
17947,
310,
581,
273,
253,
7936,
3237,
275,
391,
77,
285,
253,
2934,
273,
970,
767,
6083,
323,
1805,
17947,
310,
4722,
285,
4460,
2299,
271,
8813,
273,
253,
30328,
3212,
253,
1332,
651,
320,
4217,
253,
5661,
1543,
921,
326,
253,
1332,
2987,
973,
275,
2570,
8892,
1580,
3054,
403,
2429,
281,
1016,
643,
275,
298,
19,
4181,
253,
1332,
1537,
417,
39970,
281,
643,
10625,
835,
298,
19,
4181,
310,
417,
247,
1175,
4181,
7982,
50276,
856,
84,
50276,
4714,
3542,
50276,
66,
2969,
285,
4460,
2934,
46710,
247,
1892,
1895,
50276,
12311,
1543,
327,
1892,
8892,
50276,
5040,
50275,
266,
8813,
273,
2139,
253,
1332,
943,
789,
310,
5816,
50276,
14095,
2505,
310,
1512,
1355,
752,
310,
253,
3943,
273,
1269,
10565,
50276,
34974,
50276,
5371,
310,
253,
30328,
3212,
253,
1332,
50276,
32674,
3733,
12421,
19958,
767,
3054,
403,
2429,
2139,
352,
310,
247,
1175,
2934,
849,
253,
44864,
6391,
1979,
588,
2818,
352,
50276,
17480,
352,
310,
247,
2500,
4488,
4071,
2165,
310,
627,
2712,
368,
476,
1333,
670,
697,
295,
1225,
12902,
50275,
22309,
247,
310,
1805,
685,
270,
387,
253,
4836,
50276,
9453,
10941,
3054,
403,
2644,
9305,
7313,
1690,
7602,
3966,
908,
50276,
4674,
5976,
36908,
1646,
281,
320,
326,
4623,
390,
9371,
310,
352,
1663,
3309,
50275,
926,
577,
310,
5816,
15733,
3815,
1543,
2139,
310,
326,
352,
36908,
789,
407,
3139,
327,
1110,
8892,
2490,
187,
4118,
18435,
27,
783,
2929,
29328,
247,
747,
1332,
281,
3157,
17947,
275,
23507,
10921,
3237,
407,
1907,
767,
6083,
11771,
342,
1016,
643,
281,
6635,
29209,
10921,
326,
15771,
327,
849,
4460,
247,
9841,
11580,
1375,
310,
50276,
783,
2934,
310,
5322,
285,
2969,
285,
253,
1543,
403,
12532,
50276,
783,
4477,
9009,
625,
1666,
25379,
5125,
275,
3302,
10123,
534,
369,
671,
9371,
50276,
251,
253,
643,
1133,
253,
2746,
4620,
8489,
519,
26901,
50276,
262,
310,
417,
1900,
2590,
2139,
285,
672,
253,
1332,
2987,
3738,
690,
16875,
4431,
403,
1677,
50276,
531,
37317,
3534,
247,
5322,
14876,
273,
13546,
2007,
16039,
407,
3515,
4679,
275,
1679,
2570,
12620,
50276,
1189,
455,
436,
789,
310,
271,
4722,
7680
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper studies the problem of realtime semantic segmentation with transformer the authors proposed an rtformer block with two attention models to aggregate information on differentresolution features the experimental results on serval datasets demonstrate the effectiveness of the proposed method strengths the proposed method achieves great performance in the serval datasets compared to the baselines the proposed methods could bring constant improvements weaknesses some important ablation studies are missing the choice of architectural design the author put the proposed rtformer block on the last two stages and does not provide the results to support this design the baseline which does not use any attention is needed to be included in table 3 a comparison with the other lightweight attentions is also needed yes docsepthe manuscript presents an efficient model for semantic segmentation the main contribution corresponds to gpu friendly attention layer which improves the efficiency by using keys and values as learnable parameters dimensionality of keys and values is a hyperparameter that is much less than n hxw furthermore the mlp from the standard transformer is replaced with plain convolutions the resulting module is somewhat similar to the classic selfattention layer from pretransformer era finally some further performance improvement is obtained through crossresolution attention experiments address cityscapes and ade20k strengths a resonable hybrid model with convolutions on higherresolution representations and selfattention on lowerresolution representations stateoftheart ratio between performance and computational complexity weaknesses incremental contribution gpu friendly attention appears quite related to previous work 13 as well as to linformer and nystromformer see below hybrid convolutionaltransformer models have been proposed before eg dpt hybrid see below the proposed improvements perform only slightly better than baselines in fig3 missing configuration in fig3a ea ca large training footprint it appears that only 3 crops 512x1024 can fit into a v100 incomplete related work in the field of efficient models for semantic segmentation eg hardnet swiftnet see below missing related work ren ranftl alexey bochkovskiy vladlen koltun vision transformers for dense prediction iccv 2021 1215912168 yunyang xiong zhanpeng zeng rudrasis chakraborty mingxing tan glenn fung yin li vikas singh nystrmformer a nystrmbased algorithm for approximating selfattention aaai 2021 sinong wang belinda z li madian khabsa han fang hao ma linformer selfattention with linear complexity corr abs200604768 2020 marin orsic sinisa segvic efficient semantic segmentation with pyramidal fusion pattern recognit 110 107611 2021 ping chao chaoyang kao yushan ruan chienhsiang huang younlong lin hardnet a low memory traffic network iccv 2019 it appears that large memory footprint precludes training on single gpu systems docsepthis paper proposes rtformer for realtime semantic segmentation the rtformer leverages gpufriendly attention with linear complexity and discards the multihead mechanism the authors demonstrate the efficacy of their method on several benchmarks strengths 1 the proposed method achieves good performances on the benchmarks 2 this paper is well organized and clearly described 3 efficient segmentation is a valuable problem weaknesses 1 the method proposed in the paper is a hybrid of various existing methods such as linearcomplexity selfattention hrnet and cnn and transformer hybrid model therefore the novelty is weakened by previous works 2 this paper does not say whether to use tensorrt to accelerate the model so i dont know if the comparison is fair 3 in terms of performance and model size there is no significant advantage between this method and the compared methods please refer to paper weaknesses docsepthis paper proposes rtformer an efficient transformer for realtime semantic segmentation which achieves better tradeoff between performance and efficiency than cnnbased models to achieve high inference efficiency on gpulike devices the rtformer leverages gpufriendly attention with linear complexity and discards the multihead mechanism besides the crossresolution attention is more efficient to gather global context information for highresolution branch by spreading the high level knowledge learned from lowresolution branch extensive experiments on mainstream benchmarks demonstrate the effectiveness of the proposed rtformer it achieves stateoftheart on cityscapes and camvid and shows promising results on ade20k strengths of this paper are as follows 1 a novel rtformer block is proposed which achieves better tradeoff between performance and efficiency on gpulike devices for semantic segmentation task 2 a new network architecture rtformer is proposed which can make full use of global context for improving semantic segmentation by utilizing attention deeply without lost of efficiency 3 rtformer achieves stateoftheart on cityscapes and camvid and show promising performance on ade20k in addition it provides a new perspective for practice on realtime semantic segmentation task weakness of this paper are as follows 1 the proposed crossresolution attention is just the variant of selfattention which is widely used in network design this paper is incremental compared with previous work ddrnet the novelty of this paper is limited 2 the performance improvement on cityscape dataset is limited as shown in table 1 the miou score and fps improvement are both limited not obvious enough 3 the experiments on semantic segmentation is thorough enough and can be used to support the proposed method but my concern is that the method is simple and not novel enough in addition why not apply this method to other vision tasks like classification object detection yes
### Summary: | reviewers agree that the proposed rtformer block and overall network architecture achieves good tradeoff between performance and efficiency on several datasets the design of gpufriendly attention and crossresolution attention improves the computational efficiency over multihead attention and well captures global context information when updating highresolution embeddings the main concern as mentioned by several reviewers is the overall novelty as some ideas are related to previous work gpufriendly attention and hybrid convolutionaltransformer architecture other issue includes missing baselines that are based on lightweighted attention designs or do not use attention at all but this have been well resolved in the author feedback in summary the pros outweigh the cons and therefore ac recommends acceptance | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
1895,
273,
1524,
2606,
24705,
26405,
342,
39707,
253,
4477,
4081,
271,
37523,
19946,
2972,
342,
767,
4116,
3210,
281,
19737,
1491,
327,
1027,
21061,
3386,
253,
5661,
1543,
327,
1296,
267,
15302,
7568,
253,
12510,
273,
253,
4081,
1332,
20544,
50276,
783,
4081,
1332,
33526,
1270,
3045,
275,
253,
1296,
267,
15302,
50276,
3118,
1096,
281,
253,
1666,
25379,
253,
4081,
3082,
812,
3324,
3638,
11701,
50275,
20881,
1255,
265,
690,
1774,
28913,
2175,
403,
5816,
50276,
783,
4327,
273,
27934,
2216,
253,
2488,
1691,
253,
4081,
37523,
19946,
2972,
327,
253,
1390,
767,
8661,
285,
1057,
417,
2085,
253,
1543,
281,
1329,
436,
2216,
50276,
783,
8245,
534,
1057,
417,
897,
667,
4116,
310,
3058,
281,
320,
2908,
275,
2829,
495,
247,
50276,
47109,
342,
253,
643,
28441,
33056,
621,
310,
671,
3058,
4754,
5474,
339,
431,
248,
7714,
10262,
271,
5919,
1566,
323,
24705,
26405,
253,
2022,
7680,
10140,
281,
305,
11113,
11453,
4116,
3828,
534,
19132,
253,
6733,
407,
970,
10149,
285,
2193,
347,
3037,
494,
3602,
7877,
1319,
273,
10149,
285,
2193,
310,
247,
4373,
19484,
326,
310,
1199,
1679,
685,
295,
288,
89,
88,
33810,
253,
13361,
81,
432,
253,
2629,
39707,
310,
7932,
342,
8342,
2410,
17009,
253,
4795,
6333,
310,
8489,
2074,
281,
253,
10610,
1881,
42959,
3828,
432,
638,
16702,
254,
8685,
4720,
690,
2007,
3045,
7756,
310,
2797,
949,
2831,
21061,
4116,
4679,
2953,
2846,
1026,
9652,
285,
519,
70,
938,
76,
50275,
296,
3755,
20556,
50276,
66,
8146,
494,
9769,
1566,
342,
2410,
17009,
327,
2169,
21061,
14237,
285,
1881,
42959,
327,
2406,
21061,
14237,
50276,
3409,
23037,
14387,
4313,
875,
3045,
285,
15180,
10454,
50276,
20881,
1255,
265,
50276,
19687,
30132,
7680,
305,
11113,
11453,
4116,
4620,
3240,
2905,
281,
2045,
789,
2145,
347,
973,
347,
281,
19169,
19946,
285,
295,
9207,
409,
19946,
923,
2708,
50276,
44972,
27311,
267,
16702,
254,
3210,
452,
644,
4081,
1078,
24088,
277,
431,
9769,
923,
2708,
50275,
783,
4081,
11701,
1347,
760,
5777,
1805,
685,
1666,
25379,
275,
3036,
20,
50276,
33722,
6661,
275,
3036,
20,
66,
299,
66,
50276,
6357,
50275,
16374,
3733,
33257,
352,
4620,
326,
760,
495,
19492,
23414,
89,
31111,
476,
4944,
715,
247,
362,
2313,
50275,
249,
11984,
2905,
789,
275,
253,
1673,
273,
5919,
3210,
323,
24705,
26405,
24088,
1892,
3024,
19779,
3024,
923,
2708,
50276,
33722,
2905,
789,
50276,
445,
6337,
649,
77,
247,
1591,
2653,
1766,
348,
17131,
9327,
90,
362,
17869,
5025,
465,
7391,
328,
8113,
4979,
398,
323,
14086,
10554,
17857,
17312,
43425,
1249,
17220,
805,
13851,
50276,
90,
328,
31524,
1269,
279,
72,
1182,
5582,
81,
1205,
1182,
1205,
30514,
83,
4914,
448,
518,
37588,
430,
90,
43261,
89,
272,
23136,
1289,
2477,
794,
72,
340,
249,
632,
362,
1479,
284,
1625,
73,
295,
9207,
1109,
19946,
247,
295,
9207,
1109,
3169,
5933,
323,
4020,
839,
1881,
42959,
39951,
2284,
43425,
50276,
7432,
543,
259,
606,
1112,
17662,
1182,
632,
278,
7577,
26856,
5375,
66,
15761,
269,
606,
419,
80,
6429,
19169,
19946,
1881,
42959,
342,
4872,
10454,
944,
83,
2117,
1518,
1549,
2504,
2358,
9169,
50276,
4175,
249,
390,
25831,
6868,
8901,
8753,
19742,
5919,
24705,
26405,
342,
25874,
11421,
11781,
3102,
3183,
262,
9199,
884,
3121,
883,
43425,
50276,
14650,
448,
8500,
11450,
899,
606,
465,
8500,
340,
2345,
266,
8864,
266,
448,
1914,
73,
9245,
606,
30287,
606,
368,
79,
5056,
19169,
1892,
3024,
247,
1698,
3541,
7137,
2990,
17857,
17312,
6247,
352,
4620,
326,
1781,
3541,
33257,
46704,
3733,
327,
2014,
305,
11113,
2718,
5474,
33032,
2520,
2929,
29328,
37523,
19946,
323,
1524,
2606,
24705,
26405,
50276,
783,
37523,
19946,
19732,
1131,
31025,
2375,
6902,
314,
4116,
342,
4872,
10454,
285,
1262,
2196,
253,
4471,
2522,
5122,
253,
4477,
7568,
253,
10307,
273,
616,
1332,
327,
2067,
49602,
20544,
337,
253,
4081,
1332,
33526,
1175,
16226,
327,
253,
49602,
374,
436,
2929,
310,
973,
10932,
285,
4518,
2529,
495,
5919,
26405,
310,
247,
9865,
1895,
50276,
20881,
1255,
265,
337,
253,
1332,
4081,
275,
253,
2929,
310,
247,
9769,
273,
2710,
5368,
3082,
824,
347,
4872,
19017,
414,
1881,
42959,
20589,
3024,
285,
260,
9866,
285,
39707,
9769,
1566,
3103,
253,
38135,
310,
33153,
407,
2045,
2987,
374,
436,
2929,
1057,
417,
1333,
1880,
281,
897,
13148,
1378,
281,
28523,
253,
1566,
594,
891,
13414,
871,
604,
253,
5301,
310,
4344,
495,
275,
2426,
273,
3045,
285,
1566,
1979,
627,
310,
642,
1534,
5750,
875,
436,
1332,
285,
253,
2429,
3082,
50276,
32897,
3730,
281,
2929,
32213,
5474,
33032,
2520,
2929,
29328,
37523,
19946,
271,
5919,
39707,
323,
1524,
2606,
24705,
26405,
534,
33526,
1805,
5454,
2727,
875,
3045,
285,
6733,
685,
260,
9866,
3169,
3210,
281,
5115,
1029,
17032,
6733,
327,
31025,
335,
2804,
4095,
253,
37523,
19946,
19732,
1131,
31025,
2375,
6902,
314,
4116,
342,
4872,
10454,
285,
1262,
2196,
253,
4471,
2522,
5122,
16280,
253,
2831,
21061,
4116,
310,
625,
5919,
281,
9580,
4156,
3634,
1491,
323,
1029,
21061,
7789,
407,
17355,
253,
1029,
1268,
3640,
6311,
432,
1698,
21061,
7789,
9470,
4679,
327,
17068,
49602,
7568,
253,
12510,
273,
253,
4081,
37523,
19946,
352,
33526,
1375,
23037,
14387,
327,
2846,
1026,
9652,
285,
4049,
18049,
285,
2722,
12532,
1543,
327,
519,
70,
938,
76,
20544,
273,
436,
2929,
403,
347,
3637,
50276,
18,
247,
4460,
37523,
19946,
2972,
310,
4081,
534,
33526,
1805,
5454,
2727,
875,
3045,
285,
6733,
327,
31025,
335,
2804,
4095,
323,
24705,
26405,
4836,
374,
247,
747,
2990,
10336,
37523,
19946,
310,
4081,
534,
476,
1056,
2120,
897,
273,
4156,
3634,
323,
11138,
24705,
26405,
407,
17617,
4116,
11617,
1293,
3663,
273,
6733,
495,
37523,
19946,
33526,
1375,
23037,
14387,
327,
2846,
1026,
9652,
285,
4049,
18049,
285,
921,
12532,
3045,
327,
519,
70,
938,
76,
275,
1635,
352,
3400,
247,
747,
8668,
323,
3946,
327,
1524,
2606,
24705,
26405,
4836,
50276,
20881,
1255,
273,
436,
2929,
403,
347,
3637,
337,
253,
4081,
2831,
21061,
4116,
310,
816,
253,
12955,
273,
1881,
42959,
534,
310,
7561,
908,
275,
2990,
2216,
436,
2929,
310,
32809,
2429,
342,
2045,
789,
277,
5267,
3024,
253,
38135,
273,
436,
2929,
310,
3710,
374,
253,
3045,
50276,
49831,
420,
327,
2846,
9875,
10895,
310,
3710,
347,
2011,
275,
2829,
337,
253,
3641,
276,
4868,
285,
269,
793,
7756,
403,
1097,
3710,
417,
4755,
2217,
495,
253,
4679,
327,
24705,
26405,
310,
11080,
2217,
285,
476,
320,
908,
281,
1329,
253,
4081,
1332,
533,
619,
4468,
310,
326,
253,
1332,
310,
2969,
285,
417,
4460,
2217,
275,
1635,
2139,
417,
4647,
436,
1332,
281,
643,
8113,
8892,
751,
9162,
1789,
5481,
50275,
9820,
2490,
187,
4118,
18435,
27,
15337,
398,
5194,
326,
253,
4081,
37523,
19946,
2972,
285,
4583,
2990,
10336,
33526,
1175,
5454,
2727,
875,
3045,
285,
6733,
327,
2067,
15302,
253,
2216,
273,
31025,
2375,
6902,
314,
4116,
285,
2831,
21061,
4116,
19132,
253,
15180,
6733,
689,
4471,
2522,
4116,
285,
973,
28174,
4156,
3634,
1491,
672,
22753,
1029,
21061,
46234,
50276,
783,
2022,
4468,
347,
5393,
407,
2067,
30628,
310,
253,
4583,
38135,
347,
690,
5697,
403,
2905,
281,
2045,
789,
31025,
2375,
6902,
314,
4116,
285,
9769,
27311,
267,
16702,
254,
10336,
643,
2523,
3797,
5816,
1666,
25379,
326,
403,
1754,
327,
1708,
24676,
4116,
11809,
390,
513,
417,
897,
4116,
387,
512,
533,
436,
452,
644,
973,
11512,
275,
253,
2488,
8680,
275,
6010,
253,
5847,
32180,
798,
253,
772,
285,
3103,
913,
32636,
14924
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
2175,
253,
1895,
273,
1524,
2606,
24705,
26405,
342,
39707,
253,
4477,
4081,
271,
37523,
19946,
2972,
342,
767,
4116,
3210,
281,
19737,
1491,
327,
1027,
21061,
3386,
253,
5661,
1543,
327,
1296,
267,
15302,
7568,
253,
12510,
273,
253,
4081,
1332,
20544,
50276,
783,
4081,
1332,
33526,
1270,
3045,
275,
253,
1296,
267,
15302,
50276,
3118,
1096,
281,
253,
1666,
25379,
253,
4081,
3082,
812,
3324,
3638,
11701,
50275,
20881,
1255,
265,
690,
1774,
28913,
2175,
403,
5816,
50276,
783,
4327,
273,
27934,
2216,
253,
2488,
1691,
253,
4081,
37523,
19946,
2972,
327,
253,
1390,
767,
8661,
285,
1057,
417,
2085,
253,
1543,
281,
1329,
436,
2216,
50276,
783,
8245,
534,
1057,
417,
897,
667,
4116,
310,
3058,
281,
320,
2908,
275,
2829,
495,
247,
50276,
47109,
342,
253,
643,
28441,
33056,
621,
310,
671,
3058,
4754,
5474,
339,
431,
248,
7714,
10262,
271,
5919,
1566,
323,
24705,
26405,
253,
2022,
7680,
10140,
281,
305,
11113,
11453,
4116,
3828,
534,
19132,
253,
6733,
407,
970,
10149,
285,
2193,
347,
3037,
494,
3602,
7877,
1319,
273,
10149,
285,
2193,
310,
247,
4373,
19484,
326,
310,
1199,
1679,
685,
295,
288,
89,
88,
33810,
253,
13361,
81,
432,
253,
2629,
39707,
310,
7932,
342,
8342,
2410,
17009,
253,
4795,
6333,
310,
8489,
2074,
281,
253,
10610,
1881,
42959,
3828,
432,
638,
16702,
254,
8685,
4720,
690,
2007,
3045,
7756,
310,
2797,
949,
2831,
21061,
4116,
4679,
2953,
2846,
1026,
9652,
285,
519,
70,
938,
76,
50275,
296,
3755,
20556,
50276,
66,
8146,
494,
9769,
1566,
342,
2410,
17009,
327,
2169,
21061,
14237,
285,
1881,
42959,
327,
2406,
21061,
14237,
50276,
3409,
23037,
14387,
4313,
875,
3045,
285,
15180,
10454,
50276,
20881,
1255,
265,
50276,
19687,
30132,
7680,
305,
11113,
11453,
4116,
4620,
3240,
2905,
281,
2045,
789,
2145,
347,
973,
347,
281,
19169,
19946,
285,
295,
9207,
409,
19946,
923,
2708,
50276,
44972,
27311,
267,
16702,
254,
3210,
452,
644,
4081,
1078,
24088,
277,
431,
9769,
923,
2708,
50275,
783,
4081,
11701,
1347,
760,
5777,
1805,
685,
1666,
25379,
275,
3036,
20,
50276,
33722,
6661,
275,
3036,
20,
66,
299,
66,
50276,
6357,
50275,
16374,
3733,
33257,
352,
4620,
326,
760,
495,
19492,
23414,
89,
31111,
476,
4944,
715,
247,
362,
2313,
50275,
249,
11984,
2905,
789,
275,
253,
1673,
273,
5919,
3210,
323,
24705,
26405,
24088,
1892,
3024,
19779,
3024,
923,
2708,
50276,
33722,
2905,
789,
50276,
445,
6337,
649,
77,
247,
1591,
2653,
1766,
348,
17131,
9327,
90,
362,
17869,
5025,
465,
7391,
328,
8113,
4979,
398,
323,
14086,
10554,
17857,
17312,
43425,
1249,
17220,
805,
13851,
50276,
90,
328,
31524,
1269,
279,
72,
1182,
5582,
81,
1205,
1182,
1205,
30514,
83,
4914,
448,
518,
37588,
430,
90,
43261,
89,
272,
23136,
1289,
2477,
794,
72,
340,
249,
632,
362,
1479,
284,
1625,
73,
295,
9207,
1109,
19946,
247,
295,
9207,
1109,
3169,
5933,
323,
4020,
839,
1881,
42959,
39951,
2284,
43425,
50276,
7432,
543,
259,
606,
1112,
17662,
1182,
632,
278,
7577,
26856,
5375,
66,
15761,
269,
606,
419,
80,
6429,
19169,
19946,
1881,
42959,
342,
4872,
10454,
944,
83,
2117,
1518,
1549,
2504,
2358,
9169,
50276,
4175,
249,
390,
25831,
6868,
8901,
8753,
19742,
5919,
24705,
26405,
342,
25874,
11421,
11781,
3102,
3183,
262,
9199,
884,
3121,
883,
43425,
50276,
14650,
448,
8500,
11450,
899,
606,
465,
8500,
340,
2345,
266,
8864,
266,
448,
1914,
73,
9245,
606,
30287,
606,
368,
79,
5056,
19169,
1892,
3024,
247,
1698,
3541,
7137,
2990,
17857,
17312,
6247,
352,
4620,
326,
1781,
3541,
33257,
46704,
3733,
327,
2014,
305,
11113,
2718,
5474,
33032,
2520,
2929,
29328,
37523,
19946,
323,
1524,
2606,
24705,
26405,
50276,
783,
37523,
19946,
19732,
1131,
31025,
2375,
6902,
314,
4116,
342,
4872,
10454,
285,
1262,
2196,
253,
4471,
2522,
5122,
253,
4477,
7568,
253,
10307,
273,
616,
1332,
327,
2067,
49602,
20544,
337,
253,
4081,
1332,
33526,
1175,
16226,
327,
253,
49602,
374,
436,
2929,
310,
973,
10932,
285,
4518,
2529,
495,
5919,
26405,
310,
247,
9865,
1895,
50276,
20881,
1255,
265,
337,
253,
1332,
4081,
275,
253,
2929,
310,
247,
9769,
273,
2710,
5368,
3082,
824,
347,
4872,
19017,
414,
1881,
42959,
20589,
3024,
285,
260,
9866,
285,
39707,
9769,
1566,
3103,
253,
38135,
310,
33153,
407,
2045,
2987,
374,
436,
2929,
1057,
417,
1333,
1880,
281,
897,
13148,
1378,
281,
28523,
253,
1566,
594,
891,
13414,
871,
604,
253,
5301,
310,
4344,
495,
275,
2426,
273,
3045,
285,
1566,
1979,
627,
310,
642,
1534,
5750,
875,
436,
1332,
285,
253,
2429,
3082,
50276,
32897,
3730,
281,
2929,
32213,
5474,
33032,
2520,
2929,
29328,
37523,
19946,
271,
5919,
39707,
323,
1524,
2606,
24705,
26405,
534,
33526,
1805,
5454,
2727,
875,
3045,
285,
6733,
685,
260,
9866,
3169,
3210,
281,
5115,
1029,
17032,
6733,
327,
31025,
335,
2804,
4095,
253,
37523,
19946,
19732,
1131,
31025,
2375,
6902,
314,
4116,
342,
4872,
10454,
285,
1262,
2196,
253,
4471,
2522,
5122,
16280,
253,
2831,
21061,
4116,
310,
625,
5919,
281,
9580,
4156,
3634,
1491,
323,
1029,
21061,
7789,
407,
17355,
253,
1029,
1268,
3640,
6311,
432,
1698,
21061,
7789,
9470,
4679,
327,
17068,
49602,
7568,
253,
12510,
273,
253,
4081,
37523,
19946,
352,
33526,
1375,
23037,
14387,
327,
2846,
1026,
9652,
285,
4049,
18049,
285,
2722,
12532,
1543,
327,
519,
70,
938,
76,
20544,
273,
436,
2929,
403,
347,
3637,
50276,
18,
247,
4460,
37523,
19946,
2972,
310,
4081,
534,
33526,
1805,
5454,
2727,
875,
3045,
285,
6733,
327,
31025,
335,
2804,
4095,
323,
24705,
26405,
4836,
374,
247,
747,
2990,
10336,
37523,
19946,
310,
4081,
534,
476,
1056,
2120,
897,
273,
4156,
3634,
323,
11138,
24705,
26405,
407,
17617,
4116,
11617,
1293,
3663,
273,
6733,
495,
37523,
19946,
33526,
1375,
23037,
14387,
327,
2846,
1026,
9652,
285,
4049,
18049,
285,
921,
12532,
3045,
327,
519,
70,
938,
76,
275,
1635,
352,
3400,
247,
747,
8668,
323,
3946,
327,
1524,
2606,
24705,
26405,
4836,
50276,
20881,
1255,
273,
436,
2929,
403,
347,
3637,
337,
253,
4081,
2831,
21061,
4116,
310,
816,
253,
12955,
273,
1881,
42959,
534,
310,
7561,
908,
275,
2990,
2216,
436,
2929,
310,
32809,
2429,
342,
2045,
789,
277,
5267,
3024,
253,
38135,
273,
436,
2929,
310,
3710,
374,
253,
3045,
50276,
49831,
420,
327,
2846,
9875,
10895,
310,
3710,
347,
2011,
275,
2829,
337,
253,
3641,
276,
4868,
285,
269,
793,
7756,
403,
1097,
3710,
417,
4755,
2217,
495,
253,
4679,
327,
24705,
26405,
310,
11080,
2217,
285,
476,
320,
908,
281,
1329,
253,
4081,
1332,
533,
619,
4468,
310,
326,
253,
1332,
310,
2969,
285,
417,
4460,
2217,
275,
1635,
2139,
417,
4647,
436,
1332,
281,
643,
8113,
8892,
751,
9162,
1789,
5481,
50275,
9820,
2490,
187,
4118,
18435,
27,
15337,
398,
5194,
326,
253,
4081,
37523,
19946,
2972,
285,
4583,
2990,
10336,
33526,
1175,
5454,
2727,
875,
3045,
285,
6733,
327,
2067,
15302,
253,
2216,
273,
31025,
2375,
6902,
314,
4116,
285,
2831,
21061,
4116,
19132,
253,
15180,
6733,
689,
4471,
2522,
4116,
285,
973,
28174,
4156,
3634,
1491,
672,
22753,
1029,
21061,
46234,
50276,
783,
2022,
4468,
347,
5393,
407,
2067,
30628,
310,
253,
4583,
38135,
347,
690,
5697,
403,
2905,
281,
2045,
789,
31025,
2375,
6902,
314,
4116,
285,
9769,
27311,
267,
16702,
254,
10336,
643,
2523,
3797,
5816,
1666,
25379,
326,
403,
1754,
327,
1708,
24676,
4116,
11809,
390,
513,
417,
897,
4116,
387,
512,
533,
436,
452,
644,
973,
11512,
275,
253,
2488,
8680,
275,
6010,
253,
5847,
32180,
798,
253,
772,
285,
3103,
913,
32636,
14924
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper addresses the problem of modeling sequential data based on one of the deep recurrent gaussian process drgp structures proposed by mattos et al 2016 this structure acts like a recurrent neural net where every layer is defined as a gp one of the main limitations of the original method proposed by mattos et al 2016 is that it is limited to a small set of covariance functions as the variational expectations over these have to be analytically tractable the main contributions of this paper are the use of previously proposed inference namely i the sparse spectrum ss of lazarogredilla et al 2010 its variational improvement by gal and turnner 2015 vss and the inducingpoint ip framework of titsias and lawrence 2010 into the recurrent setting of mattos et al 2016 most if not all of the technical developments in the paper are straightforward applications of the results in the papers above therefore the technical contribution of the paper is largely incremental furthermore while it is sensible to use randomfeature approximation approaches such as ss and vss in gp models it is very unclear why combining the ip framework with ss approaches makes any sense at all indeed the original ip framework was motivated as a way to deal with the scalability issue in gp models and the corresponding variational formulation yielded a nice property of an additional regularization term in the variational bound however making the prior over a equation 9 conditioned on the inducing variables u is rather artificial and lacks any theoretical justification to elaborate on this in the ip framework both the latent functions f in the original paper and the inducing inputs come from the same gp prior hence having a joint distribution over these comes naturally however in the approach proposed in this paper a is a simple prior over the weights in a linearintheparameters model and from my perspective having a prior conditioned on the inducing variables lacks any theoretical motivation the empirical results are a bit of a mixed bag as the methods proposed beat by a small margin the corresponding benchmarks on 6 out of 10 problems while one would not expect a proposed method to win on all possible problems no free lunch it will be good to have some insights into when the proposed methods are expected to be better than their competitors while the proposed method is motivated from an uncertainty propagation perspective only pointerror metrics rmse are reported the paper needs to do a proper evaluation of the full predictive posterior distributions what is the point of using gps otherwise other comments i recommend the authors use the notation pv and qv everywhere rather than v as the latter may lead to confusion on how the priors and the variational distributions are defined it is unnecessary to cite bishop to explain how one obtains a marginal distribution would it be possible to use the work of cutajar et al 2017 who use random feature expansions for deep gps in the sequential setting if so why arent the authors comparing to this the analysis of figure 1 needs expanding what are the performance values obtained with a standard recurrent neural net lstm docsepthis paper proposes deep recurrent gp models based on the existing drgp framework two works on sparse spectrum approximation as well as that of inducing points in these models uncertainty is propagated by marginalizing out the hidden inputs at every layer the authors have combined a series of known ideas in the proposed work there is a serious lack of discussion or technical insights from the authors for their technical formulations in particular what are the nontrivial technical challenges addressed in the proposed work furthermore the authors are quite sloppy in referencing equations and inconsistent in the use of their defined notations and acronyms i also find it hard to read and understand the main text due to awkward sentence structures have the authors revealed their identity on page 2 of the paper i quote we refer to the report foll et al 2017 for a detailed but preliminary formulation of our models and experiments and drgpvss code available from httpgithubcomromanfoelldrgpvss detailed comments are provided below for the first contribution stated by the authors what are the theoretical and practical implications of the different regularization termsproperties between the lower bounds in equations 10 vs 8 these are not described in the paper can the authors provide a detailed derivation of dvi for equation 13 as well as for the predictive distributions in sectio 635 can the authors provide a time complexity analysis of all the tested deep recurrent gps would the authors proposed approach be able to extend the framework of hoang et al 2017 see below that has generalized the ss approximation of lazarogredilla et al 2010 and the improved vss approximation of gal turner 2015 hoang q m hoang t n and low k h 2017 a generalized stochastic variational bayesian hyperparameter learning framework for sparse spectrum gaussian process regression in proc aaai 20072014 minor issues just below equation 6 equation 9 and throughout the entire paper the authors need to decide whether to italicize their notations in bold or not equations are not properly referenced in a number of instances the authors have used their commas too sparingly which makes some sentences very hard to parse what is the difference between revarbvssip drgpvssip and drgpvssip equation 7 lhs should be conditioned on u page 4 vssgp does not have the same equation 8 qa and qz should be placed next to the expectation page 4 choosen page 5 will makes it possible page 5 drgpssgp vssgp ssgpip vssgip page 5 to simplify notation we write hl1hx1 yhx1 such a notation does not look simplified equation after equation 12 on lhs should ul be a random variable page 17 should the expressions begin with docsepoverall score 710 confidence score 710 detailed comments this paper introduces various deep recurrent gaussian process drgp models based on the sparse spectrum gaussian process ssgp models and the variational sparse spectrum gaussian process vssgp models this is a good paper and proposed models are very sound so i recommend for acceptance although as main weakness i can say that is very technical so it can be difficult to follow adding more intuitive ideas motivation and maybe a figure for each step would be a solution apart from that it is a really good paper congratulations related to rnn models and sparse nystrom approximation strengths models are very sound solutions are solid the proposed methodology is correct and the empirical results and experiments are valid and properly done weaknesses it is too difficult to follow and it is written in an extreme technical way more intuitions and a proper motivation both in the abstract and introduction may be put in order to make the paper easier to read and hence more used by researchers and data scientists does this submission add value to the iclr community yes it does the experiments show the efficiency of the proposed methods in some scenarios and are valid methodologies quality is this submission technically sound yes it is are claims well supported by theoretical analysis or experimental results experimental results prove empirically the methods and appendixes show the analysis performed in a clear and elegant way is this a complete piece of work or work in progress complete piece of work are the authors careful and honest about evaluating both the strengths and weaknesses of their work yes and i would enfatize that i have liked that some experiments are won by other methods such as gplstm they are very honest clarity is the submission clearly written yes but it is difficult for newcomers due to the reasons that i have stated before is it well organized yes it is does it adequately inform the reader yes it is originality are the tasks or methods new yes they are sound is the work a novel combination of wellknown techniques yes it is is it clear how this work differs from previous contributions yes is related work adequately cited yes being a strength of the paper significance are the results important i would argue that they are and are a clear alternative to consider in order to solve these problems are others likely to use the ideas or build on them if the paper is written in a more friendly way yes does the submission address a difficult task in a better way than previous work yes i think does it advance the state of the art in a demonstrable way yes empirically arguments for acceptance models are very sound solutions are solid the proposed methodology is correct and the empirical results and experiments are valid and properly done arguments against acceptance clarity of the paper minor issues and typos vss not defined before being used abstract should be rewritten adding a motivation and focusing more on the problems being solved and less in the details of the solutions recurrent indexes that go backwards i of eq 1 should be explained why are going backwards before being used like that newcomers may be confused section 2 writing style lacks a bit of cohesion relating the paragraphs may be a solution q is not defined in section 31 paragraph 1 a valid covariance function must produce a psd matrix put that in section 31 i do not see how u marginalizes in eq 7 kind of confused about that i think that it should be pyxu section 34 statistics should be explained reading thread and authors response rebuttal decision i consider that the authors have perfomed a good rebuttal and reading the other messages and the authors response i also consider that my issue with clarity is solved hence i upgrade my score to 7 and recommend the paper for publication
### Summary: | this paper is concerned with combining past approximation methods to obtain a variant of deep recurrent gps while this variant is new 23 reviewers make very overlapping points about this extension being obtained from a straightforward combination of previous ideas furthermore r3 is not convinced that the approach is well motivated beyond filling the gap in the literature all reviewers also pointed out that the paper is very hard to read the authors have improved the manuscript during the rebuttal but the ac believes that the paper is still written in an unnecessarily complicated way overall the ac believes that this paper needs some more work specifically in a improving its presentation b providing more technical insights about the methods as suggested by r2 and r3 which could be a means of boosting the novelty | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
12453,
253,
1895,
273,
14053,
22453,
941,
1754,
327,
581,
273,
253,
3676,
18902,
305,
12064,
1232,
1837,
17788,
5289,
4081,
407,
26714,
375,
1162,
355,
4022,
436,
2605,
6993,
751,
247,
18902,
11454,
2036,
835,
1046,
3828,
310,
2931,
347,
247,
31025,
581,
273,
253,
2022,
7364,
273,
253,
3236,
1332,
4081,
407,
26714,
375,
1162,
355,
4022,
310,
326,
352,
310,
3710,
281,
247,
1355,
873,
273,
26677,
3470,
347,
253,
39762,
12656,
689,
841,
452,
281,
320,
41398,
10649,
494,
50276,
783,
2022,
9021,
273,
436,
2929,
403,
253,
897,
273,
3786,
4081,
17032,
10775,
891,
253,
23507,
6637,
23524,
273,
826,
34185,
462,
433,
6077,
1162,
355,
4267,
697,
39762,
7756,
407,
5918,
285,
1614,
1216,
4104,
362,
859,
50276,
395,
253,
24635,
3659,
13997,
7792,
273,
246,
953,
6358,
285,
1569,
1196,
4267,
715,
253,
18902,
4758,
273,
26714,
375,
1162,
355,
4022,
954,
604,
417,
512,
273,
253,
7681,
16936,
275,
253,
2929,
403,
15246,
4893,
273,
253,
1543,
275,
253,
9380,
1840,
3103,
253,
7681,
7680,
273,
253,
2929,
310,
8127,
32809,
33810,
1223,
352,
310,
24600,
281,
897,
3632,
24594,
11193,
7274,
824,
347,
23524,
285,
362,
859,
275,
31025,
3210,
352,
310,
1077,
12744,
2139,
16248,
253,
13997,
7792,
342,
23524,
7274,
2789,
667,
3282,
387,
512,
6296,
253,
3236,
13997,
7792,
369,
17194,
347,
247,
1039,
281,
2968,
342,
253,
9171,
1430,
2523,
275,
31025,
3210,
285,
253,
3969,
39762,
15895,
20714,
247,
5322,
2867,
273,
271,
3081,
37820,
1307,
275,
253,
39762,
3033,
2299,
2403,
253,
2720,
689,
247,
5150,
898,
27039,
327,
253,
24635,
4903,
1484,
310,
2581,
13345,
285,
19756,
667,
10527,
22861,
281,
21184,
327,
436,
275,
253,
13997,
7792,
1097,
253,
21624,
3470,
269,
275,
253,
3236,
2929,
285,
253,
24635,
14800,
1705,
432,
253,
1072,
31025,
2720,
7613,
1907,
247,
6036,
3268,
689,
841,
3249,
10748,
2299,
275,
253,
2746,
4081,
275,
436,
2929,
247,
310,
247,
2969,
2720,
689,
253,
13461,
275,
247,
4872,
565,
248,
22041,
1566,
285,
432,
619,
8668,
1907,
247,
2720,
27039,
327,
253,
24635,
4903,
19756,
667,
10527,
16038,
50275,
783,
16774,
1543,
403,
247,
2372,
273,
247,
6804,
7351,
347,
253,
3082,
4081,
7171,
407,
247,
1355,
8459,
253,
3969,
49602,
327,
721,
562,
273,
884,
3237,
1223,
581,
651,
417,
1902,
247,
4081,
1332,
281,
3330,
327,
512,
1896,
3237,
642,
1959,
11157,
352,
588,
320,
1175,
281,
452,
690,
16039,
715,
672,
253,
4081,
3082,
403,
3264,
281,
320,
1805,
685,
616,
21607,
50275,
6050,
253,
4081,
1332,
310,
17194,
432,
271,
11649,
18634,
8668,
760,
12219,
6045,
17082,
40373,
339,
403,
2361,
253,
2929,
3198,
281,
513,
247,
1463,
7103,
273,
253,
2120,
15970,
12637,
10670,
752,
310,
253,
1127,
273,
970,
305,
793,
5010,
50276,
977,
5701,
891,
5583,
253,
4477,
897,
253,
14951,
268,
87,
50275,
395,
2805,
87,
50275,
15160,
2811,
2581,
685,
362,
50275,
284,
253,
6158,
778,
1421,
281,
13775,
327,
849,
253,
2235,
641,
285,
253,
39762,
10670,
403,
2931,
50276,
262,
310,
15279,
281,
26542,
29417,
281,
5513,
849,
581,
31326,
247,
16888,
3268,
651,
352,
320,
1896,
281,
897,
253,
789,
273,
2624,
1432,
274,
1162,
355,
4240,
665,
897,
3632,
4735,
40955,
323,
3676,
305,
793,
50276,
249,
253,
22453,
4758,
604,
594,
2139,
403,
2649,
253,
4477,
10941,
281,
436,
253,
1783,
273,
4677,
337,
3198,
16122,
50276,
5371,
403,
253,
3045,
2193,
2797,
342,
247,
2629,
18902,
11454,
2036,
50276,
42663,
78,
5474,
33032,
2520,
2929,
29328,
3676,
18902,
31025,
3210,
1754,
327,
253,
5368,
1837,
17788,
7792,
767,
2987,
327,
23507,
6637,
11193,
347,
973,
347,
326,
273,
24635,
2792,
275,
841,
3210,
11649,
310,
46695,
407,
16888,
3006,
562,
253,
8763,
14800,
387,
1046,
3828,
50276,
783,
4477,
452,
5678,
247,
2962,
273,
1929,
5697,
275,
253,
4081,
789,
627,
310,
247,
4092,
3480,
273,
5955,
390,
7681,
16039,
432,
253,
4477,
323,
616,
7681,
26850,
275,
1798,
752,
403,
253,
37825,
7681,
7881,
9713,
275,
253,
4081,
789,
33810,
253,
4477,
403,
3240,
1499,
45695,
275,
44978,
7424,
285,
16706,
275,
253,
897,
273,
616,
2931,
41818,
285,
913,
1406,
90,
983,
891,
671,
1089,
352,
1892,
281,
1239,
285,
2096,
253,
2022,
2505,
1955,
281,
19328,
6197,
5289,
50276,
9802,
253,
4477,
4950,
616,
6489,
327,
3239,
374,
273,
253,
2929,
891,
14430,
359,
3730,
281,
253,
1304,
269,
2555,
1162,
355,
4240,
323,
247,
7000,
533,
12611,
15895,
273,
776,
3210,
285,
4679,
285,
1837,
17788,
87,
859,
2127,
2130,
432,
3944,
7280,
681,
409,
266,
4786,
293,
392,
15164,
45270,
859,
50274,
5992,
7193,
5701,
403,
2530,
2708,
50276,
1542,
253,
806,
7680,
4767,
407,
253,
4477,
752,
403,
253,
10527,
285,
8542,
12739,
273,
253,
1027,
37820,
2426,
19402,
875,
253,
2406,
14493,
275,
7424,
884,
4632,
854,
841,
403,
417,
2529,
275,
253,
2929,
50276,
5092,
253,
4477,
2085,
247,
7000,
28529,
273,
277,
6584,
323,
5150,
2145,
347,
973,
347,
323,
253,
15970,
10670,
275,
25102,
900,
721,
1671,
50276,
5092,
253,
4477,
2085,
247,
673,
10454,
1783,
273,
512,
253,
5762,
3676,
18902,
305,
793,
50275,
12756,
253,
4477,
4081,
2746,
320,
2104,
281,
9017,
253,
7792,
273,
8511,
606,
1162,
355,
4240,
923,
2708,
326,
556,
14923,
253,
23524,
11193,
273,
826,
34185,
462,
433,
6077,
1162,
355,
4267,
285,
253,
5520,
362,
859,
11193,
273,
5918,
50276,
14077,
254,
4104,
50276,
1689,
606,
2805,
278,
8511,
606,
246,
295,
285,
1698,
465,
288,
4240,
247,
14923,
19191,
39762,
17699,
16561,
4373,
19484,
4715,
7792,
323,
23507,
6637,
305,
12064,
1232,
9077,
275,
15613,
39951,
2284,
5215,
6759,
50274,
37585,
3374,
816,
2708,
5150,
721,
5150,
898,
285,
4768,
253,
2862,
2929,
253,
4477,
878,
281,
7617,
1880,
281,
36037,
280,
907,
616,
41818,
275,
13433,
390,
417,
50276,
2655,
569,
403,
417,
6283,
23378,
275,
247,
1180,
273,
10872,
50276,
783,
4477,
452,
908,
616,
764,
284,
1512,
49779,
314,
534,
2789,
690,
14683,
1077,
1892,
281,
14390,
50276,
5371,
310,
253,
3064,
875,
294,
2044,
47699,
859,
532,
1837,
17788,
87,
859,
532,
285,
1837,
17788,
87,
859,
532,
50276,
29813,
818,
298,
11285,
943,
320,
27039,
327,
1484,
3239,
577,
50276,
87,
859,
17788,
1057,
417,
452,
253,
1072,
5150,
854,
2805,
66,
285,
2805,
91,
943,
320,
4845,
1735,
281,
253,
15355,
3239,
577,
2093,
5458,
3239,
608,
588,
2789,
352,
1896,
3239,
608,
1837,
72,
793,
8433,
81,
362,
859,
17788,
256,
8433,
42532,
362,
859,
72,
532,
3239,
608,
281,
25636,
14951,
359,
3630,
288,
77,
18,
73,
89,
18,
50276,
90,
73,
89,
18,
824,
247,
14951,
1057,
417,
1007,
21010,
50276,
29813,
846,
5150,
1249,
327,
298,
11285,
943,
12130,
320,
247,
3632,
4778,
50276,
6377,
1722,
943,
253,
12091,
3135,
342,
50276,
7152,
33032,
1189,
455,
4868,
47889,
7162,
4868,
47889,
50276,
5992,
7193,
5701,
436,
2929,
23970,
2710,
3676,
18902,
305,
12064,
1232,
1837,
17788,
3210,
1754,
327,
253,
23507,
6637,
305,
12064,
1232,
256,
8433,
81,
3210,
285,
253,
39762,
23507,
6637,
305,
12064,
1232,
362,
859,
17788,
3210,
436,
310,
247,
1175,
2929,
285,
4081,
3210,
403,
1077,
3590,
594,
891,
5583,
323,
14924,
3738,
347,
2022,
14855,
891,
476,
1333,
326,
310,
1077,
7681,
594,
352,
476,
320,
2834,
281,
956,
6240,
625,
27350,
5697,
16038,
285,
5046,
247,
4677,
323,
1016,
3213,
651,
320,
247,
2900,
7419,
432,
326,
352,
310,
247,
1663,
1175,
2929,
28858,
3339,
50276,
4919,
281,
391,
9866,
3210,
285,
23507,
295,
9207,
409,
11193,
50276,
296,
3755,
20556,
3210,
403,
1077,
3590,
5482,
403,
4891,
253,
4081,
16182,
310,
3451,
285,
253,
16774,
1543,
285,
4679,
403,
3588,
285,
6283,
2218,
50276,
20881,
1255,
265,
352,
310,
1512,
2834,
281,
956,
285,
352,
310,
3542,
275,
271,
9559,
7681,
1039,
625,
16875,
4431,
285,
247,
1463,
16038,
1097,
275,
253,
12002,
285,
10199,
778,
320,
1691,
275,
1340,
281,
1056,
253,
2929,
6927,
281,
1239,
285,
7613,
625,
908,
407,
8607,
285,
941,
10950,
50276,
18566,
436,
19529,
823,
1318,
281,
253,
17857,
32888,
3114,
50276,
9820,
352,
1057,
253,
4679,
921,
253,
6733,
273,
253,
4081,
3082,
275,
690,
15216,
285,
403,
3588,
39396,
50276,
15177,
310,
436,
19529,
22335,
3590,
4754,
352,
310,
403,
3916,
973,
4516,
407,
10527,
1783,
390,
5661,
1543,
5661,
1543,
5276,
45190,
253,
3082,
285,
30762,
265,
921,
253,
1783,
2684,
275,
247,
2590,
285,
20654,
1039,
310,
436,
247,
3426,
5313,
273,
789,
390,
789,
275,
4780,
3426,
5313,
273,
789,
403,
253,
4477,
10182,
285,
8274,
670,
16344,
1097,
253,
20544,
285,
32213,
273,
616,
789,
4754,
285,
891,
651,
546,
19397,
907,
326,
891,
452,
10490,
326,
690,
4679,
403,
1912,
407,
643,
3082,
824,
347,
305,
446,
296,
78,
597,
403,
1077,
8274,
50276,
498,
15752,
310,
253,
19529,
4518,
3542,
4754,
533,
352,
310,
2834,
323,
38782,
398,
1955,
281,
253,
4606,
326,
891,
452,
4767,
1078,
310,
352,
973,
10932,
4754,
352,
310,
1057,
352,
18212,
4151,
253,
9414,
4754,
352,
310,
50276,
19164,
414,
403,
253,
8892,
390,
3082,
747,
4754,
597,
403,
3590,
310,
253,
789,
247,
4460,
5019,
273,
973,
4304,
5609,
4754,
352,
310,
310,
352,
2590,
849,
436,
789,
19986,
432,
2045,
9021,
4754,
310,
2905,
789,
18212,
11106,
4754,
1146,
247,
4757,
273,
253,
2929,
50276,
9188,
40348,
403,
253,
1543,
1774,
891,
651,
9059,
326,
597,
403,
285,
403,
247,
2590,
5795,
281,
1908,
275,
1340,
281,
8415,
841,
3237,
403,
2571,
2779,
281,
897,
253,
5697,
390,
1973,
327,
731,
604,
253,
2929,
310,
3542,
275,
247,
625,
11453,
1039,
4754,
1057,
253,
19529,
2953,
247,
2834,
4836,
275,
247,
1805,
1039,
685,
2045,
789,
4754,
891,
1158,
1057,
352,
7170,
253,
1375,
273,
253,
1445,
275,
247,
2837,
494,
1039,
4754,
45190,
50276,
31126,
323,
14924,
3210,
403,
1077,
3590,
5482,
403,
4891,
253,
4081,
16182,
310,
3451,
285,
253,
16774,
1543,
285,
4679,
403,
3588,
285,
6283,
2218,
50276,
31126,
1411,
14924,
19843,
273,
253,
2929,
50276,
37585,
3374,
285,
963,
993,
50276,
87,
859,
417,
2931,
1078,
1146,
908,
50276,
15834,
943,
320,
35993,
6240,
247,
16038,
285,
13654,
625,
327,
253,
3237,
1146,
14042,
285,
1679,
275,
253,
4278,
273,
253,
5482,
50276,
250,
6259,
28308,
326,
564,
24291,
891,
273,
16186,
337,
943,
320,
5544,
2139,
403,
1469,
24291,
1078,
1146,
908,
751,
326,
38782,
398,
778,
320,
13477,
50276,
4674,
374,
4028,
3740,
19756,
247,
2372,
273,
28901,
279,
12600,
253,
33295,
778,
320,
247,
2900,
50276,
82,
310,
417,
2931,
275,
2593,
4562,
12494,
337,
50276,
66,
3588,
26677,
1159,
1364,
4711,
247,
3714,
69,
4315,
1691,
326,
275,
2593,
4562,
50275,
74,
513,
417,
923,
849,
1484,
16888,
4219,
275,
16186,
818,
2238,
273,
13477,
670,
326,
891,
1158,
326,
352,
943,
320,
7239,
46036,
50276,
4674,
5910,
9990,
943,
320,
5544,
50276,
24042,
6293,
285,
4477,
2380,
30080,
22559,
3061,
50275,
74,
1908,
326,
253,
4477,
452,
20403,
14944,
247,
1175,
30080,
22559,
285,
4361,
253,
643,
8169,
285,
253,
4477,
2380,
891,
671,
1908,
326,
619,
2523,
342,
19843,
310,
14042,
7613,
891,
15047,
619,
4868,
281,
818,
285,
5583,
253,
2929,
323,
9311,
187,
187,
4118,
18435,
27,
2520,
2929,
310,
7514,
342,
16248,
2469,
11193,
3082,
281,
4044,
247,
12955,
273,
3676,
18902,
305,
793,
1223,
436,
12955,
310,
747,
3495,
30628,
1056,
1077,
21481,
2792,
670,
436,
6880,
1146,
2797,
432,
247,
15246,
5019,
273,
2045,
5697,
33810,
391,
20,
310,
417,
13762,
326,
253,
2746,
310,
973,
17194,
4457,
12868,
253,
8037,
275,
253,
6239,
50275,
455,
30628,
671,
8042,
562,
326,
253,
2929,
310,
1077,
1892,
281,
1239,
253,
4477,
452,
5520,
253,
7714,
1309,
253,
30080,
22559,
533,
253,
913,
11532,
326,
253,
2929,
310,
1335,
3542,
275,
271,
48312,
9542,
1039,
50275,
1189,
455,
253,
913,
11532,
326,
436,
2929,
3198,
690,
625,
789,
5742,
275,
247,
11138,
697,
9759,
270,
5277,
625,
7681,
16039,
670,
253,
3082,
347,
5125,
407,
391,
19,
285,
391,
20,
534,
812,
320,
247,
2097,
273,
43124,
253,
38135,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
12453,
253,
1895,
273,
14053,
22453,
941,
1754,
327,
581,
273,
253,
3676,
18902,
305,
12064,
1232,
1837,
17788,
5289,
4081,
407,
26714,
375,
1162,
355,
4022,
436,
2605,
6993,
751,
247,
18902,
11454,
2036,
835,
1046,
3828,
310,
2931,
347,
247,
31025,
581,
273,
253,
2022,
7364,
273,
253,
3236,
1332,
4081,
407,
26714,
375,
1162,
355,
4022,
310,
326,
352,
310,
3710,
281,
247,
1355,
873,
273,
26677,
3470,
347,
253,
39762,
12656,
689,
841,
452,
281,
320,
41398,
10649,
494,
50276,
783,
2022,
9021,
273,
436,
2929,
403,
253,
897,
273,
3786,
4081,
17032,
10775,
891,
253,
23507,
6637,
23524,
273,
826,
34185,
462,
433,
6077,
1162,
355,
4267,
697,
39762,
7756,
407,
5918,
285,
1614,
1216,
4104,
362,
859,
50276,
395,
253,
24635,
3659,
13997,
7792,
273,
246,
953,
6358,
285,
1569,
1196,
4267,
715,
253,
18902,
4758,
273,
26714,
375,
1162,
355,
4022,
954,
604,
417,
512,
273,
253,
7681,
16936,
275,
253,
2929,
403,
15246,
4893,
273,
253,
1543,
275,
253,
9380,
1840,
3103,
253,
7681,
7680,
273,
253,
2929,
310,
8127,
32809,
33810,
1223,
352,
310,
24600,
281,
897,
3632,
24594,
11193,
7274,
824,
347,
23524,
285,
362,
859,
275,
31025,
3210,
352,
310,
1077,
12744,
2139,
16248,
253,
13997,
7792,
342,
23524,
7274,
2789,
667,
3282,
387,
512,
6296,
253,
3236,
13997,
7792,
369,
17194,
347,
247,
1039,
281,
2968,
342,
253,
9171,
1430,
2523,
275,
31025,
3210,
285,
253,
3969,
39762,
15895,
20714,
247,
5322,
2867,
273,
271,
3081,
37820,
1307,
275,
253,
39762,
3033,
2299,
2403,
253,
2720,
689,
247,
5150,
898,
27039,
327,
253,
24635,
4903,
1484,
310,
2581,
13345,
285,
19756,
667,
10527,
22861,
281,
21184,
327,
436,
275,
253,
13997,
7792,
1097,
253,
21624,
3470,
269,
275,
253,
3236,
2929,
285,
253,
24635,
14800,
1705,
432,
253,
1072,
31025,
2720,
7613,
1907,
247,
6036,
3268,
689,
841,
3249,
10748,
2299,
275,
253,
2746,
4081,
275,
436,
2929,
247,
310,
247,
2969,
2720,
689,
253,
13461,
275,
247,
4872,
565,
248,
22041,
1566,
285,
432,
619,
8668,
1907,
247,
2720,
27039,
327,
253,
24635,
4903,
19756,
667,
10527,
16038,
50275,
783,
16774,
1543,
403,
247,
2372,
273,
247,
6804,
7351,
347,
253,
3082,
4081,
7171,
407,
247,
1355,
8459,
253,
3969,
49602,
327,
721,
562,
273,
884,
3237,
1223,
581,
651,
417,
1902,
247,
4081,
1332,
281,
3330,
327,
512,
1896,
3237,
642,
1959,
11157,
352,
588,
320,
1175,
281,
452,
690,
16039,
715,
672,
253,
4081,
3082,
403,
3264,
281,
320,
1805,
685,
616,
21607,
50275,
6050,
253,
4081,
1332,
310,
17194,
432,
271,
11649,
18634,
8668,
760,
12219,
6045,
17082,
40373,
339,
403,
2361,
253,
2929,
3198,
281,
513,
247,
1463,
7103,
273,
253,
2120,
15970,
12637,
10670,
752,
310,
253,
1127,
273,
970,
305,
793,
5010,
50276,
977,
5701,
891,
5583,
253,
4477,
897,
253,
14951,
268,
87,
50275,
395,
2805,
87,
50275,
15160,
2811,
2581,
685,
362,
50275,
284,
253,
6158,
778,
1421,
281,
13775,
327,
849,
253,
2235,
641,
285,
253,
39762,
10670,
403,
2931,
50276,
262,
310,
15279,
281,
26542,
29417,
281,
5513,
849,
581,
31326,
247,
16888,
3268,
651,
352,
320,
1896,
281,
897,
253,
789,
273,
2624,
1432,
274,
1162,
355,
4240,
665,
897,
3632,
4735,
40955,
323,
3676,
305,
793,
50276,
249,
253,
22453,
4758,
604,
594,
2139,
403,
2649,
253,
4477,
10941,
281,
436,
253,
1783,
273,
4677,
337,
3198,
16122,
50276,
5371,
403,
253,
3045,
2193,
2797,
342,
247,
2629,
18902,
11454,
2036,
50276,
42663,
78,
5474,
33032,
2520,
2929,
29328,
3676,
18902,
31025,
3210,
1754,
327,
253,
5368,
1837,
17788,
7792,
767,
2987,
327,
23507,
6637,
11193,
347,
973,
347,
326,
273,
24635,
2792,
275,
841,
3210,
11649,
310,
46695,
407,
16888,
3006,
562,
253,
8763,
14800,
387,
1046,
3828,
50276,
783,
4477,
452,
5678,
247,
2962,
273,
1929,
5697,
275,
253,
4081,
789,
627,
310,
247,
4092,
3480,
273,
5955,
390,
7681,
16039,
432,
253,
4477,
323,
616,
7681,
26850,
275,
1798,
752,
403,
253,
37825,
7681,
7881,
9713,
275,
253,
4081,
789,
33810,
253,
4477,
403,
3240,
1499,
45695,
275,
44978,
7424,
285,
16706,
275,
253,
897,
273,
616,
2931,
41818,
285,
913,
1406,
90,
983,
891,
671,
1089,
352,
1892,
281,
1239,
285,
2096,
253,
2022,
2505,
1955,
281,
19328,
6197,
5289,
50276,
9802,
253,
4477,
4950,
616,
6489,
327,
3239,
374,
273,
253,
2929,
891,
14430,
359,
3730,
281,
253,
1304,
269,
2555,
1162,
355,
4240,
323,
247,
7000,
533,
12611,
15895,
273,
776,
3210,
285,
4679,
285,
1837,
17788,
87,
859,
2127,
2130,
432,
3944,
7280,
681,
409,
266,
4786,
293,
392,
15164,
45270,
859,
50274,
5992,
7193,
5701,
403,
2530,
2708,
50276,
1542,
253,
806,
7680,
4767,
407,
253,
4477,
752,
403,
253,
10527,
285,
8542,
12739,
273,
253,
1027,
37820,
2426,
19402,
875,
253,
2406,
14493,
275,
7424,
884,
4632,
854,
841,
403,
417,
2529,
275,
253,
2929,
50276,
5092,
253,
4477,
2085,
247,
7000,
28529,
273,
277,
6584,
323,
5150,
2145,
347,
973,
347,
323,
253,
15970,
10670,
275,
25102,
900,
721,
1671,
50276,
5092,
253,
4477,
2085,
247,
673,
10454,
1783,
273,
512,
253,
5762,
3676,
18902,
305,
793,
50275,
12756,
253,
4477,
4081,
2746,
320,
2104,
281,
9017,
253,
7792,
273,
8511,
606,
1162,
355,
4240,
923,
2708,
326,
556,
14923,
253,
23524,
11193,
273,
826,
34185,
462,
433,
6077,
1162,
355,
4267,
285,
253,
5520,
362,
859,
11193,
273,
5918,
50276,
14077,
254,
4104,
50276,
1689,
606,
2805,
278,
8511,
606,
246,
295,
285,
1698,
465,
288,
4240,
247,
14923,
19191,
39762,
17699,
16561,
4373,
19484,
4715,
7792,
323,
23507,
6637,
305,
12064,
1232,
9077,
275,
15613,
39951,
2284,
5215,
6759,
50274,
37585,
3374,
816,
2708,
5150,
721,
5150,
898,
285,
4768,
253,
2862,
2929,
253,
4477,
878,
281,
7617,
1880,
281,
36037,
280,
907,
616,
41818,
275,
13433,
390,
417,
50276,
2655,
569,
403,
417,
6283,
23378,
275,
247,
1180,
273,
10872,
50276,
783,
4477,
452,
908,
616,
764,
284,
1512,
49779,
314,
534,
2789,
690,
14683,
1077,
1892,
281,
14390,
50276,
5371,
310,
253,
3064,
875,
294,
2044,
47699,
859,
532,
1837,
17788,
87,
859,
532,
285,
1837,
17788,
87,
859,
532,
50276,
29813,
818,
298,
11285,
943,
320,
27039,
327,
1484,
3239,
577,
50276,
87,
859,
17788,
1057,
417,
452,
253,
1072,
5150,
854,
2805,
66,
285,
2805,
91,
943,
320,
4845,
1735,
281,
253,
15355,
3239,
577,
2093,
5458,
3239,
608,
588,
2789,
352,
1896,
3239,
608,
1837,
72,
793,
8433,
81,
362,
859,
17788,
256,
8433,
42532,
362,
859,
72,
532,
3239,
608,
281,
25636,
14951,
359,
3630,
288,
77,
18,
73,
89,
18,
50276,
90,
73,
89,
18,
824,
247,
14951,
1057,
417,
1007,
21010,
50276,
29813,
846,
5150,
1249,
327,
298,
11285,
943,
12130,
320,
247,
3632,
4778,
50276,
6377,
1722,
943,
253,
12091,
3135,
342,
50276,
7152,
33032,
1189,
455,
4868,
47889,
7162,
4868,
47889,
50276,
5992,
7193,
5701,
436,
2929,
23970,
2710,
3676,
18902,
305,
12064,
1232,
1837,
17788,
3210,
1754,
327,
253,
23507,
6637,
305,
12064,
1232,
256,
8433,
81,
3210,
285,
253,
39762,
23507,
6637,
305,
12064,
1232,
362,
859,
17788,
3210,
436,
310,
247,
1175,
2929,
285,
4081,
3210,
403,
1077,
3590,
594,
891,
5583,
323,
14924,
3738,
347,
2022,
14855,
891,
476,
1333,
326,
310,
1077,
7681,
594,
352,
476,
320,
2834,
281,
956,
6240,
625,
27350,
5697,
16038,
285,
5046,
247,
4677,
323,
1016,
3213,
651,
320,
247,
2900,
7419,
432,
326,
352,
310,
247,
1663,
1175,
2929,
28858,
3339,
50276,
4919,
281,
391,
9866,
3210,
285,
23507,
295,
9207,
409,
11193,
50276,
296,
3755,
20556,
3210,
403,
1077,
3590,
5482,
403,
4891,
253,
4081,
16182,
310,
3451,
285,
253,
16774,
1543,
285,
4679,
403,
3588,
285,
6283,
2218,
50276,
20881,
1255,
265,
352,
310,
1512,
2834,
281,
956,
285,
352,
310,
3542,
275,
271,
9559,
7681,
1039,
625,
16875,
4431,
285,
247,
1463,
16038,
1097,
275,
253,
12002,
285,
10199,
778,
320,
1691,
275,
1340,
281,
1056,
253,
2929,
6927,
281,
1239,
285,
7613,
625,
908,
407,
8607,
285,
941,
10950,
50276,
18566,
436,
19529,
823,
1318,
281,
253,
17857,
32888,
3114,
50276,
9820,
352,
1057,
253,
4679,
921,
253,
6733,
273,
253,
4081,
3082,
275,
690,
15216,
285,
403,
3588,
39396,
50276,
15177,
310,
436,
19529,
22335,
3590,
4754,
352,
310,
403,
3916,
973,
4516,
407,
10527,
1783,
390,
5661,
1543,
5661,
1543,
5276,
45190,
253,
3082,
285,
30762,
265,
921,
253,
1783,
2684,
275,
247,
2590,
285,
20654,
1039,
310,
436,
247,
3426,
5313,
273,
789,
390,
789,
275,
4780,
3426,
5313,
273,
789,
403,
253,
4477,
10182,
285,
8274,
670,
16344,
1097,
253,
20544,
285,
32213,
273,
616,
789,
4754,
285,
891,
651,
546,
19397,
907,
326,
891,
452,
10490,
326,
690,
4679,
403,
1912,
407,
643,
3082,
824,
347,
305,
446,
296,
78,
597,
403,
1077,
8274,
50276,
498,
15752,
310,
253,
19529,
4518,
3542,
4754,
533,
352,
310,
2834,
323,
38782,
398,
1955,
281,
253,
4606,
326,
891,
452,
4767,
1078,
310,
352,
973,
10932,
4754,
352,
310,
1057,
352,
18212,
4151,
253,
9414,
4754,
352,
310,
50276,
19164,
414,
403,
253,
8892,
390,
3082,
747,
4754,
597,
403,
3590,
310,
253,
789,
247,
4460,
5019,
273,
973,
4304,
5609,
4754,
352,
310,
310,
352,
2590,
849,
436,
789,
19986,
432,
2045,
9021,
4754,
310,
2905,
789,
18212,
11106,
4754,
1146,
247,
4757,
273,
253,
2929,
50276,
9188,
40348,
403,
253,
1543,
1774,
891,
651,
9059,
326,
597,
403,
285,
403,
247,
2590,
5795,
281,
1908,
275,
1340,
281,
8415,
841,
3237,
403,
2571,
2779,
281,
897,
253,
5697,
390,
1973,
327,
731,
604,
253,
2929,
310,
3542,
275,
247,
625,
11453,
1039,
4754,
1057,
253,
19529,
2953,
247,
2834,
4836,
275,
247,
1805,
1039,
685,
2045,
789,
4754,
891,
1158,
1057,
352,
7170,
253,
1375,
273,
253,
1445,
275,
247,
2837,
494,
1039,
4754,
45190,
50276,
31126,
323,
14924,
3210,
403,
1077,
3590,
5482,
403,
4891,
253,
4081,
16182,
310,
3451,
285,
253,
16774,
1543,
285,
4679,
403,
3588,
285,
6283,
2218,
50276,
31126,
1411,
14924,
19843,
273,
253,
2929,
50276,
37585,
3374,
285,
963,
993,
50276,
87,
859,
417,
2931,
1078,
1146,
908,
50276,
15834,
943,
320,
35993,
6240,
247,
16038,
285,
13654,
625,
327,
253,
3237,
1146,
14042,
285,
1679,
275,
253,
4278,
273,
253,
5482,
50276,
250,
6259,
28308,
326,
564,
24291,
891,
273,
16186,
337,
943,
320,
5544,
2139,
403,
1469,
24291,
1078,
1146,
908,
751,
326,
38782,
398,
778,
320,
13477,
50276,
4674,
374,
4028,
3740,
19756,
247,
2372,
273,
28901,
279,
12600,
253,
33295,
778,
320,
247,
2900,
50276,
82,
310,
417,
2931,
275,
2593,
4562,
12494,
337,
50276,
66,
3588,
26677,
1159,
1364,
4711,
247,
3714,
69,
4315,
1691,
326,
275,
2593,
4562,
50275,
74,
513,
417,
923,
849,
1484,
16888,
4219,
275,
16186,
818,
2238,
273,
13477,
670,
326,
891,
1158,
326,
352,
943,
320,
7239,
46036,
50276,
4674,
5910,
9990,
943,
320,
5544,
50276,
24042,
6293,
285,
4477,
2380,
30080,
22559,
3061,
50275,
74,
1908,
326,
253,
4477,
452,
20403,
14944,
247,
1175,
30080,
22559,
285,
4361,
253,
643,
8169,
285,
253,
4477,
2380,
891,
671,
1908,
326,
619,
2523,
342,
19843,
310,
14042,
7613,
891,
15047,
619,
4868,
281,
818,
285,
5583,
253,
2929,
323,
9311,
187,
187,
4118,
18435,
27,
2520,
2929,
310,
7514,
342,
16248,
2469,
11193,
3082,
281,
4044,
247,
12955,
273,
3676,
18902,
305,
793,
1223,
436,
12955,
310,
747,
3495,
30628,
1056,
1077,
21481,
2792,
670,
436,
6880,
1146,
2797,
432,
247,
15246,
5019,
273,
2045,
5697,
33810,
391,
20,
310,
417,
13762,
326,
253,
2746,
310,
973,
17194,
4457,
12868,
253,
8037,
275,
253,
6239,
50275,
455,
30628,
671,
8042,
562,
326,
253,
2929,
310,
1077,
1892,
281,
1239,
253,
4477,
452,
5520,
253,
7714,
1309,
253,
30080,
22559,
533,
253,
913,
11532,
326,
253,
2929,
310,
1335,
3542,
275,
271,
48312,
9542,
1039,
50275,
1189,
455,
253,
913,
11532,
326,
436,
2929,
3198,
690,
625,
789,
5742,
275,
247,
11138,
697,
9759,
270,
5277,
625,
7681,
16039,
670,
253,
3082,
347,
5125,
407,
391,
19,
285,
391,
20,
534,
812,
320,
247,
2097,
273,
43124,
253,
38135,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary the paper proposes visual transformer network which encodes the relationship between all detected object instances in a frame and uses it for navigation the paper uses detr for object detection and learn an association between local descriptors from the object detector with global descriptors resnet18 using the proposed vt model they show that using vt improves performance on the object navigation task in ai2thor simulator compared to existing methods strengths the paper proposed a novel transformer architecture that learns an association between local object descriptors with global image region features so that actions can be grounded to visual regions in the image different from prior work the paper uses all the objects detected for a label instead of just the most confident detection weaknesses the paper doesnt fully address why detr performs better than fasterrcnn features appearance features from fasterrcnn have been widely used for several downstream tasks in vision and language navigation1 vision and language tasks2 from the experiments its not clear why detr is doing better than fasterrcnn especially when the detection accuracy of detr is also better than faster rcnn additionally i didnt fully follow how authors obtain the appearance features from faster rcnn based method the authors mention that object appearance features are extracted from different layers of a backbone network how is it different from the approach taken by bottomup topdown3 paper in which 2048dim appearance features are extracted for each visual region the experimental setup isnt fully reflective of the object goal navigation task the experiments are conducted in ai2 thor scenes which only contain one room its not clear how this method will perform when evaluated on significantly more complicated environments like matterport gibson 4 specifically i am interested in how will the proposed architecture perform when the goal object is not in the same room as the agent the navigation task is also made simpler by discretizing into a grid single room environments and discrete grids simplify a lot of navigationrelated challenges and the authors dont discuss how the proposed architecture will generalize to more complex object navigation tasks the use of spatial embeddings as well as appearance embedding isnt all that surprising existing work including du et al uses bounding box coordinates to help learn spatial associations between objects other questions instead of pretraining without employing the navigation policy did the authors try using shortestpath based demonstrations to help learn the navigation policy as well in the first stage the navigation policy learns using imitation learning and then finetuned with a3c what is the step size of the agent for the forward step what are the turn angles for turnleft turnright actions what are the tilt angles for lookup and lookdown actions whats the reason for improvement over org in absence of tpn is it superior visual representations faster rcnn vs detr or the fact org only chooses objects with the highest confidence while vt uses all the detected objects how does the agent learn longterm associations between objects across multiple frames in my opinion the proposed architecture puts all the burden of learning these longterm object relationships across multiple frames on the lstm policy since the vt only learns association within a single frame 1 improving visionandlanguage navigation with imagetext pairs from the web majumdar et al 2 oscar objectsemantics aligned pretraining for visionlanguage tasks li et al 3 bottomup and topdown attention for image captioning and visual question answering anderson et al docsepthis paper demonstrates a model that uses the transformer to encode the visual features that appeared in the visual input image during navigation the model is firstly pretrained under imitation learning objective with selfgenerated shortestpath trajectories the empirical results show that the model used in the paper outperforms previous methods on ai2thor environment the authors also show some studies on the contributions of each component in the model paper strengths the proposed method further show that the transformer is a powerful model for feature extraction the authors demonstrate one method to make the training of transformer work ie pretraining transformers using shortestpath trajectories empirical result support the authors claims a thorough ablation study and discussions are provided cons the paper adopts the transformer and adapted it into the navigation problem no new architecturemodel is proposed it seems that a similar usage of transformer already appeared in the visionandlanguage navigation task 1 the paper also shows that pretraining of navigation tasks using transformers can help to boost the performance minor two missing citations 23 that are potentially relevant 1 towards learning a generic agent for visionandlanguage navigation via pretraining 2 evolving graphical planner contextual global planning for visionandlanguage navigation 3 are you looking grounding to multiple modalities in visionandlanguage navigation ive read the authors response and would like to maintain my original scoredocsep summary this paper introduces transformer network to visual navigation specifically objectgoal navigation it also develops several new feature descriptors as the input of the transformer encoder and decoder to properly train the whole model a supervised pretraining stage is used to warmup the transformer model great performance has been achieved on aithor benchmark pros 1 lots of people must have thought to use transformer to replace the rnnlstm in lots of visual navigation framework this paper provides a good example most importantly this paper focus on the representation learning part of the whole pipeline which isnt that straightforward of how to use a transformer 2 the writing is mostly clear with clear motivation and background discussion 3 the performance boost especially spl is relatively significant compared to previous sota and the ablation studies have verified most of the design choices cons there are a couple of things which are not clear to me or confused me when i was reading the paper 1 the writing in the approach section isnt very clear first it would be much better to define clear notations for all the featuresdescriptors and use such notations in the figure the current writing uses instance feature global feature positionalglobal spatial feature spatialenhanced which are a little bit confusing to me second i think most details are properly ignored in fig2 it becomes not as informative as the detailed version fig4 in appendix note that these two figures are not consistent that the add symbol for positional enhancement is missing is fig4 i also suggest that the positional embedding blob not crossing the arrow of global feature they are just added togetherthird sec42 writes we first reduce the channel dimension of a highlevel activation map from d to a smaller dimension d how the reduction is done exactly from appendix it seems like a 256dim vector is transformed into 249dim fourth h and w are abused in figure they are annotated on the long side of the tensor in eq 1 they seem to be the output of positional embedding and in sec42 description they are the resolution of 7 similarly l is abused as it means input of encoder in sec41 but output of encoder in sec43 let me stop here but these things make the approach not super clear to me 2 in sec41 im not fully convinced of the statement of faster rcnn even thought the experiments empirically verified it faster rcnn wo fpn only output features after conv4roipooling resnet101c4 variant why is it blamed for scalesensitive actually what does scalesensitive mean here why detr doesnt suffer from it honestly i dont think thats the reason why faster rcnn performs worse 3 also im not fully convinced of the statement of the early stopping in sec44 the penalties are the same for different model in rl why this transformer based representation learner suffers from early stopping is there a plausible explanation its fine that you cannot conclude something for sure because transformers are always hard to train but the statement in paper reads not super convincing to me 4 sec51 spl formulation seems to be wrong the success indicator seems missing the current equation is simply a ratio between any episode length over the optimal length regardless whether its an success episode or not 5 why not also adding global features into the transformer encoder for example reshape and concat with the input is the encoder supposed to be local misc 1 the best results of vtnet in tab1 used tpn it might be better to introduce tpn in appendix for completeness 2 variance is not reported in tab1 which is uncommon for rlcontrol paper 3 because transformer has attention module and the relationship can be easily visualized i was expecting more interpretationvisualization like fig1 right to show the proposed methods actually attend to proper areas the numbers are hard to tell what do each modules do exactly questions 1 just to make sure i understand correctly the instance feature 100x249 and spatial feature 100x7 are fed into a mlp for fusion can you describe the archi 2 local spatial feature contains the normalized bounding box confidence and toprated semantic label is the semantic label the class index 12c why not use a onehot embedding or something 3 is ai2thor the most popular benchmark for objectgoal nav i have seen lots of prior paper running on habitat whats the specific reasons of using ai2thor over habitat please address my questions im looking forward to discussing with the authors and the peer reviewers docseppaper summary the paper addresses the problem of navigation towards objects objectnav in a virtual environment the idea of the paper is to incorporate spatial information of objects using a transformerbased framework called visual transformer network the paper compares the results with a number of stateoftheart objectnav models and provides an ablation study the results have been reported on the ai2thor framework paper strengths the idea of incorporating object information using transformers for a navigation agent is new the proposed method outperforms a number of strong baselines the ablation studies show that the introduced components are effective paper weaknesses it is hard to understand some parts of the paper for example the introduction discusses details such the difference between detr and faster rcnn or difficulty of training the transformers it is difficult to understand these details without knowing the proposed method the introduction should provide a highlevel overview of the paper instead of these types of details also the paper requires proof reading there are several sentences with grammar issues it is a bit strange that nothing is learned without the imitation pretraining it would be good to dig deeper and provide a better explanation for why this happens equation 1 is not clear a brief explanation would help i recommend running the method on some other frameworks which include slightly larger scenes to see if the method generalizes to those as well robothor httpsgithubcomallenairobothorchallenge is very close to the framework used in this paper so it might be a good choice for these experiments justification of rating overall i am leaning towards accepting this paper since it introduces a new way for incorporating object information and it outperforms strong object navigation baselines writing is the main issue of this paper postrebuttal i read the rebuttal and the other reviews the rebuttal addresses my concerns to some extent writing has improved in the revised version but it still has some issues so i am going to keep my rating
### Summary: | this paper addresses the problem of visual object navigation by defining a novel visual transformer architecture where an encoder consisting of a pretrained object detector extracts objects ie their visual features position semantic label confidence that will serve as keys in an attentionbased retrieval mechanism and a decoder computes global visual features and positional descriptors as a coarse feature map the visual transformer is first pretrained using imitation learning on simple tasks consisting in moving the stateless agent camera towards the target object then an rl agent is defined by adding an lstm to the vtnet and training it endtoend on the singleroom subset of the ai2thor environment where it achieves stateoftheart performance after rebuttal all four reviewers converged on a score of 6 the reviewers praised the novelty of the method extensive evaluation with ablation studies and the sota results main points of criticism were about clarity of writing and some explanations which the authors improved using detr vs faster rcnn and the relative simplicity of the task single room and discrete action space there were also minor questions a request for more recent transformerbased vln bibliography and a request for a new evaluation on robothor one area of discussion where i empathise with the authors was regarding the difficulty of pure rl training of transformerbased agents and the necessity to pretrain the representations taking all this into account i suggest this paper gets accepted | [
588,
39970,
281,
625,
2570,
1789,
15034,
8892,
50274,
783,
897,
273,
8820,
46234,
347,
973,
347,
7286,
21496,
310,
2649,
512,
326,
10084,
5368,
789,
1690,
3443,
1162,
355,
4648,
41113,
3817,
11627,
281,
1361,
3037,
8820,
12485,
875,
5113,
50275,
977,
3533,
50275,
34235,
273,
3215,
26208,
1293,
19693,
253,
15034,
3646,
858,
253,
4477,
1611,
970,
30505,
3967,
1754,
32367,
281,
1361,
3037,
253,
15034,
3646,
347,
973,
275,
253,
806,
3924,
253,
15034,
3646,
33772,
970,
45738,
4715,
285,
840,
1442,
292,
37437,
342,
247,
20,
68,
50274,
5371,
310,
253,
3213,
1979,
273,
253,
5570,
323,
253,
3579,
3213,
752,
403,
253,
1614,
14636,
323,
1614,
1274,
1614,
918,
5231,
752,
403,
253,
20569,
14636,
323,
31994,
285,
1007,
3487,
5231,
50273,
5371,
84,
253,
1921,
323,
7756,
689,
4955,
275,
5928,
273,
246,
16077,
310,
352,
8936,
5304,
14237,
7938,
27657,
9866,
4632,
22636,
390,
253,
958,
4955,
760,
28467,
5113,
342,
253,
4585,
7162,
1223,
362,
85,
4648,
512,
253,
5189,
5113,
50274,
5430,
1057,
253,
5570,
3037,
1048,
3945,
12485,
875,
5113,
2439,
2709,
13009,
275,
619,
4743,
253,
4081,
10336,
12516,
512,
253,
7977,
273,
4715,
841,
1048,
3945,
1789,
7688,
2439,
2709,
13009,
327,
253,
298,
296,
78,
3646,
1580,
253,
362,
85,
760,
33772,
5864,
1561,
247,
2014,
3665,
50275,
18,
11138,
8113,
395,
12982,
15034,
342,
4440,
292,
2068,
8557,
432,
253,
4384,
19684,
360,
27083,
1162,
355,
50275,
19,
258,
19378,
1789,
6017,
28601,
15616,
3215,
26208,
323,
8113,
12982,
8892,
632,
1162,
355,
50275,
20,
5004,
484,
285,
1755,
3487,
4116,
323,
2460,
11743,
272,
285,
5304,
1953,
22291,
285,
3796,
1162,
355,
50276,
7152,
33032,
2520,
2929,
14371,
247,
1566,
326,
4648,
253,
39707,
281,
22573,
253,
5304,
3386,
326,
5420,
275,
253,
5304,
3280,
2460,
1309,
15034,
253,
1566,
310,
41005,
3215,
11273,
762,
45738,
4715,
8103,
342,
1881,
20419,
30505,
3967,
24102,
253,
16774,
1543,
921,
326,
253,
1566,
908,
275,
253,
2929,
41731,
13015,
2045,
3082,
327,
23105,
19,
42771,
3126,
253,
4477,
671,
921,
690,
2175,
327,
253,
9021,
273,
1016,
4445,
275,
253,
1566,
50276,
20790,
20544,
50275,
783,
4081,
1332,
2007,
921,
326,
253,
39707,
310,
247,
6422,
1566,
323,
4735,
11998,
50275,
783,
4477,
7568,
581,
1332,
281,
1056,
253,
3733,
273,
39707,
789,
26332,
3215,
26208,
4979,
398,
970,
30505,
3967,
24102,
50275,
358,
5378,
474,
906,
1329,
253,
4477,
3916,
50275,
66,
11080,
28913,
1263,
285,
11985,
403,
2530,
50276,
5040,
50275,
783,
2929,
47932,
253,
39707,
285,
12956,
352,
715,
253,
15034,
1895,
642,
747,
10336,
7645,
310,
4081,
50275,
262,
3133,
326,
247,
2074,
10393,
273,
39707,
2168,
5420,
275,
253,
8113,
395,
12982,
15034,
4836,
337,
253,
2929,
671,
2722,
326,
3215,
26208,
273,
15034,
8892,
970,
4979,
398,
476,
1361,
281,
9510,
253,
3045,
50275,
37585,
767,
5816,
30404,
3495,
326,
403,
7826,
4623,
50275,
18,
4404,
4715,
247,
12314,
5570,
323,
8113,
395,
12982,
15034,
3066,
3215,
26208,
50276,
19,
25537,
29886,
499,
9582,
33876,
4156,
7219,
323,
8113,
395,
12982,
15034,
50276,
20,
403,
368,
2819,
3216,
272,
281,
2709,
33433,
275,
8113,
395,
12982,
15034,
50274,
422,
1239,
253,
4477,
2380,
285,
651,
751,
281,
6558,
619,
3236,
11691,
406,
33032,
6010,
436,
2929,
23970,
39707,
2990,
281,
5304,
15034,
5742,
1789,
41881,
15034,
352,
671,
24357,
2067,
747,
4735,
42785,
347,
253,
3280,
273,
253,
39707,
32049,
285,
29810,
281,
6283,
6194,
253,
2644,
1566,
247,
22296,
3215,
26208,
3924,
310,
908,
281,
5890,
484,
253,
39707,
1566,
1270,
3045,
556,
644,
6786,
327,
247,
334,
263,
22791,
50275,
856,
84,
337,
8783,
273,
952,
1364,
452,
1869,
281,
897,
39707,
281,
8171,
253,
391,
9866,
42663,
78,
275,
8783,
273,
5304,
15034,
7792,
436,
2929,
3400,
247,
1175,
1650,
954,
15538,
436,
2929,
2770,
327,
253,
6779,
4715,
629,
273,
253,
2644,
15722,
534,
310,
2649,
326,
15246,
273,
849,
281,
897,
247,
39707,
374,
253,
4028,
310,
6571,
2590,
342,
2590,
16038,
285,
4114,
5955,
50276,
20,
253,
3045,
9510,
3340,
6821,
310,
4942,
1534,
2429,
281,
2045,
256,
5503,
285,
253,
28913,
2175,
452,
16058,
954,
273,
253,
2216,
10165,
50275,
5040,
627,
403,
247,
4564,
273,
1841,
534,
403,
417,
2590,
281,
479,
390,
13477,
479,
672,
891,
369,
4361,
253,
2929,
337,
50276,
783,
4028,
275,
253,
2746,
2593,
310,
2649,
1077,
2590,
806,
352,
651,
320,
1199,
1805,
281,
4853,
2590,
41818,
323,
512,
253,
3386,
3229,
1687,
641,
285,
897,
824,
41818,
275,
253,
4677,
253,
1655,
4028,
4648,
4227,
4735,
4156,
4735,
40798,
14456,
8820,
4735,
8820,
35465,
534,
403,
247,
1652,
2372,
21643,
281,
479,
1273,
891,
1158,
954,
4278,
403,
6283,
12841,
275,
3036,
19,
352,
4916,
417,
347,
27096,
347,
253,
7000,
2715,
3036,
21,
275,
30762,
3877,
326,
841,
767,
8442,
403,
417,
5185,
326,
253,
823,
9484,
323,
40798,
14314,
310,
5816,
310,
3036,
21,
891,
671,
1804,
326,
253,
40798,
21496,
37905,
417,
14270,
253,
14150,
273,
4156,
4735,
597,
403,
816,
2879,
2366,
19016,
4706,
2945,
12013,
359,
806,
4796,
253,
5048,
7877,
273,
247,
1029,
5251,
5743,
3711,
432,
277,
281,
247,
4577,
7877,
277,
849,
253,
5141,
310,
2218,
4555,
432,
30762,
352,
3133,
751,
247,
17558,
4528,
4972,
310,
13657,
715,
29503,
4528,
7002,
288,
285,
259,
403,
19848,
275,
4677,
597,
403,
28267,
50276,
251,
253,
1048,
1930,
273,
253,
13148,
275,
16186,
337,
597,
1646,
281,
320,
253,
3453,
273,
40798,
21496,
285,
275,
4706,
2945,
5740,
597,
403,
253,
6064,
273,
818,
12014,
298,
310,
19848,
347,
352,
2097,
3280,
273,
32049,
275,
4706,
3156,
533,
3453,
273,
32049,
275,
4706,
3079,
1339,
479,
3523,
1060,
533,
841,
1841,
1056,
253,
2746,
417,
2221,
2590,
281,
479,
50276,
19,
275,
4706,
3156,
516,
417,
4751,
13762,
273,
253,
3908,
273,
7938,
27657,
9866,
1014,
1869,
253,
4679,
45190,
16058,
352,
7938,
27657,
9866,
32063,
269,
16077,
760,
3453,
3386,
846,
2410,
21,
287,
532,
1062,
272,
501,
3024,
6903,
68,
21,
12955,
2139,
310,
352,
27137,
323,
11498,
18917,
2686,
752,
1057,
11498,
18917,
1599,
1060,
2139,
22636,
36908,
11089,
432,
352,
20509,
891,
13414,
1158,
28763,
253,
1921,
2139,
7938,
27657,
9866,
17923,
7197,
495,
671,
516,
417,
4751,
13762,
273,
253,
3908,
273,
253,
2393,
15910,
275,
4706,
2031,
253,
22414,
403,
253,
1072,
323,
1027,
1566,
275,
391,
77,
50276,
22309,
436,
39707,
1754,
6779,
458,
47612,
27171,
432,
2393,
15910,
310,
627,
247,
21541,
8813,
697,
4030,
326,
368,
2550,
7525,
1633,
323,
2119,
984,
4979,
398,
403,
1900,
1892,
281,
6194,
533,
253,
3908,
275,
2929,
9563,
417,
2221,
21414,
281,
479,
577,
4706,
3712,
6821,
15895,
3133,
281,
320,
3430,
253,
2323,
15301,
3133,
5816,
253,
1655,
5150,
310,
3365,
247,
4313,
875,
667,
9037,
2978,
689,
253,
8654,
2978,
10159,
1880,
697,
271,
2323,
9037,
390,
417,
608,
2139,
417,
671,
6240,
4156,
3386,
715,
253,
39707,
32049,
323,
1650,
40206,
2259,
285,
7036,
255,
342,
253,
3280,
310,
253,
32049,
6326,
281,
320,
1980,
50275,
43671,
337,
253,
1682,
1543,
273,
362,
85,
3024,
275,
10334,
18,
908,
246,
16077,
352,
1537,
320,
1805,
281,
9569,
246,
16077,
275,
30762,
323,
29867,
50276,
19,
11041,
310,
417,
2361,
275,
10334,
18,
534,
310,
24666,
323,
391,
77,
8519,
2929,
50276,
20,
984,
39707,
556,
4116,
6333,
285,
253,
2954,
476,
320,
4354,
27130,
891,
369,
16764,
625,
7914,
34309,
1320,
751,
3036,
18,
987,
281,
921,
253,
4081,
3082,
2686,
8041,
281,
1463,
3672,
253,
3904,
403,
1892,
281,
2028,
752,
513,
1016,
11911,
513,
4555,
50275,
34974,
337,
816,
281,
1056,
2119,
891,
2096,
9113,
253,
4227,
4735,
2233,
89,
21361,
285,
8820,
4735,
2233,
89,
24,
403,
10208,
715,
247,
13361,
81,
323,
11781,
476,
368,
6266,
253,
4222,
74,
374,
1980,
8820,
4735,
4428,
253,
12650,
41113,
3817,
7162,
285,
281,
1087,
456,
24705,
5203,
310,
253,
24705,
5203,
253,
966,
3605,
1249,
68,
2139,
417,
897,
247,
581,
12022,
21496,
390,
1633,
495,
310,
23105,
19,
42771,
253,
954,
4633,
22791,
323,
1789,
41881,
6563,
891,
452,
2326,
8783,
273,
2720,
2929,
3515,
327,
20571,
47515,
253,
2173,
4606,
273,
970,
23105,
19,
42771,
689,
20571,
50275,
32897,
2953,
619,
3533,
516,
2819,
3579,
281,
16585,
342,
253,
4477,
285,
253,
14218,
30628,
50276,
7152,
339,
377,
6653,
6010,
50276,
783,
2929,
12453,
253,
1895,
273,
15034,
4404,
5113,
1789,
8002,
275,
247,
7503,
3126,
253,
2934,
273,
253,
2929,
310,
281,
19071,
8820,
1491,
273,
5113,
970,
247,
39707,
3169,
7792,
1925,
5304,
39707,
2990,
253,
2929,
26662,
253,
1543,
342,
247,
1180,
273,
1375,
23037,
14387,
1789,
8002,
3210,
285,
3400,
271,
28913,
1263,
253,
1543,
452,
644,
2361,
327,
253,
23105,
19,
42771,
7792,
50276,
20790,
20544,
50275,
783,
2934,
273,
24049,
1789,
1491,
970,
4979,
398,
323,
247,
15034,
5570,
310,
747,
50275,
783,
4081,
1332,
41731,
13015,
247,
1180,
273,
2266,
1666,
25379,
50275,
783,
28913,
2175,
921,
326,
253,
5611,
4295,
403,
3576,
50276,
20790,
32213,
50275,
262,
310,
1892,
281,
2096,
690,
4243,
273,
253,
2929,
323,
1650,
253,
10199,
25339,
4278,
824,
253,
3064,
875,
22636,
285,
7938,
27657,
9866,
390,
10183,
273,
3733,
253,
4979,
398,
352,
310,
2834,
281,
2096,
841,
4278,
1293,
8958,
253,
4081,
1332,
253,
10199,
943,
2085,
247,
1029,
5251,
18389,
273,
253,
2929,
3185,
273,
841,
3510,
273,
4278,
671,
253,
2929,
4419,
4737,
4361,
627,
403,
2067,
14683,
342,
28146,
3374,
50275,
262,
310,
247,
2372,
8921,
326,
2717,
310,
6311,
1293,
253,
45738,
3215,
26208,
352,
651,
320,
1175,
281,
2836,
12861,
285,
2085,
247,
1805,
8813,
323,
2139,
436,
6569,
50275,
29813,
337,
310,
417,
2590,
247,
4864,
8813,
651,
1361,
50275,
74,
5583,
3515,
253,
1332,
327,
690,
643,
31225,
534,
2486,
5777,
4067,
13451,
281,
923,
604,
253,
1332,
2087,
4219,
281,
1110,
347,
973,
4848,
837,
263,
5987,
7280,
681,
455,
257,
22466,
15617,
263,
48781,
310,
1077,
2810,
281,
253,
7792,
908,
275,
436,
2929,
594,
352,
1537,
320,
247,
1175,
4327,
323,
841,
4679,
50276,
6309,
1877,
273,
13716,
50276,
1189,
455,
891,
717,
25661,
4404,
18738,
436,
2929,
1580,
352,
23970,
247,
747,
1039,
323,
24049,
1789,
1491,
285,
352,
41731,
13015,
2266,
1789,
15034,
1666,
25379,
4028,
310,
253,
2022,
2523,
273,
436,
2929,
50275,
5996,
250,
2858,
22559,
50276,
74,
1239,
253,
30080,
22559,
285,
253,
643,
10123,
253,
30080,
22559,
12453,
619,
7350,
281,
690,
6070,
4028,
556,
5520,
275,
253,
17265,
2715,
533,
352,
1335,
556,
690,
3374,
594,
891,
717,
1469,
281,
1978,
619,
13716,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
12453,
253,
1895,
273,
5304,
1789,
15034,
407,
13947,
247,
4460,
5304,
39707,
10336,
835,
271,
32049,
11253,
273,
247,
3215,
11273,
1789,
13562,
16756,
5113,
26332,
616,
5304,
3386,
1899,
24705,
5203,
7162,
326,
588,
5752,
347,
10149,
275,
271,
4116,
3169,
25064,
5122,
285,
247,
29810,
48169,
4156,
5304,
3386,
285,
40798,
42785,
347,
247,
25319,
4735,
3711,
253,
5304,
39707,
310,
806,
3215,
11273,
970,
45738,
4715,
327,
2969,
8892,
11253,
275,
4886,
253,
1098,
6134,
5570,
50276,
32499,
4404,
253,
2303,
1789,
840,
271,
391,
77,
5570,
310,
2931,
407,
6240,
271,
298,
296,
78,
281,
253,
362,
85,
3024,
285,
3733,
352,
990,
936,
423,
327,
253,
1625,
2146,
6188,
8578,
273,
253,
23105,
19,
42771,
3126,
835,
352,
33526,
1375,
23037,
14387,
3045,
50276,
6438,
30080,
22559,
512,
1740,
30628,
5975,
2400,
327,
247,
4868,
273,
721,
253,
30628,
26108,
253,
38135,
273,
253,
1332,
9470,
7103,
342,
28913,
2175,
285,
253,
256,
5503,
1543,
2022,
2792,
273,
14226,
497,
670,
19843,
273,
4028,
285,
690,
22909,
534,
253,
4477,
5520,
970,
22636,
4632,
7938,
27657,
9866,
285,
253,
4103,
17647,
273,
253,
4836,
2014,
2316,
285,
13358,
2250,
2317,
627,
497,
671,
5884,
3533,
247,
2748,
323,
625,
3332,
39707,
3169,
362,
6677,
20314,
20561,
285,
247,
2748,
323,
247,
747,
7103,
327,
4848,
837,
263,
581,
2170,
273,
5955,
50276,
2811,
891,
802,
3967,
885,
342,
253,
4477,
50276,
4238,
5001,
253,
10183,
273,
6313,
391,
77,
3733,
273,
39707,
3169,
6083,
285,
253,
15504,
281,
3215,
1949,
253,
14237,
50276,
29114,
512,
436,
715,
2395,
891,
1804,
436,
2929,
4850,
7607,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
588,
39970,
281,
625,
2570,
1789,
15034,
8892,
50274,
783,
897,
273,
8820,
46234,
347,
973,
347,
7286,
21496,
310,
2649,
512,
326,
10084,
5368,
789,
1690,
3443,
1162,
355,
4648,
41113,
3817,
11627,
281,
1361,
3037,
8820,
12485,
875,
5113,
50275,
977,
3533,
50275,
34235,
273,
3215,
26208,
1293,
19693,
253,
15034,
3646,
858,
253,
4477,
1611,
970,
30505,
3967,
1754,
32367,
281,
1361,
3037,
253,
15034,
3646,
347,
973,
275,
253,
806,
3924,
253,
15034,
3646,
33772,
970,
45738,
4715,
285,
840,
1442,
292,
37437,
342,
247,
20,
68,
50274,
5371,
310,
253,
3213,
1979,
273,
253,
5570,
323,
253,
3579,
3213,
752,
403,
253,
1614,
14636,
323,
1614,
1274,
1614,
918,
5231,
752,
403,
253,
20569,
14636,
323,
31994,
285,
1007,
3487,
5231,
50273,
5371,
84,
253,
1921,
323,
7756,
689,
4955,
275,
5928,
273,
246,
16077,
310,
352,
8936,
5304,
14237,
7938,
27657,
9866,
4632,
22636,
390,
253,
958,
4955,
760,
28467,
5113,
342,
253,
4585,
7162,
1223,
362,
85,
4648,
512,
253,
5189,
5113,
50274,
5430,
1057,
253,
5570,
3037,
1048,
3945,
12485,
875,
5113,
2439,
2709,
13009,
275,
619,
4743,
253,
4081,
10336,
12516,
512,
253,
7977,
273,
4715,
841,
1048,
3945,
1789,
7688,
2439,
2709,
13009,
327,
253,
298,
296,
78,
3646,
1580,
253,
362,
85,
760,
33772,
5864,
1561,
247,
2014,
3665,
50275,
18,
11138,
8113,
395,
12982,
15034,
342,
4440,
292,
2068,
8557,
432,
253,
4384,
19684,
360,
27083,
1162,
355,
50275,
19,
258,
19378,
1789,
6017,
28601,
15616,
3215,
26208,
323,
8113,
12982,
8892,
632,
1162,
355,
50275,
20,
5004,
484,
285,
1755,
3487,
4116,
323,
2460,
11743,
272,
285,
5304,
1953,
22291,
285,
3796,
1162,
355,
50276,
7152,
33032,
2520,
2929,
14371,
247,
1566,
326,
4648,
253,
39707,
281,
22573,
253,
5304,
3386,
326,
5420,
275,
253,
5304,
3280,
2460,
1309,
15034,
253,
1566,
310,
41005,
3215,
11273,
762,
45738,
4715,
8103,
342,
1881,
20419,
30505,
3967,
24102,
253,
16774,
1543,
921,
326,
253,
1566,
908,
275,
253,
2929,
41731,
13015,
2045,
3082,
327,
23105,
19,
42771,
3126,
253,
4477,
671,
921,
690,
2175,
327,
253,
9021,
273,
1016,
4445,
275,
253,
1566,
50276,
20790,
20544,
50275,
783,
4081,
1332,
2007,
921,
326,
253,
39707,
310,
247,
6422,
1566,
323,
4735,
11998,
50275,
783,
4477,
7568,
581,
1332,
281,
1056,
253,
3733,
273,
39707,
789,
26332,
3215,
26208,
4979,
398,
970,
30505,
3967,
24102,
50275,
358,
5378,
474,
906,
1329,
253,
4477,
3916,
50275,
66,
11080,
28913,
1263,
285,
11985,
403,
2530,
50276,
5040,
50275,
783,
2929,
47932,
253,
39707,
285,
12956,
352,
715,
253,
15034,
1895,
642,
747,
10336,
7645,
310,
4081,
50275,
262,
3133,
326,
247,
2074,
10393,
273,
39707,
2168,
5420,
275,
253,
8113,
395,
12982,
15034,
4836,
337,
253,
2929,
671,
2722,
326,
3215,
26208,
273,
15034,
8892,
970,
4979,
398,
476,
1361,
281,
9510,
253,
3045,
50275,
37585,
767,
5816,
30404,
3495,
326,
403,
7826,
4623,
50275,
18,
4404,
4715,
247,
12314,
5570,
323,
8113,
395,
12982,
15034,
3066,
3215,
26208,
50276,
19,
25537,
29886,
499,
9582,
33876,
4156,
7219,
323,
8113,
395,
12982,
15034,
50276,
20,
403,
368,
2819,
3216,
272,
281,
2709,
33433,
275,
8113,
395,
12982,
15034,
50274,
422,
1239,
253,
4477,
2380,
285,
651,
751,
281,
6558,
619,
3236,
11691,
406,
33032,
6010,
436,
2929,
23970,
39707,
2990,
281,
5304,
15034,
5742,
1789,
41881,
15034,
352,
671,
24357,
2067,
747,
4735,
42785,
347,
253,
3280,
273,
253,
39707,
32049,
285,
29810,
281,
6283,
6194,
253,
2644,
1566,
247,
22296,
3215,
26208,
3924,
310,
908,
281,
5890,
484,
253,
39707,
1566,
1270,
3045,
556,
644,
6786,
327,
247,
334,
263,
22791,
50275,
856,
84,
337,
8783,
273,
952,
1364,
452,
1869,
281,
897,
39707,
281,
8171,
253,
391,
9866,
42663,
78,
275,
8783,
273,
5304,
15034,
7792,
436,
2929,
3400,
247,
1175,
1650,
954,
15538,
436,
2929,
2770,
327,
253,
6779,
4715,
629,
273,
253,
2644,
15722,
534,
310,
2649,
326,
15246,
273,
849,
281,
897,
247,
39707,
374,
253,
4028,
310,
6571,
2590,
342,
2590,
16038,
285,
4114,
5955,
50276,
20,
253,
3045,
9510,
3340,
6821,
310,
4942,
1534,
2429,
281,
2045,
256,
5503,
285,
253,
28913,
2175,
452,
16058,
954,
273,
253,
2216,
10165,
50275,
5040,
627,
403,
247,
4564,
273,
1841,
534,
403,
417,
2590,
281,
479,
390,
13477,
479,
672,
891,
369,
4361,
253,
2929,
337,
50276,
783,
4028,
275,
253,
2746,
2593,
310,
2649,
1077,
2590,
806,
352,
651,
320,
1199,
1805,
281,
4853,
2590,
41818,
323,
512,
253,
3386,
3229,
1687,
641,
285,
897,
824,
41818,
275,
253,
4677,
253,
1655,
4028,
4648,
4227,
4735,
4156,
4735,
40798,
14456,
8820,
4735,
8820,
35465,
534,
403,
247,
1652,
2372,
21643,
281,
479,
1273,
891,
1158,
954,
4278,
403,
6283,
12841,
275,
3036,
19,
352,
4916,
417,
347,
27096,
347,
253,
7000,
2715,
3036,
21,
275,
30762,
3877,
326,
841,
767,
8442,
403,
417,
5185,
326,
253,
823,
9484,
323,
40798,
14314,
310,
5816,
310,
3036,
21,
891,
671,
1804,
326,
253,
40798,
21496,
37905,
417,
14270,
253,
14150,
273,
4156,
4735,
597,
403,
816,
2879,
2366,
19016,
4706,
2945,
12013,
359,
806,
4796,
253,
5048,
7877,
273,
247,
1029,
5251,
5743,
3711,
432,
277,
281,
247,
4577,
7877,
277,
849,
253,
5141,
310,
2218,
4555,
432,
30762,
352,
3133,
751,
247,
17558,
4528,
4972,
310,
13657,
715,
29503,
4528,
7002,
288,
285,
259,
403,
19848,
275,
4677,
597,
403,
28267,
50276,
251,
253,
1048,
1930,
273,
253,
13148,
275,
16186,
337,
597,
1646,
281,
320,
253,
3453,
273,
40798,
21496,
285,
275,
4706,
2945,
5740,
597,
403,
253,
6064,
273,
818,
12014,
298,
310,
19848,
347,
352,
2097,
3280,
273,
32049,
275,
4706,
3156,
533,
3453,
273,
32049,
275,
4706,
3079,
1339,
479,
3523,
1060,
533,
841,
1841,
1056,
253,
2746,
417,
2221,
2590,
281,
479,
50276,
19,
275,
4706,
3156,
516,
417,
4751,
13762,
273,
253,
3908,
273,
7938,
27657,
9866,
1014,
1869,
253,
4679,
45190,
16058,
352,
7938,
27657,
9866,
32063,
269,
16077,
760,
3453,
3386,
846,
2410,
21,
287,
532,
1062,
272,
501,
3024,
6903,
68,
21,
12955,
2139,
310,
352,
27137,
323,
11498,
18917,
2686,
752,
1057,
11498,
18917,
1599,
1060,
2139,
22636,
36908,
11089,
432,
352,
20509,
891,
13414,
1158,
28763,
253,
1921,
2139,
7938,
27657,
9866,
17923,
7197,
495,
671,
516,
417,
4751,
13762,
273,
253,
3908,
273,
253,
2393,
15910,
275,
4706,
2031,
253,
22414,
403,
253,
1072,
323,
1027,
1566,
275,
391,
77,
50276,
22309,
436,
39707,
1754,
6779,
458,
47612,
27171,
432,
2393,
15910,
310,
627,
247,
21541,
8813,
697,
4030,
326,
368,
2550,
7525,
1633,
323,
2119,
984,
4979,
398,
403,
1900,
1892,
281,
6194,
533,
253,
3908,
275,
2929,
9563,
417,
2221,
21414,
281,
479,
577,
4706,
3712,
6821,
15895,
3133,
281,
320,
3430,
253,
2323,
15301,
3133,
5816,
253,
1655,
5150,
310,
3365,
247,
4313,
875,
667,
9037,
2978,
689,
253,
8654,
2978,
10159,
1880,
697,
271,
2323,
9037,
390,
417,
608,
2139,
417,
671,
6240,
4156,
3386,
715,
253,
39707,
32049,
323,
1650,
40206,
2259,
285,
7036,
255,
342,
253,
3280,
310,
253,
32049,
6326,
281,
320,
1980,
50275,
43671,
337,
253,
1682,
1543,
273,
362,
85,
3024,
275,
10334,
18,
908,
246,
16077,
352,
1537,
320,
1805,
281,
9569,
246,
16077,
275,
30762,
323,
29867,
50276,
19,
11041,
310,
417,
2361,
275,
10334,
18,
534,
310,
24666,
323,
391,
77,
8519,
2929,
50276,
20,
984,
39707,
556,
4116,
6333,
285,
253,
2954,
476,
320,
4354,
27130,
891,
369,
16764,
625,
7914,
34309,
1320,
751,
3036,
18,
987,
281,
921,
253,
4081,
3082,
2686,
8041,
281,
1463,
3672,
253,
3904,
403,
1892,
281,
2028,
752,
513,
1016,
11911,
513,
4555,
50275,
34974,
337,
816,
281,
1056,
2119,
891,
2096,
9113,
253,
4227,
4735,
2233,
89,
21361,
285,
8820,
4735,
2233,
89,
24,
403,
10208,
715,
247,
13361,
81,
323,
11781,
476,
368,
6266,
253,
4222,
74,
374,
1980,
8820,
4735,
4428,
253,
12650,
41113,
3817,
7162,
285,
281,
1087,
456,
24705,
5203,
310,
253,
24705,
5203,
253,
966,
3605,
1249,
68,
2139,
417,
897,
247,
581,
12022,
21496,
390,
1633,
495,
310,
23105,
19,
42771,
253,
954,
4633,
22791,
323,
1789,
41881,
6563,
891,
452,
2326,
8783,
273,
2720,
2929,
3515,
327,
20571,
47515,
253,
2173,
4606,
273,
970,
23105,
19,
42771,
689,
20571,
50275,
32897,
2953,
619,
3533,
516,
2819,
3579,
281,
16585,
342,
253,
4477,
285,
253,
14218,
30628,
50276,
7152,
339,
377,
6653,
6010,
50276,
783,
2929,
12453,
253,
1895,
273,
15034,
4404,
5113,
1789,
8002,
275,
247,
7503,
3126,
253,
2934,
273,
253,
2929,
310,
281,
19071,
8820,
1491,
273,
5113,
970,
247,
39707,
3169,
7792,
1925,
5304,
39707,
2990,
253,
2929,
26662,
253,
1543,
342,
247,
1180,
273,
1375,
23037,
14387,
1789,
8002,
3210,
285,
3400,
271,
28913,
1263,
253,
1543,
452,
644,
2361,
327,
253,
23105,
19,
42771,
7792,
50276,
20790,
20544,
50275,
783,
2934,
273,
24049,
1789,
1491,
970,
4979,
398,
323,
247,
15034,
5570,
310,
747,
50275,
783,
4081,
1332,
41731,
13015,
247,
1180,
273,
2266,
1666,
25379,
50275,
783,
28913,
2175,
921,
326,
253,
5611,
4295,
403,
3576,
50276,
20790,
32213,
50275,
262,
310,
1892,
281,
2096,
690,
4243,
273,
253,
2929,
323,
1650,
253,
10199,
25339,
4278,
824,
253,
3064,
875,
22636,
285,
7938,
27657,
9866,
390,
10183,
273,
3733,
253,
4979,
398,
352,
310,
2834,
281,
2096,
841,
4278,
1293,
8958,
253,
4081,
1332,
253,
10199,
943,
2085,
247,
1029,
5251,
18389,
273,
253,
2929,
3185,
273,
841,
3510,
273,
4278,
671,
253,
2929,
4419,
4737,
4361,
627,
403,
2067,
14683,
342,
28146,
3374,
50275,
262,
310,
247,
2372,
8921,
326,
2717,
310,
6311,
1293,
253,
45738,
3215,
26208,
352,
651,
320,
1175,
281,
2836,
12861,
285,
2085,
247,
1805,
8813,
323,
2139,
436,
6569,
50275,
29813,
337,
310,
417,
2590,
247,
4864,
8813,
651,
1361,
50275,
74,
5583,
3515,
253,
1332,
327,
690,
643,
31225,
534,
2486,
5777,
4067,
13451,
281,
923,
604,
253,
1332,
2087,
4219,
281,
1110,
347,
973,
4848,
837,
263,
5987,
7280,
681,
455,
257,
22466,
15617,
263,
48781,
310,
1077,
2810,
281,
253,
7792,
908,
275,
436,
2929,
594,
352,
1537,
320,
247,
1175,
4327,
323,
841,
4679,
50276,
6309,
1877,
273,
13716,
50276,
1189,
455,
891,
717,
25661,
4404,
18738,
436,
2929,
1580,
352,
23970,
247,
747,
1039,
323,
24049,
1789,
1491,
285,
352,
41731,
13015,
2266,
1789,
15034,
1666,
25379,
4028,
310,
253,
2022,
2523,
273,
436,
2929,
50275,
5996,
250,
2858,
22559,
50276,
74,
1239,
253,
30080,
22559,
285,
253,
643,
10123,
253,
30080,
22559,
12453,
619,
7350,
281,
690,
6070,
4028,
556,
5520,
275,
253,
17265,
2715,
533,
352,
1335,
556,
690,
3374,
594,
891,
717,
1469,
281,
1978,
619,
13716,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
12453,
253,
1895,
273,
5304,
1789,
15034,
407,
13947,
247,
4460,
5304,
39707,
10336,
835,
271,
32049,
11253,
273,
247,
3215,
11273,
1789,
13562,
16756,
5113,
26332,
616,
5304,
3386,
1899,
24705,
5203,
7162,
326,
588,
5752,
347,
10149,
275,
271,
4116,
3169,
25064,
5122,
285,
247,
29810,
48169,
4156,
5304,
3386,
285,
40798,
42785,
347,
247,
25319,
4735,
3711,
253,
5304,
39707,
310,
806,
3215,
11273,
970,
45738,
4715,
327,
2969,
8892,
11253,
275,
4886,
253,
1098,
6134,
5570,
50276,
32499,
4404,
253,
2303,
1789,
840,
271,
391,
77,
5570,
310,
2931,
407,
6240,
271,
298,
296,
78,
281,
253,
362,
85,
3024,
285,
3733,
352,
990,
936,
423,
327,
253,
1625,
2146,
6188,
8578,
273,
253,
23105,
19,
42771,
3126,
835,
352,
33526,
1375,
23037,
14387,
3045,
50276,
6438,
30080,
22559,
512,
1740,
30628,
5975,
2400,
327,
247,
4868,
273,
721,
253,
30628,
26108,
253,
38135,
273,
253,
1332,
9470,
7103,
342,
28913,
2175,
285,
253,
256,
5503,
1543,
2022,
2792,
273,
14226,
497,
670,
19843,
273,
4028,
285,
690,
22909,
534,
253,
4477,
5520,
970,
22636,
4632,
7938,
27657,
9866,
285,
253,
4103,
17647,
273,
253,
4836,
2014,
2316,
285,
13358,
2250,
2317,
627,
497,
671,
5884,
3533,
247,
2748,
323,
625,
3332,
39707,
3169,
362,
6677,
20314,
20561,
285,
247,
2748,
323,
247,
747,
7103,
327,
4848,
837,
263,
581,
2170,
273,
5955,
50276,
2811,
891,
802,
3967,
885,
342,
253,
4477,
50276,
4238,
5001,
253,
10183,
273,
6313,
391,
77,
3733,
273,
39707,
3169,
6083,
285,
253,
15504,
281,
3215,
1949,
253,
14237,
50276,
29114,
512,
436,
715,
2395,
891,
1804,
436,
2929,
4850,
7607,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the paper proposes two novel methods for combinatorial blackbox optimization ie over an unconstrained binary domain based on optimistic tree search one based on a known lipschitz constant olts and another one when it is unknown octs the general idea of the olts is to evaluate nodes in a tree with large upper bounds in their subtrees where the upper bound is based on the lipschitz constant and the diameter of the subtree this is extended in octs when the lipschitz constant is not known by searching a superset of nodes that would contain the node in olts both methods are proven to have linear convergence rates with a dependence on the lipschitz constant computational experiments show that octs outperform several other heuristicbased methods the blackbox methods proposed in the paper are very appealing they are simple to implement theoretically grounded and appear to work well in practice the approach appears to be original as far as i am aware the computational section is sufficiently extensive with six different problem classes and one experiment to illustrate the convergence rates and the method generally outperforms the baselines i particularly appreciate the theoretical guarantees and their computational analysis in section 61 the paper could have benefited from a comparison with modelbased methods but i believe it is not too unreasonable to omit them given that they typically have more expensive iterations the presentation is overall clear but there are several minor issues that need to be addressed below most of my comments below are regarding presentation which should be fixable assuming those are addressed i recommend acceptance for this paper no limitations besides the ones discussed above docsepthis paper presents an algorithm for solving combinatorial optimization problems where the objective function is a black box accessible only via an oracle the algorithm is targeted at problems where this oracle is relatively cheap as opposed to the standard bayesian optimization setting and is accompanied by finite time termination guarantees the core algorithm relies heavily on lipschitz constants to guide search and prune the tree as this constant is often not known the authors present a variant that instead only relies on the existence of a lipschitz constant the authors conclude with a computational analysis of the performance of the algorithms as a function of the number of function evaluations the paper presents a novel algorithm in an area of interest to the neurips community includes interesting theoretical results and is clearly written the only weakness i can identify is the lack of a computational comparison against bayesian optimization techniques see questions there is no explicit discussion of potential negative societal impact docsepthe paper considers the blackbox optimization of combinatorial binary functions the functions are assumed to obey a lipschitz condition given some metric on the hypercube for the optimization problem the authors propose two algorithms depending on the knowledge of the lipschitz constant both algorithms rely on tree search and optimistic upper bounds theoretical guarantees are provided for the convergence of the algorithms the empirical work show that the algorithm with unknown lipschitz constant octs outperforms the considered baselines on a variety of problems the proposed algorithm is fairly natural given the lipschitz assumption the case of unknown lipschitz constant is treated in a similar way as the direct algorithm although it is not referenced the theoretical results are straightforward but nevertheless useful the binary tree is assumed as provided but i would assume that the ordering of the indices might have significant influence on the performance given the optimistic tree search approach the problem is somewhat related to the combinatorial bandit problem the main difference here is that the function is deterministic which allows much stronger bounds but i would assume some techniques from combinatorial bandits could carry over the empirical performance is a strong argument for the paper the baselines are difficult to evaluate since there is little detail provided regarding their implementation and parametrization it is not clear how meaningful the lipschitz condition is for the practical problems considered beyond the constant that results from the discrete nature of the problem docsepthis paper addresses the problem of combinatorial blackbox optimization the solution is built upon a treestructure search procedure with optimistic search strategy the contribution of the paper in my opinion is twofold 1 algorithmically it designs a new combinatorial blackbox optimization solver olts and its practical variant octs by adapting the optimistic strategy applied on treesearch optimizer 2 theoretically it provides convergence analysis on the proposed solver and its variant octs which is shown to be superior than random search strengths 1 the structure of the paper is clear and the paper is overall wellwritten the clarity is in general good except for a few points that will be discussed in the weakness part 2 the problem of combinatorial blackbox optimization is an important problem that has vast applicability of various domains including machine learning 3 the paper provides the first finitetime linear convergence rates for the problem it is a significant improvement compared to the logarithmic rates of baselines random search 4 the empirical results are promising the algorithm though simple has been shown to be outperforming the baselines on a set of benchmark blackbox combinatorial optimization problems including labs mis ising maxsat and contamination weakness 1 the novelty of the proposed solvers olts and octs is limited both the treebased search and the optimistic strategy have been well studied under similar contexts the main critique from me is not that the algorithms are not novel but that the novelty is somewhat overclaimed for example the tree based search has been discussed in a few previous papers eg in 39 and also uct ucb for trees but this has not been acknowledged in the paper it appears that the tree structure is first proposed in this paper as another example the optimistic strategy for estimating the potential of the tree nodes is also adapted from 39 though the paper lists three major differences of oltsocts vs 39 it still seems incremental also it is not clearly explained why these differences are made to adapt to the tree structure and what are the advantages 2 it is not clear what are the intuitions of ln and nc in the propositions and theorems so that it is hard to understand how tight the derived convergence bounds are in the respective theoremspropositions at least from a first look the bounds do not seem tight and therefore the theory is not as informative the paper would be stronger if these are better explainedclarified 3 bayesian optimization is an important category of methods for blackbox combinatorial optimization problems but it is not included in the set of baselines why is it it would be good to explain 4 the empirical results are promising in general one question from me though is that what are the reasons that certain problems are selected for evaluation for example reference 18 and reference 41 each provided a set of benchmark problems but this paper selected a subset from each of these two references instead of evaluating all of the settings in either one of them it does not seem that the proposed octs cannot work on the other problems eg the neural architecture search benchmark which is of potential high interest to the ml community minor aspects 1 the introduction well motivates the paper but is a bit too condensed perhaps better to split it into multiple paragraphs 2 line 186 ih seems to be a typo it should be il 3 line 270 proposition a3 is it a typo the authors claimed that they discussed potential limitations 1b and negative societal impacts of the work 1c in the appendix but i cannot see any obvious discussions of this kind
### Summary: | this paper proposes two methods for black box optimization of lipschitz combinatorial binary functions the reviewers agree that the paper is well written the methods are sufficiently novel and that the results are of interest to the neurips community the main drawback with the paper is that reviewer n1bw felt that the theoretical results are straightforward but nevertheless useful several reviewers also had hoped for comparisons with bayesian optimization techniques but during the discussion period it was decided that this comparison can be omitted due to the much higher computational cost of bayesian methods i tend to agree with the reviewers that this paper is above the bar for neurips | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
767,
4460,
3082,
323,
38183,
2806,
3364,
13757,
26332,
689,
271,
440,
48454,
8985,
5028,
1754,
327,
28684,
5202,
3186,
581,
1754,
327,
247,
1929,
11233,
37913,
3638,
8919,
1641,
285,
1529,
581,
672,
352,
310,
7202,
17109,
84,
253,
2087,
2934,
273,
253,
8919,
1641,
310,
281,
7472,
7632,
275,
247,
5202,
342,
1781,
5170,
14493,
275,
616,
749,
45670,
835,
253,
5170,
3033,
310,
1754,
327,
253,
11233,
37913,
3638,
285,
253,
9080,
273,
253,
8482,
658,
436,
310,
6508,
275,
17109,
84,
672,
253,
11233,
37913,
3638,
310,
417,
1929,
407,
12203,
247,
17402,
292,
273,
7632,
326,
651,
3831,
253,
4666,
275,
8919,
1641,
1097,
3082,
403,
11464,
281,
452,
4872,
14940,
4142,
342,
247,
10096,
327,
253,
11233,
37913,
3638,
15180,
4679,
921,
326,
17109,
84,
562,
32231,
2067,
643,
47641,
3169,
3082,
253,
2806,
3364,
3082,
4081,
275,
253,
2929,
403,
1077,
23176,
597,
403,
2969,
281,
3359,
28055,
28462,
285,
3176,
281,
789,
973,
275,
3946,
253,
2746,
4620,
281,
320,
3236,
347,
2080,
347,
891,
717,
6600,
253,
15180,
2593,
310,
10481,
9470,
342,
2800,
1027,
1895,
5971,
285,
581,
3368,
281,
17093,
253,
14940,
4142,
285,
253,
1332,
3839,
41731,
13015,
253,
1666,
25379,
891,
3782,
11435,
253,
10527,
23632,
285,
616,
15180,
1783,
275,
2593,
9901,
50276,
783,
2929,
812,
452,
37081,
432,
247,
5301,
342,
1566,
3169,
3082,
533,
891,
2868,
352,
310,
417,
1512,
20697,
281,
35991,
731,
1677,
326,
597,
5431,
452,
625,
8214,
25142,
253,
9759,
310,
4583,
2590,
533,
627,
403,
2067,
5884,
3374,
326,
878,
281,
320,
9713,
2708,
50276,
2252,
273,
619,
5701,
2708,
403,
5001,
9759,
534,
943,
320,
4993,
494,
7384,
1110,
403,
9713,
891,
5583,
14924,
323,
436,
2929,
642,
7364,
16280,
253,
4394,
5469,
1840,
5474,
33032,
2520,
2929,
10262,
271,
5933,
323,
16161,
38183,
13757,
3237,
835,
253,
8103,
1159,
310,
247,
2806,
3817,
12482,
760,
3066,
271,
42295,
253,
5933,
310,
10522,
387,
3237,
835,
436,
42295,
310,
4942,
11142,
347,
10066,
281,
253,
2629,
17699,
16561,
13757,
4758,
285,
310,
11704,
407,
6486,
673,
15056,
23632,
253,
5161,
5933,
15771,
11306,
327,
11233,
37913,
14637,
281,
7102,
3186,
285,
819,
2517,
253,
5202,
347,
436,
3638,
310,
2223,
417,
1929,
253,
4477,
1246,
247,
12955,
326,
3185,
760,
15771,
327,
253,
6242,
273,
247,
11233,
37913,
3638,
253,
4477,
7525,
342,
247,
15180,
1783,
273,
253,
3045,
273,
253,
11333,
347,
247,
1159,
273,
253,
1180,
273,
1159,
27163,
50276,
783,
2929,
10262,
247,
4460,
5933,
275,
271,
2170,
273,
1600,
281,
253,
5723,
2824,
3114,
3797,
4722,
10527,
1543,
285,
310,
4518,
3542,
253,
760,
14855,
891,
476,
4271,
310,
253,
3480,
273,
247,
15180,
5301,
1411,
17699,
16561,
13757,
5609,
923,
3533,
627,
310,
642,
6843,
5955,
273,
2442,
4016,
38058,
3486,
5474,
339,
431,
248,
2929,
19401,
253,
2806,
3364,
13757,
273,
38183,
8985,
3470,
253,
3470,
403,
8025,
281,
20090,
247,
11233,
37913,
50276,
12380,
1677,
690,
7982,
327,
253,
4373,
68,
4338,
323,
253,
13757,
1895,
253,
4477,
12661,
767,
11333,
7293,
327,
253,
3640,
273,
253,
11233,
37913,
3638,
1097,
11333,
10725,
327,
5202,
3186,
285,
28684,
5170,
14493,
10527,
23632,
403,
2530,
323,
253,
14940,
273,
253,
11333,
253,
16774,
789,
921,
326,
253,
5933,
342,
7202,
11233,
37913,
3638,
17109,
84,
41731,
13015,
253,
2783,
1666,
25379,
327,
247,
5235,
273,
3237,
50276,
783,
4081,
5933,
310,
9648,
3626,
1677,
253,
11233,
37913,
9376,
253,
1083,
273,
7202,
11233,
37913,
3638,
310,
4127,
275,
247,
2074,
1039,
347,
253,
1480,
5933,
3738,
352,
310,
417,
23378,
50276,
783,
10527,
1543,
403,
15246,
533,
17837,
4217,
50275,
783,
8985,
5202,
310,
8025,
347,
2530,
533,
891,
651,
5467,
326,
253,
15824,
273,
253,
14452,
1537,
452,
1534,
4833,
327,
253,
3045,
50272,
28821,
253,
28684,
5202,
3186,
2746,
253,
1895,
310,
8489,
2905,
281,
253,
38183,
3961,
262,
1895,
253,
2022,
3064,
1060,
310,
326,
253,
1159,
310,
30027,
534,
4483,
1199,
10046,
14493,
533,
891,
651,
5467,
690,
5609,
432,
38183,
3961,
953,
812,
4459,
689,
50273,
783,
16774,
3045,
310,
247,
2266,
4154,
323,
253,
2929,
253,
1666,
25379,
403,
2834,
281,
7472,
1580,
627,
310,
1652,
2508,
2530,
5001,
616,
7092,
285,
30364,
45031,
50275,
262,
310,
417,
2590,
849,
14282,
253,
11233,
37913,
1617,
310,
323,
253,
8542,
3237,
2783,
4457,
253,
3638,
326,
1543,
432,
253,
13358,
3753,
273,
253,
1895,
50276,
7152,
33032,
2520,
2929,
12453,
253,
1895,
273,
38183,
2806,
3364,
13757,
253,
2900,
310,
4270,
2220,
247,
2578,
383,
7818,
3186,
5199,
342,
28684,
3186,
5700,
253,
7680,
273,
253,
2929,
275,
619,
4743,
310,
767,
8089,
337,
5933,
1037,
352,
11809,
247,
747,
38183,
2806,
3364,
13757,
47037,
8919,
1641,
285,
697,
8542,
12955,
17109,
84,
407,
42174,
253,
28684,
5700,
3732,
327,
7139,
3849,
5556,
6081,
374,
28055,
352,
3400,
14940,
1783,
327,
253,
4081,
47037,
285,
697,
12955,
17109,
84,
534,
310,
2011,
281,
320,
8936,
685,
3632,
3186,
50276,
296,
3755,
20556,
50276,
18,
253,
2605,
273,
253,
2929,
310,
2590,
285,
253,
2929,
310,
4583,
973,
15720,
253,
19843,
310,
275,
2087,
1175,
3707,
323,
247,
1643,
2792,
326,
588,
320,
5469,
275,
253,
14855,
629,
50275,
19,
253,
1895,
273,
38183,
2806,
3364,
13757,
310,
271,
1774,
1895,
326,
556,
8485,
30437,
273,
2710,
10625,
1690,
5145,
4715,
50275,
20,
253,
2929,
3400,
253,
806,
1442,
262,
7816,
4872,
14940,
4142,
323,
253,
1895,
352,
310,
247,
1534,
7756,
2429,
281,
253,
32643,
4142,
273,
1666,
25379,
3632,
3186,
50275,
21,
253,
16774,
1543,
403,
12532,
253,
5933,
2167,
2969,
556,
644,
2011,
281,
320,
41731,
14692,
253,
1666,
25379,
327,
247,
873,
273,
22791,
2806,
3364,
38183,
13757,
3237,
1690,
39803,
3731,
310,
272,
2781,
22354,
285,
17969,
50274,
20881,
1255,
337,
253,
38135,
273,
253,
4081,
1220,
735,
8919,
1641,
285,
17109,
84,
310,
3710,
1097,
253,
5202,
3169,
3186,
285,
253,
28684,
5700,
452,
644,
973,
5421,
762,
2074,
22349,
253,
2022,
29254,
432,
479,
310,
417,
326,
253,
11333,
403,
417,
4460,
533,
326,
253,
38135,
310,
8489,
689,
13578,
50276,
1542,
1650,
253,
5202,
1754,
3186,
556,
644,
5469,
275,
247,
1643,
2045,
9380,
24088,
275,
6931,
285,
671,
1484,
291,
50276,
1028,
67,
323,
7139,
533,
436,
556,
417,
644,
14969,
275,
253,
2929,
352,
4620,
326,
253,
5202,
2605,
310,
806,
4081,
275,
436,
2929,
50275,
284,
1529,
1650,
253,
28684,
5700,
323,
26230,
253,
2442,
273,
253,
5202,
7632,
310,
671,
12956,
432,
6931,
2167,
253,
2929,
10894,
1264,
2201,
3910,
273,
258,
5792,
601,
291,
84,
4632,
6931,
352,
1335,
3133,
32809,
671,
352,
310,
417,
4518,
5544,
2139,
841,
3910,
403,
1160,
281,
5223,
281,
253,
5202,
2605,
285,
752,
403,
253,
11361,
50275,
19,
352,
310,
417,
2590,
752,
403,
253,
16875,
4431,
273,
43321,
285,
295,
68,
275,
253,
39325,
285,
39383,
594,
326,
352,
310,
1892,
281,
2096,
849,
6863,
253,
6012,
14940,
14493,
403,
275,
253,
9056,
39383,
856,
35507,
387,
1878,
432,
247,
806,
1007,
253,
14493,
513,
417,
1646,
6863,
285,
3103,
253,
3762,
310,
417,
347,
27096,
253,
2929,
651,
320,
10046,
604,
841,
403,
1805,
5544,
498,
274,
1245,
50275,
20,
17699,
16561,
13757,
310,
271,
1774,
7140,
273,
3082,
323,
2806,
3364,
38183,
13757,
3237,
533,
352,
310,
417,
2908,
275,
253,
873,
273,
1666,
25379,
2139,
310,
352,
352,
651,
320,
1175,
281,
5513,
50276,
21,
253,
16774,
1543,
403,
12532,
275,
2087,
581,
1953,
432,
479,
2167,
310,
326,
752,
403,
253,
4606,
326,
2176,
3237,
403,
4236,
323,
7103,
323,
1650,
3806,
1283,
285,
3806,
7609,
1016,
2530,
247,
873,
273,
22791,
3237,
533,
436,
2929,
4236,
247,
8578,
432,
1016,
273,
841,
767,
10414,
3185,
273,
16344,
512,
273,
253,
7533,
275,
2057,
581,
273,
731,
352,
1057,
417,
1646,
326,
253,
4081,
17109,
84,
2550,
789,
327,
253,
643,
3237,
24088,
253,
11454,
10336,
3186,
22791,
534,
310,
273,
2442,
1029,
1600,
281,
253,
13361,
3114,
50275,
37585,
7794,
50276,
18,
253,
10199,
973,
15265,
684,
253,
2929,
533,
310,
247,
2372,
1512,
35341,
4931,
1805,
281,
8085,
352,
715,
2709,
33295,
50276,
19,
1386,
25384,
25730,
3133,
281,
320,
247,
1745,
80,
352,
943,
320,
4164,
495,
1386,
22540,
13989,
247,
20,
50276,
261,
352,
247,
1745,
80,
50276,
783,
4477,
7558,
326,
597,
5469,
2442,
7364,
337,
67,
285,
4016,
38058,
16274,
273,
253,
789,
337,
68,
275,
253,
30762,
533,
891,
2550,
923,
667,
4755,
11985,
273,
436,
2238,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
767,
3082,
323,
2806,
3817,
13757,
273,
11233,
37913,
38183,
8985,
3470,
253,
30628,
5194,
326,
253,
2929,
310,
973,
3542,
253,
3082,
403,
10481,
4460,
285,
326,
253,
1543,
403,
273,
1600,
281,
253,
5723,
2824,
3114,
253,
2022,
32489,
342,
253,
2929,
310,
326,
37317,
295,
18,
39220,
3543,
326,
253,
10527,
1543,
403,
15246,
533,
17837,
4217,
2067,
30628,
671,
574,
13937,
323,
14023,
342,
17699,
16561,
13757,
5609,
533,
1309,
253,
5955,
2180,
352,
369,
4425,
326,
436,
5301,
476,
320,
11035,
1955,
281,
253,
1199,
2169,
15180,
2105,
273,
17699,
16561,
3082,
891,
5257,
281,
5194,
342,
253,
30628,
326,
436,
2929,
310,
1840,
253,
2534,
323,
5723,
2824
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
2929,
29328,
767,
4460,
3082,
323,
38183,
2806,
3364,
13757,
26332,
689,
271,
440,
48454,
8985,
5028,
1754,
327,
28684,
5202,
3186,
581,
1754,
327,
247,
1929,
11233,
37913,
3638,
8919,
1641,
285,
1529,
581,
672,
352,
310,
7202,
17109,
84,
253,
2087,
2934,
273,
253,
8919,
1641,
310,
281,
7472,
7632,
275,
247,
5202,
342,
1781,
5170,
14493,
275,
616,
749,
45670,
835,
253,
5170,
3033,
310,
1754,
327,
253,
11233,
37913,
3638,
285,
253,
9080,
273,
253,
8482,
658,
436,
310,
6508,
275,
17109,
84,
672,
253,
11233,
37913,
3638,
310,
417,
1929,
407,
12203,
247,
17402,
292,
273,
7632,
326,
651,
3831,
253,
4666,
275,
8919,
1641,
1097,
3082,
403,
11464,
281,
452,
4872,
14940,
4142,
342,
247,
10096,
327,
253,
11233,
37913,
3638,
15180,
4679,
921,
326,
17109,
84,
562,
32231,
2067,
643,
47641,
3169,
3082,
253,
2806,
3364,
3082,
4081,
275,
253,
2929,
403,
1077,
23176,
597,
403,
2969,
281,
3359,
28055,
28462,
285,
3176,
281,
789,
973,
275,
3946,
253,
2746,
4620,
281,
320,
3236,
347,
2080,
347,
891,
717,
6600,
253,
15180,
2593,
310,
10481,
9470,
342,
2800,
1027,
1895,
5971,
285,
581,
3368,
281,
17093,
253,
14940,
4142,
285,
253,
1332,
3839,
41731,
13015,
253,
1666,
25379,
891,
3782,
11435,
253,
10527,
23632,
285,
616,
15180,
1783,
275,
2593,
9901,
50276,
783,
2929,
812,
452,
37081,
432,
247,
5301,
342,
1566,
3169,
3082,
533,
891,
2868,
352,
310,
417,
1512,
20697,
281,
35991,
731,
1677,
326,
597,
5431,
452,
625,
8214,
25142,
253,
9759,
310,
4583,
2590,
533,
627,
403,
2067,
5884,
3374,
326,
878,
281,
320,
9713,
2708,
50276,
2252,
273,
619,
5701,
2708,
403,
5001,
9759,
534,
943,
320,
4993,
494,
7384,
1110,
403,
9713,
891,
5583,
14924,
323,
436,
2929,
642,
7364,
16280,
253,
4394,
5469,
1840,
5474,
33032,
2520,
2929,
10262,
271,
5933,
323,
16161,
38183,
13757,
3237,
835,
253,
8103,
1159,
310,
247,
2806,
3817,
12482,
760,
3066,
271,
42295,
253,
5933,
310,
10522,
387,
3237,
835,
436,
42295,
310,
4942,
11142,
347,
10066,
281,
253,
2629,
17699,
16561,
13757,
4758,
285,
310,
11704,
407,
6486,
673,
15056,
23632,
253,
5161,
5933,
15771,
11306,
327,
11233,
37913,
14637,
281,
7102,
3186,
285,
819,
2517,
253,
5202,
347,
436,
3638,
310,
2223,
417,
1929,
253,
4477,
1246,
247,
12955,
326,
3185,
760,
15771,
327,
253,
6242,
273,
247,
11233,
37913,
3638,
253,
4477,
7525,
342,
247,
15180,
1783,
273,
253,
3045,
273,
253,
11333,
347,
247,
1159,
273,
253,
1180,
273,
1159,
27163,
50276,
783,
2929,
10262,
247,
4460,
5933,
275,
271,
2170,
273,
1600,
281,
253,
5723,
2824,
3114,
3797,
4722,
10527,
1543,
285,
310,
4518,
3542,
253,
760,
14855,
891,
476,
4271,
310,
253,
3480,
273,
247,
15180,
5301,
1411,
17699,
16561,
13757,
5609,
923,
3533,
627,
310,
642,
6843,
5955,
273,
2442,
4016,
38058,
3486,
5474,
339,
431,
248,
2929,
19401,
253,
2806,
3364,
13757,
273,
38183,
8985,
3470,
253,
3470,
403,
8025,
281,
20090,
247,
11233,
37913,
50276,
12380,
1677,
690,
7982,
327,
253,
4373,
68,
4338,
323,
253,
13757,
1895,
253,
4477,
12661,
767,
11333,
7293,
327,
253,
3640,
273,
253,
11233,
37913,
3638,
1097,
11333,
10725,
327,
5202,
3186,
285,
28684,
5170,
14493,
10527,
23632,
403,
2530,
323,
253,
14940,
273,
253,
11333,
253,
16774,
789,
921,
326,
253,
5933,
342,
7202,
11233,
37913,
3638,
17109,
84,
41731,
13015,
253,
2783,
1666,
25379,
327,
247,
5235,
273,
3237,
50276,
783,
4081,
5933,
310,
9648,
3626,
1677,
253,
11233,
37913,
9376,
253,
1083,
273,
7202,
11233,
37913,
3638,
310,
4127,
275,
247,
2074,
1039,
347,
253,
1480,
5933,
3738,
352,
310,
417,
23378,
50276,
783,
10527,
1543,
403,
15246,
533,
17837,
4217,
50275,
783,
8985,
5202,
310,
8025,
347,
2530,
533,
891,
651,
5467,
326,
253,
15824,
273,
253,
14452,
1537,
452,
1534,
4833,
327,
253,
3045,
50272,
28821,
253,
28684,
5202,
3186,
2746,
253,
1895,
310,
8489,
2905,
281,
253,
38183,
3961,
262,
1895,
253,
2022,
3064,
1060,
310,
326,
253,
1159,
310,
30027,
534,
4483,
1199,
10046,
14493,
533,
891,
651,
5467,
690,
5609,
432,
38183,
3961,
953,
812,
4459,
689,
50273,
783,
16774,
3045,
310,
247,
2266,
4154,
323,
253,
2929,
253,
1666,
25379,
403,
2834,
281,
7472,
1580,
627,
310,
1652,
2508,
2530,
5001,
616,
7092,
285,
30364,
45031,
50275,
262,
310,
417,
2590,
849,
14282,
253,
11233,
37913,
1617,
310,
323,
253,
8542,
3237,
2783,
4457,
253,
3638,
326,
1543,
432,
253,
13358,
3753,
273,
253,
1895,
50276,
7152,
33032,
2520,
2929,
12453,
253,
1895,
273,
38183,
2806,
3364,
13757,
253,
2900,
310,
4270,
2220,
247,
2578,
383,
7818,
3186,
5199,
342,
28684,
3186,
5700,
253,
7680,
273,
253,
2929,
275,
619,
4743,
310,
767,
8089,
337,
5933,
1037,
352,
11809,
247,
747,
38183,
2806,
3364,
13757,
47037,
8919,
1641,
285,
697,
8542,
12955,
17109,
84,
407,
42174,
253,
28684,
5700,
3732,
327,
7139,
3849,
5556,
6081,
374,
28055,
352,
3400,
14940,
1783,
327,
253,
4081,
47037,
285,
697,
12955,
17109,
84,
534,
310,
2011,
281,
320,
8936,
685,
3632,
3186,
50276,
296,
3755,
20556,
50276,
18,
253,
2605,
273,
253,
2929,
310,
2590,
285,
253,
2929,
310,
4583,
973,
15720,
253,
19843,
310,
275,
2087,
1175,
3707,
323,
247,
1643,
2792,
326,
588,
320,
5469,
275,
253,
14855,
629,
50275,
19,
253,
1895,
273,
38183,
2806,
3364,
13757,
310,
271,
1774,
1895,
326,
556,
8485,
30437,
273,
2710,
10625,
1690,
5145,
4715,
50275,
20,
253,
2929,
3400,
253,
806,
1442,
262,
7816,
4872,
14940,
4142,
323,
253,
1895,
352,
310,
247,
1534,
7756,
2429,
281,
253,
32643,
4142,
273,
1666,
25379,
3632,
3186,
50275,
21,
253,
16774,
1543,
403,
12532,
253,
5933,
2167,
2969,
556,
644,
2011,
281,
320,
41731,
14692,
253,
1666,
25379,
327,
247,
873,
273,
22791,
2806,
3364,
38183,
13757,
3237,
1690,
39803,
3731,
310,
272,
2781,
22354,
285,
17969,
50274,
20881,
1255,
337,
253,
38135,
273,
253,
4081,
1220,
735,
8919,
1641,
285,
17109,
84,
310,
3710,
1097,
253,
5202,
3169,
3186,
285,
253,
28684,
5700,
452,
644,
973,
5421,
762,
2074,
22349,
253,
2022,
29254,
432,
479,
310,
417,
326,
253,
11333,
403,
417,
4460,
533,
326,
253,
38135,
310,
8489,
689,
13578,
50276,
1542,
1650,
253,
5202,
1754,
3186,
556,
644,
5469,
275,
247,
1643,
2045,
9380,
24088,
275,
6931,
285,
671,
1484,
291,
50276,
1028,
67,
323,
7139,
533,
436,
556,
417,
644,
14969,
275,
253,
2929,
352,
4620,
326,
253,
5202,
2605,
310,
806,
4081,
275,
436,
2929,
50275,
284,
1529,
1650,
253,
28684,
5700,
323,
26230,
253,
2442,
273,
253,
5202,
7632,
310,
671,
12956,
432,
6931,
2167,
253,
2929,
10894,
1264,
2201,
3910,
273,
258,
5792,
601,
291,
84,
4632,
6931,
352,
1335,
3133,
32809,
671,
352,
310,
417,
4518,
5544,
2139,
841,
3910,
403,
1160,
281,
5223,
281,
253,
5202,
2605,
285,
752,
403,
253,
11361,
50275,
19,
352,
310,
417,
2590,
752,
403,
253,
16875,
4431,
273,
43321,
285,
295,
68,
275,
253,
39325,
285,
39383,
594,
326,
352,
310,
1892,
281,
2096,
849,
6863,
253,
6012,
14940,
14493,
403,
275,
253,
9056,
39383,
856,
35507,
387,
1878,
432,
247,
806,
1007,
253,
14493,
513,
417,
1646,
6863,
285,
3103,
253,
3762,
310,
417,
347,
27096,
253,
2929,
651,
320,
10046,
604,
841,
403,
1805,
5544,
498,
274,
1245,
50275,
20,
17699,
16561,
13757,
310,
271,
1774,
7140,
273,
3082,
323,
2806,
3364,
38183,
13757,
3237,
533,
352,
310,
417,
2908,
275,
253,
873,
273,
1666,
25379,
2139,
310,
352,
352,
651,
320,
1175,
281,
5513,
50276,
21,
253,
16774,
1543,
403,
12532,
275,
2087,
581,
1953,
432,
479,
2167,
310,
326,
752,
403,
253,
4606,
326,
2176,
3237,
403,
4236,
323,
7103,
323,
1650,
3806,
1283,
285,
3806,
7609,
1016,
2530,
247,
873,
273,
22791,
3237,
533,
436,
2929,
4236,
247,
8578,
432,
1016,
273,
841,
767,
10414,
3185,
273,
16344,
512,
273,
253,
7533,
275,
2057,
581,
273,
731,
352,
1057,
417,
1646,
326,
253,
4081,
17109,
84,
2550,
789,
327,
253,
643,
3237,
24088,
253,
11454,
10336,
3186,
22791,
534,
310,
273,
2442,
1029,
1600,
281,
253,
13361,
3114,
50275,
37585,
7794,
50276,
18,
253,
10199,
973,
15265,
684,
253,
2929,
533,
310,
247,
2372,
1512,
35341,
4931,
1805,
281,
8085,
352,
715,
2709,
33295,
50276,
19,
1386,
25384,
25730,
3133,
281,
320,
247,
1745,
80,
352,
943,
320,
4164,
495,
1386,
22540,
13989,
247,
20,
50276,
261,
352,
247,
1745,
80,
50276,
783,
4477,
7558,
326,
597,
5469,
2442,
7364,
337,
67,
285,
4016,
38058,
16274,
273,
253,
789,
337,
68,
275,
253,
30762,
533,
891,
2550,
923,
667,
4755,
11985,
273,
436,
2238,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
29328,
767,
3082,
323,
2806,
3817,
13757,
273,
11233,
37913,
38183,
8985,
3470,
253,
30628,
5194,
326,
253,
2929,
310,
973,
3542,
253,
3082,
403,
10481,
4460,
285,
326,
253,
1543,
403,
273,
1600,
281,
253,
5723,
2824,
3114,
253,
2022,
32489,
342,
253,
2929,
310,
326,
37317,
295,
18,
39220,
3543,
326,
253,
10527,
1543,
403,
15246,
533,
17837,
4217,
2067,
30628,
671,
574,
13937,
323,
14023,
342,
17699,
16561,
13757,
5609,
533,
1309,
253,
5955,
2180,
352,
369,
4425,
326,
436,
5301,
476,
320,
11035,
1955,
281,
253,
1199,
2169,
15180,
2105,
273,
17699,
16561,
3082,
891,
5257,
281,
5194,
342,
253,
30628,
326,
436,
2929,
310,
1840,
253,
2534,
323,
5723,
2824
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the work examines properties of neural processes np more precisely of deterministic nps and how they for finitedimensional representations of infinitedimensional function spaces np learn functions f that best representfit discrete sets of points in space based on signal theoretic aspects of discretisation authors infer a maximum theoretical upper bond of frequencies of functions f that can be used to represent the points the bond depends on the latent dimensionrepresentation size and the finite interval spawn by the points simulations are computed to test the validity of the upper bond authors find that nps behave like a fourier transform and decompose the spectrum of the signal since the representation during training learns to represent specific frequencies nps can be used as band passstop filter the paper is well written and the basic approach is clearly outlined the quality of the work and the evaluation are good and support the authors claims however it is not fully clear to which extend the claims translate to other data or generalise well the finding that nps interpret points in space as signals and implement a frequency decomposition like fourierwavelet transforms seems reasonable not sure however if an application as filter is ecological in terms of computational complexity the paper provides a strong theoretical foundation of the method and authors support their claims by empirical stimulation also explainability and more importantly interpretability of how methods generate results is essential so the message the paper sends is relevant however the relevance and significance of the findings and the consequences thereof are not clear docsepthe paper tries to analyze the behavior of neural processes in the frequency domain and concludes that such processes can only represent oscillations up to a certain frequency while drawing a parallel between neural processes and signal processes i think that there is some weakness in the experiments of the paper in particular the authors only seem to consider the exponential quadratic kernel to generate examples which would mostly show examples of smooth functions as would sampling fourier linear combinations i am also unsure how this paper could be helpful to our community in its present form as it sheds some light on the inner workings of neural processes but only in a very limited practical settingdocsepthis paper addresses an interesting and timely problem which is to understand how neural processes work to learn a representation of a function space offering a closer investigation into a recently introduced framework this work will likely be of interest to the iclr community the work focuses on the 1dimensional case and tries to analyze the simplest case in a rigorous way which i think is a good approach in general however i have some concerns about the main claims of this paper as listed below one of the main findings of the paper is an observation that neural processes perform a frequency decomposition however i think this is an insufficiently supported and even misleading overstatement indeed figure 2 shows that there are different modes dominated by varying characteristic frequencies where a higherrank mode shows a more slowly varying feature but there is no further evidence that the decomposition is actually based on the frequency of the signal one would get a similar result by simply doing a principal component analysis too when you say frequency decomposition it carries a clear mathematical meaning and it is a much stronger statement than what the paper reports empirically that said i agree that the empirical observations are interesting perhaps the observations in the papers experiments may be better described in a frame of global mode decomposition cnp vs local feature detection np i also think that the claim about the theoretical upper bound on the frequency is overstated the way it is stated currently the validity of the statement theorem 31 really depends on the assumption of uniform sampling which is mentioned as a note after theorem 31 of course i fully agree that it is an important starting step to get rigorous results in simplified conditions but those conditions should be mentioned as part of the statement especially when it is highly likely that the conditions are not met in the use case there is no reason to expect that the x values in the context set is close to uniform for example it is possible to encode functions with a localized feature whose local frequency is higher than your derived bound by using more samples around that highfrequency feature this paper will get views partly because it is actually asking an interesting question and partly because of the boldness and attractiveness of the claims made how exciting is it to discover a naturally emerging fourier transform except thats not exactly what one can say just yet i think i believe the authors should either support the papers claims by further work or tone down their overall framing major changes either way while i think this work is headed to a promising direction given the concerns described above i recommend a rejection at this time update i appreciate the authors responses and the engaged discussion however i still think that the claims of the paper are not sufficiently supported by the presented results and maintain my original ratingdocsepthis paper presents an analysis on the neural processes in the signal processing point of view and gives a bound on the highest frequency of the function that a neural process can represent i recommend to reject this manuscript my comments are below the key point of this work is theorem 31 however the theorem itself is just a direct outcome of the nyquistshannon sampling theorem and it is generally true to not only neural processes but also to all the other approaches meanwhile the authors did not talk about the relationship quantitatively between the representability and the error tolerance in definition 31 in addition the analysis is limited to only scalarvalued function on a 1d interval the writing could also be improved concerns the definition of neural processes in the background section is confusing despite the way of defining a map p is a mathematical object defined by a set of tuples and a map meaning that the neural processes are also defined by data in the original paper the neural processes were however defined as random functions in the background section the words say some sources define could the authors give the sources in def 31 what do the authors mean by discrete measurements in the experiment section do the authors mean sampling from a gaussian process by saying gp prior i dont see a gp plays the role of prior in terms of bayesian inference the examples given in the experiment section lack quantitative results it is better for evaluating the reconstruction by showing the posterior or predictive distribution instead of single reconstructions in sec 42 how did the authors sample regular grid on the 2d plane as y is determined by x eq11 is defined in the appendix better to use separate numbering
### Summary: | the paper analyses the behaviour of neural processes in the frequency domain and in particular how it suppresses highfrequency components of the input functions while this is entirely intuitive the paper adds some theoretical analysis via the nyquistshannon theorem but the analysis remains too generic and it is not clear it will be of broad interest to the community | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
789,
33888,
3607,
273,
11454,
4870,
15749,
625,
10534,
273,
30027,
295,
793,
285,
849,
597,
323,
1442,
959,
37613,
14237,
273,
38353,
959,
37613,
1159,
8470,
15749,
3037,
3470,
269,
326,
1682,
1957,
8491,
13358,
5239,
273,
2792,
275,
2317,
1754,
327,
2625,
253,
30325,
7794,
273,
35132,
5837,
4477,
9441,
247,
4869,
10527,
5170,
5533,
273,
11383,
273,
3470,
269,
326,
476,
320,
908,
281,
1957,
253,
2792,
253,
5533,
7024,
327,
253,
21624,
7877,
37626,
1979,
285,
253,
6486,
7726,
30163,
407,
253,
2792,
9938,
403,
10302,
281,
1071,
253,
13091,
273,
253,
5170,
5533,
4477,
1089,
326,
295,
793,
21319,
751,
247,
269,
15421,
4979,
285,
11101,
3014,
253,
6637,
273,
253,
2625,
1580,
253,
6779,
1309,
3733,
33772,
281,
1957,
2173,
11383,
295,
793,
476,
320,
908,
347,
3961,
1509,
13121,
5806,
50276,
783,
2929,
310,
973,
3542,
285,
253,
5044,
2746,
310,
4518,
18627,
253,
3290,
273,
253,
789,
285,
253,
7103,
403,
1175,
285,
1329,
253,
4477,
3916,
2299,
352,
310,
417,
4751,
2590,
281,
534,
9017,
253,
3916,
16497,
281,
643,
941,
390,
2087,
885,
973,
253,
4560,
326,
295,
793,
4665,
2792,
275,
2317,
347,
6298,
285,
3359,
247,
4294,
14717,
751,
269,
15421,
15007,
1059,
29698,
3133,
5272,
417,
2119,
2299,
604,
271,
2898,
347,
5806,
310,
24957,
275,
2426,
273,
15180,
10454,
50275,
783,
2929,
3400,
247,
2266,
10527,
12153,
273,
253,
1332,
285,
4477,
1329,
616,
3916,
407,
50276,
358,
5378,
474,
10277,
671,
5513,
1430,
285,
625,
15538,
4665,
1430,
273,
849,
3082,
6635,
1543,
310,
5667,
594,
253,
3935,
253,
2929,
16965,
310,
4623,
2299,
253,
17200,
285,
8453,
273,
253,
4342,
285,
253,
9099,
10445,
403,
417,
2590,
5474,
339,
431,
248,
2929,
14177,
281,
12106,
253,
3879,
273,
11454,
4870,
275,
253,
4294,
5028,
285,
20097,
326,
824,
4870,
476,
760,
1957,
22957,
598,
281,
247,
2176,
4294,
50276,
6050,
10263,
247,
7529,
875,
11454,
4870,
285,
2625,
4870,
891,
1158,
326,
627,
310,
690,
14855,
275,
253,
4679,
273,
253,
2929,
275,
1798,
253,
4477,
760,
1646,
281,
1908,
253,
17619,
21396,
10295,
281,
6635,
6667,
534,
651,
6571,
921,
6667,
273,
6032,
3470,
347,
651,
10491,
269,
15421,
4872,
13553,
50276,
74,
717,
671,
31488,
849,
436,
2929,
812,
320,
9371,
281,
776,
3114,
275,
697,
1246,
830,
347,
352,
703,
1397,
690,
1708,
327,
253,
6703,
789,
723,
273,
11454,
4870,
533,
760,
275,
247,
1077,
3710,
8542,
4758,
7152,
33032,
2520,
2929,
12453,
271,
4722,
285,
14793,
1895,
534,
310,
281,
2096,
849,
11454,
4870,
789,
281,
3037,
247,
6779,
273,
247,
1159,
2317,
9159,
247,
8003,
5839,
715,
247,
4102,
5611,
7792,
436,
789,
588,
2779,
320,
273,
1600,
281,
253,
17857,
32888,
3114,
253,
789,
16633,
327,
253,
337,
6967,
1083,
285,
14177,
281,
12106,
253,
22325,
1083,
275,
247,
26565,
1039,
534,
891,
1158,
310,
247,
1175,
2746,
275,
2087,
50276,
35529,
891,
452,
690,
7350,
670,
253,
2022,
3916,
273,
436,
2929,
347,
7117,
2708,
50275,
531,
273,
253,
2022,
4342,
273,
253,
2929,
310,
271,
8310,
326,
11454,
4870,
1347,
247,
4294,
14717,
2299,
891,
1158,
436,
310,
271,
12497,
314,
4516,
285,
1014,
24363,
689,
25322,
6296,
4677,
374,
2722,
326,
627,
403,
1027,
10006,
14691,
407,
11962,
8847,
11383,
835,
247,
2169,
14714,
4438,
2722,
247,
625,
7808,
11962,
4735,
533,
627,
310,
642,
2007,
1941,
326,
253,
14717,
310,
2686,
1754,
327,
253,
4294,
273,
253,
2625,
581,
651,
755,
247,
2074,
906,
407,
3365,
2509,
247,
8624,
4445,
1783,
1512,
672,
368,
1333,
4294,
14717,
352,
15814,
247,
2590,
15965,
4495,
285,
352,
310,
247,
1199,
10046,
3908,
685,
752,
253,
2929,
5012,
45190,
50272,
3529,
753,
891,
5194,
326,
253,
16774,
7313,
403,
4722,
4931,
253,
7313,
275,
253,
9380,
4679,
778,
320,
1805,
2529,
275,
247,
3665,
273,
4156,
4438,
14717,
260,
18650,
4632,
1980,
4735,
5481,
15749,
50275,
74,
671,
1158,
326,
253,
1750,
670,
253,
10527,
5170,
3033,
327,
253,
4294,
310,
689,
33834,
253,
1039,
352,
310,
4767,
4390,
253,
13091,
273,
253,
3908,
10012,
4562,
1663,
7024,
327,
253,
9376,
273,
6447,
10491,
534,
310,
5393,
347,
247,
3877,
846,
10012,
4562,
273,
2282,
891,
4751,
5194,
326,
352,
310,
271,
1774,
4983,
3213,
281,
755,
26565,
1543,
275,
21010,
2515,
533,
1110,
2515,
943,
320,
5393,
347,
629,
273,
253,
3908,
3340,
672,
352,
310,
4122,
2779,
326,
253,
2515,
403,
417,
1313,
275,
253,
897,
1083,
627,
310,
642,
1921,
281,
1902,
326,
253,
1269,
2193,
275,
253,
3634,
873,
310,
2810,
281,
6447,
323,
1650,
352,
310,
1896,
281,
22573,
3470,
342,
247,
15783,
4735,
3692,
1980,
4294,
310,
2169,
685,
634,
6012,
3033,
407,
970,
625,
3530,
1475,
326,
1029,
18163,
4735,
50276,
2520,
2929,
588,
755,
6849,
13730,
984,
352,
310,
2686,
7004,
271,
4722,
1953,
285,
13730,
984,
273,
253,
13433,
1255,
285,
6427,
6460,
273,
253,
3916,
1160,
849,
12302,
310,
352,
281,
9413,
247,
10748,
14149,
269,
15421,
4979,
3707,
28763,
417,
4555,
752,
581,
476,
1333,
816,
2568,
891,
1158,
891,
2868,
253,
4477,
943,
2057,
1329,
253,
9380,
3916,
407,
2007,
789,
390,
10541,
1066,
616,
4583,
39926,
50276,
24330,
2544,
2057,
1039,
1223,
891,
1158,
436,
789,
310,
12860,
281,
247,
12532,
3884,
1677,
253,
7350,
2529,
1840,
891,
5583,
247,
18235,
387,
436,
673,
50276,
11183,
891,
11435,
253,
4477,
6128,
285,
253,
9583,
5955,
2299,
891,
1335,
1158,
326,
253,
3916,
273,
253,
2929,
403,
417,
10481,
4516,
407,
253,
3559,
1543,
285,
6558,
619,
3236,
13716,
7152,
33032,
2520,
2929,
10262,
271,
1783,
327,
253,
11454,
4870,
275,
253,
2625,
5162,
1127,
273,
1859,
285,
4245,
247,
3033,
327,
253,
4585,
4294,
273,
253,
1159,
326,
247,
11454,
1232,
476,
1957,
50276,
74,
5583,
281,
12009,
436,
7714,
619,
5701,
403,
2708,
50276,
783,
2234,
1127,
273,
436,
789,
310,
10012,
4562,
2299,
253,
10012,
3139,
310,
816,
247,
1480,
6454,
273,
253,
31804,
32446,
1200,
16554,
10491,
10012,
285,
352,
310,
3839,
2032,
281,
417,
760,
11454,
4870,
533,
671,
281,
512,
253,
643,
7274,
26614,
253,
4477,
858,
417,
2312,
670,
253,
2954,
36878,
875,
253,
1957,
1430,
285,
253,
2228,
13761,
275,
5426,
4562,
275,
1635,
253,
1783,
310,
3710,
281,
760,
13434,
24995,
1159,
327,
247,
337,
69,
7726,
253,
4028,
812,
671,
320,
5520,
50276,
585,
1209,
2224,
50276,
783,
5426,
273,
11454,
4870,
275,
253,
4114,
2593,
310,
21643,
5747,
253,
1039,
273,
13947,
247,
3711,
268,
310,
247,
15965,
1789,
2931,
407,
247,
873,
273,
11737,
1868,
285,
247,
3711,
50276,
30407,
326,
253,
11454,
4870,
403,
671,
2931,
407,
941,
275,
253,
3236,
2929,
253,
11454,
4870,
497,
2299,
2931,
347,
3632,
3470,
50275,
249,
253,
4114,
2593,
253,
3000,
1333,
690,
4973,
4853,
50276,
16534,
253,
4477,
1918,
253,
4973,
50275,
249,
809,
4562,
752,
513,
253,
4477,
1599,
407,
13358,
6341,
50275,
249,
253,
3368,
2593,
513,
253,
4477,
1599,
10491,
432,
247,
305,
12064,
1232,
407,
3981,
31025,
2720,
891,
13414,
923,
247,
31025,
7120,
253,
2554,
273,
2720,
275,
2426,
273,
17699,
16561,
17032,
50275,
783,
6667,
1677,
275,
253,
3368,
2593,
3480,
11745,
1543,
352,
310,
1805,
323,
16344,
253,
14433,
407,
4645,
253,
12637,
390,
15970,
3268,
3185,
273,
2014,
49866,
6477,
50275,
249,
4706,
5976,
849,
858,
253,
4477,
3410,
3963,
9860,
327,
253,
374,
69,
6415,
347,
340,
310,
3413,
407,
1269,
50274,
2574,
883,
310,
2931,
275,
253,
30762,
1805,
281,
897,
4858,
1180,
272,
2490,
187,
4118,
18435,
27,
783,
2929,
6260,
253,
8770,
273,
11454,
4870,
275,
253,
4294,
5028,
285,
275,
1798,
849,
352,
41033,
1029,
18163,
4295,
273,
253,
3280,
3470,
1223,
436,
310,
7094,
27350,
253,
2929,
11323,
690,
10527,
1783,
3066,
253,
31804,
32446,
1200,
16554,
10012,
533,
253,
1783,
4558,
1512,
12314,
285,
352,
310,
417,
2590,
352,
588,
320,
273,
3862,
1600,
281,
253,
3114,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
789,
33888,
3607,
273,
11454,
4870,
15749,
625,
10534,
273,
30027,
295,
793,
285,
849,
597,
323,
1442,
959,
37613,
14237,
273,
38353,
959,
37613,
1159,
8470,
15749,
3037,
3470,
269,
326,
1682,
1957,
8491,
13358,
5239,
273,
2792,
275,
2317,
1754,
327,
2625,
253,
30325,
7794,
273,
35132,
5837,
4477,
9441,
247,
4869,
10527,
5170,
5533,
273,
11383,
273,
3470,
269,
326,
476,
320,
908,
281,
1957,
253,
2792,
253,
5533,
7024,
327,
253,
21624,
7877,
37626,
1979,
285,
253,
6486,
7726,
30163,
407,
253,
2792,
9938,
403,
10302,
281,
1071,
253,
13091,
273,
253,
5170,
5533,
4477,
1089,
326,
295,
793,
21319,
751,
247,
269,
15421,
4979,
285,
11101,
3014,
253,
6637,
273,
253,
2625,
1580,
253,
6779,
1309,
3733,
33772,
281,
1957,
2173,
11383,
295,
793,
476,
320,
908,
347,
3961,
1509,
13121,
5806,
50276,
783,
2929,
310,
973,
3542,
285,
253,
5044,
2746,
310,
4518,
18627,
253,
3290,
273,
253,
789,
285,
253,
7103,
403,
1175,
285,
1329,
253,
4477,
3916,
2299,
352,
310,
417,
4751,
2590,
281,
534,
9017,
253,
3916,
16497,
281,
643,
941,
390,
2087,
885,
973,
253,
4560,
326,
295,
793,
4665,
2792,
275,
2317,
347,
6298,
285,
3359,
247,
4294,
14717,
751,
269,
15421,
15007,
1059,
29698,
3133,
5272,
417,
2119,
2299,
604,
271,
2898,
347,
5806,
310,
24957,
275,
2426,
273,
15180,
10454,
50275,
783,
2929,
3400,
247,
2266,
10527,
12153,
273,
253,
1332,
285,
4477,
1329,
616,
3916,
407,
50276,
358,
5378,
474,
10277,
671,
5513,
1430,
285,
625,
15538,
4665,
1430,
273,
849,
3082,
6635,
1543,
310,
5667,
594,
253,
3935,
253,
2929,
16965,
310,
4623,
2299,
253,
17200,
285,
8453,
273,
253,
4342,
285,
253,
9099,
10445,
403,
417,
2590,
5474,
339,
431,
248,
2929,
14177,
281,
12106,
253,
3879,
273,
11454,
4870,
275,
253,
4294,
5028,
285,
20097,
326,
824,
4870,
476,
760,
1957,
22957,
598,
281,
247,
2176,
4294,
50276,
6050,
10263,
247,
7529,
875,
11454,
4870,
285,
2625,
4870,
891,
1158,
326,
627,
310,
690,
14855,
275,
253,
4679,
273,
253,
2929,
275,
1798,
253,
4477,
760,
1646,
281,
1908,
253,
17619,
21396,
10295,
281,
6635,
6667,
534,
651,
6571,
921,
6667,
273,
6032,
3470,
347,
651,
10491,
269,
15421,
4872,
13553,
50276,
74,
717,
671,
31488,
849,
436,
2929,
812,
320,
9371,
281,
776,
3114,
275,
697,
1246,
830,
347,
352,
703,
1397,
690,
1708,
327,
253,
6703,
789,
723,
273,
11454,
4870,
533,
760,
275,
247,
1077,
3710,
8542,
4758,
7152,
33032,
2520,
2929,
12453,
271,
4722,
285,
14793,
1895,
534,
310,
281,
2096,
849,
11454,
4870,
789,
281,
3037,
247,
6779,
273,
247,
1159,
2317,
9159,
247,
8003,
5839,
715,
247,
4102,
5611,
7792,
436,
789,
588,
2779,
320,
273,
1600,
281,
253,
17857,
32888,
3114,
253,
789,
16633,
327,
253,
337,
6967,
1083,
285,
14177,
281,
12106,
253,
22325,
1083,
275,
247,
26565,
1039,
534,
891,
1158,
310,
247,
1175,
2746,
275,
2087,
50276,
35529,
891,
452,
690,
7350,
670,
253,
2022,
3916,
273,
436,
2929,
347,
7117,
2708,
50275,
531,
273,
253,
2022,
4342,
273,
253,
2929,
310,
271,
8310,
326,
11454,
4870,
1347,
247,
4294,
14717,
2299,
891,
1158,
436,
310,
271,
12497,
314,
4516,
285,
1014,
24363,
689,
25322,
6296,
4677,
374,
2722,
326,
627,
403,
1027,
10006,
14691,
407,
11962,
8847,
11383,
835,
247,
2169,
14714,
4438,
2722,
247,
625,
7808,
11962,
4735,
533,
627,
310,
642,
2007,
1941,
326,
253,
14717,
310,
2686,
1754,
327,
253,
4294,
273,
253,
2625,
581,
651,
755,
247,
2074,
906,
407,
3365,
2509,
247,
8624,
4445,
1783,
1512,
672,
368,
1333,
4294,
14717,
352,
15814,
247,
2590,
15965,
4495,
285,
352,
310,
247,
1199,
10046,
3908,
685,
752,
253,
2929,
5012,
45190,
50272,
3529,
753,
891,
5194,
326,
253,
16774,
7313,
403,
4722,
4931,
253,
7313,
275,
253,
9380,
4679,
778,
320,
1805,
2529,
275,
247,
3665,
273,
4156,
4438,
14717,
260,
18650,
4632,
1980,
4735,
5481,
15749,
50275,
74,
671,
1158,
326,
253,
1750,
670,
253,
10527,
5170,
3033,
327,
253,
4294,
310,
689,
33834,
253,
1039,
352,
310,
4767,
4390,
253,
13091,
273,
253,
3908,
10012,
4562,
1663,
7024,
327,
253,
9376,
273,
6447,
10491,
534,
310,
5393,
347,
247,
3877,
846,
10012,
4562,
273,
2282,
891,
4751,
5194,
326,
352,
310,
271,
1774,
4983,
3213,
281,
755,
26565,
1543,
275,
21010,
2515,
533,
1110,
2515,
943,
320,
5393,
347,
629,
273,
253,
3908,
3340,
672,
352,
310,
4122,
2779,
326,
253,
2515,
403,
417,
1313,
275,
253,
897,
1083,
627,
310,
642,
1921,
281,
1902,
326,
253,
1269,
2193,
275,
253,
3634,
873,
310,
2810,
281,
6447,
323,
1650,
352,
310,
1896,
281,
22573,
3470,
342,
247,
15783,
4735,
3692,
1980,
4294,
310,
2169,
685,
634,
6012,
3033,
407,
970,
625,
3530,
1475,
326,
1029,
18163,
4735,
50276,
2520,
2929,
588,
755,
6849,
13730,
984,
352,
310,
2686,
7004,
271,
4722,
1953,
285,
13730,
984,
273,
253,
13433,
1255,
285,
6427,
6460,
273,
253,
3916,
1160,
849,
12302,
310,
352,
281,
9413,
247,
10748,
14149,
269,
15421,
4979,
3707,
28763,
417,
4555,
752,
581,
476,
1333,
816,
2568,
891,
1158,
891,
2868,
253,
4477,
943,
2057,
1329,
253,
9380,
3916,
407,
2007,
789,
390,
10541,
1066,
616,
4583,
39926,
50276,
24330,
2544,
2057,
1039,
1223,
891,
1158,
436,
789,
310,
12860,
281,
247,
12532,
3884,
1677,
253,
7350,
2529,
1840,
891,
5583,
247,
18235,
387,
436,
673,
50276,
11183,
891,
11435,
253,
4477,
6128,
285,
253,
9583,
5955,
2299,
891,
1335,
1158,
326,
253,
3916,
273,
253,
2929,
403,
417,
10481,
4516,
407,
253,
3559,
1543,
285,
6558,
619,
3236,
13716,
7152,
33032,
2520,
2929,
10262,
271,
1783,
327,
253,
11454,
4870,
275,
253,
2625,
5162,
1127,
273,
1859,
285,
4245,
247,
3033,
327,
253,
4585,
4294,
273,
253,
1159,
326,
247,
11454,
1232,
476,
1957,
50276,
74,
5583,
281,
12009,
436,
7714,
619,
5701,
403,
2708,
50276,
783,
2234,
1127,
273,
436,
789,
310,
10012,
4562,
2299,
253,
10012,
3139,
310,
816,
247,
1480,
6454,
273,
253,
31804,
32446,
1200,
16554,
10491,
10012,
285,
352,
310,
3839,
2032,
281,
417,
760,
11454,
4870,
533,
671,
281,
512,
253,
643,
7274,
26614,
253,
4477,
858,
417,
2312,
670,
253,
2954,
36878,
875,
253,
1957,
1430,
285,
253,
2228,
13761,
275,
5426,
4562,
275,
1635,
253,
1783,
310,
3710,
281,
760,
13434,
24995,
1159,
327,
247,
337,
69,
7726,
253,
4028,
812,
671,
320,
5520,
50276,
585,
1209,
2224,
50276,
783,
5426,
273,
11454,
4870,
275,
253,
4114,
2593,
310,
21643,
5747,
253,
1039,
273,
13947,
247,
3711,
268,
310,
247,
15965,
1789,
2931,
407,
247,
873,
273,
11737,
1868,
285,
247,
3711,
50276,
30407,
326,
253,
11454,
4870,
403,
671,
2931,
407,
941,
275,
253,
3236,
2929,
253,
11454,
4870,
497,
2299,
2931,
347,
3632,
3470,
50275,
249,
253,
4114,
2593,
253,
3000,
1333,
690,
4973,
4853,
50276,
16534,
253,
4477,
1918,
253,
4973,
50275,
249,
809,
4562,
752,
513,
253,
4477,
1599,
407,
13358,
6341,
50275,
249,
253,
3368,
2593,
513,
253,
4477,
1599,
10491,
432,
247,
305,
12064,
1232,
407,
3981,
31025,
2720,
891,
13414,
923,
247,
31025,
7120,
253,
2554,
273,
2720,
275,
2426,
273,
17699,
16561,
17032,
50275,
783,
6667,
1677,
275,
253,
3368,
2593,
3480,
11745,
1543,
352,
310,
1805,
323,
16344,
253,
14433,
407,
4645,
253,
12637,
390,
15970,
3268,
3185,
273,
2014,
49866,
6477,
50275,
249,
4706,
5976,
849,
858,
253,
4477,
3410,
3963,
9860,
327,
253,
374,
69,
6415,
347,
340,
310,
3413,
407,
1269,
50274,
2574,
883,
310,
2931,
275,
253,
30762,
1805,
281,
897,
4858,
1180,
272,
2490,
187,
4118,
18435,
27,
783,
2929,
6260,
253,
8770,
273,
11454,
4870,
275,
253,
4294,
5028,
285,
275,
1798,
849,
352,
41033,
1029,
18163,
4295,
273,
253,
3280,
3470,
1223,
436,
310,
7094,
27350,
253,
2929,
11323,
690,
10527,
1783,
3066,
253,
31804,
32446,
1200,
16554,
10012,
533,
253,
1783,
4558,
1512,
12314,
285,
352,
310,
417,
2590,
352,
588,
320,
273,
3862,
1600,
281,
253,
3114,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary this paper proposes a new method that adaptively merges intervals to form a discrete action space where on each interval qi values are learned via deep neural networks then it applies readymethods designed for discrete action spaces to do offpolicy evaluation reasons for score the paper offers a new way to apply methods designed for discrete action spaces onto continuous action spaces and it seems to perform better than the two chosen baselines as seen from the experiment results although the authors mentioned problems with the baseline models quickly it would be nice to see a more indepth analysis in the experiments to demonstrate these problems that this paper has set out to overcome it is also not very clear to me when and why djqe performs better than the baselines does it always perform better than the baselines i gave a conservative score 4 but im willing to change my evaluation if convinced pros the paper provided theoretical support to the proposed method by proving its consistency under two reasonable assumptions the method was tested on both synthetic data and simulated real world data cons overall the paper is not very clear to me it would be nice to see more indepth theoretical analysis on the main advantages of djqe compared to the baselines the lack of which generates the following questions will this method always achieve lower biases than baselines on new datasets im not sure about the quality of evaluation from a simulation model on the personalized dose finding application what are the potential problemslimitations of djqe if there are any although these questions are commonly raised on methods that rely on experimental proofs of their superior performance they seem particularly relevant for this paper questions during rebuttal period how does the computational cost of djqe scale with a decreasing maximum threshold of bias how accurate is the simulation model trained on warfarin docsepthis paper considers the problem of offpolicy evaluation with continuous actions the main idea is to first using multiscale change point detection to discretize the action space and then apply traditional ipw or dr methods to estimate the value the djqe method is theoretically analyzed under both the cases that the q function is either a piecewise function or a continuous function for continuous function it is not surprising that as the number of splits m goes to infinity as n the estimation is consistent while additional results in theorem 2 also shows that for limited m the estimator can also be shown as a uniform approximation of the q value experiments consider both a toy dataset and a real problem in personalized does finding and the results show that the djqe method is superior than existing methods for continuous q evaluation the paper is clearly written and easy to follow i only have a few comments 1 in the experiments since computing the optimal bandwidth is very time consuming for the baseline methods it would good to provide a detailed computation cost comparison 2 as mentioned in the method part m is initially set to be proportional to n and the final partition size is much smaller than m would the authors shows these detailed numbers in the experiments 3 it could be great if more realworld problems can be evaluated in the current experimental section such as the dynamic pricing example introduced previously docsepsummary this paper proposes a new method for offline evaluation when the action space is continuous one dimensional this overcomes the drawbacks of the kernel based method which cannot be applied to nonsmooth q functions and requires heavy computation to optimize the bandwidth the proposed method can be applied to discontinuous q functions like step functions and achieves smaller bias this is made possible by the adaptive jump q learning method pros while the kernel method requires a single bandwidth to control the bias and variance of the value estimator the proposed method adapts to the shape of the q function by dividing the action space in an adaptive way so that the mlp fitted in each interval of the action space approximates well the real q function hence the intervals can have possibly different lengths according to the shape of the true q function a multiscale change point detection method is used for determining the intervals which requires only a linear computational cost experiment results are convincing cons 1 in algorithm 1 collect cost function step computes mlp regressor for every possible interval hence computation will become heavy when the number of initial intervals m is large authors should add discussion about this point 2 some notations are confusing minor comments 1 gamma appears before it is defined 2 l is both the number of subsets and the numer of layers in neural networks are they meant to be the same as they increase with n or are they different in the latter case they should be distinguished docsepsummary of paper the main contribution of this paper is a new algorithm to learn the expected reward function for a given target policy using the historical data generated by a different behavior policy in continuous action domains all current offlinepolicy evaluation ope methods for handling continuous action domains use a kernel function to extend inverse probability weighting ipw or doubly robust dr approaches for discrete action domains the algorithm proposed in this work adaptively discretizes the action space by combining methods in multiscale changepoint detection multilayer perceptron regression and ope in discrete action domains the finite sample performance of the proposed method known as deepjump qevaluation djqe is compared to that of two kernelbased methods one due to kallus and zhou 2018 and another due to colangelo and lee 2020 on synthetic as well as realworld data to generate synthetic data four scenarios are considered where in each case the qfunction is continuous in the action domain or is a piecewise function of the action in almost all of these cases djqe outperforms the two kernelbased methods similarly when applied to realworld warfarin data after calibration djqe outperforms the two kernelbased methods with respect to the bias standard deviation and mean squared error even when the sample size is small n50 the average runtime of djqe in each scenario for synthetic or realworld data is about 5 minutes plus points the experimental results seem to demonstrate quite convincingly that djqe outperforms the two kernelbased methods in almost all cases the methodology seems sound the theoretical results also appear correct and prove the soundness of the method for a fairly wide range of functions those that are continuous in the feature space and action domain as well as those that are piecewise constant the method can model jump discontinuities in the qfunction questions why is it reasonable to assume that the qfunction can be wellapproximated using piecewise linear combinations of mlps how is the performance of djqe affected by the choice of the regularization parameter gamma minor commentsquestions page 2 line 2 exists to exist page 3 line 4 of section 23 segments segment page 3 line 4 was there a reason for choosing the logarithm function here page 4 lines 12 to 13 is there a theoretical justification for such a choice of m or is it based on empirical observations also to what extent does the performance of djqe depend on the initial choice of m page 5 equation 4 could it be justified why the minimizer is unique page 5 third line after equation 4 should it be figure 3 appendix a there is no figure a page 5 line 4 how is hatq used in the solution of equation 5 page 6 assumption 1 number of nodes in each hidden layer page 6 last line of the statement of theorem 1 should d be d0 page 6 lines 7 to 6 i did not fully understand what this means do the change points of hatdell vary with m page 7 line 7 data dataset page 7 line 5 how was the exponent 02 chosen page 7 last line and page 8 line 9 of section 52 with 10 hidden layers fifth reference on page 10 double occurrence of technical report update after reading authors response thank you very much for the detailed answers to my questions as well as the other reviewers commentsquestions i have upgraded my score wishing you all the best
### Summary: | the paper considers the ope problem under the contextual bandit model with continuous action they studied the model of a piecewise constant value function according to the actions the assumption is new though still somewhat restrictive as it requires the piecewise constant partitions to be the same for all x the proposed algorithm estimates the partitions and then used it to build a doubly robust estimator with stratified importance sampling fitting an mlp for each partition separately the reviewers have mixed views about the paper the following is the acs evaluation based on reading the paper and consolidating the reviewers comments and the authors responses pros the algorithm is new and it makes sense for the new problem setup though computationally intractable the experimental results outperform the baseline and reinforces the theory but its a toy example at best cons the method is called qlearning but it is somewhat disappointing to see that it actually applies only to the contextual bandit model without dynamics there is quite a bit of branding issues here i suggest the authors to revise it to reflect the actual problem setup the estimator is assumed to be arg min but the objective function is nonconvex and cannot be solved efficiently in general eg 3 involves searching over all partitions and 4 involves solving neural network partitions in other words the result applies to a hypothetical minimizer that the practical solvers may or may not obtain the authors cited scikitlearn for the optimization algorithm and claims that the optimization problem can be solved which is not the case the sgd algorithm can be applied to solve it but it does not necessarily find you the solution the theory is completely asymptotic and generic there is no rate of convergence specified and no dependence on the number of jumps d0 at all in theorem 1 theorem 3 is obnoxiously sloppy the assumptions are not made explicit do you need assumption 1 and 2 what is the choice of rho the notion of minimax rate is not defined at all usually the minimax rate is the property of problem setting ie min over all algorithms and max over all problems with in a family however in the way the authors described the results in theorem 3 it says the the minimax convergence rate of kernelbased estimator is opn13 which seems to be restricting the algorithms instead such nontypical choices require clear definitions and justification based on what is stated it really appears that the authors are just comparing upper bounds of the two methods i looked at the appendix and while there is a lower bound analysis the bound is not informationtheoretical but rather a fixed example where an unspecified family of algorithms i think it is a specific kernel smoothing method with a arbitrary choice of the bandwidth parameter h will fail suggestions to the authors instead of a piecewise constant and uniformly bounded function why not consider the total variation class which is strictly more general and comes with the same rate for formalizing the lower bound i suggest the authors to look into classical lower bounds for linear smoother eg donoho liu macgibbon 1990 which clearly illustrates that kernel smoothingtype methods do not achieve the minimax rates and that waveletsbased approaches locally adaptive regression splines and fused lasso you can think about the haar wavelets as a basis function of piecewise linear functions do the authors can improve the paper by ensuring that the theoretical parts are clearly and rigorously presented and perhaps to iron out the more useful finitesample analysis that depends on model parameters of interest | [
273,
534,
15693,
253,
1563,
3533,
50274,
9846,
436,
1332,
1900,
5115,
2406,
31306,
685,
1666,
25379,
327,
747,
15302,
516,
417,
2119,
670,
253,
3290,
273,
7103,
432,
247,
9864,
1566,
327,
253,
32339,
6178,
4560,
2898,
50274,
5371,
403,
253,
2442,
3237,
17465,
569,
273,
277,
43863,
70,
604,
627,
403,
667,
50273,
20261,
841,
3533,
403,
7744,
5439,
327,
3082,
326,
10725,
327,
5661,
27947,
273,
616,
8936,
3045,
597,
1646,
3782,
4623,
323,
436,
2929,
50274,
34974,
1309,
30080,
22559,
2180,
50273,
5430,
1057,
253,
15180,
2105,
273,
277,
43863,
70,
4311,
342,
247,
11052,
4869,
7887,
273,
8492,
50274,
5430,
7899,
310,
253,
9864,
1566,
10166,
327,
2137,
14103,
249,
5474,
33032,
2520,
2929,
19401,
253,
1895,
273,
745,
22872,
7103,
342,
5415,
5231,
253,
2022,
2934,
310,
281,
806,
970,
1554,
2865,
1079,
1818,
1127,
5481,
281,
35132,
907,
253,
2250,
2317,
285,
840,
4647,
5899,
13997,
88,
390,
1837,
3082,
281,
6642,
253,
1318,
253,
277,
43863,
70,
1332,
310,
28055,
5867,
762,
1097,
253,
2219,
326,
253,
2805,
1159,
310,
2057,
247,
5313,
3020,
1159,
390,
247,
5415,
1159,
323,
5415,
1159,
352,
310,
417,
10084,
326,
347,
253,
1180,
273,
36509,
278,
4566,
281,
23579,
347,
295,
253,
13418,
310,
5185,
1223,
3081,
1543,
275,
10012,
374,
671,
2722,
326,
323,
3710,
278,
253,
29107,
476,
671,
320,
2011,
347,
247,
6447,
11193,
273,
253,
2805,
1318,
4679,
1908,
1097,
247,
20953,
10895,
285,
247,
1524,
1895,
275,
32339,
1057,
4560,
285,
253,
1543,
921,
326,
253,
277,
43863,
70,
1332,
310,
8936,
685,
5368,
3082,
323,
5415,
2805,
7103,
50276,
783,
2929,
310,
4518,
3542,
285,
3477,
281,
956,
891,
760,
452,
247,
1643,
5701,
50276,
18,
275,
253,
4679,
1580,
12672,
253,
8654,
16992,
310,
1077,
673,
21337,
323,
253,
8245,
3082,
352,
651,
1175,
281,
2085,
247,
7000,
13782,
2105,
5301,
50276,
19,
347,
5393,
275,
253,
1332,
629,
278,
310,
8523,
873,
281,
320,
14495,
281,
295,
285,
253,
2457,
10883,
1979,
310,
1199,
4577,
685,
278,
651,
253,
4477,
2722,
841,
7000,
3904,
275,
253,
4679,
495,
352,
812,
320,
1270,
604,
625,
1524,
10186,
3237,
476,
320,
6760,
275,
253,
1655,
5661,
2593,
824,
347,
253,
7870,
20910,
1650,
5611,
3786,
5474,
339,
793,
360,
3454,
50276,
2520,
2929,
29328,
247,
747,
1332,
323,
28841,
7103,
672,
253,
2250,
2317,
310,
5415,
581,
15759,
436,
689,
3217,
253,
30453,
273,
253,
10295,
1754,
1332,
534,
2550,
320,
3732,
281,
14122,
78,
4902,
2805,
3470,
285,
4419,
5536,
13782,
281,
22318,
253,
16992,
253,
4081,
1332,
476,
320,
3732,
281,
16196,
3472,
2805,
3470,
751,
3213,
3470,
285,
33526,
4577,
8492,
436,
310,
1160,
1896,
407,
253,
17825,
6923,
2805,
4715,
1332,
50275,
856,
84,
50276,
6050,
253,
10295,
1332,
4419,
247,
2014,
16992,
281,
1453,
253,
8492,
285,
11041,
273,
253,
1318,
29107,
253,
4081,
1332,
5223,
84,
281,
253,
5281,
273,
253,
2805,
1159,
407,
23534,
253,
2250,
2317,
275,
271,
17825,
1039,
594,
326,
253,
13361,
81,
14662,
275,
1016,
7726,
273,
253,
2250,
2317,
4020,
684,
973,
253,
1524,
2805,
1159,
7613,
253,
11508,
476,
452,
6830,
1027,
16095,
2556,
281,
253,
5281,
273,
253,
2032,
2805,
1159,
247,
1554,
2865,
1079,
1818,
1127,
5481,
1332,
310,
908,
323,
8925,
253,
11508,
534,
4419,
760,
247,
4872,
15180,
2105,
50275,
16217,
2092,
1543,
403,
21414,
50276,
5040,
50276,
18,
275,
5933,
337,
4822,
2105,
1159,
3213,
48169,
13361,
81,
810,
32232,
323,
1046,
1896,
7726,
7613,
13782,
588,
2489,
5536,
672,
253,
1180,
273,
3302,
11508,
278,
310,
1781,
4477,
943,
823,
5955,
670,
436,
1127,
50276,
19,
690,
41818,
403,
21643,
50275,
37585,
5701,
50275,
18,
17356,
4620,
1078,
352,
310,
2931,
374,
298,
310,
1097,
253,
1180,
273,
20077,
285,
253,
4520,
273,
8090,
275,
11454,
6928,
403,
597,
5486,
281,
320,
253,
1072,
347,
597,
2572,
342,
295,
390,
403,
597,
1027,
275,
253,
6158,
1083,
597,
943,
320,
15622,
5474,
339,
793,
360,
3454,
273,
2929,
50276,
783,
2022,
7680,
273,
436,
2929,
310,
247,
747,
5933,
281,
3037,
253,
3264,
10921,
1159,
323,
247,
1677,
2303,
3646,
970,
253,
9493,
941,
4561,
407,
247,
1027,
3879,
3646,
275,
5415,
2250,
10625,
50276,
455,
1655,
28841,
22872,
7103,
258,
365,
3082,
323,
10885,
5415,
2250,
10625,
897,
247,
10295,
1159,
281,
9017,
13737,
5912,
42428,
13997,
88,
390,
44881,
10237,
1837,
7274,
323,
13358,
2250,
10625,
50276,
783,
5933,
4081,
275,
436,
789,
5223,
1242,
35132,
4219,
253,
2250,
2317,
407,
16248,
3082,
275,
1554,
2865,
1079,
1818,
3659,
5481,
33362,
4071,
591,
916,
1406,
9077,
285,
258,
365,
275,
13358,
2250,
10625,
50276,
783,
6486,
3410,
3045,
273,
253,
4081,
1332,
1929,
347,
3676,
48742,
2805,
15419,
2368,
277,
43863,
70,
310,
2429,
281,
326,
273,
767,
10295,
3169,
3082,
581,
1955,
281,
465,
455,
316,
285,
1182,
14451,
4765,
285,
1529,
1955,
281,
847,
13852,
80,
285,
458,
70,
9169,
327,
13506,
347,
973,
347,
1524,
10186,
941,
50276,
936,
6635,
13506,
941,
1740,
15216,
403,
2783,
835,
275,
1016,
1083,
253,
2805,
3701,
310,
5415,
275,
253,
2250,
5028,
390,
310,
247,
5313,
3020,
1159,
273,
253,
2250,
50276,
249,
2761,
512,
273,
841,
2219,
277,
43863,
70,
41731,
13015,
253,
767,
10295,
3169,
3082,
50276,
3549,
6241,
672,
3732,
281,
1524,
10186,
2137,
14103,
249,
941,
846,
18543,
277,
43863,
70,
41731,
13015,
253,
767,
10295,
3169,
3082,
342,
1675,
281,
253,
8492,
2629,
11254,
285,
1599,
30044,
2228,
1014,
672,
253,
3410,
1979,
310,
1355,
295,
1235,
50276,
783,
3388,
20243,
273,
277,
43863,
70,
275,
1016,
10076,
323,
13506,
390,
1524,
10186,
941,
310,
670,
608,
2909,
50275,
11095,
2792,
50276,
783,
5661,
1543,
1646,
281,
7568,
3240,
2410,
1763,
5356,
326,
277,
43863,
70,
41731,
13015,
253,
767,
10295,
3169,
3082,
275,
2761,
512,
2219,
50273,
783,
16182,
3133,
3590,
50275,
783,
10527,
1543,
671,
3176,
3451,
285,
5276,
253,
3590,
1255,
273,
253,
1332,
323,
247,
9648,
4618,
2491,
273,
3470,
50276,
21808,
326,
403,
5415,
275,
253,
4735,
2317,
285,
2250,
5028,
347,
973,
347,
1110,
326,
403,
5313,
3020,
3638,
50276,
783,
1332,
476,
1566,
6923,
16196,
39560,
275,
253,
2805,
3701,
50275,
34974,
50276,
22309,
310,
352,
5272,
281,
5467,
326,
253,
2805,
3701,
476,
320,
973,
6772,
3266,
456,
970,
5313,
3020,
4872,
13553,
273,
13361,
793,
50276,
5430,
310,
253,
3045,
273,
277,
43863,
70,
5876,
407,
253,
4327,
273,
253,
37820,
4764,
17356,
50276,
37585,
5701,
34974,
50275,
6377,
374,
1386,
374,
4961,
50276,
936,
2226,
50276,
6377,
495,
1386,
577,
273,
2593,
3495,
13288,
50276,
29429,
50276,
6377,
495,
1386,
577,
369,
627,
247,
1921,
323,
13887,
253,
42407,
1159,
1060,
50276,
6377,
577,
3104,
1249,
281,
2145,
310,
627,
247,
10527,
22861,
323,
824,
247,
4327,
273,
278,
390,
310,
352,
1754,
327,
16774,
7313,
50276,
12563,
281,
752,
6070,
1057,
253,
3045,
273,
277,
43863,
70,
3469,
327,
253,
3302,
4327,
273,
278,
50276,
6377,
608,
5150,
577,
812,
352,
320,
17285,
2139,
253,
7221,
6081,
310,
4451,
50276,
6377,
608,
2626,
1386,
846,
5150,
577,
943,
352,
320,
4677,
495,
30762,
247,
50276,
9088,
310,
642,
4677,
247,
50276,
6377,
608,
1386,
577,
849,
310,
7856,
82,
908,
275,
253,
2900,
273,
5150,
608,
50276,
6377,
721,
9376,
337,
1180,
273,
7632,
275,
1016,
8763,
3828,
50276,
6377,
721,
1390,
1386,
273,
253,
3908,
273,
10012,
337,
943,
277,
320,
277,
17,
50276,
6377,
721,
3104,
818,
281,
721,
891,
858,
417,
4751,
2096,
752,
436,
2097,
513,
253,
1818,
2792,
273,
7856,
69,
437,
6889,
342,
278,
50276,
6377,
818,
1386,
818,
941,
50276,
42429,
50276,
6377,
818,
1386,
608,
849,
369,
253,
23653,
16261,
6777,
50276,
6377,
818,
1390,
1386,
285,
3239,
854,
1386,
898,
273,
2593,
8073,
342,
884,
8763,
8090,
50276,
25512,
394,
3806,
327,
3239,
884,
4021,
12340,
273,
7681,
1304,
50275,
11183,
846,
4361,
4477,
2380,
5717,
368,
1077,
1199,
323,
253,
7000,
9172,
281,
619,
3533,
347,
973,
347,
253,
643,
30628,
5701,
34974,
50276,
74,
452,
29101,
619,
4868,
30685,
368,
512,
253,
1682,
2490,
187,
4118,
18435,
27,
783,
2929,
19401,
253,
258,
365,
1895,
762,
253,
33876,
3961,
262,
1566,
342,
5415,
2250,
50276,
9328,
5421,
253,
1566,
273,
247,
5313,
3020,
3638,
1318,
1159,
2556,
281,
253,
5231,
50275,
783,
9376,
310,
747,
2167,
1335,
8489,
29190,
347,
352,
4419,
253,
5313,
3020,
3638,
27959,
281,
320,
253,
1072,
323,
512,
1269,
50276,
783,
4081,
5933,
8197,
253,
27959,
285,
840,
908,
352,
281,
1973,
247,
44881,
10237,
29107,
342,
31539,
6349,
10491,
13532,
271,
13361,
81,
323,
1016,
10883,
11794,
50275,
783,
30628,
452,
6804,
6849,
670,
253,
2929,
50276,
783,
1563,
310,
253,
913,
84,
7103,
1754,
327,
4361,
253,
2929,
285,
16932,
839,
253,
30628,
5701,
285,
253,
4477,
6128,
50276,
856,
84,
50274,
783,
5933,
310,
747,
285,
352,
2789,
3282,
323,
253,
747,
1895,
9978,
50276,
2004,
43245,
540,
44374,
50276,
783,
5661,
1543,
562,
32231,
253,
8245,
285,
9838,
36217,
253,
3762,
533,
697,
247,
20953,
1650,
387,
1682,
50276,
5040,
50275,
783,
1332,
310,
1925,
2805,
28269,
533,
352,
310,
8489,
31623,
281,
923,
326,
352,
2686,
10384,
760,
281,
253,
33876,
3961,
262,
1566,
1293,
8062,
50276,
9088,
310,
3240,
247,
2372,
273,
42440,
3374,
1060,
891,
1804,
253,
4477,
281,
49620,
352,
281,
4887,
253,
4588,
1895,
9978,
50274,
783,
29107,
310,
8025,
281,
320,
1736,
1054,
533,
253,
8103,
1159,
310,
1327,
44181,
285,
2550,
320,
14042,
14556,
275,
2087,
24088,
495,
8687,
12203,
689,
512,
27959,
285,
577,
8687,
16161,
11454,
2990,
27959,
50276,
249,
643,
3000,
253,
906,
10384,
281,
247,
27710,
7221,
6081,
326,
253,
8542,
1220,
735,
778,
390,
778,
417,
4044,
253,
4477,
11106,
660,
1479,
2404,
1596,
323,
253,
13757,
5933,
285,
3916,
326,
253,
13757,
1895,
476,
320,
14042,
534,
310,
417,
253,
1083,
50275,
783,
256,
35333,
5933,
476,
320,
3732,
281,
8415,
352,
533,
352,
1057,
417,
7933,
1089,
368,
253,
2900,
50274,
783,
3762,
310,
4336,
20185,
285,
12314,
627,
310,
642,
2281,
273,
14940,
7616,
285,
642,
10096,
327,
253,
1180,
273,
27287,
277,
17,
387,
512,
275,
10012,
337,
50272,
33921,
495,
310,
691,
42480,
8140,
1499,
45695,
253,
13260,
403,
417,
1160,
6843,
513,
368,
878,
9376,
337,
285,
374,
752,
310,
253,
4327,
273,
391,
1689,
50276,
783,
10732,
273,
7221,
991,
2281,
310,
417,
2931,
387,
512,
50275,
27978,
253,
7221,
991,
2281,
310,
253,
2867,
273,
1895,
4758,
50276,
466,
1054,
689,
512,
11333,
285,
2781,
689,
512,
3237,
342,
275,
247,
2021,
50276,
35529,
275,
253,
1039,
253,
4477,
2529,
253,
1543,
275,
10012,
495,
50276,
262,
2296,
253,
253,
7221,
991,
14940,
2281,
273,
10295,
3169,
29107,
310,
1121,
79,
1012,
50276,
4609,
3133,
281,
320,
34617,
253,
11333,
3185,
50276,
10328,
1327,
6611,
474,
10165,
2430,
2590,
14308,
285,
22861,
50274,
3169,
327,
752,
310,
4767,
352,
1663,
4620,
326,
253,
4477,
403,
816,
10941,
5170,
14493,
273,
253,
767,
3082,
50276,
74,
3261,
387,
253,
30762,
285,
1223,
627,
310,
247,
2406,
3033,
1783,
253,
3033,
310,
417,
1491,
783,
33977,
533,
2581,
247,
4229,
1650,
835,
271,
45346,
2021,
273,
11333,
891,
1158,
352,
310,
247,
2173,
10295,
36971,
1332,
342,
247,
10341,
4327,
273,
253,
16992,
4764,
288,
588,
1891,
50275,
35640,
621,
281,
253,
4477,
50274,
34235,
273,
247,
5313,
3020,
3638,
285,
17568,
11542,
1159,
2139,
417,
1908,
253,
2264,
7629,
966,
534,
310,
13714,
625,
2087,
285,
3249,
342,
253,
1072,
2281,
50275,
1542,
7473,
3006,
253,
2406,
3033,
891,
1804,
253,
4477,
281,
1007,
715,
8946,
2406,
14493,
323,
4872,
39797,
977,
24088,
1053,
49132,
632,
86,
5315,
72,
487,
4006,
7901,
534,
4518,
18303,
326,
10295,
36971,
881,
3082,
513,
417,
5115,
253,
7221,
991,
4142,
285,
326,
5149,
6639,
3169,
7274,
12171,
17825,
9077,
6821,
1100,
285,
29843,
298,
26341,
368,
476,
1158,
670,
253,
50276,
3227,
274,
5149,
6639,
347,
247,
3720,
1159,
273,
5313,
3020,
4872,
3470,
50276,
3088,
50275,
783,
4477,
476,
3157,
253,
2929,
407,
17749,
326,
253,
10527,
4243,
403,
4518,
285,
8132,
29689,
3559,
285,
4931,
281,
6871,
562,
253,
625,
4217,
1442,
3254,
4636,
1783,
326,
7024,
327,
1566,
3602,
273,
1600
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
273,
534,
15693,
253,
1563,
3533,
50274,
9846,
436,
1332,
1900,
5115,
2406,
31306,
685,
1666,
25379,
327,
747,
15302,
516,
417,
2119,
670,
253,
3290,
273,
7103,
432,
247,
9864,
1566,
327,
253,
32339,
6178,
4560,
2898,
50274,
5371,
403,
253,
2442,
3237,
17465,
569,
273,
277,
43863,
70,
604,
627,
403,
667,
50273,
20261,
841,
3533,
403,
7744,
5439,
327,
3082,
326,
10725,
327,
5661,
27947,
273,
616,
8936,
3045,
597,
1646,
3782,
4623,
323,
436,
2929,
50274,
34974,
1309,
30080,
22559,
2180,
50273,
5430,
1057,
253,
15180,
2105,
273,
277,
43863,
70,
4311,
342,
247,
11052,
4869,
7887,
273,
8492,
50274,
5430,
7899,
310,
253,
9864,
1566,
10166,
327,
2137,
14103,
249,
5474,
33032,
2520,
2929,
19401,
253,
1895,
273,
745,
22872,
7103,
342,
5415,
5231,
253,
2022,
2934,
310,
281,
806,
970,
1554,
2865,
1079,
1818,
1127,
5481,
281,
35132,
907,
253,
2250,
2317,
285,
840,
4647,
5899,
13997,
88,
390,
1837,
3082,
281,
6642,
253,
1318,
253,
277,
43863,
70,
1332,
310,
28055,
5867,
762,
1097,
253,
2219,
326,
253,
2805,
1159,
310,
2057,
247,
5313,
3020,
1159,
390,
247,
5415,
1159,
323,
5415,
1159,
352,
310,
417,
10084,
326,
347,
253,
1180,
273,
36509,
278,
4566,
281,
23579,
347,
295,
253,
13418,
310,
5185,
1223,
3081,
1543,
275,
10012,
374,
671,
2722,
326,
323,
3710,
278,
253,
29107,
476,
671,
320,
2011,
347,
247,
6447,
11193,
273,
253,
2805,
1318,
4679,
1908,
1097,
247,
20953,
10895,
285,
247,
1524,
1895,
275,
32339,
1057,
4560,
285,
253,
1543,
921,
326,
253,
277,
43863,
70,
1332,
310,
8936,
685,
5368,
3082,
323,
5415,
2805,
7103,
50276,
783,
2929,
310,
4518,
3542,
285,
3477,
281,
956,
891,
760,
452,
247,
1643,
5701,
50276,
18,
275,
253,
4679,
1580,
12672,
253,
8654,
16992,
310,
1077,
673,
21337,
323,
253,
8245,
3082,
352,
651,
1175,
281,
2085,
247,
7000,
13782,
2105,
5301,
50276,
19,
347,
5393,
275,
253,
1332,
629,
278,
310,
8523,
873,
281,
320,
14495,
281,
295,
285,
253,
2457,
10883,
1979,
310,
1199,
4577,
685,
278,
651,
253,
4477,
2722,
841,
7000,
3904,
275,
253,
4679,
495,
352,
812,
320,
1270,
604,
625,
1524,
10186,
3237,
476,
320,
6760,
275,
253,
1655,
5661,
2593,
824,
347,
253,
7870,
20910,
1650,
5611,
3786,
5474,
339,
793,
360,
3454,
50276,
2520,
2929,
29328,
247,
747,
1332,
323,
28841,
7103,
672,
253,
2250,
2317,
310,
5415,
581,
15759,
436,
689,
3217,
253,
30453,
273,
253,
10295,
1754,
1332,
534,
2550,
320,
3732,
281,
14122,
78,
4902,
2805,
3470,
285,
4419,
5536,
13782,
281,
22318,
253,
16992,
253,
4081,
1332,
476,
320,
3732,
281,
16196,
3472,
2805,
3470,
751,
3213,
3470,
285,
33526,
4577,
8492,
436,
310,
1160,
1896,
407,
253,
17825,
6923,
2805,
4715,
1332,
50275,
856,
84,
50276,
6050,
253,
10295,
1332,
4419,
247,
2014,
16992,
281,
1453,
253,
8492,
285,
11041,
273,
253,
1318,
29107,
253,
4081,
1332,
5223,
84,
281,
253,
5281,
273,
253,
2805,
1159,
407,
23534,
253,
2250,
2317,
275,
271,
17825,
1039,
594,
326,
253,
13361,
81,
14662,
275,
1016,
7726,
273,
253,
2250,
2317,
4020,
684,
973,
253,
1524,
2805,
1159,
7613,
253,
11508,
476,
452,
6830,
1027,
16095,
2556,
281,
253,
5281,
273,
253,
2032,
2805,
1159,
247,
1554,
2865,
1079,
1818,
1127,
5481,
1332,
310,
908,
323,
8925,
253,
11508,
534,
4419,
760,
247,
4872,
15180,
2105,
50275,
16217,
2092,
1543,
403,
21414,
50276,
5040,
50276,
18,
275,
5933,
337,
4822,
2105,
1159,
3213,
48169,
13361,
81,
810,
32232,
323,
1046,
1896,
7726,
7613,
13782,
588,
2489,
5536,
672,
253,
1180,
273,
3302,
11508,
278,
310,
1781,
4477,
943,
823,
5955,
670,
436,
1127,
50276,
19,
690,
41818,
403,
21643,
50275,
37585,
5701,
50275,
18,
17356,
4620,
1078,
352,
310,
2931,
374,
298,
310,
1097,
253,
1180,
273,
20077,
285,
253,
4520,
273,
8090,
275,
11454,
6928,
403,
597,
5486,
281,
320,
253,
1072,
347,
597,
2572,
342,
295,
390,
403,
597,
1027,
275,
253,
6158,
1083,
597,
943,
320,
15622,
5474,
339,
793,
360,
3454,
273,
2929,
50276,
783,
2022,
7680,
273,
436,
2929,
310,
247,
747,
5933,
281,
3037,
253,
3264,
10921,
1159,
323,
247,
1677,
2303,
3646,
970,
253,
9493,
941,
4561,
407,
247,
1027,
3879,
3646,
275,
5415,
2250,
10625,
50276,
455,
1655,
28841,
22872,
7103,
258,
365,
3082,
323,
10885,
5415,
2250,
10625,
897,
247,
10295,
1159,
281,
9017,
13737,
5912,
42428,
13997,
88,
390,
44881,
10237,
1837,
7274,
323,
13358,
2250,
10625,
50276,
783,
5933,
4081,
275,
436,
789,
5223,
1242,
35132,
4219,
253,
2250,
2317,
407,
16248,
3082,
275,
1554,
2865,
1079,
1818,
3659,
5481,
33362,
4071,
591,
916,
1406,
9077,
285,
258,
365,
275,
13358,
2250,
10625,
50276,
783,
6486,
3410,
3045,
273,
253,
4081,
1332,
1929,
347,
3676,
48742,
2805,
15419,
2368,
277,
43863,
70,
310,
2429,
281,
326,
273,
767,
10295,
3169,
3082,
581,
1955,
281,
465,
455,
316,
285,
1182,
14451,
4765,
285,
1529,
1955,
281,
847,
13852,
80,
285,
458,
70,
9169,
327,
13506,
347,
973,
347,
1524,
10186,
941,
50276,
936,
6635,
13506,
941,
1740,
15216,
403,
2783,
835,
275,
1016,
1083,
253,
2805,
3701,
310,
5415,
275,
253,
2250,
5028,
390,
310,
247,
5313,
3020,
1159,
273,
253,
2250,
50276,
249,
2761,
512,
273,
841,
2219,
277,
43863,
70,
41731,
13015,
253,
767,
10295,
3169,
3082,
50276,
3549,
6241,
672,
3732,
281,
1524,
10186,
2137,
14103,
249,
941,
846,
18543,
277,
43863,
70,
41731,
13015,
253,
767,
10295,
3169,
3082,
342,
1675,
281,
253,
8492,
2629,
11254,
285,
1599,
30044,
2228,
1014,
672,
253,
3410,
1979,
310,
1355,
295,
1235,
50276,
783,
3388,
20243,
273,
277,
43863,
70,
275,
1016,
10076,
323,
13506,
390,
1524,
10186,
941,
310,
670,
608,
2909,
50275,
11095,
2792,
50276,
783,
5661,
1543,
1646,
281,
7568,
3240,
2410,
1763,
5356,
326,
277,
43863,
70,
41731,
13015,
253,
767,
10295,
3169,
3082,
275,
2761,
512,
2219,
50273,
783,
16182,
3133,
3590,
50275,
783,
10527,
1543,
671,
3176,
3451,
285,
5276,
253,
3590,
1255,
273,
253,
1332,
323,
247,
9648,
4618,
2491,
273,
3470,
50276,
21808,
326,
403,
5415,
275,
253,
4735,
2317,
285,
2250,
5028,
347,
973,
347,
1110,
326,
403,
5313,
3020,
3638,
50276,
783,
1332,
476,
1566,
6923,
16196,
39560,
275,
253,
2805,
3701,
50275,
34974,
50276,
22309,
310,
352,
5272,
281,
5467,
326,
253,
2805,
3701,
476,
320,
973,
6772,
3266,
456,
970,
5313,
3020,
4872,
13553,
273,
13361,
793,
50276,
5430,
310,
253,
3045,
273,
277,
43863,
70,
5876,
407,
253,
4327,
273,
253,
37820,
4764,
17356,
50276,
37585,
5701,
34974,
50275,
6377,
374,
1386,
374,
4961,
50276,
936,
2226,
50276,
6377,
495,
1386,
577,
273,
2593,
3495,
13288,
50276,
29429,
50276,
6377,
495,
1386,
577,
369,
627,
247,
1921,
323,
13887,
253,
42407,
1159,
1060,
50276,
6377,
577,
3104,
1249,
281,
2145,
310,
627,
247,
10527,
22861,
323,
824,
247,
4327,
273,
278,
390,
310,
352,
1754,
327,
16774,
7313,
50276,
12563,
281,
752,
6070,
1057,
253,
3045,
273,
277,
43863,
70,
3469,
327,
253,
3302,
4327,
273,
278,
50276,
6377,
608,
5150,
577,
812,
352,
320,
17285,
2139,
253,
7221,
6081,
310,
4451,
50276,
6377,
608,
2626,
1386,
846,
5150,
577,
943,
352,
320,
4677,
495,
30762,
247,
50276,
9088,
310,
642,
4677,
247,
50276,
6377,
608,
1386,
577,
849,
310,
7856,
82,
908,
275,
253,
2900,
273,
5150,
608,
50276,
6377,
721,
9376,
337,
1180,
273,
7632,
275,
1016,
8763,
3828,
50276,
6377,
721,
1390,
1386,
273,
253,
3908,
273,
10012,
337,
943,
277,
320,
277,
17,
50276,
6377,
721,
3104,
818,
281,
721,
891,
858,
417,
4751,
2096,
752,
436,
2097,
513,
253,
1818,
2792,
273,
7856,
69,
437,
6889,
342,
278,
50276,
6377,
818,
1386,
818,
941,
50276,
42429,
50276,
6377,
818,
1386,
608,
849,
369,
253,
23653,
16261,
6777,
50276,
6377,
818,
1390,
1386,
285,
3239,
854,
1386,
898,
273,
2593,
8073,
342,
884,
8763,
8090,
50276,
25512,
394,
3806,
327,
3239,
884,
4021,
12340,
273,
7681,
1304,
50275,
11183,
846,
4361,
4477,
2380,
5717,
368,
1077,
1199,
323,
253,
7000,
9172,
281,
619,
3533,
347,
973,
347,
253,
643,
30628,
5701,
34974,
50276,
74,
452,
29101,
619,
4868,
30685,
368,
512,
253,
1682,
2490,
187,
4118,
18435,
27,
783,
2929,
19401,
253,
258,
365,
1895,
762,
253,
33876,
3961,
262,
1566,
342,
5415,
2250,
50276,
9328,
5421,
253,
1566,
273,
247,
5313,
3020,
3638,
1318,
1159,
2556,
281,
253,
5231,
50275,
783,
9376,
310,
747,
2167,
1335,
8489,
29190,
347,
352,
4419,
253,
5313,
3020,
3638,
27959,
281,
320,
253,
1072,
323,
512,
1269,
50276,
783,
4081,
5933,
8197,
253,
27959,
285,
840,
908,
352,
281,
1973,
247,
44881,
10237,
29107,
342,
31539,
6349,
10491,
13532,
271,
13361,
81,
323,
1016,
10883,
11794,
50275,
783,
30628,
452,
6804,
6849,
670,
253,
2929,
50276,
783,
1563,
310,
253,
913,
84,
7103,
1754,
327,
4361,
253,
2929,
285,
16932,
839,
253,
30628,
5701,
285,
253,
4477,
6128,
50276,
856,
84,
50274,
783,
5933,
310,
747,
285,
352,
2789,
3282,
323,
253,
747,
1895,
9978,
50276,
2004,
43245,
540,
44374,
50276,
783,
5661,
1543,
562,
32231,
253,
8245,
285,
9838,
36217,
253,
3762,
533,
697,
247,
20953,
1650,
387,
1682,
50276,
5040,
50275,
783,
1332,
310,
1925,
2805,
28269,
533,
352,
310,
8489,
31623,
281,
923,
326,
352,
2686,
10384,
760,
281,
253,
33876,
3961,
262,
1566,
1293,
8062,
50276,
9088,
310,
3240,
247,
2372,
273,
42440,
3374,
1060,
891,
1804,
253,
4477,
281,
49620,
352,
281,
4887,
253,
4588,
1895,
9978,
50274,
783,
29107,
310,
8025,
281,
320,
1736,
1054,
533,
253,
8103,
1159,
310,
1327,
44181,
285,
2550,
320,
14042,
14556,
275,
2087,
24088,
495,
8687,
12203,
689,
512,
27959,
285,
577,
8687,
16161,
11454,
2990,
27959,
50276,
249,
643,
3000,
253,
906,
10384,
281,
247,
27710,
7221,
6081,
326,
253,
8542,
1220,
735,
778,
390,
778,
417,
4044,
253,
4477,
11106,
660,
1479,
2404,
1596,
323,
253,
13757,
5933,
285,
3916,
326,
253,
13757,
1895,
476,
320,
14042,
534,
310,
417,
253,
1083,
50275,
783,
256,
35333,
5933,
476,
320,
3732,
281,
8415,
352,
533,
352,
1057,
417,
7933,
1089,
368,
253,
2900,
50274,
783,
3762,
310,
4336,
20185,
285,
12314,
627,
310,
642,
2281,
273,
14940,
7616,
285,
642,
10096,
327,
253,
1180,
273,
27287,
277,
17,
387,
512,
275,
10012,
337,
50272,
33921,
495,
310,
691,
42480,
8140,
1499,
45695,
253,
13260,
403,
417,
1160,
6843,
513,
368,
878,
9376,
337,
285,
374,
752,
310,
253,
4327,
273,
391,
1689,
50276,
783,
10732,
273,
7221,
991,
2281,
310,
417,
2931,
387,
512,
50275,
27978,
253,
7221,
991,
2281,
310,
253,
2867,
273,
1895,
4758,
50276,
466,
1054,
689,
512,
11333,
285,
2781,
689,
512,
3237,
342,
275,
247,
2021,
50276,
35529,
275,
253,
1039,
253,
4477,
2529,
253,
1543,
275,
10012,
495,
50276,
262,
2296,
253,
253,
7221,
991,
14940,
2281,
273,
10295,
3169,
29107,
310,
1121,
79,
1012,
50276,
4609,
3133,
281,
320,
34617,
253,
11333,
3185,
50276,
10328,
1327,
6611,
474,
10165,
2430,
2590,
14308,
285,
22861,
50274,
3169,
327,
752,
310,
4767,
352,
1663,
4620,
326,
253,
4477,
403,
816,
10941,
5170,
14493,
273,
253,
767,
3082,
50276,
74,
3261,
387,
253,
30762,
285,
1223,
627,
310,
247,
2406,
3033,
1783,
253,
3033,
310,
417,
1491,
783,
33977,
533,
2581,
247,
4229,
1650,
835,
271,
45346,
2021,
273,
11333,
891,
1158,
352,
310,
247,
2173,
10295,
36971,
1332,
342,
247,
10341,
4327,
273,
253,
16992,
4764,
288,
588,
1891,
50275,
35640,
621,
281,
253,
4477,
50274,
34235,
273,
247,
5313,
3020,
3638,
285,
17568,
11542,
1159,
2139,
417,
1908,
253,
2264,
7629,
966,
534,
310,
13714,
625,
2087,
285,
3249,
342,
253,
1072,
2281,
50275,
1542,
7473,
3006,
253,
2406,
3033,
891,
1804,
253,
4477,
281,
1007,
715,
8946,
2406,
14493,
323,
4872,
39797,
977,
24088,
1053,
49132,
632,
86,
5315,
72,
487,
4006,
7901,
534,
4518,
18303,
326,
10295,
36971,
881,
3082,
513,
417,
5115,
253,
7221,
991,
4142,
285,
326,
5149,
6639,
3169,
7274,
12171,
17825,
9077,
6821,
1100,
285,
29843,
298,
26341,
368,
476,
1158,
670,
253,
50276,
3227,
274,
5149,
6639,
347,
247,
3720,
1159,
273,
5313,
3020,
4872,
3470,
50276,
3088,
50275,
783,
4477,
476,
3157,
253,
2929,
407,
17749,
326,
253,
10527,
4243,
403,
4518,
285,
8132,
29689,
3559,
285,
4931,
281,
6871,
562,
253,
625,
4217,
1442,
3254,
4636,
1783,
326,
7024,
327,
1566,
3602,
273,
1600
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper proposes a class of model called monotone deep boltzmann machines where the underlying potentials are parameterized eg by cnns such that they obey some monotonicity constraint this constraint ensures that the inference problem has a global optimum which can be found using some generalized variant of parallel mean field the method is inspired from monotone deq previously proposed by winston kolter 2020 experiments on a joint task of image denoising and classification show that the proposed method can effectively model complex data distributions such as images on one hand this paper has some significant strengths first the paper is fairly well written in general second while this work is heavily inspired by winston kolter 2020 i find that the connection between mean field and monotone deq is quite interesting although relatively straightforward and the proposed method is theoretically well founded on the other hand the paper also has some limitations 1 first and foremost i find the experiments quite limited which is also acknowledged by the authors a more diverse set of applications would have made the paper much more solid at the very least i would have expected some experimental comparison with restricted boltzmann machines not to mention also its variants such as extensions to multilabel the proposed model is theoretically sound but it is not clear why one should use it 2 the paper also has some minor presentation issues but before ending my review with them i would like to have some comments on the bibliographical discussion 2a since the convergence of mean field is presented as an emphasis in the paper i would like to point out a very recent neurips 2021 paper on the topic regularized frankwolfe for dense crfs generalizing mean field and beyond httpsarxivorgabs211014759 in this paper they view parallel mean field as an instance of the generalized conditional gradient method and thus obtain different convergent variants of parallel mean field with different stepsize rules it seems to me that these variants do not have the same limitations as krahenbuhls and baqus as discussed in this paper even though their resulting algorithms seem to be similar to baqus at first glance could you give some comments on this including such discussion would give the reader a broader and more uptodate view of the current state of the art of course no experimental comparison would be needed thats not the focus of the paper 2b numerous works also try to combine deep neural networks with conditional random fields crf arnab et al 2018 schwartz et al 2017 zheng et al 2015 even though this is just a minor detail in the current paper i would like to take this opportunity to raise an important issue regarding credit assignment the first to view crfs as rnns for the dense crfs of krahenbuhl koltun 2011 was actually krahenbuhl koltun 2013 and not zheng et al 2015 krahenbuhl koltun 2013 had two major contributions in their paper a convergent parallel mean field and b parameter learning of dense crfs with reversemode automatic differentiation ie viewing crfs as rnns and backpropagating through time unfortunately krahenbuhl koltun 2013 have been often credited with a only and not b while b is to me even more significant than a this is not fair and i think this happened because some previous work didnt cite them correctly or in a misleading manner for example arnab et al 2018 didnt even cite this paper even though they did cite in their previous work zheng et al 2015 not sure why they removed the citation from the journal version the fact that arnab et al 2018 completely ignored krahenbuhl koltun 2013 and credited zheng et al 2015 for viewing crfs as rnns made their presentation misleading and unacceptable to me i would like to encourage the authors to give proper credits to krahenbuhl koltun 2013 whenever they have an opportunity to do so starting with the current submission i would suggest to slightly change the above sentence to the following for example numerous works also try to combine conditional random fields crf with pixelwise classifiers such as neural networks to obtain fully endtoend models krahenbuhl koltun 2013 schwartz et al 2017 zheng et al 2015 but of course it is up to the authors to decide 3 some comments on the presentation major in the abstract in addition we show that our procedure outperforms existing meanfield approximation methods while avoiding any issue of local optima i guess the authors are referring to the comparison with krahenbuhls and baqus that is presented in the appendix if something is mentioned in the abstract then its important enough to be included in the main content instead of being left in the appendix i would suggest to either remove the above sentence from the abstract or to move such comparison from the appendix to the main content the former seems more appropriate to me since this is not the focus of the paper minor eq 1 should end with a comma instead of a dot page 3 1st paragraph proposed a deep parameterization of mrf however their page 3 1st paragraph proposed a deep parameterization of mrf but their section 31 1st paragraph lines 34 are not clear to me page 6 before eq 13 similarlyfactored a matrix similarlyfactored matrix a interesting and theoretically sound model the set of experiments is quite modest in addition to some minor presentation issues docsepin this paper the authors propose a restricted parameterization of the boltzmann machine that guarantees that for any set of observations the mean field objective has a single global optimum furthermore that global optimum can be provably achieved using damped parallel meanfield updates which make inference efficient to turn inference into learning the model is treated as a supervised learning model some of its variables are considered to be observed inputs and some of its variables are considered to be target outputs known at test time the usual marginal crossentropy loss is the optimization target for learning the paper is well written and easy to follow most of its contents come from existing literature but this work nicely puts those existing pieces single fixed point parallel updates together providing a probabilistic interpretation as a boltzmann machine that is new while this paper emphasizes how the proposed approach enables the use of general boltzmann machines bm and not just stacked restricted bms the resulting model might actually be more restricted than the stacked rbms that it intends to improve upon it is true that the proposed model can contain intralayer and skiplayer connections that a dbn lacks but all the parameters are restricted so as to produce a monomodal posterior approximation for any partial evidence the true posterior even for a singlelayer rbm can be multimodal if the parameters are not restricted this means that as a modeling tool the proposed bm with restricted weights might be less flexible than a dbn many densities of interest are multimodal particularly as we reduce the available evidence in fact in the absence of evidence any useful bm will have to be multimodal for instance to be able to sample different mnist digits from it the proposed mechanism for training is also lacking in that it only allows for marginal supervised learning it cannot be used for unsupervised learning which is the typical mode of operation for rbms and dbns basically the tasks that it can solve need to be crafted in such a way that the evidence provided is enough to disambiguate a single mode of the posterior for instance if we want to perform mnist digits inpainting and we provide only the top 25 of the image showing a semicircle possible completions could be 0 2 3 6 8 9 this method would fail at this task since it would consistently default to a single digit or even worse a single combination of the possible completions thus the presented paper does provide an efficient mechanism for conditional training of parameterrestricted bms and does a good job at it but the use cases in which it can be applied are severely limited both due to the type of training and parameters it can use the experimental section does not contain meaningful comparisons with other methodsbaselines baseline 1 use your loss function with damped parallel mean field inference ie consider damping a hyperparameter and do not impose any restriction on the parameters of the bm baseline 2 use a dbn less raw expressive power but unrestricted in parameters and with a more proper loss function so it is difficult to gauge the practical advantage in the provided examples minor comments and questions you show that the mean field inference problem has a single global optimum but is the true posterior monomodal under this parameterization that would be a stronger result and convenient to know is the query the split between observed variables and variables one wants to predict fixed throughout training although this is not explicitly pointed out in the theoretical part of your paper it seems to be fixed citing domke 2013 this split could be different for each training sample which seems to be the case based on your experiments using different splits is called query training in this aaai 2021 paper query training learning a worse model to infer better marginals in undirected graphical models with hidden variables which seems to propose a very similar approach although using a different type of inference itd be good to clarify which approach you are using in the description of training the definition of the function in eq 11 is a bit confusing because of how the domain is included could you define it by parts or define i which is the value of alpha that you use for your experiments the figure 9295 test accuracy corresponds to the 10way labels of each digit or to the 4way categories of the pixels typo that owning to the restricted owing pros this paper does a good job at providing a mechanism for inference in parameter restricted bms with convergence guarantees as well as an efficient method to learn the parameters of these bms cons the proposed method cannot be applied in many settings in which bms can unsupervised learning sampling use of multimodal posteriors little experimental validation of the usefulness of the convergent inference so the settings in which this approach can be used is very limited but within that setting it provides the required details for efficient training and robust guarantees for inference docsepthis paper theoretically shows that the meanfield equation for a certain family of boltzmann machines with hidden variables called the monotone dbms can be modeled as the recently proposed monotone deep equilibrium deq model this paper further characterizes properties of such boltzmann machines and its training and shows its behavior in experiments on mnist and cifar10 strength the strength of this paper is the technical contribution of finding the connection between the dbms and the deq model by characterizing the monotone dbms i think this is a good contribution as it can be essential for further development of bms considering that the current progress of bms is not so rapid in my understanding the quality of presentation is also good and this paper is clearly written overall i have just a minor comment since the current explanation of a block hollow matrix is vague please mathematically define it for the selfcompleteness weaknesses the significance of this paper is not high and evaluation is weak in particular the practical advantage of the proposed monotone dbms is not clear although it is true that the family of bms to which the contrastive divergence algorithm can be applied is limited any bms with any connection patterns of hidden variables can be trained by directly applying gibbs sampling which of course includes the monotone dbms therefore the monotone dbms have no merit with respect to the effectiveness of inference and i guess the only practical advantage of the monotone dbms can be the efficiency however there is neither analysis of computational complexity nor empirical runtime comparison to such a straightforward approach in addition it has been already proven that rbms can represent any distribution therefore from the viewpoint of the representation power there is no difference between rbms and monotone dbms of course i agree that monotone dbms can be more effective than rbms and existing dbms in practice for example monotone dbms can achieve more accurate inference with less parameters than rbms however there are no such comparisons in this paper i am happy to increase my score if the above my concerns are properly addressed by the authors response this paper potentially includes an interesting technical contribution while the significance is not convincing and the evaluation is weak docsepthis paper proposes a new family of monotone deep boltzmann machines where the pairwise potentials satisfy a monotonicity condition giving rise to efficient meanfield iteration with provable convergence guarantees the convergence is obtained by drawing connections with monotone deep equilibrium models smallscale experiments are done as proof of concept the paper is very wellwritten and easy to read however i found the novelty aspect of the work to be a bit lacking aside from the new parameterization eq 34 introduced to satisfy the monotonicity condition the method of this paper seems like a straightforward combination of krahenbuhl kolten 2013 baque et al 2016 and winston kolter 2020 it is also unclear how restirctive this parameterization is which itself is quite simple within all possible pairwise potentials that satisfy the monotonicity condition the parallel updates and the convergence proof are almost exactly the same as winston kolter 2020 except for the extension to softmax operation i would be happy with the novelty aspect if convincing experiments results are shown sadly this does not seem to be the case the practical benefit of deep boltzmann machine compared to more traditional neural architectures eg cnn for image classification is not clear to me and has not been highlighted in the paper when would someone use deep bm instead of the alternatives the experimental results do not seem to answer this question although deep boltzmann machine can be more flexible for modeling different conditional distributions without retraining it seems to come at the cost of being much harder to train while relying on meanfield approximation im wondering how crude the meanfield approximation of the posterior distribution is in the current papers setting which has not been discussed the experiments are very smallscaled the images are all with very lowresolution this seems to suggest the impracticality of deep boltzmann machine in cifar10 experiment the test accurarcy is only 58 which is a lot lower than using conventional neural architectures in eq 20 a very arbitrary scaling is used after convergence to the meanfield solution this seems like an adhoc fix for a method that doesnt really work due to the monotonicity constraint id be interested in seeing experimental comparison between the scaled version and the original version for the patch case the model works better without the monotonicity constraint this seems to be against the whole point of the paper the proposed method does not seem to have significant improvement compared to past works in this line of work eg diagonal entries in table 3 4 additional comments page 3 we remark the readers upon this doesnt sound grammatically correct sec 35 mentions the model is trained directly to output correct marginals instead of the usual likelihood maximization which can be intractable what is lost in this simplification in addition to meanfield approximation matching only marginals seems very coarse to me how to train the proposed model on batches of images if i understand correctly the current training procedure would sample a single image split it into xh and xo run meanfield inference given xo in a differentiable manner then backpropagate through loss ellqh xh are multiple meanfield inferences run in parallel if so do they use the same number of iterations if not i would imagine the training to have very high variance at the end of sec 35 at the top of page 8 why is gqh not the damped version this paper is wellwritten but its contributions are incremental with somewhat weak experimental results
### Summary: | this is an interesting contribution to the boltzmann machine bm literature that makes a nice connection to deq models on a positive note reviewers found that it was wellwritten clear and interesting unfortunately there were significant concerns with the manuscript that were not fully addressed in the revision inappropriate or incomplete baselines insufficient credit given to previous works and the fact that this model is limited as compared to its bm relatives i would recommend that the authors take into account the reviewers feedback in a revision of the work | [
290,
10144,
2957,
310,
253,
13757,
2303,
323,
4715,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
954,
273,
697,
9410,
1705,
432,
5368,
6239,
533,
436,
789,
23395,
12516,
1110,
5368,
7437,
2014,
4229,
1127,
7529,
11269,
2366,
5277,
247,
37851,
7914,
347,
247,
22491,
91,
8420,
5145,
326,
310,
747,
50276,
6050,
436,
2929,
35520,
849,
253,
4081,
2746,
13276,
253,
897,
273,
2087,
22491,
91,
8420,
10679,
270,
78,
285,
417,
816,
24982,
11096,
270,
983,
253,
4795,
1566,
1537,
2686,
320,
625,
11096,
685,
253,
24982,
45630,
983,
326,
352,
31901,
281,
3157,
2220,
352,
310,
2032,
326,
253,
4081,
1566,
476,
3831,
540,
1544,
4071,
285,
21114,
15381,
10291,
326,
247,
14073,
79,
19756,
533,
512,
253,
3602,
403,
11096,
594,
347,
281,
4711,
247,
1114,
297,
26306,
12637,
11193,
323,
667,
7898,
1941,
253,
2032,
12637,
1014,
323,
247,
2014,
12026,
391,
5844,
476,
320,
23390,
26306,
604,
253,
3602,
403,
417,
11096,
436,
2097,
326,
347,
247,
14053,
4968,
253,
4081,
270,
78,
342,
11096,
13461,
1537,
320,
1679,
12112,
685,
247,
14073,
79,
1142,
16689,
273,
1600,
403,
23390,
26306,
3782,
347,
359,
4796,
253,
2130,
1941,
275,
958,
275,
253,
5928,
273,
1941,
667,
4217,
270,
78,
588,
452,
281,
320,
23390,
26306,
323,
4227,
281,
320,
2104,
281,
3410,
1027,
278,
79,
382,
24321,
432,
352,
50276,
783,
4081,
5122,
323,
3733,
310,
671,
14999,
275,
326,
352,
760,
4483,
323,
16888,
22296,
4715,
352,
2550,
320,
908,
323,
440,
35421,
4715,
534,
310,
253,
6867,
4438,
273,
4254,
323,
45630,
983,
285,
14073,
2224,
10323,
253,
8892,
326,
352,
476,
8415,
878,
281,
320,
37171,
275,
824,
247,
1039,
326,
253,
1941,
2530,
310,
2217,
281,
557,
19062,
6340,
247,
2014,
4438,
273,
253,
12637,
323,
4227,
604,
359,
971,
281,
1347,
278,
79,
382,
24321,
275,
31406,
1076,
285,
359,
2085,
760,
253,
1755,
2030,
273,
253,
2460,
4645,
247,
40296,
1426,
282,
50276,
24902,
2535,
621,
812,
320,
470,
374,
495,
721,
854,
898,
436,
1332,
651,
1891,
387,
436,
4836,
1580,
352,
651,
12724,
4284,
281,
247,
2014,
6670,
390,
1014,
7197,
247,
2014,
5019,
273,
253,
1896,
2535,
621,
50276,
40622,
253,
3559,
2929,
1057,
2085,
271,
5919,
5122,
323,
17697,
3733,
273,
4764,
44255,
270,
983,
285,
1057,
247,
1175,
2628,
387,
352,
533,
253,
897,
2219,
275,
534,
352,
476,
320,
3732,
403,
18270,
3710,
1097,
1955,
281,
253,
1511,
273,
3733,
285,
3602,
352,
476,
897,
50276,
783,
5661,
2593,
1057,
417,
3831,
14282,
14023,
342,
643,
3082,
10352,
25379,
50276,
44650,
337,
897,
634,
2957,
1159,
342,
16109,
264,
7529,
1599,
1673,
17032,
26332,
1908,
31731,
247,
4373,
19484,
285,
513,
417,
16209,
667,
12400,
327,
253,
3602,
273,
253,
270,
78,
50276,
44650,
374,
897,
247,
14073,
79,
1679,
9305,
43541,
1612,
533,
48566,
275,
3602,
285,
342,
247,
625,
1463,
2957,
1159,
50276,
601,
352,
310,
2834,
281,
11206,
253,
8542,
5750,
275,
253,
2530,
6667,
50276,
37585,
5701,
285,
3533,
50276,
5658,
921,
326,
253,
1599,
1673,
17032,
1895,
556,
247,
2014,
4156,
24571,
533,
310,
253,
2032,
12637,
1114,
297,
26306,
762,
436,
4764,
1320,
326,
651,
320,
247,
10046,
906,
285,
11638,
281,
871,
50275,
261,
253,
7316,
253,
8085,
875,
2540,
4903,
285,
4903,
581,
5605,
281,
3283,
4229,
4768,
3733,
3738,
436,
310,
417,
11120,
8042,
562,
275,
253,
10527,
629,
273,
634,
2929,
352,
3133,
281,
320,
4229,
19936,
2328,
413,
4072,
436,
8085,
812,
320,
1027,
323,
1016,
3733,
3410,
534,
3133,
281,
320,
253,
1083,
1754,
327,
634,
4679,
970,
1027,
36509,
310,
1925,
7316,
3733,
275,
436,
39951,
2284,
43425,
2929,
7316,
3733,
4715,
247,
7197,
1566,
281,
9441,
1805,
8459,
932,
275,
3807,
17799,
29886,
3210,
342,
8763,
4903,
534,
3133,
281,
12661,
247,
1077,
2074,
2746,
3738,
970,
247,
1027,
1511,
273,
17032,
352,
69,
320,
1175,
281,
19148,
534,
2746,
368,
403,
970,
275,
253,
5740,
273,
3733,
50275,
783,
5426,
273,
253,
1159,
275,
16186,
1903,
310,
247,
2372,
21643,
984,
273,
849,
253,
5028,
310,
2908,
812,
368,
4853,
352,
407,
4243,
390,
4853,
891,
50275,
4609,
310,
253,
1318,
273,
9765,
326,
368,
897,
323,
634,
4679,
50275,
783,
4677,
898,
22270,
1071,
7200,
10140,
281,
253,
884,
1106,
13301,
273,
1016,
6670,
390,
281,
253,
577,
1106,
9050,
273,
253,
15115,
50275,
555,
5367,
326,
37695,
281,
253,
11096,
50276,
23581,
5847,
436,
2929,
1057,
247,
1175,
2628,
387,
5277,
247,
5122,
323,
17032,
275,
4764,
11096,
270,
983,
342,
14940,
23632,
347,
973,
347,
271,
5919,
1332,
281,
3037,
253,
3602,
273,
841,
270,
983,
50276,
5040,
253,
4081,
1332,
2550,
320,
3732,
275,
1142,
7533,
275,
534,
270,
983,
476,
440,
35421,
4715,
10491,
897,
273,
23390,
26306,
20731,
17327,
1652,
5661,
12820,
273,
253,
31471,
273,
253,
41886,
17032,
50276,
601,
253,
7533,
275,
534,
436,
2746,
476,
320,
908,
310,
1077,
3710,
533,
1561,
326,
4758,
352,
3400,
253,
2424,
4278,
323,
5919,
3733,
285,
10237,
23632,
323,
17032,
5474,
33032,
2520,
2929,
28055,
2722,
326,
253,
1599,
3423,
5150,
323,
247,
2176,
2021,
273,
22491,
91,
8420,
10679,
342,
8763,
4903,
1925,
253,
49123,
14073,
983,
476,
320,
23115,
347,
253,
4102,
4081,
49123,
3676,
12902,
372,
82,
1566,
436,
2929,
2007,
45589,
3607,
273,
824,
22491,
91,
8420,
10679,
285,
697,
3733,
285,
2722,
697,
3879,
275,
4679,
327,
278,
79,
382,
285,
260,
338,
274,
740,
50275,
45563,
50276,
783,
4757,
273,
436,
2929,
310,
253,
7681,
7680,
273,
4560,
253,
4602,
875,
253,
14073,
983,
285,
253,
372,
82,
1566,
407,
39330,
253,
49123,
14073,
983,
891,
1158,
436,
310,
247,
1175,
7680,
347,
352,
476,
320,
5667,
323,
2007,
2440,
273,
270,
983,
7296,
326,
253,
1655,
4780,
273,
270,
983,
310,
417,
594,
5233,
275,
619,
4685,
50276,
783,
3290,
273,
9759,
310,
671,
1175,
285,
436,
2929,
310,
4518,
3542,
4583,
891,
452,
816,
247,
5884,
4385,
50276,
17480,
253,
1655,
8813,
273,
247,
2972,
19781,
4315,
310,
21248,
4496,
11076,
1037,
4853,
352,
323,
253,
1881,
7507,
1866,
405,
50274,
20881,
1255,
265,
50276,
783,
8453,
273,
436,
2929,
310,
417,
1029,
285,
7103,
310,
5075,
275,
1798,
253,
8542,
5750,
273,
253,
4081,
49123,
14073,
983,
310,
417,
2590,
50275,
20261,
352,
310,
2032,
326,
253,
2021,
273,
270,
983,
281,
534,
253,
4499,
422,
23279,
5933,
476,
320,
3732,
310,
3710,
667,
270,
983,
342,
667,
4602,
6127,
273,
8763,
4903,
476,
320,
10166,
407,
3587,
9433,
33342,
1768,
10491,
534,
273,
2282,
3797,
253,
49123,
14073,
983,
3103,
253,
49123,
14073,
983,
452,
642,
15785,
342,
1675,
281,
253,
12510,
273,
17032,
285,
891,
5476,
253,
760,
8542,
5750,
273,
253,
49123,
14073,
983,
476,
320,
253,
6733,
2299,
627,
310,
6747,
1783,
273,
15180,
10454,
4543,
16774,
20243,
5301,
281,
824,
247,
15246,
2746,
50276,
249,
1635,
352,
556,
644,
2168,
11464,
326,
45630,
983,
476,
1957,
667,
3268,
3103,
432,
253,
31460,
273,
253,
6779,
1612,
627,
310,
642,
3064,
875,
45630,
983,
285,
49123,
14073,
983,
273,
2282,
891,
5194,
326,
49123,
14073,
983,
476,
320,
625,
3576,
685,
45630,
983,
285,
5368,
14073,
983,
275,
3946,
323,
1650,
49123,
14073,
983,
476,
5115,
625,
7899,
17032,
342,
1679,
3602,
685,
45630,
983,
2299,
627,
403,
642,
824,
14023,
275,
436,
2929,
50276,
74,
717,
5211,
281,
2572,
619,
4868,
604,
253,
1840,
619,
7350,
403,
6283,
9713,
407,
253,
4477,
2380,
436,
2929,
7826,
3797,
271,
4722,
7681,
7680,
1223,
253,
8453,
310,
417,
21414,
285,
253,
7103,
310,
5075,
5474,
33032,
2520,
2929,
29328,
247,
747,
2021,
273,
49123,
3676,
22491,
91,
8420,
10679,
835,
253,
28208,
19316,
10517,
247,
45973,
414,
1617,
4933,
6054,
281,
5919,
1599,
3423,
19502,
342,
872,
494,
14940,
23632,
253,
14940,
310,
2797,
407,
10263,
10291,
342,
49123,
3676,
12902,
3210,
1355,
7527,
4679,
403,
2218,
347,
4737,
273,
4473,
50276,
783,
2929,
310,
1077,
973,
15720,
285,
3477,
281,
1239,
2299,
891,
1119,
253,
38135,
4809,
273,
253,
789,
281,
320,
247,
2372,
14999,
50276,
45529,
432,
253,
747,
4764,
1320,
16186,
5910,
5611,
281,
10517,
253,
45973,
414,
1617,
253,
1332,
273,
436,
2929,
3133,
751,
247,
15246,
5019,
273,
465,
376,
864,
67,
6968,
77,
50276,
76,
311,
1866,
4072,
270,
13734,
1162,
355,
4022,
285,
3330,
5493,
50276,
76,
311,
350,
9169,
50276,
262,
310,
671,
12744,
849,
1551,
343,
19343,
436,
4764,
1320,
310,
534,
3139,
310,
3240,
2969,
1561,
512,
1896,
28208,
19316,
326,
10517,
253,
45973,
414,
1617,
50275,
783,
7529,
11269,
285,
253,
14940,
4737,
403,
2761,
4555,
253,
1072,
347,
3330,
5493,
50276,
76,
311,
350,
9169,
3707,
323,
253,
6880,
281,
2602,
4090,
4254,
50276,
74,
651,
320,
5211,
342,
253,
38135,
4809,
604,
21414,
4679,
1543,
403,
2011,
30018,
436,
1057,
417,
1646,
281,
320,
253,
1083,
50276,
783,
8542,
5649,
273,
3676,
22491,
91,
8420,
5145,
2429,
281,
625,
5899,
11454,
35615,
24088,
260,
9866,
323,
2460,
9162,
310,
417,
2590,
281,
479,
285,
556,
417,
644,
16318,
275,
253,
2929,
672,
651,
3095,
897,
3676,
270,
78,
3185,
273,
253,
18075,
253,
5661,
1543,
513,
417,
1646,
281,
3662,
436,
1953,
50276,
20261,
3676,
22491,
91,
8420,
5145,
476,
320,
625,
12112,
323,
14053,
1027,
17697,
10670,
1293,
851,
26208,
352,
3133,
281,
1705,
387,
253,
2105,
273,
1146,
1199,
12150,
281,
6194,
1223,
22128,
327,
1599,
3423,
11193,
516,
12371,
849,
18934,
253,
1599,
3423,
11193,
273,
253,
12637,
3268,
310,
275,
253,
1655,
9380,
4758,
534,
556,
417,
644,
5469,
50276,
783,
4679,
403,
1077,
1355,
1026,
3256,
253,
3888,
403,
512,
342,
1077,
1698,
21061,
436,
3133,
281,
1804,
253,
45783,
414,
273,
3676,
22491,
91,
8420,
5145,
275,
260,
338,
274,
740,
3368,
253,
1071,
3933,
274,
951,
310,
760,
9135,
534,
310,
247,
2257,
2406,
685,
970,
6041,
11454,
35615,
50276,
249,
16186,
1384,
247,
1077,
10341,
13642,
310,
908,
846,
14940,
281,
253,
1599,
3423,
2900,
436,
3133,
751,
271,
519,
37806,
4993,
323,
247,
1332,
326,
36908,
1663,
789,
1955,
281,
253,
45973,
414,
7658,
2654,
320,
6110,
275,
6523,
5661,
5301,
875,
253,
24337,
2715,
285,
253,
3236,
2715,
50276,
1542,
253,
12097,
1083,
253,
1566,
2987,
1805,
1293,
253,
45973,
414,
7658,
436,
3133,
281,
320,
1411,
253,
2644,
1127,
273,
253,
2929,
50276,
783,
4081,
1332,
1057,
417,
1646,
281,
452,
1534,
7756,
2429,
281,
2469,
2987,
275,
436,
1386,
273,
789,
24088,
16421,
12028,
275,
2829,
495,
577,
50276,
38092,
5701,
50276,
6377,
495,
359,
7579,
253,
10668,
2220,
50276,
2520,
36908,
3590,
47412,
1037,
3451,
50276,
1704,
4791,
25957,
253,
1566,
310,
10166,
3587,
281,
3453,
3451,
8459,
932,
3185,
273,
253,
7312,
12177,
11903,
1320,
534,
476,
320,
540,
44374,
752,
310,
3663,
275,
436,
8077,
1877,
275,
1635,
281,
1599,
3423,
11193,
11038,
760,
8459,
932,
3133,
1077,
25319,
281,
479,
50276,
5430,
281,
6194,
253,
4081,
1566,
327,
39657,
273,
3888,
604,
891,
2096,
9113,
253,
1655,
3733,
5199,
651,
3410,
247,
2014,
2460,
8085,
352,
715,
1269,
73,
285,
1269,
80,
1408,
1599,
3423,
17032,
1677,
1269,
80,
275,
247,
46350,
5133,
840,
896,
44263,
366,
949,
2957,
11591,
82,
73,
1269,
73,
403,
2709,
1599,
3423,
27377,
1408,
275,
7529,
604,
594,
513,
597,
897,
253,
1072,
1180,
273,
25142,
604,
417,
891,
651,
8564,
253,
3733,
281,
452,
1077,
1029,
11041,
50276,
255,
253,
990,
273,
4706,
4791,
387,
253,
1755,
273,
3239,
854,
2139,
310,
305,
82,
73,
417,
253,
16109,
264,
2715,
50276,
2520,
2929,
310,
973,
15720,
533,
697,
9021,
403,
32809,
342,
8489,
5075,
5661,
1543,
2490,
187,
4118,
18435,
27,
2520,
310,
271,
4722,
7680,
281,
253,
22491,
91,
8420,
5145,
270,
78,
6239,
326,
2789,
247,
5322,
4602,
281,
372,
82,
3210,
327,
247,
2762,
3877,
30628,
1119,
326,
352,
369,
973,
15720,
2590,
285,
4722,
19235,
627,
497,
1534,
7350,
342,
253,
7714,
326,
497,
417,
4751,
9713,
275,
253,
18520,
19582,
390,
18464,
1666,
25379,
12497,
6152,
1677,
281,
2045,
2987,
285,
253,
958,
326,
436,
1566,
310,
3710,
347,
2429,
281,
697,
270,
78,
17772,
50276,
74,
651,
5583,
326,
253,
4477,
1379,
715,
2395,
253,
30628,
8680,
275,
247,
18520,
273,
253,
789
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
290,
10144,
2957,
310,
253,
13757,
2303,
323,
4715,
253,
2929,
310,
973,
3542,
285,
3477,
281,
956,
954,
273,
697,
9410,
1705,
432,
5368,
6239,
533,
436,
789,
23395,
12516,
1110,
5368,
7437,
2014,
4229,
1127,
7529,
11269,
2366,
5277,
247,
37851,
7914,
347,
247,
22491,
91,
8420,
5145,
326,
310,
747,
50276,
6050,
436,
2929,
35520,
849,
253,
4081,
2746,
13276,
253,
897,
273,
2087,
22491,
91,
8420,
10679,
270,
78,
285,
417,
816,
24982,
11096,
270,
983,
253,
4795,
1566,
1537,
2686,
320,
625,
11096,
685,
253,
24982,
45630,
983,
326,
352,
31901,
281,
3157,
2220,
352,
310,
2032,
326,
253,
4081,
1566,
476,
3831,
540,
1544,
4071,
285,
21114,
15381,
10291,
326,
247,
14073,
79,
19756,
533,
512,
253,
3602,
403,
11096,
594,
347,
281,
4711,
247,
1114,
297,
26306,
12637,
11193,
323,
667,
7898,
1941,
253,
2032,
12637,
1014,
323,
247,
2014,
12026,
391,
5844,
476,
320,
23390,
26306,
604,
253,
3602,
403,
417,
11096,
436,
2097,
326,
347,
247,
14053,
4968,
253,
4081,
270,
78,
342,
11096,
13461,
1537,
320,
1679,
12112,
685,
247,
14073,
79,
1142,
16689,
273,
1600,
403,
23390,
26306,
3782,
347,
359,
4796,
253,
2130,
1941,
275,
958,
275,
253,
5928,
273,
1941,
667,
4217,
270,
78,
588,
452,
281,
320,
23390,
26306,
323,
4227,
281,
320,
2104,
281,
3410,
1027,
278,
79,
382,
24321,
432,
352,
50276,
783,
4081,
5122,
323,
3733,
310,
671,
14999,
275,
326,
352,
760,
4483,
323,
16888,
22296,
4715,
352,
2550,
320,
908,
323,
440,
35421,
4715,
534,
310,
253,
6867,
4438,
273,
4254,
323,
45630,
983,
285,
14073,
2224,
10323,
253,
8892,
326,
352,
476,
8415,
878,
281,
320,
37171,
275,
824,
247,
1039,
326,
253,
1941,
2530,
310,
2217,
281,
557,
19062,
6340,
247,
2014,
4438,
273,
253,
12637,
323,
4227,
604,
359,
971,
281,
1347,
278,
79,
382,
24321,
275,
31406,
1076,
285,
359,
2085,
760,
253,
1755,
2030,
273,
253,
2460,
4645,
247,
40296,
1426,
282,
50276,
24902,
2535,
621,
812,
320,
470,
374,
495,
721,
854,
898,
436,
1332,
651,
1891,
387,
436,
4836,
1580,
352,
651,
12724,
4284,
281,
247,
2014,
6670,
390,
1014,
7197,
247,
2014,
5019,
273,
253,
1896,
2535,
621,
50276,
40622,
253,
3559,
2929,
1057,
2085,
271,
5919,
5122,
323,
17697,
3733,
273,
4764,
44255,
270,
983,
285,
1057,
247,
1175,
2628,
387,
352,
533,
253,
897,
2219,
275,
534,
352,
476,
320,
3732,
403,
18270,
3710,
1097,
1955,
281,
253,
1511,
273,
3733,
285,
3602,
352,
476,
897,
50276,
783,
5661,
2593,
1057,
417,
3831,
14282,
14023,
342,
643,
3082,
10352,
25379,
50276,
44650,
337,
897,
634,
2957,
1159,
342,
16109,
264,
7529,
1599,
1673,
17032,
26332,
1908,
31731,
247,
4373,
19484,
285,
513,
417,
16209,
667,
12400,
327,
253,
3602,
273,
253,
270,
78,
50276,
44650,
374,
897,
247,
14073,
79,
1679,
9305,
43541,
1612,
533,
48566,
275,
3602,
285,
342,
247,
625,
1463,
2957,
1159,
50276,
601,
352,
310,
2834,
281,
11206,
253,
8542,
5750,
275,
253,
2530,
6667,
50276,
37585,
5701,
285,
3533,
50276,
5658,
921,
326,
253,
1599,
1673,
17032,
1895,
556,
247,
2014,
4156,
24571,
533,
310,
253,
2032,
12637,
1114,
297,
26306,
762,
436,
4764,
1320,
326,
651,
320,
247,
10046,
906,
285,
11638,
281,
871,
50275,
261,
253,
7316,
253,
8085,
875,
2540,
4903,
285,
4903,
581,
5605,
281,
3283,
4229,
4768,
3733,
3738,
436,
310,
417,
11120,
8042,
562,
275,
253,
10527,
629,
273,
634,
2929,
352,
3133,
281,
320,
4229,
19936,
2328,
413,
4072,
436,
8085,
812,
320,
1027,
323,
1016,
3733,
3410,
534,
3133,
281,
320,
253,
1083,
1754,
327,
634,
4679,
970,
1027,
36509,
310,
1925,
7316,
3733,
275,
436,
39951,
2284,
43425,
2929,
7316,
3733,
4715,
247,
7197,
1566,
281,
9441,
1805,
8459,
932,
275,
3807,
17799,
29886,
3210,
342,
8763,
4903,
534,
3133,
281,
12661,
247,
1077,
2074,
2746,
3738,
970,
247,
1027,
1511,
273,
17032,
352,
69,
320,
1175,
281,
19148,
534,
2746,
368,
403,
970,
275,
253,
5740,
273,
3733,
50275,
783,
5426,
273,
253,
1159,
275,
16186,
1903,
310,
247,
2372,
21643,
984,
273,
849,
253,
5028,
310,
2908,
812,
368,
4853,
352,
407,
4243,
390,
4853,
891,
50275,
4609,
310,
253,
1318,
273,
9765,
326,
368,
897,
323,
634,
4679,
50275,
783,
4677,
898,
22270,
1071,
7200,
10140,
281,
253,
884,
1106,
13301,
273,
1016,
6670,
390,
281,
253,
577,
1106,
9050,
273,
253,
15115,
50275,
555,
5367,
326,
37695,
281,
253,
11096,
50276,
23581,
5847,
436,
2929,
1057,
247,
1175,
2628,
387,
5277,
247,
5122,
323,
17032,
275,
4764,
11096,
270,
983,
342,
14940,
23632,
347,
973,
347,
271,
5919,
1332,
281,
3037,
253,
3602,
273,
841,
270,
983,
50276,
5040,
253,
4081,
1332,
2550,
320,
3732,
275,
1142,
7533,
275,
534,
270,
983,
476,
440,
35421,
4715,
10491,
897,
273,
23390,
26306,
20731,
17327,
1652,
5661,
12820,
273,
253,
31471,
273,
253,
41886,
17032,
50276,
601,
253,
7533,
275,
534,
436,
2746,
476,
320,
908,
310,
1077,
3710,
533,
1561,
326,
4758,
352,
3400,
253,
2424,
4278,
323,
5919,
3733,
285,
10237,
23632,
323,
17032,
5474,
33032,
2520,
2929,
28055,
2722,
326,
253,
1599,
3423,
5150,
323,
247,
2176,
2021,
273,
22491,
91,
8420,
10679,
342,
8763,
4903,
1925,
253,
49123,
14073,
983,
476,
320,
23115,
347,
253,
4102,
4081,
49123,
3676,
12902,
372,
82,
1566,
436,
2929,
2007,
45589,
3607,
273,
824,
22491,
91,
8420,
10679,
285,
697,
3733,
285,
2722,
697,
3879,
275,
4679,
327,
278,
79,
382,
285,
260,
338,
274,
740,
50275,
45563,
50276,
783,
4757,
273,
436,
2929,
310,
253,
7681,
7680,
273,
4560,
253,
4602,
875,
253,
14073,
983,
285,
253,
372,
82,
1566,
407,
39330,
253,
49123,
14073,
983,
891,
1158,
436,
310,
247,
1175,
7680,
347,
352,
476,
320,
5667,
323,
2007,
2440,
273,
270,
983,
7296,
326,
253,
1655,
4780,
273,
270,
983,
310,
417,
594,
5233,
275,
619,
4685,
50276,
783,
3290,
273,
9759,
310,
671,
1175,
285,
436,
2929,
310,
4518,
3542,
4583,
891,
452,
816,
247,
5884,
4385,
50276,
17480,
253,
1655,
8813,
273,
247,
2972,
19781,
4315,
310,
21248,
4496,
11076,
1037,
4853,
352,
323,
253,
1881,
7507,
1866,
405,
50274,
20881,
1255,
265,
50276,
783,
8453,
273,
436,
2929,
310,
417,
1029,
285,
7103,
310,
5075,
275,
1798,
253,
8542,
5750,
273,
253,
4081,
49123,
14073,
983,
310,
417,
2590,
50275,
20261,
352,
310,
2032,
326,
253,
2021,
273,
270,
983,
281,
534,
253,
4499,
422,
23279,
5933,
476,
320,
3732,
310,
3710,
667,
270,
983,
342,
667,
4602,
6127,
273,
8763,
4903,
476,
320,
10166,
407,
3587,
9433,
33342,
1768,
10491,
534,
273,
2282,
3797,
253,
49123,
14073,
983,
3103,
253,
49123,
14073,
983,
452,
642,
15785,
342,
1675,
281,
253,
12510,
273,
17032,
285,
891,
5476,
253,
760,
8542,
5750,
273,
253,
49123,
14073,
983,
476,
320,
253,
6733,
2299,
627,
310,
6747,
1783,
273,
15180,
10454,
4543,
16774,
20243,
5301,
281,
824,
247,
15246,
2746,
50276,
249,
1635,
352,
556,
644,
2168,
11464,
326,
45630,
983,
476,
1957,
667,
3268,
3103,
432,
253,
31460,
273,
253,
6779,
1612,
627,
310,
642,
3064,
875,
45630,
983,
285,
49123,
14073,
983,
273,
2282,
891,
5194,
326,
49123,
14073,
983,
476,
320,
625,
3576,
685,
45630,
983,
285,
5368,
14073,
983,
275,
3946,
323,
1650,
49123,
14073,
983,
476,
5115,
625,
7899,
17032,
342,
1679,
3602,
685,
45630,
983,
2299,
627,
403,
642,
824,
14023,
275,
436,
2929,
50276,
74,
717,
5211,
281,
2572,
619,
4868,
604,
253,
1840,
619,
7350,
403,
6283,
9713,
407,
253,
4477,
2380,
436,
2929,
7826,
3797,
271,
4722,
7681,
7680,
1223,
253,
8453,
310,
417,
21414,
285,
253,
7103,
310,
5075,
5474,
33032,
2520,
2929,
29328,
247,
747,
2021,
273,
49123,
3676,
22491,
91,
8420,
10679,
835,
253,
28208,
19316,
10517,
247,
45973,
414,
1617,
4933,
6054,
281,
5919,
1599,
3423,
19502,
342,
872,
494,
14940,
23632,
253,
14940,
310,
2797,
407,
10263,
10291,
342,
49123,
3676,
12902,
3210,
1355,
7527,
4679,
403,
2218,
347,
4737,
273,
4473,
50276,
783,
2929,
310,
1077,
973,
15720,
285,
3477,
281,
1239,
2299,
891,
1119,
253,
38135,
4809,
273,
253,
789,
281,
320,
247,
2372,
14999,
50276,
45529,
432,
253,
747,
4764,
1320,
16186,
5910,
5611,
281,
10517,
253,
45973,
414,
1617,
253,
1332,
273,
436,
2929,
3133,
751,
247,
15246,
5019,
273,
465,
376,
864,
67,
6968,
77,
50276,
76,
311,
1866,
4072,
270,
13734,
1162,
355,
4022,
285,
3330,
5493,
50276,
76,
311,
350,
9169,
50276,
262,
310,
671,
12744,
849,
1551,
343,
19343,
436,
4764,
1320,
310,
534,
3139,
310,
3240,
2969,
1561,
512,
1896,
28208,
19316,
326,
10517,
253,
45973,
414,
1617,
50275,
783,
7529,
11269,
285,
253,
14940,
4737,
403,
2761,
4555,
253,
1072,
347,
3330,
5493,
50276,
76,
311,
350,
9169,
3707,
323,
253,
6880,
281,
2602,
4090,
4254,
50276,
74,
651,
320,
5211,
342,
253,
38135,
4809,
604,
21414,
4679,
1543,
403,
2011,
30018,
436,
1057,
417,
1646,
281,
320,
253,
1083,
50276,
783,
8542,
5649,
273,
3676,
22491,
91,
8420,
5145,
2429,
281,
625,
5899,
11454,
35615,
24088,
260,
9866,
323,
2460,
9162,
310,
417,
2590,
281,
479,
285,
556,
417,
644,
16318,
275,
253,
2929,
672,
651,
3095,
897,
3676,
270,
78,
3185,
273,
253,
18075,
253,
5661,
1543,
513,
417,
1646,
281,
3662,
436,
1953,
50276,
20261,
3676,
22491,
91,
8420,
5145,
476,
320,
625,
12112,
323,
14053,
1027,
17697,
10670,
1293,
851,
26208,
352,
3133,
281,
1705,
387,
253,
2105,
273,
1146,
1199,
12150,
281,
6194,
1223,
22128,
327,
1599,
3423,
11193,
516,
12371,
849,
18934,
253,
1599,
3423,
11193,
273,
253,
12637,
3268,
310,
275,
253,
1655,
9380,
4758,
534,
556,
417,
644,
5469,
50276,
783,
4679,
403,
1077,
1355,
1026,
3256,
253,
3888,
403,
512,
342,
1077,
1698,
21061,
436,
3133,
281,
1804,
253,
45783,
414,
273,
3676,
22491,
91,
8420,
5145,
275,
260,
338,
274,
740,
3368,
253,
1071,
3933,
274,
951,
310,
760,
9135,
534,
310,
247,
2257,
2406,
685,
970,
6041,
11454,
35615,
50276,
249,
16186,
1384,
247,
1077,
10341,
13642,
310,
908,
846,
14940,
281,
253,
1599,
3423,
2900,
436,
3133,
751,
271,
519,
37806,
4993,
323,
247,
1332,
326,
36908,
1663,
789,
1955,
281,
253,
45973,
414,
7658,
2654,
320,
6110,
275,
6523,
5661,
5301,
875,
253,
24337,
2715,
285,
253,
3236,
2715,
50276,
1542,
253,
12097,
1083,
253,
1566,
2987,
1805,
1293,
253,
45973,
414,
7658,
436,
3133,
281,
320,
1411,
253,
2644,
1127,
273,
253,
2929,
50276,
783,
4081,
1332,
1057,
417,
1646,
281,
452,
1534,
7756,
2429,
281,
2469,
2987,
275,
436,
1386,
273,
789,
24088,
16421,
12028,
275,
2829,
495,
577,
50276,
38092,
5701,
50276,
6377,
495,
359,
7579,
253,
10668,
2220,
50276,
2520,
36908,
3590,
47412,
1037,
3451,
50276,
1704,
4791,
25957,
253,
1566,
310,
10166,
3587,
281,
3453,
3451,
8459,
932,
3185,
273,
253,
7312,
12177,
11903,
1320,
534,
476,
320,
540,
44374,
752,
310,
3663,
275,
436,
8077,
1877,
275,
1635,
281,
1599,
3423,
11193,
11038,
760,
8459,
932,
3133,
1077,
25319,
281,
479,
50276,
5430,
281,
6194,
253,
4081,
1566,
327,
39657,
273,
3888,
604,
891,
2096,
9113,
253,
1655,
3733,
5199,
651,
3410,
247,
2014,
2460,
8085,
352,
715,
1269,
73,
285,
1269,
80,
1408,
1599,
3423,
17032,
1677,
1269,
80,
275,
247,
46350,
5133,
840,
896,
44263,
366,
949,
2957,
11591,
82,
73,
1269,
73,
403,
2709,
1599,
3423,
27377,
1408,
275,
7529,
604,
594,
513,
597,
897,
253,
1072,
1180,
273,
25142,
604,
417,
891,
651,
8564,
253,
3733,
281,
452,
1077,
1029,
11041,
50276,
255,
253,
990,
273,
4706,
4791,
387,
253,
1755,
273,
3239,
854,
2139,
310,
305,
82,
73,
417,
253,
16109,
264,
2715,
50276,
2520,
2929,
310,
973,
15720,
533,
697,
9021,
403,
32809,
342,
8489,
5075,
5661,
1543,
2490,
187,
4118,
18435,
27,
2520,
310,
271,
4722,
7680,
281,
253,
22491,
91,
8420,
5145,
270,
78,
6239,
326,
2789,
247,
5322,
4602,
281,
372,
82,
3210,
327,
247,
2762,
3877,
30628,
1119,
326,
352,
369,
973,
15720,
2590,
285,
4722,
19235,
627,
497,
1534,
7350,
342,
253,
7714,
326,
497,
417,
4751,
9713,
275,
253,
18520,
19582,
390,
18464,
1666,
25379,
12497,
6152,
1677,
281,
2045,
2987,
285,
253,
958,
326,
436,
1566,
310,
3710,
347,
2429,
281,
697,
270,
78,
17772,
50276,
74,
651,
5583,
326,
253,
4477,
1379,
715,
2395,
253,
30628,
8680,
275,
247,
18520,
273,
253,
789
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper provides a theoretical analysis for batch normalization with gradient descent gdbn under a simplified scenario ie solving an ordinary least squares problem the analysis shows that gdbn converges to a stationary point when the learning rate is less than or equal to 1 regardless of the condition number of the problem some practical experiments are carried out to justify their theoretical insights the paper is in general easy to follow pros this paper provides some insights for bn using the simplified model 1 it shows that the optimal convergence rate of bn can be faster than vanilla gd 2 it shows that gdbn doesnt diverge even if the learning rate for trainable parameters is very large cons 1 in the main theorem when the learning rate for the rescaling parameter is less than or equal to 1 the algorithm is only proved to converge to a stationary point for ols problem rather a global optimal 2 to show convergence to the global optimal the learning rate needs to be sufficiently small but it is not specified how small it is overall i think this paper provides some preliminary analysis for bn which should shed some lights for understanding bn however the model under analysis is very simplified and the theoretical results are still preliminarydocsepthe paper presents an analysis of the batch normalization idea on a simple ols problem the analysis is interesting as presented but several key questions remain as described below it is unclear that these questions are answered to the point where the insight gained can be considered transferable to bn in large neural network models the reason why the auxiliary variable a is included in the formulation 7 is unclear the whole reason for using bn is to rescale intermediate outputs to have an expectation of zero and variance of one the authors claim that bn produces order 1 output and so a is needed can you please explain his better the scaling proposition 32 is claimed to be important but the authors dont provide a clear explanation of why that is so two different settings of algorithms are presented where the iterates should roughly be in the same order if input parameters of the formulation or the algorithm are scaled in a specific way it is unclear how this leads to the claimed insight that the bn algorithm is yielded to be insensitive to input parameters of step length etc due to this proposition also where is the proof of this proposition i couldnt find it in the appendix and i apologize in advance if thats an oversight on my part the u referred to in eqn 14 is the optimal solution to the original ols problem so has form h1 g for some g that depends on input parameters doesnt this simplify the expression in 4 does this lead to some intuition on how the condition number of h relates to h does this operation knock off the highest or lowest eigenvalue of h to impact the condition number additionally it is bad notation to use twoletter function names in a mathematical description such as bnz this gets confusing very fast in theorems and proofs though the cs community seems to be comfortable with this convention docsepthe author analyze the convergence properties of batch normalization for the ordinary least square ols objective they also provide experimental results on the ols objective as well as small scale neural networks first of all understanding the properties of batch normalization is an important topic in the machine learning community so in that sense contributions that tackle this problem are of interest for the community however this paper has a significant number of problems that need to be addressed before publication perhaps the most important one being the overlap with prior work please address this point clearly in your rebuttal 1 overlap with kolher et al 2018 the authors erroneously state that kolher et al considered the convergence properties of bngd on linear networks while after taking a close look at their analysis they first derive an analysis for leastsquares and then also provide an extension of their analysis to perceptrons the major problem is that this paper does not correctly state the difference between their analysis and kolher et al who already derived similar results for ols i will come back to this aspect multiple times below 2 properties of the minimizer the authors should clearly state that kolher et al first proved that a and w have similar properties to eq 8 if i understand correctly the difference seem to be that the algorithm analyzed in kohler relies on the optimal a while the analysis presented here alternates between optimizing a and w is this correct is there any advantage in not using a i think this would be worth clarifying 3 scaling property i find this section confusing specifically a the authors say they rely on this property in the proof but it is not very clear why this is beneficial can you please elaborate b it seems to me this scaling property is also similar to the analysis of kolher et al who showed that the reparametrized ols objective yields a rayleigh quotient objective can you comment on this c the idea of restarting is not clear to me are you saying that one the magnitude of the vector w goes above a certain threshold then one can rescale the vector therefore going back to what you called an equivalent representation i dont see why the text has to make this part so unclear looking at the proof of theorem 33 this property seem to be used to simply rescale the a and w parameters d the authors claim that the scaling law proposition 32 should play a significant role to extend the analysis to more general models this requires further explanation why would this help for say neural networks or other more complex models 4 convergence rate it seems to me that the results obtained in this paper are weaker than previous known results i would have liked to see a discussion of these results specifically a theorem 33 is an asymptotic convergence result so it is much weaker than the linear rate of convergence derived in kolher et al the authors require a sufficiently small step size looking at the analysis of kolher et al they show that the reparametrized ols objective yields a rayleigh quotient objective wouldnt a constant step size also yield convergence in that case b proposition 34 also only provides a local convergence rate the authors argue bngd could have a faster convergence this does seem to again be a weaker result so again i think it would be very beneficial if the authors could clearly state the differences with previous work 5 saddles for neural nets the authors claim they have not encountered convergence to saddles for the experiments with neural networks how did you check whether the limit point reached by bngd was not a saddle point this requires computing all the eigenvalues of the hessian which is typically expensive how was this done exactly 6 extension of the analysis to deep neural networks the analysis provided in this paper only applies to ols while kolher et al also derived an analysis for neural networks can the authors comment on extending their own analysis to neural nets and how this would differ from the one derived in kolher et al 7 experiments how would you estimate the range of suitable step sizes for both a and w for bngd for a neural network
### Summary: | the reviewers agree that providing more insights on why batch normalization work is an important topic of investigation but they all raised several problems with the current submission which need to be addressed before publication the ac thus proposes revise and sesubmit | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
3400,
247,
10527,
1783,
323,
14604,
21539,
342,
11786,
18499,
305,
5470,
79,
762,
247,
21010,
10076,
26332,
16161,
271,
9826,
1878,
19325,
1895,
253,
1783,
2722,
326,
305,
5470,
79,
26414,
281,
247,
17429,
1127,
672,
253,
4715,
2281,
310,
1679,
685,
390,
4503,
281,
337,
10159,
273,
253,
1617,
1180,
273,
253,
1895,
690,
8542,
4679,
403,
4824,
562,
281,
15249,
616,
10527,
16039,
253,
2929,
310,
275,
2087,
3477,
281,
956,
50275,
856,
84,
436,
2929,
3400,
690,
16039,
323,
270,
79,
970,
253,
21010,
1566,
337,
352,
2722,
326,
253,
8654,
14940,
2281,
273,
270,
79,
476,
320,
7938,
685,
26724,
305,
69,
50276,
19,
352,
2722,
326,
305,
5470,
79,
36908,
11711,
463,
1014,
604,
253,
4715,
2281,
323,
6194,
494,
3602,
310,
1077,
1781,
50275,
5040,
337,
275,
253,
2022,
10012,
672,
253,
4715,
2281,
323,
253,
46595,
272,
4764,
310,
1679,
685,
390,
4503,
281,
337,
253,
5933,
310,
760,
8058,
281,
29623,
281,
247,
17429,
1127,
323,
258,
5200,
1895,
2581,
247,
4156,
8654,
50275,
19,
281,
921,
14940,
281,
253,
4156,
8654,
253,
4715,
2281,
3198,
281,
320,
10481,
1355,
533,
352,
310,
417,
7616,
849,
1355,
352,
310,
50275,
1189,
455,
891,
1158,
436,
2929,
3400,
690,
12611,
1783,
323,
270,
79,
534,
943,
17914,
690,
10654,
323,
4685,
270,
79,
2299,
253,
1566,
762,
1783,
310,
1077,
21010,
285,
253,
10527,
1543,
403,
1335,
12611,
7152,
339,
431,
248,
2929,
10262,
271,
1783,
273,
253,
14604,
21539,
2934,
327,
247,
2969,
258,
5200,
1895,
253,
1783,
310,
4722,
347,
3559,
533,
2067,
2234,
3533,
3464,
347,
2529,
2708,
352,
310,
12744,
326,
841,
3533,
403,
9577,
281,
253,
1127,
835,
253,
12288,
12103,
476,
320,
2783,
3700,
494,
281,
270,
79,
275,
1781,
11454,
2990,
3210,
50273,
783,
1921,
2139,
253,
24026,
4778,
247,
310,
2908,
275,
253,
15895,
818,
310,
12744,
253,
2644,
1921,
323,
970,
270,
79,
310,
281,
9708,
1079,
10444,
18012,
281,
452,
271,
15355,
273,
5058,
285,
11041,
273,
581,
253,
4477,
1750,
326,
270,
79,
11330,
1340,
337,
3453,
285,
594,
247,
310,
3058,
476,
368,
4496,
5513,
521,
1805,
50275,
783,
13642,
13989,
4567,
310,
7558,
281,
320,
1774,
533,
253,
4477,
13414,
2085,
247,
2590,
8813,
273,
2139,
326,
310,
594,
767,
1027,
7533,
273,
11333,
403,
3559,
835,
253,
10040,
684,
943,
11467,
320,
275,
253,
1072,
1340,
604,
3280,
3602,
273,
253,
15895,
390,
253,
5933,
403,
24337,
275,
247,
2173,
1039,
352,
310,
12744,
849,
436,
5644,
281,
253,
7558,
12288,
326,
253,
270,
79,
5933,
310,
20714,
281,
320,
39188,
281,
3280,
3602,
273,
3213,
2978,
3966,
1955,
281,
436,
13989,
671,
835,
310,
253,
4737,
273,
436,
13989,
891,
812,
2649,
1089,
352,
275,
253,
30762,
285,
891,
26012,
275,
7170,
604,
28763,
271,
29002,
327,
619,
629,
50275,
783,
1484,
6289,
281,
275,
16186,
79,
1638,
310,
253,
8654,
2900,
281,
253,
3236,
258,
5200,
1895,
594,
556,
830,
288,
18,
305,
323,
690,
305,
326,
7024,
327,
3280,
3602,
36908,
436,
25636,
253,
2048,
275,
577,
1057,
436,
1421,
281,
690,
30328,
327,
849,
253,
1617,
1180,
273,
288,
7033,
281,
288,
1057,
436,
4254,
7569,
745,
253,
4585,
390,
8840,
25023,
273,
288,
281,
3486,
253,
1617,
1180,
50272,
29483,
595,
352,
310,
3076,
14951,
281,
897,
767,
15139,
1159,
4454,
275,
247,
15965,
5740,
824,
347,
270,
36794,
436,
4850,
21643,
1077,
3809,
275,
39383,
285,
27947,
2167,
253,
29180,
3114,
3133,
281,
320,
9848,
342,
436,
5008,
50276,
7152,
339,
431,
248,
2488,
12106,
253,
14940,
3607,
273,
14604,
21539,
323,
253,
9826,
1878,
6278,
258,
5200,
8103,
597,
671,
2085,
5661,
1543,
327,
253,
258,
5200,
8103,
347,
973,
347,
1355,
4311,
11454,
6928,
806,
273,
512,
4685,
253,
3607,
273,
14604,
21539,
310,
271,
1774,
9400,
275,
253,
5145,
4715,
3114,
594,
275,
326,
3282,
9021,
326,
18915,
436,
1895,
403,
273,
1600,
323,
253,
3114,
2299,
436,
2929,
556,
247,
1534,
1180,
273,
3237,
326,
878,
281,
320,
9713,
1078,
9311,
4931,
253,
954,
1774,
581,
1146,
253,
14787,
342,
2720,
789,
4496,
2953,
436,
1127,
4518,
275,
634,
30080,
22559,
50276,
18,
14787,
342,
38301,
379,
1162,
355,
4765,
253,
4477,
41587,
1375,
326,
38301,
379,
1162,
355,
2783,
253,
14940,
3607,
273,
270,
1251,
69,
327,
4872,
6928,
1223,
846,
3192,
247,
2810,
1007,
387,
616,
1783,
597,
806,
15313,
271,
1783,
323,
1878,
23600,
4420,
285,
840,
671,
2085,
271,
6880,
273,
616,
1783,
281,
591,
916,
9036,
253,
2201,
1895,
310,
326,
436,
2929,
1057,
417,
9113,
1375,
253,
3064,
875,
616,
1783,
285,
38301,
379,
1162,
355,
665,
2168,
6012,
2074,
1543,
323,
258,
5200,
891,
588,
1705,
896,
281,
436,
4809,
2709,
2069,
2708,
50276,
19,
3607,
273,
253,
7221,
6081,
253,
4477,
943,
4518,
1375,
326,
38301,
379,
1162,
355,
806,
8058,
326,
247,
285,
259,
452,
2074,
3607,
281,
16186,
854,
604,
891,
2096,
9113,
253,
3064,
1646,
281,
320,
326,
253,
5933,
5867,
275,
465,
1368,
2146,
15771,
327,
253,
8654,
247,
1223,
253,
1783,
3559,
1060,
3960,
684,
875,
39793,
247,
285,
259,
310,
436,
3451,
310,
627,
667,
5750,
275,
417,
970,
247,
891,
1158,
436,
651,
320,
4409,
8254,
5411,
50276,
20,
13642,
2867,
891,
1089,
436,
2593,
21643,
5742,
247,
253,
4477,
1333,
597,
10725,
327,
436,
2867,
275,
253,
4737,
533,
352,
310,
417,
1077,
2590,
2139,
436,
310,
12912,
476,
368,
4496,
21184,
270,
352,
3133,
281,
479,
436,
13642,
2867,
310,
671,
2074,
281,
253,
1783,
273,
38301,
379,
1162,
355,
665,
2692,
326,
253,
294,
3575,
292,
50065,
258,
5200,
8103,
11026,
247,
21868,
34460,
26860,
8103,
476,
368,
4385,
327,
436,
260,
253,
2934,
273,
19855,
272,
310,
417,
2590,
281,
479,
403,
368,
3981,
326,
581,
253,
9777,
273,
253,
4972,
259,
4566,
1840,
247,
2176,
7887,
840,
581,
476,
9708,
1079,
253,
4972,
3103,
1469,
896,
281,
752,
368,
1925,
271,
6425,
6779,
891,
13414,
923,
2139,
253,
2505,
556,
281,
1056,
436,
629,
594,
12744,
2819,
387,
253,
4737,
273,
10012,
5922,
436,
2867,
1646,
281,
320,
908,
281,
3365,
9708,
1079,
253,
247,
285,
259,
3602,
277,
253,
4477,
1750,
326,
253,
13642,
1569,
13989,
4567,
943,
1132,
247,
1534,
2554,
281,
9017,
253,
1783,
281,
625,
2087,
3210,
436,
4419,
2007,
8813,
2139,
651,
436,
1361,
323,
1333,
11454,
6928,
390,
643,
625,
2570,
3210,
50276,
21,
14940,
2281,
352,
3133,
281,
479,
326,
253,
1543,
2797,
275,
436,
2929,
403,
21076,
685,
2045,
1929,
1543,
891,
651,
452,
10490,
281,
923,
247,
5955,
273,
841,
1543,
5742,
247,
10012,
5922,
310,
271,
20185,
14940,
906,
594,
352,
310,
1199,
21076,
685,
253,
4872,
2281,
273,
14940,
6012,
275,
38301,
379,
1162,
355,
253,
4477,
2430,
247,
10481,
1355,
3213,
1979,
2819,
387,
253,
1783,
273,
38301,
379,
1162,
355,
597,
921,
326,
253,
294,
3575,
292,
50065,
258,
5200,
8103,
11026,
247,
21868,
34460,
26860,
8103,
651,
2649,
247,
3638,
3213,
1979,
671,
4917,
14940,
275,
326,
1083,
270,
13989,
5910,
671,
760,
3400,
247,
1980,
14940,
2281,
253,
4477,
9059,
270,
1251,
69,
812,
452,
247,
7938,
14940,
436,
1057,
1646,
281,
969,
320,
247,
21076,
906,
594,
969,
891,
1158,
352,
651,
320,
1077,
12912,
604,
253,
4477,
812,
4518,
1375,
253,
3910,
342,
2045,
789,
50276,
22,
37127,
868,
323,
11454,
37507,
253,
4477,
1750,
597,
452,
417,
14494,
14940,
281,
37127,
868,
323,
253,
4679,
342,
11454,
6928,
849,
858,
368,
2451,
1880,
253,
2701,
1127,
4925,
407,
270,
1251,
69,
369,
417,
247,
26759,
1127,
436,
4419,
12672,
512,
253,
20223,
273,
253,
344,
859,
757,
534,
310,
5431,
8214,
849,
369,
436,
2218,
4555,
50276,
23,
6880,
273,
253,
1783,
281,
3676,
11454,
6928,
253,
1783,
2530,
275,
436,
2929,
760,
10384,
281,
258,
5200,
1223,
38301,
379,
1162,
355,
671,
6012,
271,
1783,
323,
11454,
6928,
476,
253,
4477,
4385,
327,
13633,
616,
1211,
1783,
281,
11454,
37507,
285,
849,
436,
651,
9184,
432,
253,
581,
6012,
275,
38301,
379,
1162,
355,
50276,
24,
4679,
849,
651,
368,
6642,
253,
2491,
273,
7470,
3213,
9552,
323,
1097,
247,
285,
259,
323,
270,
1251,
69,
323,
247,
11454,
2990,
2490,
187,
4118,
18435,
27,
783,
30628,
5194,
326,
5277,
625,
16039,
327,
2139,
14604,
21539,
789,
310,
271,
1774,
9400,
273,
5839,
533,
597,
512,
5439,
2067,
3237,
342,
253,
1655,
19529,
534,
878,
281,
320,
9713,
1078,
9311,
253,
913,
3021,
29328,
49620,
285,
16517,
538,
2225
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
3400,
247,
10527,
1783,
323,
14604,
21539,
342,
11786,
18499,
305,
5470,
79,
762,
247,
21010,
10076,
26332,
16161,
271,
9826,
1878,
19325,
1895,
253,
1783,
2722,
326,
305,
5470,
79,
26414,
281,
247,
17429,
1127,
672,
253,
4715,
2281,
310,
1679,
685,
390,
4503,
281,
337,
10159,
273,
253,
1617,
1180,
273,
253,
1895,
690,
8542,
4679,
403,
4824,
562,
281,
15249,
616,
10527,
16039,
253,
2929,
310,
275,
2087,
3477,
281,
956,
50275,
856,
84,
436,
2929,
3400,
690,
16039,
323,
270,
79,
970,
253,
21010,
1566,
337,
352,
2722,
326,
253,
8654,
14940,
2281,
273,
270,
79,
476,
320,
7938,
685,
26724,
305,
69,
50276,
19,
352,
2722,
326,
305,
5470,
79,
36908,
11711,
463,
1014,
604,
253,
4715,
2281,
323,
6194,
494,
3602,
310,
1077,
1781,
50275,
5040,
337,
275,
253,
2022,
10012,
672,
253,
4715,
2281,
323,
253,
46595,
272,
4764,
310,
1679,
685,
390,
4503,
281,
337,
253,
5933,
310,
760,
8058,
281,
29623,
281,
247,
17429,
1127,
323,
258,
5200,
1895,
2581,
247,
4156,
8654,
50275,
19,
281,
921,
14940,
281,
253,
4156,
8654,
253,
4715,
2281,
3198,
281,
320,
10481,
1355,
533,
352,
310,
417,
7616,
849,
1355,
352,
310,
50275,
1189,
455,
891,
1158,
436,
2929,
3400,
690,
12611,
1783,
323,
270,
79,
534,
943,
17914,
690,
10654,
323,
4685,
270,
79,
2299,
253,
1566,
762,
1783,
310,
1077,
21010,
285,
253,
10527,
1543,
403,
1335,
12611,
7152,
339,
431,
248,
2929,
10262,
271,
1783,
273,
253,
14604,
21539,
2934,
327,
247,
2969,
258,
5200,
1895,
253,
1783,
310,
4722,
347,
3559,
533,
2067,
2234,
3533,
3464,
347,
2529,
2708,
352,
310,
12744,
326,
841,
3533,
403,
9577,
281,
253,
1127,
835,
253,
12288,
12103,
476,
320,
2783,
3700,
494,
281,
270,
79,
275,
1781,
11454,
2990,
3210,
50273,
783,
1921,
2139,
253,
24026,
4778,
247,
310,
2908,
275,
253,
15895,
818,
310,
12744,
253,
2644,
1921,
323,
970,
270,
79,
310,
281,
9708,
1079,
10444,
18012,
281,
452,
271,
15355,
273,
5058,
285,
11041,
273,
581,
253,
4477,
1750,
326,
270,
79,
11330,
1340,
337,
3453,
285,
594,
247,
310,
3058,
476,
368,
4496,
5513,
521,
1805,
50275,
783,
13642,
13989,
4567,
310,
7558,
281,
320,
1774,
533,
253,
4477,
13414,
2085,
247,
2590,
8813,
273,
2139,
326,
310,
594,
767,
1027,
7533,
273,
11333,
403,
3559,
835,
253,
10040,
684,
943,
11467,
320,
275,
253,
1072,
1340,
604,
3280,
3602,
273,
253,
15895,
390,
253,
5933,
403,
24337,
275,
247,
2173,
1039,
352,
310,
12744,
849,
436,
5644,
281,
253,
7558,
12288,
326,
253,
270,
79,
5933,
310,
20714,
281,
320,
39188,
281,
3280,
3602,
273,
3213,
2978,
3966,
1955,
281,
436,
13989,
671,
835,
310,
253,
4737,
273,
436,
13989,
891,
812,
2649,
1089,
352,
275,
253,
30762,
285,
891,
26012,
275,
7170,
604,
28763,
271,
29002,
327,
619,
629,
50275,
783,
1484,
6289,
281,
275,
16186,
79,
1638,
310,
253,
8654,
2900,
281,
253,
3236,
258,
5200,
1895,
594,
556,
830,
288,
18,
305,
323,
690,
305,
326,
7024,
327,
3280,
3602,
36908,
436,
25636,
253,
2048,
275,
577,
1057,
436,
1421,
281,
690,
30328,
327,
849,
253,
1617,
1180,
273,
288,
7033,
281,
288,
1057,
436,
4254,
7569,
745,
253,
4585,
390,
8840,
25023,
273,
288,
281,
3486,
253,
1617,
1180,
50272,
29483,
595,
352,
310,
3076,
14951,
281,
897,
767,
15139,
1159,
4454,
275,
247,
15965,
5740,
824,
347,
270,
36794,
436,
4850,
21643,
1077,
3809,
275,
39383,
285,
27947,
2167,
253,
29180,
3114,
3133,
281,
320,
9848,
342,
436,
5008,
50276,
7152,
339,
431,
248,
2488,
12106,
253,
14940,
3607,
273,
14604,
21539,
323,
253,
9826,
1878,
6278,
258,
5200,
8103,
597,
671,
2085,
5661,
1543,
327,
253,
258,
5200,
8103,
347,
973,
347,
1355,
4311,
11454,
6928,
806,
273,
512,
4685,
253,
3607,
273,
14604,
21539,
310,
271,
1774,
9400,
275,
253,
5145,
4715,
3114,
594,
275,
326,
3282,
9021,
326,
18915,
436,
1895,
403,
273,
1600,
323,
253,
3114,
2299,
436,
2929,
556,
247,
1534,
1180,
273,
3237,
326,
878,
281,
320,
9713,
1078,
9311,
4931,
253,
954,
1774,
581,
1146,
253,
14787,
342,
2720,
789,
4496,
2953,
436,
1127,
4518,
275,
634,
30080,
22559,
50276,
18,
14787,
342,
38301,
379,
1162,
355,
4765,
253,
4477,
41587,
1375,
326,
38301,
379,
1162,
355,
2783,
253,
14940,
3607,
273,
270,
1251,
69,
327,
4872,
6928,
1223,
846,
3192,
247,
2810,
1007,
387,
616,
1783,
597,
806,
15313,
271,
1783,
323,
1878,
23600,
4420,
285,
840,
671,
2085,
271,
6880,
273,
616,
1783,
281,
591,
916,
9036,
253,
2201,
1895,
310,
326,
436,
2929,
1057,
417,
9113,
1375,
253,
3064,
875,
616,
1783,
285,
38301,
379,
1162,
355,
665,
2168,
6012,
2074,
1543,
323,
258,
5200,
891,
588,
1705,
896,
281,
436,
4809,
2709,
2069,
2708,
50276,
19,
3607,
273,
253,
7221,
6081,
253,
4477,
943,
4518,
1375,
326,
38301,
379,
1162,
355,
806,
8058,
326,
247,
285,
259,
452,
2074,
3607,
281,
16186,
854,
604,
891,
2096,
9113,
253,
3064,
1646,
281,
320,
326,
253,
5933,
5867,
275,
465,
1368,
2146,
15771,
327,
253,
8654,
247,
1223,
253,
1783,
3559,
1060,
3960,
684,
875,
39793,
247,
285,
259,
310,
436,
3451,
310,
627,
667,
5750,
275,
417,
970,
247,
891,
1158,
436,
651,
320,
4409,
8254,
5411,
50276,
20,
13642,
2867,
891,
1089,
436,
2593,
21643,
5742,
247,
253,
4477,
1333,
597,
10725,
327,
436,
2867,
275,
253,
4737,
533,
352,
310,
417,
1077,
2590,
2139,
436,
310,
12912,
476,
368,
4496,
21184,
270,
352,
3133,
281,
479,
436,
13642,
2867,
310,
671,
2074,
281,
253,
1783,
273,
38301,
379,
1162,
355,
665,
2692,
326,
253,
294,
3575,
292,
50065,
258,
5200,
8103,
11026,
247,
21868,
34460,
26860,
8103,
476,
368,
4385,
327,
436,
260,
253,
2934,
273,
19855,
272,
310,
417,
2590,
281,
479,
403,
368,
3981,
326,
581,
253,
9777,
273,
253,
4972,
259,
4566,
1840,
247,
2176,
7887,
840,
581,
476,
9708,
1079,
253,
4972,
3103,
1469,
896,
281,
752,
368,
1925,
271,
6425,
6779,
891,
13414,
923,
2139,
253,
2505,
556,
281,
1056,
436,
629,
594,
12744,
2819,
387,
253,
4737,
273,
10012,
5922,
436,
2867,
1646,
281,
320,
908,
281,
3365,
9708,
1079,
253,
247,
285,
259,
3602,
277,
253,
4477,
1750,
326,
253,
13642,
1569,
13989,
4567,
943,
1132,
247,
1534,
2554,
281,
9017,
253,
1783,
281,
625,
2087,
3210,
436,
4419,
2007,
8813,
2139,
651,
436,
1361,
323,
1333,
11454,
6928,
390,
643,
625,
2570,
3210,
50276,
21,
14940,
2281,
352,
3133,
281,
479,
326,
253,
1543,
2797,
275,
436,
2929,
403,
21076,
685,
2045,
1929,
1543,
891,
651,
452,
10490,
281,
923,
247,
5955,
273,
841,
1543,
5742,
247,
10012,
5922,
310,
271,
20185,
14940,
906,
594,
352,
310,
1199,
21076,
685,
253,
4872,
2281,
273,
14940,
6012,
275,
38301,
379,
1162,
355,
253,
4477,
2430,
247,
10481,
1355,
3213,
1979,
2819,
387,
253,
1783,
273,
38301,
379,
1162,
355,
597,
921,
326,
253,
294,
3575,
292,
50065,
258,
5200,
8103,
11026,
247,
21868,
34460,
26860,
8103,
651,
2649,
247,
3638,
3213,
1979,
671,
4917,
14940,
275,
326,
1083,
270,
13989,
5910,
671,
760,
3400,
247,
1980,
14940,
2281,
253,
4477,
9059,
270,
1251,
69,
812,
452,
247,
7938,
14940,
436,
1057,
1646,
281,
969,
320,
247,
21076,
906,
594,
969,
891,
1158,
352,
651,
320,
1077,
12912,
604,
253,
4477,
812,
4518,
1375,
253,
3910,
342,
2045,
789,
50276,
22,
37127,
868,
323,
11454,
37507,
253,
4477,
1750,
597,
452,
417,
14494,
14940,
281,
37127,
868,
323,
253,
4679,
342,
11454,
6928,
849,
858,
368,
2451,
1880,
253,
2701,
1127,
4925,
407,
270,
1251,
69,
369,
417,
247,
26759,
1127,
436,
4419,
12672,
512,
253,
20223,
273,
253,
344,
859,
757,
534,
310,
5431,
8214,
849,
369,
436,
2218,
4555,
50276,
23,
6880,
273,
253,
1783,
281,
3676,
11454,
6928,
253,
1783,
2530,
275,
436,
2929,
760,
10384,
281,
258,
5200,
1223,
38301,
379,
1162,
355,
671,
6012,
271,
1783,
323,
11454,
6928,
476,
253,
4477,
4385,
327,
13633,
616,
1211,
1783,
281,
11454,
37507,
285,
849,
436,
651,
9184,
432,
253,
581,
6012,
275,
38301,
379,
1162,
355,
50276,
24,
4679,
849,
651,
368,
6642,
253,
2491,
273,
7470,
3213,
9552,
323,
1097,
247,
285,
259,
323,
270,
1251,
69,
323,
247,
11454,
2990,
2490,
187,
4118,
18435,
27,
783,
30628,
5194,
326,
5277,
625,
16039,
327,
2139,
14604,
21539,
789,
310,
271,
1774,
9400,
273,
5839,
533,
597,
512,
5439,
2067,
3237,
342,
253,
1655,
19529,
534,
878,
281,
320,
9713,
1078,
9311,
253,
913,
3021,
29328,
49620,
285,
16517,
538,
2225
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this paper the authors propose a method for dimensionality reduction of image data they provide a structured and deterministic function g that maps a set of parameters c to an image x gc the number of parameters c is smaller than the number of free parameters in the image x so this results in a predictive model that can be used for compression denoising inpainting superresolution and other inverse problems the structure of g is as follows starting with a small fixed multichannel white noise image linearly mix the channels truncate the negative values to zero and upsample this process is repeated multiple times and finally the output is squashed through a sigmoid function for the output to remain in the 01 range this approach makes sense and the model is indeed more principled than the one taken by ulyanov et al in fact the dip of ulyanov et al can hardly be considered a model or a prior for that matter and instead should be considered an algorithm since it relies on the early stopping of a specific optimization algorithm this means that we are not interested in the minimum of the cost function associated to the model which contradicts the very concept of cost function if only global optimizers were available dip wouldnt work showing its value is in the interplay of the cost function and a specific optimization algorithm none of these problems exist with the presented approach the exposition is clear and the presented inverse problems as well as demonstrated performance are sufficient one thing that i missed while reading the paper is more comment on negative results did the authors tried any version of their model with convolutions or pooling and found it not to perform as well measuring the number of parameters when including pooling or convolutions can become tricky was that part of the reason minor regularizing by stopping early for regularization in this paper large compression ratios means little compression which i found confusing docsepbrief summary this paper presents a deep decoder model which given a target natural image and a random noise tensor learns to decode the noise tensor into the target image by a series of 1x1 convolutions relus layer wise normalizations and upsampling the parameter of the convolution are fitted to each target image where the source noise tensor is fixed the method is shown to serve as a good model for natural image for a variety of image processing tasks such as denoising and compression pros an interesting model which is quite intriguing in its simplicity good results and good analysis of the model mostly clear writing and presentation few typos etc nothing too serious cons and comments the author say explicitly that this is not a convolutional model because of the use of 1x1 convolutions i disagree and i actually think this is important for two reasons first though these are 1x1 convolutions because of the upsampling operation and the layer wise normalizations the influence of each operation goes beyond the 1x1 support furthermore and more importantly is the weight sharing scheme induced by this using convolutions is a very natural choice for natural images no pun intended due to the translation invariant statistics of natural images i doubt this would have worked so well hadnt it been modeled this way not to mention this allows a small number of parameters the upsampling analysis is interesting but it is only done on synthetic data will the result hold for natural images as well should be easy to try and will allow a better understanding of this choice natural images are only approximately piecewise smooth after all the use of the name batchnorm for the layer wise normalization is both wrong and misleading this is just channelwise normalization with some extra parameters no need to call it this way even if its implemented with the same function as there is no batch i would have loved to see actual analysis of the methods performance as a function of the noise standard deviation specifically for a fixed k how would performance increase or decrease and vice versa for a given noise level how would k affect performance the actual standard deviation of the noise is not mentioned in any of the experiments as far as i could tell what does the decoder produce when taking a trained c on a given image and changing the source noise tensor i think that would shed light on what structures are learned and how they propagated in the image possibly more than figure 6 which should really have something to compare to because its not very informative out of contextdocsepthe paper builds upon deep image prior dip work which shows that one can optimize a neural generator to fit a single image without learning on any dataset and the output of the generator which approximates the image can be used for denoising super resolution etc the paper proposes a new architecture for the dip method which has much less parameters but works on par with dip another contribution of the paper is theoretical treatment of a simplified version of the proposed architecture showing that it cant fit random noise and thus maybe better suited for denoising the paper is clearly written and the proposed architecture has too cool properties its compact enough to be used for image compression and it doesnt overfit thus making early stopping notnesesary which was crucial for the original dip model i have two main concerns about this paper first it is somewhat misleading about its contributions its not obvious from abstractintroduction that the whole model is the same as dip except for the proposed architecture specifically the first contribution listed in the introduction makes it look like this paper introduces the idea of not learning the decoder on the dataset the one that starts with the network is not learned and itself incorporates all assumptions on the data my second concern is about the theoretical contribution on the one hand i enjoyed the angle the authors tackled proving that the network architecture is underparameterized enough to be a good model for denoising on the other hand the obtained results are very weak only one layered version of the paper is analysed and the theorem applies only to networks with less than some threshold of parameters roughly the theorem states that if for example we fix any matrix b of size eg 256 x k and matrix u of size 512 x 256 and then compute u relub c where c is the vector of parameters of size k x 1 and if k 25 ie if we use at most 2 parameters then it would be very hard to fit 512 iid gaussian values ie minc u relub c eta where eta n0 1 this restriction of the number of parameters to be small is only mentioned in the theorem itself not in the discussion of its implications also the theorem only applies to the iid noise while most natural noise patterns have structure eg jpeg artifacts broken pixels etc and thus can probably be better approximated with deep models since the paper manages to use very few parameters btw how many parameters in total do you have can you please add this number to the text it would be cool to see if second order methods like lbfgs can be applied here some less important points fig 4 is very confusing first it doesnt label the x axis second the caption mentions that early stopping is beneficial for the proposed method but i cant see it from the figure third i dont get what is plotted on different subplots the text mentions that a is fitting the noisy image b is fitting the noiseless image and c is fitting noise is it all done independently with three different models then why does the figure says test and train loss and why dip loss goes up it should be able to fit anything right if not and its a single model that gets fitted on the noisy image and tested on the noiseless image then how can you estimate the level of noise fitting gc eta should be high if gc x also in this quote in fig 4a we plot the mean squared error mse over the number of iterations of the optimizer for fitting the noisy astronaut image x ie formula the formula doesnt correspond to the text and finally the discussion of this figure makes claims about the behaviour of the model that seems to be too strong to be based on a single image experiment i dont get the details of the batch normalization used with respect to which axis the mean and variance are computed the authors claim that the model is not convolutional but first its not obvious why this would be a good thing or a bad thing for that matter second its not exactly correct as noted in the paper itself the architecture uses 1x1 convolutions and upsampling which combined give a weak and underparametrized analog of convolutions the deep decoder is a deep image model g r n r n where n is the number of parameters of the model and n is the output dimension which is typically much larger than the number of parameters n n i think it should be vice versa n n the following footnote specifically we took a deep decoder g with d 6 layers and output dimension 5125123 and choose k 64 and k 128 for the respective compression ratios uses unintroduced at that point notation and is very confusing it would be nice to have a version of figure 6 with k 6 so that one can see all feature maps in contrast to a subset of them im also wondering is it harder to optimize the proposed architecture compared to dip the literature on distillation indicates that overparameterization can be beneficial for convergence and final performance
### Summary: | in this work the authors propose a simple under parameterized network architecture which can fit natural images well when fed with a fixed random input signal this allows the model to be used for a number of tasks without requiring that the model be trained on a dataset further unlike a recently proposed related method dip ulyanov et al 18 the method does not require regularization such as earlystopping as with dip the reviewers noted the simplicity and experimental validation and were unanimous in recommending acceptance | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
12661,
247,
1332,
323,
7877,
1319,
5141,
273,
2460,
941,
597,
2085,
247,
18872,
285,
30027,
1159,
305,
326,
8115,
247,
873,
273,
3602,
260,
281,
271,
2460,
1269,
50276,
23654,
253,
1180,
273,
3602,
260,
310,
4577,
685,
253,
1180,
273,
1959,
3602,
275,
253,
2460,
1269,
594,
436,
1543,
275,
247,
15970,
1566,
326,
476,
320,
908,
323,
13800,
1850,
80,
2182,
275,
31406,
1076,
2221,
21061,
285,
643,
13737,
3237,
50276,
783,
2605,
273,
305,
310,
347,
3637,
4983,
342,
247,
1355,
4229,
1554,
469,
3536,
3168,
6046,
2460,
23352,
5878,
253,
8123,
17701,
366,
253,
4016,
2193,
281,
5058,
285,
598,
16848,
436,
1232,
310,
6015,
2709,
2069,
285,
4720,
253,
3453,
310,
3896,
6948,
949,
247,
9788,
78,
1238,
1159,
323,
253,
3453,
281,
3464,
275,
253,
14805,
2491,
50276,
2520,
2746,
2789,
3282,
285,
253,
1566,
310,
6296,
625,
3505,
74,
6216,
685,
253,
581,
2668,
407,
1484,
314,
46964,
1162,
355,
275,
958,
253,
12539,
273,
1484,
314,
46964,
1162,
355,
476,
10693,
320,
2783,
247,
1566,
390,
247,
2720,
323,
326,
2647,
285,
3185,
943,
320,
2783,
271,
5933,
1580,
352,
15771,
327,
253,
2393,
15910,
273,
247,
2173,
13757,
5933,
436,
2097,
326,
359,
403,
417,
6110,
275,
253,
5927,
273,
253,
2105,
1159,
2330,
281,
253,
1566,
534,
40878,
253,
1077,
4473,
273,
2105,
1159,
604,
760,
4156,
5556,
14460,
497,
2130,
12539,
651,
2649,
789,
4645,
697,
1318,
310,
275,
253,
36039,
273,
253,
2105,
1159,
285,
247,
2173,
13757,
5933,
5293,
273,
841,
3237,
2226,
342,
253,
3559,
2746,
50276,
783,
47284,
310,
2590,
285,
253,
3559,
13737,
3237,
347,
973,
347,
5183,
3045,
403,
4209,
50276,
531,
2181,
326,
891,
9829,
1223,
4361,
253,
2929,
310,
625,
4385,
327,
4016,
1543,
858,
253,
4477,
3597,
667,
2715,
273,
616,
1566,
342,
2410,
17009,
390,
45900,
285,
1119,
352,
417,
281,
1347,
347,
973,
10499,
253,
1180,
273,
3602,
672,
1690,
45900,
390,
2410,
17009,
476,
2489,
28190,
369,
326,
629,
273,
253,
1921,
50276,
37585,
50276,
12846,
3006,
407,
15910,
2393,
323,
37820,
50276,
249,
436,
2929,
1781,
13800,
11878,
2097,
1652,
13800,
534,
891,
1119,
21643,
5474,
339,
15656,
3624,
6010,
50276,
2520,
2929,
10262,
247,
3676,
29810,
1566,
534,
1677,
247,
2303,
3626,
2460,
285,
247,
3632,
6046,
13148,
33772,
281,
30358,
253,
6046,
13148,
715,
253,
2303,
2460,
407,
247,
2962,
273,
337,
89,
18,
2410,
17009,
774,
316,
3828,
15822,
2622,
5904,
285,
598,
48027,
253,
4764,
273,
253,
27311,
403,
14662,
281,
1016,
2303,
2460,
835,
253,
2603,
6046,
13148,
310,
4229,
253,
1332,
310,
2011,
281,
5752,
347,
247,
1175,
1566,
323,
3626,
2460,
323,
247,
5235,
273,
2460,
5162,
8892,
824,
347,
1850,
80,
2182,
285,
13800,
50276,
856,
84,
50276,
266,
4722,
1566,
534,
310,
3240,
27807,
275,
697,
17647,
50276,
12311,
1543,
285,
1175,
1783,
273,
253,
1566,
50276,
39025,
2590,
4028,
285,
9759,
1643,
963,
993,
3966,
2717,
1512,
4092,
50276,
5040,
285,
5701,
50276,
783,
2488,
1333,
11120,
326,
436,
310,
417,
247,
27311,
267,
1566,
984,
273,
253,
897,
273,
337,
89,
18,
2410,
17009,
891,
14936,
285,
891,
2686,
1158,
436,
310,
1774,
323,
767,
4606,
806,
2167,
841,
403,
337,
89,
18,
2410,
17009,
984,
273,
253,
598,
48027,
4254,
285,
253,
3828,
15822,
2622,
5904,
253,
4833,
273,
1016,
4254,
4566,
4457,
253,
337,
89,
18,
1329,
33810,
285,
625,
15538,
310,
253,
2801,
9628,
6974,
5802,
407,
436,
50276,
5302,
2410,
17009,
310,
247,
1077,
3626,
4327,
323,
3626,
3888,
642,
5419,
6034,
1955,
281,
253,
10234,
13727,
9990,
273,
3626,
3888,
891,
5545,
436,
651,
452,
4307,
594,
973,
574,
2649,
352,
644,
23115,
436,
1039,
417,
281,
3748,
436,
4483,
247,
1355,
1180,
273,
3602,
50275,
783,
598,
48027,
1783,
310,
4722,
533,
352,
310,
760,
2218,
327,
13506,
941,
50276,
9846,
253,
906,
2186,
323,
3626,
3888,
347,
973,
943,
320,
3477,
281,
1611,
285,
588,
1581,
247,
1805,
4685,
273,
436,
4327,
3626,
3888,
403,
760,
5512,
5313,
3020,
6032,
846,
512,
50275,
783,
897,
273,
253,
1416,
10464,
1451,
526,
323,
253,
3828,
15822,
21539,
310,
1097,
3430,
285,
24363,
436,
310,
816,
5048,
3020,
21539,
342,
690,
4465,
3602,
50276,
2369,
878,
281,
1067,
352,
436,
1039,
1014,
604,
697,
9009,
342,
253,
1072,
1159,
347,
627,
310,
642,
14604,
50275,
74,
651,
452,
7636,
281,
923,
4588,
1783,
273,
253,
3082,
3045,
347,
247,
1159,
273,
253,
6046,
2629,
11254,
5742,
323,
247,
4229,
465,
849,
651,
3045,
2572,
390,
6379,
285,
12008,
26620,
50276,
1542,
247,
1677,
6046,
1268,
849,
651,
465,
2818,
3045,
50275,
783,
4588,
2629,
11254,
273,
253,
6046,
310,
417,
5393,
275,
667,
273,
253,
4679,
347,
2080,
347,
891,
812,
2028,
50275,
5371,
1057,
253,
29810,
4711,
672,
3192,
247,
10166,
260,
327,
247,
1677,
2460,
285,
6890,
253,
2603,
6046,
13148,
891,
1158,
326,
651,
17914,
1708,
327,
752,
5289,
403,
6311,
285,
849,
597,
46695,
275,
253,
2460,
6830,
625,
685,
4677,
721,
534,
943,
1663,
452,
1633,
281,
7277,
281,
984,
697,
417,
1077,
27096,
562,
273,
3634,
7152,
339,
431,
248,
2929,
21168,
2220,
3676,
2460,
2720,
12539,
50276,
1601,
534,
2722,
326,
581,
476,
22318,
247,
11454,
14156,
281,
4944,
247,
2014,
2460,
1293,
4715,
327,
667,
10895,
285,
253,
3453,
273,
253,
14156,
534,
4020,
684,
253,
2460,
476,
320,
908,
323,
1850,
80,
2182,
50276,
12185,
6064,
50276,
14069,
253,
2929,
29328,
247,
747,
10336,
323,
253,
12539,
1332,
534,
556,
1199,
1679,
3602,
533,
2987,
327,
1061,
342,
12539,
1529,
7680,
273,
253,
2929,
310,
10527,
1971,
273,
247,
21010,
2715,
273,
253,
4081,
10336,
4645,
326,
352,
16216,
4944,
3632,
6046,
285,
3021,
5046,
1805,
18960,
323,
1850,
80,
2182,
50276,
783,
2929,
310,
4518,
3542,
285,
253,
4081,
10336,
556,
1512,
4484,
3607,
697,
8566,
2217,
281,
320,
908,
323,
2460,
13800,
285,
352,
36908,
689,
8491,
3021,
2403,
2393,
15910,
417,
5210,
265,
552,
534,
369,
9560,
323,
253,
3236,
12539,
1566,
50276,
74,
452,
767,
2022,
7350,
670,
436,
2929,
806,
352,
310,
8489,
24363,
670,
697,
9021,
697,
417,
4755,
432,
12002,
46089,
326,
253,
2644,
1566,
310,
253,
1072,
347,
12539,
3707,
323,
253,
4081,
10336,
5742,
253,
806,
7680,
7117,
275,
253,
10199,
2789,
352,
1007,
751,
436,
2929,
23970,
253,
2934,
273,
417,
4715,
253,
29810,
327,
253,
10895,
253,
581,
326,
7866,
342,
253,
2990,
310,
417,
6311,
285,
3139,
31167,
512,
13260,
327,
253,
941,
50276,
2577,
1273,
4468,
310,
670,
253,
10527,
7680,
327,
253,
581,
1133,
891,
11346,
253,
6907,
253,
4477,
11463,
1070,
18597,
326,
253,
2990,
10336,
310,
762,
19484,
1025,
2217,
281,
320,
247,
1175,
1566,
323,
1850,
80,
2182,
327,
253,
643,
1133,
253,
2797,
1543,
403,
1077,
5075,
760,
581,
36910,
2715,
273,
253,
2929,
310,
15626,
285,
253,
10012,
10384,
760,
281,
6928,
342,
1679,
685,
690,
7887,
273,
3602,
11467,
253,
10012,
3054,
326,
604,
323,
1650,
359,
4993,
667,
4315,
270,
273,
1979,
24088,
17558,
1269,
465,
285,
4315,
1484,
273,
1979,
23414,
1269,
17558,
285,
840,
11897,
1484,
774,
538,
260,
835,
260,
310,
253,
4972,
273,
3602,
273,
1979,
465,
1269,
337,
285,
604,
465,
50276,
1099,
26332,
604,
359,
897,
387,
954,
374,
3602,
840,
352,
651,
320,
1077,
1892,
281,
4944,
23414,
891,
301,
305,
12064,
2193,
26332,
1054,
68,
1484,
774,
538,
260,
50276,
1464,
835,
1162,
66,
50276,
79,
17,
337,
436,
12400,
273,
253,
1180,
273,
3602,
281,
320,
1355,
310,
760,
5393,
275,
253,
10012,
3139,
417,
275,
253,
5955,
273,
697,
12739,
671,
253,
10012,
760,
10384,
281,
253,
891,
301,
6046,
1223,
954,
3626,
6046,
6127,
452,
2605,
24088,
480,
21949,
24165,
7154,
15115,
3966,
285,
3021,
476,
3164,
320,
1805,
34930,
342,
3676,
3210,
50276,
17480,
253,
2929,
26091,
281,
897,
1077,
1643,
3602,
270,
7553,
849,
1142,
3602,
275,
2264,
513,
368,
452,
476,
368,
4496,
823,
436,
1180,
281,
253,
2505,
352,
651,
320,
4484,
281,
923,
604,
1273,
1340,
3082,
751,
298,
3342,
5943,
476,
320,
3732,
1060,
50276,
8826,
1679,
1774,
2792,
50276,
926,
577,
310,
1077,
21643,
806,
352,
36908,
5203,
253,
1269,
7844,
1273,
253,
11743,
25957,
326,
2393,
15910,
310,
12912,
323,
253,
4081,
1332,
533,
891,
16216,
923,
352,
432,
253,
4677,
2626,
891,
13414,
755,
752,
310,
17944,
327,
1027,
749,
42045,
253,
2505,
25957,
326,
247,
310,
13532,
253,
27620,
2460,
270,
310,
13532,
253,
642,
261,
6134,
2460,
285,
260,
310,
13532,
6046,
310,
352,
512,
2218,
10939,
342,
1264,
1027,
3210,
840,
2139,
1057,
253,
4677,
2296,
1071,
285,
6194,
2957,
285,
2139,
12539,
2957,
4566,
598,
352,
943,
320,
2104,
281,
4944,
2712,
987,
604,
417,
285,
697,
247,
2014,
1566,
326,
4850,
14662,
327,
253,
27620,
2460,
285,
5762,
327,
253,
642,
261,
6134,
2460,
840,
849,
476,
368,
6642,
253,
1268,
273,
6046,
13532,
305,
68,
50276,
1464,
943,
320,
1029,
604,
305,
68,
50276,
89,
671,
275,
436,
14430,
275,
3036,
577,
66,
359,
7484,
253,
1599,
30044,
2228,
278,
339,
689,
253,
1180,
273,
25142,
273,
253,
5556,
6081,
323,
13532,
253,
27620,
35807,
2460,
1269,
50275,
466,
7212,
50276,
783,
7212,
36908,
2723,
281,
253,
2505,
285,
4720,
253,
5955,
273,
436,
4677,
2789,
3916,
670,
253,
8770,
273,
253,
1566,
326,
3133,
281,
320,
1512,
2266,
281,
320,
1754,
327,
247,
2014,
2460,
3368,
50276,
74,
13414,
755,
253,
4278,
273,
253,
14604,
21539,
908,
342,
1675,
281,
534,
7844,
253,
1599,
285,
11041,
403,
10302,
50276,
783,
4477,
1750,
326,
253,
1566,
310,
417,
27311,
267,
533,
806,
697,
417,
4755,
2139,
436,
651,
320,
247,
1175,
2181,
390,
247,
3076,
2181,
323,
326,
2647,
1273,
697,
417,
4555,
3451,
347,
4879,
275,
253,
2929,
3139,
253,
10336,
4648,
337,
89,
18,
2410,
17009,
285,
598,
48027,
534,
5678,
1918,
247,
5075,
285,
762,
3575,
292,
50065,
7370,
273,
2410,
17009,
50275,
783,
3676,
29810,
310,
247,
3676,
2460,
1566,
305,
391,
295,
50276,
83,
295,
835,
295,
310,
253,
1180,
273,
3602,
273,
253,
1566,
285,
295,
310,
253,
3453,
7877,
534,
310,
5431,
1199,
4067,
685,
253,
1180,
273,
3602,
295,
50276,
79,
891,
1158,
352,
943,
320,
12008,
26620,
295,
50276,
79,
50276,
783,
1563,
43302,
50276,
46458,
359,
2335,
247,
3676,
29810,
305,
342,
277,
50276,
23,
8090,
285,
3453,
7877,
608,
9312,
10683,
285,
5206,
465,
50276,
1540,
285,
465,
50276,
8196,
323,
253,
9056,
13800,
11878,
4648,
25962,
2466,
758,
387,
326,
1127,
14951,
285,
310,
1077,
21643,
50276,
262,
651,
320,
5322,
281,
452,
247,
2715,
273,
4677,
721,
342,
465,
50276,
23,
594,
326,
581,
476,
923,
512,
4735,
8115,
275,
4499,
281,
247,
8578,
273,
731,
50276,
303,
671,
12371,
310,
352,
12150,
281,
22318,
253,
4081,
10336,
2429,
281,
12539,
253,
6239,
327,
940,
21755,
6492,
326,
689,
19484,
1320,
476,
320,
12912,
323,
14940,
285,
2457,
3045,
2490,
187,
4118,
18435,
27,
249,
436,
789,
253,
4477,
12661,
247,
2969,
762,
4764,
1025,
2990,
10336,
534,
476,
4944,
3626,
3888,
973,
672,
10208,
342,
247,
4229,
3632,
3280,
2625,
436,
4483,
253,
1566,
281,
320,
908,
323,
247,
1180,
273,
8892,
1293,
10568,
326,
253,
1566,
320,
10166,
327,
247,
10895,
2007,
12401,
247,
4102,
4081,
2905,
1332,
12539,
1484,
314,
46964,
1162,
355,
1283,
253,
1332,
1057,
417,
2430,
37820,
824,
347,
2393,
11769,
2784,
347,
342,
12539,
50276,
783,
30628,
4879,
253,
17647,
285,
5661,
12820,
285,
497,
42293,
275,
46705,
14924,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
4477,
12661,
247,
1332,
323,
7877,
1319,
5141,
273,
2460,
941,
597,
2085,
247,
18872,
285,
30027,
1159,
305,
326,
8115,
247,
873,
273,
3602,
260,
281,
271,
2460,
1269,
50276,
23654,
253,
1180,
273,
3602,
260,
310,
4577,
685,
253,
1180,
273,
1959,
3602,
275,
253,
2460,
1269,
594,
436,
1543,
275,
247,
15970,
1566,
326,
476,
320,
908,
323,
13800,
1850,
80,
2182,
275,
31406,
1076,
2221,
21061,
285,
643,
13737,
3237,
50276,
783,
2605,
273,
305,
310,
347,
3637,
4983,
342,
247,
1355,
4229,
1554,
469,
3536,
3168,
6046,
2460,
23352,
5878,
253,
8123,
17701,
366,
253,
4016,
2193,
281,
5058,
285,
598,
16848,
436,
1232,
310,
6015,
2709,
2069,
285,
4720,
253,
3453,
310,
3896,
6948,
949,
247,
9788,
78,
1238,
1159,
323,
253,
3453,
281,
3464,
275,
253,
14805,
2491,
50276,
2520,
2746,
2789,
3282,
285,
253,
1566,
310,
6296,
625,
3505,
74,
6216,
685,
253,
581,
2668,
407,
1484,
314,
46964,
1162,
355,
275,
958,
253,
12539,
273,
1484,
314,
46964,
1162,
355,
476,
10693,
320,
2783,
247,
1566,
390,
247,
2720,
323,
326,
2647,
285,
3185,
943,
320,
2783,
271,
5933,
1580,
352,
15771,
327,
253,
2393,
15910,
273,
247,
2173,
13757,
5933,
436,
2097,
326,
359,
403,
417,
6110,
275,
253,
5927,
273,
253,
2105,
1159,
2330,
281,
253,
1566,
534,
40878,
253,
1077,
4473,
273,
2105,
1159,
604,
760,
4156,
5556,
14460,
497,
2130,
12539,
651,
2649,
789,
4645,
697,
1318,
310,
275,
253,
36039,
273,
253,
2105,
1159,
285,
247,
2173,
13757,
5933,
5293,
273,
841,
3237,
2226,
342,
253,
3559,
2746,
50276,
783,
47284,
310,
2590,
285,
253,
3559,
13737,
3237,
347,
973,
347,
5183,
3045,
403,
4209,
50276,
531,
2181,
326,
891,
9829,
1223,
4361,
253,
2929,
310,
625,
4385,
327,
4016,
1543,
858,
253,
4477,
3597,
667,
2715,
273,
616,
1566,
342,
2410,
17009,
390,
45900,
285,
1119,
352,
417,
281,
1347,
347,
973,
10499,
253,
1180,
273,
3602,
672,
1690,
45900,
390,
2410,
17009,
476,
2489,
28190,
369,
326,
629,
273,
253,
1921,
50276,
37585,
50276,
12846,
3006,
407,
15910,
2393,
323,
37820,
50276,
249,
436,
2929,
1781,
13800,
11878,
2097,
1652,
13800,
534,
891,
1119,
21643,
5474,
339,
15656,
3624,
6010,
50276,
2520,
2929,
10262,
247,
3676,
29810,
1566,
534,
1677,
247,
2303,
3626,
2460,
285,
247,
3632,
6046,
13148,
33772,
281,
30358,
253,
6046,
13148,
715,
253,
2303,
2460,
407,
247,
2962,
273,
337,
89,
18,
2410,
17009,
774,
316,
3828,
15822,
2622,
5904,
285,
598,
48027,
253,
4764,
273,
253,
27311,
403,
14662,
281,
1016,
2303,
2460,
835,
253,
2603,
6046,
13148,
310,
4229,
253,
1332,
310,
2011,
281,
5752,
347,
247,
1175,
1566,
323,
3626,
2460,
323,
247,
5235,
273,
2460,
5162,
8892,
824,
347,
1850,
80,
2182,
285,
13800,
50276,
856,
84,
50276,
266,
4722,
1566,
534,
310,
3240,
27807,
275,
697,
17647,
50276,
12311,
1543,
285,
1175,
1783,
273,
253,
1566,
50276,
39025,
2590,
4028,
285,
9759,
1643,
963,
993,
3966,
2717,
1512,
4092,
50276,
5040,
285,
5701,
50276,
783,
2488,
1333,
11120,
326,
436,
310,
417,
247,
27311,
267,
1566,
984,
273,
253,
897,
273,
337,
89,
18,
2410,
17009,
891,
14936,
285,
891,
2686,
1158,
436,
310,
1774,
323,
767,
4606,
806,
2167,
841,
403,
337,
89,
18,
2410,
17009,
984,
273,
253,
598,
48027,
4254,
285,
253,
3828,
15822,
2622,
5904,
253,
4833,
273,
1016,
4254,
4566,
4457,
253,
337,
89,
18,
1329,
33810,
285,
625,
15538,
310,
253,
2801,
9628,
6974,
5802,
407,
436,
50276,
5302,
2410,
17009,
310,
247,
1077,
3626,
4327,
323,
3626,
3888,
642,
5419,
6034,
1955,
281,
253,
10234,
13727,
9990,
273,
3626,
3888,
891,
5545,
436,
651,
452,
4307,
594,
973,
574,
2649,
352,
644,
23115,
436,
1039,
417,
281,
3748,
436,
4483,
247,
1355,
1180,
273,
3602,
50275,
783,
598,
48027,
1783,
310,
4722,
533,
352,
310,
760,
2218,
327,
13506,
941,
50276,
9846,
253,
906,
2186,
323,
3626,
3888,
347,
973,
943,
320,
3477,
281,
1611,
285,
588,
1581,
247,
1805,
4685,
273,
436,
4327,
3626,
3888,
403,
760,
5512,
5313,
3020,
6032,
846,
512,
50275,
783,
897,
273,
253,
1416,
10464,
1451,
526,
323,
253,
3828,
15822,
21539,
310,
1097,
3430,
285,
24363,
436,
310,
816,
5048,
3020,
21539,
342,
690,
4465,
3602,
50276,
2369,
878,
281,
1067,
352,
436,
1039,
1014,
604,
697,
9009,
342,
253,
1072,
1159,
347,
627,
310,
642,
14604,
50275,
74,
651,
452,
7636,
281,
923,
4588,
1783,
273,
253,
3082,
3045,
347,
247,
1159,
273,
253,
6046,
2629,
11254,
5742,
323,
247,
4229,
465,
849,
651,
3045,
2572,
390,
6379,
285,
12008,
26620,
50276,
1542,
247,
1677,
6046,
1268,
849,
651,
465,
2818,
3045,
50275,
783,
4588,
2629,
11254,
273,
253,
6046,
310,
417,
5393,
275,
667,
273,
253,
4679,
347,
2080,
347,
891,
812,
2028,
50275,
5371,
1057,
253,
29810,
4711,
672,
3192,
247,
10166,
260,
327,
247,
1677,
2460,
285,
6890,
253,
2603,
6046,
13148,
891,
1158,
326,
651,
17914,
1708,
327,
752,
5289,
403,
6311,
285,
849,
597,
46695,
275,
253,
2460,
6830,
625,
685,
4677,
721,
534,
943,
1663,
452,
1633,
281,
7277,
281,
984,
697,
417,
1077,
27096,
562,
273,
3634,
7152,
339,
431,
248,
2929,
21168,
2220,
3676,
2460,
2720,
12539,
50276,
1601,
534,
2722,
326,
581,
476,
22318,
247,
11454,
14156,
281,
4944,
247,
2014,
2460,
1293,
4715,
327,
667,
10895,
285,
253,
3453,
273,
253,
14156,
534,
4020,
684,
253,
2460,
476,
320,
908,
323,
1850,
80,
2182,
50276,
12185,
6064,
50276,
14069,
253,
2929,
29328,
247,
747,
10336,
323,
253,
12539,
1332,
534,
556,
1199,
1679,
3602,
533,
2987,
327,
1061,
342,
12539,
1529,
7680,
273,
253,
2929,
310,
10527,
1971,
273,
247,
21010,
2715,
273,
253,
4081,
10336,
4645,
326,
352,
16216,
4944,
3632,
6046,
285,
3021,
5046,
1805,
18960,
323,
1850,
80,
2182,
50276,
783,
2929,
310,
4518,
3542,
285,
253,
4081,
10336,
556,
1512,
4484,
3607,
697,
8566,
2217,
281,
320,
908,
323,
2460,
13800,
285,
352,
36908,
689,
8491,
3021,
2403,
2393,
15910,
417,
5210,
265,
552,
534,
369,
9560,
323,
253,
3236,
12539,
1566,
50276,
74,
452,
767,
2022,
7350,
670,
436,
2929,
806,
352,
310,
8489,
24363,
670,
697,
9021,
697,
417,
4755,
432,
12002,
46089,
326,
253,
2644,
1566,
310,
253,
1072,
347,
12539,
3707,
323,
253,
4081,
10336,
5742,
253,
806,
7680,
7117,
275,
253,
10199,
2789,
352,
1007,
751,
436,
2929,
23970,
253,
2934,
273,
417,
4715,
253,
29810,
327,
253,
10895,
253,
581,
326,
7866,
342,
253,
2990,
310,
417,
6311,
285,
3139,
31167,
512,
13260,
327,
253,
941,
50276,
2577,
1273,
4468,
310,
670,
253,
10527,
7680,
327,
253,
581,
1133,
891,
11346,
253,
6907,
253,
4477,
11463,
1070,
18597,
326,
253,
2990,
10336,
310,
762,
19484,
1025,
2217,
281,
320,
247,
1175,
1566,
323,
1850,
80,
2182,
327,
253,
643,
1133,
253,
2797,
1543,
403,
1077,
5075,
760,
581,
36910,
2715,
273,
253,
2929,
310,
15626,
285,
253,
10012,
10384,
760,
281,
6928,
342,
1679,
685,
690,
7887,
273,
3602,
11467,
253,
10012,
3054,
326,
604,
323,
1650,
359,
4993,
667,
4315,
270,
273,
1979,
24088,
17558,
1269,
465,
285,
4315,
1484,
273,
1979,
23414,
1269,
17558,
285,
840,
11897,
1484,
774,
538,
260,
835,
260,
310,
253,
4972,
273,
3602,
273,
1979,
465,
1269,
337,
285,
604,
465,
50276,
1099,
26332,
604,
359,
897,
387,
954,
374,
3602,
840,
352,
651,
320,
1077,
1892,
281,
4944,
23414,
891,
301,
305,
12064,
2193,
26332,
1054,
68,
1484,
774,
538,
260,
50276,
1464,
835,
1162,
66,
50276,
79,
17,
337,
436,
12400,
273,
253,
1180,
273,
3602,
281,
320,
1355,
310,
760,
5393,
275,
253,
10012,
3139,
417,
275,
253,
5955,
273,
697,
12739,
671,
253,
10012,
760,
10384,
281,
253,
891,
301,
6046,
1223,
954,
3626,
6046,
6127,
452,
2605,
24088,
480,
21949,
24165,
7154,
15115,
3966,
285,
3021,
476,
3164,
320,
1805,
34930,
342,
3676,
3210,
50276,
17480,
253,
2929,
26091,
281,
897,
1077,
1643,
3602,
270,
7553,
849,
1142,
3602,
275,
2264,
513,
368,
452,
476,
368,
4496,
823,
436,
1180,
281,
253,
2505,
352,
651,
320,
4484,
281,
923,
604,
1273,
1340,
3082,
751,
298,
3342,
5943,
476,
320,
3732,
1060,
50276,
8826,
1679,
1774,
2792,
50276,
926,
577,
310,
1077,
21643,
806,
352,
36908,
5203,
253,
1269,
7844,
1273,
253,
11743,
25957,
326,
2393,
15910,
310,
12912,
323,
253,
4081,
1332,
533,
891,
16216,
923,
352,
432,
253,
4677,
2626,
891,
13414,
755,
752,
310,
17944,
327,
1027,
749,
42045,
253,
2505,
25957,
326,
247,
310,
13532,
253,
27620,
2460,
270,
310,
13532,
253,
642,
261,
6134,
2460,
285,
260,
310,
13532,
6046,
310,
352,
512,
2218,
10939,
342,
1264,
1027,
3210,
840,
2139,
1057,
253,
4677,
2296,
1071,
285,
6194,
2957,
285,
2139,
12539,
2957,
4566,
598,
352,
943,
320,
2104,
281,
4944,
2712,
987,
604,
417,
285,
697,
247,
2014,
1566,
326,
4850,
14662,
327,
253,
27620,
2460,
285,
5762,
327,
253,
642,
261,
6134,
2460,
840,
849,
476,
368,
6642,
253,
1268,
273,
6046,
13532,
305,
68,
50276,
1464,
943,
320,
1029,
604,
305,
68,
50276,
89,
671,
275,
436,
14430,
275,
3036,
577,
66,
359,
7484,
253,
1599,
30044,
2228,
278,
339,
689,
253,
1180,
273,
25142,
273,
253,
5556,
6081,
323,
13532,
253,
27620,
35807,
2460,
1269,
50275,
466,
7212,
50276,
783,
7212,
36908,
2723,
281,
253,
2505,
285,
4720,
253,
5955,
273,
436,
4677,
2789,
3916,
670,
253,
8770,
273,
253,
1566,
326,
3133,
281,
320,
1512,
2266,
281,
320,
1754,
327,
247,
2014,
2460,
3368,
50276,
74,
13414,
755,
253,
4278,
273,
253,
14604,
21539,
908,
342,
1675,
281,
534,
7844,
253,
1599,
285,
11041,
403,
10302,
50276,
783,
4477,
1750,
326,
253,
1566,
310,
417,
27311,
267,
533,
806,
697,
417,
4755,
2139,
436,
651,
320,
247,
1175,
2181,
390,
247,
3076,
2181,
323,
326,
2647,
1273,
697,
417,
4555,
3451,
347,
4879,
275,
253,
2929,
3139,
253,
10336,
4648,
337,
89,
18,
2410,
17009,
285,
598,
48027,
534,
5678,
1918,
247,
5075,
285,
762,
3575,
292,
50065,
7370,
273,
2410,
17009,
50275,
783,
3676,
29810,
310,
247,
3676,
2460,
1566,
305,
391,
295,
50276,
83,
295,
835,
295,
310,
253,
1180,
273,
3602,
273,
253,
1566,
285,
295,
310,
253,
3453,
7877,
534,
310,
5431,
1199,
4067,
685,
253,
1180,
273,
3602,
295,
50276,
79,
891,
1158,
352,
943,
320,
12008,
26620,
295,
50276,
79,
50276,
783,
1563,
43302,
50276,
46458,
359,
2335,
247,
3676,
29810,
305,
342,
277,
50276,
23,
8090,
285,
3453,
7877,
608,
9312,
10683,
285,
5206,
465,
50276,
1540,
285,
465,
50276,
8196,
323,
253,
9056,
13800,
11878,
4648,
25962,
2466,
758,
387,
326,
1127,
14951,
285,
310,
1077,
21643,
50276,
262,
651,
320,
5322,
281,
452,
247,
2715,
273,
4677,
721,
342,
465,
50276,
23,
594,
326,
581,
476,
923,
512,
4735,
8115,
275,
4499,
281,
247,
8578,
273,
731,
50276,
303,
671,
12371,
310,
352,
12150,
281,
22318,
253,
4081,
10336,
2429,
281,
12539,
253,
6239,
327,
940,
21755,
6492,
326,
689,
19484,
1320,
476,
320,
12912,
323,
14940,
285,
2457,
3045,
2490,
187,
4118,
18435,
27,
249,
436,
789,
253,
4477,
12661,
247,
2969,
762,
4764,
1025,
2990,
10336,
534,
476,
4944,
3626,
3888,
973,
672,
10208,
342,
247,
4229,
3632,
3280,
2625,
436,
4483,
253,
1566,
281,
320,
908,
323,
247,
1180,
273,
8892,
1293,
10568,
326,
253,
1566,
320,
10166,
327,
247,
10895,
2007,
12401,
247,
4102,
4081,
2905,
1332,
12539,
1484,
314,
46964,
1162,
355,
1283,
253,
1332,
1057,
417,
2430,
37820,
824,
347,
2393,
11769,
2784,
347,
342,
12539,
50276,
783,
30628,
4879,
253,
17647,
285,
5661,
12820,
285,
497,
42293,
275,
46705,
14924,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper presents an approach for only partially grounding classical planning tasks which are too large to be fully grounded by common grounding algorithms the approach uses machine learning techniques to estimate the probability of operators belonging to a plan of the task using information from small instances of the same domain these operators are then considered first for grounding which can be stopped early if otherwise risking to run out of resources the resulting partially grounded task is not guaranteed to be solvable even if the original task was an experimental evaluation shows that the approach works well in several ipc domains where very large tasks can be solved with partial grounding since this paper is already published at aaai and the submitted version is identical and not a longer technicalreport style variant i will refrain from writing a full review while the paper does not directly fall into the category of search or heuristic for planning it addresses the problem of solving domains that are challenging because of their size as as such also fits the scope of the workshop the paper is very well written and easy to follow and hence my recommendation is a clear accept for the workshop if i had to point out anything that could be improved in case the authors would actually like to turn this into a one page longer extended paper then i would to suggest to include a description of the used machine learning techniques because many researchers at icaps are probably not very familiar with these more importantly i think it would be interesting to discuss why particular ml methods worked or didnt work for the purpose of the paper or if that cannot be explained easily at least state what differences were observed what parameters ended up being used and how this affected resultsdocsepshort summary of the paper this works introduces partial grounding techniques for planning tasks shows how machine learning techniques can be used to prioritize the operator order of the grounding process and presents an empirical evaluation of the techniques on multiple planning domains detailed review in planning most planners nowadays perform grounding as a preprocess step before search therefore even the strongest search algorithms wont be of use if the planner is not even able to complete the grounding step this was indeed the case in some problems of the latest international planning competition ipc 19 therefore this work investigates a very relevant field for planning and fits in the scope of the workshop the paper reads very well and does a good job in presenting the problem and the idea of partial grounding and operator ordering related work is cited although it could relate itself to the recent introduction of action schema networks toyer et al 2018 which also apply machine learning techniques to the lifted task representation although these are used to guide search nevertheless the presented techniques are novel and for the most part clearly defined an extensive empirical evaluation shows how the different techniques compare to each other and show that the partial grounding techniques can significantly increase the coverage of a planner although there is not one dominating technique all in all this paper presents highly relevant work to the field of classical planning and i do not see a reason to not accept this work at the workshop my only real criticism is that the presentation of the ilp and the classification approach is somewhat informal and a bit convoluted while the remainder of the paper is very well and fluently written i had to reread this section several times before fully understanding the underlying concepts i think either a more formal setting or a more detailed example on both approaches could be helpful for the reader sam toyer felipe w trevizan sylvie thibaux lexing xie action schema networks generalised policies with deep learning aaai 2018 62946301 minor comments i would argue that a stopping condition is a condition on when the algorithm stops but in this work it is a condition on when the algorithm continues let nop be a constant require the algorithm to continue while the colon does not really make sense here maybe just end the sentence and start with we require the text of figure 1 is hard to read on printed paper consider the use of bold font fast downward is cited twice ridder and fox 2014 2014 duplicate year
### Summary: | dear authors thank you very much for your submission we are happy to inform you that we have decided to accept it and we look forward to your talk in the workshop please go over the feedback in the reviews and correct or update your papers in time for the camera ready date may 24 best regards hsdip organizers | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
271,
2746,
323,
760,
10571,
3216,
272,
8946,
7219,
8892,
534,
403,
1512,
1781,
281,
320,
4751,
28462,
407,
1846,
3216,
272,
11333,
253,
2746,
4648,
5145,
4715,
5609,
281,
6642,
253,
5912,
273,
9158,
15823,
281,
247,
2098,
273,
253,
4836,
970,
1491,
432,
1355,
10872,
273,
253,
1072,
5028,
841,
9158,
403,
840,
2783,
806,
323,
3216,
272,
534,
476,
320,
6331,
2393,
604,
5010,
2495,
272,
281,
1408,
562,
273,
5300,
253,
4795,
10571,
28462,
4836,
310,
417,
16293,
281,
320,
1220,
17254,
1014,
604,
253,
3236,
4836,
369,
271,
5661,
7103,
2722,
326,
253,
2746,
2987,
973,
275,
2067,
13997,
68,
10625,
835,
1077,
1781,
8892,
476,
320,
14042,
342,
7898,
3216,
272,
50276,
17480,
436,
2929,
310,
2168,
3863,
387,
39951,
2284,
285,
253,
9262,
2715,
310,
8931,
285,
417,
247,
3356,
7681,
16223,
3740,
12955,
891,
588,
35531,
432,
4028,
247,
2120,
2278,
1223,
253,
2929,
1057,
417,
3587,
2965,
715,
253,
7140,
273,
3186,
390,
47641,
323,
7219,
352,
12453,
253,
1895,
273,
16161,
10625,
326,
403,
11132,
984,
273,
616,
1979,
347,
347,
824,
671,
13840,
253,
7990,
273,
253,
22586,
253,
2929,
310,
1077,
973,
3542,
285,
3477,
281,
956,
285,
7613,
619,
17401,
310,
247,
2590,
2997,
323,
253,
22586,
50276,
338,
891,
574,
281,
1127,
562,
2712,
326,
812,
320,
5520,
275,
1083,
253,
4477,
651,
2686,
751,
281,
1614,
436,
715,
247,
581,
3239,
3356,
6508,
2929,
840,
891,
651,
281,
1804,
281,
2486,
247,
5740,
273,
253,
908,
5145,
4715,
5609,
984,
1142,
8607,
387,
17857,
1825,
403,
3164,
417,
1077,
7615,
342,
841,
625,
15538,
891,
1158,
352,
651,
320,
4722,
281,
2319,
2139,
1798,
13361,
3082,
4307,
390,
42126,
789,
323,
253,
4096,
273,
253,
2929,
390,
604,
326,
2550,
320,
5544,
4354,
387,
1878,
1375,
752,
3910,
497,
2540,
752,
3602,
7402,
598,
1146,
908,
285,
849,
436,
5876,
1543,
7152,
339,
793,
73,
430,
6010,
273,
253,
2929,
436,
2987,
23970,
7898,
3216,
272,
5609,
323,
7219,
8892,
2722,
849,
5145,
4715,
5609,
476,
320,
908,
281,
23652,
907,
253,
5572,
1340,
273,
253,
3216,
272,
1232,
285,
10262,
271,
16774,
7103,
273,
253,
5609,
327,
2709,
7219,
10625,
50276,
5992,
7193,
2278,
275,
7219,
954,
499,
23217,
31735,
1347,
3216,
272,
347,
247,
638,
7404,
3213,
1078,
3186,
3103,
1014,
253,
19508,
3186,
11333,
31451,
320,
273,
897,
604,
253,
499,
9582,
310,
417,
1014,
2104,
281,
3426,
253,
3216,
272,
3213,
436,
369,
6296,
253,
1083,
275,
690,
3237,
273,
253,
6323,
5213,
7219,
7324,
13997,
68,
655,
3103,
436,
789,
2340,
684,
247,
1077,
4623,
1673,
323,
7219,
285,
13840,
275,
253,
7990,
273,
253,
22586,
253,
2929,
9563,
1077,
973,
285,
1057,
247,
1175,
2628,
275,
15250,
253,
1895,
285,
253,
2934,
273,
7898,
3216,
272,
285,
5572,
15824,
2905,
789,
310,
11106,
3738,
352,
812,
14588,
3139,
281,
253,
3332,
10199,
273,
2250,
20824,
6928,
281,
7885,
1162,
355,
4765,
534,
671,
4647,
5145,
4715,
5609,
281,
253,
14287,
4836,
6779,
3738,
841,
403,
908,
281,
7102,
3186,
17837,
253,
3559,
5609,
403,
4460,
285,
323,
253,
954,
629,
4518,
2931,
271,
9470,
16774,
7103,
2722,
849,
253,
1027,
5609,
7277,
281,
1016,
643,
285,
921,
326,
253,
7898,
3216,
272,
5609,
476,
3012,
2572,
253,
7031,
273,
247,
499,
9582,
3738,
627,
310,
417,
581,
41297,
5853,
512,
275,
512,
436,
2929,
10262,
4122,
4623,
789,
281,
253,
1673,
273,
8946,
7219,
285,
891,
513,
417,
923,
247,
1921,
281,
417,
2997,
436,
789,
387,
253,
22586,
50275,
2577,
760,
1524,
14226,
310,
326,
253,
9759,
273,
253,
4164,
81,
285,
253,
9162,
2746,
310,
8489,
25040,
285,
247,
2372,
2410,
311,
4525,
1223,
253,
6414,
273,
253,
2929,
310,
1077,
973,
285,
2938,
1574,
3542,
891,
574,
281,
294,
1088,
436,
2593,
2067,
2069,
1078,
4751,
4685,
253,
6944,
12342,
891,
1158,
2057,
247,
625,
7473,
4758,
390,
247,
625,
7000,
1650,
327,
1097,
7274,
812,
320,
9371,
323,
253,
9414,
50276,
22163,
281,
7885,
11664,
5495,
259,
2578,
87,
478,
266,
726,
77,
25858,
289,
487,
10422,
26752,
272,
1269,
466,
2250,
20824,
6928,
2087,
1701,
7823,
342,
3676,
4715,
39951,
2284,
4765,
721,
1717,
2950,
17615,
50276,
37585,
5701,
50275,
74,
651,
9059,
326,
247,
15910,
1617,
310,
247,
1617,
327,
672,
253,
5933,
14545,
533,
275,
436,
789,
352,
310,
247,
1617,
327,
672,
253,
5933,
7788,
50276,
1059,
295,
412,
320,
247,
3638,
50276,
15684,
253,
5933,
281,
4035,
1223,
50276,
783,
6769,
1057,
417,
1663,
1056,
3282,
1060,
5046,
816,
990,
253,
6197,
285,
1265,
342,
359,
2430,
50276,
783,
2505,
273,
4677,
337,
310,
1892,
281,
1239,
327,
11462,
2929,
1908,
253,
897,
273,
13433,
8266,
50276,
7957,
21169,
310,
11106,
7019,
50276,
6992,
491,
285,
30013,
4059,
4059,
50276,
563,
21821,
807,
50275,
187,
187,
4118,
18435,
27,
69,
613,
4477,
5717,
368,
1077,
1199,
323,
634,
19529,
359,
403,
5211,
281,
4151,
368,
326,
359,
452,
4425,
281,
2997,
352,
285,
359,
1007,
3579,
281,
634,
2312,
275,
253,
22586,
4496,
564,
689,
253,
8680,
275,
253,
10123,
285,
3451,
390,
5731,
634,
9380,
275,
673,
323,
253,
6568,
4704,
3522,
778,
2164,
1682,
17730,
288,
8289,
532,
37630
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
10262,
271,
2746,
323,
760,
10571,
3216,
272,
8946,
7219,
8892,
534,
403,
1512,
1781,
281,
320,
4751,
28462,
407,
1846,
3216,
272,
11333,
253,
2746,
4648,
5145,
4715,
5609,
281,
6642,
253,
5912,
273,
9158,
15823,
281,
247,
2098,
273,
253,
4836,
970,
1491,
432,
1355,
10872,
273,
253,
1072,
5028,
841,
9158,
403,
840,
2783,
806,
323,
3216,
272,
534,
476,
320,
6331,
2393,
604,
5010,
2495,
272,
281,
1408,
562,
273,
5300,
253,
4795,
10571,
28462,
4836,
310,
417,
16293,
281,
320,
1220,
17254,
1014,
604,
253,
3236,
4836,
369,
271,
5661,
7103,
2722,
326,
253,
2746,
2987,
973,
275,
2067,
13997,
68,
10625,
835,
1077,
1781,
8892,
476,
320,
14042,
342,
7898,
3216,
272,
50276,
17480,
436,
2929,
310,
2168,
3863,
387,
39951,
2284,
285,
253,
9262,
2715,
310,
8931,
285,
417,
247,
3356,
7681,
16223,
3740,
12955,
891,
588,
35531,
432,
4028,
247,
2120,
2278,
1223,
253,
2929,
1057,
417,
3587,
2965,
715,
253,
7140,
273,
3186,
390,
47641,
323,
7219,
352,
12453,
253,
1895,
273,
16161,
10625,
326,
403,
11132,
984,
273,
616,
1979,
347,
347,
824,
671,
13840,
253,
7990,
273,
253,
22586,
253,
2929,
310,
1077,
973,
3542,
285,
3477,
281,
956,
285,
7613,
619,
17401,
310,
247,
2590,
2997,
323,
253,
22586,
50276,
338,
891,
574,
281,
1127,
562,
2712,
326,
812,
320,
5520,
275,
1083,
253,
4477,
651,
2686,
751,
281,
1614,
436,
715,
247,
581,
3239,
3356,
6508,
2929,
840,
891,
651,
281,
1804,
281,
2486,
247,
5740,
273,
253,
908,
5145,
4715,
5609,
984,
1142,
8607,
387,
17857,
1825,
403,
3164,
417,
1077,
7615,
342,
841,
625,
15538,
891,
1158,
352,
651,
320,
4722,
281,
2319,
2139,
1798,
13361,
3082,
4307,
390,
42126,
789,
323,
253,
4096,
273,
253,
2929,
390,
604,
326,
2550,
320,
5544,
4354,
387,
1878,
1375,
752,
3910,
497,
2540,
752,
3602,
7402,
598,
1146,
908,
285,
849,
436,
5876,
1543,
7152,
339,
793,
73,
430,
6010,
273,
253,
2929,
436,
2987,
23970,
7898,
3216,
272,
5609,
323,
7219,
8892,
2722,
849,
5145,
4715,
5609,
476,
320,
908,
281,
23652,
907,
253,
5572,
1340,
273,
253,
3216,
272,
1232,
285,
10262,
271,
16774,
7103,
273,
253,
5609,
327,
2709,
7219,
10625,
50276,
5992,
7193,
2278,
275,
7219,
954,
499,
23217,
31735,
1347,
3216,
272,
347,
247,
638,
7404,
3213,
1078,
3186,
3103,
1014,
253,
19508,
3186,
11333,
31451,
320,
273,
897,
604,
253,
499,
9582,
310,
417,
1014,
2104,
281,
3426,
253,
3216,
272,
3213,
436,
369,
6296,
253,
1083,
275,
690,
3237,
273,
253,
6323,
5213,
7219,
7324,
13997,
68,
655,
3103,
436,
789,
2340,
684,
247,
1077,
4623,
1673,
323,
7219,
285,
13840,
275,
253,
7990,
273,
253,
22586,
253,
2929,
9563,
1077,
973,
285,
1057,
247,
1175,
2628,
275,
15250,
253,
1895,
285,
253,
2934,
273,
7898,
3216,
272,
285,
5572,
15824,
2905,
789,
310,
11106,
3738,
352,
812,
14588,
3139,
281,
253,
3332,
10199,
273,
2250,
20824,
6928,
281,
7885,
1162,
355,
4765,
534,
671,
4647,
5145,
4715,
5609,
281,
253,
14287,
4836,
6779,
3738,
841,
403,
908,
281,
7102,
3186,
17837,
253,
3559,
5609,
403,
4460,
285,
323,
253,
954,
629,
4518,
2931,
271,
9470,
16774,
7103,
2722,
849,
253,
1027,
5609,
7277,
281,
1016,
643,
285,
921,
326,
253,
7898,
3216,
272,
5609,
476,
3012,
2572,
253,
7031,
273,
247,
499,
9582,
3738,
627,
310,
417,
581,
41297,
5853,
512,
275,
512,
436,
2929,
10262,
4122,
4623,
789,
281,
253,
1673,
273,
8946,
7219,
285,
891,
513,
417,
923,
247,
1921,
281,
417,
2997,
436,
789,
387,
253,
22586,
50275,
2577,
760,
1524,
14226,
310,
326,
253,
9759,
273,
253,
4164,
81,
285,
253,
9162,
2746,
310,
8489,
25040,
285,
247,
2372,
2410,
311,
4525,
1223,
253,
6414,
273,
253,
2929,
310,
1077,
973,
285,
2938,
1574,
3542,
891,
574,
281,
294,
1088,
436,
2593,
2067,
2069,
1078,
4751,
4685,
253,
6944,
12342,
891,
1158,
2057,
247,
625,
7473,
4758,
390,
247,
625,
7000,
1650,
327,
1097,
7274,
812,
320,
9371,
323,
253,
9414,
50276,
22163,
281,
7885,
11664,
5495,
259,
2578,
87,
478,
266,
726,
77,
25858,
289,
487,
10422,
26752,
272,
1269,
466,
2250,
20824,
6928,
2087,
1701,
7823,
342,
3676,
4715,
39951,
2284,
4765,
721,
1717,
2950,
17615,
50276,
37585,
5701,
50275,
74,
651,
9059,
326,
247,
15910,
1617,
310,
247,
1617,
327,
672,
253,
5933,
14545,
533,
275,
436,
789,
352,
310,
247,
1617,
327,
672,
253,
5933,
7788,
50276,
1059,
295,
412,
320,
247,
3638,
50276,
15684,
253,
5933,
281,
4035,
1223,
50276,
783,
6769,
1057,
417,
1663,
1056,
3282,
1060,
5046,
816,
990,
253,
6197,
285,
1265,
342,
359,
2430,
50276,
783,
2505,
273,
4677,
337,
310,
1892,
281,
1239,
327,
11462,
2929,
1908,
253,
897,
273,
13433,
8266,
50276,
7957,
21169,
310,
11106,
7019,
50276,
6992,
491,
285,
30013,
4059,
4059,
50276,
563,
21821,
807,
50275,
187,
187,
4118,
18435,
27,
69,
613,
4477,
5717,
368,
1077,
1199,
323,
634,
19529,
359,
403,
5211,
281,
4151,
368,
326,
359,
452,
4425,
281,
2997,
352,
285,
359,
1007,
3579,
281,
634,
2312,
275,
253,
22586,
4496,
564,
689,
253,
8680,
275,
253,
10123,
285,
3451,
390,
5731,
634,
9380,
275,
673,
323,
253,
6568,
4704,
3522,
778,
2164,
1682,
17730,
288,
8289,
532,
37630
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper focuses on the network load balancing problem in data centers using multiagent rl paradigm the main goal in load balancing problems is to minimize the makespan the authors prove various properties of the setting with the main result to be that such setting is a markov potential game they showed this result via properly defining a workload distribution fairness potential function moreover using facts established in leonardos et al they design a distributed algorithm to approximate nash equilibrium policies the authors provide an extensive experimental section that suggest that the proposed algorithm is effective pros the paper is interesting with both theoretical and applied merits and an interesting modeling of the network load balancing prblems in data centers as a marl system cons the result of the proposed framework to be a markov potential game is not very surprising as load balancing games are known to be potential games see koutsoupias papadimitriou 99 this work has no negative societal impact as far as the reviewer can forsee docsepthis paper proposes mpgbased marl solution for the loadbalancing problem applying rl directly for loadbalancing is not favorable as the load balancers ie multipleagents need to synchronize observations as well as the action space grows with number of agents requiring retraining etc strengths 1 does not require retraining with increasing number of multiple agents as the proposed approach decomposes the joint state and action space 2 does not require synchronization between the load balancers weakness 1 poor evaluation no scaling experiments ie increasing number of lbs or servers only 2 lbs in evaluation if the paper had scaled the experiments more they would find that their approach may not be practical for real dcs discussed later no real trafficworkload the supposedly real benchmark is a mocked up small testbed that does not mimic real distribution of traffic or scale experimentally weak no variation in traffic and limited variation of iocpu what about qos 99th percentile behavior an important metric to evaluate for lbs 2 invalid and inconsistent assumptions wrt problem statement in the intro the paper claims that existing algorithms are not adaptive to due to dynamic environments yet the assumption made in this paper is that each server is capable of processing a certain amount of workload vj which is a number only dependent on server capabilities and not on the request or traffic type itself for example a get request of type x can take 2s whereas a another get request of type y can take 20s the paper mentions collided elephants and yet does not provide any experiments that the proposed technique can handle such situations in other words vj should be stochastic and not fixed on just the server but also on the request characristics previous assumption invalidates most of the derivation presented later in the paper another assumption is that active probing is impractical however it is okay for lbs to communicate with severs to observe the server state why there is no citation or experiment showing that indeed that is a reasonable assumption all of the work is based on this key assumption 3 limited insights why markov potential games for handling the stated problem why not use meanfield theorem to approximate the behavior of all the other multiagents using mean or median behavior overall the paper reads as an application of mpg rather than there is a nice solution to the loadbalancing problem insights and analysis of different approaches are missing no evaluation of the overhead of the rl vs marl solution in terms of performance as well as overhead to justify marl is needed over rl there are several solutions proposed such as rlbsac neurips 2021 which reports similar high performance and park an open platform for learningaugmented computer systems what happens if rl makes bad decisions safety of rl towards safe online reinforcement learning in computer systems neurips 2021 4 writing needs significant improvement esp introduction and related work section citations on key assumptionsclaims or experiments to make those statements abbreviations introduced without describing what it stands for eg ne for mpg limited evaluation strong assumptions for a practical system no comparison with other mlbased approaches docsepthis paper considers the load balancing problem in a network of multiple heterogenous servers and multiple load balancers the authors formulate the problem as a multi agent reinforcement learning problem and specifically consider a markova potential game the settings is that of multiple load balancers each responsible for sending jobs to a set of servers there might be overlaps in the set of servers the various load balancers serve and the load balancers thus have partial observability of the system state using the cumulative total fairness as the potential function where fairness is defined as either variance fairness or product fairness the authors show that the job allocation game where the objective is to minimize the makespan while maximizing the variance fairness or product fairness is a markov potential game a network with multiple load balancers managing load to multiple and overlapping servers is a complex problem the interactions between the load balancers is such that a closed form solution to the balancing problem is not evident this approach of setting a potential game within a rl environment is interesting and seems novel the authors propose a distributed load balancing method where each agent independently learns a policy through policy gradient methods the reward function is set to be perlb variance or product fairness the authors show that maximizing for these local fairness metrics is sufficient tor minimizing makespan a global metric the exact model with respect to overlap of servers among the load balancers is not clear are all servers allocated jobs by all lbers what do the results look like with partial overlaps this seems to be a harder problem the experimentation doesnt include comparison with classical methods such as lsq it would be interesting to see how a distributed blind greedy lsq compares to the distributed marl method proposed here especially since the computation costs are so vastly different docsepthis paper proposes a distributed multiagent reinforcement learning based approach for load balancing at the network layer formulated as a markov potential game current network load balancers have limited observability over the workloads and servers performance and are prone to misconfiguration due to heterogeneity and elasticity centralized approaches ctde incur an additional overhead from centralized communication this work addresses this by using a local variancebased fairness function in each load balancer which when maximized can minimize the potential function of the markov potential game this approximates the nash equilibrium of the game strengths significant gains by using proposed design over current inproduction load balancing algorithms strong theoretical foundation of formulating load balancing as a multiagent rlbased markov potential game well written paper that puts the pieces of the design in an easy to understand order weaknesses dcs typically have high bandwidth for internal communication the paper states that centralized communication leads to heavy overhead which is not convincing in the main paper the evaluation section mentions it in passing as being evaluated in the appendix but i feel it would be helpful to show in the main paper not sure if i agree with largescale dc networks having only 20 servers the largest of data centers have thousands of servers and load balancers this makes the realworld setup slightly less impressive fault tolerance not evaluated in the paper in terms of failed requests leading to incorrect job completion estimates for next time period network partitions etc the paper doesnt seem to address elastic setups even though their motivation included both heterogeneous and elastic infrastructures simulator not as complex as realworld addressed in paper still allows to test parts of the system without stochastic network parameters these could be synthetically injected though need for low communication overhead in dc is not motivated strongly centralized methods qmix for example still show comparable performance in some application setups docsepthe paper explores the task of multiagent network load balancing via formulation as a markov potential game using workload distribution fairness as a potential function a marl algorithm is proposed based on this formulation and provides for fullydecentralized learning the paper further develops an eventbased simulator which along with a realworld network setup is used to evaluate the proposed algorithm against several marl baselines strengths rigorous formulation of network load balancing as mpg with proofs that appear sound generally interesting and wellmotivated application for marl with promising potential weaknesses concern regarding representativeness of baselines used for evaluation practical benefits in terms of communication overhead training time could be more strongly motivated detailed comments overall the paper was interesting to read and the problem itself is well motivated formulation of the problem as an mpg appears sound and offers a variety of important insights with promising applications there are however some concerns regarding evaluation fairness and practical benefits the baselines used for evaluation do not seem to accurately represent the stateoftheart in ctde in particular there have been a variety of recent works that explore more efficient strategies eg 13 and consistently outperform qmix with relatively low interagent communication although the proposed work appears effective as a fullydecentralized approach it is unclear how well it would perform against more competitive ctde baselines comparison against these more recent works would greatly improve the strength of evaluation benefits in terms of reduced communication overhead could also be more strongly motivated presumably communication between agents could be done over purposebuilt interlb links thus avoiding qos degradation due to contention on links between lbs and servers even without interlb links the increase in latency demonstrated in appendix e22 appears relatively low robustness against dynamic changes in network setup are discussed to some degree but its unclear how significant this issue is in a realworld environment even in a largescale setup the number of lbsservers is likely to remain fairly constant at the timescales considered in this work ie minutes given this it seems that the paper should at least discuss tradeoffs with a longer training time which could impact the relative benefits of various approaches some confusion in notation algorithm 2 l8 should be t 1h for horizon l100 m denotes the set of lbs minor notes some abbreviations are not defined eg ne on l73 superscript notation in eq 6 is not defined until much later l166 which hindered understanding in an initial read 1 s zhang et al efficient communication in multiagent reinforcement learning via variance based control neurips 2019 2 z ding et al learning individually inferred communication for multiagent cooperation neurips 2020 3 t wang et al learning nearly decomposable value functions via communication minimization iclr 2020 na
### Summary: | the paper received an uniformly positive evaluation although all the scores are in the borderline weak accept range the authors included a long and comprehensive rebuttal and actively participated in the discussion which made some of the reviewers updating their scores i recommend the paper to be accepted but i understand the decision could be reverted when comparing the paper with the other candidates | [
271,
9470,
5661,
2593,
326,
1804,
326,
253,
4081,
5933,
310,
3576,
5847,
253,
2929,
310,
4722,
342,
1097,
10527,
285,
3732,
16108,
285,
271,
4722,
14053,
273,
253,
2990,
3301,
26259,
819,
23042,
275,
941,
12127,
347,
247,
2304,
77,
985,
50276,
5040,
253,
906,
273,
253,
4081,
7792,
281,
320,
247,
1616,
729,
2442,
2165,
310,
417,
1077,
10084,
347,
3301,
26259,
3958,
403,
1929,
281,
320,
2442,
3958,
923,
465,
8349,
1011,
6358,
13860,
324,
16563,
363,
276,
8688,
50276,
2520,
789,
556,
642,
4016,
38058,
3486,
347,
2080,
347,
253,
37317,
476,
323,
2887,
5474,
33032,
2520,
2929,
29328,
278,
8159,
3169,
2304,
77,
2900,
323,
253,
3301,
7187,
6816,
1895,
9433,
391,
77,
3587,
323,
3301,
7187,
6816,
310,
417,
13857,
347,
253,
3301,
4273,
13342,
26332,
2709,
21215,
878,
281,
11842,
907,
7313,
347,
973,
347,
253,
2250,
2317,
17202,
342,
1180,
273,
6083,
10568,
851,
26208,
3966,
20544,
337,
1057,
417,
2430,
851,
26208,
342,
3629,
1180,
273,
2709,
6083,
347,
253,
4081,
2746,
11101,
6013,
253,
6036,
1375,
285,
2250,
2317,
374,
1057,
417,
2430,
27801,
875,
253,
3301,
4273,
13342,
50275,
20881,
1255,
337,
4105,
7103,
50274,
2369,
13642,
4679,
26332,
3629,
1180,
273,
38818,
390,
14903,
760,
374,
38818,
275,
7103,
604,
253,
2929,
574,
24337,
253,
4679,
625,
597,
651,
1089,
326,
616,
2746,
778,
417,
320,
8542,
323,
1524,
277,
6113,
5469,
1996,
50273,
2369,
1524,
7137,
1601,
2799,
253,
24628,
1524,
22791,
310,
247,
13031,
264,
598,
1355,
1071,
3026,
326,
1057,
417,
25066,
1524,
3268,
273,
7137,
390,
4311,
50273,
16217,
2092,
595,
5075,
642,
7629,
275,
7137,
285,
3710,
7629,
273,
891,
406,
11113,
50274,
5371,
670,
2805,
375,
8688,
394,
36384,
3879,
271,
1774,
7982,
281,
7472,
323,
38818,
374,
12078,
285,
16706,
13260,
8772,
1895,
3908,
50274,
249,
253,
26432,
253,
2929,
3916,
326,
5368,
11333,
403,
417,
17825,
281,
1955,
281,
7870,
12620,
2568,
253,
9376,
1160,
275,
436,
2929,
310,
326,
1016,
4771,
310,
7032,
273,
5162,
247,
2176,
2408,
273,
32140,
362,
75,
534,
310,
247,
1180,
760,
7976,
327,
4771,
13789,
285,
417,
327,
253,
2748,
390,
7137,
1511,
3139,
323,
1650,
247,
755,
2748,
273,
1511,
1269,
476,
1379,
374,
84,
5727,
247,
1529,
755,
2748,
273,
1511,
340,
476,
1379,
1384,
84,
253,
2929,
25957,
3007,
1356,
42322,
285,
2568,
1057,
417,
2085,
667,
4679,
326,
253,
4081,
5853,
476,
6016,
824,
9534,
275,
643,
3000,
362,
75,
943,
320,
19191,
285,
417,
4229,
327,
816,
253,
4771,
533,
671,
327,
253,
2748,
1018,
317,
2198,
982,
50273,
35065,
9376,
12078,
684,
954,
273,
253,
28529,
3559,
1996,
275,
253,
2929,
50272,
23955,
9376,
310,
326,
3939,
39578,
310,
45783,
2299,
352,
310,
8261,
323,
38818,
281,
13791,
342,
396,
735,
281,
10018,
253,
4771,
1375,
2139,
627,
310,
642,
25577,
390,
3368,
4645,
326,
6296,
326,
310,
247,
5272,
9376,
512,
273,
253,
789,
310,
1754,
327,
436,
2234,
9376,
50275,
20,
3710,
16039,
50274,
22309,
1616,
729,
2442,
3958,
323,
10885,
253,
4767,
1895,
2139,
417,
897,
1599,
3423,
10012,
281,
16851,
253,
3879,
273,
512,
253,
643,
4471,
21215,
970,
1599,
390,
8876,
3879,
4583,
253,
2929,
9563,
347,
271,
2898,
273,
278,
8159,
2581,
685,
627,
310,
247,
5322,
2900,
281,
253,
3301,
7187,
6816,
1895,
16039,
285,
1783,
273,
1027,
7274,
403,
5816,
50274,
2369,
7103,
273,
253,
18332,
273,
253,
391,
77,
4632,
2304,
77,
2900,
275,
2426,
273,
3045,
347,
973,
347,
18332,
281,
15249,
2304,
77,
310,
3058,
689,
391,
77,
627,
403,
2067,
5482,
4081,
824,
347,
391,
77,
1768,
317,
5723,
2824,
43425,
534,
5012,
2074,
1029,
3045,
285,
5603,
271,
1527,
5147,
323,
4715,
2321,
16390,
4382,
2718,
50274,
5371,
6569,
604,
391,
77,
2789,
3076,
7089,
5252,
273,
391,
77,
4404,
4999,
3909,
35221,
4715,
275,
4382,
2718,
5723,
2824,
43425,
50276,
21,
4028,
3198,
1534,
7756,
17985,
10199,
285,
2905,
789,
2593,
50269,
34212,
327,
2234,
13260,
28803,
390,
4679,
281,
1056,
1110,
7234,
50272,
357,
25669,
5611,
1293,
12930,
752,
352,
9572,
323,
24088,
425,
323,
278,
8159,
50274,
15870,
7103,
50276,
9072,
13260,
323,
247,
8542,
985,
50276,
2369,
5301,
342,
643,
13361,
3169,
7274,
50276,
7152,
33032,
2520,
2929,
19401,
253,
3301,
26259,
1895,
275,
247,
2990,
273,
2709,
6895,
11426,
14903,
285,
2709,
3301,
4273,
13342,
50276,
783,
4477,
36803,
253,
1895,
347,
247,
4471,
5570,
35221,
4715,
1895,
285,
5742,
1908,
247,
1616,
8947,
2442,
2165,
50276,
783,
7533,
310,
326,
273,
2709,
3301,
4273,
13342,
1016,
5506,
323,
10430,
7375,
281,
247,
873,
273,
14903,
50276,
9088,
1537,
320,
47685,
275,
253,
873,
273,
14903,
253,
2710,
3301,
4273,
13342,
5752,
285,
253,
3301,
4273,
13342,
3021,
452,
7898,
1759,
1430,
273,
253,
985,
1375,
50276,
5302,
253,
18849,
2264,
28959,
347,
253,
2442,
1159,
835,
28959,
310,
2931,
347,
2057,
11041,
28959,
390,
1885,
28959,
253,
4477,
921,
326,
253,
2628,
17621,
2165,
835,
253,
8103,
310,
281,
15338,
253,
2789,
4029,
1223,
46875,
253,
11041,
28959,
390,
1885,
28959,
310,
247,
1616,
729,
2442,
2165,
50275,
66,
2990,
342,
2709,
3301,
4273,
13342,
14419,
3301,
281,
2709,
285,
21481,
14903,
310,
247,
2570,
1895,
50276,
783,
6355,
875,
253,
3301,
4273,
13342,
310,
824,
326,
247,
4581,
830,
2900,
281,
253,
26259,
1895,
310,
417,
8943,
50276,
2520,
2746,
273,
4758,
247,
2442,
2165,
1561,
247,
391,
77,
3126,
310,
4722,
285,
3133,
4460,
50276,
783,
4477,
12661,
247,
5939,
3301,
26259,
1332,
835,
1016,
5570,
10939,
33772,
247,
3646,
949,
3646,
11786,
3082,
253,
10921,
1159,
310,
873,
281,
320,
591,
24780,
11041,
390,
1885,
28959,
50276,
783,
4477,
921,
326,
46875,
323,
841,
1980,
28959,
17082,
310,
4209,
7263,
28699,
2789,
4029,
247,
4156,
7982,
50276,
783,
3242,
1566,
342,
1675,
281,
14787,
273,
14903,
2190,
253,
3301,
4273,
13342,
310,
417,
2590,
50276,
609,
512,
14903,
18564,
7375,
407,
512,
298,
1653,
752,
513,
253,
1543,
1007,
751,
342,
7898,
47685,
436,
3133,
281,
320,
247,
12150,
1895,
50273,
783,
40290,
36908,
2486,
5301,
342,
8946,
3082,
824,
347,
298,
18858,
50276,
262,
651,
320,
4722,
281,
923,
849,
247,
5939,
9645,
38754,
298,
18858,
26662,
281,
253,
5939,
2304,
77,
1332,
4081,
1060,
3340,
1580,
253,
13782,
4815,
403,
594,
37078,
1027,
5474,
33032,
2520,
2929,
29328,
247,
5939,
4471,
12788,
35221,
4715,
1754,
2746,
323,
3301,
26259,
387,
253,
2990,
3828,
26115,
347,
247,
1616,
729,
2442,
2165,
50276,
6259,
2990,
3301,
4273,
13342,
452,
3710,
1759,
1430,
689,
253,
32140,
84,
285,
14903,
3045,
285,
403,
21291,
281,
3731,
24693,
1955,
281,
19331,
285,
43546,
36409,
7274,
45830,
615,
36967,
271,
3081,
18332,
432,
36409,
5511,
436,
789,
12453,
436,
407,
970,
247,
1980,
11041,
3169,
28959,
1159,
275,
1016,
3301,
4273,
21955,
534,
672,
11903,
1025,
476,
15338,
253,
2442,
1159,
273,
253,
1616,
729,
2442,
2165,
436,
4020,
684,
253,
295,
1225,
12902,
273,
253,
2165,
50275,
296,
3755,
20556,
50276,
32258,
15988,
407,
970,
4081,
2216,
689,
1655,
275,
25307,
3301,
26259,
11333,
50276,
9072,
10527,
12153,
273,
830,
8287,
3301,
26259,
347,
247,
4471,
12788,
391,
77,
3169,
1616,
729,
2442,
2165,
50276,
4714,
3542,
2929,
326,
12516,
253,
7437,
273,
253,
2216,
275,
271,
3477,
281,
2096,
1340,
50275,
20881,
1255,
265,
50276,
69,
6113,
5431,
452,
1029,
16992,
323,
4812,
5511,
253,
2929,
3054,
326,
36409,
5511,
5644,
281,
5536,
18332,
534,
310,
417,
21414,
275,
253,
2022,
2929,
253,
7103,
2593,
25957,
352,
275,
8136,
347,
1146,
6760,
275,
253,
30762,
533,
891,
1928,
352,
651,
320,
9371,
281,
921,
275,
253,
2022,
2929,
50276,
1439,
2119,
604,
891,
5194,
342,
1236,
2510,
25912,
36196,
6928,
1907,
760,
1384,
14903,
253,
6253,
273,
941,
12127,
452,
6763,
273,
14903,
285,
3301,
4273,
13342,
436,
2789,
253,
1524,
10186,
9978,
5777,
1679,
13943,
50276,
40478,
13761,
417,
6760,
275,
253,
2929,
275,
2426,
273,
4242,
9762,
4283,
281,
13583,
2628,
12240,
8197,
323,
1735,
673,
2180,
2990,
27959,
3966,
50275,
783,
2929,
36908,
1646,
281,
2953,
15386,
873,
8777,
1014,
2167,
616,
16038,
2908,
1097,
22766,
285,
15386,
275,
9627,
38124,
50276,
3549,
11699,
417,
347,
2570,
347,
1524,
10186,
9713,
275,
2929,
1335,
4483,
281,
1071,
4243,
273,
253,
985,
1293,
19191,
2990,
3602,
841,
812,
320,
5132,
85,
1037,
13945,
2167,
50276,
22990,
323,
1698,
5511,
18332,
275,
36196,
310,
417,
17194,
7052,
36409,
3082,
2805,
24706,
323,
1650,
1335,
921,
10870,
3045,
275,
690,
2898,
873,
8777,
50276,
7152,
339,
431,
248,
2929,
33826,
253,
4836,
273,
4471,
12788,
2990,
3301,
26259,
3066,
15895,
347,
247,
1616,
729,
2442,
2165,
970,
32140,
3268,
28959,
347,
247,
2442,
1159,
247,
2304,
77,
5933,
310,
4081,
1754,
327,
436,
15895,
285,
3400,
323,
4751,
615,
25765,
1025,
4715,
253,
2929,
2007,
24357,
271,
2362,
3169,
40022,
534,
2112,
342,
247,
1524,
10186,
2990,
9978,
310,
908,
281,
7472,
253,
4081,
5933,
1411,
2067,
2304,
77,
1666,
25379,
50276,
296,
3755,
20556,
50276,
10389,
11303,
15895,
273,
2990,
3301,
26259,
347,
278,
8159,
342,
27947,
326,
3176,
3590,
50276,
43786,
4722,
285,
973,
24013,
8550,
2898,
323,
2304,
77,
342,
12532,
2442,
50276,
20881,
1255,
265,
50276,
585,
20631,
5001,
1957,
255,
6460,
273,
1666,
25379,
908,
323,
7103,
50276,
81,
26080,
5373,
275,
2426,
273,
5511,
18332,
50276,
31158,
673,
812,
320,
625,
7052,
17194,
50276,
5992,
7193,
5701,
50276,
1189,
455,
253,
2929,
369,
4722,
281,
1239,
285,
253,
1895,
3139,
310,
973,
17194,
15895,
273,
253,
1895,
347,
271,
278,
8159,
4620,
3590,
285,
6131,
247,
5235,
273,
1774,
16039,
342,
12532,
4893,
627,
403,
2299,
690,
7350,
5001,
7103,
28959,
285,
8542,
5373,
50276,
783,
1666,
25379,
908,
323,
7103,
513,
417,
1646,
281,
13613,
1957,
253,
1375,
23037,
14387,
275,
45830,
615,
275,
1798,
627,
452,
644,
247,
5235,
273,
3332,
2987,
326,
8338,
625,
5919,
8130,
24088,
2145,
285,
12724,
562,
32231,
2805,
24706,
342,
4942,
1698,
734,
12788,
5511,
3738,
253,
4081,
789,
4620,
3576,
347,
247,
4751,
615,
25765,
1025,
2746,
352,
310,
12744,
849,
973,
352,
651,
1347,
1411,
625,
12085,
45830,
615,
1666,
25379,
5301,
1411,
841,
625,
3332,
2987,
651,
10260,
3157,
253,
4757,
273,
7103,
50276,
31891,
953,
275,
2426,
273,
3777,
5511,
18332,
812,
671,
320,
625,
7052,
17194,
18289,
5511,
875,
6083,
812,
320,
2218,
689,
4096,
20989,
734,
24780,
4859,
3021,
17816,
2805,
375,
11961,
1955,
281,
16422,
327,
4859,
875,
38818,
285,
14903,
1014,
1293,
734,
24780,
4859,
253,
2572,
275,
22667,
5183,
275,
30762,
299,
1423,
4620,
4942,
1698,
50276,
18848,
461,
1255,
1411,
7870,
2544,
275,
2990,
9978,
403,
5469,
281,
690,
4248,
533,
697,
12744,
849,
1534,
436,
2523,
310,
275,
247,
1524,
10186,
3126,
1014,
275,
247,
1236,
2510,
25912,
9978,
253,
1180,
273,
32830,
859,
254,
735,
310,
2779,
281,
3464,
9648,
3638,
387,
253,
2069,
1179,
265,
2783,
275,
436,
789,
26332,
2909,
1677,
436,
352,
3133,
326,
253,
2929,
943,
387,
1878,
2319,
5454,
14273,
342,
247,
3356,
3733,
673,
534,
812,
3486,
253,
4103,
5373,
273,
2710,
7274,
50275,
8826,
13775,
275,
14951,
209,
186,
41528,
374,
298,
25,
943,
320,
246,
50276,
18,
73,
323,
16892,
209,
186,
77,
2313,
278,
12853,
253,
873,
273,
38818,
50276,
37585,
7211,
209,
186,
8826,
490,
25669,
403,
417,
2931,
24088,
425,
327,
298,
3655,
209,
186,
8403,
398,
1687,
14951,
275,
16186,
721,
310,
417,
2931,
1919,
1199,
1996,
298,
17962,
534,
17134,
2122,
4685,
275,
271,
3302,
1239,
50276,
18,
256,
1182,
12109,
1162,
355,
5919,
5511,
275,
4471,
12788,
35221,
4715,
3066,
11041,
1754,
1453,
5723,
2824,
6247,
374,
1182,
50097,
1162,
355,
4715,
15978,
22245,
5511,
323,
4471,
12788,
15850,
5723,
2824,
9169,
495,
246,
259,
606,
1162,
355,
4715,
4829,
11101,
34690,
1318,
3470,
3066,
5511,
41458,
17857,
32888,
9169,
50276,
2072,
2490,
187,
4118,
18435,
27,
783,
2929,
2959,
271,
17568,
2762,
7103,
3738,
512,
253,
7363,
403,
275,
253,
45210,
50276,
20881,
2997,
2491,
253,
4477,
2908,
247,
1048,
285,
11088,
30080,
22559,
285,
15257,
13640,
275,
253,
5955,
534,
1160,
690,
273,
253,
30628,
22753,
616,
7363,
50276,
74,
5583,
253,
2929,
281,
320,
7607,
533,
891,
2096,
253,
3061,
812,
320,
294,
13083,
672,
10941,
253,
2929,
342,
253,
643,
9183
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
271,
9470,
5661,
2593,
326,
1804,
326,
253,
4081,
5933,
310,
3576,
5847,
253,
2929,
310,
4722,
342,
1097,
10527,
285,
3732,
16108,
285,
271,
4722,
14053,
273,
253,
2990,
3301,
26259,
819,
23042,
275,
941,
12127,
347,
247,
2304,
77,
985,
50276,
5040,
253,
906,
273,
253,
4081,
7792,
281,
320,
247,
1616,
729,
2442,
2165,
310,
417,
1077,
10084,
347,
3301,
26259,
3958,
403,
1929,
281,
320,
2442,
3958,
923,
465,
8349,
1011,
6358,
13860,
324,
16563,
363,
276,
8688,
50276,
2520,
789,
556,
642,
4016,
38058,
3486,
347,
2080,
347,
253,
37317,
476,
323,
2887,
5474,
33032,
2520,
2929,
29328,
278,
8159,
3169,
2304,
77,
2900,
323,
253,
3301,
7187,
6816,
1895,
9433,
391,
77,
3587,
323,
3301,
7187,
6816,
310,
417,
13857,
347,
253,
3301,
4273,
13342,
26332,
2709,
21215,
878,
281,
11842,
907,
7313,
347,
973,
347,
253,
2250,
2317,
17202,
342,
1180,
273,
6083,
10568,
851,
26208,
3966,
20544,
337,
1057,
417,
2430,
851,
26208,
342,
3629,
1180,
273,
2709,
6083,
347,
253,
4081,
2746,
11101,
6013,
253,
6036,
1375,
285,
2250,
2317,
374,
1057,
417,
2430,
27801,
875,
253,
3301,
4273,
13342,
50275,
20881,
1255,
337,
4105,
7103,
50274,
2369,
13642,
4679,
26332,
3629,
1180,
273,
38818,
390,
14903,
760,
374,
38818,
275,
7103,
604,
253,
2929,
574,
24337,
253,
4679,
625,
597,
651,
1089,
326,
616,
2746,
778,
417,
320,
8542,
323,
1524,
277,
6113,
5469,
1996,
50273,
2369,
1524,
7137,
1601,
2799,
253,
24628,
1524,
22791,
310,
247,
13031,
264,
598,
1355,
1071,
3026,
326,
1057,
417,
25066,
1524,
3268,
273,
7137,
390,
4311,
50273,
16217,
2092,
595,
5075,
642,
7629,
275,
7137,
285,
3710,
7629,
273,
891,
406,
11113,
50274,
5371,
670,
2805,
375,
8688,
394,
36384,
3879,
271,
1774,
7982,
281,
7472,
323,
38818,
374,
12078,
285,
16706,
13260,
8772,
1895,
3908,
50274,
249,
253,
26432,
253,
2929,
3916,
326,
5368,
11333,
403,
417,
17825,
281,
1955,
281,
7870,
12620,
2568,
253,
9376,
1160,
275,
436,
2929,
310,
326,
1016,
4771,
310,
7032,
273,
5162,
247,
2176,
2408,
273,
32140,
362,
75,
534,
310,
247,
1180,
760,
7976,
327,
4771,
13789,
285,
417,
327,
253,
2748,
390,
7137,
1511,
3139,
323,
1650,
247,
755,
2748,
273,
1511,
1269,
476,
1379,
374,
84,
5727,
247,
1529,
755,
2748,
273,
1511,
340,
476,
1379,
1384,
84,
253,
2929,
25957,
3007,
1356,
42322,
285,
2568,
1057,
417,
2085,
667,
4679,
326,
253,
4081,
5853,
476,
6016,
824,
9534,
275,
643,
3000,
362,
75,
943,
320,
19191,
285,
417,
4229,
327,
816,
253,
4771,
533,
671,
327,
253,
2748,
1018,
317,
2198,
982,
50273,
35065,
9376,
12078,
684,
954,
273,
253,
28529,
3559,
1996,
275,
253,
2929,
50272,
23955,
9376,
310,
326,
3939,
39578,
310,
45783,
2299,
352,
310,
8261,
323,
38818,
281,
13791,
342,
396,
735,
281,
10018,
253,
4771,
1375,
2139,
627,
310,
642,
25577,
390,
3368,
4645,
326,
6296,
326,
310,
247,
5272,
9376,
512,
273,
253,
789,
310,
1754,
327,
436,
2234,
9376,
50275,
20,
3710,
16039,
50274,
22309,
1616,
729,
2442,
3958,
323,
10885,
253,
4767,
1895,
2139,
417,
897,
1599,
3423,
10012,
281,
16851,
253,
3879,
273,
512,
253,
643,
4471,
21215,
970,
1599,
390,
8876,
3879,
4583,
253,
2929,
9563,
347,
271,
2898,
273,
278,
8159,
2581,
685,
627,
310,
247,
5322,
2900,
281,
253,
3301,
7187,
6816,
1895,
16039,
285,
1783,
273,
1027,
7274,
403,
5816,
50274,
2369,
7103,
273,
253,
18332,
273,
253,
391,
77,
4632,
2304,
77,
2900,
275,
2426,
273,
3045,
347,
973,
347,
18332,
281,
15249,
2304,
77,
310,
3058,
689,
391,
77,
627,
403,
2067,
5482,
4081,
824,
347,
391,
77,
1768,
317,
5723,
2824,
43425,
534,
5012,
2074,
1029,
3045,
285,
5603,
271,
1527,
5147,
323,
4715,
2321,
16390,
4382,
2718,
50274,
5371,
6569,
604,
391,
77,
2789,
3076,
7089,
5252,
273,
391,
77,
4404,
4999,
3909,
35221,
4715,
275,
4382,
2718,
5723,
2824,
43425,
50276,
21,
4028,
3198,
1534,
7756,
17985,
10199,
285,
2905,
789,
2593,
50269,
34212,
327,
2234,
13260,
28803,
390,
4679,
281,
1056,
1110,
7234,
50272,
357,
25669,
5611,
1293,
12930,
752,
352,
9572,
323,
24088,
425,
323,
278,
8159,
50274,
15870,
7103,
50276,
9072,
13260,
323,
247,
8542,
985,
50276,
2369,
5301,
342,
643,
13361,
3169,
7274,
50276,
7152,
33032,
2520,
2929,
19401,
253,
3301,
26259,
1895,
275,
247,
2990,
273,
2709,
6895,
11426,
14903,
285,
2709,
3301,
4273,
13342,
50276,
783,
4477,
36803,
253,
1895,
347,
247,
4471,
5570,
35221,
4715,
1895,
285,
5742,
1908,
247,
1616,
8947,
2442,
2165,
50276,
783,
7533,
310,
326,
273,
2709,
3301,
4273,
13342,
1016,
5506,
323,
10430,
7375,
281,
247,
873,
273,
14903,
50276,
9088,
1537,
320,
47685,
275,
253,
873,
273,
14903,
253,
2710,
3301,
4273,
13342,
5752,
285,
253,
3301,
4273,
13342,
3021,
452,
7898,
1759,
1430,
273,
253,
985,
1375,
50276,
5302,
253,
18849,
2264,
28959,
347,
253,
2442,
1159,
835,
28959,
310,
2931,
347,
2057,
11041,
28959,
390,
1885,
28959,
253,
4477,
921,
326,
253,
2628,
17621,
2165,
835,
253,
8103,
310,
281,
15338,
253,
2789,
4029,
1223,
46875,
253,
11041,
28959,
390,
1885,
28959,
310,
247,
1616,
729,
2442,
2165,
50275,
66,
2990,
342,
2709,
3301,
4273,
13342,
14419,
3301,
281,
2709,
285,
21481,
14903,
310,
247,
2570,
1895,
50276,
783,
6355,
875,
253,
3301,
4273,
13342,
310,
824,
326,
247,
4581,
830,
2900,
281,
253,
26259,
1895,
310,
417,
8943,
50276,
2520,
2746,
273,
4758,
247,
2442,
2165,
1561,
247,
391,
77,
3126,
310,
4722,
285,
3133,
4460,
50276,
783,
4477,
12661,
247,
5939,
3301,
26259,
1332,
835,
1016,
5570,
10939,
33772,
247,
3646,
949,
3646,
11786,
3082,
253,
10921,
1159,
310,
873,
281,
320,
591,
24780,
11041,
390,
1885,
28959,
50276,
783,
4477,
921,
326,
46875,
323,
841,
1980,
28959,
17082,
310,
4209,
7263,
28699,
2789,
4029,
247,
4156,
7982,
50276,
783,
3242,
1566,
342,
1675,
281,
14787,
273,
14903,
2190,
253,
3301,
4273,
13342,
310,
417,
2590,
50276,
609,
512,
14903,
18564,
7375,
407,
512,
298,
1653,
752,
513,
253,
1543,
1007,
751,
342,
7898,
47685,
436,
3133,
281,
320,
247,
12150,
1895,
50273,
783,
40290,
36908,
2486,
5301,
342,
8946,
3082,
824,
347,
298,
18858,
50276,
262,
651,
320,
4722,
281,
923,
849,
247,
5939,
9645,
38754,
298,
18858,
26662,
281,
253,
5939,
2304,
77,
1332,
4081,
1060,
3340,
1580,
253,
13782,
4815,
403,
594,
37078,
1027,
5474,
33032,
2520,
2929,
29328,
247,
5939,
4471,
12788,
35221,
4715,
1754,
2746,
323,
3301,
26259,
387,
253,
2990,
3828,
26115,
347,
247,
1616,
729,
2442,
2165,
50276,
6259,
2990,
3301,
4273,
13342,
452,
3710,
1759,
1430,
689,
253,
32140,
84,
285,
14903,
3045,
285,
403,
21291,
281,
3731,
24693,
1955,
281,
19331,
285,
43546,
36409,
7274,
45830,
615,
36967,
271,
3081,
18332,
432,
36409,
5511,
436,
789,
12453,
436,
407,
970,
247,
1980,
11041,
3169,
28959,
1159,
275,
1016,
3301,
4273,
21955,
534,
672,
11903,
1025,
476,
15338,
253,
2442,
1159,
273,
253,
1616,
729,
2442,
2165,
436,
4020,
684,
253,
295,
1225,
12902,
273,
253,
2165,
50275,
296,
3755,
20556,
50276,
32258,
15988,
407,
970,
4081,
2216,
689,
1655,
275,
25307,
3301,
26259,
11333,
50276,
9072,
10527,
12153,
273,
830,
8287,
3301,
26259,
347,
247,
4471,
12788,
391,
77,
3169,
1616,
729,
2442,
2165,
50276,
4714,
3542,
2929,
326,
12516,
253,
7437,
273,
253,
2216,
275,
271,
3477,
281,
2096,
1340,
50275,
20881,
1255,
265,
50276,
69,
6113,
5431,
452,
1029,
16992,
323,
4812,
5511,
253,
2929,
3054,
326,
36409,
5511,
5644,
281,
5536,
18332,
534,
310,
417,
21414,
275,
253,
2022,
2929,
253,
7103,
2593,
25957,
352,
275,
8136,
347,
1146,
6760,
275,
253,
30762,
533,
891,
1928,
352,
651,
320,
9371,
281,
921,
275,
253,
2022,
2929,
50276,
1439,
2119,
604,
891,
5194,
342,
1236,
2510,
25912,
36196,
6928,
1907,
760,
1384,
14903,
253,
6253,
273,
941,
12127,
452,
6763,
273,
14903,
285,
3301,
4273,
13342,
436,
2789,
253,
1524,
10186,
9978,
5777,
1679,
13943,
50276,
40478,
13761,
417,
6760,
275,
253,
2929,
275,
2426,
273,
4242,
9762,
4283,
281,
13583,
2628,
12240,
8197,
323,
1735,
673,
2180,
2990,
27959,
3966,
50275,
783,
2929,
36908,
1646,
281,
2953,
15386,
873,
8777,
1014,
2167,
616,
16038,
2908,
1097,
22766,
285,
15386,
275,
9627,
38124,
50276,
3549,
11699,
417,
347,
2570,
347,
1524,
10186,
9713,
275,
2929,
1335,
4483,
281,
1071,
4243,
273,
253,
985,
1293,
19191,
2990,
3602,
841,
812,
320,
5132,
85,
1037,
13945,
2167,
50276,
22990,
323,
1698,
5511,
18332,
275,
36196,
310,
417,
17194,
7052,
36409,
3082,
2805,
24706,
323,
1650,
1335,
921,
10870,
3045,
275,
690,
2898,
873,
8777,
50276,
7152,
339,
431,
248,
2929,
33826,
253,
4836,
273,
4471,
12788,
2990,
3301,
26259,
3066,
15895,
347,
247,
1616,
729,
2442,
2165,
970,
32140,
3268,
28959,
347,
247,
2442,
1159,
247,
2304,
77,
5933,
310,
4081,
1754,
327,
436,
15895,
285,
3400,
323,
4751,
615,
25765,
1025,
4715,
253,
2929,
2007,
24357,
271,
2362,
3169,
40022,
534,
2112,
342,
247,
1524,
10186,
2990,
9978,
310,
908,
281,
7472,
253,
4081,
5933,
1411,
2067,
2304,
77,
1666,
25379,
50276,
296,
3755,
20556,
50276,
10389,
11303,
15895,
273,
2990,
3301,
26259,
347,
278,
8159,
342,
27947,
326,
3176,
3590,
50276,
43786,
4722,
285,
973,
24013,
8550,
2898,
323,
2304,
77,
342,
12532,
2442,
50276,
20881,
1255,
265,
50276,
585,
20631,
5001,
1957,
255,
6460,
273,
1666,
25379,
908,
323,
7103,
50276,
81,
26080,
5373,
275,
2426,
273,
5511,
18332,
50276,
31158,
673,
812,
320,
625,
7052,
17194,
50276,
5992,
7193,
5701,
50276,
1189,
455,
253,
2929,
369,
4722,
281,
1239,
285,
253,
1895,
3139,
310,
973,
17194,
15895,
273,
253,
1895,
347,
271,
278,
8159,
4620,
3590,
285,
6131,
247,
5235,
273,
1774,
16039,
342,
12532,
4893,
627,
403,
2299,
690,
7350,
5001,
7103,
28959,
285,
8542,
5373,
50276,
783,
1666,
25379,
908,
323,
7103,
513,
417,
1646,
281,
13613,
1957,
253,
1375,
23037,
14387,
275,
45830,
615,
275,
1798,
627,
452,
644,
247,
5235,
273,
3332,
2987,
326,
8338,
625,
5919,
8130,
24088,
2145,
285,
12724,
562,
32231,
2805,
24706,
342,
4942,
1698,
734,
12788,
5511,
3738,
253,
4081,
789,
4620,
3576,
347,
247,
4751,
615,
25765,
1025,
2746,
352,
310,
12744,
849,
973,
352,
651,
1347,
1411,
625,
12085,
45830,
615,
1666,
25379,
5301,
1411,
841,
625,
3332,
2987,
651,
10260,
3157,
253,
4757,
273,
7103,
50276,
31891,
953,
275,
2426,
273,
3777,
5511,
18332,
812,
671,
320,
625,
7052,
17194,
18289,
5511,
875,
6083,
812,
320,
2218,
689,
4096,
20989,
734,
24780,
4859,
3021,
17816,
2805,
375,
11961,
1955,
281,
16422,
327,
4859,
875,
38818,
285,
14903,
1014,
1293,
734,
24780,
4859,
253,
2572,
275,
22667,
5183,
275,
30762,
299,
1423,
4620,
4942,
1698,
50276,
18848,
461,
1255,
1411,
7870,
2544,
275,
2990,
9978,
403,
5469,
281,
690,
4248,
533,
697,
12744,
849,
1534,
436,
2523,
310,
275,
247,
1524,
10186,
3126,
1014,
275,
247,
1236,
2510,
25912,
9978,
253,
1180,
273,
32830,
859,
254,
735,
310,
2779,
281,
3464,
9648,
3638,
387,
253,
2069,
1179,
265,
2783,
275,
436,
789,
26332,
2909,
1677,
436,
352,
3133,
326,
253,
2929,
943,
387,
1878,
2319,
5454,
14273,
342,
247,
3356,
3733,
673,
534,
812,
3486,
253,
4103,
5373,
273,
2710,
7274,
50275,
8826,
13775,
275,
14951,
209,
186,
41528,
374,
298,
25,
943,
320,
246,
50276,
18,
73,
323,
16892,
209,
186,
77,
2313,
278,
12853,
253,
873,
273,
38818,
50276,
37585,
7211,
209,
186,
8826,
490,
25669,
403,
417,
2931,
24088,
425,
327,
298,
3655,
209,
186,
8403,
398,
1687,
14951,
275,
16186,
721,
310,
417,
2931,
1919,
1199,
1996,
298,
17962,
534,
17134,
2122,
4685,
275,
271,
3302,
1239,
50276,
18,
256,
1182,
12109,
1162,
355,
5919,
5511,
275,
4471,
12788,
35221,
4715,
3066,
11041,
1754,
1453,
5723,
2824,
6247,
374,
1182,
50097,
1162,
355,
4715,
15978,
22245,
5511,
323,
4471,
12788,
15850,
5723,
2824,
9169,
495,
246,
259,
606,
1162,
355,
4715,
4829,
11101,
34690,
1318,
3470,
3066,
5511,
41458,
17857,
32888,
9169,
50276,
2072,
2490,
187,
4118,
18435,
27,
783,
2929,
2959,
271,
17568,
2762,
7103,
3738,
512,
253,
7363,
403,
275,
253,
45210,
50276,
20881,
2997,
2491,
253,
4477,
2908,
247,
1048,
285,
11088,
30080,
22559,
285,
15257,
13640,
275,
253,
5955,
534,
1160,
690,
273,
253,
30628,
22753,
616,
7363,
50276,
74,
5583,
253,
2929,
281,
320,
7607,
533,
891,
2096,
253,
3061,
812,
320,
294,
13083,
672,
10941,
253,
2929,
342,
253,
643,
9183
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
authors present an imaginary coordinator agent that executes graph selection as its action based on possible combinations of pairwise edges and the underlying individual and pairwise utilities the results show some effectiveness of the presented method sopcg in simple examples the main strength of the paper lies in its algorithm being of polynomialtime nature and that characteristic needs much more emphasis in both text and figures does sopcg reach peak performance faster is its peak performance higher could be some guideline questions when structuring the paper contents around the polynomialtime algorithm q1 i would appreciate if the authors would discuss the relevance of a www 2020 paper how much and when do we need higherorder information in hypergraphs a case study on hyperedge prediction by yoon et al i think it could also provide some useful insights as to how high of an order the hyperedges need to be to save some function expressiveness ie representational capacity at the cost of time complexity q2 how did the authors go about designing the didactic examples how simple of a task is the sweet spot for sopcg as task complexity is altered when does sopcg begin losing its simplicity advantage when do other models take over which measures would you say most greatly determine the task complexity which in turn compromises sopcgs performance q3 how would you position sopcg in the marl research since the graph selector agent is there at both training at execution times would sopcg be a centralized training centralized execution work if it is comparison against ctde works such as vdn and qmix would not be fair it is as though a free centralized coordinator is helping out the sopcg agents at execution time as well it may be a good idea to compare against some communicationenabled marl works q4 despite starting out strong the paper seems to fall off rather dramatically especially when it comes to the didactic nature of the examples tied in with the page 6 remark that sopcg would perform best given its tradeoff in tasks where the restricted graph classes are enough to express the coordination dependencies this remark really sounds like going back on the vdnqmixqtran line of research whose focus was about covering a richer class of joint actionvalue functions going back on that trend now only to pursue the polynomialtime nature of the running algorithm would in my opinion require far more diverse evaluation examples backed by a stronger motivation highlighting realworld threats of all the other marl algorithms taking longer than polynomial time as is sopcg does not contend amazingly against other marl algorithms that chose the nphard curse of dimensionality fine well approximate approximate approximate path rather than the polynomial time is our topmost priority function expressiveness can wait path that leads me back to the question of why pursue polynomial time at the cost of losing both the function expressiveness and the peak performance in the apparent trilemma my biggest concern is the imaginary agent freely collecting information making decisions and delivering those decisions to all the agents at both the training and execution times comparison against the chosen baselines is not fair even when the chosen evaluation task is a simple enough one in which sopcgs limited representational capacity would not appear that pronounced docsepthis paper proposes an extension of deep coordination graph called selforganized polynomialtime coordination graphs sopcg instead of prespecified graph topology used in dcg their method allows graph topology to be statedependent which is achieved by a coordinator agent and the optimization of this agent is incorporated in a modified temporal difference learning paradigm two prespecified undirected acyclic graph classes are used to ensure polynomialtime graph selection and accurate greedy action selection the result on sensor network grid world and mpe shows that such a tradeoff between the representational capacity of graph topology and the computational accuracy can improve the performance of marl and learn meaningful graph topology strengths 1 to determine statedependent coordination graph is interesting 2 incorporating graph selection into td learning is desirable concernsquestions 1 statedependent coordination graph needs to be determined at each time step in both training and execution which means sopcg is a centralized method thus a baseline of singleagent rl is desired for comparison in experiments 2 it is not clear how qi and qij are learned are they parametersharing for agents 3 the operation argmax in equation 4 takes on3 for mathcalgp it may be too costly for both training and execution one evidence is that each run of sopcg takes up to 25 days in these simple experimental tasks 4 since both dcg and casec include smac experiments it would be better to also include it here to show the performance of sopcg in complex environments in summary it is currently hard to see the benefit of determining the coordination graph in a centralized way moreover the proposed method is not verified in complex environments minor comments 1 as claimed in appendix c a graph relabeling technique is used to solve the extra overestimation error introduced by additional max operator over graphs however this paper is currently missing an ablation to validate this point statedependent coordination graph is important however the paper currently has several weaknesses as mentioned above it seems clearly below the bar of iclr docsepthis paper introduced a novel method called selforganized polynomialtime coordination graphs sopcg aiming to handle the decentralized constraint optimization problem dcop this paper is well organized and the experiments are explicitly presented therefore i think the work of this paper is very interesting and the contributions are sufficient the detailed comments regarding the quality of this paper are listed as follows 1 in the introduction section one of the biggest concerns is that this paper cannot tell the novelty of the proposed sopcg compared with the methods in peer researches besides the illustration of experimental results is not convincible to prove the advantages of the proposed sopcg in this section 2 somewhere in the paper the authors presentation is unclear and is worth being improved for example in the background section the meanings of some symbols in the model are not clear furthermore the authors should give the purposes of introducing formula 2 3 what is the intuition behind the process of investigating polynomialtime coordination graphs in the section 4 or is there any intuition at all how did you come up with that idea have you borrowed this idea from somewhere else it is better to give a detailed explanation about polynomialtime coordination graphs 4 in page 6 the authors proposed selforganized polynomialtime coordination graphs whatever techniques are used in the manuscript there is a need to tabulate computational cost of the proposed algorithm in this paper 5 in the experiments sections the authors claimed that the graph structures learned by sopcg definitely match the groundtruth demands for effective collaboration however it is not clear to us why the method can be used in demonstrating the ability of the proposed approach to organizing coordination relations therefore the authors should provide a detailed explanation about the above issue the work presented is indeed interesting and relevant to the real scenarios i recommend that this paper could be accepted
### Summary: | description of paper content the paper studies the problem of achieving coordination among a group of agents in a cooperative multiagent task coordination graphs reduce the computational complexity of this problem by reducing the joint value function to a sum of local value functions depending on only subsets of agents in particular the qfunction of the entire system is expanded up to secondorder in agent interactions q sumi in n qi sumij in g qij where the qi is function of the ith agents history and current action and qij is a function of two agents histories and current actions as g does not include higherorder third and above terms the algorithm does not have exponential dependence on the number of agents if g includes only a subset of pairs of agents then the computational complexity is reduced to less than quadratic since the coordination problem is cooperative the authors propose a metaagent coordinator that selects the graph g in a dynamic statebystate fashion in order to maximize return the optimization problems of the metaagent and the subagents are performed by deep qlearning summary of paper discussion the critical comment made by one reviewer was going back on that trend now only to pursue the polynomialtime nature of the running algorithm would in my opinion require far more diverse evaluation examples backed by a stronger motivation highlighting realworld threats of all the other marl algorithms taking longer than polynomial time as is sopcg does not contend amazingly against other marl algorithms that chose the nphard curse of dimensionality fine well approximate approximate approximate path rather than the polynomial time is our topmost priority function expressiveness can wait path that leads me back to the question of why pursue polynomial time at the cost of losing both the function expressiveness and the peak performance comments from area chair looking at the experiments the number of agents in the empirical problems is not large for example there are 15 agents in sensor any focus on computational complexity at this scale is hard to justify especially with algorithms that are approximate it seems favorable at this small scale to use function approximators that can take in all the agents histories and actions this obvious baseline is not included in comparisons it is hard to justify inclusion of this paper in the conference | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
43355,
1246,
271,
21833,
30468,
5570,
326,
42506,
4216,
5438,
347,
697,
2250,
1754,
327,
1896,
13553,
273,
28208,
9297,
285,
253,
6944,
2060,
285,
28208,
28275,
253,
1543,
921,
690,
12510,
273,
253,
3559,
1332,
34715,
29676,
275,
2969,
6667,
253,
2022,
4757,
273,
253,
2929,
8696,
275,
697,
5933,
1146,
273,
14189,
2606,
3753,
285,
326,
8847,
3198,
1199,
625,
15075,
275,
1097,
2505,
285,
8442,
1057,
34715,
29676,
3986,
5241,
3045,
7938,
310,
697,
5241,
3045,
2169,
812,
320,
690,
29609,
3533,
672,
1577,
981,
253,
2929,
9410,
1475,
253,
14189,
2606,
5933,
50276,
82,
18,
891,
651,
11435,
604,
253,
4477,
651,
2319,
253,
17200,
273,
247,
8280,
9169,
2929,
849,
1199,
285,
672,
513,
359,
878,
2169,
2621,
1491,
275,
4373,
33884,
247,
1083,
1263,
327,
1465,
11712,
463,
10554,
407,
340,
3508,
1162,
355,
891,
1158,
352,
812,
671,
2085,
690,
4217,
16039,
347,
281,
849,
1029,
273,
271,
1340,
253,
1465,
11712,
2510,
878,
281,
320,
281,
5321,
690,
1159,
3890,
6460,
26332,
1957,
1050,
5350,
387,
253,
2105,
273,
673,
10454,
50276,
82,
19,
849,
858,
253,
4477,
564,
670,
20462,
253,
858,
9994,
6667,
849,
2969,
273,
247,
4836,
310,
253,
7353,
6308,
323,
34715,
29676,
347,
4836,
10454,
310,
12059,
672,
1057,
34715,
29676,
3135,
10305,
697,
17647,
5750,
672,
513,
643,
3210,
1379,
689,
534,
5593,
651,
368,
1333,
954,
10260,
3653,
253,
4836,
10454,
534,
275,
1614,
10953,
3013,
34715,
68,
5943,
3045,
50276,
82,
20,
849,
651,
368,
1899,
34715,
29676,
275,
253,
2304,
77,
2561,
1580,
253,
4216,
23434,
5570,
310,
627,
387,
1097,
3733,
387,
10636,
2069,
651,
34715,
29676,
320,
247,
36409,
3733,
36409,
10636,
789,
604,
352,
310,
5301,
1411,
45830,
615,
2987,
824,
347,
362,
17915,
285,
2805,
24706,
651,
417,
320,
4344,
352,
310,
347,
2167,
247,
1959,
36409,
30468,
310,
9073,
562,
253,
34715,
29676,
6083,
387,
10636,
673,
347,
973,
352,
778,
320,
247,
1175,
2934,
281,
7277,
1411,
690,
5511,
22331,
2304,
77,
2987,
50276,
82,
21,
5747,
4983,
562,
2266,
253,
2929,
3133,
281,
2965,
745,
2581,
16821,
3340,
672,
352,
3249,
281,
253,
858,
9994,
3753,
273,
253,
6667,
12331,
275,
342,
253,
3239,
721,
7579,
326,
34715,
29676,
651,
1347,
1682,
1677,
697,
5454,
2727,
275,
8892,
835,
253,
11096,
4216,
5971,
403,
2217,
281,
3890,
253,
19915,
21011,
436,
7579,
1663,
7835,
751,
1469,
896,
327,
253,
362,
17915,
82,
24706,
82,
1206,
266,
1386,
273,
2561,
3692,
2770,
369,
670,
10985,
247,
38539,
966,
273,
6036,
2250,
2877,
3470,
1469,
896,
327,
326,
9058,
1024,
760,
281,
15142,
253,
14189,
2606,
3753,
273,
253,
3515,
5933,
651,
275,
619,
4743,
2430,
2080,
625,
11117,
7103,
6667,
17245,
407,
247,
10046,
16038,
27321,
1524,
10186,
14207,
273,
512,
253,
643,
2304,
77,
11333,
3192,
3356,
685,
14189,
673,
347,
310,
34715,
29676,
1057,
417,
21244,
7001,
5356,
1411,
643,
2304,
77,
11333,
326,
9703,
253,
295,
545,
472,
28401,
273,
7877,
1319,
4030,
973,
16851,
16851,
16851,
1854,
2581,
685,
253,
14189,
673,
310,
776,
1755,
2252,
11674,
1159,
3890,
6460,
476,
3343,
1854,
326,
5644,
479,
896,
281,
253,
1953,
273,
2139,
15142,
14189,
673,
387,
253,
2105,
273,
10305,
1097,
253,
1159,
3890,
6460,
285,
253,
5241,
3045,
275,
253,
5165,
1195,
21838,
619,
5962,
4468,
310,
253,
21833,
5570,
15744,
17055,
1491,
2403,
7089,
285,
18723,
1110,
7089,
281,
512,
253,
6083,
387,
1097,
253,
3733,
285,
10636,
2069,
5301,
1411,
253,
6777,
1666,
25379,
310,
417,
4344,
1014,
672,
253,
6777,
7103,
4836,
310,
247,
2969,
2217,
581,
275,
534,
34715,
68,
5943,
3710,
1957,
1050,
5350,
651,
417,
3176,
326,
17088,
5474,
33032,
2520,
2929,
29328,
271,
6880,
273,
3676,
19915,
4216,
1925,
1881,
34092,
14189,
2606,
19915,
14580,
34715,
29676,
3185,
273,
838,
1553,
1245,
4216,
18080,
908,
275,
277,
29676,
616,
1332,
4483,
4216,
18080,
281,
320,
4767,
2662,
534,
310,
6786,
407,
247,
30468,
5570,
285,
253,
13757,
273,
436,
5570,
310,
11217,
275,
247,
7321,
11935,
3064,
4715,
22199,
767,
838,
1553,
1245,
3807,
17799,
913,
9920,
280,
4216,
5971,
403,
908,
281,
5416,
14189,
2606,
4216,
5438,
285,
7899,
38754,
2250,
5438,
253,
906,
327,
8468,
2990,
9860,
1533,
285,
278,
365,
2722,
326,
824,
247,
5454,
2727,
875,
253,
1957,
1050,
5350,
273,
4216,
18080,
285,
253,
15180,
7200,
476,
3157,
253,
3045,
273,
2304,
77,
285,
3037,
14282,
4216,
18080,
20544,
337,
281,
3653,
4767,
2662,
19915,
4216,
310,
4722,
374,
24049,
4216,
5438,
715,
32989,
4715,
310,
11408,
50275,
585,
1209,
2224,
34974,
337,
4767,
2662,
19915,
4216,
3198,
281,
320,
3413,
387,
1016,
673,
3213,
275,
1097,
3733,
285,
10636,
50276,
4609,
2097,
34715,
29676,
310,
247,
36409,
1332,
3021,
247,
8245,
273,
2014,
12788,
391,
77,
310,
6799,
323,
5301,
275,
4679,
50276,
19,
352,
310,
417,
2590,
849,
2805,
74,
285,
2805,
1944,
403,
6311,
403,
597,
3602,
73,
1875,
323,
6083,
495,
253,
4254,
1736,
4090,
275,
5150,
577,
3936,
327,
20,
323,
14168,
1179,
17788,
50276,
262,
778,
320,
1512,
19983,
323,
1097,
3733,
285,
10636,
581,
1941,
310,
326,
1016,
1408,
273,
34715,
29676,
50276,
85,
1582,
598,
281,
2030,
1897,
275,
841,
2969,
5661,
8892,
50276,
21,
1580,
1097,
277,
29676,
285,
1083,
68,
2486,
924,
317,
4679,
352,
651,
320,
1805,
281,
671,
2486,
352,
1060,
281,
921,
253,
3045,
273,
34715,
29676,
275,
2570,
12620,
50275,
249,
6010,
352,
310,
4390,
1892,
281,
923,
253,
5649,
273,
8925,
253,
19915,
4216,
275,
247,
36409,
1039,
25761,
253,
4081,
1332,
310,
417,
16058,
275,
2570,
12620,
50273,
37585,
5701,
337,
347,
7558,
275,
30762,
260,
247,
4216,
774,
1492,
272,
5853,
310,
908,
281,
8415,
253,
4465,
35039,
14508,
2228,
5611,
407,
3081,
2781,
5572,
689,
14580,
50276,
35529,
436,
2929,
310,
4390,
5816,
271,
28913,
281,
17813,
436,
1127,
50273,
33834,
2662,
19915,
4216,
310,
1774,
2299,
253,
2929,
4390,
556,
2067,
32213,
347,
5393,
1840,
352,
3133,
4518,
2708,
253,
2534,
273,
17857,
32888,
50275,
7152,
33032,
2520,
2929,
5611,
247,
4460,
1332,
1925,
1881,
34092,
14189,
2606,
19915,
14580,
34715,
29676,
26400,
281,
6016,
253,
40880,
7658,
13757,
1895,
277,
21592,
436,
2929,
310,
973,
10932,
285,
253,
4679,
403,
11120,
3559,
3103,
891,
1158,
253,
789,
273,
436,
2929,
310,
1077,
4722,
285,
253,
9021,
403,
4209,
50275,
783,
7000,
5701,
5001,
253,
3290,
273,
436,
2929,
403,
7117,
347,
3637,
337,
275,
253,
10199,
2593,
581,
273,
253,
5962,
7350,
310,
326,
436,
2929,
2550,
2028,
253,
38135,
273,
253,
4081,
34715,
29676,
2429,
342,
253,
3082,
275,
14218,
29905,
2706,
16280,
253,
23356,
273,
5661,
1543,
310,
417,
2410,
1763,
917,
281,
5276,
253,
11361,
273,
253,
4081,
34715,
29676,
275,
436,
2593,
374,
9366,
275,
253,
2929,
253,
4477,
9759,
310,
12744,
285,
310,
4409,
1146,
5520,
323,
1650,
275,
253,
4114,
2593,
253,
30460,
273,
690,
14217,
275,
253,
1566,
403,
417,
2590,
33810,
253,
4477,
943,
1918,
253,
6378,
273,
16984,
7212,
374,
495,
752,
310,
253,
30328,
3212,
253,
1232,
273,
15686,
14189,
2606,
19915,
14580,
275,
253,
2593,
577,
390,
310,
627,
667,
30328,
387,
512,
849,
858,
368,
1705,
598,
342,
326,
2934,
452,
368,
29563,
436,
2934,
432,
9366,
2010,
352,
310,
1805,
281,
1918,
247,
7000,
8813,
670,
14189,
2606,
19915,
14580,
577,
275,
3239,
721,
253,
4477,
4081,
1881,
34092,
14189,
2606,
19915,
14580,
5913,
5609,
403,
908,
275,
253,
7714,
627,
310,
247,
878,
281,
10334,
4187,
15180,
2105,
273,
253,
4081,
5933,
275,
436,
2929,
50276,
22,
275,
253,
4679,
7118,
253,
4477,
7558,
326,
253,
4216,
5289,
6311,
407,
34715,
29676,
7964,
3761,
253,
3216,
33024,
11684,
323,
3576,
14448,
2299,
352,
310,
417,
2590,
281,
441,
2139,
253,
1332,
476,
320,
908,
275,
17227,
253,
3745,
273,
253,
4081,
2746,
281,
26169,
19915,
2493,
3103,
253,
4477,
943,
2085,
247,
7000,
8813,
670,
253,
1840,
2523,
50275,
783,
789,
3559,
310,
6296,
4722,
285,
4623,
281,
253,
1524,
15216,
891,
5583,
326,
436,
2929,
812,
320,
7607,
2490,
187,
4118,
18435,
27,
10008,
273,
2929,
2600,
50276,
783,
2929,
2175,
253,
1895,
273,
17170,
19915,
2190,
247,
1387,
273,
6083,
275,
247,
27293,
4471,
12788,
4836,
19915,
14580,
4796,
253,
15180,
10454,
273,
436,
1895,
407,
8493,
253,
6036,
1318,
1159,
281,
247,
2020,
273,
1980,
1318,
3470,
7293,
327,
760,
20077,
273,
6083,
275,
1798,
253,
2805,
3701,
273,
253,
2862,
985,
310,
11848,
598,
281,
1273,
2621,
275,
5570,
6355,
2805,
50276,
2204,
74,
275,
295,
2805,
74,
50276,
2204,
1944,
275,
305,
2805,
1944,
835,
253,
2805,
74,
310,
1159,
273,
253,
209,
334,
6083,
2892,
285,
1655,
2250,
285,
2805,
1944,
310,
247,
1159,
273,
767,
6083,
29892,
285,
1655,
5231,
347,
305,
1057,
417,
2486,
2169,
2621,
2626,
285,
1840,
2426,
253,
5933,
1057,
417,
452,
17619,
10096,
327,
253,
1180,
273,
6083,
604,
305,
3797,
760,
247,
8578,
273,
8557,
273,
6083,
840,
253,
15180,
10454,
310,
3777,
281,
1679,
685,
21396,
1580,
253,
19915,
1895,
310,
27293,
253,
4477,
12661,
247,
11419,
12788,
30468,
326,
34899,
253,
4216,
305,
275,
247,
7870,
1375,
1615,
3409,
8142,
275,
1340,
281,
22950,
1091,
253,
13757,
3237,
273,
253,
11419,
12788,
285,
253,
749,
21215,
403,
2684,
407,
3676,
2805,
28269,
50276,
8774,
273,
2929,
5955,
50276,
783,
4619,
4385,
1160,
407,
581,
37317,
369,
1469,
896,
327,
326,
9058,
1024,
760,
281,
15142,
253,
14189,
2606,
3753,
273,
253,
3515,
5933,
651,
275,
619,
4743,
2430,
2080,
625,
11117,
7103,
6667,
17245,
407,
247,
10046,
16038,
27321,
1524,
10186,
14207,
273,
512,
253,
643,
2304,
77,
11333,
3192,
3356,
685,
14189,
673,
347,
310,
34715,
29676,
1057,
417,
21244,
7001,
5356,
1411,
643,
2304,
77,
11333,
326,
9703,
253,
295,
545,
472,
28401,
273,
7877,
1319,
4030,
973,
16851,
16851,
16851,
1854,
2581,
685,
253,
14189,
673,
310,
776,
1755,
2252,
11674,
1159,
3890,
6460,
476,
3343,
1854,
326,
5644,
479,
896,
281,
253,
1953,
273,
2139,
15142,
14189,
673,
387,
253,
2105,
273,
10305,
1097,
253,
1159,
3890,
6460,
285,
253,
5241,
3045,
50276,
26122,
432,
2170,
6951,
50276,
13565,
387,
253,
4679,
253,
1180,
273,
6083,
275,
253,
16774,
3237,
310,
417,
1781,
323,
1650,
627,
403,
1458,
6083,
275,
8468,
667,
2770,
327,
15180,
10454,
387,
436,
4311,
310,
1892,
281,
15249,
3340,
342,
11333,
326,
403,
16851,
352,
3133,
13857,
387,
436,
1355,
4311,
281,
897,
1159,
4020,
2392,
326,
476,
1379,
275,
512,
253,
6083,
29892,
285,
5231,
436,
4755,
8245,
310,
417,
2908,
275,
14023,
352,
310,
1892,
281,
15249,
11250,
273,
436,
2929,
275,
253,
8059
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
43355,
1246,
271,
21833,
30468,
5570,
326,
42506,
4216,
5438,
347,
697,
2250,
1754,
327,
1896,
13553,
273,
28208,
9297,
285,
253,
6944,
2060,
285,
28208,
28275,
253,
1543,
921,
690,
12510,
273,
253,
3559,
1332,
34715,
29676,
275,
2969,
6667,
253,
2022,
4757,
273,
253,
2929,
8696,
275,
697,
5933,
1146,
273,
14189,
2606,
3753,
285,
326,
8847,
3198,
1199,
625,
15075,
275,
1097,
2505,
285,
8442,
1057,
34715,
29676,
3986,
5241,
3045,
7938,
310,
697,
5241,
3045,
2169,
812,
320,
690,
29609,
3533,
672,
1577,
981,
253,
2929,
9410,
1475,
253,
14189,
2606,
5933,
50276,
82,
18,
891,
651,
11435,
604,
253,
4477,
651,
2319,
253,
17200,
273,
247,
8280,
9169,
2929,
849,
1199,
285,
672,
513,
359,
878,
2169,
2621,
1491,
275,
4373,
33884,
247,
1083,
1263,
327,
1465,
11712,
463,
10554,
407,
340,
3508,
1162,
355,
891,
1158,
352,
812,
671,
2085,
690,
4217,
16039,
347,
281,
849,
1029,
273,
271,
1340,
253,
1465,
11712,
2510,
878,
281,
320,
281,
5321,
690,
1159,
3890,
6460,
26332,
1957,
1050,
5350,
387,
253,
2105,
273,
673,
10454,
50276,
82,
19,
849,
858,
253,
4477,
564,
670,
20462,
253,
858,
9994,
6667,
849,
2969,
273,
247,
4836,
310,
253,
7353,
6308,
323,
34715,
29676,
347,
4836,
10454,
310,
12059,
672,
1057,
34715,
29676,
3135,
10305,
697,
17647,
5750,
672,
513,
643,
3210,
1379,
689,
534,
5593,
651,
368,
1333,
954,
10260,
3653,
253,
4836,
10454,
534,
275,
1614,
10953,
3013,
34715,
68,
5943,
3045,
50276,
82,
20,
849,
651,
368,
1899,
34715,
29676,
275,
253,
2304,
77,
2561,
1580,
253,
4216,
23434,
5570,
310,
627,
387,
1097,
3733,
387,
10636,
2069,
651,
34715,
29676,
320,
247,
36409,
3733,
36409,
10636,
789,
604,
352,
310,
5301,
1411,
45830,
615,
2987,
824,
347,
362,
17915,
285,
2805,
24706,
651,
417,
320,
4344,
352,
310,
347,
2167,
247,
1959,
36409,
30468,
310,
9073,
562,
253,
34715,
29676,
6083,
387,
10636,
673,
347,
973,
352,
778,
320,
247,
1175,
2934,
281,
7277,
1411,
690,
5511,
22331,
2304,
77,
2987,
50276,
82,
21,
5747,
4983,
562,
2266,
253,
2929,
3133,
281,
2965,
745,
2581,
16821,
3340,
672,
352,
3249,
281,
253,
858,
9994,
3753,
273,
253,
6667,
12331,
275,
342,
253,
3239,
721,
7579,
326,
34715,
29676,
651,
1347,
1682,
1677,
697,
5454,
2727,
275,
8892,
835,
253,
11096,
4216,
5971,
403,
2217,
281,
3890,
253,
19915,
21011,
436,
7579,
1663,
7835,
751,
1469,
896,
327,
253,
362,
17915,
82,
24706,
82,
1206,
266,
1386,
273,
2561,
3692,
2770,
369,
670,
10985,
247,
38539,
966,
273,
6036,
2250,
2877,
3470,
1469,
896,
327,
326,
9058,
1024,
760,
281,
15142,
253,
14189,
2606,
3753,
273,
253,
3515,
5933,
651,
275,
619,
4743,
2430,
2080,
625,
11117,
7103,
6667,
17245,
407,
247,
10046,
16038,
27321,
1524,
10186,
14207,
273,
512,
253,
643,
2304,
77,
11333,
3192,
3356,
685,
14189,
673,
347,
310,
34715,
29676,
1057,
417,
21244,
7001,
5356,
1411,
643,
2304,
77,
11333,
326,
9703,
253,
295,
545,
472,
28401,
273,
7877,
1319,
4030,
973,
16851,
16851,
16851,
1854,
2581,
685,
253,
14189,
673,
310,
776,
1755,
2252,
11674,
1159,
3890,
6460,
476,
3343,
1854,
326,
5644,
479,
896,
281,
253,
1953,
273,
2139,
15142,
14189,
673,
387,
253,
2105,
273,
10305,
1097,
253,
1159,
3890,
6460,
285,
253,
5241,
3045,
275,
253,
5165,
1195,
21838,
619,
5962,
4468,
310,
253,
21833,
5570,
15744,
17055,
1491,
2403,
7089,
285,
18723,
1110,
7089,
281,
512,
253,
6083,
387,
1097,
253,
3733,
285,
10636,
2069,
5301,
1411,
253,
6777,
1666,
25379,
310,
417,
4344,
1014,
672,
253,
6777,
7103,
4836,
310,
247,
2969,
2217,
581,
275,
534,
34715,
68,
5943,
3710,
1957,
1050,
5350,
651,
417,
3176,
326,
17088,
5474,
33032,
2520,
2929,
29328,
271,
6880,
273,
3676,
19915,
4216,
1925,
1881,
34092,
14189,
2606,
19915,
14580,
34715,
29676,
3185,
273,
838,
1553,
1245,
4216,
18080,
908,
275,
277,
29676,
616,
1332,
4483,
4216,
18080,
281,
320,
4767,
2662,
534,
310,
6786,
407,
247,
30468,
5570,
285,
253,
13757,
273,
436,
5570,
310,
11217,
275,
247,
7321,
11935,
3064,
4715,
22199,
767,
838,
1553,
1245,
3807,
17799,
913,
9920,
280,
4216,
5971,
403,
908,
281,
5416,
14189,
2606,
4216,
5438,
285,
7899,
38754,
2250,
5438,
253,
906,
327,
8468,
2990,
9860,
1533,
285,
278,
365,
2722,
326,
824,
247,
5454,
2727,
875,
253,
1957,
1050,
5350,
273,
4216,
18080,
285,
253,
15180,
7200,
476,
3157,
253,
3045,
273,
2304,
77,
285,
3037,
14282,
4216,
18080,
20544,
337,
281,
3653,
4767,
2662,
19915,
4216,
310,
4722,
374,
24049,
4216,
5438,
715,
32989,
4715,
310,
11408,
50275,
585,
1209,
2224,
34974,
337,
4767,
2662,
19915,
4216,
3198,
281,
320,
3413,
387,
1016,
673,
3213,
275,
1097,
3733,
285,
10636,
50276,
4609,
2097,
34715,
29676,
310,
247,
36409,
1332,
3021,
247,
8245,
273,
2014,
12788,
391,
77,
310,
6799,
323,
5301,
275,
4679,
50276,
19,
352,
310,
417,
2590,
849,
2805,
74,
285,
2805,
1944,
403,
6311,
403,
597,
3602,
73,
1875,
323,
6083,
495,
253,
4254,
1736,
4090,
275,
5150,
577,
3936,
327,
20,
323,
14168,
1179,
17788,
50276,
262,
778,
320,
1512,
19983,
323,
1097,
3733,
285,
10636,
581,
1941,
310,
326,
1016,
1408,
273,
34715,
29676,
50276,
85,
1582,
598,
281,
2030,
1897,
275,
841,
2969,
5661,
8892,
50276,
21,
1580,
1097,
277,
29676,
285,
1083,
68,
2486,
924,
317,
4679,
352,
651,
320,
1805,
281,
671,
2486,
352,
1060,
281,
921,
253,
3045,
273,
34715,
29676,
275,
2570,
12620,
50275,
249,
6010,
352,
310,
4390,
1892,
281,
923,
253,
5649,
273,
8925,
253,
19915,
4216,
275,
247,
36409,
1039,
25761,
253,
4081,
1332,
310,
417,
16058,
275,
2570,
12620,
50273,
37585,
5701,
337,
347,
7558,
275,
30762,
260,
247,
4216,
774,
1492,
272,
5853,
310,
908,
281,
8415,
253,
4465,
35039,
14508,
2228,
5611,
407,
3081,
2781,
5572,
689,
14580,
50276,
35529,
436,
2929,
310,
4390,
5816,
271,
28913,
281,
17813,
436,
1127,
50273,
33834,
2662,
19915,
4216,
310,
1774,
2299,
253,
2929,
4390,
556,
2067,
32213,
347,
5393,
1840,
352,
3133,
4518,
2708,
253,
2534,
273,
17857,
32888,
50275,
7152,
33032,
2520,
2929,
5611,
247,
4460,
1332,
1925,
1881,
34092,
14189,
2606,
19915,
14580,
34715,
29676,
26400,
281,
6016,
253,
40880,
7658,
13757,
1895,
277,
21592,
436,
2929,
310,
973,
10932,
285,
253,
4679,
403,
11120,
3559,
3103,
891,
1158,
253,
789,
273,
436,
2929,
310,
1077,
4722,
285,
253,
9021,
403,
4209,
50275,
783,
7000,
5701,
5001,
253,
3290,
273,
436,
2929,
403,
7117,
347,
3637,
337,
275,
253,
10199,
2593,
581,
273,
253,
5962,
7350,
310,
326,
436,
2929,
2550,
2028,
253,
38135,
273,
253,
4081,
34715,
29676,
2429,
342,
253,
3082,
275,
14218,
29905,
2706,
16280,
253,
23356,
273,
5661,
1543,
310,
417,
2410,
1763,
917,
281,
5276,
253,
11361,
273,
253,
4081,
34715,
29676,
275,
436,
2593,
374,
9366,
275,
253,
2929,
253,
4477,
9759,
310,
12744,
285,
310,
4409,
1146,
5520,
323,
1650,
275,
253,
4114,
2593,
253,
30460,
273,
690,
14217,
275,
253,
1566,
403,
417,
2590,
33810,
253,
4477,
943,
1918,
253,
6378,
273,
16984,
7212,
374,
495,
752,
310,
253,
30328,
3212,
253,
1232,
273,
15686,
14189,
2606,
19915,
14580,
275,
253,
2593,
577,
390,
310,
627,
667,
30328,
387,
512,
849,
858,
368,
1705,
598,
342,
326,
2934,
452,
368,
29563,
436,
2934,
432,
9366,
2010,
352,
310,
1805,
281,
1918,
247,
7000,
8813,
670,
14189,
2606,
19915,
14580,
577,
275,
3239,
721,
253,
4477,
4081,
1881,
34092,
14189,
2606,
19915,
14580,
5913,
5609,
403,
908,
275,
253,
7714,
627,
310,
247,
878,
281,
10334,
4187,
15180,
2105,
273,
253,
4081,
5933,
275,
436,
2929,
50276,
22,
275,
253,
4679,
7118,
253,
4477,
7558,
326,
253,
4216,
5289,
6311,
407,
34715,
29676,
7964,
3761,
253,
3216,
33024,
11684,
323,
3576,
14448,
2299,
352,
310,
417,
2590,
281,
441,
2139,
253,
1332,
476,
320,
908,
275,
17227,
253,
3745,
273,
253,
4081,
2746,
281,
26169,
19915,
2493,
3103,
253,
4477,
943,
2085,
247,
7000,
8813,
670,
253,
1840,
2523,
50275,
783,
789,
3559,
310,
6296,
4722,
285,
4623,
281,
253,
1524,
15216,
891,
5583,
326,
436,
2929,
812,
320,
7607,
2490,
187,
4118,
18435,
27,
10008,
273,
2929,
2600,
50276,
783,
2929,
2175,
253,
1895,
273,
17170,
19915,
2190,
247,
1387,
273,
6083,
275,
247,
27293,
4471,
12788,
4836,
19915,
14580,
4796,
253,
15180,
10454,
273,
436,
1895,
407,
8493,
253,
6036,
1318,
1159,
281,
247,
2020,
273,
1980,
1318,
3470,
7293,
327,
760,
20077,
273,
6083,
275,
1798,
253,
2805,
3701,
273,
253,
2862,
985,
310,
11848,
598,
281,
1273,
2621,
275,
5570,
6355,
2805,
50276,
2204,
74,
275,
295,
2805,
74,
50276,
2204,
1944,
275,
305,
2805,
1944,
835,
253,
2805,
74,
310,
1159,
273,
253,
209,
334,
6083,
2892,
285,
1655,
2250,
285,
2805,
1944,
310,
247,
1159,
273,
767,
6083,
29892,
285,
1655,
5231,
347,
305,
1057,
417,
2486,
2169,
2621,
2626,
285,
1840,
2426,
253,
5933,
1057,
417,
452,
17619,
10096,
327,
253,
1180,
273,
6083,
604,
305,
3797,
760,
247,
8578,
273,
8557,
273,
6083,
840,
253,
15180,
10454,
310,
3777,
281,
1679,
685,
21396,
1580,
253,
19915,
1895,
310,
27293,
253,
4477,
12661,
247,
11419,
12788,
30468,
326,
34899,
253,
4216,
305,
275,
247,
7870,
1375,
1615,
3409,
8142,
275,
1340,
281,
22950,
1091,
253,
13757,
3237,
273,
253,
11419,
12788,
285,
253,
749,
21215,
403,
2684,
407,
3676,
2805,
28269,
50276,
8774,
273,
2929,
5955,
50276,
783,
4619,
4385,
1160,
407,
581,
37317,
369,
1469,
896,
327,
326,
9058,
1024,
760,
281,
15142,
253,
14189,
2606,
3753,
273,
253,
3515,
5933,
651,
275,
619,
4743,
2430,
2080,
625,
11117,
7103,
6667,
17245,
407,
247,
10046,
16038,
27321,
1524,
10186,
14207,
273,
512,
253,
643,
2304,
77,
11333,
3192,
3356,
685,
14189,
673,
347,
310,
34715,
29676,
1057,
417,
21244,
7001,
5356,
1411,
643,
2304,
77,
11333,
326,
9703,
253,
295,
545,
472,
28401,
273,
7877,
1319,
4030,
973,
16851,
16851,
16851,
1854,
2581,
685,
253,
14189,
673,
310,
776,
1755,
2252,
11674,
1159,
3890,
6460,
476,
3343,
1854,
326,
5644,
479,
896,
281,
253,
1953,
273,
2139,
15142,
14189,
673,
387,
253,
2105,
273,
10305,
1097,
253,
1159,
3890,
6460,
285,
253,
5241,
3045,
50276,
26122,
432,
2170,
6951,
50276,
13565,
387,
253,
4679,
253,
1180,
273,
6083,
275,
253,
16774,
3237,
310,
417,
1781,
323,
1650,
627,
403,
1458,
6083,
275,
8468,
667,
2770,
327,
15180,
10454,
387,
436,
4311,
310,
1892,
281,
15249,
3340,
342,
11333,
326,
403,
16851,
352,
3133,
13857,
387,
436,
1355,
4311,
281,
897,
1159,
4020,
2392,
326,
476,
1379,
275,
512,
253,
6083,
29892,
285,
5231,
436,
4755,
8245,
310,
417,
2908,
275,
14023,
352,
310,
1892,
281,
15249,
11250,
273,
436,
2929,
275,
253,
8059
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work introduces a novel approach to tackling object rearrangement the main idea is to learn a scoring function in the form of a gradient field the learned gradient field could be used by a modelbased planning algorithm to find collisionfree arrangement plans or as a reward function to train rl agents experiment results show promising results of the proposed method for rearranging a room and sample efficient policy learning strength 1 the paper is wellwritten with a clear introduction of the implementation details 2 i really enjoy reading the paper following the motivation of how the learned gradient field could be used to tackle three major challenges of object arrangement which is easy to follow and makes a lot of sense 3 the velocitybased action space of the designed task showcases that the proposed method could be used in complex continuous control scenarios the discussion of collision avoidance planning with orca further strengthens the practicality of the proposed approach in realworld tasks 4 it is impressive to see such a gradientbased scoring function could be used as a reward function for training the rl agent also the gradientbased action could further alleviate the burden of the action generation network so that only a small residual policy network needs to be trained which largely improves the sample efficiency of the policy learning process concerns 1 how does the rl residual policy learn to avoid collision as mentioned in the modelbased planning method the gradient field does not have environment information so the gradient cannot avoid collision since the residual policy is based on gradientbased action does it need an additional collision penalty reward during training in the paper you mentioned the centralized and decentralized reward but more details here would be better 2 the gradient field might struggle to handle multimodel distributions what if there are multiple arrangement goals for the same environment will the gradient become a mean of different distributions which will mislead the planner or the policy learning i feel a probabilistic model would make more sense for this situation 3 the authors mentioned that the roomarrangement task is too complex for planningbased methods i would suggest trying with bilevel planning which first plans highlevel arrangements plans with the learned gradient and then use a collisionavoidance trajectory planner to optimize the path to avoid the collision 1 might fail to handle multimodel goal distributions 2 missing some details of the experimental setups in the main paper docsep post rebuttal i thank the authors for their response i have increased my score pre rebuttal this paper attempts to tackle the problem of learning to rearrange a set of objects to a sensible state without a predefined reward function towards this end they propose target gradient fields targf targf estimates the gradient of the likelihood that the current environment state is a goal or sensible state for the objects targf is learned by randomly permuting correct arrangements the output of targf is used to estimate the best action to take at every timestep andor used to estimate the reward the method evaluated in two scenarios in the first a set of balls must be rearranged into either a circle clusters by color or a combination in the second the furniture in rooms must be rearranged back to its initial states strengths the proposed method is able to learn solely from examples and does not require any additional data other than correct examples negatives are generated by adding noise to the correct examples in the circle scenario the proposed method outperforms various baselines across a series of metrics namely collision count and success weaknesses the evaluation is very simple there isnt an agent that rearranges these objects instead the objects all rearrange themselves this is a good initial evaluation that shows the method provides a reasonable reward signal but it leaves a lot of questions since this reduces the timehorizon isnt representative of how the objects would be moved by a single agent one object moves at a time and removes the initial exploration to find the objects an evaluation that is closer to a real scenario would be helpful ideally the authors would evaluate on something like hab szot et al 2021 or ai2thor rearrangement the residual policy derived from the gradientbased action seems quite important to overall performance of the model free approach however this residual policy requires an oracle entitybased representation of the state requires all object positions object categories and bounding boxes while it is fine to assume this at training time this is also required at evaluation time i am concerned by the assumption that such a representation is available as the task complexity increases on this topic it is unclear if the baselines receive the same information the supplement says they operate on state but state is defined just as object position not position category bounding box in the paper finally in the most realistic scenario room arrangement the proposed method likely doesnt outperform gail sac by a statistically significant margin can the authors report confidence intervals instead of variance limitations are addressed docsepthis paper learns a reward function score function for objective rearrangement tasks making use of recent advancements in scorebased generative modeling the authors term this score function a target gradient field and show that it can be used with either a path planner or a reinforcement learning algorithm to solve object rearrangement tasks in a toy setting ie where you can directly control object velocities the authors compare against a number of baselines and competing methods in these simulated object rearrangement domains and show that their method performs well across several different metrics strengths learning reward functions is an interesting application of scorebased generative modeling and has not been explored before as far as i know proposed approach performs well against other reward learning approaches on the objectrearrangement task weaknesses the paper is somewhat limited in scope it is only applied to a very specific robotics problem that of object rearrangement and even here some major simplifying assumptions had to be made such as the fact you can directly control the velocity of any object the paper only shows results in lowdimensional domains small graphs since scorebased generative modeling also works in highdimensional domains such as images it would be interesting to see if the method can be used for reward learning from scene images for example while the paper is nicely structured it contains a large number of typos and grammatical errors these errors must be fixed before the paper is ready for publication i will provide a nonexhaustive list here but i would also suggest making useful of professional proofreading services if possible typos and grammatical errors we do also tries to find a examplebased control method differently our proposed approach focus on learning the target gradient fields from the examples besides both rl method and traditional planner can be supported by our target gradient fields in the object arrangement task even if we are accessible to the target distribution in address these problems reducing the risk of objects being collided i dont think the authors have done a good job of addressing the limitation of the work in the limitations section the only limitation that the authors mentioned is the fact that their work only considered planar settings and not 3d settings there is only a very brief mention of the fact that future work could use real robots i think the authors should provide a more detailed limitations section by reducing space used elsewhere some ideas for limitations can be seen in the weaknesses section above a somewhat more comprehensive plan on how the method could be applied to more realistic settings would also be useful
### Summary: | after a strong rebuttal from the authors and an extensive discussion among the reviewers i believe this work will be a valuable contribution to neurips i recommend it for acceptance and encourage the authors to address the reviewers comments for the cameraready version of the paper especially the point about the simplistic evaluation of the method please consider a more realistic evaluation scenario | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
23970,
247,
4460,
2746,
281,
46710,
1789,
47410,
253,
2022,
2934,
310,
281,
3037,
247,
14755,
1159,
275,
253,
830,
273,
247,
11786,
1673,
253,
6311,
11786,
1673,
812,
320,
908,
407,
247,
1566,
3169,
7219,
5933,
281,
1089,
15708,
4924,
11461,
5827,
390,
347,
247,
10921,
1159,
281,
6194,
391,
77,
6083,
3368,
1543,
921,
12532,
1543,
273,
253,
4081,
1332,
323,
23690,
5610,
247,
2316,
285,
3410,
5919,
3646,
4715,
4757,
337,
253,
2929,
310,
973,
15720,
342,
247,
2590,
10199,
273,
253,
7092,
4278,
374,
891,
1663,
4264,
4361,
253,
2929,
1563,
253,
16038,
273,
849,
253,
6311,
11786,
1673,
812,
320,
908,
281,
18915,
1264,
2201,
7881,
273,
1789,
11461,
534,
310,
3477,
281,
956,
285,
2789,
247,
2257,
273,
3282,
495,
253,
7602,
3169,
2250,
2317,
273,
253,
4158,
4836,
921,
12866,
326,
253,
4081,
1332,
812,
320,
908,
275,
2570,
5415,
1453,
15216,
253,
5955,
273,
15708,
28772,
7219,
342,
390,
6357,
2007,
4056,
49966,
253,
8542,
414,
273,
253,
4081,
2746,
275,
1524,
10186,
8892,
577,
352,
310,
13943,
281,
923,
824,
247,
11786,
3169,
14755,
1159,
812,
320,
908,
347,
247,
10921,
1159,
323,
3733,
253,
391,
77,
5570,
671,
253,
11786,
3169,
2250,
812,
2007,
33623,
253,
7977,
273,
253,
2250,
5978,
2990,
594,
326,
760,
247,
1355,
12541,
3646,
2990,
3198,
281,
320,
10166,
534,
8127,
19132,
253,
3410,
6733,
273,
253,
3646,
4715,
1232,
50275,
585,
1209,
2224,
337,
849,
1057,
253,
391,
77,
12541,
3646,
3037,
281,
3693,
15708,
347,
5393,
275,
253,
1566,
3169,
7219,
1332,
253,
11786,
1673,
1057,
417,
452,
3126,
1491,
594,
253,
11786,
2550,
3693,
15708,
1580,
253,
12541,
3646,
310,
1754,
327,
11786,
3169,
2250,
1057,
352,
878,
271,
3081,
15708,
12339,
10921,
1309,
3733,
275,
253,
2929,
368,
5393,
253,
36409,
285,
40880,
10921,
533,
625,
4278,
1060,
651,
320,
1805,
374,
253,
11786,
1673,
1537,
11182,
281,
6016,
23390,
49797,
10670,
752,
604,
627,
403,
2709,
11461,
7342,
323,
253,
1072,
3126,
588,
253,
11786,
2489,
247,
1599,
273,
1027,
10670,
534,
588,
3731,
26460,
253,
499,
9582,
390,
253,
3646,
4715,
891,
1928,
247,
37851,
1566,
651,
1056,
625,
3282,
323,
436,
4112,
495,
253,
4477,
5393,
326,
253,
2316,
3298,
606,
1003,
4836,
310,
1512,
2570,
323,
7219,
3169,
3082,
891,
651,
1804,
2820,
342,
26413,
652,
7219,
534,
806,
5827,
1029,
5251,
16669,
5827,
342,
253,
6311,
11786,
285,
840,
897,
247,
15708,
27635,
593,
18974,
499,
9582,
281,
22318,
253,
1854,
281,
3693,
253,
15708,
50276,
18,
1537,
1891,
281,
6016,
23390,
49797,
4736,
10670,
374,
5816,
690,
4278,
273,
253,
5661,
873,
8777,
275,
253,
2022,
2929,
5474,
33032,
1501,
30080,
22559,
50276,
74,
5717,
253,
4477,
323,
616,
2380,
891,
452,
2559,
619,
4868,
50275,
3456,
30080,
22559,
50276,
2520,
2929,
9437,
281,
18915,
253,
1895,
273,
4715,
281,
23690,
912,
247,
873,
273,
5113,
281,
247,
24600,
1375,
1293,
247,
41364,
10921,
1159,
4404,
436,
990,
597,
12661,
2303,
11786,
4910,
246,
1662,
71,
246,
1662,
71,
8197,
253,
11786,
273,
253,
12177,
326,
253,
1655,
3126,
1375,
310,
247,
4736,
390,
24600,
1375,
323,
253,
5113,
50276,
85,
1662,
71,
310,
6311,
407,
12421,
8143,
9634,
3451,
16669,
50276,
783,
3453,
273,
246,
1662,
71,
310,
908,
281,
6642,
253,
1682,
2250,
281,
1379,
387,
1046,
4522,
383,
554,
285,
263,
908,
281,
6642,
253,
10921,
50276,
783,
1332,
6760,
275,
767,
15216,
275,
253,
806,
247,
873,
273,
15254,
1364,
320,
23690,
4626,
715,
2057,
247,
9096,
9959,
407,
3295,
390,
247,
5019,
275,
253,
1273,
253,
17076,
275,
9956,
1364,
320,
23690,
4626,
896,
281,
697,
3302,
3054,
50276,
296,
3755,
20556,
253,
4081,
1332,
310,
2104,
281,
3037,
12718,
432,
6667,
285,
1057,
417,
2430,
667,
3081,
941,
643,
685,
3451,
6667,
2297,
3993,
403,
4561,
407,
6240,
6046,
281,
253,
3451,
6667,
50276,
249,
253,
9096,
10076,
253,
4081,
1332,
41731,
13015,
2710,
1666,
25379,
2439,
247,
2962,
273,
17082,
50276,
49592,
15708,
1385,
285,
2323,
50275,
20881,
1255,
265,
50276,
783,
7103,
310,
1077,
2969,
627,
310,
2649,
271,
5570,
326,
23690,
6525,
841,
5113,
3185,
253,
5113,
512,
23690,
912,
3746,
436,
310,
247,
1175,
3302,
7103,
326,
2722,
253,
1332,
3400,
247,
5272,
10921,
2625,
533,
352,
6505,
247,
2257,
273,
3533,
1580,
436,
11355,
253,
673,
1688,
21148,
310,
2649,
8612,
273,
849,
253,
5113,
651,
320,
4395,
407,
247,
2014,
5570,
581,
1789,
9727,
387,
247,
673,
285,
26586,
253,
3302,
17947,
281,
1089,
253,
5113,
271,
7103,
326,
310,
8003,
281,
247,
1524,
10076,
651,
320,
9371,
50276,
504,
595,
253,
4477,
651,
7472,
327,
1633,
751,
4776,
18558,
302,
1162,
355,
43425,
390,
23105,
19,
42771,
47410,
50276,
783,
12541,
3646,
6012,
432,
253,
11786,
3169,
2250,
3133,
3240,
1774,
281,
4583,
3045,
273,
253,
1566,
1959,
2746,
2299,
436,
12541,
3646,
4419,
271,
42295,
10726,
3169,
6779,
273,
253,
1375,
50276,
36042,
512,
1789,
6887,
1789,
9050,
285,
41113,
12783,
1223,
352,
310,
4030,
281,
5467,
436,
387,
3733,
673,
436,
310,
671,
2424,
387,
7103,
673,
891,
717,
7514,
407,
253,
9376,
326,
824,
247,
6779,
310,
2130,
347,
253,
4836,
10454,
5459,
50276,
251,
436,
9400,
352,
310,
12744,
604,
253,
1666,
25379,
4763,
253,
1072,
1491,
253,
8499,
2296,
597,
10196,
327,
1375,
533,
1375,
310,
2931,
816,
347,
1789,
1899,
417,
1899,
50276,
14267,
50276,
9458,
272,
3817,
275,
253,
2929,
50276,
71,
3341,
275,
253,
954,
15958,
10076,
2316,
11461,
253,
4081,
1332,
2779,
36908,
562,
32231,
305,
647,
50276,
38346,
407,
247,
10126,
1534,
8459,
476,
253,
4477,
1304,
7162,
11508,
3185,
273,
11041,
7364,
403,
9713,
5474,
33032,
2520,
2929,
33772,
247,
10921,
1159,
50276,
18891,
1159,
323,
8103,
47410,
8892,
2403,
897,
273,
3332,
7170,
942,
275,
4868,
3169,
1006,
800,
14053,
253,
4477,
1307,
436,
4868,
1159,
247,
2303,
11786,
1673,
285,
921,
326,
352,
476,
320,
908,
342,
2057,
247,
1854,
499,
9582,
390,
247,
35221,
4715,
5933,
281,
8415,
1789,
47410,
8892,
275,
247,
20953,
4758,
26332,
835,
368,
476,
3587,
1453,
1789,
24409,
253,
4477,
7277,
1411,
247,
1180,
273,
1666,
25379,
285,
11771,
3082,
275,
841,
15524,
1789,
47410,
10625,
285,
921,
326,
616,
1332,
17923,
973,
2439,
2067,
1027,
17082,
50275,
296,
3755,
20556,
50276,
28269,
10921,
3470,
310,
271,
4722,
2898,
273,
4868,
3169,
1006,
800,
14053,
285,
556,
417,
644,
14859,
1078,
347,
2080,
347,
891,
871,
50275,
856,
7334,
2746,
17923,
973,
1411,
643,
10921,
4715,
7274,
327,
253,
1789,
250,
3298,
606,
1003,
4836,
50275,
20881,
1255,
265,
50276,
783,
2929,
310,
8489,
3710,
275,
7990,
50276,
262,
310,
760,
3732,
281,
247,
1077,
2173,
15688,
982,
1895,
326,
273,
1789,
47410,
285,
1014,
1060,
690,
2201,
8077,
5411,
13260,
574,
281,
320,
1160,
824,
347,
253,
958,
368,
476,
3587,
1453,
253,
7602,
273,
667,
1789,
50275,
783,
2929,
760,
2722,
1543,
275,
1698,
6967,
10625,
1355,
14580,
1580,
4868,
3169,
1006,
800,
14053,
671,
2987,
275,
1029,
6967,
10625,
824,
347,
3888,
352,
651,
320,
4722,
281,
923,
604,
253,
1332,
476,
320,
908,
323,
10921,
4715,
432,
6200,
3888,
323,
1650,
50275,
6050,
253,
2929,
310,
23395,
18872,
352,
4428,
247,
1781,
1180,
273,
963,
993,
285,
47412,
474,
6332,
841,
6332,
1364,
320,
4229,
1078,
253,
2929,
310,
4704,
323,
9311,
891,
588,
2085,
247,
44382,
8648,
422,
1618,
1060,
533,
891,
651,
671,
1804,
2403,
4217,
273,
5702,
4737,
24042,
3238,
604,
1896,
50275,
555,
993,
285,
47412,
474,
6332,
50276,
664,
513,
671,
14177,
281,
1089,
247,
1650,
3169,
1453,
1332,
50276,
19623,
314,
776,
4081,
2746,
2770,
327,
4715,
253,
2303,
11786,
4910,
432,
253,
6667,
50276,
67,
11587,
1097,
391,
77,
1332,
285,
5899,
499,
9582,
476,
320,
4516,
407,
776,
2303,
11786,
4910,
275,
253,
1789,
11461,
4836,
50276,
9154,
604,
359,
403,
12482,
281,
253,
2303,
3268,
50276,
249,
2953,
841,
3237,
50276,
17773,
272,
253,
2495,
273,
5113,
1146,
3007,
1356,
50276,
74,
13414,
1158,
253,
4477,
452,
2218,
247,
1175,
2628,
273,
15974,
253,
12291,
273,
253,
789,
275,
253,
7364,
2593,
253,
760,
12291,
326,
253,
4477,
5393,
310,
253,
958,
326,
616,
789,
760,
2783,
23601,
7533,
285,
417,
495,
69,
7533,
627,
310,
760,
247,
1077,
4864,
3748,
273,
253,
958,
326,
2852,
789,
812,
897,
1524,
25497,
891,
1158,
253,
4477,
943,
2085,
247,
625,
7000,
7364,
2593,
407,
8493,
2317,
908,
11358,
690,
5697,
323,
7364,
476,
320,
2326,
275,
253,
32213,
2593,
1840,
247,
8489,
625,
11088,
2098,
327,
849,
253,
1332,
812,
320,
3732,
281,
625,
15958,
7533,
651,
671,
320,
4217,
50276,
187,
187,
4118,
18435,
27,
6438,
247,
2266,
30080,
22559,
432,
253,
4477,
285,
271,
9470,
5955,
2190,
253,
30628,
891,
2868,
436,
789,
588,
320,
247,
9865,
7680,
281,
5723,
2824,
891,
5583,
352,
323,
14924,
285,
11907,
253,
4477,
281,
2953,
253,
30628,
5701,
323,
253,
4049,
254,
609,
5102,
2715,
273,
253,
2929,
3340,
253,
1127,
670,
253,
8077,
2531,
7103,
273,
253,
1332,
50276,
32897,
1908,
247,
625,
15958,
7103,
10076
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
789,
23970,
247,
4460,
2746,
281,
46710,
1789,
47410,
253,
2022,
2934,
310,
281,
3037,
247,
14755,
1159,
275,
253,
830,
273,
247,
11786,
1673,
253,
6311,
11786,
1673,
812,
320,
908,
407,
247,
1566,
3169,
7219,
5933,
281,
1089,
15708,
4924,
11461,
5827,
390,
347,
247,
10921,
1159,
281,
6194,
391,
77,
6083,
3368,
1543,
921,
12532,
1543,
273,
253,
4081,
1332,
323,
23690,
5610,
247,
2316,
285,
3410,
5919,
3646,
4715,
4757,
337,
253,
2929,
310,
973,
15720,
342,
247,
2590,
10199,
273,
253,
7092,
4278,
374,
891,
1663,
4264,
4361,
253,
2929,
1563,
253,
16038,
273,
849,
253,
6311,
11786,
1673,
812,
320,
908,
281,
18915,
1264,
2201,
7881,
273,
1789,
11461,
534,
310,
3477,
281,
956,
285,
2789,
247,
2257,
273,
3282,
495,
253,
7602,
3169,
2250,
2317,
273,
253,
4158,
4836,
921,
12866,
326,
253,
4081,
1332,
812,
320,
908,
275,
2570,
5415,
1453,
15216,
253,
5955,
273,
15708,
28772,
7219,
342,
390,
6357,
2007,
4056,
49966,
253,
8542,
414,
273,
253,
4081,
2746,
275,
1524,
10186,
8892,
577,
352,
310,
13943,
281,
923,
824,
247,
11786,
3169,
14755,
1159,
812,
320,
908,
347,
247,
10921,
1159,
323,
3733,
253,
391,
77,
5570,
671,
253,
11786,
3169,
2250,
812,
2007,
33623,
253,
7977,
273,
253,
2250,
5978,
2990,
594,
326,
760,
247,
1355,
12541,
3646,
2990,
3198,
281,
320,
10166,
534,
8127,
19132,
253,
3410,
6733,
273,
253,
3646,
4715,
1232,
50275,
585,
1209,
2224,
337,
849,
1057,
253,
391,
77,
12541,
3646,
3037,
281,
3693,
15708,
347,
5393,
275,
253,
1566,
3169,
7219,
1332,
253,
11786,
1673,
1057,
417,
452,
3126,
1491,
594,
253,
11786,
2550,
3693,
15708,
1580,
253,
12541,
3646,
310,
1754,
327,
11786,
3169,
2250,
1057,
352,
878,
271,
3081,
15708,
12339,
10921,
1309,
3733,
275,
253,
2929,
368,
5393,
253,
36409,
285,
40880,
10921,
533,
625,
4278,
1060,
651,
320,
1805,
374,
253,
11786,
1673,
1537,
11182,
281,
6016,
23390,
49797,
10670,
752,
604,
627,
403,
2709,
11461,
7342,
323,
253,
1072,
3126,
588,
253,
11786,
2489,
247,
1599,
273,
1027,
10670,
534,
588,
3731,
26460,
253,
499,
9582,
390,
253,
3646,
4715,
891,
1928,
247,
37851,
1566,
651,
1056,
625,
3282,
323,
436,
4112,
495,
253,
4477,
5393,
326,
253,
2316,
3298,
606,
1003,
4836,
310,
1512,
2570,
323,
7219,
3169,
3082,
891,
651,
1804,
2820,
342,
26413,
652,
7219,
534,
806,
5827,
1029,
5251,
16669,
5827,
342,
253,
6311,
11786,
285,
840,
897,
247,
15708,
27635,
593,
18974,
499,
9582,
281,
22318,
253,
1854,
281,
3693,
253,
15708,
50276,
18,
1537,
1891,
281,
6016,
23390,
49797,
4736,
10670,
374,
5816,
690,
4278,
273,
253,
5661,
873,
8777,
275,
253,
2022,
2929,
5474,
33032,
1501,
30080,
22559,
50276,
74,
5717,
253,
4477,
323,
616,
2380,
891,
452,
2559,
619,
4868,
50275,
3456,
30080,
22559,
50276,
2520,
2929,
9437,
281,
18915,
253,
1895,
273,
4715,
281,
23690,
912,
247,
873,
273,
5113,
281,
247,
24600,
1375,
1293,
247,
41364,
10921,
1159,
4404,
436,
990,
597,
12661,
2303,
11786,
4910,
246,
1662,
71,
246,
1662,
71,
8197,
253,
11786,
273,
253,
12177,
326,
253,
1655,
3126,
1375,
310,
247,
4736,
390,
24600,
1375,
323,
253,
5113,
50276,
85,
1662,
71,
310,
6311,
407,
12421,
8143,
9634,
3451,
16669,
50276,
783,
3453,
273,
246,
1662,
71,
310,
908,
281,
6642,
253,
1682,
2250,
281,
1379,
387,
1046,
4522,
383,
554,
285,
263,
908,
281,
6642,
253,
10921,
50276,
783,
1332,
6760,
275,
767,
15216,
275,
253,
806,
247,
873,
273,
15254,
1364,
320,
23690,
4626,
715,
2057,
247,
9096,
9959,
407,
3295,
390,
247,
5019,
275,
253,
1273,
253,
17076,
275,
9956,
1364,
320,
23690,
4626,
896,
281,
697,
3302,
3054,
50276,
296,
3755,
20556,
253,
4081,
1332,
310,
2104,
281,
3037,
12718,
432,
6667,
285,
1057,
417,
2430,
667,
3081,
941,
643,
685,
3451,
6667,
2297,
3993,
403,
4561,
407,
6240,
6046,
281,
253,
3451,
6667,
50276,
249,
253,
9096,
10076,
253,
4081,
1332,
41731,
13015,
2710,
1666,
25379,
2439,
247,
2962,
273,
17082,
50276,
49592,
15708,
1385,
285,
2323,
50275,
20881,
1255,
265,
50276,
783,
7103,
310,
1077,
2969,
627,
310,
2649,
271,
5570,
326,
23690,
6525,
841,
5113,
3185,
253,
5113,
512,
23690,
912,
3746,
436,
310,
247,
1175,
3302,
7103,
326,
2722,
253,
1332,
3400,
247,
5272,
10921,
2625,
533,
352,
6505,
247,
2257,
273,
3533,
1580,
436,
11355,
253,
673,
1688,
21148,
310,
2649,
8612,
273,
849,
253,
5113,
651,
320,
4395,
407,
247,
2014,
5570,
581,
1789,
9727,
387,
247,
673,
285,
26586,
253,
3302,
17947,
281,
1089,
253,
5113,
271,
7103,
326,
310,
8003,
281,
247,
1524,
10076,
651,
320,
9371,
50276,
504,
595,
253,
4477,
651,
7472,
327,
1633,
751,
4776,
18558,
302,
1162,
355,
43425,
390,
23105,
19,
42771,
47410,
50276,
783,
12541,
3646,
6012,
432,
253,
11786,
3169,
2250,
3133,
3240,
1774,
281,
4583,
3045,
273,
253,
1566,
1959,
2746,
2299,
436,
12541,
3646,
4419,
271,
42295,
10726,
3169,
6779,
273,
253,
1375,
50276,
36042,
512,
1789,
6887,
1789,
9050,
285,
41113,
12783,
1223,
352,
310,
4030,
281,
5467,
436,
387,
3733,
673,
436,
310,
671,
2424,
387,
7103,
673,
891,
717,
7514,
407,
253,
9376,
326,
824,
247,
6779,
310,
2130,
347,
253,
4836,
10454,
5459,
50276,
251,
436,
9400,
352,
310,
12744,
604,
253,
1666,
25379,
4763,
253,
1072,
1491,
253,
8499,
2296,
597,
10196,
327,
1375,
533,
1375,
310,
2931,
816,
347,
1789,
1899,
417,
1899,
50276,
14267,
50276,
9458,
272,
3817,
275,
253,
2929,
50276,
71,
3341,
275,
253,
954,
15958,
10076,
2316,
11461,
253,
4081,
1332,
2779,
36908,
562,
32231,
305,
647,
50276,
38346,
407,
247,
10126,
1534,
8459,
476,
253,
4477,
1304,
7162,
11508,
3185,
273,
11041,
7364,
403,
9713,
5474,
33032,
2520,
2929,
33772,
247,
10921,
1159,
50276,
18891,
1159,
323,
8103,
47410,
8892,
2403,
897,
273,
3332,
7170,
942,
275,
4868,
3169,
1006,
800,
14053,
253,
4477,
1307,
436,
4868,
1159,
247,
2303,
11786,
1673,
285,
921,
326,
352,
476,
320,
908,
342,
2057,
247,
1854,
499,
9582,
390,
247,
35221,
4715,
5933,
281,
8415,
1789,
47410,
8892,
275,
247,
20953,
4758,
26332,
835,
368,
476,
3587,
1453,
1789,
24409,
253,
4477,
7277,
1411,
247,
1180,
273,
1666,
25379,
285,
11771,
3082,
275,
841,
15524,
1789,
47410,
10625,
285,
921,
326,
616,
1332,
17923,
973,
2439,
2067,
1027,
17082,
50275,
296,
3755,
20556,
50276,
28269,
10921,
3470,
310,
271,
4722,
2898,
273,
4868,
3169,
1006,
800,
14053,
285,
556,
417,
644,
14859,
1078,
347,
2080,
347,
891,
871,
50275,
856,
7334,
2746,
17923,
973,
1411,
643,
10921,
4715,
7274,
327,
253,
1789,
250,
3298,
606,
1003,
4836,
50275,
20881,
1255,
265,
50276,
783,
2929,
310,
8489,
3710,
275,
7990,
50276,
262,
310,
760,
3732,
281,
247,
1077,
2173,
15688,
982,
1895,
326,
273,
1789,
47410,
285,
1014,
1060,
690,
2201,
8077,
5411,
13260,
574,
281,
320,
1160,
824,
347,
253,
958,
368,
476,
3587,
1453,
253,
7602,
273,
667,
1789,
50275,
783,
2929,
760,
2722,
1543,
275,
1698,
6967,
10625,
1355,
14580,
1580,
4868,
3169,
1006,
800,
14053,
671,
2987,
275,
1029,
6967,
10625,
824,
347,
3888,
352,
651,
320,
4722,
281,
923,
604,
253,
1332,
476,
320,
908,
323,
10921,
4715,
432,
6200,
3888,
323,
1650,
50275,
6050,
253,
2929,
310,
23395,
18872,
352,
4428,
247,
1781,
1180,
273,
963,
993,
285,
47412,
474,
6332,
841,
6332,
1364,
320,
4229,
1078,
253,
2929,
310,
4704,
323,
9311,
891,
588,
2085,
247,
44382,
8648,
422,
1618,
1060,
533,
891,
651,
671,
1804,
2403,
4217,
273,
5702,
4737,
24042,
3238,
604,
1896,
50275,
555,
993,
285,
47412,
474,
6332,
50276,
664,
513,
671,
14177,
281,
1089,
247,
1650,
3169,
1453,
1332,
50276,
19623,
314,
776,
4081,
2746,
2770,
327,
4715,
253,
2303,
11786,
4910,
432,
253,
6667,
50276,
67,
11587,
1097,
391,
77,
1332,
285,
5899,
499,
9582,
476,
320,
4516,
407,
776,
2303,
11786,
4910,
275,
253,
1789,
11461,
4836,
50276,
9154,
604,
359,
403,
12482,
281,
253,
2303,
3268,
50276,
249,
2953,
841,
3237,
50276,
17773,
272,
253,
2495,
273,
5113,
1146,
3007,
1356,
50276,
74,
13414,
1158,
253,
4477,
452,
2218,
247,
1175,
2628,
273,
15974,
253,
12291,
273,
253,
789,
275,
253,
7364,
2593,
253,
760,
12291,
326,
253,
4477,
5393,
310,
253,
958,
326,
616,
789,
760,
2783,
23601,
7533,
285,
417,
495,
69,
7533,
627,
310,
760,
247,
1077,
4864,
3748,
273,
253,
958,
326,
2852,
789,
812,
897,
1524,
25497,
891,
1158,
253,
4477,
943,
2085,
247,
625,
7000,
7364,
2593,
407,
8493,
2317,
908,
11358,
690,
5697,
323,
7364,
476,
320,
2326,
275,
253,
32213,
2593,
1840,
247,
8489,
625,
11088,
2098,
327,
849,
253,
1332,
812,
320,
3732,
281,
625,
15958,
7533,
651,
671,
320,
4217,
50276,
187,
187,
4118,
18435,
27,
6438,
247,
2266,
30080,
22559,
432,
253,
4477,
285,
271,
9470,
5955,
2190,
253,
30628,
891,
2868,
436,
789,
588,
320,
247,
9865,
7680,
281,
5723,
2824,
891,
5583,
352,
323,
14924,
285,
11907,
253,
4477,
281,
2953,
253,
30628,
5701,
323,
253,
4049,
254,
609,
5102,
2715,
273,
253,
2929,
3340,
253,
1127,
670,
253,
8077,
2531,
7103,
273,
253,
1332,
50276,
32897,
1908,
247,
625,
15958,
7103,
10076
] |